903R92011
Current Approaches for Modeling
Estuarine Ecosystem Processes
TD
225
.C54
C87
Scientific and Technical Advisory Committee
Chesapeake Bay Program
-------
Current Approaches for Modeling
Estuarine Ecosystem Processes
Scientific and Technical Advisory Committee
Chesapeake Bay Program
Printed on recycled paper
-------
To receive additional copies of this publication, please call or write
Chesapeake Research Consortium, Inc.
P.O. Box 1280
Solomons, MD 20688
(410) 326-6700
CRC Publication No. 144
Disclaimer
Current Approaches for Modeling Estuarine Ecosystem Processes is supported by the U.S. Environmental Protection Agency (EPA) through
cooperative agreement X-3454-04-0 with the Chesapeake Research Consortium, Inc. (CRC). This publication has been prepared by staff
of the Scientific and Technical Advisory Committee (STAC) of the Chesapeake Bay Program. Recommendations contained herein have
not been formally endorsed by STAC members. Mention of trade names or commercial products does not constitute endorsement or
recommendation for use.
-------
Table of Contents
Introduction
Objectives and Organization of Workshop 1
- Background
- Format of the executive summary
State of the Art
Contemporary Approaches to Estuarine Modeling 2
- Introduction
- Ecosystem process models
-Water quality models
- Spatially explicit fish bioenergetic models
- Individual-based fishery management models
- Ecosystem regression models
- Ecosystem network analysis models
- Landscape spatial models
Linkage and Coupling
Toward an Integrated Modeling Framework 17
- Overview
- Conceptual issues
- Physical process issues
- Water quality model issues
- Submerged aquatic vegetation model issues
- Ecosystem feedback and control issues
- Fishery management and recruitment issues
Methodology and Technology
Factors in Concept, Design, and Implementation 21
- Introduction
- Scales of ecological systems
- Aggregation and scale
- Predictive versus experimental
- Specific scale problems
- Limits of technology
- Physics: fine-scale versus coarse-scale
- Feedbacks, trophic interactions, and time
- Data availability
- Model development: formulation, calibration, and sensitivity analysis
Recommendations
An Agenda for Action 27
- Recommendations
-------
List of Participants
Conference chairmen
Steering committee
Invited scientists
Invited managers
Editors and writers
STAC Staff
Walter Boynton, University of Maryland, Chesapeake Biological Lab*
Mike Kemp, University of Maryland, Horn Point Environmental Lab*
Dick Wetzel, College of William and Mary, Virginia Inst. of Marine Science*
Rich Batiuk, U.S. EPA, Chesapeake Bay ProgramOffice
Steve Jordan, Maryland Department of Natural Resources
Joe Mihursky, University of Maryland, Chesapeake Biological Lab/CRC
Steve Nelson, University of Maryland, Chesapeake Biological Lab/CRC
Steve Bartell, Oak Ridge National Lab***
Steve Brandt, University of Maryland, Chesapeake Biological Lab*
Carl Cerco, U.S. Army Corps of Engineers
Shenn-Yu Chao, University of Maryland, Horn Point Environmental Lab*
Bob Costanza, University of Maryland, Chesapeake Biological Lab*
Jim Cowan, University of South Alabama
Dominic DiToro, Hydroqual, Inc.*
Stuart Findley, New York Botanical Garden
Jay Gooch, University of Maryland, Chesapeake Biological Lab
Cynthia Jones, Old Dominion University
James Kremer, University of Southern California*
Bill Richkus, Versar, Inc.
Ken Rose, Oak Ridge National Lab*
Terry Schemm, Johns Hopkins University
Bill Seufzer, College of William and Mary, Virginia Inst. of Marine Science
Gordon Smith, Johns Hopkins University
Katrina Smith, Duke University
Robert Ulanowicz, University of Maryland, Chesapeake Biological Lab**
Dick Wiegert, University of Georgia*
Steve Bieber, Maryland Department of Environment
Carin Bisland, U.S. EPA, Chesapeake Bay Program Office
Arthur Butt, Virginia Water Control Board
Bess Gillelan, NOAA Coastal Ocean Program
Lewis Linker, U.S. EPA, Chesapeake Bay Program Office
Bob Lippson, NO A A/National Marine Fisheries Service
Rob Magnien, Maryland Department of Environment
Gail McKiernan, Maryland Sea Grant
Paul Miller, Maryland Department of Natural Resources
Ed Pendleton, U.S. Fish and Wildlife Service
Steve Nelson, Mike Kemp, and Walter Boynton
Michele Aud, design and layout; Paul Elliott, Brooke Farquhar, Jim Hagy,
Nadine Lymn, Lisa Wainger, and Elizabeth Watkins, rapporteurs;
Marianne Fisher, editor
* Workshop speakers.
** Invited but unable to attend.
*** Presentation given by Ken Rose.
-------
Introduction
Objectives and Organization of Workshop
Background
Format of the
executive
summary
A group of estuarine scientists and managers met in March 1992 in St. Michaels,
Maryland, to discuss a variety of issues relevant to modeling estuarine ecosystem
processes in Chesapeake Bay. Sponsored by the Chesapeake Bay Program Scientific
and Technical Advisory Committee (STAC), the three-day workshop brought
together leading researchers and modelers from across the United States to investi-
gate and to compare a range of numerical ecosystem approaches and to evaluate
relevant applications for managing estuarine resources in Chesapeake Bay.
The workshop featured plenary presentations by leading modelers who are
working on various approaches to ecosystem modeling. These presentations summa-
rized state-of-the-art methodologies and technologies used in several important
areas, including:
Ecosystem process models
Water quality models
Spatially explicit fish bioenergetic models
Individual-based fishery management models (IBFMs)
Ecosystem regression models (ERMs)
Ecosystem network analysis models
Landscape spatial models
After the plenary presentations, the group divided into smaller subgroups to
address a range of technical and philosophical questions pertaining to ecosystem
process modeling. Each subgroup focused on the same basic questions of appropriate
scales and levels of aggregation, and technical issues associated with model develop-
ment, sensitivity analysis, calibration, and documentation.
Other discussions focused on coupling and integration questions: How do models
integrate scientific research into a coherent framework? What are the best approaches
for linking ecosystem process models with physical transport and fish population
models? What are the technical and philosophical issues regarding links between
such key components as water quality, submerged aquatic vegetation (SAV), fish,
and habitat?
On the final day, the group reconvened to summarize its recommendations and to
outline an agenda for action.
After the workshop, STAC staff prepared this document—a technical executive
summary of the presentations and discussions. To cover issues more effectively, the
executive summary does not follow the chronological events of the workshop; rather,
it borrows from all presentations, discussions, and papers to summarize essential
points. In this way, the executive summary outlines important modeling approaches,
reviews conceptual issues, and provides specific recommendations to the Chesa-
peake Bay Program.
Moreover, readers should be reminded that any good discussion contains a range
of opinions and alternative viewpoints. In this context, any statement presented
herein as consensus or fact, upon closer scrutiny, will reveal a full spectrum of
scientific interpretation and emphasis.
-------
Introduction
Ecosystem
process models
State of the Art
Contemporary Approaches to Estuarine Modeling
Workshop participants described a wide range of modeling approaches in use to
simulate various processes, populations, and communities in Chesapeake Bay and
other coastal ecosystems. Moreover, participants recognized the importance of
complex interactions in relating human activities to estuarine resources. There was
general consensus that linking or coupling hydrodynamic, water quality, ecosystem,
and fish models into an integrated Bay-system framework is a major objective for the
future. (See Figure 1.)
To achieve this goal, participants endorsed the idea of developing a suite of
coupled (directly or indirectly) models to describe Bay-system processes, popula-
tions, and communities. However, they also agreed that because the models are
based on different assumptions and attempt to quantify processes at different spatial
and temporal scales and at different levels of biological organization, it would be a
considerable challenge to overcome the problems inherent in developing a fully
integrated model. The following initial steps were proposed: (1) generate output of
biologically relevant variables from traditional water quality models; (2) use ecosys-
tem models to explore feedback effects on water quality; (3) apply ecosystem models
to examine water quality effects on lower trophic-level organisms (e.g., zooplankton,
benthos) and SAV habitat; and (4) repeat the process and get better.
Although there are many approaches to modeling estuarine systems, several types
of models stand out for making important contributions to Chesapeake Bay science
or for potentially making such contributions in the future. They include:
. Ecosystem process models:
- Plankton/benthos of Chesapeake Bay
- Plankton dynamics of Narragansett Bay
- Seagrass photosynthesis/growth
- Salt marsh production
. Water quality models
. Spatially explicit fish bioenergetic models
. Individual-based fishery management models (IBFMs)
. Ecosystem regression models (ERMs)
. Ecosystem network analysis models
. Landscape spatial models
The rest of this section summarizes the chief characteristics of these model types. It
is based largely on the papers presented at the workshop.
Ecosystem process models address the mechanistic interactions that control the
flow of nutrients and organic materials in coastal systems. Based on earlier models
used to describe nutrient cycling between sediments and phytoplankton, some
current ecosystem models explicitly simulate the effects of organic matter on benthic
animals and bacteria. They often emphasize complex biogeochemical processes and
interactions with higher trophic species.
Plankton/benthos of Chesapeake Bay
One ecosystem process model that simulates the trophic and biogeochemical
processes that control the vertical exchanges of material between sediments and
euphotic waters of Chesapeake Bay is Mike Kemp's planktonic/benthic model,
which he described in this workshop. The model was designed to answer a range of
specific scientific questions about the deposition of particulate organic matter (POM)
and the factors controlling the process; for instance:
-------
Section 2: State of the Art
Figure 1. This
diagram summarizes
the types of models
discussed at the
workshop. Dark
shading represents
models currently in
use, light stipling
describes models
under development,
and clear boxes show
approaches to future
models. Connecting
lines represent links
between current and
planned or potential
modeling activities.
This simple diagram
reveals a great
diversity in modeling
approaches being
developed for estua-
rine systems. It also
shows that there are
multiple, but largely
untested, approaches
for linking different
parts of the ecosys-
tem—such as water
quality and living
resources—into a
unified whole.
Modeling Approach
I. Water Quality Model
II. Ecological Habitat
Models
III. User Interface
Capabilities—
(i.e. HyperCard)
IV. Fishery Stocty
Recruitment Models
V. Chemical Models
VI. Linear Analysis
Multiple Species Models
VII. Spatial Models
Models under development
&•'..;
Doing it
now
Starting
to do it
Hope to
start
-------
Current Approaches for Modeling
How do zooplankton grazing and fish predation affect POM deposition?
• How does nutrient enrichment influence POM deposition and oxygen
consumption in bottom waters?
• To what extent does the microbial loop affect POM deposition?
• What controls the balance between denitrification and ammonia recycling
in sediments?
• How do benthic suspension-feeders affect plankton dynamics and associ-
ated benthic-pelagic coupling?
This benthic-pelagic ecosystem process model includes 36 state variables
interacting according to equations well-established in the scientific literature
(depicted in Figure 2). Based on well-understood relationships, such equations
have a high degree of generality and can be applied to a broad range of condi-
tions and Bay regions. The model spatially averages ecosystem processes in a
stratified water column over an area of approximately 200 km2. To simulate
sediment processes, the model includes the upper 10 cm of sediment and uses a
pore-water oxygen pool as a state variable to separate the sediment into two
distinct redox zones. It simulates time frames ranging from a day to a decade and
uses Apple Macintosh STELLA simulation software to integrate finite difference
equations that describe the temporal rates of change in state variables (time-step
equals 2-4 h). Simulation time for an annual cycle of the model is about 15
minutes.
Initial model experiments have been used to investigate the seasonal effects on
nutrient inputs, the effects of zooplankton and fish grazing, and the interactions
between denitrification and bottom water dissolved oxygen concentrations.
Moreover, these models have been used to evaluate the effects of nutrient
reductions on summer bottom water dissolved oxygen conditions and the results
are broadly comparable to the output from the hydrodynamically driven, time-
variable, Chesapeake Bay water quality model.
Mike Kemp suggested that, in general, ecosystem process models offer several
strengths in that they:
• Provide a framework for empirical research
• Simulate functional attributes of the system
• Can be reasonably calibrated
• Simulate nonlinear ecological feedbacks
• Can be linked to fishery bioenergetic models
Ecosystem process models, however, also tend to have certain weaknesses in that
they:
• Often have limited spatial articulation
• Require expensive ecological process data for calibration
• Present calibration difficulties owing to high dimensionality
• Seldom include higher trophic organisms with complex life cycles (e.g. fish
and birds)
Plankton dynamics of Narragansett Bay
In reviewing the Narragansett model, Jim Kremer described this classic estua-
rine ecosystem process model. The model simulates interactions between nutri-
ents, phytoplankton, zooplankton, benthos, and small carnivores in eight spatial
elements subject to daily tidal variations. In this approach, two phytoplankton
subdivisions grow as an exponential function of temperature; Monod hyperbolic
functions define nutrient limitation for nitrogen, phosphorus, and silicon, and
Steel's curve specifies light levels for optimal growth. Both zooplankton and
benthos graze phytoplankton, thereby controlling the realized increase in phy-
toplankton biomass.
-------
Section 2: State of the Art
Burial
Burial
Figure 2. Ecosystem process model of Kemp and Bartleson. Based on Odum's symbols,
it is a conceptual model of ecosystem processes in Chesapeake Bay. Arrows indicate flaw
of either carbon, oxygen, nitrogen, sulfur, or iron, except those from the sun, tides, and
wind. Many functional compartments contain more than one biotic or chemical species.
-------
Current Approaches for Modeling
In the Narragansett model, zooplankton are divided into two groupings with
algorithms for egg production, growth, respiration, and feeding efficiencies. By
comparing preferred respiration activity with available food, the model calcu-
lates a filtering rate for phytoplankton. The model assumes unique rates for
adults and juveniles and its small carnivores include larval fish, ctenophores, and
menhaden. According to formulations for grazing and excretion, these carnivores
graze zooplankton, but not algae.
Similar respiration and grazing rates can be calculated for benthic organisms,
which are subdivided into filter feeders and deposit feeders. Available particu-
late food determines rates for filter feeders, while deposit feeder rates depend on
sinking phytoplankton and zooplankton feces as food sources. Microbial decom-
position occurs as first order decay of particulate organic carbon (POC) in the
sediments; phytoplankton, zooplankton, and filter feeders contribute to the POC
pool.
Describing this approach as mechanistic, Dr. Kremer stressed the need for
detailed understanding of the processes and species controlling the ecosystem.
Early versions of the model failed due to a poor understanding of sediment
denitrification and zooplankton populations controlled by overwintering eggs.
By making the appropriate scientific adjustments, the current model provides
better results for nitrogen balances in the estuary.
Seagrass photosynthesis/growth
Models of seagrasses and other SAV typically simulate photosynthesis and
growth of macrophytes, epiphytes, and phytoplankton. Some seagrass models
also include important nutrient cycling pathways and grazing by herbivores.
Based on physical forcing functions and biotic interactions in plant and grazer
communities, these models have shown the importance of light, epiphytic
colonization, and temperature on plant growth and survival. Irradiance, or
photosynthetically active radiation (PAR), is attenuated in the water column
owing to the scattering and absorption by particulate and dissolved organic
compounds. Epiphytes also attenuate radiation and act as a limiting boundary
for the exchange of nutrients and gases. Temperature controls specific physi-
ological processes and limits the geographic range of sea grass distribution.
In this workshop, Dick Wetzel described a seagrass model where many statisti-
cal or empirical equations were defined using physical/environmental forcing
functions, such as temperature, solar insolation, water depth, light attenuation,
and photoperiod. However, for biotic interactions the model applies nonlinear,
density-dependent feedback mechanisms to limit biological processes. These
density-dependent effects include shading, crowding, and nutrient limitation.
For instance, the model defines carbon dioxide availability and leaf biomass as
limiting factors which keep photosynthesis to less than full physiological capac-
ity. In this model, nonlinear equations describe biological processes like photo-
synthesis, nutrient assimilation, and other enzyme reactions with relations such
as hyperbolic, exponential, and sigmoid functions. In addition to SAV leaf
photosynthesis, the model includes biotic components for microflora epiphytes
and for isopod and amphipod grazers of both leaves and epiphytes. A concep-
tual diagram of this model is depicted in Figure 3.
Based on simulations for one, four, and ten years, the model indicates that
small changes in irradiance or temperature, or their combined interaction, result
in decreased plant productivity and the eventual loss of the eelgrass community.
Moreover, the model shows that epiphyte colonization, losses owing to grazing,
and other factors controlling the epiphytic community affect long-term eelgrass
survival.
-------
Section 2: State of the Art
Figure 3. These
models simulate
photosynthesis and
plant growth.
Epiphytes and
particulate and
dissolved organic
matter limit light
availability and
control growth of the
seagrasAes.
EELGRASS
LEAVES
(X03)
VROOTS -
RHIZOMES
(X04)
Salt marsh production
Daily tidal cycles give salt marsh ecosystems unique spatial and temporal variabil-
ity. Bidirectional tidal movements can both deposit organic material in the marsh
and/or transport carbon in an outwelling flux to surrounding coastal areas. Dick
Wiegert reviewed salt marsh models and showed how they can simulate different
processes in a coastal Georgia salt marsh. His model was initially designed to test
hypotheses about (1) the net flow of carbon into and out of salt marsh systems, and
(2) which biological components are most important in influencing this process. The
model was also constructed to identify significant information gaps in order to
improve predictive accuracy by guiding research priorities.
Wiegert's model describes a marsh environment composed of both the marsh
proper, with its resident organisms, and the water-borne tidal system, which carries
migrant organisms and materials into and out of the marsh. The 23 compartments of
the model are made up of 15 biotic and 8 abiotic components, including Spartina
roots and shoots, decaying plant material, dissolved organic carbon (DOC), particu-
late organic carbon (POC), aerobic and anaerobic microbes, benthic algae, phy-
toplankton, zooplankton, benthic infauna, meiofauna, filter feeders, particle feeders,
and top carnivores.
-------
Current Approaches for Modeling
Within this ecological framework, model simulations can be used like experi-
ments to determine how different processes and mechanisms affect each other. In
one series of simulations, Wiegert changed values associated with:
• migration of particle feeders and top predators (fish, shrimp, and crabs) into
and out of the marsh;
• net primary productivity differences between high marsh and creek bank
vegetation; and
• threshold limits controlling rates of anaerobic and aerobic microbial degra-
dation.
Because of the complexities of the interrelationships of the components, some
results generated by these manipulations were counterintuitive. For instance,
simulations that doubled the number of migrants moving into and out of the salt
marsh produced results similar to those that eliminated the migrants: a decrease
in aerobic bacteria, benthic infauna, and meiofauna.
Wiegert's model demonstrated that important whole system changes can result
from relatively small changes in one of the compartments, particularly the biotic
components. One of the key findings was that feedback mechanisms, which
control rates of aerobic microbial degradation, can determine whether the marsh
impounds or exports carbon, based on small variations in the standing stock of
microbes. The model indicated that this is one of the most important processes in
determining the spatial and temporal variation in salt marshes. It also presents
the possibility that relatively small perturbations could significantly alter a
seemingly stable system.
Water quality Dominic DiToro reviewed the development of water quality models and
models described several applications of these models to lakes and estuaries. Summariz-
ing the evolution of water quality models, he outlined important developments
in physical transport phenomena, nutrient loading estimations, and chemical/
biological kinetics.
Essentially, water quality models predict chemical and biological responses to
dynamic spatial distributions of nutrients, light, temperature, and organic
material. Current water quality models rely on sophisticated mass balance
equations to model physical transport phenomena and to evaluate chemical/
biological kinetics in each segment of the water column. Under the mass balance
assumptions, closed cycles account for all materials in the system as they move
through water and sediment.
These finite-segment, mass balance models use advective and dispersive
transport phenomena to simulate the flow of materials to and from adjoining
segments. Specifically, the hydrodynamic component of the model predicts
water velocity, diffusion, surface elevation, salinity, and temperature on an
intratidal, five-minute scale. It formulates turbulence closure for vertical disper-
sion and thus replaces the Pritchard kinetics and salt-dispersion coefficients used
in earlier efforts.
Nonlinear differential equations simulate the exchange of materials among
various components (state variables) in each segment in the water column. The
Chesapeake Bay water quality model contains 22 state variables (See Table 1) and
4029 segments, or cells. (See Figure 4)
Inputs of nutrients and other materials come from external sources, including
point and nonpoint source loadings, and from internal sources, especially the
sediments. Current models include sediment fluxes in mass balance calculations
and consequently close the nutrient cycle within the modeled system. Sediment
loadings are driven by the settling of particulate organic matter from the phy-
toplankton community.
Phytoplankton taxa include two functional groups—winter diatoms and a
summer group of green algae and cyanobacteria —with different nutrient
requirements and metabolic processes. Under model assumptions, phytoplank-
ton are considered to be particles of carbon, nitrogen, phosphorus, and silica;
-------
Section 2: State of the Art
Table 1. The Chesa-
peake Bay Program
model uses 22 state
variables to simulate
changes in water
quality.
Chesapeake Bay Water Quality Model
State Variables
Temperature
Salinity
Diatoms
Green algae
Cyanobacteria
Dissolved organic carbon
Labile participate organic carbon
Refractory particulate organic carbon
Ammonium
Nitrate-nitrite
Dissolved organic nitrogen
Labile particulate organic nitrogen
Refractory particulate organic nitrogen
Total phosphate
Dissolved organic phosphate
Labile particulate organic phosphorus
Refractory particulate organic phosphorus
Particulate biogenic silica
Available silica
Total active metal
Chemical oxygen demand
Dissolved oxygen
these materials circulate through completely closed cycles in the water column, in the
sediments, and in phytoplankton biomass. Moreover, these materials are categorized
as either labile or refractory components and as either particulate or dissolved
matter. Such distinctions allow the model to quantify specific aspects of phytoplank-
ton-nutrient kinetics and to predict the status of state variables in each water column
segment.
The status, or amount, of state variables depends on the flow of materials through
the closed system; process formulations control system flow and determine the
importance of sources and sinks for each variable. For example, algae production
(source) depends on available nutrients, temperature, and light; while algal loss
(sink) results from settling, basal metabolism, predation, and benthic grazing.
Oxygen production (source) results from photosynthesis and reaeration; while
oxygen loss (sink) is due to algal respiration, nitrification, and the oxidation of
organic carbon.
For management purposes, water quality models have often been used to predict
changes in dissolved oxygen concentration in response to changes in nutrient loading
to the system. However, estuaries present especially difficult modeling problems due
to complex intratidal transport processes and the variable phytoplankton-nutrient
kinetic interactions controlled by salinity gradients.
DiToro went on to review the predictive capability of other models and showed
how certain assumptions can modify model results. For example, a water quality
model in San Francisco Bay was used to predict the effects of agricultural runoff on
phytoplankton and dissolved oxygen in the estuary. Although the model correctly
predicted salt and freshwater flows, it did not consider the effects of freshwater
flushing on the benthic community, nor did it predict the significant effect of benthic
filter feeders on phytoplankton concentration. In other words, the model simulated
the physics correctly, but it missed important biological processes. Subsequent model
corrections were made to account for densities and filtration rates of benthic filter
feeders and their effect on water quality parameters.
In another example, DiToro explained how a Lake Erie water quality model
correctly predicted a required reduction in total phosphorus necessary to achieve a
4.0 mg/L level of dissolved oxygen. The Lake Erie model included a sediment
component that calculated internal sediment oxygen demand (SOD) based on
diagnosis mechanisms in anaerobic and aerobic sediments. In this example, the
model correctly determined allowable loads of phosphorus required to achieve a
specific dissolved oxygen goal.
-------
Current Approaches for Modeling
Figure 4. The Bay
Program model is an
integrated compartment
box model; boxes corre-
spond to cells in three-
dimensional space. The
surface plan contains 729
cells, roughly 10 km by 5
km by 1.7m. The vertical
dimension contains two to
fifteen cells for a total
4,029 cells, or boxes.
As a final example, DiToro described the failure of the Potomac eutrophication
model (PEM) to predict a large Microcystis algal bloom in the summer of 1983.
Based on dynamic, nonlinear nutrient-phytoplankton interactions, the model
outlined the need for phosphorus reduction efforts. With reduction efforts well
underway, a major algal bloom occurred, which raised questions over the cause
and doubts about the model. However, scientists used the model to investigate
bloom conditions and performed sensitivity analysis to identify the source of
additional phosphorus loading to the system.
As a result of these investigations, scientists determined that increased algal
production raised pH in the water column, which subsequently led to a release of
sediment phosphorus, thereby further fueling the bloom. In this case, increased
algae, increased pH, and increased sediment flux of phosphorus presents a
positive feedback to the nutrient-phytoplankton kinetic loop. The Potomac
estuary example shows how a failure to correctly portray natural processes in the
model formulation leads to erroneous predictions.
10
-------
Section 2: State of the Art
Spatially
explicit fish
bioenergetic
models
Individual-
based fishery
management
models
FigureS. Bioener-
getic model output.
Map of water temper-
ature (top panel), fish
biomass density
(middle panel), and
growth rate potential
(bottom panel) of a
1.9-kg striped bass
across a 7.5-km sec-
tion of the mesohaline,
middle portion of the
Chesapeake Bay at
night on 2 May 2990.
Maximum bottom
depth is 34 m.
Scanned from color
graphics. Color
represen tations from
blue to red are linear
for water temperature
(9 to 28°) and fish
growth rate (-0.004 to
0.019 g-g-1 • d-1)
but logarithmically-
spact-d for fish
biomass density (0 to
158 g • m-3). Graphic
by Jixngang Luo.
Spatially explicit fish bioenergetic models determine volumetric maps of fish-
growth potential. They identify specific areas, or volumes of water, with favorable
prey populations and water temperature. In this context, bioenergetic models define
the habitats with the highest potential for fish growth. Moreover, they predict fish
growth based on consumption of prey species, and they attempt to quantify the
functional responses of fish to their physical and biological environment.
Steve Brandt presented an example of such a spatial bioenergetic model in this
workshop. The model, which describes estuarine populations of striped bass,
divides the water column into horizontal and vertical grids and estimates prey size
and densities and water temperature in each cell. Prey densities (e.g., bay anchovies)
are measured with acoustic sampling procedures, while temperature is measured by
field observations. A foraging submodel translates prey size and density data into
prey availability and consumption by predators. Physiological growth equations
relate prey consumption to striped bass fish production. By integrating predator fish
movement, or behavior/migration, through the cells and by adding potential growth
rates for all the cells, modelers can obtain a bioenergetic estimate for total system
predator fish production. Figure 5 contains an example of striped bass bioenergetic
model output.
Individual-based fishery models (IBFMs) describe population dynamics based on
the characteristics of the individual fish. The IBFM uses physiological and behavioral
attributes of the individual to ascertain day-to-day survival and growth through
various life stages. Such a model, therefore, uses a reductionist approach to infer
population status by tracking the characteristics of individual survivors.
IBFMs are appealing because they are conceptually simple and because they
account for the many stochastic events, density-dependent factors, and nonlinear
processes that control fish populations in particular and many other biological
11
-------
Current Approaches for Modeling
Ecosystem
regression
models
Ecosystem
network
analysis models
systems in general. For example, such models can predict growth based on prey
consumption and estimate mortality based on starvation, predation, and tem-
perature fluctuations. To model such random and episodic phenomena, IBFMs
use Monte Carlo simulation to distribute events according to relevant stochastic
probability functions.
In this way, IBFMs provide a mechanism for modeling the intense variability in
fish recruitment, growth, and reproductive success. Because they are based on
the rules of individual survival, IBFMs account for the genotypic fact that the
survivor is not the average, but rather the exceptional individual.
In this workshop, Ken Rose described an IBFM for Potomac River striped bass.
The IBFM uses population data for length, weight, and age, along with data for
zooplankton, and white perch as a food competitor. By varying the combinations
of factors, this model was able to estimate how each factor and the interaction
among factors control the variability in each yearclass.
Despite their heuristic advantages, IBFMs are data-hungry; they require
comprehensive data on the recruitment, growth, and survival of individual
species. Moreover, IBFMs ignore foodweb interactions and do not account for
spatial heterogeneity. Nonetheless, IBFMs are important because of their
straightforward approach to modeling recruitment variability.
Avoiding the mechanistic details of ecosystem processes, ecosystem regression
models (ERMs) use regression analysis to identify strong relationships in sys-
tems. ERMs typically correlate a chosen environmental forcing function, such as
nutrient loading or river flow, with a selected ecosystem response, such as
primary production or fishery yield. Based on these regressions, modelers can
identify patterns in the whole system and test hypotheses based on various
cause-and-effect assumptions. Because ERMs evaluate relationships without
specific mechanistic details, they are holistic rather than reductionist; because
they correlate large-scale trends rather than facilitate analysis of processes, they
are empirical rather than analytical. Walter Boynton described several ERMs, an
example of which can be found in Figure 6.
ERMs provide ecologists tools to establish and measure basic rules of thumb
for estuarine response to a given causative agent or predictive variable. For
example, although we can correlate loading with eutrophication, we know little
about how individual estuary morphology and hydrology affect the status and
responses of areas within an estuary or different types of estuaries. ERMs help
ecologists answer questions about the relative status of particular estuaries,
including:
What kinds of estuaries are most susceptible to eutrophication?
What regions are most vulnerable to the effects of excess nitrogen or
phosphorus nutrients?
What level of nutrient decrease is necessary to achieve restored conditions?
The availability of the many good databases developed to assess and monitor
water quality provides an important advantage for the ERM approach. When
scaled appropriately, these data make it possible to compare processes and
trends in a large number of estuaries.
Some of the disadvantages of ERMs stem from problems associated with time
lags, averaging errors, and spatial shifts that occur in large-scale systems. None-
theless, numerous ERM applications have been successful.
Ecosystem network analysis models allow investigators to examine the indirect
connections between species in an ecosystem. These models use linear algebra
techniques to assess the interactions between species, or components, not in
direct trophic communication. Network analysis was developed by Robert
Ulanowicz and his colleagues as a tool to improve the predictive power of
complex ecosystem process models and to analyze indirect ecosystem effects in
nonlinear trophic interactions.
12
-------
Section 2: State of the Art
FigureB. Ecosystem
regression model
(limnological model).
P load vs Chlorophyll-a Concentration
100
E
O)
E,
5 10
.c
o
0)
D)
03
CO
I .1
Mesotrophic
Eutrophic x>
f £ '
( '...'.
- Oligotrophic
.1 1 10 100 1000
(LP/qs)/(lAl z/qs)
Lp = phosphorus loading rate, mg m-2 yr1
qs = hydraulic load = z/tw
z = mean depth, m
Tw = V/Qv, years
V = volume, m-3
Qv = freshwater input rate, m3 yr-1
Network analysis requires data on the connections and magnitudes of all trophic
transfers occurring within an ecosystem. Such data rarely come from a single study;
rather, they are collected from various studies of the relevant ecosystem. Using
estimation techniques, trophic interactions can be assembled and averaged to create a
snapshot of ecosystem function over some convenient time interval. For example,
Figure 7 shows the average annual transfers of carbon (mg/yr) moving through 36
major components of the Chesapeake Bay ecosystem from 1984 to 1985.
In network analysis, matrix manipulation techniques can be used to quantify
trophic interactions and to characterize features of an ecosystem. Based on such
techniques, investigators can:
• identify indirect feeding relationships
• determine trophic position and status
• calculate trophic efficiencies
• elucidate material and energy pathways
• determine measures of ecosystem health
Consider some examples. By examining material flow through several species,
investigators can identify feeding relationships between species not in direct preda-
tor-prey relationships. For example, although striped bass do not directly consume
microzooplankton, analyses reveal that over 25 percent of the carbon in the striped
bass diet once passed through microzooplankton biomass.
In the same way, network analysis can trace the number of trophic interactions
food undergoes on the way to each species. This allows the investigator to determine
the trophic status of each species and the average trophic position of the entire
system (the so-called "average-path length"). This latter averaged value can serve as
a useful indicator of system response to applied stress. In one such analysis,
Ulanowicz used information theory to compare indices of the Chesapeake Bay
network to corresponding values in the Baltic Sea in order to assess the metaphoric
"health" of the two systems. He showed that the Bay faces more intensive stress than
its high-latitude counterpart.
13
-------
Cu rrent Approaches for Modeling
Moreover, analysis techniques can be used to map elements of a complicated
food web into a linear series of transfers in order to calculate the trophic efficien-
cies associated with transitions at each consecutive level. Analyses show that
drops in trophic efficiencies, as may occur under community perturbations, often
result in dramatic decreases in the amounts of material and energy reaching
higher levels.
Algorithms also can be used to enumerate and quantify recycling pathways for
the circulation of energy and materials within the ecosystem. When applied to
the Chesapeake Bay ecosystem, network analysis reveals a bipartite carbon
cycling structure. In this structure, carbon recycling pathways in the planktonic
section of the ecosystem do not overlap with the more substantial recycle path-
ways between the benthic and nektonic sections. Only the filter feeding species
were common to both domains. This analysis revealed for the first time the
ecological role filter feeding species (oysters, soft clams, menhaden, and ale-
wives) play in the overall carbon processing in the ecosystem.
Ecological network models offer a powerful analytical tool. They are not,
however, designed to predict or examine causal relationships. Although net-
work models do not depict dynamic ecological relationships, they can portray
sequential changes in ecological relationships over time.
622650
103188
185086
,6,758^
f othei T
potv
cfiaefes
77803
~
27832
figure 7. Network analysis model simulate trophic transfers occuring in the ecosystem.
This figure shows the average annual carbon transfers among 36 major components of
the Chesapeake Bay ecosystem.
14
-------
Section!: State of the Art
Landscape
spatial models
Most ecological models simulate processes occurring at a point in space and
extrapolate findings to the entire landscape by assuming the environment is homoge-
neous. Such models provide little or no spatial articulation. However, landscape
models incorporate space as well as time to a level of resolution that is meaningful to
a given management question.
In general, landscape models can be used to predict the temporal evolution of
landscapes and to quantitatively describe landscape phenomena. More specifically,
these models can:
• map flows of energy, matter, and information in space
• designate source, sink, and receptor areas
• predict succession in two and three-dimensional space
• determine cumulative use thresholds for anthropogenic substances
• address questions of scale
Process-based landscape models simulate spatial structure by compartmentalizing
the landscape into a geometric design and then describing the flows between and
within compartments. Specific algorithms determine spatially explicit processes
controlling material and energy flow, including inputs from outside the system,
transfers within the system, outputs of useful material, and energy dissipation.
Bob Costanza outlined process-based landscape models and described recent
efforts to model landscape dynamics in a Louisiana marsh complex and the Patuxent
River watershed. He stressed the spatial characteristics of landscape models and
showed how dimensions of scale affect interpretation of model results.
For example, the Coastal Ecological Landscape Spatial Simulation (CELSS) model
is a process-based simulation of the Atchafalaya marsh area in southern Louisiana.
(See Figure 8.) It compartmentalizes the marsh complex into 2479 interconnected
cells, each representing 1 square kilometer. Under the control of connectivity param-
eters, the model simulates the movement of water, salts, nitrogen, and organic and
inorganic sediments between adjacent cells. Water movement from one cell to
another is modeled as a function of water storage and connectivity, which in turn
depend on various landscape characteristics including habitat type, drainage density,
waterway orientation, and levee height.
Each cell in the CELSS model contains a dynamic, nonlinear ecosystem simulation
model with eight state variables. Based on parameter values for these variables, each
cell is defined as one of six marsh habitat types:
• Fresh marsh
• Brackish marsh
• Salt marsh
• Swamp forest
• Upland
• Open water
Habitat succession occurs when environmental variables fall outside the range of
values specified for a designated habitat type. For example, changes in elevation,
salinity, water level, and primary productivity could convert a cell habitat from fresh
to brackish marsh. By evaluating the transition of ecosystem types due to natural and
anthropogenic changes, managers can use these spatial, process-oriented models to
predict the impacts of events such as sea-level rise or levee and canal construction in
a coastal area.
Dr. Costanza also described a Patuxent landscape model that was designed to
estimate nutrient and sediment loads resulting from various land use practices in the
Patuxent watershed. Containing approximately 6,000 spatial cells, 10 state variables,
and 22 land uses, the model predicts the effect of natural vegetation associations
(wetlands, riparian forests, grassed buffer strips, etc.) on nutrient and sediment
loadings to the aquatic environment. It also simulates the effects of vegetative
growth, buffers, and retention ponds on rates of soil erosion and nutrient retention.
Moreover, the terrestrial Patuxent landscape model is designed to link directly with a
Patuxent estuarine ecosystem model.
15
-------
Current Approaches for Modeling
Early versions of landscape models ran on supercomputers due to the inten-
sive data requirements for modeling dynamic, spatial processes and to the many
simulations required for parameter estimation. For example, a typical run of the
CELSS model simulates 22 years of landscape dynamics based on weekly time
steps for 2479 cells with eight state variables, each with 19,832 simultaneous
difference equations. Although initial simulations required 15 minutes of Cray
X/MP supercomputer time, recent software and hardware developments now
allow these simulations to run on Macintosh II computers using parallel proces-
sors.
Data
mm^^
-------
Linkage and Coupling
Toward an Integrated Modeling Framework
Overview
Conceptual
issues
Physical process
issues
Section 3 characterizes conceptual issues to address before building an integrated
modeling framework. It describes important linkage problems inherent in coupling
hydrodynamic and water quality models to various ecosystem and fishery models.
In considering linkage problems, the workshop participants discussed specific
issues for key components of an integrated modeling strategy, including:
Conceptual issues
Physical process issues
Water quality model issues
Submerged aquatic vegetation model issues
Ecosystem feedback and control issues
Fishery management and recruitment issues
Clearly, there are difficulties inherent in trying to link models that range from
hydrodynamic models to traditional fishery management models. The general
problems include the choice(s) of appropriate time and space scales. The more
specific problems revolve around the difficulties of linking water quality models to
biological phenomena often characterized by discontinuities, thresholds, rare events,
behavior, recruitment success, and counterintuitive feedback mechanisms. Other
conceptual linkage problems arise in trying to incorporate feedback mechanisms
operating between water quality models and foodweb processes.
Most participants agreed that modeling fishery recruitment—the fish that survive
early life history stages to enter into the adult population—was the greatest obstacle
to incorporating fishery models into an integrated ecological framework. Moreover,
there are scale problems to consider in linking diverse processes at various levels of
aggregation in time, space, and biological detail. The participants agreed that in
choosing appropriate scales, modelers should not necessarily copy scales used for the
water quality models. Scaling problems are treated in more detail in Section 4.
Some of the linkage problems have been addressed by making broad assumptions
about the relevance of using one model variable as a simple surrogate for processes
in other models. For example, dissolved oxygen concentrations are often used as an
indicator of habitat conditions. Although this loosely links the water quality model to
fish populations, it fails to account for the highly variable and complex factors
controlling fish behavior, recruitment success, and trophic interactions.
Despite the importance of the linkage issue, most participants agreed that linkage
problems should not be overemphasized at this early stage of model development,
simply because so much basic biology remains unknown. The group agreed that to
make ecosystem models more accurate, research was needed to provide additional
biological details, especially at higher tropic levels. In this context, the workshop
participants emphasized two important roles of modeling: (1) to identify gaps in our
understanding of estuarine ecology, and (2) to integrate and synthesize diverse
research results.
Most participants agreed that current physical process models can describe mate-
rial transport and hydrodynamic phenomena to an acceptable degree of precision
and accuracy. Recent advances have been particularly exciting and useful. Methods
are currently being developed to adjust detailed hydrodynamic model output to time
and space scales appropriate for specific ecosystem process models. Appropriate
time-space scaling of these models will make them useful to a diversity of modeling
efforts.
17
-------
Current Approaches for Modeling
The participants raised the need for finer resolution, or segmentation, of
current Chesapeake Bay hydrodynamic models for shallow estuarine areas, such
as SAV habitat. Such an improvement would provide a link to the important
ecological processes and living resources along shorelines and in shallow areas.
Output from physical/chemical models also provides important habitat data
for fishery bioenergetic models. Because output variables such as dissolved
oxygen and temperature directly link to such bioenergetic models, these data
may be enhanced to suit the specific needs of a model of higher trophic organ-
isms. For example, dissolved oxygen concentration and temperature fields
predicted from water quality models provide limits on striped bass habitat and
control fish growth and production. In this way, the development of a spatially
articulated database of physical parameters allows better modeling of fish
populations.
Water quality The Chesapeake Bay Program has developed a three-dimensional, time-
model issues variable water quality model that describes the concentration of nutrients and
dissolved oxygen throughout the main Bay. As described in Section 2, the Bay
model is based on physical transport, closed nutrient cycles, mass balance, and a
distinction between summer and winter algae. The participants endorsed the
Bay Program water quality model as a robust technology for predicting dis-
solved oxygen and nutrient concentrations in the Bay mainstem. The hydrody-
namics have been sufficiently resolved to describe most transport and distribu-
tion phenomena relevant to other ecosystem processes.
To be useful as a tool for managing living resources, water quality models
must generate biologically useful output. Models should be able to generate
useful chemical and physical information corresponding to the habitat require-
ments for key species. For example, submerged aquatic vegetation (SAV) habitat
requirements are described based on water quality parameters, such as light
attenuation and the distribution of total suspended solids, chlorophyll a, dis-
solved inorganic nitrogen, and dissolved inorganic phosphorous.
To meet these habitat requirements and to provide better links to living
resources, it was recommended that time steps for water quality model output
should match the averaging periods used to define habitat parameters. Partici-
pants also recommended the development of finer scales for temporal and spatial
distribution of water quality output.
The Chesapeake Bay time-variable water quality model currently does not
receive feedback input from biological processes occurring in or at the borders of
the water column; in reality, however, zooplankton, benthos, SAV, filter feeding
fish, and many other organisms can affect water quality. These organisms may
graze phytoplankton, shift carbon flow, alter microbial processes, or in other
ways indirectly control aspects of water quality.
The workshop participants endorsed incorporating ecological feedback models
into the water quality model. The first step in such an effort is to carefully choose
appropriate organisms to model and integrate them into the water quality
model; in this context, the group recommended three important criteria:
' Sensitivity—organisms and/or processes should be suitably responsive to
environmental changes and/or capable of altering environmental condi-
tions
' Trophic importance—organisms should be important in the Bay's foodweb
' Data sufficiency—adequate data should exist for realistic simulation of a
species in a model.
Based on these criteria, the participants agreed that zooplankton and benthos
should be the first groups of organisms to be integrated into the water quality
model. In both cases, there is a good monitoring database and both hold a key
trophic position between primary producers and large fish. Some participants
warned, however, that because of their relative instability in the environment,
zooplankton would be difficult to model.
18
-------
Section 3: Linkage and Coupling
Submerged
aquatic
vegetation
model issues
Ecosystem
feedback and
control issues
Fishery
management
and recruitment
issues
Currently, SAV models provide the most direct link to water quality models.
Several of the habitat requirements for SAV, such as chlorophyll a, dissolved inor-
ganic nitrogen (DIN), and dissolved inorganic phosphorus (DIP), can be simulated
by the Chesapeake Bay Program (CBP) water quality model. However, the CBP
model does not explicitly extend to littoral areas; before linking to SAV models, the
CBP model must be able to simulate habitat requirement parameters in these shal-
low areas. The participants pointed out, however, that parameters for habitat re-
quirements are not species-specific, nor do they fully account for the enormous
temporal and spatial variations in SAV communities. SAV models will also require
better information about the grazer communities. Moreover, links between SAV and
water quality models will have to incorporate feedback mechanisms—the effects of
the grasses on the ambient water conditions.
Currently, the Chesapeake Bay Program water quality model incorporates linked
physical processes and can simulate the dynamics of chemical parameters relevant to
ecosystem models. However, the outputs from ecosystem models do not link into the
water quality model. To anticipate linkage problems, the participants discussed
interactions, controls, and feedbacks relevant to linking components.
Numerical models of marine processes have existed since the late 1930s, but not
until 1949 did researchers begin to incorporate phytoplankton and zooplankton
equations into a system of equations with feedback control. Current ecosystem
models attempt to quantify feedback interactions and incorporate controls to more
accurately describe actual ecological phenomena. The microcomputer, coupled with
the diagrammatic languages of Odum and Forrester, has enabled modelers to easily
simulate ecosystems by solving a system of nonlinear, differential equations.
Currently, several nonlinear ecological feedback mechanisms occurring in estuaries
have been studied and incorporated into ecosystem models, including:
Benthic suspension feeding
Seagrass nutrient assimilation
Seagrass sediment trapping
Grazing on seagrass epiphytes
Oxygen effects on denitrification/ nutrient recycling
Benthic bioturbation effects on nutrient cycling
Incorporating diverse nonlinear ecological responses into an integrated framework
presents both a conceptual and a mathematical challenge. In the conceptual context,
control mechanisms and feedback interactions are difficult to determine and often
are counterintuitive to an initial understanding of the ecosystem. The piecemeal
nature of existing data makes it difficult to gain a comprehensive understanding of
the complex interactions of biological, physical, and geochemical processes that
govern the cycling of nutrients and organic matter in estuaries. Moreover, averaging
nonlinear responses across diverse time and space scales presents a difficult math-
ematical challenge. Averaged fine-scale phenomena may accumulate and propagate
through the system to make aggregated results inaccurate.
To help clarify the influence of control processes, one can apply sensitivity analysis
techniques to ecological models to reveal indirect feedbacks that may seem unimpor-
tant but actually exert great control over ecosystem phenomena. When such tech-
niques identify processes previously thought to be unimportant or when the results
do not adequately follow observed field data, alternative hypotheses can be devel-
oped to account for the predicted behavior. In this context, models can be used to
guide new research or to refine monitoring programs. Also, they can be used to test
new hypotheses or to obtain additional information about controlling processes.
Fishery models have been widely used by fishery managers since the 1950s to
generate maximum sustainable yield (MSY) and yield-per-recruit. However, fishery
models have been marked by numerous failures and deficiencies, and they have lost
credibility in recent years.
19
-------
Current Approaches for Modeling
(Fishery ^° some managers, linking the water quality model with fishery models
management cont'd) represents the ultimate challenge in building an integrated ecological modeling
framework. As one workshop participant pointed out, "We manage the Bay from
two points—nutrient input and fishery harvest—and we need models to explic-
itly address this strategy." Others commented that water quality models and
fishery models were at opposite ends of the spectrum; the best ecological model-
ing strategy would be to "meet in the middle."
Population models usually fail owing to their inability to predict recruitment
variability. According to some workshop participants, the problem of quantify-
ing the intense variability in recruitment success remains the single greatest
obstacle in linking ecological processes with traditional fishery management
models.
Recruitment success is often a random event based on biological phenomena,
such as fish behavior and predator-prey interactions, and on physical characteris-
tics, such as temperature and river flow. Modeling recruitment success is compli-
cated by episodic perturbations in the environment and the density-dependent
characteristics of many fish populations. Models work best with averages, but in
the context of population genetics, survival often depends not on the average,
but rather on the exceptional individual, because larval survival itself is a rare
event.
There are other problems with traditional fishery models. For example, fishery
management models based on surplus production and yield-per-recruit assume
a population in equilibrium with predictable or constant recruitment. In addi-
tion, the critical interactions of fish populations with other components of the
estuarine ecosystem are not included in traditional fish population models.
Typically, such factors as food abundance, habitat (e.g., refuge) availability,
disease, and predation losses are either ignored or implicit within empirical
coefficients used for natural mortality, growth, and fecundity.
To address these problems, current stock recruitment models attempt to
predict the number of progeny from the size of the spawning stock. But such
relationships are poorly understood, and recruitment models require long time-
series data of all environmental parameters (e.g., temperature and river flow),
biological factors (e.g., mortality rates, fecundity, and growth rates), and fishing
characteristics (e.g., annual harvest and catch composition). Even with available
data, unpredictable natural and anthropogenic environmental changes remain as
obstacles to the reliable prediction of recruitment success.
For many species, management agencies solve the recruitment problem by
using other quantification approaches, such as virtual population analysis,
juvenile indices, or stock recruitment models. But important questions regarding
fisheries modeling remain open to further consideration. One such question is
whether fish population models can be improved by linking them to other
models. For example, physical circulation models and water quality models
provide physical and chemical information that could be used to describe fish
habitat; ecosystem models also generate information about potential habitat,
including food availability and predation pressure. These physical, chemical, and
biological data may provide important links to improve fishery models.
The workshop participants discussed how ecological models might add new
insights to fishery models. Spatially explicit fish bioenergetic models describe the
potential for fish growth based on density of prey species and temperature.
Because they evaluate growth conditions, rather than actual populations, fish
bioenergetic models can be linked to ecosystem models of fish habitat. More-
over, they can be linked to the water quality model through data on temperature
and dissolved oxygen. Individual-based fishery models were also discussed as
another approach for handling recruitment variability for specific species.
20
-------
Introduction
Scales of
ecological
systems
Aggregation
and scale
Methodology and Technology
Factors in Concept, Design, and Implementation
In discussions and presentations, workshop participants evaluated the technical
issues associated with model conceptualization and construction. Such discussions
covered various approaches for addressing scaling and aggregation issues, and they
reviewed the evolving methodologies for model development, calibration, and
sensitivity analysis. Section 4 provides an overview of these issues based on both
plenary presentations and technical discussions in individual subgroups. It summa-
rizes criteria for deciding on appropriate scales, and it outlines trade-offs in aggrega-
tion or disaggregation over scales of time, space, or trophic complexity.
By scales, we mean the fundamental metrics used to describe the dimensions of a
system. The divisions, or levels of articulation, are chosen to capture the details of a.
specific process. To describe ecological phenomena, models must incorporate scales
of time, space, and some level of biological organization. Because relevant biological,
chemical, and physical phenomena occur on such varied dimensions, ecological
modelers must choose scales appropriate to a specific modeling objective. As a result,
the degree of articulation in scale and the level of aggregation determine the value of
any modeling approach to a specific problem. In other words, scales for time, space/
and ecological complexity may be appropriate for one use but not for another.
Decisions over appropriate scales should be determined by the objectives of the
model, the limits of technology, and the availability of relevant data. The participants
generally agreed that the model objective, or question, defines the appropriate scale
for each model, and different scales are appropriate for different questions. Others
specifically noted that scales used in the water quality model, although appropriate
to predicting anoxia, may be inappropriate for questions regarding ecosystem
processes or population dynamics.
Distinguishing between types of models and their intended uses is an important
step in determining appropriate scales. To emphasize this point by an analogy, Bob
Costanza suggested that we compare models to maps. Fine-scale street maps are
relevant to some problems, whereas large-scale global maps (aggregated ecosystem
models) are relevant to others. Modelers must articulate dynamic models in time,
space, and trophic complexity. An important step is to choose the appropriate level
of articulation to evaluate the processes defined in model goals.
Modelers aggregate systems to simplify the conceptual framework or to convert
myriad and diverse phenomena into more tractable computer simulations. However,
by combining fine-scale, nonlinear systems into coarse-scale, aggregated ones,
modelers may introduce averaging errors that accumulate and propagate through
the entire simulation system. Aggregation errors occur most often when modelers
average nonlinear systems into a larger whole; the aggregate average may lose the
rich nonlinear features of the natural system. For example, in a forest leaf canopy, the
photosynthetic capability of the canopy could be estimated by summing up the
individual capabilities of all leaves. Such an aggregation, however, would ignore the
interactive effects (shading, etc.) of leaves on each other. In this way, the coarse-scale
aggregate does not behave the same way as the sum of its component fine-scale
simulations. Important details of fine-scale components with high-frequency, tran-
sient behavior may be lost or damped in aggregation. Thus, a coarse-scale model can
fail even though it was assembled from well-understood, accurately modeled, fine-
scale processes.
21
-------
Current Approaches for Modeling
Predictive versus
experimental
Specific scale
problems
Limits of
technology
In one presentation, Jim Kremer outlined a continuum of modeling strategies
ranging from the highly aggregated, holistic ecosystem models to the detailed,
mechanistic models used to describe microbial processes in the sediments.
Large-scale holistic models are highly aggregated relative to the phenomena to
be explained; although extremely predictive, such models are often so broad they
preclude direct experimental measurement. In other words, they seldom fail, but
often they are right for the wrong reasons. Such models are accurate; they are not
precise.
Large-scale, aggregated models may best be applied to management applica-
tions in which predicting system response is more important than describing
detailed mechanisms. Managers use predictive models to extrapolate the struc-
ture or behavior of a system outside the existing data boundaries. Moreover,
such models can be used to support pollution abatement strategies based on
broad assumptions— such as limiting nutrient input to raise levels of dissolved
oxygen. It might be argued that large-scale landscape models, such as those
outlined by Bob Costanza, provide practical scales because pollution and envi-
ronmental policy occur at the large, landscape scale.
In contrast, small-scale, mechanistic models are best applied in experimental
approaches in which hypotheses about causal relationships can be tested and
rejected. Indeed, models can play a role analogous to experiments in the scientific
method. But models can be seen as rejectable alternative hypotheses if their
compartments are defined on scales appropriately small relative to the question
being asked. Such descriptive models are used to test hypotheses and they often
fail because of limited understanding of processes controlling the system. In this
context, small-scale models can be used to guide research.
Problems of scale ultimately require making decisions as to the appropriate
level of aggregation/disaggregation, and the best solution depends on making
trade-offs between accuracy, precision, and generality. Ultimately there must be
a compromise between model resolution and model predictability; finer resolu-
tion lowers predictability.
Between the two extremes of aggregated, holistic, predictive models and
disaggregated, mechanistic, descriptive models, there are many practical scale
problems to address before building an integrated modeling framework. Con-
sider some examples involved in creating an integrated model for the Chesa-
peake Bay. In the Chesapeake Bay water quality model, the Bay is divided into
57,871 cells each measuring 1 km2 in area and 1 m in depth. Mass or concentra-
tion values of dissolved oxygen are calculated based on statistical interpolation
from nearby sampling sites. Steve Brandt presented a striped bass bioenergetics
model that uses 18,000 water column cells to describe fish growth and survival.
To integrate his model with the water quality model, Brandt would need to use
dissolved oxygen and temperature information at other scales. The SAV models
of Dick Wetzel operate on a spatial scale of 1 m2. These smaller scales are more
realistic for SAV growth and survival than the km2 scales of water-column
models. Other participants raised other scaling problems, including the integra-
tion of groundwater transport phenomena into mainstem Bay hydrodynamic
models and the large-scale problems associated with atmospheric deposition and
sea-level rise.
Numerical ecosystem modeling requires the use of computers. The partici-
pants agreed that, although limited, the technology is available to support the
development and integration of ecosystem process models. In fact, hardware
development has catalyzed the ecological modeling field. Computers enable
modelers to organize complex ecosystem information, simulate possible out-
comes over time, examine control mechanisms, describe changes, and model
uncertainty.
22
-------
Section 4: Methodology and Technology
Physics:
fine-scale
versus
coarse-scale
Consider that the time-variable, three-dimensional water quality model of the
entire Chesapeake Bay runs an annual cycle on 25 hours of Cray supercomputer
time. In contrast, mechanistically complex but spatially-averaged ecosystem process
models, such as those outlined by Mike Kemp, can be run on desktop computers
with a one year simulation requiring 15 minutes.
As a result, practical concerns over computing power can define appropriate scale.
Increasing scale resolution causes an exponential increase in computing require-
ments; some models aggregate scales simply to maximize computing resources. One
participant remarked that, "in the early days, we designed our water quality models
to run in eight hours computing time." As a result, we can expect some scales to be
determined not by the relevance of the problem, but rather by the size of a workable
computer model.
Currently, the Bay water quality model runs push the limits of technology in terms
of required supercomputer time. Several participants said that adding ecosystem
processes to the present version of the water quality model would strain existing
computer resources. In this context, any new biological components added to the
water quality model would strain technical resources. For these reasons, the work-
shop participants stressed that species for initial ecological modeling efforts should
be carefully chosen based on a high probability of success (see water quality model
issues in Section 3).
Given limited computing resources, the participants discussed trade-offs in fine-
scale versus coarse-scale hydrodynamics and the benefits of aggregating cells in the
water quality model. According to several participants, in order to save computer
power, individual cells in the bottom 6-7 m of the Bay water column could be col-
lapsed into a single cell without changing data for key processes.
Other participants said that the water quality model has too many cells; Chesa-
peake Bay Program managers should aggregate the Bay into larger segments to
interpret results. Dominic DiToro described a successful 2:1 grid collapse in the Long
Island Sound water quality models; the aggregation was made to "improve computa-
tion tractability."
On the other hand, some participants warned of the potential problems stemming
from coarse hydrodynamic cells. For example, in collapsing small cells into bigger
ones, the model may lose important details and gradients in the water column. Dick
Wetzel suggested that finer-scale hydrodynamic cells were needed to model SAV
phenomena. Most participants agreed that hydrodynamic segments of the current
water quality model are too big to model resuspension and turbidity phenomena in
tributaries and shoal areas. Participants recommended using finer physical scales in
shallow areas.
From the ecological perspective, participants identified transport and diffusion
phenomena as the processes that should determine optimal hydrodynamic cell size.
Because material inputs are assumed to be instantaneously distributed across the
entire cell, the natural rate of transport and diffusion in the water column limits the
size of the modeled cell. Steve Brandt pointed out that scale-dependent measure-
ments of prey-fish density also determine optimal hydrodynamic cell size; scales
must be small enough to account for the patchy distributions of small schooling fish.
The participants discussed possible rules for aggregation and disaggregation
decisions. One suggestion: run models on different spatial and temporal scales to
evaluate the effects on model output. Specifically, the Chesapeake Bay Program
could run the time-variable water quality model under fine-scale and coarse-scale
scenarios to evaluate changes in water quality results. Rules for cell aggregation
could evolve from this effort.
To study trade-offs and benefits of various scales and to examine aggregation/
disaggregation issues more systematically, Mike Kemp announced that the Univer-
sity of Maryland will soon begin the 10-year Multi-scale Experimental Ecosystem
Research Center (MEERQ funded by EPA's Centers for Exploratory Research
Program.
23
-------
Current Approaches for Modeling
Feedbacks,
trophic
interactions,
and time
Data availability
Trophic interactions and nonlinear biological responses complicate the aggre-
gation of ecological models. By simply aggregating segments into larger wholes,
a model may not accurately account for the many nonlinear processes occurring
in biological systems. For example, processes such as photosynthesis, nutrient
uptake, growth, and some density-dependent population phenomena depend on
nonlinear biological responses that are subject to thresholds and feedback
mechanisms.
Moreover, biological processes may have unique characteristics that require
special temporal considerations. The participants discussed temporal variation in
many processes. Fish, for example, have different functional responses during
phases of their life cycle—larval, juvenile and adult stages—that may need to be
distinguished into separate model variables. Water quality models have evolved
that separate winter and summer algae into distinct functional components and
account for the relative importance of top-down versus bottom-up control
changes over time.
Large scale aggregation can blur many of these important processes; thus, the
participants supported flexible time scales to incorporate the temporal variability
in many biological systems. The ability to characterize rare or episodic events
and tease out such events from seasonal or monthly averages may define appro-
priate scales for modeling. In reality, aggregated time scales often average long-
term data and produce periodic averages too general to describe important
episodic phenomena. For example, Ken Rose pointed out that fish larvae may be
killed by the rare event, such as a sudden temperature drop, rather than the long-
term seasonal temperature average.
To better understand biological events, the participants recommended that
selected variables be monitored on finer time scales, particularly for key ecosys-
tem processes or in areas characterized by feedback mechanisms. Several model-
ers described techniques to dynamically adjust temporal parameters according to
seasonal fluctuations.
The availability of data may also limit the choice of scales, especially in the
temporal dimension. For example, monitoring data may be available only for
limited periods, and extrapolation and averaging may be required for application
to specific problems.
Although the group acknowledged the good monitoring database available for
Chesapeake Bay, there is a need for more comprehensive data, especially for the
basic natural history of important species and ecosystem processes.
The group recommended a balance between data gathering and data model-
ing. According to Dominic DiToro, current modeling efforts cost only about one-
third of the cost of data collection and monitoring. Moreover, modeling should
not be kept separate from monitoring; to be most useful, choices should be made
more carefully over what to monitor and when. Specifically, the participants
endorsed more detailed data collection in areas in which nonlinear processes are
likely to occur.
Also, the group identified data availability as a key criterion for choosing
appropriate and relevant species to model. In addition to having a solid data-
base, a species selected for inclusion in models should also be significant in the
foodweb and should be environmentally sensitive. Based on these criteria, the
group endorsed macrozooplankton (e.g., copepods) as a key taxanomic group for
initial links to the water quality model. Zooplankton, in particular, have a good
monitoring database. The group also recognized the existence of good databases
for striped bass and Chesapeake Bay water quality.
To make the most of current databases, the participants recommended better
communication and exchange of existing databases. In his presentation, Jim
Kremer described HyperCard software as an example of a system that makes
databases more transparent and easy to use.
24
-------
Model
development:
formulation,
calibration, and
sensitivity
analysis
Section 4: Methodology and Technology
Ecological models begin as tools for addressing specific problems or questions. As
such, models must have a purpose, or goal, and must be defined in time and space.
To achieve stated goals, modelers design a conceptual framework to describe
relevant natural processes, stocks, or resources. Such a framework usually includes
the components of the ecosystem, defined as state variables, and uses equations to
control input and output to state variables. State variables are usually connected
through material or energy flow.
The process of model building includes six important steps:
(1) Define and conceptualize the process;
(2) Express relationships empirically with equations;
(3) Translate equations into computer code;
(4) Simulate, calibrate, and validate model results;
(5) Analyze results and perform sensitivity analysis;
(6) Apply results.
Define and conceptualize
Before conceptualizing processes and interactions, modelers must clearly define
purposes and goals of the model and set boundaries in time and space. Models with
different purposes and goals will have different structures and components. Model
structure and the interactions between components define a conceptual framework
for the model.
Build equations and empirical relationships
Using a conceptual model with clear boundaries and stated goals, modelers
mathematically describe the relationships between components and processes with a
collection of equations for various ecological processes. Ecological modelers most
often rely on standard equations for processes such as photosynthesis, nutrient
assimilation, metabolic rates, light irradiance, and predation.
Although these processes are well understood and have solid mathematical
descriptions, many natural history processes, such as predator behavior and migra-
tions, are complex and cannot always be described in precise mathematical terms.
Often, equations will result from a researcher's own work and subsequent simula-
tions will be used to test the new equation. In any case, modelers must borrow
existing equations and/or derive and estimate new formulations.
Generate computer code
Because ecological models run on computers, their mathematical formulations
must be translated into computer code. There are several approaches to the problem;
most often, modelers either write their own code in FORTRAN or C, or they use
modeling software packages such as STELLA to construct heuristic, graphic models
that, in turn, are translated into computer code.
Some participants anticipate future difficulties integrating computer code across a
standard software platform. Earlier water quality models, including the Chesapeake
Bay time-variable, water quality model, have been written in FORTRAN and run on
supercomputers; current ecosystem models are constructed in STELLA and run on
Apple Macintosh microcomputers. Bob Costanza described a transputer technology
using parallel processors that greatly increases computation speed for repetitive
calculating with Macintosh-based STELLA models, with the resulting computation
systems rivaling supercomputers. Dick Wetzel showed how he planned to integrate
SAV data into geographic information systems. Some participants suggested plan-
ning ahead to anticipate and thus minimize problems resulting from the integration
of varied software and hardware technologies used in ecosystem models.
Simulate, validate, and calibrate
Using ecological equations, computers simulate the interactions among variables in
time and space. Normally simulations occur in time; that is, models evaluate ecosys-
tem structure and function as time changes. This capability enables modelers to
25
-------
Current Approaches for Modeling
predict potential changes to the system based on modified inputs and it invites
"what if" comparisons of alternative scenarios.
However, before simulated results can be applied with confidence, simulated
data must be validated, or compared with collected data, and calibrated to better
fit real-world conditions. Often, validation/comparison and calibration/adjust-
ment constitute an iterative process, as adjustments in forcing functions or
equation coefficients produce more accurate or more precise simulation output.
Some participants discussed techniques for improved validation and calibra-
tion and the need for a set of standards to judge the "goodness of fit" of various
models. For model validation, basic statistical methods, such as the Student t
test, might be used to compare simulated data with collected data, but partici-
pants warned of the difficulties in comparing model averages with actual col-
lected data. Overall, there were suggestions for robust quantitative methods to
test whether or not model variables are within a prescribed margin of error
relative to collected data.
One approach to model calibration is to adjust standard equation coefficients.
In the process, modelers first define equation coefficients, or parameters, based
on accepted values in appropriate scientific literature. Coefficients are then
adjusted to match model output with actual data. Adjustments normally occur
within a range of standard error; if modeled processes and state variables still do
not match expected values, equations can be modified or rejected.
Often, historical data are used to validate and calibrate model simulations. For
example, Bob Costanza made use of U.S. Fish and Wildlife Service historical data
to calibrate a large spatial simulation of 2,479 interconnected cells of 1 km2 each,
representing the Atchafalaya delta. Historical data were available for 1956 for
initial conditions and for 1978 and 1983 for correlation points. The use of histori-
cal data to calibrate models foreshadows the need to collect relevant monitoring
data for use in future ecological models.
Perform sensitivity analysis
Simulations can be used to explore and determine components and processes
controlling ecosystem structure and to evaluate how changes in model structure
or inputs affect model results. By using sensitivity analysis or network analysis,
modelers can determine the components that have the greatest control over
modeled processes and simulation output.
Monte Carlo simulation techniques involve repetitive simulations with
coefficients for each variable selected randomly from prescribed confidence
intervals. Using these techniques one can determine which variable had the
greatest control over some process (e.g., production) and calculate the sensitivi-
ties to evaluate how the variance in each component variable controls variance in
output variables. Such sensitivity analysis techniques also determine the residual
variance owing to nonlinear and interactive effects among components.
Apply results
Well-calibrated models can be used to predict changes, to test hypotheses, to
understand mechanisms controlling a system, or to explore how various pro-
cesses control other processes.
26
-------
Recommendations
An Agenda for Action
Recommendations In the final session of the three-day workshop, the participants recommended a
series of specific steps to guide the ecosystem modeling effort. The plan includes the
following steps:
• Define the objectives
• Establish a conceptual framework
• Identify gaps in data and models
• Support consensus and facilitate standardization
• Gain the confidence of model users
• Establish institutional support
• Communicate results and boost credibility
Define the management objectives of a modeling program
A successful ecosystem modeling strategy should address the management objec-
tives of the agencies responsible for restoring and protecting Chesapeake Bay. In this
context, managers will be responsible for asking the questions that ultimately guide
the ecosystem modeling program with clearly stated goals and objectives.
Establish a conceptual framework
A hierarchy of management questions/objectives can initiate the research/man-
agement dialogue necessary to establish a conceptual modeling framework. Such a
framework should link ecologically valuable species and important processes (e.g,.
water quality and habitats) in the Chesapeake Bay ecosystem. Moreover, it should
incorporate a continuum of models ranging from traditional water quality models to
seagrass models to fishery management models. Several earlier conceptual frame-
works were suggested in the 1980s.
Identify gaps in data and models
The workshop participants endorsed the diversity of approaches currently under-
way to model the Chesapeake Bay ecosystem (see Figure 1). A conceptual framework
can put these models in the perspective of an integrated whole and highlight areas in
which more work needs to be done. Based on the goals of the ecosystem modeling
framework, existing and new models should be identified for further development
and refinement. Moreover, because the utility of any model depends on the availabil-
ity of relevant data, managers and scientists should work to identify data gaps or
missing information critical to the modeling strategy. Recognition of required data
and information should lead to improvements in ongoing monitoring programs and
help guide specific research activities to investigate and/or quantify processes
relevant to ecosystem model development, calibration, and validation.1
'See:
Green, K.A. A conceptual ecological model for Chesapeake Bay. Washington, D.C.: U.S. Fish and
Wildlife Service, Division of Biological Services, FWS/OBS-78/69,22 pp.
Kemp, W.M., R.R. Twilley, J.G. Stevenson, W.R. Boynton, and J.C. Mian. 1983. The decline of
submerged aquatic vascular plants in upper Chesapeake Bay: Summary of results concerning
possible causes. Mar. Tech. Soc. J. 17:78-89.
27
-------
Current Approaches for Modeling
Support consensus and facilitate standardization
Owing to the complexity of the Chesapeake Bay system, a diversity of model-
ing approaches remains essential for understanding and/or quantifying ecosys-
tem processes. However, the workshop participants encouraged planning efforts
that anticipate integration and linkage issues. Managers and scientists should
support the movement toward consensus on technical issues involving appropri-
ate scales, levels of aggregation or disaggregation, linkages between individual
models, and recognized standards for model development, calibration, and
sensitivity analysis.
Gain the confidence of model users
To streamline development, boost the credibility of model results, and ulti-
mately ensure model acceptance, technical program managers should develop
marketing strategies aimed at winning the confidence of management agencies
and other potential model users. Specific suggestions included (1) facilitate
communication between managers, modelers, and researchers, and (2) carefully
choose modeling approaches and/or species with a high probability of success
and with direct management implications. Initial modeling efforts should focus
on zooplankton, SAV, and benthos.
Establish institutional support
To provide long-term financial support, the Chesapeake Bay Program and
other regulatory and resource management agencies should establish a consor-
tium of funding sources. Such a funding consortium would help guide the
modeling program through integration and standardization issues, establish
mechanisms for technical guidance and peer review, and identify specific
monitoring and research needs.
Specific funding agency consortium objectives could include the following:
Define a hierarchy of objectives and establish a conceptual framework for
the ecosystem modeling strategy
Select principal investigators and technical program managers to direct
and coordinate the program
Establish an independent peer review structure (similar to the Modeling
Evaluation Group [MEG] of the Chesapeake Bay Program Modeling
Subcommittee)
Develop a detailed two to three year short-term workplan along with a
longer five to ten year strategy for ecosystem processes model develop-
ment, calibration, and application
Create a specific list of monitoring and research needs to fill identified data
and information gaps
Communicate results and boost credibility
Model developers and model users should actively communicate model results
to a wide variety of audiences. Communication strategies should include tech-
nologies, documentation, and presentations that explain model assumptions and
make them easy to use.
28
------- |