United States          Office of Research and Development       EPA-600/A-00/083
Environmental Protection        Washington, DC 20460             August 2000
Agency
              Proceedings of the
  Cross Discipline Ecosystem Modeling
           and Analysis Workshop
                August 15-17, 2000
          U.S. Environmental Protection Agency
              EPA Administration Building
               79 T.W. Alexander Drive
           Research Triangle Park, NC 27709

-------
                Cross-Discipline Ecosystem Modeling and Analysis Workshop
                                         August 15-17,2000


Tuesday August 15, 2000

8:00            Registration for visitor passes begins, coffee will be available

Overview and Problem Description
8:30 - 8:50      Welcome, Joan Novak
8:50 - 9:10      Office of Water Water Quality Modeling Program BASINS
                (Russ Kinerson - EPA OW)
9:10 - 9:30      EPA Regional perspective (Jim Greenfield - EPA Region IV)
9:30 - 9:50      Near Laboratory Ecological Research Area Studies
                in the Neuse River NLERA (Joseph Bumgarner - EPA HE ASD

Break (9:50 -10:05)

10:05 - 10:30     Water Quality Issues in the Neuse River Basin (Ken Rechow - WRRI)
10:30 - 10:55     Neuse River Estuary Monitoring (Rick Luettich - UNG CH)
10:55 - 11:20     Nutrient Cycling and Algal Blooms in the Neuse River Estuary
                (Hans Paerl - UNGCH)
11:20 - 11:45     Atmospheric Deposition of Reduced Nitrogen to Estuarine Waters
                (Robin Denni - EPA AMD)

Lunch (11:45 -1:15)

Multimedia Ecosystem Models: Integration and Development Activities
1:15 - 1:25       Introduction: MIMS Ecosystem Modeling Component Integration Efforts
                (Alice Gilliland - EPA AMD)
1:25 - 1:50       Applications of the TOPLATS Land Surface Hydrology Model, Including
                Strategies for Modeling Large River Basins and Coupling to Saturated
                Groundwater Models (Christa Peters-Lidard - Ga Tech)
1:50 - 2:15       Coupling a Mesoscale Meteorological Model to a High-Resolution Land-Surface
                Hydrology Model: Review of Results, Current Status, and Future Plans
                (JohnMcHenry-NCSq
2:15 - 2:40       A High Performance Analytic Element Modeling of Ground Water Flow
                (Alan Rabideau - SUNY)

Break (2:40-2:55)

2:55 - 3:20       A spatially distributed object-oriented model for simulating terrestrial water, carbon
                and nitrogen cycling and transport (LarryBand - UNGCH)
3:20 - 3:45       Issues and Data Needs for Introducing Nitrogenous Pollutants from Swine
                Waste Land Application into an Object-Oriented Model Ecosystem Model
                (Harvey Jeffries, Steve Whalen - UNG CH)
3:45 - 4:10       Development of a Surface Water Object-Oriented Modeling System (SWOOMS)
                for the Neuse River Estuary (Rick Luettich - UNGCH)
4:10 - 4:35       Mechanistic-based Ecosystem Exposure Modeling on a Watershed Scale
                (Gour-Tsyh Yeh - PSU)

4:35 - 5:15       Discussion (Facilitator Christa Peters-Lidard): Opportunities for future collaboration

5:15   Adjourn
Page No.
     1
     2

Not included
     3
Not included
    4
    5
    10
    11

    12


    13

    14

-------
Wednesday August 16,2000

Framework Developments for Coupled Multimedia Models
8:30 - 8:55       Status of MMS Architecture Design (Steve Fine - EPA AMD)                        15
8:55 - 9:20       Dynamic Information Architecture System pIAS) (Jayne Dolph - ANL)                19
9:20 - 9:45       MMS (Steve Markstrom, George Leavesly - USGS)                                  20
9:45 - 10:10      Development of Object-based Simulation Tools for Distributed Modular                21
                Ecological Modeling Qohn Bolte - ORST)

Break (10:10 -10:25)

10:25 - 10:50     A Toolbox for Assembling Spatially Explicit Multimedia Ecological                     22
                Models for Reusable Components (David Weinstein - Cornell)
10:50- 11:15     Object-oriented Architecture for Integrated Multimedia Model (David Stotts,             Not available
                Jason Smith - UNGCH)

11:15 - 12:15      Discussion Panel  (Facilitator Steve Fine)

Lunch (12:15 -1:45)

Tools and Techniques for Integrated Environmental Modeling
1:45-2:15       Software Reuse and Cluster Computing Tools for Environmental                       23
                Modeling (Suraj Kothari - lAState)  (Demonstration tnduded)
2:15 - 2:40       The Coupling-Mode Extensions for the Models-3 I/O API: Tools for                   24
                Building Coupled Cross-Media Models (Carlie Coats - NCSQ
2:40 - 3:05       Model Design for High-Performance on Microprocessor Based Parallel                  24
                Computer Systems (Carlie Coats - NCSQ

Break (3:05 - 3:20)

3:20 - 3:45       High Performance Computing for Large Scale Environmental Flow and                 25
                Transport Processes (Tate Tsang - UKY)
3:45 - 4:10       High-Performance Environmental Models for Estuarine and Coastal                    26
                Ecosystems (Peter Sheng - UFL)
4:10 - 4:35       Novel Techniques for Parallel Simulations (S.Rajasekeran - UFL)                       27
4:35 - 5:00       Tool-box for Parallel Simulation of Flows in Porous Media                            28
                (Raytcho Lazarov - T A&M)

5:00   Adjourn

-------
 August 17, 2000

 Landscape and Subsurface Characterization
 8:30 - 8:55       Development of a 1-km Landscape Vegetation Database for Modeling                   29
                Biogenic Fluxes of Hydrocarbons and Nitric Oxides (Tom Pierce - EPA AMD)
 8:55 - 9:20       Landscape Characterization and Non-Point Source Nitrogen Modeling                   30
                in Support of TMDL Development in the Neuse River Basin, NC
                (Ross Lunetta - EPA ESD)
 9:20 - 9:45       Observations and Modeling of O3 and SO: Deposition Velocity over                    31
                Agricultural and Forest Ecosystems: Synthesis of a Six Year Field
                Program  (Pete Finklestein - EPA AMD)

 Break (9:45 -10:00)

 10:00 - 10:25     Four Dimensional Visualization and Analysis of Ground Penetrating                    32
                Radar Data for the Determination of Three Dimensional Hydrologic
                Conductivity and Time Dependent Fluid Flow
                (Roelof Versteeg - Columbia)
 10:25 - 10:50     Determination of Aquifer Recharge, Ground Water Flow and  Basin                     33
                Discharge-Methods and Examples
                (Tim Spruill and Ted Mew - USGS, NC DENR)

 10:50-12:15      Panel Discussion (Facilitator: Larry Band)

 Lunch (12:15 -1:45)

 Data Visualization and Analysis Tools
 1:45-2:15       Advances in Environmental Decision Support Systems through HPCC                   34
                pan Loughlin, Downey Brill - NCSU)
 2:15 - 2:45       A Framework for Creating and Managing Graphical Analyses                           35
                (Steve Fine - USEPA)
 2:45 - 3:15       Space Time Toolkit: Web-Based Integration and Visualization of Spatially          Not available
                and Temporally-Disparate Data from Distributed Sources (Mike Botts - UAH)
 3:15 - 3:45       Development of a Field Data Model for Coupling Environmental Models with            37
                Spatial and Temporal Discrepancies (Todd Plessel - Lockheed Martin)

 3:45 Adjourn
This proceedings has been reviewed in accordance with the U.S. Environmental Protection Agency's peer review and
administrative review policies and approved for presentation and publication.

-------
 Cross-Discipline Ecosystem Modeling and Analysis Workshop:  Introduction

 Joan H. Novak
 Atmospheric Modeling Division, National Exposure Research Laboratory, USEPA, Research Triangle Park, NC 27711
 email: novakjoan@epa.gov

 The  complexity of  environmental problems we face  now and in the future is ever increasing.
 Process linkages among air, land, surface and subsurface  water require interdisciplinary modeling
 approaches.   The dynamics of land use  change spurred by population and economic growth, and
 the impact of Best Management Practices in urban and agricultural areas  must be considered in
 environmental exposure and  risk assessments.  An abundance  of related research and  model
 development is proceeding in Universities, Federal agencies and research laboratories, and  related
 research is being sponsored by industry-based research foundations. Thus, one of the primary goals
 of this workshop is to bring together stakeholders from many of these diverse groups for exchange
 of information  about  their  modeling needs  and research activities  with special emphasis  on
 techniques,  tools,  and  frameworks for  model  integration, characterization of  landscape  and
 subsurface features, and data visualization and analysis tools.

 EPA is interested in fostering a "community approach" to multi-disciplinary ecosystem modeling
 and analysis.  The emerging problems are larger than one group or one agency can expect to solve,
 so our goal is to work together toward open-architecture problem solving environments that
 facilitate the integration  of state-of-the-science process models/modules, application domain
 specification and data preparation, and decision support. A flexible Problem Solving Environment
 will enable exploration of a variety of modeling approaches dealing with multiple scale and stressor
 interactions.  Object technology, new computing algorithms and architectures, and intelligent data
 analysis techniques offer promise for overcoming previous computing limitations and modeling
 inflexibility.  During the workshop, investigators from the 1996 EPA STAR grants for High
 Performance  Computing and Communications will be presenting the results of their three year
 research efforts, and investigators for the 1999/2000 EPA STAR grants for Computing Technology
 for Ecosystem Modeling will be presenting their research directions for the next three years.
 Numerous other researchers and stakeholders engaged in ecosystem modeling and monitoring will
 also be presenting progress-to-date on their projects. The anticipated outcomes of the workshop are
 better understanding of  1) cross-media exchange processes  and scale issues, 2) a variety of
 framework approaches for dealing with cross-discipline model integration and application issues,
 and 3) identification of inter-disciplinary opportunities for collaboration.
* On assignment from Atmospheric Sciences Modeling Division, National Oceanic Atmospheric Administration, U.S. Department of
Commerce

-------
Office of Water (OW) Water Quality Modeling Program: BASINS

Russell Kinerson
US EPA, Office of Water (4305), Ariel Rios Bldg, 1200 Pennsylvania Ave, NW, Washington, DC 20460,
email: Kinerson.Russell@epa.gov

BASINS,  EPA's  GIS-based modeling system,  makes it  easier to  locate potential sources of
pollutants and estimate their effects on drinking water, recreational waters, aquatic life, wildlife
habitat, and other critical uses of waters in a watershed.  BASINS makes it easier to  evaluate
management strategies and perform "what if  scenarios in a particular watershed.  Many factors
affect water quality in a watershed, and each watershed is different. By using data compiled in
BASINS,  in conjunction with local data sources, users can evaluate large amounts of pollutant
source, chemical discharge, and stream flow information for every watershed in the continental
United States. Users may add their own data to that contained in BASINS, thus insuring use of
the most current, reliable, and accurate data that exists for each watershed.   They also may run
models to estimate  changes to water quality that may result from different  land use practices.
The watershed models use weather data to  generate stream flow, estimate loadings to streams
from non-point sources, combine these non-point source contributions with  facility discharges,
and calculate changes in pollutant  concentrations (e.g., sediment, nutrients,  bacteria, and toxic
substances) as they are diluted and flow downstream.  This presentation provides an overview of
the features found in BASINS v 3.0 and presents a discussion of design considerations associated
with working with evolving commercial software (ArcView) and Object Oriented programming
languages.

-------
Near Laboratory Ecological Research Area Studies in the Neuse River Basin

Joseph E. Bumgarner
National Exposure Research Laboratory, USEPA, Research Triangle Park, NC 27709,
email: bumgarner.joseph@epa.gov

Two years ago NERL began to establish field research sites in order to support methods
development under EPA's Office of Research Development risk assessment strategy.  The near lab
sites will serve as this research platform. This will provide a scientific framework for risk reduction
and target available resources at those environmental stressors that account for the greatest impact
on human and ecological communities and will define newly emerging problem areas.  To this end,
four locations were chosen. 1- the Little Miami River Basin in Ohio 2- the Lower Colorado Basin in
Nevada, 3- the Savannah River Basin in Georgia and South Carolina and 4- the Neuse River Basin in
North Carolina.

The advantages of these NLERA sites are
    1-  Facilitates Ecological Research Planning
    2-  Reductions in Costs & Logistic Efforts
    3-  Enhanced Research Quality
    4-  Encourages Collaboration (Federal, State and Local)
    5-  Integrate programs  and leverage resources

In the Neuse River Basin there are three base sites
    1-  Hill Forest on the Flat River northwest of the Triangle in the Piedmont region and is also
       one of the headwaters of the Neuse
    2-  Cunningham Farms in the coastal plain section of the state at Kinston NC. Neuse River/
       Contentnea Creek
    3-  And a coastal site near Havelock NC,Cherry Point Marine Air Station.

In addition to these base sites, NERL has secured access to eight swine farms in the Neuse Basin
and two of these farms are  the scene of an intensive study by four Federal Agencies, two State
Agencies and two Universities to define the impact of CAFOs (confined animal feeding operations)
on the surrounding environment

There are many environmental stressors to be found in this Basin ranging from urban and industrial
pollution, agriculture pollution to natural stressors is harmful algae blooms, and coupled with rapid
growth and development and changing economies. The Neuse represents a valuable ecological
research resource.

-------
Neuse River/Estuary Monitoring
Rick Luettich'and Bill Showers2
1 University of North Carolina at Chapel Hill, Institute of Marine Sciences, 3431 Arendell St., Morehead City, NC,
28557, email: rick_luettich@unc.edu
2Dept. of Marine, Earth and Atmospheric Sciences, North Carolina State University, Raleigh, NC, 27695
In the early to mid 1990s, the occurrence of several large fish kills together with the intense hype
surrounding the pfiesteria organism helped to galvanize public opinion about the need for significant
management action to resuscitate water quality in the Neuse River Estuary (NRE). In response, the
NC General Assembly passed House Bill ISA NCAC 2B.0232  1997 requiring that a 30% reduction
in external nitrogen (N)  loading (based on a mean 1990-1995 N loading "cap")  be in place by 2003.
Concurrently,  the State became  more  proactive  in funding monitoring  programs  to  provide
information necessary to develop effective  management  strategies.   This talk will  provide  an
overview of two monitoring programs.

In 1997, the Neuse River Estuary MODeling and MONitoring (MODMON)  program was initiated.
The principal objectives of the MODMON program have been (i) to develop water quality models
of the NRE to assist staff  at the NC  Department of Environment and  Natural Resources in
evaluating the effectiveness of proposed Total Maximum Daily Loads (TMDLs) and (ii) to develop a
monitoring program that would support these modeling efforts  and furthermore allow the detection
of trends in water quality parameters that were presumably responding to management actions.

In  1999, the  RiverNet program  was  funded  to  provide water quality monitoring data in the
freshwater sections of the Neuse to (i) continuously monitor river nutrients & physical water quality
parameters, (ii) transfer the data from the remote stations to NCSU and disseminate the  data on the
web so that the data can be accessed by policy makers, scientists, educators,  scientists, government
agencies, as well as university, middle and high school students, and (111)  obtain long term records of
nutrient flux in various portions of the basin to determine associations with land use changes and
climatic cycles. At the present time there are 3 of the 15 stations are installed.

-------
 Nutrient Cycling and Algal Blooms in the Neuse River Estuary

 Hans W. Paerl
 TJNGCHInstitute of Marine Sciences, 3431 Arendell Street, Morehead Qty, NC 28557, email: hans_pearl@unc.edu

 In eutrophying coastal  ecosystems like  the  Neuse  River Estuary,  nutrient-stimulated primary
 production and associated phytoplankton blooms have been  implicated in various water quality
 problems.  Blooms may be toxic,  disrupt food webs and provide the organic matter fueling bottom
 water oxygen depletion or hypoxia.  In the Neuse River Estuary, episodic (e.g., storm-related runoff)
 and chronic (low-level, long term) nutrient loading events, coupled to long residence times (30-120
 days), persistent  vertical stratification and elevated summer temperatures, promote blooms and
 hypoxia that can last for weeks and cover large areas.  The timing, sources (point vs. non-point,
 land-based vs.  atmospheric), loading characteristics (episodic vs.  chrome), and levels of nutrient
 input are critical determinants of bloom dynamics. The ecosystem-level trophic implications are that
 a large component of the estuarine food web may be negatively impacted or removed, altering both
 the  structure and function of the system.   Hypoxia also alters biogeochemical  (nutrient) cycling
 processes in affected habitats.

 Nitrogen (N) is the most limiting  nutrient  in the Neuse.  N inputs from riverine, groundwater, and
 atmospheric sources intensify hypoxia/anoxia  potentials by stimulating phytoplankton  growth and
 bloom formation. Changes in composition and loading rates of N sources influence both standing
 stock and composition of phytoplankton.  If, for example, specific N inputs result  in selection for
 rapidly-proliferating dinoflagellate, cryptomonad, and cyanobacterial nuisance blooms which are not
 effectively grazed, residual labile carbon will be available for fueling hypoxia/anoxia, altering benthic
 regeneration, nutrient cycling, and  ultimately,  determining  the temporal and spatial patterns and
 magnitudes of estuarine production.

 The Neuse River hydrodynamic water quality model currently lacks  the information necessary to link
 specific nitrogen  formulations (i.e.,  NOX, NH/, DON)  with responses of phytoplankton species-
 specific growth and bloom events. In addition, the relative importance of acute vs. chronic loadings
 of these sources  plays a role in  determining  phytoplankton competitive interactions  and bloom
 dynamics. Because  phytoplankton  composition and activity play key roles in dissolved oxygen
 dynamics and the trophic status of the estuary, it is crucial that the water quality model account for
 algal group-specific responses to varying nutrient sources and concentrations.  Our contribution to
 the Neuse River Modeling and Monitoring Program (ModMon) is  designed to provide data needed
to incorporate the effects of concentrations, formulations and loading characteristics  of N and other
nutrients on phytoplankton community structure  and its contribution to O2 depletion potentials
within the model  framework   The  predictability of the spatial  and  temporal dynamics of
phytoplankton  blooms in dynamic  estuarine environments depends on the ability of the water
quality model to accurately assess the collective responses of  many different ecosystem components.
A  mechanistic understanding of phytoplankton community  dynamics and linkages to O2 depletion
potentials are critical for model validation, applications, and simulations.

-------
Atmospheric Deposition of Inorganic Nitrogen to
Watersheds/Estuarine Waters

Robin L. Dennis"
Atmospheric Modeling Division, MD-80, U.S. EPA, Research Triangle Park, NC 27711, email: rdennis@hpcc.epa.gov

Atmospheric deposition of nutrients to coastal watersheds and estuaries contributes a sufficient
fraction of nitrogen loading and is a contributor in nutrient stimulation.  For various estuaries in the
eastern U.S., the  atmospheric contribution is estimated to  range between 10-30% of the nitrogen
load and for some estuaries the fraction is estimated to be even higher.  The great majority of this
deposition is in the form of inorganic nitrogen: oxidized nitrogen (ox-N = nitrate and nitric acid)
and reduced nitrogen (red-N = ammonia and ammonium).  This talk will provide some background
on the defining characteristics  of  atmospheric  inorganic nitrogen  deposition that  also  relate to
estuarine ecosystem modeling. This will  include the two pathways of delivery of deposition (wet and
dry), how gas-particle partitioning significantly affects  dry deposition, and how the  two inorganic
forms are different and when they can be similar. The background will also include a perspective on
land versus water exchange, direct versus indirect loading, and emission hot  spots.  The talk will
then move into examination of transport lifetimes and airsheds.   Because  these  pollutants are
involved in long-range transport, the airsheds are large relative to the watersheds. Airsheds of ox-N
and red-N for Chesapeake Bay and Neuse/Pamlico will be presented  and compared and contrasted
in terms of some of their characteristics. The characteristics of the Neuse/Pamlico reduced nitrogen
deposition and airshed responsibility will also be discussed in terms of conventional wisdom (it may
not be correct) and the large increase in ammonia emissions in the 1990s.
* On assignment from Atmospheric Sciences Modeling Division, National Oceanic Atmospheric Administration, U.S. Department of
Commerce

-------
 Introduction: MIMS Ecosystem Modeling Integration Efforts

 Alice B. Gilliknd *
 Atmospheric Modeling Division, National Exposure Research Laboratory, US EPA, Research Triangle Park, NC 27711
 Email: gilliland.alice@epa.gov

 The U.S. EPA Multimedia Integrated Modeling System (MIMS) is a long-term project that aims to
 apply  advancements  in software engineering to develop  a problem-solving  environment for
 multiscale ecosystem modeling and  analysis.   As part of the MIMS development, environmental
 modeling components such as atmospheric and surface hydrology models will be linked as initial
 prototypes to test  the  MIMS software architecture.  This  initial prototype  will then  be further
 developed to address water quality issues such as urban drainage and nitrogen flux contributions
 from the land surface, groundwater, and atmospheric sources, through an integrated ecosystem
 modeling perspective.

 The focus of this afternoon session is to present environmental model development and  integration
 efforts. For initial prototype development, terrestrial or land surface hydrology models are being
 linked with atmospheric and  groundwater models.  These linkages must address  the issues of
 disparate  scales between  the  individual  models.  Spatial scaling of these models  is  also being
 addressed to formulate strategies for modeling larger watershed and nver basin domains.  This is of
 particular concern  for  nutrient  cycling processes  where smaller vegetation  zones  (e.g., riparian
 buffers) play a  significant role.  In addition to ongoing  projects, the  new USEPA  Computer
 Technology for Ecosystem Modeling grant program includes  projects to develop new modeling
 approaches  for  surface water, swine waste  land application,  groundwater flow, and  ecosystem
 exposure.   It is anticipated  that future  MIMS prototype  developments will benefit  from the
 modeling approaches developed as part of these studies.
* On assignment from Atmospheric Sciences Modeling Division, National Oceanic and Atmospheric Administration, U.S.
Department of Commerce

-------
Applications of the TOPLATS Land Surface Hydrology Model, Including Strategies
for Modeling Large River Basins and Coupling to Saturated Groundwater Models
Christa Peters-Lidard1, John McHenry2, Carlie Coats2, Henk Haitjema3, Feifei Pan1, Brian
Keel1
                                               1 School of Qvil and Environmental Engineering
                                               Georgia Institute of Technology
                                               Atlanta, GA 30332-0355
                                               Phone: 404.894.5190 Fax: 404.385.1131
                                               Email: cpeters@ce.gatech.edu
                                               WWW:http:// www.ee. gatech.edu/~cpeters
                                               2 Environmental Programs
                                              MCNGNorth Carolina Supercomputing Center
                                              Research Triangle Park, NC

                                              3 The School of Public and Environmental Affairs
                                              Indiana University
                                              Bloomington, IN
Environmental modeling has traditionally been executed separately for each major medium, e.g. air,
surface water, and groundwater. Historical separation was justified on the basis that early pollution
concerns were primarily confined to one medium (e.g., DNAPLs in groundwater). Ongoing work in
the Chesapeake Bay, Great Lakes and the Neuse River Estuary has indicated the need  for fully
interactive multimedia modeling.

The TOPLATS-MM5 hydrologic- atmospheric model has been designed as part of an HPCC grant
to McHenry and Peters-Lidard, and may serve as a prototype for coupling air and surface water
media. TOPLATS-MM5 has been shown to improve on current state-of-science coupling of land-
atmosphere media at the watershed-scale.  One of  the unique aspects of the coupled modeling
system is that TOPLATS/MM5 incorporates an Input/Output Applications Programming Interface
(I/O  API) which encapsulates the complexity of coordinate systems, grid definitions, and data
structures at lower software  layers, and uses self-describing files  and communication channels to
provide this data to the coupled components.

The following three tasks are the focus of the project: 1) Apply and calibrate the TOPLATS-MM5
hydrologic-atmospheric model to the Sandy Run watershed, which is a sub-basin of the Neuse River
basin  in North Carolina; 2) Scope out the feasibility of applying the TOPLATS-MM5  model to the
following three basins: Contentnea Creek watershed (980  mi 2 ), Neuse River  basin (5,600 mi 2 ),
and the Albemarle-Pamlico drainage basin (28,000 mi 2 ); and 3) Provide technical advice on linking
TOPLATS-MM5 to grid-based (e.g., MODFLOW) or analytic element (e.g., GFLOW)  groundwater
models.  Early results from the work, including progress  developing project databases, as well as
strategies for coupling environmental models will be presented.

-------
 Coupling a Mesoscale Meteorology Model to a High-Re solution Land-Surface
 Hydrology Model: Review of Results, Current Status, and Future Plans (EPA
 Grant No. CR825210)

 John N. McHenry1, Christa Peters-Lidard2, Garlic J. Coats1, Atanas Trayanov1
 1 Environmental Programs, MCNGNorth Carolina Supercomputing Center, RTF, NC 27709, email: mchenrj@mcnc.org
 2 School of Gvil and Environmental Engineering, Georgia Institute of Technology, Atlanta, GA 30332

 Current interest in the processes that govern the fluxes of heat, moisture and momentum at the
 land-atmosphere interest is high. It is a difficult modeling problem because the relatively small-
 spatial and long-temporal hydrological scales interact directly with the relatively large-spatial and
 rapid-temporal scales of the atmosphere. In this project, we have coupled a high-resolution land-
 surface hydrology model to a mesoscale meteorology model in order to represent these complex
 interactions. Furthermore, we have designed the coupled model to accommodate assimilation of
 observed rainfall and downward solar radiation in order to provide for a variety of model coupling
 strategies that could be useful for different types of simulation demands.

 The models involved are the PSU/NCAR MM5V2 and V3 and the Top-Model-based Land-
 Atmosphere Transfer Scheme (TOPLATS). The initial work focused on designing and constructing
 a flux-coupler that passes surface water and energy fluxes  from TOPLATS to MM5 and passes
 downward directed atmospheric fluxes (precipitation, solar radiation) fromMMS to TOPLATS. The
 design of the coupler also accounts for the need to disaggregate fluxes passed from MM5 to
 TOPLATS, and to aggregate fluxes passed from TOPLATS to MM5, preserving  conservation of
 mass and energy. The coupler is not a single subroutine but rather a set of subroutines that, when
 implemented in MM5 and TOPLATS, performs the desired functions.

 The initial coupler was tested using a 1-dimensional column version of MM5 for a case-study that
 occurred in the Little Washita Watershed in 1994. Results  from this study clearly suggested that the
 temporal variability of soil moisture in TOPLATS plays a  crucial role in accurate  estimation of
 sensible and latent heat fluxes, which in turn determine ABL characteristics. A summary of these
 results will be shown.

 A significant amount of software re-engineering was also conducted on the TOPLATS model code
 in order to make it more computationally efficient. Code was re-organized in order to support
 parallelization, and parallel directives were added, tested and validated. In addition, by making use of
 "hydrological similarity," the code was optimized to eliminate pixel-based redundant calculations,
 improving the code performance by a factor of nearly 1000. Further, a restart capability was added,
 I/O was re-engineered by implementing EPA's Models-3  I/O API, and I/O API filters were built
 to support import of a variety of data types  from both stations and remote-sensing platforms,
 including NEXRAD rainfall estimates. Together, this set of improvements has made it possible to
 regionalize TOPLATS' application domain.

At present, tests of the fully 3-D coupled system are underway, though no results will be presented
at the conference. Currently, the project is focusing on three areas: the ARM-CART domain, the
Neuse River basin, and the Houston-Galveston area. Though funding for the current grant will
come to a close within the next year, follow-on projects have been funded based on the success of
the grant work The state of all of these projects and future goals will be outlined.

-------
A High  Performance Analytic  Element  Model:  GIS  Interface,  Calibration
Tools, and Application to the Niagara Falls Region

Alan J. Rabideau, Mathew W. Becker, Douglas M. Flewelling, and Igor Jankovic
Department of Gvil, Structural, and Environmental Engineering, email: rabideavi@eng.buffalo.edu
State University of New York at Buffalo, Buffalo, NY

The  presentation will summarize the proposed activities of a recently funded EPA STAR project.
The primary project tasks include:
•  Development of a high performance groundwater flow model based on  the Analytic Element
   Method (AEM), with an emphasis on MPI-based portability.
•  Development  and comparison  of automated  calibration/optimization  tools  based  on:  1)
   nonlinear regression and 2) the genetic algorithm.
•  Development of a GIS-based graphical user interface for the  modeling system.
•  Extensive testing of the modeling system on 3 high-performance parallel  computing platforms:
   the SGI Origin, the IBM SP, and a heterogeneous Sun workstation cluster.
•  Application of the modeling tools to a case study of regional groundwater flow in the Niagara
   Falls/Lake  Ontario region, including comparisons with previous efforts  based on the traditional
   finite difference approach.

The  presentation will also include a brief  overview of the  AEM for simulating groundwater flow,
with  an emphasis on large scale regional applications.
                                           10

-------
Spatially Distributed Simulation of Hydroecological Processes at the Lizzy
Site: Model Structure, Preliminary Applications and Coupling Issues

Larry Band and David Tenenbaum
University of North Carolina, Chapel Hill, lband@email.unc.edu

We present the preliminary applications and design of a spatially distributed, hydroecological model
of water, carbon and nutrient (WON) cycling and transport to the Lizzy site. While not all required
field data for model parameterization are available for this site, the form and operation of the model
are illustrated to engender discussion of issues for coupling with groundwater, atmospheric and
confined animal feed  operation  (CAFO) models.   The process models for WCN cycling and
transport are coupled with a GIS which functions to set up and parameterize the spatially distributed
model, manage simulation runs  and visualize  results. The modeling system identifies a set of
terrestrial, aquatic and atmospheric containers, which may be  further broken down  into spatial
hierarchies  of land (or water)  units.  In the terrestrial phase, the  model  contains successively
embedded basins, slopes, patches and (canopy) strata.  WCN cycling is modeled at the  level of
patches and strata, while lateral transport through hydrologic flowpaths occurs between patches at
the levels of hillslopes and basins. The aquatic phase is described by the components  of the stream
network hierarchy, and contains all stream reaches, natural and built water bodies. We will describe
preliminary operations and results from the hydrological portions of the model applied to the small
monitored catchments in Lizzy.
                                           11

-------
Issues and Data Needs for Introducing Nitrogenous Pollutants from Swine
Waste Land Application Into An Object-Oriented Ecosystem Model

Harvey Jeffries, Steve Whalen, and Heather Fraser
University of North Carolina, Chapel Hill

Simulation System model  (RHESSys) that  is implemented as a  typical  procedural-based code
operating on global data structures. Conceptually, the model has many natural hierarchies including
spatial nesting  of  landscape and  layering of land cover.  While this model provides  an excellent
conceptual framework for studying the effects of swine waste application to land, it is difficult for
new users  to understand and even harder for them to extend it.  We  will describe a substantial
reengineering of this  software into a fully  object oriented system.  The entire model has been
subdivided into a series of namespaces and class scopes and no global data or global procedures
remain. Substantial use has been made of C++ Standard Library components, as well as our own
C++ library components that provide abstractions of whole-part aggregation and support for model
utility needs that are common across  several simulation codes. In addition, community-developed
libraries that support  the processing of extensible Markup Language (XML) are incorporated to
provide for the external representation and easily extensible  parsing  of  a series  of supporting
ecophysiological variable data sets used to initialize the model. AD of  these libraries are  used to
create and  manage the spatial hierarchies via multiple inheritances from template base classes. The
need for an explicit "atmosphere" object was identified and this functionality was abstracted out of
the spatial  hierarchy and the ecosystem process representations. At  the  lowest levels of the spatial
hierarchy,  where the  ecosystem process representations reside, we have applied abstraction and
encapsulation principles to produce abstract classes that provide the necessary interfaces for  the
model to function. Within the smallest spatial object, an aspatial vertical structure of above-, on-,
and below-ground ecosystem or physical system abstract objects has been implemented. Concrete
classes that implement the physical and ecosystem processes are derived from these interface classes.
The existing procedural  process codes (with significant changes in the arguments) have been used
(mostly as private parts of the classes) to implement these concrete classes and these provide one
example of how different process  representations might be used in the model. This new design will
significantly improve the education of  new users and developers, permit ready modification of the
system, and can more easily provide for coupling with  other environmental models. We will also
present a conceptual model detailing potential fates of nitrogen in land-applied liquid lagoonal swine
effluent. We will summarize the results  of field observations and laboratory, process-oriented studies
regarding rates and environmental controls on nitrification, denitrification, N-mineralization and N
2 O emission on  representative  spray fields. Further,  we  will provide an overview of ongoing
research directed toward assessing rates and controls of plant assimilation, microbial immobilization
and off site transport of N via NH  3 volatilization, and surface and ground waters. Collectively, these
data will provide the information necessary to formulate and quantify significant parameters in the
model.
                                            12

-------
Development  of  a   Surface   Water   Object-Oriented  Modeling  System
(SWOOMS) for the Neuse River Estuary, North Carolina

Rick Luettich1, Jim Bowen2, and Marc Alperin3
 University of North Carolina at Chapel Hill, Institute of Marine Sciences, 3431 Arendell St., Morehead Gty, NC,
28557, email: rick_luettich@unc.edu
2 Department of Civil Engineering, Univ. of North Carolina at Charlotte, Charlotte, NC, 28223
3 Department of Marine Sciences, University of North Carolina at Chapel Hill, Chapel Hill, NC, 27599-3300
The Neuse River Basin has experienced unprecedented growth in the past 50 years.  Changing and
diversifying land use practices  that include forestry, agriculture, industry, and  urbanization have
placed increasing pressure on estuarine and coastal habitats to accommodate anthropogenic inputs
of nutrients.  It is widely believed that these pressures have resulted in the long-term degradation of
water quality (e.g., increased nuisance and/or harmful algal blooms and fish kills) in the Neuse River
Estuary (NRE).  In an attempt to reverse  this trend, the NC General Assembly passed legislation in
1997 requiring that a 30% reduction in external nitrogen (N) loading (based on a mean 1990-1995 N
loading "cap") be in place  by 2003.  However, there is little evidence of  how effective these
measures will be in improving water quality in the NRE. It is also very unclear how to equitably and
efficiently manage the highly diversified nutrient sources, (e.g., automobile,  smokestack and waste
lagoon emissions that are transported through the air, agricultural and urban runoff  that are
transported through the watershed, and point source discharges that are introduced directly into the
streams and rivers), to achieve a desired nutrient loading to the estuary.

Our overall objective is to develop a prototype Surface Water Object-Oriented Modeling System
(SWOOMS) that will allow us to (1) model and understand the ecological response of the NRE to
varying nutrient  (primarily N) loading and (2)  integrate this estuary model within a larger suite of
models that can track nutrients from their source in the airshed and watershed to  their ultimate
arrival and impacts within the estuary.  Existing  estuarine water quality models have not been
designed to be part of such a modeling suite and therefore they are poorly suited for linking with
dynamic models  of other environmental media.  SWOOMS will be developed using object-oriented
design and implementation together with collaboration with researchers developing models in other
media (i.e., the watershed, ground  water and  atmosphere) to yield an  integrative  environmental
multi-media modeling system.

Specific  components   of  the  SWOOMS  project   are  to   (1)  develop   a   finite-volume
hydrodynamic/transport model, (2) integrate a  water column, biogeochemical, water quality model
with the hydrodynamic/transport model, (3) develop a sediment diagenesis model that is integrated
with the water quality model, and  (4) build the resulting surface water modeling system using an
object-oriented framework that promotes seamless input/output with object-oriented models of the
atmosphere, watershed, and groundwater media.
                                             13

-------
Mechanistic-based Ecosystem Exposure Modeling on a Watershed Scale

Gour-Tsyh (George) Yeh
University of Central Florida, email: gyeh@mail ucf.edu

The approaches to environmental transport and hydrological simulations on a watershed scale can
be classified into three broad groups: stochastic  methods, parametric methods, and physics-based
mathematical methods. In the past 30 years, the watershed modeling communities have employed
parametric-based models (the most famous one is the HSPF; all other parametric models are similar
to HSPF, e.g., SWMM, CREAMS, STORM, ANSWERS, SWRRBWQJ for watershed management
and assessment including ecological exposure assessments and TMDL calculations.  Evolved from
the pioneer model STANFORD WATERSHED IV, HSPF has dominated watershed simulations in
the past 20 years.  Physics-based, process-level chemical transport and hydrological models have
been practically nonexistent until recently. It is easy to see that only the physics-based, process-level
contaminant  and  sediment transport and  fluid  flow models  have the potential to further the
understanding of the fundamental biological, chemical and physical factors that take place in nature,
and give definite and consensus  predictions.  It is  precisely for this reason that EPA ecological
research strategies (EPA, 1998) clearly stated that the first principal physical models should be used
in ecological system assessment on a watershed scale.   This presentation will discuss the essential
elements that must be included for a model to be first-principle based.  In particular, biochemical
and geochemical processes must be dealt with using reaction-based approaches and rate equations
must  be  mechanistically formulated.    The  presentation  will  address  and  emphasize  the
computational approaches that must be taken to enable the application of such a model on the field
scale. This includes the development of innovative numerical algorithms to handle moving sharp
fronts, the implementation  of high performance  computation to speed up several orders  of
calculations, and the inclusion of interactive graphical pre-processing and post visualizations to
facilitate the application to complex watersheds and subsurface media.
                                            14

-------
 Status of MIMS Architecture Design

 Steven Fine1', Karl Casdeton2, Steve Howard1, Joan Novak1*, Todd Plessel3, Jason Smith4
 1 Atmospheric Modeling Division, National Exposure Research Laboratory, USEPA, RTP, NC,
 email: fine.steven@epa.gov
 2 Ecosystems Research Division, National Exposure Research Laboratory, USEPA, Athens, GA
 3 Lockheed Martin
 4 University of North Carolina-Chapel Hill

 The Multimedia Integrated Modeling System (MIMS) is a multi-scale, cross-discipline modeling and
 decision support system that is intended to support ecosystem modeling and environmental health
 assessment. The MIMS architecture design team has collected information about MIMS
 requirements from stakeholders, including presentations and discussions from three prior
 multimedia modeling meetings, and from the team's own experience. The team has also surveyed
 other modeling systems and talked with a variety of model and modeling system developers. From
 this information the team has developed a draft MIMS architecture.

 While the architecture may continue to evolve through the life of MIMS, the team intends to have a
 version of the architecture by the end of September that could be used to begin development.
 Simultaneously the architecture design team is investigating technologies, such as object databases,
 and collaborations that could provide the capabilities required to implement parts of the
 architecture.

 Current MIMS Draft Conceptual Structure

 One representation of the architecture is the "conceptual structure" which shows abstractions of
MTMS's functional requirements and data sharing relationships among them. The conceptual
structure is represented as a description of entities and a diagram (which uses an informal notation)
among these entities. The lines on the diagram can be read as sentences starting with the entity at the
origin of the line, followed by the text along the line, and finally the text at the destination of the
line. For example, the line between the Session Manager and the System Administration Manager at
the left side of the diagram could be read as "Session Manager invokes  System Administration
Manager."
Functional Abstractions

Files. Files on a storage device.

Databases. Databases that contain indexed information.

External Data Sources. Data services external to the organization running MIMS, such as web sites
and Distributed Oceanographic Data System servers.
* On assignment from Atmospheric Sciences Modeling Division, National Oceanic and Atmospheric Administration,
U.S. Department of Commerce
                                             15

-------
 Concurrently Executing Data Processors. In many cases, required data will be produced by other
 data processors.

 Environmental Data Request Satisfier. Accepts requests for environmental data and determines if
 and how the request can be satisfied. Sources of data include files, databases, external data sources,
 and programs that are expected to be running. Some requests might imply aggregating, subsetting, or
 transforming the data. The Satisfier returns an object that provides data that satisfies the request.

 Environmental Data I/O and Transformation Library. This library provides routines to read, write,
 subset, aggregate, and interpolate environmental data. It provides one or more APIs to support field
 data, tabular data, and arbitrary sets of parameters. Instances can be constructed by the
 Environmental Data Request Satisfier.

 Persistent Object Storage. This provides storage for objects used within MIMS. It may only be
 accessible from object-oriented languages.

 Analysis and Visualization: Performs analyses of data and prepares visualizations and reports to be
 used by people. A type of visualization tool that might require additional thought is one that both
 displays model results and allows the user to change inputs to models. For example, a control
 strategy development tool might display the cost and resulting air quality for a set of control
 strategies and allow the user to change the strategy and invoke a new simulation. It's possible that
 this would be a type of iteration controller.

 Data Processor. One or more computational processes that perform transformations of data.
 Examples include what are typically called preprocessors, models, and postprocessors. Data
 processors can be executables, scripts, or collections of data processors. If the architecture allows
 data processors to be part of the framework's process space, then a data processor might simply be
 an object. Each data processor should be able to describe or should contain metadata that describe
 the resources that are required including data sets, computers, and GUIs.

Module. One or more subroutines that conform to a standard calling interface and paradigm that is
 designed to support interchangeable modules. An example of modules is alternative algorithms for
 advection that could be plugged into a model and interchanged. There could be a number of
 standard calling interfaces and paradigms, each supporting a different type of data processor (e.g., air
 quality model, meteorological model, groundwater model, emissions processor). Each module
should be able to describe or should contain metadata that describe the resources that are required
including data sets, computers, and GUIs.

Execution Manager. Used by Iteration Controllers (1C) and Conceptual Simulation Editors (CSE) to
control execution of Data Processors.

Component Mediator. This  provides configuration management functionality, where configuration
items are Iteration Controllers, Data Processors, and Modules. It likely will be the Concurrent
Versions System (CVS) or built on top of CVS, which will allow users to perform configuration
management without using the MIMS framework
                                             16

-------
System Administration Manager. Provides administration services, such as registering users and
hosts.

Iteration Controller. An entity that understands the characteristics of data processors, modules, and
other iteration controllers; provides values for inputs that the user has specified will vary; starts the
execution of the data processors and other iteration controllers; and determines if iteration is
required, possibly by examining the results of previous runs. Examples of iteration controllers
include a Monte Carlo simulation controller, a calibration tool, an iteration tool (e.g., repeat runs for
different chemicals or time periods), an automated testing facility, and an optimization package.

Conceptual Simulation Editor. This includes functionality for building models, connecting models
(planning computational studies), and specifying relevant domain, chemicals, date/time, output time
steps, output parameters, scale, etc. It is undecided if this will include specification of a timeline of
events  (e.g., land development), or if that functionality belongs in an Iteration Controller or a model.

Session Manager. The Session Manager is what is initially started when MIMS is invoked. It
performs authorization checks before starting other MIMS components. It might also keep track of
which components and windows were last open  and reopen those items.

Monitor. System monitors run on each platform where iteration controllers and data processors run,
invoke those components at the request of the execution manager, and indicate when those
components have completed. The monitor also returns information about the capabilities of the
host on which it runs.
                                             17

-------
'  =
  I
                                                                                                        "£ f=  -^

-------
 Dynamic Information Architecture System (DIAS)

 Jayne Dolph
 Decision and Information Sciences Division, Argonne National Laboratory, email: dolph@anl.gov

 A major challenge in modeling and simulation is the need to account for the complex, dynamic, and
 multi-scale nature of "real-wo rid" systems for the purpose of experimentation, problem solving and
 decision support. Over the past 7 years, DIS has  developed the Dynamic Information Architecture
 System (DIAS), an object-oriented framework with capabilities for attacking complex modeling and
 simulation problems. The main components of any DIAS simulation are 1) software objects (Entity
 objects) that represent the real-world entities that  comprise the problem space; 2) simulation models
 and other applications that express the dynamic behaviors of the domain entities; and 3) the unique
 DIAS infrastructure that makes it feasible to build and manipulate complex simulation scenarios in
 which Entity objects can interact via numerous concurrent dynamic processes.

 DIAS extends the Object paradigm of encapsulation and inheritance by abstraction  of the objects'
 dynamic behaviors, separating the  "WHAT"  from the "HOW. DIAS object class  definitions
 contain an abstract description of the various aspects of the object's behavior (the WHAT), but no
 implementation details (the HOW).  Separate DIAS infrastructure objects carry the implementation
 of object behaviors (the HOW) and provide the  linkage to models/applications. These models  are
 linked to appropriate domain  objects "on the fly", to meet  specific needs of a given simulation
 context.  Therefore, in a DIAS simulation, models communicate only with domain (Entity) objects,
 never directly with each other. From a software  perspective,  this makes it easy to add models, or
 swap alternative models in and out without major re-coding. This gives DIAS the ability to scale
 very well to increasingly complex problems. To adequately address the scientific domain of these
 new models, however, requires that intelligent domain (discipline) expertise be used in Entity object
 design.

Because of the wealth of existing legacy models/applications, it is important to promote model and
code  reuse.   Application  registration through  DIAS  provides  for  integration  of legacy-type
applications (e.g. simulation models,  GIS  applications, database  management systems) without
expensive reworking or receding.  The DIAS Java-based Application Programmers Interface (API)
allows modelers to register models/applications and create DIAS simulation suites. Using the API,
developers can extend existing DIAS  Entity  objects, create new Entities  as  needed,  write the
"wrappers" for their existing models/applications, or code new models within the DIAS framework
DIAS external  applications  are executed in their native languages  (e.g. FORTRAN, Q etc.), and
DIAS  can  be used to integrate disparate models, databases, or applications that  can  reside  on
multiple machines across a network, including the  Internet.

Additional  functionality associated  with  DIAS  is the Framework for Addressing Cooperative
Extended Transactions, or FACET.  FACET is an object-based software framework that provides a
mechanism for  simulating complex patterns  of societal interaction of multiple entities (agents) over
time.  FACET models are analogous to flowcharts and can be used to implement societal behavior
patterns such as land management plans, business  practices, government or corporate policies,
clinical guidelines, etc. FACET models operate within DIAS, providing  dynamic, integrated societal
and natural process modeling capability.
                                            19

-------
Modular Modeling Framework for Multi-disciplinary Ecosystem Modeling

G.H. Leavesley, S.L. Markstrom, and R.J. Viger
U.S. Geological Survey, Denver, CO, email: george@usgs.gov

The interdisciplinary nature and increasing complexity of ecosystem and natural-resource problems
require the use of a variety of modeling approaches  that can incorporate knowledge from a broad
range of scientific disciplines. Optimal models for selected problems should be composed of a
combination of process-simulation modules that are most appropriate for a given set of problem
objectives, data constraints, and spatial and temporal scales of application. A modular modeling
framework is being developed to facilitate the development, evaluation, and application of such
models to a wide range of research and operational problems. The framework is an integration of
modular concepts and tools from the U.S. Geological Survey's Modular Modeling System (MMS)
and the Friedrich Schiller University's Object Modeling System (OMS). The framework uses the Java
programming language to provide an object-oriented approach and platform independence. Model
resource definition and the integration of database and GIS technologies are supported through the
use of the extensible Markup Language (XML).
                                           20

-------
 Development of Object-based Simulation Tools for Distributed Modular
 Ecological Modeling

 John Bolte1 & Timothy Budd2
 -'•Associate Professor, Bioresource Engineering Department, Oregon State University, email: boltej@engr.orst.edu
 ^Associate Professor, Computer Science Department, Oregon State University

 Ecological models, and supporting computing technology, have evolved to the point were it is
 feasible to develop truly modular approaches to model construction and implementation.  A
 consistent, robust framework is needed to make this technology readily accessible to the ecological
 modeling community. The primary objective of the study is to provide this technology by extending
 an existing object-oriented simulation framework for ecological modeling in four focused areas: 1)
 underlying object-based technology for implementing and coordinating collections of simulation
 objects (modules) in an integrated simulation environment, 2) interobject communication
 technology supporting both single-machine and network-based communication between
 components of a modular simulation, 3) spatial and nonspatial data input, collection, analysis and
 visualization, and 4) development of visual programming tools for rapid definition and assembly of
 model modules into complete, fully functional ecological models and visual interpretation of model
 results.

 This research will result in a set of tools for automating and simplifying the process of developing
 interoperable object-oriented simulation models for spatially- and nonspatially-explicit ecological
 systems. The study will use current standard object-oriented languages (C++ and Java) and
 language-independent object technologies (CORBA/COM/DCOM) to refine and develop a
 collection of object classes and application programming interfaces (API's), and visual development
 tools geared towards ecological modelers for simplifying and automating the development of
 ecological models which maybe both spatially distributed and networked via the World-Wide Web.
 We will work other modeling groups to coordinate activities and approaches to providing
 technology for modular model development and implementation. To test and evaluate the tools
 developed, the resulting simulation framework will be utilized to implement a watershed model with
 hydrologic, ecological and economic components currently being developed in a separate project.

 Deliverables from this project will include (1) a series of interface specifications and corresponding
 code to allow intermodule communication and identification to facilitate modular model
 development and assembly, 2) a simulation environment providing coordinated execution of module
 collections, each potentially running as continuous or discrete object at variable timesteps, and 3) a
visual assembly tool capable of query single-machine or networked model modules for identification
and interface specification, able to assemble these modules into complete simulation models,
execute the resulting model, and provide visual interpretation of the results.  These products will be
made available on the web for utilization by other modeling groups.
                                            21

-------
A Toolbox for Assembling Spatially-Explicit Multimedia Ecological Models
from Reusable Components

D. A. Weinstein, P. Woodbury, D. Swaney, R. Beloin, B. Gollands
Boyce Thompson Institute for Plant Research, Cornell University, email: daw@cornell.edu

Simulation models have become essential tools for evaluating the effect of human perturbations on
ecosystems.  We have developed a system for constructing ecological models for simulating the
inter-linking of processes, such as water flow and nutrient flow, and the spatially-explicit
connections between adjacent ecosystems. This  system is called the Ecological Component Library
for Parallel Spatial Simulation (ECLPSS), and can be viewed as a toolbox for ecological simulation
modeling. The goal of this framework is to permit users to design and build fully three-dimensional
spatially-explicit ecological models by connecting existing reusable components and readily creating
new components. In ECLPSS, models can be quickly assembled from a library of components, each
representing a single function or aspect of a process. These components only communicate via state
variables - they cannot communicate directly or with any hidden variables.  Because of this
constraint on model looping structure, components can be readily replaced without 'breaking' the
model.  This functionality makes it easy to test the influence of a given representation of a process
on the predictions, or to easily modify the model when understanding of a process improves. The
ECLPSS framework uses component-based techniques to simulate ecosystem and ecological
phenomena at multiple spatial scales. The framework is designed to take advantage of parallel
processing to enable simulation of many individuals in a large spatial domain at high speeds.  It is
designed to operate on multiple platforms and be used across networks via a World Wide Web-
based user interface.

This system allows the user to separate chemical, physical, and biological concepts from simulation
support functionality. Hence biologists can readily construct new spatially-explicit models without
becoming  computer scientists, while computer scientists can improve the modeling support
framework without being ecologists.  The system assists ecologists to build robust spatially-explicit
simulations of ecological processes from a growing library of reusable components representing
terrestrial ecosystem processes.

With this system scientists can create models  of ecological processes more reliable by using
interchangeable components that can be  easily replaced without interfering with other aspects of
model behavior.
                                            22

-------
Software Reuse and Cluster Computing Tools for Environmental Modeling

Suraj Kothari
Iowa State University, Electrical and Computer Engineering, email: kothari@iastate.edu

The Software Reengineering Laboratory (SRL) at Iowa State University has developed ParAgent, a
tool for automatic  parallelization of legacy codes.  The SRL has  assembled ParAgent  and other
complementary tools in a comprehensive environment to make cluster computing a reality for
long-range climate and environment modeling. The software tools and their applicability will be
described using the NCAR/Penn State MM5 as an  example. Results of performance experiments
on several Pentium clusters with different communication networks will be presented.
                                          23

-------
The Coupling-Mode Extensions of the Models-3 I/O API: Tools for Building
Coupled Cross-Media Models

Carlie Coats
Environmental Programs, MCNCNorth Carolina Supercomputing Center, RTF, NC 27709, email: coats@ncsc.org

The Coupling-Mode Extensions were developed in the "Practical Parallel Computing" STAR
Cooperative Agreement to provide a robust and modeler-fnendly set of tools for coupling multiple
cooperating environmental models.  Data is exchanged by means of a data access library in a
selective, direct-access fashion, so that the individual models do not "need" to know whether they
are running stand-alone, receiving their data from files, or whether they are running in parallel,
receiving data from other processes  running at the same time (on possibly different — even remote —
computer systems). Cooperating-process models for both hydrology-meteorology and numerical air
quality prediction have been constructed using the coupling-mode extensions of the Models-3 I/O
API.
Model Design for High Performance on Microprocessor Based Parallel
Computer Systems

Carlie Coats
Environmental Programs, MCNGNorth Carolina Supercomputing Center, RTP, NC 27709, email: coats@ncsc.org

Several aspects of environmental model design and parallel decomposition have substantial effects
upon the computational performance of models running on microprocessor based parallel computer
systems. The  "Practical Parallel Computing" STAR Cooperative Agreement studied the sensitivity
of model performance to these aspects for several environmental models, across a range of several
vendors' computer systems. It also studied modeler friendly means of parallelization on available
systems. We discuss these results.
                                          24

-------
High Performance Computing for Large Scale Environmental Flow and
Transport Processes

Tate T. H. Tsang
Department of Chemical & Materials Engineering, University of Kentucky, email: tsang@engr.uky.edu

The objective of this  work is to develop parallel algorithms for large scale environmental flow and
transport  processes.  We present  a  domain-decomposition based  Least-Squares  Finite Element
Method (LSFEM) which can provide implicit and fully coupled transient solutions to large scale,
three-dimensional Navier-Stokes equations and convection diffusion equations. The LSFEM leads
to a symmetric and positive definite  (SPD)  system of  linear equations. A parallel and matrix-free
conjugate  gradient method is  used  as  an  iterative solver for the SPD  linear systems. Parallel
computations of fluid flow problems with more than 4 million finite elements and 29 million
unknowns were  carried  out on a  HP-Convex Exemplar X-class scalable computers (SPP 2200).
Efficiency of parallelization is about 65 to 86 percent.

We formulate a LSFEM for large eddy simulation (LES) of turbulent flow and transport processes.
The formulation  is based on a dynamic subgrid scale model, which can capture the dynamics of the
wall layer and provide correct profiles of eddy viscosity and eddy diffusivity.

Other  preliminary results  obtained  by  the LSFEM include  multiphase flow  in porous  media,
pollutant dispersion in convective boundary layers, concentration fluctuations and reactive plumes.

Other  activities  in this  project include parallel computations  of open  channel flows on  the
Roadrunner cluster.
                                           25

-------
High-Performance Modeling of Estuarine and Coastal Dynamics

Y. Peter Sheng1, Justin Davis1, S. Rajasekeran2, andj. Luo2
1 Qvil & Coastal Engineering Department, University of Florida, Gainesville, FL, email:  pete@coastal.ufl.edu
2 Computer and Information Science and Engineering Department, University of Florida, Gainesville, FL

In order to develop predictive ability of long-term response of estuarine and coastal environments to
natural and anthropogenic changes, high performance environmental models are needed. Multiple
year simulation using existing multi-dimensional environmental models require excessive computer
time on conventional  single  CPU  computers.   This work examines the application of high
performance  computing to models  of estuarine  and  coastal dynamics.   Both shared memory
approach and message passing approach (PVM and MPI) have been tested with 1-D, 2-D and 3-D
estuarine and  coastal models  on several different platforms  (SGI Origin-2000, Sun Enterprise,
Beowulf Cluster, and a cluster of various UNIX workstations).

This paper focuses on the  application of high-performance  modeling techniques  on a curvilinear-
grid multi-dimensional model CH3D which was the cornerstone of the Chesapeake Bay  model.
Based on our test, the shared-memory approach is very effective on the SGI Origin-2000, due to the
particular solution algorithm deployed by  CH3D.  However, CH3D  is  not very amenable for
parallelization on the  Beowulf Cluster at UF.  A new approach which uses a very fine rectangular
grid and special parallel algorithms has  been found to be very effective.   A new parallel  matrix
inversion routine developed by us is being implemented into the model.
                                           26

-------
Novel Techniques for Parallel Simulations

S. Rajasekeran
University of Florida, email: raj@cise.ufl.edu

In this  talk we discuss two novel ideas that have proven useful in coastal simulations. The first is a
simple  parallel algorithm for solving banded linear systems. This algorithm is optimal and practical.
The second is a paradigm called LESS_TALK. LESS_TALK can be used to speedup computations
in general on any parallel model of computing.
                                          27

-------
Tool-box for parallel simulation of flows in porous media

Raytcho Lazarov and S.Z. Tomov
Department of Mathematics, Texas A&M University, e-mail: lazarov@math.tamu.edu

For parallel  simulation  of flows in porous media we have implemented, and tested a computer
system that is based on:
(a) discretization techniques utilizing finite elements and finite volumes;
(b) efficiently preconditioned iterative methods for the resulting large sparse system;
(c) error control and adaptive grid refinement; and
(d) parallel implementation on multiprocessor systems utilizing domain decomposition
concepts.

The tools that we have used include:
(1) stand-alone 3-D mesh generator (NETGEN);
(2) stand-alone partitioning and load balancing software (METIS);
(3) local error
control based on a posteriori error analysis and subsequent refinement procedures;
(4) parallel methods based on domain decomposition and multigrid/multilevel subdomain
preconditioning;
(5) MPI and the OpenMP standards for parallel computations.

Examples of various tests on problems of flow and contaminant transport in porous media will be
presented.
                                           28

-------
 Development of a 1-km vegetation  database for modeling biogenic fluxes of
 hydrocarbons and nitric oxides

 Thomas E Pierce "
 Atmospheric Modeling Division, National Exposure Research Laboratory, USEPA, Research Triangle Park, NC 27711
 email: pierce.tom@epa.gov

 Regional air quality modeling  systems  are composed  of an intricate network of programs that
 include emission processors, dry deposition algorithms,  and meteorological modules. Characterizing
 the land surface is critical for modeling biogenic emissions, dry deposition of gases and particles, and
 fluxes  of moisture, heat, and momentum.  Existing land use inventories, such as the popular USGS
 1-km satellite-derived database, lack information on the distribution of specific tree species and crop
 types.  Emissions of isoprene, an important hydrocarbon emitted from vegetation, varies by several
 orders of magnitude among deciduous tree species. Similar variations occur with  the emission of
 nitric oxide from different crop  types.  Less  pronounced, but still significant (-factor of three),
 variations occur with stomatal resistance among different tree species  and crop types.  Stomatal
 resistance strongly influences dry deposition and surface  energy fluxes.   To support the USEPA's
 existing Community Multi-Scale Air  Quality (CMAQ) modeling system and evolving Multi-Media
 Integrated  Modeling System  (MIMS), we  have  developed a 1-km vegetation database for North
 America. This database integrates the USGS database with forest inventories collected by the U.S.
 Forest Service  and  crop statistics complied by the  U.S. Department  of Agriculture. We will
 demonstrate the value of characterizing specific tree species and crop types in the third generation of
 the Biogenic Emissions Inventory System  (BEIS3), which is  an important emissions processor for
 the CMAQ modeling system.
* On assignment from Atmospheric Sciences Modeling Division, National Oceanic and Atmospheric Administration, U.S.
Department of Commerce
                                            29

-------
Landscape Characterization and  Non-Point  Source Nitrogen  Modeling in
Support of TMDL Development in the Neuse River Basin, North Carolina

Ross S. Lunetta1, Jayantha Ediriwickrcma2, Charles T. Garten3, Richard G. Greene4,  John
liames1, David Johnson2, John G. Lyon5, Alexa McKerrow2, and Drew Pilant1
nj.S. Environmental Protection Agency, National Exposure Research Laboratory (MD-56), Research Triangle Park, NC
27711, email: lunetta.ross@epa.gov
2Lockheed Martin Services, Inc., Research Triangle Park, North Carolina 27709
'National Research Council, U.S. Environmental Protection Agency, National Exposure Research Laboratory (MD-56),
Research Triangle Park, NC 27711
4Oak Ridge national Laboratory, Environmental Sciences Division, Ecological Sciences Section, Oak Ridge , TN 37831
5U.S. Environmental Protection Agency, National Exposure Research Laboratory, 944 E. Harmon, Las Vegas, Nevada
89193-3478

Pfiesteria-like toxic  blooms have been  implicated  as the causative agent responsible for numerous
outbreaks  of  fish lesions and fish kills in the Mid-Atlantic and southeastern United States.  An
increase in frequency, intensity, and severity of toxic blooms in recent years is thought to be a result
of surface water nutrient enrichment mediated by changing land-use practices.  The goal  of this
research is  to apply land-cover/use information to quantify the extent  distribution of terrestrial
sources of nitrogen contributing to harmful algal blooms and possible Pfiesteria outbreaks.  This is
being accomplished by coupling high  resolution  land-cover  data  sets with  GIS-based nutrient
models that will be calibrated and  validated using  a stratified subset of Neuse River basin  (NKB)
sub-watersheds.  A high resolution land-cover/use data sets has been developed using advanced
satellite based remote sensor systems including the new SPOT  4  (XS)  and Landsat 7 Enhanced
Thematic Mapper Plus (ETM+) remote  sensor systems to provide basin-wide land cover/use  data at
0.4 ha. and 5.8 ha. minimum mapping units (MMU) or landscape patches. Also, new IKONOS sub-
meter stereo imagery for the characterization  riparian zone vegetation structure.  NRB modeling
includes the development and implementation of a nitrogen mass balance model to quantify patch
specific potential nitrogen sources, coupled with a  hydrologic  model for routing nitrogen to water
courses based on event driven precipitation.
                                            30

-------
 Observations   and  Modeling   of  O3  and   SO2  Deposition  Velocity  over
 Agricultural and Forest Ecosystems: Synthesis of a Six Year Field Program.

 Peter L. Finkelstein
 Atmospheric Modeling Division, National Exposure Research Laboratory, USEPA, Research Triangle Park, NC 27709,
 email: fuikelstein@epa.gov

 Dry deposition networks, such as the CASTNET network in the United States, usually measure
 concentration of pollutants and infer deposition velocity from a model.  In the Castnet network the
 multi-layer model  (MLM) of Meyers is used. To evaluate the MLM and other deposition velocity
 models, and to provide a database that can be used to improve our understanding of dry deposition
 processes, we launched a field program that was designed to directly measure fluxes and deposition
 velocity of O3 and SO2 over a variety of natural and managed ecosystems that were representative of
 major land use types. Agricultural studies have included two studies over pasture, one over corn,
 and two over soybean. One of the soybean studies lasted the full growing season of the crop, from
 planting to harvest, and included a major drought, and a hurricane.  Three studies were conducted
 over  forests, a six week study over a  pine plantation, and growing season long studies over a
 deciduous and a mixed coniferous-deciduous forest. Additionally, a six-week study was conducted
 over a salt-water estuary. This paper will summarize and synthesize these studies.  It compares and
 contrasts the observations from each study, extracting the salient features and noting the similarities
 and differences in the deposition process for each pollutant between the  different cover types.
 Besides a general  "Climatology" of deposition velocity for  the sites, special attention is given to
 issues such as the impact of water on leaf surfaces, day/night differences between plant species, the
 differing diurnal cycles of deposition velocity over different plants, uptake by surfaces, and  other
 issues. The MLM was run using data from the field studies.  The modeling results are shown, and
 its strengths and weaknesses discussed. Finally, some thoughts on a new deposition velocity model
 that shows promise for improved predictions will be presented.
* On assignment from Atmospheric Sciences Modeling Division, National Oceanic and Atmospheric Administration, U.S.
Department of Commerce
                                             31

-------
Four-dimensional (4D) acquisition, visualization and analysis of Ground Penetrating
Radar data for the determination of three-dimensional hydrologic conductivity and
time-dependent fluid flow

Roelof Versteeg and Ralf Birken
Columbia University, email: Versteeg@ldeo.columbia.edu

The key problem in subsurface remediation is knowing what happens below the surface. If we knew
this, we could tailor our remediation and containment activities to the processes that occur. This
knowledge can partly be derived from modeling. However, modeling will always be limited by data
and thus what we  really need are data on subsurface processes. In all current approaches our data
on the subsurface is limited to the  results of single geophysical surveys and a small number of wells,
most of which are not even equipped with continuous logging probes.

Sponsored by EPA Grant  # GR825209-01-0, we developed and  refined a  methodology to use
multiple three-dimensional (3D) GPR surveys (4D surveys)  to image processes.  Our direction of
research was based on the recognition that 4D data do provide the potential to see processes but
also that  field 4D  data were  of  insufficient  quality to resolve many of the  issues involved in
understanding 4D  data. We thus chose to develop our methodology in a custom-created highly-
controlled environment (Columbia's subsurface imaging lab) in which we can  create and control
both  the subsurface and the processes in the subsurface. The  lab is centered around a large tank
which can hold 20 tons of material which can be emplaced in a relatively simple fashion.

This lab is completely automated from data acquisition to processing and web  based visualization
(all of which occur in near real time).  Thus, we can observe processes  as they occur (which of
course allows us - at least in theory -  the ability to observe the effect of feed back systems on these
processes). Once we collect high-density 4D data the focus shifts to how to interpret 4D geophysical
data. As the temporal density of our 4D datasets is one to two  orders of magnitude higher than all
other 4D datasets collected anywhere else we had to develop a new approach. Our data allows us to
exploit the temporal density of the data to invert for the changes in our data (which are a result of
processes)  to obtain  information on the fluid  flow.  This of course  requires fairly complex
visualization  and computational approach as  we are manipulating data  coming in  at a rate  of
approximately 20 MB/hour, and we have to invert this data for processes. However, the amount of
data makes the inversion feasible (if not trivial).

As the fluid flow is  related to the hydrological conductivity we can - with some assumptions -
derive a 3D distribution of hydrological conductivity from our data. While this methodology has
been developed in a controlled setting, the only problems in expanding this approach to the field
setting" are engineering related and  we are now close to being able to image subsurface flow (and the
contamination or remediation associated with this) in field settings in real time.  One caveat is the
geophysical method used: while radar was appropriate for our setting it is well known not to work in
all areas and we may thus have to use other geophysical methods (eg. acoustic or electrical) in other
settings.
                                            32

-------
Determination of aquifer recharge, ground-water flow and basin discharge-
methods and examples
Ted Mew1 and Tim SpruilP
1 North Carolina Department of Environment and Natural Resources, Raleigh, NC,
email: T_Me-w@gw.dem.ehnr.state.nc.us
2 United States Geological Survey, Raleigh, NC

The water balance equation provides the basis for understanding movement of water and dissolved
matter through a watershed. Several hydrologic and chemical techniques can be combined to
understand a hydrologic system so that resulting models more accurately reflect controlling
processes taking place in a watershed. Useful techniques include hydrograph separation, isotope
analysis, and environmental tracers. A simple analysis using tritium data and hydrograph separation
is demonstrated which can be used to separate surface runoff, deep ground water and shallow
ground water contributions to streamflow. Environmental tracer data collected during the summer
of 1999 (chlorofluorocarbons) are presented which can be used to verify model-predicted
movement of groundwater through Coastal Plain aquifers on the Lizzie Study Site.

In order to demonstrate application of the hydrograph separation technique, a case study for
mapping ground-water recharge in North Carolina is presented. The method focuses  on the
drainage basin, dividing the landscape into upland flats, valley slopes, and valley bottoms. The
method is based on a conceptual understanding of ground-water flow within the basin, where water
moves beneath the land surface from upland recharge areas to lower riverine areas of ground-water
discharge. The building blocks of the recharge map are l:24,000-scale soil-mapping-units digitized
from detailed county soil surveys. These mapping units are aggregated into "hydrogeologic areas"
having similar recharge characteristics.  Several factors govern the rate of ground-water recharge,
including: depth to the water table; slope of the land surface; and the  infiltration capacity of the
unsaturated soil profile.  Corresponding properties of soil mapping units are: drainage class; slope
gradient; and underlying geology. The Rorabaugh-Daniel streamflow recession-curve displacement
technique was used to estimate ground-water recharge in selected drainage areas at representative
USGS  stream gauging stations. These calculated recharge values were  used, in turn, to calibrate,
recharge rates for the different hydrogeologic areas using a Monte Carlo simulation technique.
Stream hydrograph separation techniques were used to determine the ground-water contribution to
streamflow.
                                            33

-------
Advancement of Environmental Decision Support Through HPCC

John Baugh1, Downey Brill1, Dan Loughlin1, Ranji Ranjithan1, and Steve Fine2"
1 Department of Qvil Engineering, North Carolina State University, email: dhloughl@eos.ncsu.edu
2 North Carolina Supercomputing Center, RTF, NC 27709

Decision making for single- and cross-media environmental problems is often complex and tedious.
Competing design objectives and constraints, coupled with large quantities of data and sophisticated
simulation models,  produce problems that can seem overwhelming to a policy maker.  In some
cases, the time and effort spent to find feasible management strategies limits  the resources available
for exploring issues  such as cost, equity, and uncertainty. The fields of Decision Support Systems
(DSSs), Operations  Research, and Decision Theory offer concepts and techniques that promise to
facilitate the  design process, potentially yielding  more effective and efficient solutions, as  well as
allowing more  comprehensive consideration of complex design issues. Many such  techniques,
however,  are highly computationally intensive. "While this has limited their application  in the past,
high performance  computing is making these approaches more practical  for  supporting policy
making in the present and future. We have performed a variety of activities to explore the use of
decision support tools in regulatory decision making. Among the topics that will be discussed in our
presentation are: a prototype decision support system for air quality management, optimization using
distributed computing, advancements in optimization algorithms, the conjunctive use of simple and
complex models, and new approaches for modeling emissions trading programs.
 Now on assignment to Atmospheric Modeling Division, National Exposure Research Laboratory, USEPA from Atmospheric
Sciences Modeling Division, National Oceanic and Atmospheric Administration, U.S. Department of Commerce
                                            34

-------
 A Framework for Creating and Managing Graphical Analyses

 Steven S. Fine *, W. Ted Smith, Daniel M. Gatti, Steve R, Thorpe, Neil J. M. Wheeler +,
 Kiran Alapaty, Alison M. Eyth
 Environmental Programs, MCNC-North Carolina Supercomputmg Center, RTF, NC 27709,
 email:  fine.steven@epa.gov

 Environmental data analysis often consists of  a long and convoluted series of questions, with each
 analysis providing some answers and leading to more questions. During the exploration of the data,
 statistics and visualizations  are  a means  to an  end,  such as  a  better  understanding of  the
 environment, an improved model, or a new policy. While a number of environmental data analysis
 tools are available, there remains a fundamental gap between the work many scientists perform and
 the  tools that are  available. Most environmental data analysis  systems concentrate on only  the
 capabilities needed to crunch  numbers or create pictures. The user retains a substantial  burden of
 repetitively explaining to an analysis package — often via low-level commands — how to process
 the data. At the same time, many people keep no or inadequate records of how they have analyzed
 their data, which can be a problem if questions arise later. Issues such as lack of awareness of tools
 and their capabilities further hinder scientists' data analysis efforts.

 To address the gap between data analysis  work and analysis tools' capabilities, we developed  the
 Knowledge-based Environmental Data Analysis Assistant (KEDAA), a prototype software system.
 The KEDAA provides analysis management  capabilities that provide more complete support  for
 data analysis than is found in most  analysis  packages. Note that the  KEDAA was intended  to
 support people who analyze environmental data, not replace them. The KEDAA does not interpret
 data or draw conclusions from analyses.

 The KEDAA also does not perform analyses  itself. Instead,  it relies  on off-the-shelf data analysis
 packages to perform that work In other words, the KEDAA provides an interface between the user
 and  existing analysis packages. A user specifies an analysis  to perform via the  KEDAA's graphical
 user interface (GUI), which allows the user to select analysis techniques, variables to plot, options,
 panes (i.e., divide a window or page into multiple plot areas), and  overlays. The  KEDAA determines
 which of the external analysis packages that KEDAA utilizes can best satisfy the user's request and
 sends the appropriate instructions to that package.

 The user can store the specifications of analyses in  an electronic outline. Collapsing and expanding
 sections of the outline allows the user to review past work at the desired  level of detail. Textual
 notes can be added to the outline to  record observations, hypotheses, and conclusions, allowing a
user to create an integrated record of what analyses were performed  and what insight was gained.
Variables read from data files  and formulas for deriving variables can be assigned to topics in  the
outline, providing  a scope  for variables that are analyzed. Analyses can be copied from one topic
 (scope) to another, allowing complex analyses to be easily applied  to different variables.
* Now on assignment to Atmospheric Modeling Division, National Exposure Research Laboratory, USEPA from the Atmospheric
Sciences Modeling Division, National Oceanic and Atmospheric Administration, U.S. Department of Commerce.

+ Currently with Sonoma Technology, Inc.
                                             35

-------
The KEDAA currently provides six types of analyses, implemented by three analysis packages, and
reads files in four formats. The KEDAA has been designed in a modular manner that allows  its
capabilities to easily be extended with new analyses,  packages, and file formats.  The KEDAA is
implemented in Java and has been used on several flavors of UNIX.

We believe that the KEDAA can provide several benefits for users. Users can perform repetitive
analyses, for instance of multiple time periods or scenarios, by simply copying and pasting analyses.
The KEDAA provides a common user interface for creating analyses with multiple packages and
indicates when  analysis techniques can and  cannot  be  used, which allows  users to apply new
techniques to their work without learning new analysis packages or commands. Since the KEDAA
provides a general graphical user interface for creating analyses, it can be used as a front-end for
packages that do  not have  a GUI.  Finally, users can produce  an integrated record of analyses
performed and insight gained that can be invaluable when questions arise later.

The development  and evaluation of the approaches described above were terminated prematurely
when the principal investigator changed organizations. The KEDAA provides significant analysis
management capabilities, but users have not  extensively evaluated the concepts.  The KEDAA is
available as open source from http://envpro.ncsc.org/keda.  The distribution includes source, jar
files, and on-line documentation.
                                            36

-------
Space Time Toolkit: Web-Based Integration and Visualization of
Spatially and Temporally-Disparate Data from Distributed Sources

Mike Botts
Principal Research Scientist, Earth System Science Center, GHCC - NSSTC, University of Alabama in
Huntsville, Huntsville, AL 35899, email:  mike.botts@nsstc.uah edu (http://vast.uah.edu (256) 961-7760)

The Space-Time Toolkit (STT) is a Java-based environment designed for interactive fusion
and 3D visualization of disparate data sets. It is capable of running as either an application or
an  applet  on virtually  any workstation  platform with highly interactive, dynamic 3D
performance. Unlike  most visualization environments, the STT does not require input data
to be gridded into a common spatial domain nor stacked as common time slices in order to
be integrated together. In fact, the STT acts as  an on-demand transform manager that will
"on-the-fly" georegister and synchronize multisource disparate data into any user-selectable
display domain. A powerful capability within the STT is the presence of an Observation
Dynamics Model that provides geolocation capabilities for virtually any dynamic sensor,
whether hand-carried, stationary, or mounted on satellite, aircraft, ship, or automobile.

For any user-selected project, the STT determines the availability of data using web-based
resource  documents.  All data is accessed over the web using URLs that point to Internet
files, CGI scripts,  or Java Servlets.

While the STT prototype was  developed primarily using NASA funding, the current Java
redevelopment was partially funded (10/96 - 09/99) by the  EPA HPCC program under
Chris Saint. The  EPA Nashville SOS field experiment data  has served  as  a testbed for
merging disparate data from multiple sources. With current funding primarily coming from
NASA, the STT is being extended to  meet the needs of NASA's EOS and Digital Earth
programs.

-------
Field Data Model for Supporting Cross-media Modeling and Visualization
of Large Diverse Time-varying Geospatial Data and Metadata

Todd Plessel
Lockheed Martin Services, RTP, NC 27709, email: plessel.todd@epa.gov

Goal:
Research scientific data models and develop high-quality multi-language software libraries for
efficient representation, integrated analysis and visualization of large diverse time-varying geospatial
environmental data and metadata for supporting cross-media modeling and decision-support
applications operating in a high-performance networked multi-platform computing environment.

Background:
Computer modeling of environmental processes is crucial for understanding how the environment
changes due to natural and human processes and the long-term effects of these changes on
ecosystems and human health. Environmental decision-making requires the support of reasonably
accurate modeling of the relevant processes  and summary analysis of their effects.

Environmental modeling is typically done separately in each domain - air, land surface, water, soil,
groundwater, etc. - and at different scales: global, meso, regional, urban and micro. Modeling results
must be compared and correlated with actual observations, samples and reported data. Also, ideally,
the various kinds of models (air, terrain, soil, etc.) should be coupled to better account for the
exchanges that occur at these domain interfaces, such as wet deposition of air pollutants onto the
surface of a lake and biogenic emissions from surface vegetation, etc.

Decision-support analysis imposes more requirements on models such as steering (of model
execution) and generation of metadata (allowing traceability during review and future analysis).
Metadata necessary to support EPA's "20-year rule" is a significant challenge that must be met by
such decision-support systems to help justify (e.g., in court) regulatory laws enacted based,  in part,
upon such computer-aided analysis.
                                           37

-------