Alaska
United States	Region 10	Idaho
Environmental Protection	1200 Sixth Avenue	Oregon
Agency	Seattle WA 98101	Washington	
Office of Research & Development Office ot Science, Planning, & Regulatory Evaluation	
&EFA Decision Support Tools
Workshop
The Executive Inn • Seattle, Washington
June 28-29, 1994

-------
CONTENTS
Page
Agency Task Force on Environmental Regulatory Modeling:
Conclusions and Recommendations	 1
The History of Decision Support Tools in the Region 7 RCRA Program 	 3
Strategic Directions for Achieving Data Integration in Region 10	 5
Lessons Learned in the Application of Decision Support Tools
by Sandia National Laboratories' Environmental Restoration Project	 9
Technical Requirements for the Application of Decision Support
Tools at a Private Superfund Site 	 13
Decision Support Systems in the Public Domain:
Issues and Considerations	 15
Advantages of Commercialized Products Versus Publicly
Developed Products Remaining in the Public Domain 	 19
Environmental Modeling Resources at the U.S. Environmental
Protection Agency's Center for Exposure Assessment Modeling	 21
The Center for Subsurface Modeling Support (CSMoS) 	 23
Strategic Use of Decision Support Tools Throughout
the Remediation Process	 25
Information Systems for Site Management:
Current Use and Future Trends 	 27
Application of a Ground-Water Monitoring Trigger
for Underground Storage Tank Sites	 29
Software Products From the Environmental Monitoring Systems Laboratory 	 31
SmartRISK 1.0: Risk Assessment Software for Windows	 33
Risk-On-Site: A Tool for Characterizing Site Contamination 	 35
THERdbASE: A Modeling and Database System for Making Total
Human Exposure Assessments 	 37
Development of RESRAD and Other Environmental Pathway and
Health Risk Models at Argonne National Laboratory	 39

-------
CONTENTS (cont)
Page
ProTech: The Prospective Technology Communication System		41
Remedial Action Cost Engineering and Requirements (RACER) System		45
The EnviroText Retrieval System 		47
Quality Assurance Issues and Suggestions for Environmental
Software Development 		49
Software Development Process: The Key to Success		51
Getting the Right Answer		53
The Defense Environmental Corporate Information Management (DECIM) Program:
Software Development and Data Standardization	 55
Decision Support System for Evaluating Remediation Performance With
Interactive Pump-and-Treat Simulator 	 57
Sandia's Environmental Decision Support System (SEDSS): A Tool To Guide
Site Characterization, Risk Assessment, and Remedial Design Selection	 59
DuPont's CD/ROM Decision Support System: HEART 	 61
Introduction to the Internet: Environmental Resources	 63
Waste Management and Technologies Analytical Database System (WMTADS) 	 65
Access to the U.S Environmental Protection Agency's High Performance
Computing Resources for Environmental Decision Support 	 67
EnviroTRADE: A Commercialization Case Study	 69
Products and Services of the National Technical Information Service	 71
The U.S. Environmental Protection Agency's Environmental Monitoring
Methods Index (EMMI): A Tool for Environmental Monitoring	 73
NOTICE: The Abstracts contained in this Proceedings do not necessarily
reflect the views of the U.S. Environmental Protection Agency, and no official
endorsement should be inferred. Mention of trade names or commercial
products does not constitute endorsement or recommendation for use.

-------
Agency Task Force on Environmental Regulatory Modeling:
Conclusions and Recommendations
Steve Cordle
U.S. Environmental Protection Agency, Office of Environmental Processes and
Effects Research, Washington, DC
In March 1992, the Deputy Administrator, in his role as Chair of the Risk Assessment
Council, established the ad hoc Task Force on Environmental Regulatory Modeling. Its
charge was to "...complete within 12 months a recommendation to the Agency on specific
actions that should be taken to satisfy the needs for improvement in the way that models are
developed and used in policy and regulatory assessment and decision-making." In addition,
the following was to be addressed:
¦	Acceptability criteria for model use, generally and in particular circumstances.
¦	Formal technical and policy guidance on model development.
¦	Agency requirements for peer review and for documentation of models prior
to use.
¦	Expansion of training and technical support activities for the U.S.
Environmental Protection Agency (EPA) personnel who oversee model
applications.
The Task Force was cochaired by Steve Cordle of the Office of Research and Development
(ORD) and Larry Reed of the Office of Solid Waste and Emergency Response (OSWER).
Members came from all headquarters' offices, seven regions, and two ORD laboratories. The
final report of the Task Force was transmitted to the Deputy Administrator on April 25,
1994. It contained four sections: Training and Technical Support Needs, Model Use
Acceptability Criteria, Agency Guidance for Conducting External Peer Review of
Environmental Regulatory Modeling, and Proposed Charter for a Permanent Committee on
Regulatory Environmental Modeling.
Training and Technical Support Needs
There is a need to support those models developed and used to further specific program
objectives. Such support could come from Agency technical experts or by training that makes
use of the latest technologies. Personnel responsible for model use or interpretation need to
be properly trained in the exercise of that responsibility. More technical support needs to be
provided to model users in general, so that these decision support tools can be fully
exploited. Technical support can be provided in the form of training, direct help to users, and
Decision Support Tools Workshop
1

-------
information transfer. Short-term technical support needs could be met by panels of experts,
or forums, until more formal support programs are needed and instituted.
Model Use Acceptability Criteria
Model code acceptability should be judged on the basis of appropriateness, accessibility,
reliability, and usability. There is a need for a "Model Information System," which lists
models that meet the acceptability criteria. Once a collection of acceptable models is
assembled, there is a need for a process to periodically assess the models being used to
support rulemaking decisions and regulatory impact assessments.
Agency Guidance for Conducting External Peer Review of Environmental Regulatory
Modeling
Peer review is an important tool in EPA's campaign to document the quality and credibility
of the science upon which its regulatory and polity decisions are based. Not all managers
who must consider the utility of peer reviews are aware of their importance. Many of those
who are aware do not have a clear description of the procedures by which peer review is
needed now. There is a need to begin external peer review as early in the model
development phase as possible to maximize its value.
External peer review of a model's applicability needs to be conducted well in advance of any
decision-making that depends upon the model's results. Information gathered from the peer
review of scientific issues is critical in understanding the uncertainties and usefulness of a
model in regulatory decision-making. Therefore, such information needs to be available to
the decision-maker before decisions based on the model are made.
Proposed Charter for a Permanent Committee on Regulatory Environmental
Modeling
There is a need for a centralized focus to promote the goal of providing EPA's senior
policymakers with a set of well-developed, well-documented, and well-understood modeling
tools to support environmental decision-making.
2
Decision Support Tools Workshop

-------
The History of Decision Support Tools in the Region 7 RCRA
Program
William A. Pedicino
U.S. Environmental Protection Agency, Region 7 RCRA Branch, Hydrogeology
Section, Kansas City, KS
During the last 10 years, Region 7 has strived to develop strategies by which ground-water
reviews of Resource Conservation and Recovery Act (RCRA) facilities could be
accomplished faster and with more confidence. The advent of tools such as the ground-water
workstation and the Ground-Water Information Tracking System (GRITS), and models such
as MODFLOW and BIOPLUME attached to the Surfer program, have significantly
improved our ability to meet the ever increasing demands on our ground-water staff. Our
history can be broken down into four distinct eras. The precomputer years, the ground-water
workstation, GRITS, and the present.
Even in our precomputer years (1984-1985), the RCRA Branch recognized the increasing
load of ground-water documents, which needed a rapid review. To meet the expectations of
our engineering staff, a plan was developed to incorporate ground-water information on land
disposal facilities into a specific file. This file, which was available to the geologist, contained
geological information specific to the site in question. Included in this file were all pertinent
maps of the area, including soil maps as well as ground-water data. These files enabled the
ground-water staff to focus their time reviewing the site rather than collecting data.
During the period of time between 1986-1989, the ground-water workstation was actively
used as a tool to evaluate different ground-water parameters. The workstation, which
consisted of an IBM 286 computer (4-meg RAM, 40-meg hard drive math coprocessor) a
digitizer, a six-pen plotter, and a 24-pin printer, enabled the ground-water staff to initiate
rudimentary modeling. The workstation contained software from which contours, both two
and three dimensional, could be developed using data from a facility. Additionally, the
workstation contained two simple models, plume and slug. During the later part of 1986,
Region 7 had developed a database program, which was compatible with the workstation.
This enabled Region 7, with the help of a contractor, to place individual site ground-water
information on disk, which could be transferred directly to the workstation software.
During the early part of 1990, Region 7, with the help of our contractor, the Center for
Environmental Research Information (CERI), HDQRS, and Region 5, developed GRITS.
This database system is capable of storing data from both soils and ground water.
Additionally, modules have been added to this program, which make it possible to determine
statistically the status of the ground-water data. Future additions will enable staff to draw
contours and connect data to models. This system is currently being used by some states,
regions, and private companies.
Decision Support Took Workshop
3

-------
Currently, each member of the RCRA Branch in Region 7 has at least a 286 computer. The
Geology Section staff all have 386 computers with a 466 computer workstation. We are using
data generated by GRITS and sent to Surfer to develop two- and three-dimensional models
of plumes and ground-water flow. We are currently developing ways in which our support
staff will become more familiar with the more complex models.
We are now discovering it is no longer possible to have a staff schooled in geology or
engineering without some education in the use of computers. Each year we are aware of the
need to become increasingly literate in the use of models and graphics, something that was
unheard of 10 years ago.
4
Decision Support Took Workshop

-------
Strategic Directions for Achieving Data Integration in Region 10
Robin L. Gonzalez
U.S. Environmental Protection Agency, Information and Facilities Branch,
Region 10, Seattle, WA
Strategic Theme
In every area of environmental protection, from enforcement to public outreach, from
state/U.S. Environmental Protection Agency (EPA) information exchange to environmental
monitoring, the ability to access the right data with which to make sound policy decisions will
play a large part in the success or failure of our overall mission.
The Agency's IRM strategic plan, drafted by the Office of Information Resource
Management (OIRM) in 1992, clearly underscores the fact that our information resources
such as computers, software, and communication links have become the "engine" of the
Agency; an engine that is underpowered for the future demands of Agency decision-makers.
Data integration, which is a key theme in the strategic plan, can be defined as "the ability to
integrate data from various sources into a tangible format for decision-making." Our ability
to accomplish the automation of this critical function is key to having the right information
for making sound environmental decisions. Yet, we cannot provide that decision-making
capability necessary for success with existing technology.
Current State of Computing in Region 10
According to OIRM's strategic vision, "(EPA) workers will have access to greater power,
more information sources, and larger communication networks. Enhanced processing and
communication capabilities will improve the productivity of workers and the effectiveness of
government decision-making and services. Graphical User Interfaces (GUIs) and
object-oriented programs will make these technologies easier to learn and use."
Looking at our current strengths in the IRM area, we have improved electronic
communication greatly throughout the Region in the past 4 years and have provided users
with improved automation tools to perform administrative tasks more efficiently. Looking
at our weaknesses, we have not been able to provide the technological tools to improve our
ability to make better environmental decisions. The key steps that we are taking to meet the
future needs of the Agency are:
¦	Expanding and upgrading our hardware and software infrastructure.
¦	Expanding and upgrading our communications infrastructure.
Decision Support Took Workshop
5

-------
¦	Targeting decision support tools for use at the desktop; leveraging
Geographic Information Systems (GIS) technology.
¦	Targeting our user training and information outreach efforts effectively.
¦	Partnering with ESD and outside organizations to develop better scientific
support systems.
Forces Affecting Change
The success of our ability to build a workable infrastructure to support data integration and
decision support mechanisms will depend on two things. First, our ability to predict trends
in the rapidly changing information technology field. Second, our ability to attain the
necessary resources to complete the task.
One of the major platforms for data integration is the local area network (LAN), as well as
our GIS/UNIX system. Microsoft Windows and X-Windows products will play a large role
in how the products will look and act for all levels of users. Lotus Notes is an information
manager for group computing. Notes concentrates many disparate PC functions such as word
processing, spreadsheets, databases, E-mail, and communications in one environment. This
is a product that has the greatest potential for changing the way that the Agency leverages
administrative resources on the LAN. In fact, OIRM has recently endorsed Lotus Notes as
the Agency standard for "groupware" computing. Other products that are key to data
integration and decision support are Arclnfo and ArcView. These products will play a
significant role in the visual display of critical Agency data.
Multiplatform integration will be the key to moving information easily between disparate
systems. We have been working on the integration of our LAN platform and our GIS/UNIX
platform and we will continue to invest and expand in this area to bring decision support
tools to the desktop.
The Internet and public-information access are key growth areas for us as well. We have
developed a public-access BBS in the Region, and we hope to have an Internet connection
available to the BBS by the end of fiscal year 1994. We are also addressing the security
concerns with the Internet and our internal access and availability issues in conjunction with
our user training on the Internet and public systems.
Conclusion
Region 10 will continue to invest in infrastructure and cross-platform integration at a
moderate rate for the next 5 years. We will build on our successful LAN platform and
provide our users with the necessary tools for data integration and decision support on the
desktop. We will continue to seek new ways of opening up access to EPA information to the
business community, educational community, and general public through available technology
such as our BBS and the Internet.
6
Decision Support Tools Workshop

-------
Sound environmental decision-making will depend more and more on the ability to extract
and use critical data easily. For Region 10, this will mean a comprehensive, integrated
approach to information management of both scientific and administrative information,
which carefully considers all the available options and implements those with the highest
payback for the Agency and the public.
Decision Support Took WoHcshop
7

-------
Lessons Learned in the Application of Decision Support Tools by
Sandia National Laboratories' Environmental Restoration Project*
Robert G. Knowlton, Ph.D., P.E.
Sandia National Laboratories, Albuquerque, NM
Sandia National Laboratories (SNL) has been tasked by the U.S. Department of Energy to
assess and remediate waste sites associated with past laboratory activities in testing and
disposal operations. The Environmental Restoration (ER) project at SNL is responsible for
this work at approximately 200 potential waste sites. Current funding scenarios do not permit
unlimited spending and/or the pursuit of scientific curiosity in addressing regulatory concerns
at these sites, and future funding is limited. Decision support tools (DSTs) are proving to be
a great asset in the effort to reduce costs and to make defensible decisions. DSTs provide
information to aid the decision-making process by combining 1) U.S. Environmental
Protection Agency (EPA) risk methodology, 2) probabilistic estimates of key model
parameters, 3) site-specific contaminant information, and 4) realistic exposure scenarios.
Areas of interest for DST supplied information include: What risk does a site pose? What
cleanup levels are appropriate? How many samples are needed? How many monitor wells
are needed? Which sites should be investigated first with the limited funding that exists?
What is the worth of collecting additional information? What remediation technology is best?
What is the cost uncertainty in the assessment or remediation alternatives?
DST work is being done under the scrutiny of the New Mexico Environment Department
(NMED) and EPA Region 6. SNL's interactions and negotiations with the regulatory
community regarding DST methodologies and codes have been ongoing for over 2 years. The
lessons learned in these interactions and regulatory negotiations have influenced the
attributes of DSTs currently under development at SNL. Some of the regulatory concerns
and the solutions to problems with DSTs can be summarized as: 1) concern about
conservatism in risk assessment, 2) concern about the treatment of uncertainty, 3)
documentation of assumptions and data/parameter selection, 4) "what if (or hypothetical)
case study analyses, 5) visualization of results, and 6) statistical and/or geostatistical
approaches.
DSTs used at SNL are adhering to the Risk Assessment Guidance for Superfund (RAGS)
methodology, but have the added attribute of performing quantitative uncertainty analyses
in the form of probabilistic Monte Carlo simulation techniques. This provides conservatism
in the quantification of risk and identifies possible impacts to human health and the
environment. The quantification of the uncertainty in risk using the Monte Carlo techniques
allows for more defensible decisions that are not overly conservative.
This work was supported by the U.S. Department of Energy under Contract DE-AC04-
94AL85000. SAND94-1509A.
Decision Support Toob Wtt 'shop
9

-------
Regulators have expressed concern about how the distributions of uncertain parameters are
established when implementing uncertainty analyses. SNL's ER project approaches these
concerns by using 1) actual site data, 2) analog site data (i.e, data from a similar or adjacent
site at SNL), 3) published literature values, and 4) subjective or expert judgement.
Regulators have some concern about using expert judgement in establishing parameter
distributions; therefore, SNL will concentrate on obtaining information from the other three
classes of data, as these might be less biased. Each DST will prompt the user for
documentation of his or her assumptions for the data/parameter inputs, as well as the
assumptions for the conceptual model under investigation. This concept has been praised,
since typical contractor reports submitted to the regulator do not contain an explicit list of
assumptions used in support of data analysis and modeling. Explicit definitions of
assumptions and data distributions should help facilitate understanding between the
regulators, stakeholders, and potentially responsible parties.
With the advent of the current sophistication in hardware and software packages for desktop
computers, the ability to do real-time, or near real-time, computer simulations has been
greatly improved. In addition, the computer graphics/visualization capabilities of the systems
in use today are outstanding. As a result, DSTs at SNL are employing Monte Carlo analyses
of flow and transport simulations, with contaminant plume visualizations, to aid in the
understanding of uncertainties in contaminant plume distributions. Many variations of
hypothetical conceptual model scenarios may be tested to achieve a greater understanding
of the physical system. One such scenario, which proved beneficial, was the ability to simulate
buried stream channel deposits between existing monitor wells as part of the spatial
variability within an aquifer to evaluate the effects of dispersion within the contaminant
plume. Because the flow and transport simulators are constrained by physics, the result was
a better understanding of the contaminant movement through and about this zone. The
visualization aspects helped to better understand the problem, and also can serve as
educational tools.
The statistical sampling and geostatistical simulation work required of DSTs to aid in the
definition of the adequacy of sampling and analysis plans have come under scrutiny by
regulators. The basic concern in this area is the appropriateness of the methods employed
in defining the nature and extent of contamination. Again, the visualization capabilities of
DSTs have helped with the initial attempts at regulatory acceptance of these techniques.
DSTs employed, or to be employed, in SNL's ER project are 1) the Probabilistic Risk
Evaluation and Characterization Investigation System (Precis), 2) the Borehole Optimization
Support System (BOSS), 3) the Environmental Decision Support System (EDSS), and 4) the
Cost/Risk Performance Assessment (CRPA) tool.
Preliminary estimates of the relative impact of DST use in the ER project are available.
DSTs reduce the time to perform risk assessments and other computer simulation tasks,
typically from weeks to days or even hours, depending on the complexity of the analysis. This
performance boost allows SNL to perform risk estimates at the outset of a site investigation
to decide whether a site is a logical candidate for No Further Action (NFA), requires
additional site characterization, or poses a possible threat and therefore should have remedial
alternatives defined early in the process, almost in real time. An estimated 40 percent of the
sites at SNL may be candidates for early NFA petitions, thereby eliminating needless
characterization. The cost savings in this area of DST use totals approximately $20 million.
10
Decision Support Took Workshop

-------
The cost savings associated with the use of DSTs in defining borehole and monitor well
installations, as well as sampling strategies, are estimated to be approximately $10 million.
The time savings are significant as well, allowing SNL to concentrate early on the sites that
pose a possible threat, to make optimal use of limited funds, and to compress overall site
characterization schedules by several years.
Decision Support Tools Workshop
11

-------
Technical Requirements for the Application of Decision Support
Tools at a Private Superfund Site
Calvin C. Chien
DuPont Company, Wilmington, DE
David S. Ward
GeoTrans, Inc., Sterling, VA
At Superfund sites, decision support tools (DSTs) provide owners with the ability to employ
integrated technologies, such as a Geographical Information System (GIS), to develop
conceptual models of subsurface transport processes. Requirements for the application of
DSTs include compilation of hydrogeological database/computer-aided design (CAD) files,
development of a conceptual model, development of the technical approach (modeling,
statistical, expert system), and experience with the application of the tools. The
implementation of DSTs leads to the application of ground-water models to address specific
questions in the areas of hydrogeological investigation, risk assessment, feasibility study,
remedial design of ground-water extraction systems, and improvement of monitoring well
design. These technologies require comprehensive and consistent construction and
maintenance of relational database and CAD files. The models often require that
point-based or inferred knowledge of aquifer parameters (hydraulic conductivity, leakage
rates) and hydrogeological boundaries (stream elevation, conductance, recharge) be
extrapolated to span beyond the site boundary as defined by the modeled area. Difficulties
often arise when data (hydraulic head and water concentration) are clustered on site or are
intended to determine the extent of waste plume migration without specific regard for plume
characterization.
While many of the technical requirements for the application of DSTs are scientific and
generally straightforward, the requirements for regulatory acceptance are often less tangible.
Acceptance issues include model selection (public versus proprietary), parameter assignment
(acceptable range or distribution of hydraulic parameters), and "conservative bias"
(perpetuation of worst case assumptions). Through the negotiation process, we can often
resolve many of the differences though training and face-to-face informal technical exchange.
Problems encountered often include unclear definition of DST objectives and limitations,
application of generic models without regard to site-specific details, necessity of modeling
as determined by regulation rather than remediation objectives, lack of appreciation of the
uncertainty, and error propagation from measurement to predictive simulation.
Decision Support Toob Workshop
13

-------
Decision Support Systems in the Public Domain: Issues and
Considerations
Hanadi S. Rifai, Ph.D., P.E.
Rice University, Energy and Environmental Systems Institute, Houston, TX
Decision support systems (DSSs) describe a class of software for supporting executive
decision-making. The introduction of DSSs for solving environmental problems is relatively
new, and their acceptance and widespread use has been somewhat slow. One of the questions
that needs to be answered regarding decision support tools (DSTs) is whether they should
be developed as proprietary software or in the public domain. This paper addresses some of
the issues and concerns that need to be addressed when developing a DST in the public
domain.
Development Platform
One of the key decision-making steps in designing DSTs is the selection of hardware and
software environments to be used. Prior to undertaking such a project, a choice needs to be
made between personal computers and workstations, and whether DSTs would support
multiple users across a network or be solely intended for single users. Then the question of
which software platform(s) should be used to develop DSTs arises: Should one use
commercial software or develop DST-specific software? These considerations are not to be
taken lightly since they greatly impact the extent of use of the DST. The average public
domain DST user likely has access to a personal computer and has a somewhat limited
knowledge of software, for example, familiarity with word processors and spreadsheet
programs. These constraints preclude the development of "sophisticated" DSTs in the public
domain and dictate the use of personal computers and commonly used software tools.
User Profile and User Needs
A misconception related to DSTs may be that they are intended for technically oriented
users. The fact is that there may be a broader spectrum of users which includes technical
project staff as well as site managers and regulators. The main purpose of public-domain
DSTs should be to facilitate interaction and collaboration between project team members
and to accelerate and enhance the decision-making process at the management level. As
such, consideration has to be given to issues of usability, applicability, and acceptance by the
users.
Decision Support Tools Workshop
15

-------
Software Upgrades and Software Support
A main impediment to the success of public-domain DSTs is the lack of commitment on the
part of the funding agency for the maintenance and support of these tools. The majority of
commercial software development companies invest significant resources toward upgrading
and responding to their users. Public-domain software requires the same commitment from
the funding source and the developers toward responding to questions about the software
and improving on the system performance.
To demonstrate the feasibility of public-domain DSTs, two such systems developed at Rice
University will be presented.
The OASIS Decision Support System for Ground-Water Contaminant Modeling (1)
The OASIS system was the first of its kind in ground water. The DSS was developed using
hypertext on a Macintosh platform and utilized graphical interfaces to create a more intuitive
form of communication with the computer. At the time of its development, IBM compatibles
did not allow for the graphical nature of the OASIS system. As a result, the federal- and
private-user base of OASIS was limited. The current users of OASIS are mostly in the
educational and research communities. Rice University has invested in in-house support of
OASIS, which is currently allowing us to upgrade the software and bring it to the PC
platform.
A Decision Support System for Evaluating Remediation Performance With
Interactive Pump-and-Treat Simulator (P&T DSS)
The P&T DSS is a second-generation OASIS-like tool that can be used to manage, analyze,
and model site data and remediation activities. The key differences between OASIS and the
P&T DSS are: 1) the P&T system uses a color hypertext interface instead of the black-and-
white interface in OASIS, 2) the P&T DSS incorporates a site ORACLE database and two-
dimensional and three-dimensional visualization tools, and 3) the P&T DSS will run on
Macintosh and IBM-compatible computers.
The P&T DSS contains the Global, Site-Specific, and Simulator modules. The Global module
includes hydrogeologic and chemical databases, a remediation technologies library, and
remediation decision flowcharts. The Site-Specific module is built around an ORACLE
database for the U.S. Air Force Plant 44 site in Tucson, Arizona. The ORACLE database
was linked to a user-friendly interface that allows the user to analyze and visualize data using
a series of two-dimensional and three-dimensional plots. The Simulator module includes
analytical models, an analytical simulator, and a numerical simulator, built around the
BIOPLUMEII model (2).
Thus far, the P&T DSS has received significant interest from the federal sector because of
its simple-to-use interfaces for site data analyses and decision-making. A key element in
developing those interfaces, however, has been the use of third-party software such as
16
Decision Support Tools Workshop

-------
ORACLE and ORACLE CARD (database). This implies that the user needs to make an
investment in software to have the intended functionality and flexibility of the DSS. On the
positive side, the DSS can be used by technical persons as well as by site managers and
decision-makers. The P&T DSS will not be ready for distribution until later in the year.
More information about the success of the adopted approach will emerge at that time.
References
1.	Newell, CJ., J.F. Haasbeek, and P. Bedient. 1990. OASIS: A graphical decision
support system for ground-water contaminant modeling. Ground Water 28(2):224-
234.
2.	Rifai, H.S., PJB. Bedient, J.T. Wilson, K.M. Miller, and J.M. Armstrong. 1988.
Biodegradation modeling at an aviation fuel spill. ASCE J. Environ. Engineer.
114(5):1,007-1,029.
Decision Support Toots Workshop
17

-------
Advantages of Commercialized Products Versus Publicly Developed
Products Remaining in the Public Domain
Richard A. Ferguson
The Skylonda Group, Inc., Palo Alto, CA
Commercial software products—models, databases, database managers, expert systems,
mapping applications, and the like—generally have advantages over publicly developed
software products where there must be a fixed budget and timeframe for development and
implementation. Software products solve "information problems" for the various customers
who have stakes in Superfiund or Resource Conservation and Recovery Act (RCRA) sites.
A commercialization proposal, or "the business case," for new-product development is
generally easy to make when 1) the information problem to be solved arises in analogous
form at dozens of different privately managed sites and 2) the cost of solving the information
problem is a small fraction of the cost of the physical site work that must be done.
Where these conditions do not hold true, the case for commercialized products weakens, but
interestingly enough, so does the case for using software tools in the first place.
Commercial incentives move both producers and users more quickly toward such goals as
a) establishing workable standards, b) setting measurable performance benchmarks, c)
customer satisfaction, d) maintenance, e) technology transfer to the next analogous problem
sites, f) cost reduction over time, and g) introduction of innovation at appropriate times and
settings. In contrast—and here is where the commercial advantages tend to accrue—the
public-agency contract and grant mechanisms under which "public-domain" products are
developed can create disincentives to achieving the same goals. The history of publicly
developed Contract Lab Program data standards and related software products offers useful
illustrations of the operation of public-development incentive systems, in contrast to
private-sector developments in a commercial context.
In nearly every area of public- and private-information system work today, trends point
toward decentralized, smaller-scale, private initiatives and away from centralized solutions,
even where the centralized solution appears to be "free." Public development induces the
developer to pay attention primarily to the needs of the public sponsor. In contrast, reliance
on commercial incentives necessarily forces the developer to pay attention to the needs of
the many "local" customers at sites, and to the technologies available to meet those needs at
a finite price within a finite time.
Decision Support Took Workshop
19

-------
Environmental Modeling Resources at the U.S. Environmental
Protection Agency's Center for Exposure Assessment Modeling
Dermont C. Bouchard
U.S. Environmental Protection Agency, Athens, GA
The Center for Exposure Assessment Modeling (CEAM) provides microcomputer-based
software for modeling aquatic, terrestrial, and multimedia exposure pathways for organic
chemicals and metals. CEAM models range from simple desktop techniques suitable for
screening analysis, through computerized steady-state models, to sophisticated,
state-of-the-art continuous simulation models. Currently distributed software includes
simulation models and databases that can be applied to urban runoff, leaching and runoff
from soils, conventional pollution of streams, toxic pollution of streams, toxic pollution of
lakes and estuaries, conventional pollution of lakes and estuaries, tidal hydrodynamics,
geochemical equilibrium, and aquatic food chain bioaccumulation. CEAM software is
available through diskette exchange, an electronic bulletin board system, and over the
Internet via anonymous file transfer protocol.
Decision Support Tools Workshop
21

-------
The Center for Subsurface Modeling Support (CSMoS)
David S. Burden, Ph.D.
U.S. Environmental Protection Agency, Robert S. Kerr Environmental Research
Laboratory, Ada, OK
The Center for Subsurface Modeling Support (CSMoS) provides ground-water and vadose
zone modeling software and services to public agencies and private companies throughout
the nation. CSMoS is located in Ada, Oklahoma, at the Robert S. Kerr Environmental
Research Laboratory (RSKERL), the U.S. Environmental Protection Agency's (EPA's)
Center for Ground-Water Research.
The primary aims of CSMoS are to provide direct technical support to EPA and state
decision-makers in subsurface model applications and to manage and support the
ground-water models and databases resulting from the research at RSKERL. This research
encompasses the transport and fate of contaminants in the subsurface, the development of
methodologies for protection and restoration of ground-water quality, and the evaluation of
subsurface remedial technologies. As a result, a major focus of CSMoS entails coordinating
the use of models for risk assessment, site characterization, remedial activities, wellhead
protection, and Geographic Information Systems (GIS) application. In these ways, CSMoS
performs an active role in protecting, restoring, and preserving our nation's ground-water
resources.
Modeling Services
CSMoS integrates numerous individuals and organizations with expertise in all aspects of the
environmental field in its effort to apply models to better understand and resolve ground-
water problems. Internally, CSMoS is supported by the scientists and engineers of RSKERL,
whose specialties include hydrology, chemistiy, soil science, biology, mathematics, and
environmental engineering. This forms the nucleus of CSMoS. Externally, CSMoS is
supported by technical support contractors and extramural experts. Through this network,
CSMoS is able to interface with a wide variety of experts to provide the comprehensive
modeling services required to resolve complex environmental problems.
CSMoS provides assistance in the following modeling areas:
¦	Conceptualization
¦	Model development
¦	Model verification
¦	Model validation
Decision Support Took Workshop
23

-------
¦	Model application
¦	Model distribution
¦	Modeling training and education
Technical Assistance
CSMoSis an integral part of RSKERL's Technology Support Center. CSMoS distributes and
services all models and databases developed by RSKERL and provides general support on
model application to ground-water andvadose zone problems. Technical assistance activities
include developing educational documents, providing training courses, and distributing
update notices and other pertinent information for all software developed at RSKERL as
well as software developed under laboratory grants and contracts.
CSMoS provides direct technical assistance for a broad spectrum of modeling applications.
Models and/or databases are available to assist in the following areas:
¦	Site a?:essment
¦	Site characterization
¦	Soil remediation
¦	Ground-water remediation
¦	Treatability studies
¦	Remedial action management
¦	Ground-water resource development
¦	Wellhead protection
¦	Environmental planning
¦	Geostatistics
¦	Resource Conservation and Recovery Act (RCRA) corrective action
¦	Superfund activities
24
Decision Support Took Workshop

-------
Strategic Use of Decision Support Tools Throughout the Remediation
Process
Bill Byers
CH2M Hill, Corvallis, OR
At most hazardous waste sites, site characterization results in the generation of a tremendous
amount of data. The data can be thought to have a life cycle that begins when the need for
a decision is established. Subsequently, a sampling plan and a health and safety plan are
generated, data quality objectives are established, a sample tracking system is identified,
samples are produced and analyzed, data are produced, data quality is determined, and data
are entered into a data management system from which they are displayed and analyzed as
appropriate to support a decision about remedial action at the site.
Frequently, data generation has occurred in cycles: an initial site assessment has been
performed, followed by two, three, or even more phases of remedial investigation, producing
staggering amounts of data in the process of reaching a decision about remedial action at the
site. Decision tools have largely focused on the analysis of all of this data to reach the
"ultimate" decision: the remedial action at the site.
While the remedy selection decision may receive the most attention, decisions are actually
made throughout the cycles of data generation. Better decisions at every point in the cycle
can greatly reduce the cost of the investigation and remediation process. Thus, opportunities
to make beneficial use of decision support tools are present at every step of the site
characterization and remediation process. The challenge is to recognize a decision as an
opportunity to use a decision support tool, identify an appropriate tool(s), and apply the tool
appropriately. Frequently, the limiting factor in the use of decision support tools is either a
lack of awareness of the tools' existence or an inability to identify appropriate uses for the
tools.
A model for identifying a decision as warranting use of a decision support tool, selecting an
appropriate tool, and understanding the resources needed to properly use the tool is outlined
in this paper. Examples of successful uses of decision support tools are included as support
for the model.
Decision Support Tools Wc^tahap
25

-------
Information Systems for Site Management: Current Use and Future
Trends
Richard C. Michael and Clifford A. Kottman, Ph.D.
Intergraph Corporation, Reston, VA
The concept of an integrated, computer-based model of a site has been developed to the
point where project managers, engineers, scientists, and others involved with its cleanup can
access all or part of the available data from an integrated set of applications. Currently, most
systems are used primarily to manage project data and to perform technical analysis.
Recently, there has been increased interest in using this information as part of the broader
decision process. This interest is motivated by the desire to base decisions on the most
comprehensive and accurate information available as well as the need to communicate
information and decisions to the stakeholder community in a trusted manner. Increasingly,
the accessibility of data requires the adoption of a variety of organizational standards. These
include a) data-related standards that standardize database syntax and semantics, b) standard
operating procedures that allow interoperability across management authorities, and c)
system-related standards that facilitate, rather than exclude, interoperability between vendor-
specific system components. This paper will use examples from existing sites to illustrate how
systems are currently implemented and used. It also will discuss current efforts by
government and industry consortia to develop and promote standards.
Decision Support Toots Workshop
27

-------
Application of a Ground-Water Monitoring Trigger for Underground
Storage Tank Sites
Stephen G. Zemba, Ph.D., and Steven J. Luis, C.E.
Cambridge Environmental, Inc., Cambridge, MA
This paper describes the application of a user-friendly software tool to decide whether to
sample groundwater at underground storage tank (UST) sites. Development of the software
was funded by the New Jersey Department of Environmental Protection and Energy
(NJDEPE) Division of Research, and the software is designed for use by NJDEPE case
managers.
The desire for a methodical, consistent approach to the treatment of UST sites served as the
impetus for this project. In the past, the excavation of a UST typically has been accompanied
by soil sampling in the immediate area of the tank. A decision to sample ground water has
been made in an ad hoc manner, depending upon the magnitude of contamination detected
(if any) in the soil samples and other factors such as depth to ground water and site cover.
The goal of this research was to analyze and codify the procedures used by case managers
in UST investigations.
The ground-water trigger (i.e., the decision to install a well and sample ground water) is
based on a statistical analysis of cases taken from the files of NJDEPE's Bureau of
Underground Storage Tanks. The files were used to evaluate the ability of variables such as
soil concentration, depth to ground water, soil texture, and simple estimates of travel time
to indicate exceedances of NJDEPE ground-water standards. Explanatory variables identified
as good indicators were ranked and incorporated into the tiers of the trigger according to
indicative ability and ease of characterization. By employing multiple tiers, the trigger enables
the user to refine the decision to monitor ground water in response to site conditions and
to make these decisions in a consistent manner.
A menu-driven spreadsheet interface has been created to make the trigger easy to use. The
interface was written in the macro programming language of a commonly used spreadsheet
program, Quattro Pro, and is intended to provide a comfortable and familiar user
environment. The software prompts the user to input values of indicator variables required
by each tier, advances the user through the tiers as necessary, and provides the user with the
conclusions of the analysis.
Applications of the software are presented using input 1) obtained from NJDEPE case
studies (i.e., the data used to develop the trigger) and 2) derived for hypothetical sites. The
software is available for hands-on demonstration on a portable PC, and individuals are
invited to test the algorithm by applying it to example cases of their choosing.
Decision Support Took Workshop
29

-------
Software Products From the Environmental Monitoring Systems
Laboratory*
Jeff van Ee
U.S. Environmental Protection Agency, Las Vegas, NV
A number of software products for the personal computer have been produced at the U.S.
Environmental Protection Agency (EPA)-Las Vegas in the areas of statistics, risk assessment,
site characterization, and quality assurance. This software will be briefly described with more
details being provided during the poster session. Problems in the production and distribution
of this software will be highlighted, and future directions for the production of software at
the Environmental Monitoring Systems Laboratory (EMSL)-Las Vegas will be described.
Increased emphasis will be devoted to the integration of software modules into more
comprehensive software packages. EMSL-Las Vegas proposed a standard for the exchange
of data between software modules. Other standards also have been proposed. Production and
eventual acceptance of software products from EMSL-Las Vegas will be dependent on the
ability of the software to interface easily with other software products and databases.
Production of sophisticated, environmental software has become increasingly difficult because
of the need to draw upon greater numbers of people in university, private, and government
sectors. Frequent changes in technical expertise, personnel, and contracting procedures can
slow the long-term developmental efforts that are required to produce this software. The
financial resources that are required to distribute and support the software can also be a
limiting factor.
The successes and problems experienced at EMSL-Las Vegas in the development of software
may serve as lessons to others in their production of software for environmental problems.
"The research described in this abstract has been funded by EPA through its Office of
Research and Development (ORD). This abstract and the oral presentation it summarizes
have not been subjected to ORD's peer and administrative review and do not necessarily
reflect the views of EPA or ORD.
Decision Support Took Workshop
31

-------
SmartRISK 1.0: Risk Assessment Software for Windows
Chris Waldron
Pioneer Environmental Consulting, Kirkland, WA
SmartRISK is a complete multichemical, multipathway human health risk assessment
modeling package for Microsoft Windows. SmartRISK eliminates the time-consuming process
of developing exposure models in spreadsheets for calculating risks. SmartRISK also provides
tools for managing risk assessment information such as exposure factors, toxicity values, and
references.
Exposure models are built by selecting media, chemicals, and exposure routes from easy to
use pick lists. This information is used with exposure point concentrations and data in user-
defined default databases, such as exposure factors, to evaluate exposure and calculate risk.
Exposure point concentrations can be hand entered or imported from a variety of file
formats. Users can change, update, or customize any of the information in the default media,
chemicals, exposure routes, exposure factors, toxicity values, and physical constants databases.
Four different exposure scenarios can be evaluated at one time for reasonable maximum
exposed (RME) and alternate exposed populations. The results of a risk assessment can be
evaluated using interactive tools to ask "what if questions to determine the media, exposure
routes, and chemicals that are responsible for the majority of the risks. Noncarcinogenic risks
can be summed by toxic endpoint. Users can select from 25 standard documentation reports
to print out assumptions and the results of the assessment, or they can create their own
custom documentation reports using the integrated report writer. Data can also be exported
to a variety of file formats for analysis and presentation using numerous applications.
Planned upgrades include Monte Carlo Simulation, Geographic Information System (GIS)
compatibility, and a Cleanup Level Calculator.
Decision Support Tools Workshop
33

-------
Risk-On-Site: A Tool for Characterizing Site Contamination
Stephen G. Zemba, Ph.D., and Edmund A.C. Crouch, Ph.D.
Cambridge Environmental, Inc., Cambridge, MA
Risk-On-Site is a software tool designed to evaluate the pattern and magnitude of
contamination at hazardous waste disposal sites. The program provides four features that aid
in site characterization. First, Risk-On-Site constructs false-color contamination maps from
on-site sampling data (pollutant concentrations and geographic coordinates). Maps are
defined using a nearest neighbor principle.1 A two-dimensional area is divided into a series
of polygons (one for each sample). The shape and size of each polygon is determined by the
relative locations of sampling points and site boundaries, such that each polygon represents
the portion of the site closest to the measurement it circumscribes. For a given configuration
of sampling points, a contamination map is both unique and intuitive. Only one map can be
drawn for a given pattern of sampling locations, and the closest neighbor approach is easily
grasped.
Risk-On-Site's topological algorithms are implemented in an efficient, automatic, and
reproducible manner. On a 50 MHz 80486DX platform, the algorithms process a site with
more than 800 sampling points in less than 10 seconds.
Risk-On-Site maps facilitate efforts to characterize contamination by providing a visual
display of measurements that illustrates the extent and pattern of contamination, hot spots,
and undersampled areas that may warrant additional investigation. "What if scenarios are
easily investigated; for example, the consequences of adding sampling points into existing
diagrams can be evaluated within seconds.
Estimation of exposure point concentrations is the second feature provided by Risk-On-Site.
Area-weighted concentrations are easily calculated using measurements and their associated
areas, as defined by the contamination map. Area-weighted averages provide an objective
comparison to other statistical measures such as arithmetic averages and associated
parameters, such as upper 95th percentile confidence limits of arithmetic means, which are
recommended in Superfund risk assessment guidance. In cases where exposure is equally
probable across a site, an area-weighted average is arguably the best estimate of the exposure
point concentration. The nearest neighbor approach embodied by Risk-On-Site provides an
objective estimate of area-weighted averages that—unlike methods such as interpolation,
kriging, and other statistical techniques—requires no application of professional judgment
regarding underlying data distributions and parameters.
Risk-On-Site's third advantage is its rapid ability to evaluate the consequences of remedial
action alternatives. For a given contamination map, target cleanup levels are easily
^elaunay/Voronoi diagrams and Thiessen polygons use this technique.
Decision Support Took Workshop
35

-------
investigated by substituting remediated levels at sampling points at which concentrations
exceed the target criterion. For a given target cleanup level, Risk-On-Site 1) updates the
contamination map by highlighting the areas to be remediated, and 2) estimates the
postremediation area-weighted average concentration and total area of remediation.
Finally, additional capabilities of Risk-On-Site encompass the elements needed for exposure
and risk assessment. The modular style used to program Risk-On-Site facilitates interfacing
with additional models. In a Risk-On-Site application at a complex, multiuse site,2 fate and
transport algorithms have been used to estimate exposure point concentrations in
environmental media in which measurements were unavailable (e.g., pollutant concentrations
in air due to evaporation from surface water or ground water). Receptor networks were
constructed to superimpose on contamination maps, and exposure scenarios were used to
describe the rate and frequency of human contact with site-related contamination. The
exposure estimates provided by the integration of this information were combined with dose-
response data to generate geographic distributions of risk estimates.
A demonstration disk of Risk-On-Site's capabilities is available upon request. The
demonstration runs on a DOS-based PC. A color monitor is preferable to display the graphic
portions of the demonstration.
2The site encompasses 90 acres, and is to be redeveloped for mixed industrial, commercial,
and residential land usage. A copy of the risk assessment of this site, in which Risk-On-Site
was extensively applied, will be provided upon request.
36
Decision Support Tools Workshop

-------
THERdbASE: A Modeling and Database System for Making Total
Human Exposure Assessments
Muhilan D. Pandian
University of Nevada at Las Vegas, Harry Reid Center for Environmental
Studies, Las Vegas, NV
Joseph V. Behar
U.S. Environmental Protection Agency, Las Vegas, NV
THERdbASE is being developed as a PC-based computer modeling and database system that
contains exposure-related information. The system provides an optimal framework for the
construction of a suite of exposure-related models within the Modeling Engine by using
information available in data files within the Database Engine. It will be possible to use
information available :n THERdbASE to determine total (multiple pollutants present in
multiple media and crossing the human envelope through multiple pathways) human
exposure estimates. Scientists and engineers will be able to use THERdbASE to make better
exposure assessments for various population^) of interest and determine contributions of
different variables to total exposure.
The state-of-the-art models and submodels within THERdbASE are being organized into the
following categories:
¦	Human population distributions
¦	Human location/activity patterns
¦	Human food consumption patterns
¦	Pollutant releases from sources
¦	Microenvironmental pollutant concentrations (microscale)
¦	Ambient pollutant concentrations (macroscale)
¦	Expose e patterns
The submodels belonging to these categories are being integrated to obtain estimates of total
human exposure.
Data input to models is achieved through a standardized procedure. Input can be provided
as single values, custom distributions (normal, lognormal, beta, gamma, etc.), distributions
based on data files present in THERdbASE, or specific percentile values. When distributions
Decision Support Tools Workshop
37

-------
based on data files are required as inputs, only the appropriate THERdbASE data files are
provided as choices. Model execution is based on the mathematics behind the model itself.
Efficient algorithms are provided to access optimally input data and generate appropriate
output data. Multiple runs of a model can be executed through a batch process. Data output
from models is done in the following two ways: 1) as THERdbASE data files or 2) as preset
graphs. Any THERdbASE data analysis feature can be performed on the data files. The
preset graphs allow viewing of model results immediately after execution. Provision has been
made to save output data for multiple executions of the same model.
The data files within THERdbASE are being organized into the following nine categories:
¦	Human population distributions
¦	Human location/activity patterns
¦	Human food consumption patterns
¦	Pollutant properties
¦	Pollutant sources (+ use patterns)
¦	Environmental characterizations
¦	Environmental pollutant concentrations
¦	Food contamination
¦	Human physiological parameters
Contents of data files i:an be viewed in tabular form (records and fields/rows and columns).
Data files are mostly organized by codes. The built-in relational structure allows users to
switch between coded and decoded information with the click of a button. While viewing,
columns of data can be set either to "show" or to "hide" mode. Multiple data files can be
viewed at the same time. While viewing contents of a data file, users can access records
based on simple queries (filters) on field values. Queries can be performed on both coded
and uncoded information. While viewing contents of a data file, users can perform simple
statistics on appropriate data fields. Only those fields relevant to statistics are highlighted.
Given a numerical data field, the following can be determined: a) summary statistics (mean,
standard deviation, minimum, and maximum), b) percentile values at desired intervals, and
c) distribution parameters. Given two numerical data fields, a correlation analysis can be
done. For text data fields, statistics will produce occurrence frequencies. "Smart" graphs allow
viewing model output results in pregraphed formats. Through the print function, the
following can be output to any printer a) contents of a data file, b) query results, c) statistics
results, and d) graphs. Subsets of data files can be saved while viewing. Entire data files or
subsets thereof can be exported out of THERdbASE to standard formats (e.g., ASCII, dbf).
Those data files, external to THERdbASE, existing in most popular formats and conforming
to the structure of existing THERdbASE data files can be imported into THERdbASE.
38
Decision Support Toob Workshop

-------
Development of RESRAD and Other Environmental Pathway and
Health Risk Models at Argonne National Laboratory
S.Y. Chen
Environmental Assessment Division, Argonne National Laboratory, Argonne, IL
Several multimedia environmental pathway computer models have been, or are in the process
of being, developed at Argonne National Laboratory (ANL). These models are centered on
the existing RESRAD code, although each has a separate computer package design and
serves different objectives.
RESRAD has been developed to implement the U.S. Department of Energy's (DOE's)
residual radioactive materials guidelines for contaminated soils (DOE Order 5400.5). It is
currently a participating code in the international verification and validation (BIOMOVS)
effort. The code has been used widely by DOE and its contractors, and to a certain extent
outside of DOE. To date, some 30 RESRAD workshops have been conducted by ANL.
Several major DOE programs have successfully utilized the code in assessing human health
risks and developing site cleanup criteria.
Several pathways can be analyzed by RESRAD: direct external exposure; inhalation of
particulates and radon progeny; and ingestion of foodstuffs (plants and animals), water, and
soil. All of these pathways are modeled in relationship to the contaminated area(s). As such,
the code has been equipped with modeling of above-surface (air, surface water, vegetation,
etc.) as well as subsurface (ground water) transport mechanisms. The code calculates human
health risk as the endpoint for each individual pathway and for all pathways aggregated, and
computes allowable soil contaminant concentrations. Analysis is performed in a time-
dependent fashion (up to 10,000 years).
Over the years, RESRAD has undergone major improvements in terms of both added
modeling capability and input database. The latest updates have been described in the
RESRAD Manual, Version 5.0 (ANL/EAD/LD-2, September 1993). The following have been
accomplished:
¦	Information regarding input parameters has been published in the Data
Collection Handbook.
¦	Sensitivity analytical capability now enables users to identify key input
parameters and set priorities for data collection efforts.
¦	Models for special nuclides (tritium and C-14) have been added.
¦	The menu system is improved and more user-friendly.
Decision Support Tools Workshop
39

-------
Plans for improving RESRAD in the future include developing the capabilities for
benchmarking and performing uncertainty analysis, among other things.
RESRAD-CHEM analyzes hazardous chemicals. The design of RESRAD-CHEM is parallel
to that of RESRAD, except for the addition of chemical properties and their related human
toxicity data. Chemical health risk (slope) factors (for cancer incidence) are taken from the
U.S. Environmental Protection Agency's (EPA's) IRIS and HEAST databases. Another
major design difference lies in the environmental pathways included. For instance, the
external exposure pathway is absent in RESRAD-CHEM, while the dermal absorption
pathway is unimportant in RESRAD (for radionuclides other than tritium). A draft version
of RESRAD-CHEM has been completed and has undergone DOE review.
RESRAD-BUILD analyzes human health risks from contaminants during building
dismantling (for workers) and occupancy (for the general public) through decommissioning
or rehabilitation. The RESRAD-BUILD code is based on a room compartmental model that
analyzes the effects on room air quality of contaminant emission and resuspension (as well
as radon emanation), the external radiation pathway, and other pathways such as air
immersion and indirect ingestion. Because of the proximity of human-to-contaminant contact
during building decommissioning or dismantling, RESRAD-BUILD requires more precise
pathway modeling and input data than does RESRAD. For instance, a detailed description
of contaminant distribution is needed to better define the source-to-receptor configuration.
The same applies to the room air quality. RESRAD-BUILD is currently completed as draft
and has undergone DOE review. The code has been used successfully in a separate DOE
effort to assess potential release standards by calculating human health risks from radioactive
scrap metal recycle and reuse.
RESRAD-BASELINEandRESRAD-PROBABLISTICare separate models currently under
development. RESRAD-BASELJNE is a convenient tool designed to implement EPA's
guidance on human health risk assessment. RESRAD-PROBABUSTIC is intended to
perform uncertainty analysis for RESRAD using the Monte Carlo approach based on the
Latin-Hypercube sampling scheme.
In summary, a RESRAD code series is under development at ANL. These codes are
designed to perform human health risk analyses based on multimedia environmental pathway
models. Each code is tailored to meet a specific objective of human health risk assessment,
which requires specific parameter definition and data gathering. The combined capabilities
of these codes serve to satisfy various risk assessment requirements in environmental
restoration and remediation activities.
40
Decision Support Took Workshop
(

-------
ProTech: The Prospective Technology Communication System
Ann Lesperance
Battelle Pacific Northwest Laboratory, Seattle, WA
ProTech, the Prospective Technology Communication System, describes innovative
environmental cleanup technologies to a wide audience. The technologies described are
funded through the U.S. Department of Energy's (DOE's) Office of Technology
Development (OTD). The primary audience for ProTech is individuals and/or groups who
are interested in or feel they have a stake in waste management activities at each Integrated
Demonstration (ID) site. These stakeholders include the interested public, regulators, Native
Americans, and technology users. ProTech's objective is to provide three main benefits. First,
ProTech functions as a communication tool which can greatly enhance ID and Integrated
Program (BP) public-involvement activities. Second, ProTech describes innovative
technologies to varied stakeholders with the intent of soliciting input on technology
acceptance. Finally, ProTech increases national exposure for technologies and enhances
technology transfer activities. ProTech's secondary objectives include providing management
support to Integrated Demonstration Coordinators (IDCs) and OTD personnel and
increasing communication between people involved in technology development activities
throughout the DOE complex. A demonstration of ProTech and its capabilities will be
provided.
ProTech was developed in 1992, as a communication tool to describe innovative technologies
being demonstrated at DOE sites. The intended audience for ProTech is stakeholders
interested in or concerned about OTD's activities. Currently, ProTech focuses on IDs being
conducted across the DOE complex. ProTech includes site maps, summary fact sheets,
technology profiles, and technology diagrams. A prototype version of ProTech has been
developed for the Volatile Organic Compound (VOC)-Arid Site ID. Customized applications
of ProTech were available in January 1994. These versions describe technologies supporting
the following:
¦	VOCs at Arid Sites ID
¦	Underground Storage Tank ID
¦	Mixed Waste Landfill ID
¦	VOCs at Non-Arid Sites
¦	Uranium in Soils ID
¦	Buried Waste ID
Decision Support Tods Workshop
41

-------
As public-involvement activities were developed for the VOC-Arid ID, it was clear that the
traditional sources of information for stakeholders were Technical Test Plans (TTPs),
technical reports, and presentations. These sources, however, were often unavailable, difficult
to understand because they were too technically written, or incomplete because the
information wanted was not available or the uncertainties about the information were too
great. It became clear that no tool provided information on these technologies in one place
and in an understandable format, and that a new approach to disseminating information
about new technologies was needed.
ProTech was initially conceived as a tool to assist in stakeholder involvement activities and
to thereby evaluate and enhance public, regulator, and technology user acceptance of
VOC-Arid ID technologies. The tool was expected to 1) describe to stakeholders what they
wanted to know about ihe technologies in a form they could understand and 2) describe how
technologies relate to one another.
ProTech filled an identified consumer need for data to make informed decisions about new
technologies. The development requirements were to create a tool that was appropriate for
meeting that consumer need; therefore, ProTech's design is user friendly, based on existing
hardware and software, understandable, and low cost.
ProTech allows users to learn about innovative technologies by displaying fact sheets and
comparing innovative cleanup technologies to established baseline technologies or to other
innovative technologies. The fact sheets include a text description (the need, process,
advantages, and challenges) and a diagram of each technology. TTiey are simple and can be
used in a number of ways, such as press releases. For example, the fact sheet on a drilling
technology called cone penetrometer appeared in Tech Trends." The fact sheet on Hybrid
Plasma Technology, developed by the Massachusetts Institute of Technology, was used by
the Boston Globe in a news article.
The technology comparison feature allows the user to compare technologies based on criteria
of interest in five categories: effectiveness, environmental safety and health, sociopolitical
interests, and regulatory objectives. ProTech then looks up the two technologies (two
innovative or one innovative and one baseline), retrieves data on the selected criteria for
both technologies, and summarizes the results in a comparison chart. Technology information
for both the fact sheets and the comparisons comes from a ProTech technology profile, a
detailed form based on interests and concerns identified in over 40 stakeholder interviews
and two workshops.
The comparison capability is an important function of ProTech in that it allows the user to
compare apples to apples to understand the advantages and, in some cases, the limitations
of the innovative technologies in comparison to the baselines or other innovative
technologies. This comparison feature has been found to be important to industry. Each
application of ProTech also allows users to get an overview of the problem addressed by and
the technologies supporting the ID. A series of site maps is also provided for each of the six
ID sites; the maps progress from a map of the United States to a cross-sectional view of the
site and the problem that the ID addresses. ProTech also furnishes contact names for those
who require further information.
42
Decision Support Tools Workshop

-------
To gain access to the ProTech technology profiles you can call the file transfer protocol
server address at 131.167.239.40.
Decision Support Tools Workshop
43

-------
Remedial Action Cost Engineering and Requirements (RACER)
System
Thomas Ove
U.S. Air Force, Tyndall Air Force Base, FL
The U.S. Air Force (USAF) has established an ambitious goal of cleaning up all of its known
hazardous waste sites by the year 2000. This goal, combined with increasing pressure to
conclude studies and perform remediation activities sooner, prompted the USAF to look for
better and more efficient ways to do its work. RACER was developed to assist the USAF
in meeting these objectives by providing automated tools and data to characterize sites,
consider alternative remediation approaches, document the decision process, accurately
predict the costs of remediation, and manage those costs throughout the design process.
RACER is a knowledge-based system designed to aid in selecting appropriate remediation
approaches, estimating the cost of alternative remediation technologies, and preparing
remedial investigation/feasibility study (RI/FS) documentation. RACER includes two major
components: 1) the Remedial Action Assessment System (RAAS), an integrated object-
oriented expert system used to select remediation approaches, and 2) the Environmental
Cost Engineering System (ENVEST), a parametric cost model used to estimate the cost of
all phases of the remediation process.
RAAS, which is being developed by Battelle Pacific Northwest Laboratories for the U.S.
Department of Energy, will be used to select remediation technology trains. It includes
descriptions and performance data for approximately 100 different remediation technologies
and 200 different contaminants. Other criteria for selection include laws, regulations, and site
applicability.
ENVEST, which is being developed by Delta Research Corporation, Niceville, Florida, for
the USAF, will be used to estimate the total cost of remediation approaches, including RI/FS
or RFI/CMS, remedial design, remedial action, and postremediation operations and
maintenance modules. The system uses a hierarchical structure: parameters, system,
subsystem, assembly categoiy, and assembly and line items.
Decision Support Took Workshop
45

-------
The EnviroText Retrieval System
Rhea Cohen
U.S. Environmental Protection Agency, Superfund Office of Program
Management, Washington, DC
EnviroText, a full-text retrieval knowledge base, is being developed by the U.S.
Environmental Protection Agency with matching funds and data from the Departments of
the Army, Defense, Energy, Interior/Bureau of Mines, and Justice. In August 1993, users in
the sponsor agencies began a one-year pilot test of the system, while the U.S. Army
Construction Engineering Research Laboratories (CERL) completes the data acquisition,
software development, and establishment of the EnviroText Support Center at the University
of Illinois. Resident on the UNIX mainframe of the University, EnviroText will be opened
to the public in early 1995 as a not-for-profit national resource.
This shared federal/state system provides a one-stop library for government staffs,
environmental justice proponents, citizen groups, site managers, project reviewers,
environmental media specialists, planners, researchers, andlawyers. Topics include ecosystem
protection, natural resource trusteeship, pollution prevention, site restoration, occupational
safety and health protection, Resource Conservation and Recovery Act (RCRA) compliance,
and identification of potential federal and state Applicable or Relevant and Appropriate
Requirements (ARARs) under the Comprehensive Environmental Response, Compensation,
and Liability Act (CERCLA). Environmental data sets will include U.S. and state laws and
regulations (full texts and abstracts), U.S. Executive orders, U.S. Indian policies and Indian
tribal codes, international treaties and agreements, pending Congressional legislation, Federal
Register, and policies of federal agencies.
Designed to provide easy public access to environmental requirements, EnviroText—even in
its pilot phase—promotes interagency and intergovernmental cooperation, as well as faster,
more thorough research and economical use of public funds for information collection and
distribution. To realize the advantages of this system, the Superfund Program is currently
providing training to regional staff in the use of EnviroText as a means of saving research
and decision-making time in the CERCLA remedy selection process.
Decision Support Tools Workshop
47

-------
Quality Assurance Issues and Suggestions for Environmental
Software Development*
Jeff van Ee
U.S. Environmental Protection Agency, Las Vegas, NV
Environmental software has been under development for some time. Some software has
proven to be quite useful, but a variety of issues impede greater acceptance of the software.
The first problem is knowing what software is available, and the second problem is
determining if the software is adequate and acceptable for solving an environmental problem.
Distribution of software has been mixed with private and public domain software being
distributed through a variety of channels. Major considerations in using a software product
are whether the software is current, easy to use, and produces quality results.
One area in which quality software can be ensured is through a rigorous quality
assurance/quality control (QA/QC) program. Periodic testing of code to ensure that
algorithms have been properly captured, recording changes in code, updating documentation,
and beta testing of code before it is widely released are some of the traditional ways to
enhance the quality of software. Unfortunately, as software becomes more complex, the
resources required to ensure quality code become larger. Developers of environmental
software have an especially difficult task in ensuring quality code because of the complexity
of the problems the code is tiying to address.
A significant problem in environmental software is whether the models and algorithms are
correctly chosen and applied to the problem. A wide variety of potential users exist as well
as a wide variety of potential applications of the software. Precautions must be taken to
ensure that the environmental software is appropriately applied to the problem at hand.
Finally, there is the problem of inputting and outputting data. A large variety of interrelated
software is being developed by a number of parties; unfortunately, few widely accepted data
exchange standards exist to allow data to be easily interchanged. In addition, standard
definitions for data are usually weak, and antillaiy data required to accompany critical data
in an environmental program are often poorly handled or neglected.
Suggestions are provided on steps that can be taken to promote the development and use
of environmental software.
'The research described in this abstract has been funded by the U.S. Environmental
Protection Agency (EPA) through its Office of Research and Development (ORD). This
abstract and the oral presentations it summarizes have not been subjected to ORD's peer
and administrative review and do not necessarily reflect the views of EPA or ORD.
Decision Support Toob Workshop
49

-------
Software Development Process: The Key to Success
Lillian Snyder
Sandia National Laboratories, Albuquerque, NM
The keys to a successful software development process are 1) an open approach that includes
the customer as an important member of the team and gives the customer access to all
aspects of the development process; 2) a team approach that includes the customer, the
conceptual or methodology developers (if different than the customer), and the software
developers, testers, and document writers; and 3) a control system consisting of two parts:
a) a tracking system (preferably electronic) and b) a control board composed of the
development team including, and in many cases led by, the customer.
The control system maintains consistency and control over all aspects of the software
development cycle. Once documentation and/or software has been accepted as the standard
and has been entered into the control system, they can only be changed with a change
request (also entered into the system). These change requests' status and resolution are
maintained in the control system and can be accessed for review at any time.
The process above only works if: 1) quality and product excellence is assumed to be the
responsibility of each and every member of the team and begins on Day 1 of the project
(quality is considered intrinsic to every step in the process, not an external exercise); 2) each
member of the team respects each other and allows the experts in any one aspect to assume
leadership when discussions or actions are required in that area; 3) each member of the
team, including the customer, agrees to be actively involved in all steps of the process; 4) all
members review the documentation and come to an agreement on the version to be
considered the standard; 5) all members enter into the electronic control system all changes
necessary to make the product successful and correct; 6) all members, including the
customer, log onto the control system to view and test the software as each new version is
finished, and each person who tests the system follows through by entering all changes or
problems into the electronic control system; and 7) all members of the team feel that the
final product is a reflection on their professionalism, and so insist on excellence from
themselves and each other.
Decision Support Tods Workshop
51

-------
Getting the Right Answer
Robert J. Gelinas
Lawrence Livermorft National Laboratoiy, Environmental Restoration Division,
Livermore, CA
The most effective decision support tools (DSTs) enable science to deliver the best available
answers, at any given time, to managers, policy makers, and the general public. DSTs must
perform two major functions: first, they must demonstrably increase the productivity of
practicing scientists and aid in the decision-making process; and second, they must convey
highly technical results, accurately and in context, to technical nonexperts. These are
especially critical functions in Superfund/Resource Conservation and Recovery Act (RCRA)
actions, where advanced fate and transport models and site-specific hydrogeologic
characterizations are at the heart of evaluations for controlling and remediating toxic
materials in the environment, reducing risks, minimizing the economic and social
consequences of toxic materials, and maximizing the effectiveness of public policy.
Successful development and use of DSTs face a basic dilemma, arising when physics models,
which were originally built as science tools by specialists, often waiy of misapplications,
become the engines of DSTs that will be used by technical nonexperts, who simply want to
get the right answer. On the one hand, DSTs should represent the best available science,
which is fundamentally uncertain and constantly advancing. On the other hand, firm answers
are needed with little uncertainty. Pragmatic environmental predictions must be made prior
to knowing final outcomes while recognizing intrinsic physical and representative
uncertainties.
The key to resolving this dilemma is in asking the right questions with appropriate technical
expectations. I will draw from successful historical paradigms that have resolved some of
these complex dilemmas now faced in applying DSTs to RCRA actions.
Decision Support Tools Workshop
S3

-------
The Defense Environmental Corporate Information Management
(DECIM) Program: Software Development and Data Standardization
Windell Ingram
U.S. Army Corps of Engineers, Computer Science Division,
Waterways Experiment Station, Vicksburg, MS
Mark Bovelsky
U.S. Army Environmental Center, Information Management Branch,
Aberdeen Proving Ground, MD
The U.S. Departments of Defense (DOD) Corporate Information Management (CIM)
initiative intends to adopt standard business practices and standard information systems
throughout DOD. The CIM initiative is DOD's version of "reinventing government." One
CIM goal is to reduce costs by eliminating the development, operation, and maintenance of
redundant information systems. The DECIM program is the environmental component of
the CIM initiative.
The general DECIM strategy is to select and deploy "migration" systems, i.e., existing
information systems that may be "reengineered" to meet initial requirements and will then
evolve to support improved business practices as they are implemented.
Migration systems are selected through a process that includes group sessions where subject
matter experts (SMF • representing DOD components use an electronic meeting system
(EMS) environment to examine business practices and information needs. These sessions
lead to the development of activity models and data models corresponding to current
requirements. Similar sessions identify system selection criteria and evaluate candidate
systems. The functionally acceptable candidate systems are then evaluated for technical merit,
and a selection recommendation follows.
Migration systems will be used primarily by environmental coordinators at DOD installations.
Initially, systems must be targeted to the host platform commonly available to this user
community, i.e., a 386-based machine running DOS. Data transfer will occur either
electronically via file transfers or by mailing floppy disks.
Ultimately, the intent is to provide the necessary information infrastructure, such that all
users have high-speed access to common database servers and data sharing can occur via
client-server applications. It is envisioned that Geographic Information Systems (GIS) will
play a large role in the future infrastructure.
DECIM has adopted software development practices that comply with DOD mandates and
satisfy pragmatic requirements. Four development centers collaborate, adhere to common
practices, and use common tool sets. Data models are developed, object-based design
methods are used, and Ada is the standard programming language. AdaSage is currently the
Decision Support Tools Workshop
55

-------
standard tool set. Each center has recently acquired a Rational software engineering
environment.
Data standardization is a primary goal for DECIM. Logical data models are produced early
in the analysis process for each business activity. These models will be integrated and
conflicts and inconsistencies resolved. A comprehensive standard DECIM data dictionary is
a goal. The integrated DECIM data model and data dictionary will ultimately become part
of the global CIM data model and data dictionary.
56
Decision Support Tods Workshop

-------
Decision Support System for Evaluating Remediation Performance
With Interactive Pump-and-Treat Simulator
David S. Burden, Ph.D.
U.S. Environmental Protection Agency, Robert S. Kerr Environmental Research
Laboratory, Ada, OK
Hanadi S. Rifai, Ph.D.
Rice University, Energy and Environmental Systems Institute, Houston, TX
Over the past decade, numerous Superfund sites have implemented pump-and-treat systems
in an effort to remediate contaminated ground water. In 1989, the U.S. Environmental
Protection Agency conducted a detailed evaluation of ground-water extraction systems at 112
sites and determined that the ground-water extraction systems were generally effective in
maintaining hydraulic containment of contaminant plumes, thus preventing further migration
of contaminants. Significant removal of contaminant mass from the subsurface is often
achieved by ground-water extraction systems. When site conditions are favorable and the
extraction system is properly designed and operated, it may be possible to remediate the
aquifer to health-based levels. Contaminant concentrations usually decrease most rapidly
soon after the initiation of extraction. After this initial reduction, the concentrations often
tend to level off, and progress toward complete aquifer restoration is usually slower than
expected or impossible to achieve. Data collection, both prior to system design and during
operation, was frequently insufficient to fully assess contaminant movement and response of
the ground-water system to extraction.
A graphical decision support system (DSS) is being developed that would be used for
evaluating extraction networks at contaminated sites. The Pump-and-Treat DSS (P&T DSS)
design includes three components: 1) a Global module, 2) a Site-Specific module, and 3) an
Interactive Simulator module. The Global module is aimed at familiarizing the user with
remediation technologies that can be used for cleanup of soils and ground water
contaminated with petroleum hydrocarbons or solvent mixtures. The Site-Specific module
develops a case study of evaluating a pump-and-treat system using data from the U.S. Air
Force Plant 44 site in Tucson, Arizona. Finally, the Interactive Simulator module is a
graphical modeling interface that allows the user to design a pump-and-treat system for a
given site. The interface is built around the BIOPLUMEII model.
Decision Support Tools Workshop
57

-------
Sandia's Environmental Decision Support System (SEDSS): A Tool
To Guide Site Characterization, Risk Assessment, and Remedial
Design Selection
Erik K. Webb, Steve Conrad, and Lillian Snyder
Sandia National Laboratories, Albuquerque, NM
SEDSS is a process and supporting computer system designed to use probabilistic modeling
and measures of risk to guide decisions related to site safety, remediation selection, and
closure. This approach explicitly captures the inherent uncertainties associated with irregular
physical/chemical systems, sources, transport pathways, exposure, and consequences so that
these uncertainties can be factored into site-specific decisions. In addition, SEDSS provides
a direct link between data analysis (risk assessment, etc.) and new data collection by
guiding/optimizing new site characterization and site monitoring activities based on estimates
of risk and cost. Potentially, this methodology can be applied to any environmental pathway,
both radionuclide and hazardous contaminants, and works with any type of detection or site
characterization tool.
The automated system is designed to provide a user-friendly computer-based system where
the user works through a Graphical User Interface (GUI) to select the type of problem that
is to be solved (Application). This in turn allows the user access to data in databases and
Geographical Information Systems (GIS) and to probabilistic modeling tools, specific to site
conditions, from an extensive toolbox.
For any of the system, applications, there is a set framework for solving the problem. This
framework consists of several steps that are arranged to allow the user to iterate between
analysis and data collection to solve the questions listed above. As shown in Figure 1, the
automated system (following this framework) queries the user on the exact objective of the
analysis to formulate a set of numerical performance measures for the site (1); provides
access to data stored in a Geographic Information System or database (2); queries the user
in a consistent manner to develop a comprehensive description of the user's understanding
of site conditions, establishes probabilistic models to simulate the user-defined system, and
accepts uncertainty in input parameters (3); performs the uncertainty analysis (4); and
compares the analysis directly to the original performance measures for the user to then
make a decision (5). If the user is faced with inadequate information to make a decision,
sensitivity analysis (6), data worth, and cost/benefit functions (7) provide a comparison of the
value of additional data versus their cost, and the user can decide to collect additional data
or move to a new phase of action (8). New data would be collected and entered into the data
management system (9), thereby completing the process.
Hie only baseline to which we can compare SEDSS is the subjective decision-making process
that is currently being used at most sites. If the individual decision-maker was queried, you
will find that they usually follow a process that either explicitly or implicitly covers most of
Decision Support Tools Workshop
59

-------
explicitly documented, the decision process becomes highly inconsistent, and in most cases
extremely inefficient. Furthermore, subjective decision-making is usually full of qualitative
assessments. SEDSS provides a consistent, explicit approach to defining and solving problems
related to risk assessment, site characterization, remediation selection, and monitoring. It
also speeds up the process by providing automated tools to facilitate the decision-maker's
job.
SEDSS is being developed in stages. The first release will be a beta version which will be
tested in fiscal year 1995. This version will contain applications to perform risk assessment
for both radionuclides and hazardous constituents through the ground-water pathway.
SEDSS is funded in part by the U.S. Environmental Protection Agency (EPA) Office of
Emergency and Remedial Response (Superfund) headquarters, EPA Radiation and Indoor
Air, the Nuclear Regulatory Commission (NRC) Office of Research for Low-Level Waste,
the U.S. Department of Energy (DOE) Mixed Waste Landfill Integrated Demonstration,
DOE Uranium Mill Tailings Remedial Action Program, and DOE Environmental
Restoration Albuquerque Project Office.
Figure 1.
60
Decision Support Took Workshop

-------
DuPont's CD/ROM Decision Support System: HEART
Richard Jensen
E.I. DuPont, Willmington, DE
DuPont expends approximately $10 million annually on remediation technology development
and many times that amount on cleanup operations. Technology development includes
researching, developing, external networking, gathering, evaluating, interpreting, ranking, and
disseminating. Faced with an "information glut," DuPont has developed a CD/ROM-based
technology transfer and decision support system.
Philosophy: DuPont finds that information is most useful when it is the right information,
delivered to the right place, at the right time, in a form that can easily be used. Information
should be organized into multiple tiers, beginning with the simplest, most friendly tier, and
progressing stagewise toward the most complex. Information may start out as data; as it is
manipulated, it progresses through the stages of knowledge, know-how, and finally wisdom.
As information progresses through these stages, the volume which must be communicated
decreases, and the usefulness increases. The advent of personal computers and optical
storage devices have made it possible to follow these rules.
Written entirely in Turbo Pascal for the DOS environment, DuPont's system consists of
separate "objects" for displaying text, graphics, and multipage scanned and compound
documents. Information elements are linked in hypertext fashion for developing decision
trees and matrices. Smooth transitions between text and graphics are provided. The system
includes internally generated guidance, U.S. Environmental Protection Agency (EPA) and
U.S. Department of Defense (DOD) guidance, scanned technical articles, vendor brochures,
photographs of equipment, and special decision support matrices and questionnaires.
EPA might consider CD/ROM as a means of disseminating remediation guidance and
support tools. Numbers of important documents could be grouped on a CD/ROM, linked
in a logical, tiered fashion through decision trees and matrices. EPA could say, "On this
single CD are all the decision support tools for	. The tools are backed up
by the entire body of important information we are aware of on the subject, as of
	' If you have this CD, you have ail that we have!"
Decision Support Toob Wc~cshop
61

-------
Introduction to the Internet: Environmental Resources
Byron Palmer
Los Alamos National Laboratory, Los Alamos, NM
Internet History and Tools
The Internet started in the 1970s with the Advanced Research Projects Agency (ARPA) and
has grown to be a global network with millions of users. Except for the very recent
phenomenon of America On-line, it is the fastest growing network of users, and has the
largest number of users (17 million as of 1993) of any collection, including CompuServe. It
is unique in that there exists no one governing body, but a collection of people interested in
its continuance. It connects a number of different networks and has a variety of protocols.
There are a number of different things that the user can do on the internet. All of these are
driven by the protocols that are collectively agreed upon using the Request for Comment
(RFC) approach. The earliest protocol was telnet, wherein the user's computer connects with
another computer and operates the other computer from his or her "terminal." The following
list is some of the standard protocols that exist on the Internet:
telnet	connect to a remote computer
ftp	file transfer protocol for moving files between computers
smtp	simple mail transport protocol for E-mail
pop	post office protocol for handling mail on a server
slip	serial line interface protocol for dialing into the Internet
ppp	point to point protocol—similar to slip, only better
The other protocols are not listed as they are not commonly used, at least as far as the user
sees. There are a number of different services that the Internet provides, based in part on
the protocols listed above. These are:
¦	E-mail	electronic mail to anyplace
¦	finger	locate a user on a computer
¦	phone	locate a user in a "phone directory"
¦	gopher	information front end that relies on intelligent ftp clients
Decision Support Tools Workshop
63

-------
mosaic graphical information front end to world-wide web (WWW)
servers
wais wide area information server—finds information based on
phrases
news news posting on a variety of topics
lists information servers based on lists of users—sent via E-mail
archie search Internet sites for software
Connection to the Internet
There are a number of different ways to connect to the Internet. If the user's organization
is connected, he or she should find it relatively easy to obtain access at a personal computer.
If the user's organization is not connected and does not plan to in the near future, he or she
will have to go with a personal connection provided by dial-up services. These range from
"shell" accounts to slip and ppp services. Only through the use of slip or ppp will the user
obtain the full connection capabilities of the Internet.
Guides to the Internet
There has been an explosion of books on the Internet in the past couple of years. These
books often contain software for connection as well as many of the tools to get started. Some
even contain offers from service providers to try out the Internet. Service providers have also
grown in the past couple of years, and it is now hard to find a current listing as new areas
are constantly being added by new and established providers.
Serving the Internet
If the user has a direct connection to the network, he or she can become one of the
information providers—if not for the world, then for his or her own work group. These
servers now run on desktop computers rather than on large mainframe computers, and many
are free or relatively cheap to set up and administer. The cost, except for the Internet access,
is minor and maintenance is easy.
WWW servers are very popular but require some work to establish; ftp sites are easier to
run; and gopher servers fall between ftp and WWW servers in complexity. E-mail via pop
servers is easy to administer. WAIS servers have not made the transition to desktop yet; but
it will only be a matter of time.
Guidelines for serving the Internet will be discussed as well as recommendations for servers
and clients. There will also be a general discussion of what the Internet does and does not
provide.
64
Decision Support Tooh Workshop

-------
Waste Management and Technologies Analytical Database System
(WMTADS)
Byron Palmer
Los Alamos National Laboratory, Los Alamos, NM
Los Alamos National Laboratory, with U.S. Department of Energy (DOE) support, has been
developing analytical tools for waste management and treatment. These tools have been
directed toward the waste that DOE has generated and continues to generate. They center
around the technologies and their selection to treat waste and have been applied to the Site
Treatment Plans and to the Environmental Impact Statement (EIS), both of which are
ongoing projects within the DOE complex.
We have collected a large number of different technology descriptions from industry, DOE,
and the U.S. Environmental Protection Agency's databases of treatment technologies. We
have reviewed these technologies and tagged them appropriately so that any given technology
can be evaluated for a specific problem. We are in the process of further refining these tags
to better match the technologies to wastes.
We have also implemented tools for analyzing wastes and the treatment systems necessary
(type and size) to handle the wastes. These tools have been applied to the EIS work to
generate waste volume loads for treatment modules in the various schemes proposed by
DOE.
Finally, we are developing tools that combine both the technology selection and the
treatment system analysis to help in the design of a treatment system appropriate for the
problems. These tools will offer alternatives for treatment systems and provide cost
information for these treatment systems. The goals of this effort are to provide a rapid way
for managers to analyze various options available to handle waste problems and to aid in
providing alternatives that are feasible, while helping to reject those that will not work well
for a given situation.
Decision Support Tools Wc-kshop
65

-------
Access to the U.S Environmental Protection Agency's High
Performance Computing Resources for Environmental Decision
Support
Joan H. Novak
U.S. Environmental Protection Agency, Research Triangle Park, NC
The U.S. Environmental Protection Agency (EPA), as part of the Federal High Performance
Computing and Communications Initiative, is developing an Environmental Modeling and
Decision Support System to take maximum advantage of emerging high performance
computing platforms and network technology. One main goal of EPA's program is to provide
powerful, yet easy to use, environmental management tools to state, federal, and industrial
organizations involved in day-to-day environmental problem-solving. The approach
encompasses building high performance computing and communications environment within
EPA to handle more complex regional multipollutant and multimedia environmental issues,
while at the same time ensuring compatible low cost solutions for use at local levels. Thus
with a distributed computing approach, a problem can transparently migrate to the most cost
and time effective computing platform as the problem size grows or as time constraints
become critical. Decision-makers at all levels will have the ability to access the most powerful
resources necessary to resolve critical environmental issues.
Decision Support Tools Workshop
67

-------
EnviroTRADE: A Commercialization Case Study
Susan W. Johnson
U.S. Department of Energy, Office of Technology Development,
Germantown, MD
The U.S. Department of Energy (DOE) is developing the EnviroTRADE environmental
information system for the purpose of managing large amounts of information on waste site
and environmental technologies. The system uses decision support tools through an easy-to-
use interface for matching technologies to sites and locating potential sites where
technologies might be applied. Spatial data is managed using state-of-the-art geographic
information system technology. The information will be delivered on a network and must be
both accurate and current. DOE has begun the process of commercializing the
EnviroTRADE information system because the private sector is well suited to market and
deliver the data.
Decision Support Took Workshop
CO

-------
Products and Services of the National Technical Information Service
Jan Guastaferro
U.S. Department of Commerce, National Technical Information Service,
Springfield, VA
The National Technical Information Service (NTIS), an agency of the U.S. Department of
Commerce, operates a national clearinghouse for federal scientific, technical, engineering,
and business information. The primary mission of NTIS is to make federal information
products accessible to business and industry. This presentation describes some of the
environmental software products available through NTIS and discusses services NTIS offers
to federal agencies, including a case study of how the Superfund program office has
partnered with NTIS to disseminate information.
NTIS helps federal agencies accelerate the distribution of their information products to a
broad user community by offering a number of services:
¦	Basic information dissemination and order fulfillment services
¦	Special collection management, marketing, and distribution services
¦	Reproduction and electronic media production services
¦	Financial and contractual services
NTIS performs many of these services for the Office of Emergency and Remedial Response
within the Office of Solid Waste and Emergency Response in support of the Superfund
program. Superfund instituted a policy change in 1992, where the program office continues
to pay for the U.S. Environmental Protection Agency's (EPA's) internal distribution of
Superfund information, but NTIS would assume the burden of public dissemination. Several
new subscription products are now offered based on topics of interest such as site assessment
and remediation, technologies and analytical services, and program policies, where a
customer automatically receives any document assigned that subject category when it is
released. NTIS production staff currently are on site at the Superfiind Document Center to
handle the sizable number of internal orders.
Collection management services represent a large part of the EPA Superfund-NTIS
partnership. There are about 2,000 documents in the Superfund collection including 600
active documents. Each type of document, publication, fact sheet, directive, etc. has its own
format and printing specifications. NTIS maintains a separate inventory control system for
Superfund internal copies, which are stored at NTIS in Springfield, Virginia.
Decision Support Took Workshop
71

-------
NTTS can help announce and market new government reports and products. An example of
how we help EPA Superfund get the word out on what's new is the Superfund early bird
window on NTTS's FedWorld, an on-line service and bulletin board. The window lists the
new Superfund documents available that month and has helped cut down on calls to EPA
hotlines.
NTTS's Electronic Media Production Service provides complete production services for
electronic media on magnetic tape, digital tape, floppy disks, microdiskettes, and CD-ROM.
We assist our clients at every step of the development, production, and distribution process,
adding value based on our experience with over 350 different projects and over 71 different
agencies. The scope of this work has ranged from projects costing several thousand dollars
to very large projects, such as National Library of Medicine's Grateful Med, which cost over
$1 million. NITS has in-house production capabilities as well as prescreened and prequalified
contractors for all types of media. Most requests for quotations close after three to five
business days.
NTIS also provides a complete spectrum of business services that support the production
effort. NTTS's accounting services assist agencies by managing billing and collections for their
information dissemination programs. This is a very time-consuming and labor-intensive
activity, and NITS is an excellent alternative to tying up scarce resources within your agency
by performing these tasks.
NTIS receives over 50 new environmental documents a day and has over 4,000 computer
data files, models, and software programs. Several environmental decision support tools
available from NTTS will be described. Copies of the catalog of U.S. Government
Environmental Datafiles & Software will be available at the presentation and the poster
exhibit.
72
Decision Support Tools Workshop

-------
The U.S. Environmental Protection Agency's Environmental
Monitoring Methods Index (EMMI): A Tool for Environmental
Monitoring
William A. Telliard
U.S. Environmental Protection Agency, Office of Water Programs,
Washington, DC
The U.S. Environmental Protection Agency (EPA) has developed a computerized database
containing analytical testing method and regulatory information on environmentally
significant analytes that are monitored by EPA. The database, the Environmental Monitoring
Methods Index (EMMI), is the result of efforts by the Agency's Environmental Monitoring
Management Council (EMMC) and the Office of Water. EMMI is designed to aid
environmental program managers, and others who must develop lists of analytes to study,
identify appropriate analytical methods, determine the most suitable methods for a particular
purpose, evaluate available methods prior to developing new analytical procedures, locate
sources for analytical standards, and identify contact points for environmental regulation and
analytical methods.
The present version of EMMI is the result of an exhaustive search of the U.S. Code of
Federal Regulations, the Federal Register, and published analytical methods. EMMI
encompasses a total of 2,607 analytes, 49 regulatory and monitoring lists, 1,167 analytical
methods, and a database cross-reference to 5,740 analytes. The Chemical Abstracts Service
(CAS) Registry Number is used to unambiguously identify analytes contained in the
database. This presentation will focus on using EMMI as a problem-solving tool to identify
appropriate analytes and methods for environmental monitoring studies. A case study will
be examined where EMMI was used to identify analytical methods that met the required
analytical detection limits and other study data quality objectives. EMMI will be
demonstrated to the audience during the presentation and a working demonstration copy will
be provided to all attendees.
Decision Support Tools Workshop
73

-------