&EPA
United States
Environmental Protection
Agency
Office of the Administrator
Science Advisory Board
Washington DC 20460
SAB EG88-Q40A
September
Final Report
Appendix A:
Strategies for Sources,
Transport and Fate Research
'•W
Report of the Subcommittee
on Sources, Transport and Fate
Research Strategies Committee
-------
NOTICE
This report has be^n -written as a part of the activities
of the Science Advisory Board, a public advisory group providing
extramural scientific information and advice to the Administrator
and other officials of the Environmental Protection Agency.
The Board is structured to provide a balanced, expert assessment
of scientific matters related to problems facing the Agency.
This report has not been reviewed for approval by the Agency;
hence, the contents of this report do not necessarily
represent the views and policies of the Environmental Protection
Agency or of other Federal agencies. Any mention of trade
names or commercial products do not constitute endorsement or
recommendation for use.
-------
U. S. Environmental Protection Agency
Science Advisory Board
Research Strategies Committee
Sources, Transport and Fate Subcommittee
George Hidy
Electric Power Research Institute, Environmental Division
3412 Hillview Avenue
Palo Alto, CA 94303
Members
Dr. Anaers Andren
University of Wisconsin, Water Chemistry Laboratory
660 N. Park Street
Madison, Wisconsin 53706
Dr. Jack Calvert
National Center for Atmospheric Research
1850 Table Mesa Drive
Boulder, Colorado 80303
Dr. Yoram Cohen
University of California at Los Angeles,
Chemical Engineering Department
'Boelter Hall
Los Angeles, CA 90024
Dr. Richard Conway
Union Carbide Corporation, South Charleston Technical Center
3200 Kanawha Turnpike (Bldg. 770)
South Charleston, WV 25303
Dr. Robert Huggett
Virginia Institute of Marine Science,
School of Marine Sciences
9 Raymond Drive
Seaford, Virginia 23696
Dr. Donald O'Connor
Manhatton College,
Environmental Engineering and Science Program,
Riverdale, New York 10471
Dr. Barbara Walton
Oak Ridge National Laboratory, Environmental Sciences Division
Post Office Box X
Oak Ridge, TN 37831-6083
Dr. Herbert Ward
Rice University,
Department of Environmental Science and Engineering
61QQ South Main
Houston, TX 77005
-------
Support Staff
Dr. Donald G. Barnes*
Acting Staff Director, Science Advisory Board
U.S. Environmental Protection Agency
Ms. Joanna Foellmer
Secretary to the Staff Director, Science Advisory Board
U.S. Environmental Protection Agency
* Original draft prepared through the support of
Dr. Terry F. Yosie, former Director, Science Advisory Board
-------
Strategies for Sources, Transport and Fate Research
Appendix to FUTURE RISK
TABLE OF CONTENTS
Page
1.0 Executive Summary 1
2.0 Importance of Sources, Transport and Fate Research 4
2.1 The Role of Sources, Transport and Fate Research 4
2.2 Key Elements Needs in Sources, Transport, 5
and Fate Research
3.0 Strategy for Sources, Transport and Fate Research 7
3.1 FIRST STRATEGIC ELEMENT:
Reduction of Uncertainty in Estimating
Environmental Concentrations of Pollutants 8
3.1.1 Modeling and Model Validation 8
3.1.2 Source Characterization 10
3.1.2.1 Chemical Characterization 11
3.1.2.2 Release Rates 11
3.1.2.3 Episodic Releases 11
3.1.2.4 Source Characterization by Medium 12
3.1.2.4.1 Air 12
3.1.2.4.2 Surface Water 12
3.1.2.4.3 Ground Water 13
3.1.2.4.4 Soils and Sediments 13
-------
3.1.3 Transport Processes 14
3.1.3.1 Surface Water 15
3.1.3.2 Ground Water 15
3.1.2.3 Water-Underlying Bed Interactions 16
3.1.2.4 Soils 16
3.1.4 Fate Processes 17
3.2 SECOND STRATEGIC ELEMENT:
Early Detection of Environmental Problems 17
3.2.1 New Stressor Identification:
The Need for Early Warning 17
3.2.2 Early Warning Data Sources 18
3.2.2.1 Chemical, Biological and Physical Monitoring 18
3.2.2.2 Societal, Economic and Technological Changes 21
3.2.2.3 Literature Reviews and Expert Workshops 22
3.3 Implementation at EPA 22
4.0 Recommendations 23
4.1 Recommendation I: Emphasis on STF Models 23
4.2 Recommendation II: Leadership by
Risk Assessment Council 24
4.3 Recommendation III: Establishment of an
Early Warning Group 24
-------
STRATEGIES FOR SOURCES, TRANSPORT AND FATE RESEARCH
1.0 Executive Summary
Sources, transport and fate (STF) research explores the
interconnections between sources of environmental pollutants,
their transport and transformation through the environment, and
their ultimate fate. These research findings allow measurement
or prediction of pollutant concentrations at points distant from
the source. These exposure data are coupled with toxicity
information to assess risk. In other cases, STF research can be
used to identify sources of environmental risks. For example,
previously unsuspected pollution sources fortuitously have been
identified through field measurements (e.g., chlorinated •
dibenzo-p-dioxins from pulp and paper mills), and mathematical
models have successfully related suspect source emissions to
particular environmental findings (e.g., stratospheric ozone
depletion). In addition to risk assessment purposes, STF
research is being looked to as a generator of "early warning"
information on potential, emerging, and/or escalating
environmental problems.
In order to meet these growing demands, STF research
strategy in the 1990's should have two major elements, which are
central to this report:
a. Strengthening EPA's capability for predicting
environmental form and concentration of pollutants,
with a known level of uncertainty, through
measurements and modelling.
b. Utilizing STF knowledge to provide an early warning
vehicle for anticipating issues that are likely to
become priority concerns for EPA.
The first element of the strategy calls for expansion of the
knowledge base on transport and transformation processes in order
to develop and validate models needed in the assessment and
management of environmental risks. The second element is
designed to raise Agency and public awareness of environmental
problems at a stage early enough to permit adoption of a
cost-effective approach to risk reduction.
Regarding the first strategic element, much of the success
of STF research depends upon the development and validation of
mathematical models, specifying their degrees of uncertainty.
While the basic principles applicable to many of these models are
known, site-specific conditions, process data needs and differing
scale requirements (e.g., local vs. regional vs. global, or
short-term vs. long-term) limit the current successful
application of these models and introduce uncertainties which
— 1 —
-------
need to be identified, quantified, and narrowed.
•
Models are predicated on mass conservation and, necessarily,
require data on source characterization, media transport and
chemical conversion processes, and deposition or media removal
processes, which are sometimes called "fate" terms. A broad
range of known and potential sources in all media (e.g., air,
surface water,* ground water, and soils) should be characterized
through a core program examining the chemical characterization,
release rates, -and frequency of releases from the sources. This
source information, coupled with fundamental knowledge of
transport and transformation processes, should feed into the
mathematical models which predict the behavior and ultimate fate
of the pollutant(s) in various media and generate estimates of
exposure.
Regarding the second major element of the strategy -- the
need for early warning --, great benefits can be derived from
early identification of problems; i.e., reasoned risk reduction
actions can be implemented to correct a situation before it
requires a costly crisis response. The ability to detect
problems before they would traditionally appear is related to
foresighted collection and judicious use of key environmental
data. Chemical, biological and physical monitori-ng activities, a
major source of such data, need to be strengthened considerably.
In addition, a shift in strategic thinking is needed in the
basic approa-ch to environmental monitoring. Currently, the
Agency focuses on a limited number of selected pollutants,
adopting a "feedback" strategy. That is,, if certain pollutants
are found in excess of some existing standard or limit, the
information is fed backhand regulatory action is taken. In the
future, the Agency should adopt a "feed forward" strategy that
involves monitoring a much broader range of compounds and other
environmental stressors of potential interest, many of which do
not have regulatory standards. The resulting data would provide
an increasingly realistic and complete estimate of the total
toxic burden in the environment and a context within which to •
determine more easily the extent to which chemically transformed
products or new, unregulated compounds enter the environment. It
would also highlight situations in which the distribution of
chemicals change, perhaps indicating significant changes in
environmental conditions; e.g., global air (climate) warming from
the increased presence of radiation- absorbing gases in the
atmosphere. Such information would be fed forward and analyzed,
possibly leading.to the development of a regulatory or other risk
reduction response.
Careful consideration of societal, economic and
technological trends could also be helpful in anticipating—and
possibly avoiding—environmental problems. Measurements alone
will not suffice to provide an anticipating framework. A forward
looking analysis also will be needed. The Agency should assign
the task of achieving this "early warning" goal to a group
-2-
-------
charged with discerning the implications of emerging observations
and knowledge in the context of past knowledge. This group would
submit an annual report on their findings and projections, or a
special alert of findings as necessary.
This Report contains three specific recommendations:
a. Recommendation I: Improved STF Models
EPA should strengthen its research on STF model
development, evaluation and validation, as a means ,
for reducing uncertainty in risk assessment and risk
management.
b. Recommendation II: Leadership by the Risk Assessment
Council
The Agency's Risk Assessment Council should take steps
to insure that STF research is integrated into EPA's
approach to exposure assessment analyses,
c. Recommendation III: Establishment of Early Warning Group
The Agency should establish a group of senior scientists
and engineers to identify potential, emerging, and/or
escalating environment health and ecological problems
using systematic, long term measurements and their
interpretation.
-3-
-------
2.0 Importance of Source, Transport and Fate Research
2.1 The Role of Sources, Transport and Fate Research
The study of the sources, transport and fate (STF) of
pollutants is an essential part of environmental research. This
type of work has served three major* roles in environmental
assessment, resulting in the abil-ity to estimate exposure
concentration levels and to relate excessive exposure levels to ^
sources needing emission reduction. The roles are:
a. Generation of fundamental knowledge about the physical
and chemical characteristics of emissions from
pollution sources.
b. Clarification of the nature of transport, conversion and
media loss processes (e.g. deposition and absorption)
that lead to exposure.
c. Highlighting of chemical conversions leading to.
pollutants that differ from the direct emissions;
e.g., the conversion of sulfur dioxide into sulfuric
acid in the atmosphere.
In the first instance, an expanding inventory of compounds
or radioactive substances has emerged that potentially affect the
environment. In some research the interpretation of temporal and
spatial distributions of ambient measurements has led to the
identification of sources otherwise not considered (e.g.,
polychlorinated dibenzo-p-dioxins from pulp and paper mills).
Accurate source identification information is important for.the
development of cost-effective control strategies.
Second, STF research has provided the principal basis for
developing mathematical models to relate source emissions to
ambient conditions, yielding exposure estimates. Such models
provide a critical element of exposure estimation in space and
time to assess the impact of both existing and new sources.
Particularly notable accomplishments of modeling that have not
only provided regulatory tools but have advanced the
understanding of source-receptor relationships include multiple
source air dispersion models and chemical fate models for ground
water contamination.
Specially designed measurement programs have been required
to provide data to investigate environmental processes and to
verify and test the reliability of such models. These
measurements are distinct from monitoring and surveillance for
existing regulatory requirements.
A third role of STF research involves understanding
large-scale environmental phenomena, using basic knowledge of
chemical processes, results from laboratory experimentation,
development of- mathematical models, and critical measurements in
the field. Examples of major contributions in this category
— 4~
-------
include:
a. Tropospheric ozone prediction schemes employing
meteorological factors and highly complicated
chemistry.
b. An explanation of the role of chlorofluorocarbons in
modifying stratospheric ozone,
c. The ^discovery of organic chlorine compounds in treated
drinking water.
d. The exploration of complex bio-geochemical factors
affecting the speciation of heavy metals such as *
mercury, chromium and selenium.
An additional emerging role of transport and fate research
concerns the identification, analysis and interpretation of
certain long-term trends that can alert policy makers to
significant environmental issues in the future. Those have
included inferences from long-term monitoring that emerged
through ecological and biological effects, acid deposition and
long--range transport, the build-up of greenhouse gases to produce
climate change, surface hydrogeological concerns in the storage
and land disposal of wastes, and identification of global
contamination of the oceans from certain pesticides.
The past successes of STF research in raising scientific and
public awareness on a wide range of environmental issues
indicates that the public investment in this work is well
founded. The area should continue to be an important component
of EPA efforts in research and development.
2.2 Key Elements Needed in Sources, Transport and Fate Research
An important factor in risk assessment is knowledge of the
sources of pollutants and the processes that subsequently
govern the dispersal of a pollutant into the environment. This
relationship can be conceived as links in a chain beginning with
pollutant emissions followed by transport, transformation and
media removal (deposition or sorption). Understanding these
relationships contributes scientific insight as to how
pollutants, through space and time, reach humans and other
receptors. In addition, definition.of source-receptor
relationships aids decisionmakers in targeting specific sources
for risk reduction efforts. Failure to address these questions
can result in environmental policies and regulations that are
cost-inefficient, spending too little on some problems and too
much on others.
EPA's research has contained a substantial component of STF
work in the past. However, the effort frequently has been poorly
defined and, consequently, has not always focused on issues
central to the Agency's needs. For the 1990's, STF research
strategy should have two major elements:
a. Strengthening the capability of predicting environmental
form and concentration of pollutants, with a known
-5-
-------
level of uncertainty-, through measurements and
modeling.
b. Utilizing STF knowledge to provide an early warning
vehicle for anticipating issues that are likely to
become priority concerns for EPA.
The first strategic element would expand the base of
knowledge on transport and transformation processes in order to
develop and validate mathematical models for assessing* and
managing environmental risks through exposure estimation and
identification of significant sources and their relative
contributions. The second goal would use a combination of
measurements, theory and analysis to contribute to raising
Agency and public awareness of issues potentially harmful to
public health and the environment at a stage early enough to
permit adoption of a cost-effective approach to risk reduction.
Section 3 discusses these key elements of the STF research
strategy. Section 4 offers specific recommendations.
-6-
-------
3.0 Strategy for Sources, Transport and Fate Research
Environmental risk assessment and management requires
reliable means for estimating exposure in space and time, as well
as estimating the contribution of sources to those exposure
patterns. Exposure to humans and to ecosystems can occur through
respiration, ingestion, direct contact, or the food chain.
Exposure generally is defined in terms of an ambient
concentration or bioaccumulation concentration over time, a
deposition rate to a collector, or a total medium burden (amount
of material in a defined volume). In the absence of direct
measurements of exposure, estimation of concentration, deposition
or burden can be carried out through interpolation or
extrapolation of results of mathematical calculations based on
the principles of mass and energy conservation. Continuing
research is needed on both direct measurement of exposure and
predictive models, as discussed in the report of the Exposure
Assessment Subcommittee. In general, projections of source
contributions to exposure at a receptor (source-receptor
relationships, or SRRs) presently can be done only through
mathematical models.
Inherent in either the interpretation of field observations
or mathematical modeling are uncertainties in exposure or SRRs
that are seldom known. A scientifically supportable risk
assessment requires an accurate and precise exposure estimate.
Therefore, determination of uncertainty in estimates.is as
important as the estimate itself. This is also true in risk
management where a balancing among cost, technological
effectiveness and reliability, and other factors is often
required or used for selecting among determining emission control
options. Inherent in the improvement of risk methodology is
reduction of uncertainty in estimates of the attribution of
exposure to specific sources.
Uncertainty in exposure estimation derives from two factors.
The first concerns a mismatch in the spatial or temporal scale of
calculations relative to receptors. The second uncertainty stems
from errors in the models themselves, the input data to the
models, and computational errors inherent in numerical techniques
which are employed in the execution of the model.
Models calculate concentrations or burdens in a relatively
coarse or macro-scale. An individual receptor generally is much
smaller than this resolution and often is mobile. These factors
make it necessary to use population mobility and statistical
factors to relate measurements at fixed stations and model
calculations to receptor exposure. Little research has been done
to reduce uncertainty in these factors or in estimation of target
tissue doses resulting from environmental exposure. This subject
is more fully discussed in the report of the Exposure Assessment
Subcommittee.
-7-
-------
Work has been done to define uncertainties in model
calculations but, in general, models and interpolation schemes
are not well tested for reliability or validated for the quality
of their simulations.
The validation of theoretically based models is often
defined at two performance levels. The first is determined by a
suitable comparison of its predictions with ambient concentration
measurements for given physical or chemical input conditions.
However, close correspondence between model prediction and a few'",
selected environmental measurements does not necessarily
constitute adequate model validation. Thus, a second validation
criterion is needed that tests the model for its integrity in
simulating media processes that link source emissions to ambient
conditions. Testing models at this level is far more demanding
than the first evaluation, but it needs to be an integral part of
STF research. Experimentation with models at this level leads to
advances in basic knowledge as well as added confidence in the
model performance.
3.1 FIRST STRATEGIC ELEMENT:
Reduction of Uncertainty in Estimating Environmental
Concentrations of Pollutants.
3.1.1 Modeling and Model Validation
Effective risk assessment begins with an attempt to
estimate the environmental concentration of contaminants wiChin
an acceptable margin of error, followed by exposure assessment
that assumes a reasonable level of confidence in estimates of
contact with these environmental pollutant concentrations. Three
factors significantly affect the accuracy of estimates of
environmental concentration:
a. Specification of source location, chemical
characteristics and emission rates.
b. Description of transport and chemical conversion
processes.
c. Description of fate or removal processes.
Information on these factors is used to construct models to
represent the phenomena believed to be involved in the movement
and transformation of chemicals in the environment, and computer
codes are developed to facilitate what are frequently very
complex calculations.
The EPA requires reliable data on concentrations in
environmental media in order to determine exposures to target
organisms and populations for risk assessment. The Agency also
requires data on the contribution of specific sources to
pollutant concentrations (i.e., SRRs) as a means of identifying
which sources to target for reducing risk. In the absence of
sufficient data on exposures, source data may be combined with
transport and transformation models to provide estimates of
-8-
-------
exposure. In such cases, it is critical that the models be
capable of providing exposure estimates within acceptable (or at
least defined) bounds of uncertainty. Uncertainty is determined
by the suitability of the model (i.e., whether it includes
appropriate terms for all of the important variables, such as
dispersion and meteorological conditions) and by the accuracy of
coefficients and other input data associated with each variable
or connecting mathematical term. In an ideal world, each model
would be fully validated before it is used for risk assessment or
risk management decisions. However, the high costs and long lead
times required for model validation inhibit the validation effort
in a regulatory agency such as EPA.
In reality, a model can prove useful, even without a full
validation, provided that it can be empirically verified for a
range of conditions comparable to those in the situation for
which it is to be applied. Furthermore, as it is applied more
widely and tested periodically against available environmental
data, it can be refined and/or modified, based on operating
experience. Thus, iterative applications of models have served
the EPA in several ways. They help to define the bounds of
uncertainty associated with use of the model in risk assessment
or risk management, thereby increasing confidence in the results.
At the same time, the data generated provide a basis for refining
the model and/or extending the bounds over which it can be used.
Although STF models primarily have been structured, to
address a single medium of the environment--air, surface water,
ground water or soil--, it is increasingly apparent that
intermedia and multi-media models are also necessary to analyze
certain problems. In any case, the most general and appropriate
structure of a model is based on the conservation of mass or
continuity principles, incorporating source, transport, transfer
and transformation components.
The basic principles that underlie any modeling effort are
at least qualitatively understood and the numerical coefficients
relating to the above mentioned components often can be
estimated. The application to a particular problem, however,
often requires more detailed qualitative descriptions of
transport and transformation processes; e.g., resuspension of
aerosols from soils and the role of wet scavenging of reactive
organics. In addition, more accurate quantitative measures of
the coefficients which model these processes may be required in
order to obtain projected estimates that are of practical use to
the risk manager. Thus, there arises the need for model
calibration and validation specific to the problem and region,
the degree and extent of which should be guided by the
significance of the question and the environmental and economic
consequences. EPA should continue to establish a systematic
procedure and a specific schedule to validate key environmental
models. This effort should include documenting underlying
assumptions, updated modeling procedures and protocols, and
estimating uncertainties in prediction capability for a range of
-9-
-------
conditions.
The verification of models to a defined uncertainty requires
a combination of special .data acquisition, including source
emission and field tests, laboratory experiments, and theoretical
or mechanistic studies of media processes. These generally
involve progressive and incremental design considerations based
on a continuing improvement in our knowledge.
The model components to be quantified are source
characterization, media transport and conversion processes, and *
ultimate disposition ("fate") processes. Each of these
components and their associated uncertainties are discussed in
the following sections.
3.1.2. Source Characterization
In deriving estimates of environmental concentrations of
pollutants, quantification of sources, their strengths, and
interactions is potentially one of the larger sources of
uncertainty. Because of legislative mandates, source inventory
and characterization have been directed toward release into
specific environmental media such as air, surface water, ground
water and soil. Great strides have been made on emissions
estimation in the last 15 years; however, source characterization
should continue to be high in priority because
a. Historical sources such as abandoned waste pits and
dumps have been inadequately characterized as to the
presence of particular pollutants or to releases.
b. More recently acknowledged sources, such as contaminated
sediments, present additional assessment problems
c. Advances in emission control technology and evolution of
industrial processes and activities require
progressive re-evaluation of emissions inventories
d. Multi-source and multi-media interactions have been
inadequately characterized.
The sources to be studied will change according to the
prioritization of current and projected environmental problems
and introduction of new technology. Within a given problem area,
the sources studied should not be limited to those addressed by
current Federal regulations. Rather, the outlook should be as
comprehensive as possible to define the magnitude of current and
emerging problems. In studying ground water pollution sources,
for example, municipal landfills should be included as well as
RCRA Subtitle D facilities and the use of agricultural chemicals.
Source research should address area, as well as point, sources
and both mobile and stationary sources. A continuing core
research program in this area is recommended both to develop
generic methodology and to apply it to critical environmental
problems. Three aspects of a core program are addressed in this
report: chemical characterization, release rates, and episodic
releases. Emphasis also should be placed on emerging
-10-
-------
technologies and new chemicals entering the environment.
3.1.2.1 Chemical Characterization
The objective of chemical characterization is to develop and
apply efficient methods that adequately define problem sources
and point to solutions. Accuracy, precision, detection limits,
matrix effects, cost, and time are all critical factors. Besides
the identity and concentration of chemical constituents, tests
are needed to predict the mobility of materials under various
scenarios and to provide data for selection/design of control
techniques.
3.1.2.2 Release Rates
Emissions from point sources often can be directly measured,
while the flux of contaminants from various area or diffuse
sources into the environment is estimated by applying a
mathematical model to either source characterization data or
ambient monitoring data. Each approach is associated with levels
of uncertainty that need to be established and then reduced when
greater accuracy and precision is required. For example, many
exposure estimates assume a nominally steady discharge of a
pollutant, when in fact, variation in emissions rate may be
critical to an accurate estimate of exposure. Also, there is a
need to develop approaches which utilize all available data
sources, be they NPDES reports, air permits or RCRA Part B
applications. Improved release rate models need to be soundly
conceived a_nd adequately verified for a variety of applications.
3.1.2.3 Episodic Releases
In many situations EPA may emphasize the regulation of
emissions under stable, steady conditions while serious
environmental and/or human health problems are caused by sudden
releases or "upset" conditions. Formal procedures are needed
for:
a. Identifying specific potential hazards; i.e. situations
that could result in a sudden release.
b. Estimating probability of that hazard occurring.
c. Predicting the magnitude and chemical or physical form
of the release.
Projections of equipment failure and handling or
transportation accident rates are needed. Acceptable standards
of practice need to be established and tested. One approach for
hazard identification is to divide operations into segments and
compare possible risks against a hazard checklist, including
combustible mixtures, mechanical stress, vapor cloud release and
over-pressurization. The potential for natural disasters, like a
sudden gas release (e.g. carbon dioxide or hydrogen sulfide) from
-11-
-------
a volcanic disturbance or a deep lake sediment should be
investigated.
3.1.2.4 Source Characterization by Medium
3.1.2.4.1 Air
Although air quality research is said to be more advanced
than research for other media, a continuing effort will be needed
to refine and improve knowledge of emissions for regulatory
decision making. Over the past decade there has been
considerable effort to characterize emissions of criteria
pollutants and certain hazardous chemicals from stationary and
mobile sources.
Data acquisition for source characterization will be needed
for air regulatory analysis at a modest level of priority for at
least the next decade. Continuing work will be required to
maintain and update the inventories. Characterization of
emissions from new or rebuilt facilities will be required, as
will the estimation of pollutant forms not previously considered.
Additional research will continue to be needed to provide
improved emission factors and to define the uncertainties and
limitations in available data. With such refinements, high
priority should be assigned .to upgrading estimates of emissions
of nitrogen oxides and volatile organic compound emissions for
use in source-receptor modeling and control strategy analysis of
oxidants and air toxics.
3.1.2.4.2 Surface Water
In many instances non-point sources are the major
contributors to freshwater surface problems; e.g., toxics in Lake
Superior and Lake Michigan. Risk reduction efforts will
increasingly turn to non-point sources because of the large
fraction of surface water pollution problems they may represent
and since point sources have been more effectively controlled.
Potentially important non-point surface water pollution
sources include the following: run-off and leachates from
agricultural and other land uses, deposition of wind-borne
volatile organic chemicals and heavy metals, groundwater inflow
and sediment releases. At this time source models for predicting
organic loadings are much further developed than models for
inorganic loadings. Also, agricultural run-off is considered to
be better characterized than is urban run-off. Research should
be balanced between monitoring (direct measurements for use in
identifying/defining problems, as input data to models, and in
model validation) and development of predictive run-off models.
Specific models need couple the source information with
hydrodynamic and process kinetic models, describing sediment
-12-
-------
transport, and elucidating biologically mediated reactions, metal
speciation kinetics, and hydrophobic compound transport.
Reconstructive models based on concentrations in receptor
organisms, including humans, also are useful. Balanced funding
'of field measurements and predictive modeling is recommended.
3.1.2.4.3 Ground Water
Contamination of ground water from human activities
frequently originates from surface impoundments, landfills,
agriculture, leaks and spills, septic tanks, mining, petroleum
and gas production, and underground injection of wastes. EPA's
1977 "Report to Congress on Waste Disposal Practices and Their
Effects on Ground Water" (Premier Press, Berkley, CA, 1980)
identified the disposal of wastes at industrial impoundments and
other solid waste disposal sites as the most important sources of
groundwater contamination. It estimated that approximately 15%
of the liquid and solid industrial wastes generated in the United
States can be classified as hazardous. Such 'wastes represent
potential sources of groundwater contamination, depending on the
method of disposal. Most of the past land-disposed wastes were
not managed by means that comply with more recent Federal
regulations, and, therefore, they may threaten groundwater
quality in many areas.
In addition to industrial wastes,' the 1977 report identified
so-called secondary sources of national importance including
septic tanks, municipal wastewater, mining, and petroleum
exploration and production residues. Although concentrations of
toxic material from these sources are generally lower than from
industrial wastes, they can be significant on a regional basis.
In an area of substantial manufacturing activity containing large
numbers of people, there exists a potential for pollution of
groundwater resburces, especially from products such as gasoline,
fuel oils, and solvents. Areas where mining, agriculture, and/or
petroleum production are prevalent are also at potential risk.
3.1.2.4.4 Soils and Sediments
Soils and sediments can retain organic and inorganic
chemicals released to the environment. Therefore, they can
become sources for release and subsequent contamination of air,
ground water, and surface waters through resuspension, vapor
losses, leaching, and removal of particulates containing sorbed
compounds. Defensible risk assessments and risk management
strategies require reliable information on the amount of
contaminants accumulated in soils at sites and knowledge of how
to predict contaminant persistence, transformation, and transport
to other media.
The spatial distribution of chemical contaminants in soils
is often extremely heterogeneous. Consequently, extensive core
-13-
-------
sampling and/or exhumation to delineate zones of contamination
can be time-consuming and expensive. In situ and remote assay
equipment and sampling methods are needed to determine
concentrations of chemicals in surface and subsurface soils.
Among approaches that show promise are those that couple recent
advances in laser technology with those in fiber optics in order
to improve the detection of organics and the development of
portable gas chromatographs for analysis of volatile organics in
the field. In addition, neutron and scintillation probes may
prove useful for in situ detection of transuranic and
gamma-emitting radionuclides, respectively. Development of these
and other techniques to detect and quantify contaminants will
require a significant research effort, but. one that would yield a
high payoff in monitoring capability.
In order to adequately assess soil sources, there is a need
for appropriate leaching test(s). Improved methods are needed to
evaluate contaminated soils and wastes to account for variability
in leach rates of constituent chemicals over a long period of
time at specific sites. Such methods could be used-to improve
cleanup and closure of RCRA and CERCLA sites, as well as to serve
as guidance for management of land treatment and landfill
facilities.
3.1.3 Transport Processes
There is general acceptance of the basic principles of the
transport component of STF jnodels. The fundamental equations of
fluid (air and water) motion which follow from these principles
are reasonably well-established and have been utilized for
calculations. The principal limitations of these models often
lie with certain empirical transfer coefficients that are media
specific. The mixing and dispersion associated with the fluid
motion, although understood, has less of a research base to
support it. Nevertheless, empirical relations, based on field
and laboratory data, provide important insight into the nature of
transfer coefficient variability. The recent advances in
turbulence theory increase scientific understanding of this
important mixing mechanism. At the present time, the development
and application of fluid dynamic models incorporating this theory
reside primarily with the research scientists. Transferring this
knowledge to the environmental analyst is an important need for
the future development and application of air and water quality
models. Certain specialized areas; e.g., underlying sediment bed
failure, remain as more fundamental challenges for environmental
modellers.
Analytical and numerical solutions and the associated codes
are available for surface and ground water transport models.
Stochastic or Monte Carlo techniques have also been used to
define the uncertainty of the various elements in these models.
Different research groups should test a selected group of models
in order to determine those which are most appropriate for use in
-14-
-------
environmental quality analysis. This testing should be followed
by a comparison of model performance with sets of observations
from various air regimes and water systems.
3.1.3.1 Surface Water
The transport of pollutants in freshwater bodies is
predominantly advective (moving with the mean flow), rather than
dispersive (associated with the eddying). Given the knowledge of
the hydrologic balance of a drainage area and empirical
correlation of the dispersion, transport of pollution in
freshwater systems can usually be determined with greater
accuracy than some other components of the mass balance; i.e.
sources and transformations. The hydraulic interaction (e.g.,
caused by fluctuating levels) between the surface and ground
water deserves further attention as it relates to estimating
contamination.
The dispersive component in estuarine and coastal systems is
more significant than in fresh waters. In the former, the
effects of density stratification on vertical and lateral
dispersion and the distribution and disposition of sediment and
organic particulates needs to be further developed. Given the
intensive computational manipulations required to solve
multi-dimensional fluid dynamic/quality models, a significant
effort is needed to enable these models to interact for
long-term, time variable simulations and projections. Much
remains to be done on transitions for coastal systems both on the
east-and west coast of the country. The eddies from the Gulf
Stream in the Atlantic have marked effects on the transport
within the region north of Cape Hatteras and notably in the New
York Bight. The Gulf Stream has not been successfully modeled in
spite of the advanced state of knowledge of the field and
aforementioned developments.
Wind and temperature effects are important factors in
defining transport in marine systems, as well as in lakes and
reservoirs. The latter have frequently been modelled in a
simplistic fashion by assuming complete mixing. For long term
projections this approximation has been successfully applied in
many cases. For the more refined, multi-dimensional analysis,
—however, much needs to be done in applying turbulence theory to
the analysis of transport phenomena in lakes and reservoirs so
that detailed concentration patterns can be estimated. In
addition, intermedia transfer phenomena (e.g.,the role of the
surface microlayer in pollutant transport across the
water/atmosphere interface and critical factors in water/sediment
transfer) need additional study.
3.1.3.2 Ground Water
As in the case of surface water, the basic equations of
-15-
-------
fluid motion in the saturated zone of ground waters are generally
well established and understood. Qualitatively, fluid transport
in some important ground water media has successfully been
modeled. Similarly, the dispersive effects, referred to as
dispersivity, have been empirically defined to s-"~e degree, but
site-specific evaluation of this component is u: ly required.
By contrast, knowledge of the transport in the u iturated zone
is inadequate, and further development in this area is needed. A
major source of uncertainty in groundwater modelling is the
inherent heterogeneity of the soil media and underlying rock
structure, which must be addressed on a site-specific basis.
3.1.3.3 Water-Underlying Bed Interactions
The exchange between the water and the underlying sediment
bed acts as both source and sink of dissolved and particulate
forms of pollutant constituents. In some cases, the transfer
rates from or to the bed far exceed the current mass inputs from
point and non-point sources. Two general constituent categor-ies
are considered: nutrients and toxic substances. The former
affects the bacterial and algal levels in the water column. In
many locations this interaction is the major factor in a
dissolved oxygen budget. There is a pressing need for the
analysis of dissolved oxygen and eutrophication to better
characterize the flux of nutrients at the water-underlying bed
interface.
Many organic chemicals and heavy metals partition to
particulate matter, particularly to the organic and clay fraction
of the solids. The degree to which the contaminated particulates
accumulate in the sediment depends on the characteristics of the
solids and the turbulence and shear at the water-underlying bed
interface. Estuaries, lakes or reservoirs are net sedimenting
systems and accumulate these toxic substances in the bed. The
broad area of sediment interaction with the water column in all
systems requires a significant effort in order to understand the
phenomena of sediment transport, settling, resuspension and
bioturbation affecting water quality.
3.1.3.4 Soils
Current methods are inadequate to predict accurately
conditions in soils (e.g., moisture and temperature fluctuations)
or the transport of organic and inorganic contaminants in this
medium. Studies are needed to refine the conceptual models for
organic and inorganic mobility and to provide for the influence
of soil heterogeneity and other environmental variables on these
processes. Structure-activity analyses should be explored to
improve predictions based on physicochemical properties of
specific compounds.
Pollutant transport through soils is often viewed simply as
-16-
-------
a single chemical chromatographic process. This view fails to
account for the influence of soil structure (e.g., macropores),
rainfall events, contaminant interactions in waste mixtures, and
the possibility of movement via colloidal transport and
adsorption to mobile microparticulates. Studies are needed to
delineate the extent to which these additional processes affect
rates of contaminant transport in soils.
3.1.4 Fate Processes "
The environmental fate of a substance depends on physical
dispersion processes as well as on its physical, chemical, and
biological properties or interactions with substrates.
Information required for future predictions of the fate of
chemicals in air, soil, or water includes such basic data as
aqueous solubility, vapor pressure, air-water partition
coefficient (Henry's Law constant), molecular diffusivity, phase
partition coefficient, melting point and absorbtivity. There has
been progress in acquiring data in pure homogeneous systems.
This fundamental information is needed in order to understand the
effects of cosolvents, unicelles, and colloids on these
properties. In addition, many chemicals hydrolyze, photolyze or
participate in additional abiotic or biotic degradation
processes, such as electron transfer reactions. An increased
effort is needed,to produce thermodynamic and kinetic data for
heterogeneous systems as well; e.g., the influence of metal
oxides and microorganisms on the persistence of chemicals in
soils.
Since economic and logistical constraints prohibit
laboratory measurements for all these properties and rate
constants, an alternative predictive tool is recommended:
investigation of chemical structure-activity relationships.
These estimated parameters are then adapted for appropriate fate
models. Such an approach has been used with great success in
chemical engineering to design unit processes for chemical
manufacturing and in pharmacology to construct pharmacokinetic
drug transport models.
3.2 SECOND STRATEGIC ELEMENT:
Early Detection of Environmental Problems
3.2.1 New Stressor Identification: The Need for Early Warning
Early identification of potential, emerging and/or
escalating environmental problems should take its place along
with risk assessment and risk management as a central part of
EPA's mission. The current research program provides no funds
specifically earmarked toward this objective. This is
disappointing in view of the number of issues (e.g., radon,
stratospheric ozone depletion and global climate change) that
have only recently risen to priority in EPA's policy agenda but
-17-
-------
which have been known to the scientific community for a number of
years. It is also surprising because of relatively high
perceived risks and the rising priority for these "newer"
problems which are discussed in EPA's February, 1987 report
entitled "Unfinished Business: A Comparative Assessment of
Environmental Problems" (EPA/230/2-87/025a-e). While admittedly
not a scientific study, the "Unfinished Business" report provides
a rationale for follow-up investigations that need to be pursued,
if only to minimize future surprises and to ensure a better match
between research expenditures and significant sources of public*
health and environmental risk.
The benefits of early identification of stressors to human
health and ecological systems include:
a. Cost reduction: more orderly conduct of the research vsc
expensive crash programs.
b. Improved regulation: more time is available to develop
data bases for scientifically supportable
regulations.
c. Risk reduction: steps can be taken early to reduce or
prevent risk either by non-regulatory and/or
regulatory means.
Initiation of a program to identify new or potential risks, which
can complement the Agency's ongoing efforts to assess known
risks, is strongly recommended.
3.2.2 Early Warning Data Sources
3.2.2.1 Chemical, Biological, and Physical Monitoring
Many of the environmental stresses that concern this nation
and the world are caused by anthropogenic chemicals. Often a
crisis is first detected through the direct observation of a
biological effect caused by pollutants rather than by earlier
prediction or detection of the release. There are numerous
examples of this pattern, including:
a. Kepone in the James River detected by the observation of
worker illness.
b. Tributyltin in harbors detected by the observation of
malformed oysters.
c. Polybrominated biphenyls in Michigan cattle detected by
the observation of dead and dying animals.
d. Polynuclear aromatic hydrocarbons in areas of the Puget
Sound detected by the observation of fish with
cancers.
Often by the time a problem is detected, biological damage
has already occurred and remediation is difficult, expensive or
impossible from a practical standpoint. In other words, the
anticipatory regulatory systems have been inadequate.
Improved anticipation can be promoted through an improved
surveillance system. Such a system would continue to include a
-18-
-------
chemical monitoring program designed to quantify a preselected
set of compounds already of regulatory interest. This approach
has been the thrust of most environmental monitoring to date. An
improved surveillance system should, in addition, provide
qualitative identification of additional chemicals of concern.
This latter approach has been haphazard in deployment, but has
proven important.
There are advantages and disadvantages to the first, directed
approach. One advantage is that the qualitative aspects of f
chemical analyses are simplified. Analytical methodologies can
be selected or developed for specific compounds, decreasing the
possibility of false identification. The quantitative outputs of
the analyses are usually more accurate and precise because the
methodologies employed are optimized for the preselected
compounds. These outputs are particularly important if the
objective of monitoring is to determine compliance with some
regulatory program or permit.
A disadvantage of the directed approach is that only the
preselected compounds are surveyed even though other compounds
may also be detected. The data for the latter compounds are
generally ignored and even lost. New compounds, which may later
prove to be damaging to human health or the environment, are not
systematically tracked. Examples exist where chemical problems
have been needlessly overlooked. Among these are the impacts of
such organic chemicals as polychlorinated biphenyls in the
1960's, Kepone and dioxins in the .1970's. In other words,
potentially valuable chemical data have not and are not being •
utilized specifically for environmental assessment because of a
narrow focus on ichemical-specific monitoring.
Another way of describing most existing monitoring systems
for toxic chemicals is to describe them as "feedback" programs.
Such feedback programs are keyed by error signals. For example,
if a permit allows a certain amount of a specific compound in an
effluent, a concentration that exceeds the permitted level by an
established margin constitutes an exceedance; i.e., a violation.
Detection of this violation may feedback, initiating regulatory
action. Compounds not specified in a permit and, therefore, not
analytically sought, cannot trigger an warning alert even though
these "new" compounds may be detrimental to the biological
communities in the receiving media.
Technologies and expertise now exist to reduce such
oversight through improved design of broad-based chemical
monitoring programs and the use of biological endpoints in
monitoring. The use of techniques such as gas chromatography
(GC) and gas chromatography-mass spectrometry (GC-MS) provide
effective means for efficient, anticipatory monitoring. Through
the use of various columns and detectors, these methods yield
signals for essentially all of the compounds present, both those
which are analytical targets and those which are unexpected.
Even though many of the output signals are not essential to a
-19-
-------
feedback system, they can be collected, stored, and analyzed
through the use of data handling systems. This broad-based
record can be examined historically for chemicals of possible
concern and for apparent shifts or trends that may signal an
accumulation of material.
In such a program, data systems could be linked together to
create networks. Software could be developed to query the
networks to determine whether new chemicals have appeared between
samplings and whether a compound is increasing or decreasing over
time. The network could provide efficient access to the areal
distribution of a compound(s) of potential interest. "
This alternative to targeted chemical monitoring would
sacrifice some quantitative aspects of the analyses in order to
maximize the qualitative outputs. That is, this added
information would be obtained at the cost of somewhat higher
limits of detection.
The results of these refinements to chemical monitoring
would be progress toward "feed forward", rather than feedback,
monitoring. Feed forward monitoring, in this case, is defined as
monitoring designed to determine when new, unregulated or
unselected compounds enter a system and when shifts in the
distribution of chemicals in the environment occur. Feed forward
monitoring has the advantage of determining many more compounds,
which in turn provides a much more realistic estimate of the
total toxic burden to which organisms are exposed. Such
information would be "fed forward" and analyzed, possibly leading
to the development of a regulatory or other risk reduction
response.
Although feed forward monitoring programs may be less
cost-effective in the short-term than routine feedback monitoring
for regulatory compliance, the long-term benefits of avoiding a
future kepone-type event justify the costs associated with the
development and maintenance of such an early warning system.
Further, judicious application of knowledge about what biological
and physical processes are possible can increase the efficiency
with which feed forward monitoring is conducted.
Plants and animals can also be used to gain important
information about the sources and availability to biota of
chemical contaminants and their resulting effects. Insight can
result from analyses of tissue that may not be possible from
chemical analyses of abiotic components of ecosystems. Another
advantage of biological monitoring is the extremely high
sensitivity of certain biochemical endpoints, such as enzyme
induction, that can supply evidence of the presence of chemicals
at concentrations below thresholds for chemical analyses.
Recently published studies have shown that biomedical tests
derived from research on mammals are useful when applied to
aquatic systems. The detection of chemical stresses on aquatic
biota by utilizing histopathological and immunological techniques
-20-
-------
is new possible. The observation of tumors in fish from Puget
Sound, the finding of lesions and depressed immune systems in
fish from the Elizabeth River, Virginia, and the determination of
elevations in metallothionein concentrations in fish from
Prickley Pear Creek, Montana, are examples of the use of such
technologies. Also, non-specific indicators of toxicant exposure
can be valuable monitoring tools in broad-scale screening
programs. For example, deviations from normal ratios of
single-stranded to double-stranded DNA reflect exposure to a
broad array of genotoxic chemicals.
There is no doubt that the ability to analyze environmental
samples will improve and become more comprehensive in the future.
There is also no doubt that the need for long-term monitoring
data will increase as technology and human populations expand.
Both of these developments support the concept of collecting and
storing environmental samples to be analyzed in the future as new
techniques .become available or other needs dictate. The Agency
now participates in such a program, the Environmental Specimen
Bank. Consideration should be given to expanding the effort.
The availability of documented samples on which to perform
retrospective analyses could be extremely advantageous for
determining temporal or spatial trends. Similar efforts should
continue with the National Human Adipose Tissue Survey (NHATS).
Monitoring efforts should also address risks caused by
stresses on humans and ecological systems other than direct
toxicity of anthropogenic chemicals. These stresses include
global warming, increased UV-B radiation, physical modification
of habitat, radon, pathogenic and engineered organisms, and
natural chemical emissions.
3.2.2.2 Societal, Economic and Technological Changes
Clues as to potential and emerging public health or
ecological stressors (risk) can be gained by periodically
examining societal, economic and technological trends. For
example, energy conservation scenarios developed in the 1970's
because of rising energy prices could have predicted the rising
importance of indoor air pollution problems heightened by
increasing insulation and resulting decreased ventilation.
Similarly, more recent estimates that approximately 70%~of the
American people will live within 50 miles of a coastal area by
the year 2000 strengthen the urgency for protecting estuarine and
marine ecosystems. Other examples of trends which can be studied
are the significance of superconductors, climate change and urban
population changes.
Some aspects of such an effort were included in a 1980 ORD
report entitled "Environmental Outlook 1980" (EPA-600/8-80-0003,
July 1980). Potentially useful procedures of identifying the
environmental impact of trends in energy supply/demand,
demographics, human activities, economics, regulations, natural
-21-
-------
cycles, international activities, and technology are described.
However, the thrust of the report was not on identification of
new risks or rapidly escalating risks; rather, the report was
directed at determining the effects of such trends on existing
efforts to assess and control known risks. In order to make this
risk-identification effort successful at a reasonable cost,
greater emphasis needs to be on the identification of new,
emerging, and rapidly escalating stressors/risks.
3.2.2.3 Literature Reviews and Expert Workshops
•7-
Selected literature should be monitored with the aim of
searching for signals of new stressors. Also, workshops should
be held at least annually to solicit the thinking of outside
experts on potentially significant environmental problems.
Possible mechanisms include utilizing units of the National
Academy of Sciences, the National Academy of Engineering, the
Office of Technology Assessment, professional societies or other
Federal agencies to host or co-sponsor such workshops. Working
with EPA, these and other institutions; e.g., NIEHS, can organize
leading scientists, engineers, sociologists, economists, and
others to identify potential and emerging ecological and health
stresses.
3.3 •• Implementation by EPA
•
Essential to the development and success of an early warning
system is the formation of a. group of people within EPA that
includes, at a minimum, staff drawn from the Office of Research
and Development-and the Office of Policy, Planning and
Evaluation. The group would prepare analyses and studies of
potential problems, draw upon other Agency expertise, as
appropriate, and fund certain outside studies in the data source
areas cited above. These people should be experienced
individuals who can discern the implications of existing and new
information and be able to assess its importance. Inclusion of
visiting scientists from academia, industry or private groups
would assist this effort by adding external inputs to the Agency.
Each year this group would prepare an annual report to the
Administrator, Deputy Administrator and Assistant Administrators
of new, emerging, and/or escalating health and environmental
problems. The Assistant Administrator for ORD would develop a
mechanism to ensure that the conclusions, and recommendations of
this group receive formal consideration in the research planning
process.
-22-
-------
4.0 Recommendations
4.1 Recominendation_JI: Emphasis on STF Models
EPA should maintain its research on sources, transport and
_fate {STF) model development, evaluation and validation, and
continue improving its methods for reducing uncertainty in risk
assessment.
To implement this recommendation, EPA should take the
following actions:
a. Continue to formalize the mechanism and criteria for
acceptability of STF models for all media, using
methods such as the current procedures of the Office
of Air Quality Planning and Standards (OAQPS).
b. Evaluate and validate on a priority schedule widely
used STF models (single medium or multi-media), using
a combination of field measurements and laboratory
data to determine the level of uncertainties
predicted by the models and to provide guidelines for
reducing these uncertainties.
c. Continue research on media processes to ensure the
quality of model input data in order to improve the
detection and prediction of chemical transport and
transformation in environmental media.
d. Adopt a systematic review schedule for STF model
progress, including target milestones to achieve
reduction in predicting uncertainties.
These four actions will facilitate the preparation of an orderly
and focused Agencywide effort to advance the development and use
of STF models in risk analysis.
Currently'there exists a profusion of numerical codes that
are exposure estimators. However, in general, they are not
validated or tested, nor have they been ascribed specific
quantitative uncertainties. The methods adopted by OAQPS serve
as a useful Agency guide for placing a more uniform certification
process on these types of models for regulatory analysis.
Validated models are essential for this use so that public
confidence in the reliability of risk assessment' results can be
increased. To achieve the goal of systematic and continued
improvement of STF models, research funds should be provided to
improve model input data for source emissions, fluid flow
estimation, and physicochemical rate parameters. Improvements in
these components need to be assimilated progressively into models
to ensure that the models reflect the current state of knowledge.
Comparisons between older and newer models should also be
attempted a regular basis to evaluate progress in reducing
uncertainty. These comparisons should also be incorporated into a
systematic review process to update the STF models recommended
for regulatory applications.
-23-
-------
4.2 Recommendation II : Leadership by Risk Assessment Council
EPA's Risk Assessment Council (Council) should ensure that
STF research is integrated into EPA's approach to exposure
assessment. Specifically, the Council should:
a. Initiate the development, of Agencywide guidelines for
STF model performance criteria and their
acceptability, following methods adopted by OAQPS.
b. Endorse and promote the coordination and use of
interagency STF research as part of an effective
research strategy for EPA.
4.3 Recommendation III; Establishment of an Early Warning Group
EPA should establish a formal and continuing group of senior
scientists and engineers who would be drawn from the Office of
Research and Development, the Office of Policy, Planning and
Evaluation, and extra-Agency groups. These individuals,
representing a number of disciplines, would be charged with
identifying potential, emerging, and/or escalating public health
and environmental problems. Such a group would, at a minimum,
perform the following functions:
a. Survey early warning data sources which can be found
from modest refinements to existing chemical,
biological and physical data monitoring systems.
Such refinements lead to feed forward monitoring,
which can determine when new, unregulated or
unselected compounds enter a system or when shifts in
the distribution of chemicals in the environment
occur. Methods for for analyzing such data should be
developed.
b. Identify potential human health and environmental risks
that are currently not classified as major EPA
priorities. The process for identifying such risks
should include an examination of social, economic and
technological changes that can create new risks, use
of existing models and measurement data, and
sponsorship of periodic expert workshops to survey
expert judgment on trends and risks.
c. Prepare an annual report of potential new problems to be
submitted to the Administrator, Deputy Administrator,
and the Assistant Administrators. The Assistant
Assistant for ORD should ensure that this report is
• formally considered in each year's research planning
process.
-24-
------- |