United States
Environmental Protection
Agency
Risk Reduction Engineering
Laboratory
Cincinnati OH 45268
Research and Development
EPA/600/M-89/014 Jan.1990
&EPA ENVIRONMENTAL
RESEARCH
Methods of Analysis for Waste Load Allocation
J. Wayland Eheart,1 Jon C. Liebman,1 and E. Downey Brill, Jr.2
Abstract
This research has addressed several unresolved questions
concerning the allocation of allowable waste loads among
multiple wastewater dischargers within a water quality
limited stream segment. First, the traditional assumptions
about critical design conditions for waste load allocation
were shown to be true (except for some highly uncommon
situations) in multi-discharger settings—namely, that lower
streamflows and higher temperatures lead to more
stringent allowable loads. Second, a method was
developed for aggregating dischargers together into
discrete groupings so that the water quality interactions
between groups was minimized. This allows waste load
allocations to be made separately for each grouping,
thereby simplifying the overall computational process.
Third, the issue of setting aside unallocated waste load
capacity as a reserve against future growth or modeling
uncertainty was examined. A case study illustrated the
unique relationships that might exist between this reserve
capacity, the frequency of water quality excursions, and
the cost of wastewater treatment. Finally, a method was
developed for designing multi-discharger seasonal waste
load allocations. It determines seasonal discharge limits for
each discharger that minimizes the degree of treatment
necessary to provide the same risk of water quality
-- excursion as would exist under a nonseasonal waste load
allocation.
This Research Brief was developed by the principal
investigators and EPA's Risk Reduction Engineering
1 University of Illinois, Urbana, IL 61801
2North Carolina State University, Raleigh, NC 27695
Laboratory, Cincinnati, OH, to announce key findings of the
research project that is fully documented in the reports
and publications listed at the end.
Introduction
Waste load allocation is the process of determining
allowable levels of pollutant discharges that will maintain
acceptable receiving water quality. Several factors can
complicate this task. The presence of multiple pollutant
sources, including the need to find an economic and
equitable allocation of allowable load among these
sources, is one such factor. Another is the need to account
for natural variations in receiving water conditions such as
streamflow and temperature when determining the effects
of a proposed load allocation on water quality. Additional
concerns may involve accommodating possible future
discharge sources and uncertainty in water quality
modeling predictions when conducting a waste load
allocation.
This project has addressed several previously unresolved
questions related to allocation of waste loads to
dischargers along a stream segment. The questions are:
1. Parameters that affect water quality, such as
reaeration coefficient, and^traveMime^hange with
streamflow and temperature in a manner that
sheds doubt on the traditional assumptions of low
flow and high temperature as the "worst-case"
conditions for decaying pollutants. What are the
limitations for the validity of these assumptions?
2. On a large river there is usually an unmanageably
large number of dischargers. Is there a way to
-------
split them into groups that can be analyzed
separately thereby reducing the computational
burden?
3. How should part of the allowable load capacity be
reserved and how will this affect the goals of a
waste load allocation program?
4. Seasonal (or periodic) discharge limits are used
by many states to reduce the costs of water
quality control programs. How can this approach
be applied in an economically and
environmentally sound manner to multiple
discharger settings?
A brief discussion of the research findings related to each
of these questions follows.
Critical Water Quality Conditions
The assumption that the worst water quality occurs at the
lowest streamflow may not always hold in instances
involving multiple discharges of nonconservative
pollutants. The additional dilution resulting from increased
streamflow may be offset by adverse changes in the
parameters that govern water quality and in decreased
travel time, which allows the stream less time to recover
from the effect of one discharge before receiving another.
The question is whether, with multiple sources of
nonconservative pollutants, water quality might worsen
with increasing streamflow. This can be examined by
finding the pattern of discharge that maximizes the rate of
change of critical pollutant concentration (i.e., the
concentration at the location where the quality standard is
binding) with respect to streamflow. If that rate is positive,
the standard will be violated with an increase in
streamflow. If, regardless of the loading pattern, the rate is
negative, then it is impossible to have such a situation and
it may be assumed that the lowest streamflow is the worst
slrearnflow.
Under constant temperature conditions, a linear
programming model showed that this discharge pattern is
a uniformly distributed load along the entire length of
stream. This suggests that streams receiving a large
number of discharges may be more susceptible to
concentration increasing with decreasing flow than are
streams receiving a small number of discharges [Eheart,
1988].
For substances that decay according to first-order kinetics,
such as biochemical oxygen demand (BOD), whether the
maximum value of the rate of change of critical
concentration with respect to streamflow is positive or
negative depends on the value of the exponent in the
power law that relates stream velocity to streamflow. For
dissolved oxygen (DO) deficit, the exponent relating
reaeration coefficient to streamflow is also important.
Results indicate that the rate is negative for most natural
streams for both types of water quality parameters. Thus,
the traditional assumption that the lowest streamflow is the
worst from a water quality perspective will usually be valid.
Exceptions, however, could occur in highly polluted
backwater (impounded) stream reaches for which depth
decreases as velocity increases.
For first-order pollutants, low temperature is the most
pessimistic condition, since the pollutant decays more
slowly and is hence present in larger concentration. For
DO, however, the question is more complicated. The
classical assumption that the lowest DO occurs at the
highest temperature may not always hold. The DO
saturation concentration decreases monotonically with
increasing temperature, lowering the DO, but the
reaeration coefficient increases monotonically with
increasing temperature, tending to raise it. The BOD decay
coefficient increases monotonically with increasing
temperature, lowering the DO for single discharges but not
necessarily for multiple discharges. (Lower decay rates
attending lower temperatures could result in low DO at the
point where the effect of one discharge meets that of
another.)
The question of""whether Dt) rmgTft"under'some
circumstances worsen with decreasing temperature has
been addressed [Eheart and Park, 1989]. A linear
programming model showed that, for a uniform stream at
constant streamflow, the discharge pattern that maximizes
the rate of change of critical DO with respect to
temperature is a uniformily distributed load along the
entire length of stream. This suggests that streams
receiving a large number of discharges may be more
susceptible to DO increasing with decreasing temperature
than are streams receiving a small number of discharges.
The maximum rate of change of critical DO with respect to
temperature depends on temperature, the DO standard
(C*), and the temperature adjustment factor for the
reaeration coefficient (4>). It does not depend on the BOD
decay coefficient or its temperature adjustment factor. For
the maximum reported value of of I.047, the assumption
that DO decreases monotonically with increasing
temperature is valid for C* greater than about 6 mg/L. This
assumption breaks down, however, for values just above
the range reported in the literature and for C* values just
below the normally chosen range of 5 to 6 mg/L.
Discharger Grouping
ln_deyelgping programs for regulating waste discharges
into wafer "'" bodies'"it" is- ofterHcbriveniehtand,- frvmariy
cases, necessary, to subdivide the dischargers into
independently administered groups. If each group of
dischargers is to be responsible for the water quality at a
particular group of checkpoints, the groups should be
chosen so that the effect of the dischargers of one group
on the checkpoints of another is minimal.
A heuristic method was developed that groups dischargers
on the basis of minimizing the effects of the dischargers
included in a group on checkpoints associated with other
groups of dischargers [Eheart et al., 1989]. The method
was illustrated with the use of data for several river basins,
viz., the Lower Fox River in Wisconsin, the Willamette
River in Oregon, and the Mohawk River in New York.
Several different grouping criteria were chosen to
represent the effect of the excluded dischargers, and it
was observed that, for the selected rivers, as examples,
-------
the groupings are somewhat insensitive to the choice of
this criterion when the excluded effect is low, but not when
it is high.
Reserve Capacity
To act as a hedge against any potential errors or
inaccuracies in the predictions made with the water quality
model and/or to allow for future growth in waste
discharges, some of the river's allowable load capacity can
be set aside and made unavailable for use by the
dischargers. Depending on how the reserve capacity is
actually incorporated into the waste load allocation
process, however, the effects on water quality could vary.
To include reserve capacity in such a process, several
criteria should be considered and evaluated. Acceptable
levels of water quality should be maintained at an
acceptable probability and an acceptable cost. In addition,
_the: methods for inciudirjg reserves should be equitable in
terms of the distribution of required treatment levels and
costs.
Approaches have been examined for including a reserve
capacity in the waste load allocation process, and methods
have been developed for analyzing these approaches in a
given application [Michels, 1987]. Relationships among
reserve capacity, frequency of water quality excursions,
and total cost have been examined. It was found that,
depending on how reserve is incorporated into a waste
load allocation, these relationships may be quite different
and the effect on overall water quality may vary. The
relationships were illustrated quantitatively with the use of
linear programming models for three load allocation
programs employing Lower Fox River data.
Seasonal Waste Load Allocation
Water quality management programs that allow different
waste discharge rates during different times of the year are
an innovative approach for reducing the cost of waste
treatment. Under such programs, referred to as periodic or
seasonal discharge programs, the rate of waste discharge
allowed at a given time is based on the assimilative
capacity of the receiving water body during that time, the
water quality goals of the river basin authority, and the
acceptable risk o_L water guajjty, violation. The number and
length of the seaso'ris"a'?e' typically chosen considering the
physical conditions of the stream, the waste treatment
limitations of the dischargers, and the estimated
administrative burden and treatment cost savings
associated with each potential combination of season
number and length.
An important criterion of seasonal discharge programs is
that the degree of water quality protection achieved under
such programs should be the same as that achieved under
an accepted or existing nonseasonal discharge program
for the same river basin. One expression of this criterion is
referred to as risk equivalency [Rossman, 1989] because it
requires that the risk of violating a given water quality
standard under the seasonal discharge program be no
greater than that allowed under an existing nonseasonal
discharge program. Rossman [1989] describes an
approach for designing risk equivalent seasonal discharge
limits for single discharger stream segments. Under this
approach, risk is defined as the probability of incurring one
or more water quality violations in any given year, and the
level of risk allowed may be related to the return period of
the design streamflow used in an existing nonseasonal
discharge program. For example, the 10-year return period
associated with the annual 7-day, 10-year (7Q10) low flow
would correspond to a 10% risk of water quality violation in
any year. The seasonal discharge limits for a single
discharger under this approach are designed to maintain
risk equivalency with a nonseasonal discharge limit while
minimizing the waste treatment effort of that discharger.
In this research, risk equivalent seasonal waste discharge
programs for stream segments with several dischargers
were examined [Lence et al., 1989]. Although this
approach is similar to that proposed by Rossman [1989], it
was modified to accommodate river segments with several
dischargers. Two management objectives were proposed
as, substitutes for_-minimizing_waste-treatment-effort, 4he
minimum average uniform treatment (UT) and the
maximum total discharge (TD) objectives. Under the UT
objective, a seasonal set of uniform percent removal levels
is designed such that the average percent removal over
the year is minimized and a limit on the frequency of water
quality violations is satisfied. Under the TD objective, the
total design waste load in each season (i.e., the sum of the
waste loads for the individual dischargers) is determined
such that the TD over all seasons is maximized and a limit
on the frequency of water quality violations is satisfied.
The seasonal waste discharge programs that meet these
objectives were demonstrated for managing BOD
discharges on the Willamette River. Streamflow and
temperature data for the middle fork of the Willamette
River, the stream segment on which the 10 major
dischargers are located, were obtained from the United
States Geological Survey. The dischargers' locations,
waste load characteristics, design flows, and treatment
cost data were taken from Kilgore [1985]. Alternative risk
equivalent discharge programs for the dischargers on the
Willamette River were compared with each other and were
evaluated with respect to the resulting total waste
discharge, water quality, and total treatment cost. A
nonseasonal and a two-season discharge schedule was
examined for both the UT and TD objectives. The DO
standard was to be achieved with an annual risk of water
quality violation of 10%, and the alternative standards
considered varied from 5.0 to 7.5 mg/L.
The results of this analysis show that, when compared with
nonseasonal discharge schedules, substantial potential
reductions in waste treatment effort and cost were possible
under a seasonal discharge schedule for the Willamette
River. The two-season discharge programs, which allowed
between 0 and 66,500 Ib/day more waste discharge than
the nonseasonal TD programs, resulted in treatment cost
savings of between 0.0 and $22.0 million/yr, (0% to 48%)
with respect to the nonseasonal discharge schedule.
To evaluate the dependence of cost on the relative lengths
of the seasons under the two-season discharge schedule,
the length of the critical season (i.e., the season in which
the average streamflow is the lowest) was varied. For a
given DO standard and critical season length, the total
-------
treatment costs of both the UT and TD management
programs were obtained. For both of these discharge
programs, the optimal critical season length ranges from 2
to 5 months, depending on the DO standard. The total cost
of waste treatment is more sensitive to the critical season
length for high values of the DO standard than for low
values. The results for the UT allocation follows the same
trend, but the UT allocation is generally less expensive
than the TD allocation.
This research brief summarizes work done under
Cooperative Agreement No. CR812577-01 by the
University of Illinois under the sponsorship of the U.S.
Environmental Protection Agency. The Principal
Investigator was J. Wayland Eheart and the EPA Project
Officer was Lewis A. Rossman.
References _ _ _ _ _,_,^,_
"Eheart, J.W., "Effects of streamflow variation on critical
water quality for multiple discharges of decaying
pollutants." Water Resources Research, 24(1 ):1-8,
1988.
'Eheart, J. W. and H. Park, "Effects of temperature
variation on critical stream dissolved oxygen," Water
Resources Research, 25(2) :145-151, 1989.
'Eheart, J. W., E. D. Brill, Jr., and J. C. Liebman,
"Discharger grouping for water quality control," in
review, 1989.
Kilgore, J. D., "Seasonal static transferable discharge
permits for the control of biochemical oxygen demand
in the Willamette River," Working Paper No. 2, NSF
Award PRA 8I-2I692, Department of Civil Engineering,
University of Illinois, Urbana, Illinois, 1985.
"Lence, B. J., J. W. Eheart, and E. Downey Brill, Jr., "Risk
equivalent seasonal discharge .programs for river
basins with several dischargers," in review, 1989.
'Michels, C. M., "Incorporating reserve assimilative
capacity in the waste load allocation process," M.S.
Thesis, Environmental Engineering and Science
Program, Department of Civil Engineering, University
of Illinois, 1987.
_B,ossrnan,_J A.,_ ILBisk equivalent_seas.Qn.aL waste, load,
allocation," accepted by Water Resources Research^
1989.
•Product of the current project
United States
Environmental Protection
Agency
Center for Environmental Research
Information
Cincinnati OH 45268
Official Business
Penalty for Private Use $300
EPA/600/M-89/014
------- |