United States
Environmental Protection
Agency
Office of
Public Affairs (A-107)
Washington DC 20460
Volume 11
Number 8
October 1985
EPA JOURNAL
Mysteries of the Atmosphere
Another Challenge for
Environmental Technology
•JN
f
<
-------
EPA's mobile incinenj/or in operation during u 1982 tricil Inirn in h'dison, \'J.
The incinerator's saccess demonstrates how technology can help clean up the
environment. (See story on pu»t! 17.)
Technology
and the
Environment
Behind (he cleanup of the
environment stands a
powerful workhorse—
technology. This issue of
th<; EPA Journal examines
technology's role.
EPA Administrator Lee M.
Thomas sets a perspective,
explaining how science and
engineering are evolving
rapidly in support of cleanup
measures.
A new research frontier at
EPA is featured in a piece
explaining how the everyday
world is being used as a
testing site to advance
environmental understanding.
Other articles describe the
current technological effort to
clean up coal, developments
in techniques to deal with
hazardous waste, and modern
approaches in municipal
waste-water treatment.
Congressman James H.
Scheuer explores the
changing relationship
between technology and the
environment. Scheuer is
Chairman of the House
Subcommittee on Natural
Resources, Agriculture
Research and Environment.
Also discussed in this
issue are the recent .success
of EPA's mobile
incinerator—the Blue
Goose—in destroying dioxin,
and EPA's use of an airborne
laser system to help solve air
pollution problems. From
EPA's Environmental
Research Laboratory at
Corvallis, Ore., comes a
report on an innovative use
of computers.
In other stories, a feature
describes how the New
England states have gotten
together to stop evasion of
hazardous waste disposal
rules. A new approach to
dealing with nonpoint-source
water pollution is reported.
An article explains what
scientists know about the
mechanisms that cause
cancer.
A photo essay features a
recent dive by EPA scientists
in a submersible. The clivers
revisited an old dump site for
low-level radioactive waste
in the Pacific Ocean, 40
miles off the California coast.
The issue concludes with
two regular features—Update
and Appointments, n
-------
United States
Environmental Protection
Agency
Office of
Public Affairs (A-107)
Washington DC 20460
Volume 11
Number 8
October 1985
xvEPA JOURNAL
Lee M. Thomas, Administrator
Richard E. Sanderson, Acting Assistant Administrator for External Affairs
Paul A. Schuette, Acting Director, Office of Public Affairs
John Heritage, Editor
Susan Tejada, Associate Editor
Jack Lewis, Assistant Editor
Margherita Pryor, Contributing Editor
EPA is charged by Congress to
protect the nation's land, air, and
water systems. Under a mandate of
national environmental laws, the
agency strives to formulate and
implement actions which lead to a
compatible balance between
human activities and the ability of
natural systems to support and
nurture life.
The EPA Journal is published by
the U.S. Environmental Protection
Agency. The Administrator of EPA
has determined that the
publication of this periodical is
necessary in the transaction of the
public business required by law of
this agency. Use of funds for
printing this periodical has been
approved by the Director of the
Office of Management and Budget.
Views expressed by authors do not
necessarily reflect EPA policy.
Contributions and inquiries should
be addressed to the Editor (A-107),
Waterside Mall, 401 M St., S.W.,
Washington, D.C. 20460. No
permission necessary to reproduce
contents except copyrighted photos
and other materials.
Solving Tough
Environmental Problems
by Lee M. Thomas -
Charting New Research
Frontiers in Chattanooga
by Jack Lewis 4
The Promise of
Cleaner Coal
by Julian Josephson 7
Deadline: 1990
by Suellen W. Pirages '
Treating Municipal
Wastewater: Tradition
and Innovation
by Carl A. Brunner
Environmental Technology:
Old Disappointments
and New Hope
by James H. Scheuer 1
The Blue Goose Flies!
by Susan Tejada '
Lasers Help Unravel
Air Pollution Mysteries
by Donald T. Wruble -
Using Computers to
Isolate Pollution Causes
by Karen Randolph
A Compact to Track Down
Waste Dumping Cheaters
by David Pickman
Speeding Water Cleanup
While Saving Money
by John Jaksch and
Diane Niedzialkowski 2
How Chemicals
Can Cause Cancer
by Ronald W. Hart
and Angelo Turturro 26
Vigilance in the
Deep Sea Environment
by Margherita Pryor 2
Update ;il
Appointments at EPA
Front Cover; Sun behind clouds.
(See story on page 18 regarding
EPA's air pollution research with
laser beams.J
Design Credits:
Robert Flanagan;
Ron Farrah.
EPA Journal Subscriptions
The annual rate for subscribers
in the U.S. for the EPA Journal is
$20.00. The charge to subscribers
in foreign countries is $25.00 a
year. The price of a single copy of
the EPA Journal is $2.00 in this
country and $2.50 if sent to a
foreign country. Prices include
mail costs. Subscriptions to the
EPA Journal as well as to other
Federal Government magazines are
handled only by the U.S.
Government Printing Office.
Anyone wishing to subscribe to the
EPA Journal should fill in the form
at right and enclose a check or
money order payable to the
Superintendent of Documents. The
requests should be mailed to:
Superintendent of Documents,
GPO, Washington, D.C. 20402.
Name - First, Last
PLEASE PRINT
Company Name or Additional Address Line
Street Address
City
Zip
Coc
ie
| I Payment enclosed (Make checks payable to Superintendent of Documents)
M Charge to my Deposit Account No
-------
Solving Tough
Environmental Problems
by Lee M. Thomas
It wasn't too long ago that the
American environment seemed
beyond repair. Downtown, the air was
so thick with smog that you couldn't see
through it. The rivers were cesspools of
floating sewage. Junk piles were
proliferating all over the landscape.
Pesticides were killing off irreplaceable
wildlife, and toxic substances in minute
but dangerous amounts could not be
properly monitored, let alone managed.
Now, 15 years after the founding of
EPA, conditions are much better. We
The public is solidly behind
the idea of scientific
environmental management.
have a long way to go before our
surroundings become as clean and safe
and healthy as we know how to make
them, but we are on the road to
recovery. The public is solidly behind
the idea of scientific environmental
management.
Look how far we've come. Despite
substantial increases in population and
economic activity, despite millions
more cars on the road, despite the
continuing spread of exurban and
recreational communities, we have seen
steady progress in our efforts to combat
surface-water pollution. The fish have
returned to countless lakes and streams,
and thousands of miles of
once-contaminated rivers are open to
swimmers. The changes in the urban
atmosphere are even more obvious—the
levels of almost all of the major
pollutants have dropped, many
dramatically, during the last decade and
a half.
These advances have been possible
because of EPA's strict enforcement of
environmental laws passed during the
(Thoiims is Administrator <>! EPA.)
1970s. However, EPA programs don't
work in a vacuum; behind them lies a
largely untold story of rapidly evolving
science and engineering. Without EPA's
health and environmental research,
sophisticated monitoring techniques,
and new control technologies, we could
not have accomplished as much in
protecting the health and heritage of the
American people. Our own findings are,
of course, amplified by the work of
academicians, environmentalists, and
industrial laboratories.
To take one example of our progress,
the widespread installation and gradual
refinement of flue-gas scrubbers for
industrial and utility smokestacks
helped cut ambient sulfur dioxide 36
percent from 1975 to 1983. Nitrogen
dioxide has come down more slowly,
but studies indicate that we may be able
to do better with computer-controlled
improvements in combustion
techniques. Meanwhile, stacks have
become less smokey in general due to
the installation of precipitators and
baghouses; particulates dropped 20
percent from 1975 to 1983.
We must also look to science
for the ultimate solution to the
hazardous waste problem.
Automobiles pose a far less serious
problem today because of the
introduction of the catalytic converter in
the mid-1970s. Complementing these
engineered systems, by January 1, 1986,
we will have removed by regulation
more than 90 percent of the lead
formerly used in gasoline.
The nation's rivers have been a
special challenge because of the volume
and extraordinary variety of substances
dumped into them from tens of
thousands of sources. Conventional
wastewater treatment technology has
always relied upon screening, settling,
aeration, and chlorination to skim off
solids, clarify the water, and kill
infectious bacteria. But that is no longer
enough; EPA must regulate the toxic
compounds entering the nation's
waterways and aquifers as well.
Intensive research is under way to
identify and quantify toxics so we can
tighten our control measures.
We must also look to science for the
ultimate solution to the hazardous waste
problem. Most communities are running
out of places to dispose of these wastes,
and under the 1984 amendments to the
Resource Conservation and Recovery
Act, they can no longer simply be
dumped, not even in controlled
landfills. Ironically, some wastes are
generated by systems installed to
prevent pollution of air and water. So
we are looking for breakthroughs in
safe, high-temperature incineration,
chemical treatment, and natural or
genetically engineered bacteria that can
digest toxics and excrete them in
harmless form. Initial signs are highly
encouraging.
It goes without saying, however, that
studying pollutants individually, though
necessary, is hardly sufficient. We need
to analyze entire metropolitan regions
over long periods to determine how
pollutants interact and how various
strategies can minimize their impact.
Philadelphia and Baltimore have
pioneered in such research. Now
Chattanooga, Tenn., is working with us
to establish a baseline for the major
contaminants so we can follow them
over a number of years. The city and its
suburbs will be a living laboratory for
an integrated study of social, biological,
and physical aspects of pollution in all
its complex forms. (See story on page 4).
Environmental research is nothing if
not multi-disciplinary. It examines
everything from the fate of a small
biological community to the fluctuation
EPA JOURNAL
-------
At EPA's Air and Knergy Engineering
Research Lab in \orfh Carol)
contract engineer Heggie Poive/1
operates o ivet scrubber pilot system.
The system tises n mixture of'ivafc,
limt'stone to remove sulfur dioxide from
flue gases resulting trom <
combustion. Technology dc;
the lab and adapted by industry b
limit gaseous pollutants in the
environment.
of climate. EPA scientists study how
pollutants are transported and
chemically transformed in air, soil, and
water; sometimes dispersed and
sometimes concentrated by industrial,
urban, agricultural, or purely natural
processes. Such observations determine
who and what is exposed to danger.
We know much more about
how the environment works as
a total biophysical system than
we did 15 years ago.
how intensively, and how long. With
the facts in hand, we can then work out
appropriate remedies.
We know much more about how the
environment works as a total
biophysical system than we did 15 years
ago. EPA scientists have been in the
forefront of this historic effort, both in
the lab and in the field. Indeed,
environmental research is one of the
best examples of American leadership
in world science generally. In applied
environmental engineering we are, on
the whole, far ahead of other industrial
nations. Yankee ingenuity is not only
not dead, it's alive and well. It is
solving tough problems and, not
coincidentally, creating hundreds of
thousands of jobs.
We still have a long way to go before
the picture is complete, but in the
course of the next decade or two we
may learn enough to design a system of
environmental management that is less
reactive and more proactive than any
possible today. That, in turn, would
permit major advances in the protection
of human health and the preservation of
resources. From my vantage point, the
prospects seem highly favorable, and we
in EPA are anxious to get on with the
job. a
OCTOBER 1985
3
-------
Charting New Research
Frontiers in Chattanooga
by Jack Lewis
AcriaJ vi>'\v o> Chattanooga. Lookout
Mountain is in the background.
Most people think of scientific
testing as something very far
removed from everyday life. They
picture a scientist performing arcane
experiments in a laboratory filled with
test tubes and beakers.
But what if the everyday world were
to become a lab? That is precisely what
is happening in Chattanooga, Tenn. EPA
has chosen this city from among 56
candidates to become the nation's first
Environmental Methods Testing Site
(EMTS).
This is not the first time a city has
served as a site for environmental
testing. What makes Chattanooga—and
EMTS—unique is the duration of testing
EPA has in mind, and the
unprecedented foundation of basic data
that will be gathered before testing
begins.
For at least 10 years, and possibly as
long as 15, Chattanooga will be the
scene of a series of experiments, each
one adding another vital link to our
understanding of the environment.
Before experimentation begins, large
quantities of data will be gathered to
"characterize" the Chattanooga site.
Scientists need such benchmark data
before they can set up their field tests,
and gathering it takes both time and
money.
The EMTS will offer scientists a set of
site characterization data unsurpassed
in the United States. Having exact data
on pollutant loadings will do more than
save scientists time and money. It will
also improve the design of field tests by
indicating:
• How much new data should be
gathered by projected field tests.
• Whether computer mapping
procedures can be constructed to
predict the best locations for field
testing.
• What statements about exposure can
be made on the basis of the data
collected.
(Lewis is Assistant Editor of the EPA
Journal.]
Improved experimental design will
enhance the quality of field testing
methods and procedures. A fixed
location for experiments will also lead
to greater consistency.
All the experimental advantages the
EMTS offers are certain to improve
EPA's understanding of the best means
of monitoring human exposure to toxic
substances. The Toxic Substances
Control Act of 1976 charges EPA with
developing and improving methods for
monitoring such exposure. The agency's
Office of Toxic Substances—in
conjunction with the Office of Research
and Development—decided that
selecting and characterizing a single
research site would greatly advance that
mission.
Chattanooga's biggest edge
was its inclusion in a veritable
treasure trove of computerized
Tennessee Valley Authority
data.
EPA will have priority access to the
EMTS, but other government
agencies—federal, state, and local—will
also have a chance to conduct field tests
in Chattanooga once the site is initially
characterized. One outside organization
has already been assured access to the
EMTS: the United Nations. The U.N.
Environment Program and World Health
Organization's Human Exposure
Assessment Location (HEAL) project
will be conducted in Chattanooga by
EPA. The Chattanooga HEAL site will
be one of only four such sites in the
world.
What factors have made Chattanooga
the focus for such intense scientific
research? EPA's EMTS Project
Coordinator Robert Jungers emphasizes
that Chattanooga was not selected as
any "dirty city showcase." Far from it,
in fact. Although Chattanooga once had
severe pollution problems, in recent
years the city has made commendable
progress on all environmental fronts.
Chattanooga does have measurable
pollutant loadings of the type scientists
need to study, but other American cities
have them, too.
The main reasons for Chattanooga's
selection were altogether positive:
• A dynamic economy: Although still
heavily industrialized, Chattanooga is
forging a new economic position as a
major distribution center in the
Southeast. As EMTS Steering Committee
Chair Michael Dellarco puts it,
Chattanooga is making "an ambitious
and coordinated effort to improve its
economy and quality of life."
• Good location: Chattanooga's relative
isolation from other major population
centers makes it easier to get
measurements specific to the city. That
isolation does not cause inconvenience.
Transportation in and out of the city is
excellent.
• A strong public health network at the
county level: Hamilton County, home
county of Chattanooga, has done its own
air pollution monitoring since the
1920s. The county also gained
experience as a site for EPA field testing
in the 1970s when the agency was
studying nitrogen oxides as part of the
Community Health and Environmental
Surveillance System program.
• Support facilities: EMTS experiments
can draw on the skills of commercial
engineers and technicians in
Chattanooga as well as those of
academic experts at the University of
Tennessee's Chattanooga campus.
And last but not least:
• Available data: Chattanooga's biggest
edge over its competitors was its
inclusion in a veritable treasure trove of
computerized Tennessee Valley
Authority (TVA) data. EPA estimates
that the existence of this vast computer
data base will shave an entire year off
the time it takes to characterize the
Chattanooga site.
The long process of characterizing the
Chattanooga site is already under way.
Even with the headstart from TVA, this
EPA JOURNAL
-------
demanding project will take a full year.
EPA has contracted the site
characterization to the University of
Nevada's Environmental Research
Center (ERG). The Center will work
closely with EPA's Environmental
Monitoring Systems Laboratory in Las
Vegas. Computer experts at EPA's lab in
Research Triangle Park, N.C., will keep
track of the growing data base in the
federal mainframe computers at the
National Computer Center.
The most up-to-date computer
technology will be used to speed the
EMTS site characterization. Consider
the ARC/INFO Georeferenced
Information System, better known as
"GIS." CIS gives scientists the power to
put spatial data into a computer data
base. That is a great leap forward,
because nearly all environmental data
derive their significance from their
position on a map.
As they are entered into the data base,
GIS data are categorized according to
"theme" and assigned to one of many
layers in the computer's memory. Each
"theme" layer corresponds to a specific
data type: land use, hydrology, soil
type, etc. GIS permits scientists to
interact with all these stored data in an
endless variety of ways.
Never-before-possible correlations
involving data from many different
"themes" can now be done quickly and
printed out in handy map form. No
wonder words like "revolutionary" are
often used to describe the potential
impact of GIS.
A quick look at the types of data
needed to characterize the Chattanooga
site bears out its spatial dimension:
• Political boundaries
• Transportation systems
• Natural drainage patterns
• Streams, rivers, lakes
• Topography
• Land use
• Soil type
• Census geography
• Sewage systems
• Drinking water distribution systems
• Location of large public buildings
* ZIP code boundaries
• Location of industries
• Location of environmental monitoring
sites
• Demography
• Agricultural practices
• Climatology
• Concentrations of environmental
pollutants
Each of these "themes," once
programmed for Chattanooga on the
Georeferenced Information System, will
be a magnet for thousands of bits of
data: far more information than any
human brain can correlate, let alone
retain.
The EMTS site characterization team
will have to locate all available data sets
pertaining to the Chattanooga Standard
Metropolitan Statistical Area, a 6-county
region that spills over the Tennessee
border into Georgia. A large portion of
this information exists only on paper,
buried in the voluminous files of these
counties and states, as well as in the
national files of EPA. To simplify the
process, no data earlier than 1980 will
be entered into the computer data base.
OCTOBER 1985
-------
When no data exist for the years 1980 to
1985, the best available set of data will
be used.
Compiling computer data bases will
also require a massive effort. The
Tennessee Valley Authority data base is
the prime candidate for inclusion, but
other needed information may come
from the computers of the Bureau of the
Census, the National Weather Service,
the National Oceanic and Atmospheric
Administration, the U.S. Geological
Survey, the Department of Agriculture,
and the Department of Health and
Human Services, as well as from
existing data bases at EPA. Fortunately,
the Georeferenced Information System is
readily compatible with other computer
data bases.
EPA's Office of Toxic Substances
(OTS) has a valuable data base to
contribute to the Chattanooga site
characterization. This is GEMS, the
Graphical Exposure Modeling System.
GEMS already contains site
characterization data extracted from
many EPA data bases. OTS and many
other GEMS users across the country
use such data together with chemical
fate and exposure models available in
GEMS to predict levels of exposure to
toxic substances. This innovative system
has attracted national, and even
international, attention over the past
five years.
Many EM.TS experimenters will have
to wait until every last detail needed for
the site characterization has been
pinned down, and that can't be
expected before October 1986. Other
EMTS projects have the green light,
however, because the data they are
gathering will make valuable
contributions to the site characterization
itself.
A case in point is the Office of Toxic
Substances' Pilot Geocoding Project.
Scheduled for completion by the end of
November, this project entails locating
emission sources in Chattanooga using a
special means of analyzing aerial
photographs.
Another high priority project is the
prestigious Human Exposure
Assessment Location (HEAL) project for
the United Nations. The HEAL project
will generate an international data base
of exposure and environmental
monitoring data. The initial HEAL
experiments at the Chattanooga site will
generate data on several chemicals.
Chattanooga has the chance to
take environmental science
out of the laboratory and into
real life.
EPA's Office of Research and
Development and Office of Toxic
Substances are jointly leading the HEAL
effort in the United States. HEAL
projects are already underway in Japan,
Yugoslavia, and Sweden. It is expected
that 12 to 15 nations will eventually be
part of the international HEAL network.
Other projects already approved for
EMTS testing include:
• An Office of Toxic Substances pilot
test for monitoring levels of toxic
chemicals in fatty tissue. This national
survey will later be extended to human
blood and possibly mothers' milk.
• Phase 2 testing of the Total Exposure
Assessment Methodology (TEAM).
EMTS testing in Chattanooga will be
aimed at clarifying the differences
between outdoor and indoor toxic air
exposure.
• Diesel engines and other sources emit
particulate matter into the air, and
samplers are needed to monitor their
exact level. The EMTS will test particle
samplers developed by various
manufacturers.
• The National Oceanic and
Atmospheric Administration (NOAA)
plans to conduct an in-depth
climatological study at the EMTS.
• Analyses of gases detected during soil
testing. Gases often emanate from
landfills and hazardous waste sites, of
which Chattanooga has its share.
• An evaluation of the merits of
Chattanooga's new drinking water
system technology.
The final item on this list should
assure the citizens of Chattanooga that
EPA intends to be a good neighbor. To
promote better understanding and
communication, the agency has
included local as well as state
representatives on the EMTS Steering
Committee. This committee will screen
all proposals for projects to be
conducted in Chattanooga.
But an EMTS experiment doesn't have
to have Chattanooga in its project
prospectus to be pertinent to
Chattanoogans. Because of the special
nature of the EMTS, all experiments run
at the Chattanooga site—even those
originating from abroad—will be
generating Chattanooga data, valuable to
Chattanooga residents, businessmen,
and government leaders. And EPA will
make every effort to tap the extensive
support facilities of Chattanooga in all
aspects of EMTS testing.
The project also promises to bring
prestige to Chattanooga. Dr. Michael
Bruner, Assistant Commissioner for
Environment in the Tennessee
Department of Health and Environment,
believes that the EMTS designation
gives Chattanooga "the potential of
developing into the national—and even
the international—center for
environmental methods testing."
Most important of all, the EMTS
project could provide data crucial to the
formulation of new theories about the
effects of toxic substances on humans:
how best to measure them, how best to
reduce or eliminate them.
The daily life of Chattanooga can
proceed unchanged as scientists look for
new insights into these problems. At no
risk or expense, and at little or no
inconvenience, Chattanooga has the
chance to take environmental science
out of the laboratory and into real life,
as a whole city becomes a lab of great
value to scientists everywhere, Q
EPA JOURNAL
-------
The Promise of
Cleaner Coal
by Julian Josephson
Coal is America's "good news-bad
news" energy source.
The good news is that it can be
expected to remain a reasonably priced
fuel over the next several decades—the
time needed to develop renewable
sources of energy on a scale large
enough to meet the U.S. economy's
needs. The bad news is that the
expanded use of coal could cause
additional air pollution because of
increased emissions of particulate
matter (fly ash and soot) and oxides of
sulfur and nitrogen.
The fly ash and soot are
thought to contain trace
amounts of toxic and
cancer-causing substances.
Coal combustion is a source of air
pollution because of the composition of
the coal and the manner in which it
burns. Coal contains varying amounts of
mineral matter, including pyritic
sulfur—the crystals of "fool's gold" one
often sees in lumps of coal. It also
contains organic sulfur bound
up in its molecular structure. During
combustion, the mineral matter fuses to
become fly ash and clinker.
Incompletely burned coal forms soot.
The pyritic and organic sulfur oxidize to
sulfur dioxide. Atmospheric nitrogen
and oxygen combine under the high
temperatures of coal combustion to form
oxides of nitrogen.
These materials do not affect only air
quality and visibility. Atmospheric
chemists now believe that oxides of
sulfur and nitrogen combine with
moisture in the air to form acid
(/osephson is As.soddtc Editor of
Environmental Science and Technology,
a publication of the American Chemical
Society.]
precipitation. Oxides of sulfur also form
sulfates which contaminate surface
water and soils, and can damage forests
and crops. The fly ash and soot are
thought to contain trace amounts of
toxic and cancer-causing substances.
Because of these threats, existing
limits cm emissions of particulates and
sulfur dioxide (SO2) from stationary
sources, such as power plants, will
eventually be reduced further. Rules
sharply restricting emissions of oxides
of nitrogen (NOX) can be expected sooner
or later.
To comply with present and future
emission control regulations, the user of
coal has two options. One, presently the
most widely used, is to remove
particulate matter, SO2, and NOX flue
gas generated when coal burns. The
other is to remove pollutants before they
are taken up in the flue gas. The first
option requires an elaborate installation
of gas-cleaning equipment; the second
consists of capturing pollutants during
the burning process or taking potential
pollutants out of the coal before it is
burned. Cleaning coal before or during
burning could reduce or eliminate the
need for gas-cleaning installations.
Cleaning Emissions
Before the coal is burned, it can be
pulverized and washed to remove
some of the mineral matter that
forms ash and pyrites that oxidize
to SO2. After the coal is burned,
particulate matter is captured with
either an electrostatic precipitator (ESP)
or a fabric filter. An ESP charges the
particles electrically and catches them
on plates of opposite charge. A fabric
filter collects the particles as though it
were a giant vacuum cleaner bag.
The gas is then passed through a
"scrubber" in which a wet alkaline
material, usually lime, removes the SO2.
Fabric filters and ESPs can capture more
than 99 percent of the particulate
matter. Scrubbers can achieve more than
OCTOBER 1985
-------
At t/u's potvrr plmit i/i Hit; southeastern
("m'trd States, uork is underway on
instijlliii" [as desul/urization
H-'(,D't system to the hit of the chimney
and. to the ri»iit. (in electrostatic
.li/dtor. T'his particular F(;/) system
product's commercial .ynulc gypsum as
(i bypmdurt.
A fabric filter collects the
particles as though it were a
giant vacuum cleaner bag.
90 percent SO2 removal. Most do not
yet remove NOX, although technology
to accomplish this is developing
rapidly.
Although gas-cleaning equipment has
performed successfully, it still has
several problems. For instance, an ESP
or fabric filter cannot catch all of the
solid particles. Small amounts of finer,
inhalable particles can escape to the air.
The solid material that is captured must
be disposed of in some way. This
problem could become more serious if
these solids are designated as a
hazardous waste. A major problem with
scrubbers is that they are expensive and
energy-intensive. A scrubber can add as
much as 40 percent to the cost of a new
coal-fired power plant. Its operation can
consume up to eight percent of the
plant's output.
Another difficulty is that scrubbing
SO2 with wet lime produces large
quantities of sludge which must be
disposed of somewhere. At a coal-fired
power plant in western Pennsylvania,
the sludge is submerged in a large
artificial lake in which it settles to the
bottom and is said to cause no
contamination problem. At another
plant, proprietary materials are mixed
with the sludge to "fix" it in a
concrete-like material that can be used
for paving.
Newer technologies in use or in
advanced stages of development bypass
the sulfite sludge disposal problem.
They can also produce marketable
byproducts whose sale partially offsets
the high capital and operating costs of
flue gas cleaning equipment.
In one process developed in Japan,
SO2 is scrubbed with limestone over a
catalyst made of silver. The limestone is
converted to calcium sulfate (gypsum)
which is used to make plaster and other
building products.
Other techniques do not use lime or
limestone. For example, at several
plants, flue gas is scrubbed with a
solution of caustic soda to produce
sulfur, a marketable byproduct. In
another process, SO2 is captured with
magnesium oxide. After the sulfur is
removed, the magnesium oxide can be
recycled to the flue gas cleaning system.
This process also produces sulfur of
marketable quality.
Flue gas scrubbing technology for
removing both SO2 and NOX is coming
on line. One technique uses black
copper oxide to capture SO2. This
reaction produces copper sulfate which,
in turn, catalyzes the destruction of NOX
by ammonia and the formation of water
and nonpolluting nitrogen gas. Heating
the copper sulfate restores the copper
oxide and drives off SO2 in
concentrated form. This concentrated
SO2 is easily converted to sulfuric acid
or sulfur, both of which are always in
demand in many industries. A West
German process produces ammonium
sulfate, a fertilizer base, as the flue
gas cleaning byproduct.
Cleaning Coal during Combustion
Fluidized-bed combustion (FBC)
removes SO2 during combustion by a
process known as sorption. Coal and
limestone are crushed, then burned
together in a bed suspended in midair
by updrafts from below. As the coal
burns, the limestone captures, or sorbs,
the SO2. Suspending the coal-limestone
mixture on the updraft "cushion" makes
it look like a flowing, bubbling liquid.
FBC takes place at lower temperatures
than those needed for conventional or
The development and
commercialization of cleaner
ways to burn coal is going
forward.
"fixed-bed" combustion. Typical
temperatures for FBC are 750-950°C
(1382-1742°F); fixed-bed combustion
requires temperatures of 1400-15QO°C
(2552-2732T).
Because of these lower FBC
temperatures, no slag or clinkers are
formed from the mineral matter in the
coal. That makes the ash soft and easy
to handle. Lower FBC temperatures also
lead to substantial reductions in the
formation of NOx as well as the highly
corrosive vapors of alkali and metal
salts.
Between next year and 1989, several
utilities in Colorado, Kentucky, and
Minnesota expect to start up
fluidized-bed combustion boilers to
generate 100-160 megawatts of power.
About 20 U.S. and 32 foreign companies
are vying to sell FBC systems in what
they see as a rich market, and a number
of engineering firms are competing for
orders to design such systems.
One problem with FBC is that most
systems use the limestone sorbent only
once to capture SO2, and then dispose
of it. However, scientists and engineers
are developing sorbents that can be
regenerated and recycled. They are also
testing materials which can sharply
increase the efficiency of these sorbents.
Cleaning Coal before Combustion
This approach involves removing the
mineral matter that would otherwise
become fly ash and SO2 before the coal
is burned. Among techniques in use or
being developed are simple washing,
EPA JOURNAL
-------
Research Cornell
conversion of coal to gases and liquids,
and exotic procedures such as chemical
treatment and microwave separation.
One coal-cleaning technology has
completed one year of successful
performance at a 100-megawatt power
plant in California. Coal is reacted with
steam and oxygen under very high
pressure, to form a mixed gas consisting
of carbon monoxide and hydrogen. This
gas drives a turbine to generate
electricity; it is then burned in a boiler
to make steam which drives another
turbine to produce more electricity. The
coal's sulfur combines with the
hydrogen formed by the gasification
process. The sulfur can be recovered for
possible sale. NOX formation is
inhibited because of lower flame
temperatures and oxygen concentration,
and new burner and flame-shaping
technology.
Solvent refining is another means of
cleaning coal. The coal is dissolved in
an organic solvent and heated to
750-850°F under pressures of 1800-2800
pounds per square inch. This process
forms hydrocarbon liquids and gases,
hydrogen sulfide, and water. The
hydrogen sulfide removes some of the
organic sulfur bound up in the coal.
Ash-forming minerals and inorganic or
pyritic sulfur are also taken out. Pilot
studies of the process are being
conducted in Alabama under the
sponsorship of the U.S. Department of
Energy, and several industry
associations and companies.
Why Not Full Speed Ahead?
The development and
commercialization of cleaner ways to
burn coal is going forward, and shows
promise for reduced air pollution and
acid deposition. But why are these
activities not moving ahead at full
speed? One reason may be the present
glut and price softness of oil. The
economic situation has dampened
incentives to improve coal cleaning and
combustion technologies. However, the
petroleum oversupply cannot last more
than several years, at best. When oil and
natural gas supplies become short once
again, environmental regulations
governing the use of coal will not be
relaxed. Clean coal research,
development, and commercialization
must be accelerated so that coal and its
products can tide the U.S. over between
the time oil and gas become scarce once
again and renewable sources of energy
become technically and economically
feasible for use on a large scale, n
OCTOBER 1985
-------
Deadline: 1990
by Suellen W. Pirages
Deadline: 1990. That is the year when
land disposal of all hazardous waste
will come to a complete stop unless. . .
Unless EPA has, by then, evaluated
all listed hazardous wastes, determined
which ones should not be disposed on
land, and identified alternative
ways—either by incineration or
treatment—to manage those wastes.
Congress imposed the 1990 deadline
on EPA last year, when it reauthorized
the Resource Conservation and Recovery
Act (RCRA).
What will happen if the land disposal
prohibition takes effect before sufficient
capacity to treat or incinerate hazardous
waste exists?
Manufacturers will either have to
store their hazardous waste until such
capacity does become available, or they
will have to stop producing goods that
generate hazardous waste.
Unfortunately, these goods include not
only exotic chemicals with which the
average American is little concerned,
but also such essential items as
medicines and medical supplies and
such common everyday items as
automobiles, household plumbing
fixtures, paint, home computers, and
clothing. Obviously, if production of
these goods were to stop, the effect on
both manufacturers and consumers
would be drastic.
Can the congressional requirement be
implemented by 1990? Perhaps. But
several factors will limit the ability of
both government and industry to restrict
certain wastes from land disposal.
Two inescapable realities confront
any attempt to ban land disposal of
hazardous waste. First, the residues of
some industrial wastes will require land
disposal even after treatment.
Both organic compounds and
inorganic elements are found in
industrial wastes. Organic compounds
can be destroyed using existing
technology. But according to the laws of
physics, inorganic elements—which
occur naturally in the environment and
include lead, mercury, chromium,
cadmium, and sodium—can never be
destroyed. The best option for managing
(Pirages i,s Director (if t/iu .Yntionul
Solid Wastes Management Association's
Hdzcirdoiis U'tisff! Prognim.)
these wastes is to solidify them to
minimize the possibility of future
migration, and to place them in a secure
landfill.
Second, even though alternatives to
land disposal are available now, limited
capacity for treating hazardous waste
could still make it difficult to meet the
1990 deadline.
A recent national survey conducted
for EPA indicated that, even as early as
1981, 66 percent of the total national
volume of hazardous waste generated
was already being treated. Furthermore,
an informal survey of selected
No prudent businessman in
today's economy will expand
investments into an area
plagued by uncertainty.
commercial waste service firms revealed
that, of the untreated waste accepted at
disposal facilities, 30 to 60 percent was
pretreated at the facility before disposal.
In addition to conventional treatment
options in use today (see Figure 1), new
developments to improve treatment
technology are emerging. Some of these
are refinements of existing technologies
that lead to more efficient destruction of
hazardous chemicals. Others involve the
development of "high tech" innovations
that can reduce the hazard of chemicals
that previously were difficult to destroy.
Figure 2 illustrates some innovative
technologies that have advanced beyond
the experimental stage and are ready for
investigation in larger demonstration
projects.
But while commercial capacity for
treating industrial waste exists, it is not
evenly distributed throughout the
country. EPA data suggest that unused
treatment capacity nationwide stands at
30 to 45 percent. Some regions of the
country use almost all their treatment
capacity, but others do not. The greatest
unused capacity apparently exists in the
Midwest.
Unavailability of nearby commercial
treatment facilities can force large
generators to ship their industrial
hazardous waste long distances for
treatment or to treat the waste
themselves; and it puts a real crunch on
medium and small generators who
cannot afford an in-house treatment
capability.
Barriers to
Increasing Capacity
There are three major barriers to
increasing commercial capacity for
treating greater volumes of industrial
waste.
Lack of adequate regulatory standards
inhibits both commercial expansion of
conventional treatment capacity and
capital investment in new technologies.
Within the RCRA program, strict
standards have been developed only for
incineration and land disposal of
hazardous waste. Other treatment
alternatives are regulated only by the
Clean Water, Clean Air, and Safe
Drinking Water Acts, laws which do not
regulate a diverse range of hazardous
industrial chemicals.
EPA plans to develop health-based
criteria with which to evaluate wastes
for restrictions from land disposal. Once
established, these criteria can be used to
identify treatment standards for a broad
range of land disposal alternatives.
Until then, however, no prudent
businessman in today's economy will
expand investments into an area
plagued by uncertainty.
Figure 1.
Conventional Treatment
Chemical
• precipitation
• oxidation
• reduction—dechlorination
• photolysis
• stabilization/solidification
Biological
• aerobic/anaerobic
• land treatment
Incineration
• liquid injection
• rotary kiln
• cement kiln
10
EPA JOURNAL
-------
S, MondShem
Constraints on commercial markets
arise from governmental actions at all
levels. For example, many states and
communities are attempting to restrict
the movement of industrial wastes,
either by preventing hazardous material
transport through a city or by imposing
high taxes on importation of hazardous
waste for treatment and disposal. Some
states attempt to impose differential tax
rates on waste treated commercially and
waste treated by a generator. New
mandates by Congress to minimize
waste generation can lead to uncertainty
about future volumes requiring
management. Expansion of commercial
Figure 2.
Emerging Technologies
• Water oxidation
• Vertical-tube reactor
• Pyrolizing rotary kiln
• Penberthy pyro-converter
• Plasma arc
• High-temperature fluid wall reactor
• UV photolysis
• Pyroplasma processes
treatment technologies will continue to
be constrained until such trends in
taxation, transport, and composition and
volume of industrial waste can be
identified with greater certainty.
Another major constraint to
development of commercial treatment
capacity is the present slow pace of
granting government permits for
operation of a facility. RCRA permits
must be obtained before construction of
new facilities or expansion of existing
facilities. To date very few permits have
been granted. Some have been finalized
for small treatment facilities in rural
areas, but virtually no final permits
have been granted for large
multipurpose management facilities
(i.e., facilities with incineration,
treatment, and land disposal capacity
for treatment residues).
Although EPA has stated that new
treatment facilities will receive high
priority in the permitting process,
evidence of this promised change is not
yet apparent. Since it takes time to raise
capital to invest in better treatment
processes and new equipment, the
longer the permit process, the greater
the delays that can be expected in
implementing land disposal restrictions.
Siting problems—the Not In My
Backyard, or NIMBY, syndrome—persist
today as strongly as ever. Various
An American ritual: watching TV on
Saturday morning. Production of nuiny
of the items in this room generates
hazardous tvustf. Those items include
the cnrprting. paint, ivooii stum.
television, and clothing, drapery, and
upholstery fubrirs.
models for an effective siting process for
hazardous waste management facilities
have been proposed, but to date none
has been successful. Community
resistance continues regardless of the
type of facility being proposed. Between
1983 and 1985, for example, 15 different
proposals were made around the
country for development of treatment
and incineration facilities and small
landfills for disposal of treatment and
incineration residue. None of the
proposals succeeded.
But unless the United States returns
to "caveman living standards," there
will always be some volume of
hazardous waste requiring treatment
and disposal; it will not magically
disappear. Each state and community
must be willing to take some
responsibility for its proper
management.
It is relevant to recall the experience
of California. In 1983, California became
the first state to prohibit land disposal
of hazardous waste. But this prohibition
has yet to be implemented. In fact, state
officials recently called for an extension
of the legislative deadline for
implementation. A major reason for the
delay is a lack of alternatives for
managing waste. In California, as in the
rest of the nation, permitting and siting
problems have hampered the
development of facilities capable of
using alternative technologies.
The immediate future is not rosy
regarding restricting certain hazardous
wastes from land disposal, but it
certainly is not hopeless. The ability to
treat hazardous waste using
incineration, and chemical and biological
destruction processes exists. Barriers to
a timely implementation of land
disposal restrictions are primarily
institutional, not technological.
EPA has a very complex and difficult
task before it. It may be necessary for
Congress to assist EPA and industry by
providing more time to promulgate
restrictions and develop alternatives for
land disposal, o
OCTOBER 1985
11
-------
Treating Municipal
Wastewater: Tradition
and Innovation
by Carl H. Brunner
A rotating biological contactor is a
cost-effective municipal u-u.sfeivafer
treatment system for smaller plants.
Tr
J.Technologies" is the name of the
game as municipal wastewater pollution
control moves into its second century.
Although the history of mankind's
efforts to get rid of liquid wastes dates
back to the sewer systems of ancient
Rome, it was not until well into the
nineteenth century that concern over
the detrimental public health impact of
such pollution led to significant
attempts to develop wastewater
treatment methods. Since that time,
there has been a continuing recognition
of new problems and of the need for
new approaches. EPA research has been
an important source of innovative,
cost-effective solutions.
The earliest identified municipal
wastewater problems were the presence
of biologically degradable organic
materials and disease-causing
(pathogenic) microorganisms. In 1913,
for example, outbreaks of typhoid in the
Detroit area were traced to sewage
pouring into Lake Erie, Lake St. Glair,
and the Detroit River which joined
them. The solution then was to relocate
the sewer outlets in relation to drinking
water intakes. Today, wastewater
treatment technology is the answer.
In addition to the health problems
caused by disease-related
microorganisms, scientists became
concerned about biologically degradable
organic; materials which caused a
number of problems in the lakes and
streams into which they poured. These
organics reduced dissolved oxygen in
the water to levels that caused
unpleasant tastes and odors and
prevented the existence of fish and
other aquatic plants and animals. These
problems are still important today.
(Dr. Brunner is Chief of (lie Systems
Engineering Evaluation Branch of I he
VVastewarer Research Division in KPA's
Water Engineering Research Laboratory in
Cincinnati, Ohio.J
By the 1960s, scientists had expanded
the list of problem substances in the
concern over clean water to include
nutrients—mainly phosphorus and
nitrogen—and refractory, or
treatment-resistant organics. The
nutrients stimulate excess growth of
algae and other plants which not only
cause aesthetic problems but also
"choke" the affected waters by reducing
dissolved-oxygen levels to the point
where fish can not live. The refractory
organics were suspected of including
chemicals that were toxic or otherwise
harmful to humans. Continuing
Added to the problem of
pollution itself is the cost of
providing adequate municipal
wastewater treatment.
improvements in analytic techniques in
the 1970s and 1980s reinforced these
concerns.
Added to the problems of pollution
itself is the cost of providing adequate
municipal wastewater treatment.
Progress towards the Clean Water Act
goal of maintaining the "chemical,
physical and biological integrity of the
nation's waters" has already cost
billions of dollars. The technology
required to curb pollution from
household and commercial sewage is
one of the major costs involved. EPA
construction grants presently help
communities pay the price of
state-of-the-art wastewater treatment
plant construction, but escalating energy
charges and other rising expenses keep
increasing the cost of operating the
plants. EPA scientists and engineers
working on the development of
innovative technologies must
continually seek ways of getting more
clean water for the EPA buck.
Responding to the Problem
Many of the wastewater treatment
techniques in use today are modified
forms of approaches developed by early
researchers, who took advantage of the
ability of natural microorganisms to
degrade organic matter when oxygen is
present. Today's activated sludge
process, in which air is blown through
tanks containing the wastewater and
microorganisms, and the process
whereby wastewater is trickled over
beds of high-surface material covered
with microorganisms, are the
"traditional" biological processes
considered conventional for removal of
organics and reduction of pathogenic
microorganisms.
The battle against eutrophication—
the excess fertilization that almost killed
Lake Erie before massive wastewater
treatment intervention—was first fought
by adding iron or aluminum compounds
to the biological treatment processes to
remove phosphorus. Phosphorus was
targeted because it was cheaper to
remove than nitrogen. Most wastewater
treatment plants requiring phosphorus
removal still use this chemical
approach.
because it was found that some
microorganisms can, under certain
conditions, absorb abnormally large
amounts of phosphorus, there has been
a growing interest in utilizing biological
methods. Some activated sludge
treatment plants have been modified by
adjustments in dissolved oxygen levels
to make possible substantial phosphorus
removal without requiring chemicals.
Ammonia, another nutrient found in
wastewater, is a problem because it
reacts with oxygen and can deplete the
dissolved oxygen in surface waters.
Conventional biological processes have
been modified to allow the ammonia to
be oxidized to nitrate. This method is
already being widely used.
In the 1970s, there were a number of
new developments in response to the
need for more cost-effective,
12
EPA JOURNAL
-------
1
^1-
«
energy-effective technology. Use of pure
oxygen rather than air in the
activated sludge process was one. In
new or retrofitted plants, the size of the
needed aeration tanks is halved.
Because of the cost and complexity of
producing the pure oxygen, this process
is better suited to large plants. For
smaller plants, a more appropriate new
system has been the rotating biological
contactor (RBC). This uses a series of
rotating plastic disks covered with
microorganisms, and functions much
like a trickling filter.
EPA research has been an
important source of
innovative, cost-effective
solutions.
Technologies Program
Clean Water Act amendments in 1977
led to the creation of the Innovative and
Alternative (I/A) Technologies program,
which accelerated the pace of
technology development and full-scale
evaluation. The amendments
encouraged use of not fully proven
systems that were potentially beneficial
and cost- and energy-effective. Increased
federal sharing in the costs of
construction and modifications—if the
new technology didn't work
properly—was a major new incentive.
The I/A program has produced
important changes in municipal
wastewater treatment and accelerated
utilization of research results. A very
significant change has been the
widespread adoption of such alternative
technologies as land disposal techniques
for wastewaters and sludges. Such
treatment allows the wastewater to
infiltrate the soil or run over sloped
surfaces. Infiltration can produce a very
high quality water for aquifer recharge
or other uses. Overland flow lets
biological pollutant removal occur at the
ground's surface and produces water
similar in quality to that from
conventional treatment. Such systems
are being used at more than 150 sites
under the I/A program.
More than 200 alternative collection
projects have been funded under I/A,
resulting in combined savings of more
than $100 million to small
communities. The majority of these are
pressure sewers, but there are also
small-diameter gravity sewers which
accept septic-tank effluent, and vacuum
sewers.
The program has also helped
accelerate adoption of energy-saving,
fine-bubble diffuser technology in
activated sludge processing. Replacing
large, or coarse, bubble aeration systems
with fine-bubble units equipped with
individual air flow control devices can
save as much as 50 percent on aeration
energy requirements. Virtually all new
aeration construction is expected to
involve some form of fine-bubble
diffuser.
Disinfection technology, too, has
advanced significantly under the I/A
program. Until recently, chlorination
was virtually the only method in use,
but concern about the appearance of
harmful chlorinated organics in
municipal wastewater led to the quest
for other approaches. Over the last 10
years, EPA has played a major role in
the development of ultraviolet light
disinfection as a low-cost alternative to
chlorination by supporting research and
demonstration projects and funding
facility construction. Ultraviolet
disinfection systems are being utilized
at over 50 wastewater treatment plants.
The treatment and disposal of sludge
left over from biological wastewater
treatment plants can represent as much
as half the total treatment cost. What's
more, the amount of sludge being
treated—currently about seven million
dry tons annually—is steadily
increasing and there are significant
changes in the way it is being handled.
About a quarter of the residual sludge is
being applied to the land as fertilizer
instead of being incinerated or dumped
in landfills. Properly done, the
agricultural application of sludge
reduces the need for chemical
fertilizers. Composting of sludge to
produce a soil enhancer for lawns and
gardens is also increasing. The number
of treatment plants composting sludge
has increased from 10 to 50 in the past
decade.
Continued to next page
OCTOBER 1985
13
-------
Although much of EPA's effort in
developing wastewater treatment
technology focuses on communities
with centralized systems, agency
research and development has not
overlooked the 25 percent of our
population not served by sewers. Some
of the new collection methods
developed under I/A are making
practical the use of sewers in many
such areas. Improvements have been
made in the septic tank soil absorption
system. Where even septic tanks are not
practical, research has developed
improved methods of wastewater
disposal such as:
• Mound systems, where the
ground-water level is too high or the
aquifers unprotected.
• Evapotranspiration beds in arid
regions with unsuitable soils.
• Improved sand filter designs which
permit high quality on-site treatment for
direct disposal to surface water where
soils are unfit for infiltration.
• Wastewater segregation and
conservation techniques which can
extend the lives of marginally failing
conventional backyard systems.
Toxics Control
Treatment systems have always
coincidentally removed toxics from
wastewater. Recently, however,
municipal wastewater treatment
objectives have begun to move beyond
control of "traditional" pollutants to
encompass specific individual toxics
and overall biological toxicity caused by
the complex mixture of materials found
in wastewater. This added dimension,
which is evolving as a national policy
for the development of water
quality-based permit limitations for
toxic pollutants, recommends an
integrated approach using both specific
chemical analyses and biomonitoring
with appropriate bioassays.
Pollution control based upon toxics
leads logically to a broad systems
engineering approach in which tradeoffs
between municipal and industrial
treatment to attain water quality goals at
minimum cost must be considered.
Modern, automated process controls
will be a necessary tool in applying this
approach. While such effective
14
Photomicrograph D? \(«
-------
Environmental Technology
Old Disappointments
and New Hope
by James H. Scheuer
In the early 1970s, the relationship
between technology and the
environment seemed simple and
obvious. Bad technologies were the ones
that polluted the environment; good
technologies were the ones that did not
pollute, or ones which cleaned up after
the bad technologies.
The solution to pollution seemed
equally obvious: we ought to set clean
air and clean water standards, and
replace the polluting technologies wih
ones that could meet those standards.
This simple faith in the ability of
technology to deliver us painlessly into
a pristine future resulted in provisions
requiring industries to install the "best
available technology" to control
pollution.
We clearly underestimated the
difficulty in cleanup.
That early faith in technology was a
product of the boundless optimism in
our economy that pervaded the nation
in those years. Americans had the sense
that we could solve all our problems by
simply calling on American science and
engineering. After all, as the cliche'
went, if we could send a man to the
moon, why couldn't we clean up our
planet? In a time when double-digit
inflation and the Japanese trade
invasion were unimaginable, we had
faith in the ability of American industry
to absorb any necessary costs that might
result from pollution controls. Certainly
(Scheuer is Chairman of tht: ffou.si;
Subcommittee on NuturoJ Resources.
Agriculture Research and Environment.
in the Committee on Science mid
Technology. He is a Democratic
Congressman representing the flth
District of New York.)
it should not be too difficult or
expensive for industry to find a way to
stop spewing pollutants into the air and
water. Certainly the engineering talent
in our automotive industry could come
up with an innovative way to cut
emissions from auto exhausts.
Fifteen years and over $500 billion
dollars later, we remain far from the
vision of fishable and swimmable
waters and clean, healthy air that
guided Congress' actions in the early
1970s. We clearly underestimated the
difficulty in cleanup. Deadlines in
several of our environmental statutes
have been reached, extended, and
reached again. Making a bad situation
worse, the very nature of the problems
we are now facing has changed. Today,
we must deal with a veritable "alphabet
soup" of toxic chemicals, heretofore
unheard of by the public.
Our early faith in the bright promise
of technology was not altogether
misplaced. Much of the progress made
in cleaning up our air and water is
indeed due to significant technological
developments and improvements, such
as scrubbers and catalytic converters. It
is also clear that developments in
technology will play an even more
important role in the future in
addressing critical environmental issues
such as acid rain and hazardous waste
contamination.
But plainly our early assumptions
were faulty. Part of the problem
stemmed from our failure to take to
heart the lessons of ecologists that the
environment must be viewed as a total
system.
The distinction between "good"
technology and "bad" technology
rapidly eroded as we began to realize
that, in reality, technology often created
new problems or presented a variety of
trade-offs. We embraced technologies
which cleaned up one part of the
environment at the expense of another.
Tall stacks meant to meet Clean Air Act
requirements improved local air quality,
but contributed to long-range acid
deposition. Treatment technologies used
to remove pollutants from water at
sewage treatment plants cleaned the
water, but released the pollutants
directly back into the air. To clean up
surface waters, hazardous wastes
formerly discharged into water were
stored at dumps, which subsequently
leaked and contaminated ground-water
sources across the country.
Technology also had the confounding
effect of uncovering "new" subtle,
pervasive, and threatening forms of
pollution. Thanks to the rapid
development of analytic technology, we
became able to detect and measure toxic
substances at ever lower levels of
concentration, down to a few parts per
trillion. As the threshold of detection
increased, the apparent dimensions of
the toxic contamination problem
increased as well.
Technology also had the
confounding effect of
uncovering "new" subtle,
pervasive, and threatening
forms of pollution.
Ground water, which many scientists
had believed only a few years ago to be
protected from pollution, instead was
found in many instances to be
contaminated with a wide range of
pollutants. The Congressional Office of
Technology Assessment has identified
over 200 substances found in the
nation's ground water. Similarly,
improvements in monitoring devices
helped reveal the potential hazards of
exposure to contaminants, such as
formaldehyde and radon, found in
indoor environments.
Over the last 15 years, our faith in
technology has waned. While
Americans have been quick to embrace
the comforts and protections afforded by
OCTOBER 1985
15
-------
technology, we have nursed a quiet
democratic distrust of elites of any sort,
including the technological elite. As
technology has proliferated over the last
20 years, touching every aspect of our
everyday lives, it is not surprising that
many citizens are increasingly
distrustful and suspicious of technology
which they cannot understand or
control. Cars stuffed with byzantine
emission control equipment can't be
fixed even by the most ardent
do-it-yourselfer; consumers erroneously
dunned by bill-collecting computers
can't find a "real person" to set the
matter straight.
That distrust is not altogether
unwarranted. Compiling a list of
incidents which engineers and scientists
said could not happen is not a difficult
task.
Three Mile Island was an accident
which engineers performing elaborate
risk analyses concluded was statistically
impossible. Yet despite numerous
mechanisms designed to ensure the
safety of the plant, recent investigations
show that the core of the Three Mile
Island reactor came perilously close to a
meltdown, closer than any scientist or
engineer thought even theoretically
possible at the time of the accident.
Similarly, for decades, scientists
believed that ethylene dibromide (EDB)
was safe to use as a pesticide because it
was volatile and would leave no residue
on food. Only in the late 1970s did
researchers discover that EDB did not in
fact dissipate, and that large quantities
of flour and other grain were
contaminated. The discovery that EDB
could also leach into ground water came
as a further surprise a few years later.
More recently, a sophisticated
computerized system installed at Union
Carbide's Institute, W. Va., plant to
warn the community in the event of a
toxic chemical release failed to work
when toxic rnethylene chloride was
accidently released at the plant, despite
the high priority given to the system by
the company in the wake of the Bhopal
disaster.
It is not astonishing that the public is
skeptical of statements from scientists
working in the area of biotechnology
that the development of new
genetically engineered organisms pose
no threat to the environment or human
health.
While engineers and scientists are
loathe to admit it, the fact is that they
have no special exemption from
Murphy's Law. Yet the failures
mentioned above, and numerous others,
have not stemmed from a failure of
technology per se. By and large, the
tools, machines, and computers—the
hardware, if you will—have behaved as
they were designed or programmed to
do.
Rather, the failures have often come
from human error, such as the case of
Three Mile Island, or from a tendency of
scientists and engineers to
underestimate the complexity of the
natural ecosystems within which they
are working. Perhaps part of this
tendency is caused by the nature of the
scientific method, which requires the
careful observation of a few variables at
a time under controlled conditions.
Nature, however, is rarely content with
the simplicity of laboratory conditions,
and scientists and engineers sometimes
fail to appreciate potential interreactions
or uncertainties about the particular
ecosystem involved.
And yet it is precisely technology
itself which holds out the best hope for
our ability to understand and analyze
the environment from an integrated and
holistic view. Remarkable developments
in satellite remote sensing, for example,
can help build a much better
understanding of the interplay of
environmental factors over large regions
of the earth. The truly global nature of
many environmental problems—and the
global nature of any solution—is being
documented by this new tool. At the
other extreme of the scale, sophisticated
new research tools make it possible for
us to measure and manipulate biological
and ecological activities at the cellular
level, helping us to understand the basic
processes which must be known and
understood to create an effective and
total view of the environment.
Finally, and perhaps most
The Three Mile Island Nudear Station.
located on the Susquehanna River south
of Harrisburg. Pa. According (n
Congressman Scheuer, the 1979
accident at Three Mile Island — an
accident that supposedly was
"statistically impossible" •— is one
reason why Americans liuvo become
"distrustful and suspicious of
technology."
importantly, the development of new
generations of supercomputers creates
the possibility of building complex
models of the environment which can
integrate the new wealth of information
and begin, for the first time, to
approximate nature's own complexity.
Such developments will be necessary
if we are to solve the great global
environmental issues—acid rain,
desertification, deforestation, loss of
species, and the depletion of natural
resources—that will surely demand our
attention well into the next century.
In recent years, Congress1 view of
technology and its ability to bring us
effortlessly to a pristine environment
has been tempered with a more accurate
vision of the limits and perils of
technology. We certainly know now that
technology by itself will not solve our
environmental problems; what is
perhaps most needed is an improvement
in our human vision of the
interrelationship of the vast array of
component elements which make up
our environment. But it is equally
evident that without technology, even
inspired human vision will be unlikely
to create the cleaner and healthier
environment we all seek. Q
16
EPA JOURNAL
-------
The Blue Goose Flies!
by Susan Tejada
In the January-February 1985 issue of
the EPA Journal, Rowena Michaels,
Public Affairs Director of EPA Region 7,
reported on an upcoming experiment in
southwest Missouri. In a field test in
Barry County, Mo., EPA's mobile
incinerator, known as the "Blue Goose,"
would be used to burn some of the
dioxin-contaminated waste that had
been spread years earlier at more than
40 locations throughout that part of the
state.
"Upon successful completion of the
project," Michaels wrote, "we will have
demonstrated that no harmful
contaminants entered the environment
by any route from the process . . . We
will have successfully, safely destroyed
dioxin."
Agency officials declared the
tests were a "major
breakthrough" in efforts to
find solutions to the dioxin
problem.
This summer, EPA announced that
the trial burns had succeeded. Final
results showed that the system
destroyed or removed 99.9999
percent of wastes contaminated with
2,3,7,8-TCDD, the most toxic form of
dioxin, and that there were no
detectable traces of 2,3,7,8-TCDD in the
system's flue gas emissions, kiln ash, or
scrubber-water effluent streams.
Michaels' prediction had come true.
The Blue Goose had worked. Agency
officials declared the tests were a "major
breakthrough" in efforts to find
(Tejada is Associate Editor of the EPA
Journal.}
solutions to the dioxin problem.
The mobile incinerator was conceived
in 1976 by the EPA Office of Research
and Development, Hazardous
Waste Engineering Research Laboratory,
in Edison, N.J., and built by
outside contractors. Its effectiveness had
been documented in a series of trial
burns with fuel oil, iron oxide, carbon
tetrachloride, o-dichlorobenzene, and
PCBs in 1982 and 1983, and at the end
of 1984 it was moved for dioxin testing
to Missouri.
Four dioxin trial burns took place in
the state between February and April.
1985. EPA's Environmental Monitoring
System Laboratory in Las Vegas
compiled results of the tests from more
than 15,000 pages of analytical data.
The incinerator processed 1,750
gallons of liquids and more than 40 tons
of soil contaminated with dioxin,
destroying a total of 3.84 pounds of
2,3,7,8-TCDD. Emissions from the
system met and exceeded all federal and
state requirements for the incineration
of the dioxin-contaminated material.
Because the mobile incinerator
burned the 2,3,7,8-TCDD so completely,
EPA has proposed to de-list, or
designate as not hazardous, the residues
from future burns in the incinerator.
Despite the incinerator's success, use
of the system is not practical at certain
sites such as Times Beach, Mo., where
dioxin contaminated an estimated
400,000 tons of soil. Since the unit was
developed for mobility, its capacity is
restricted by design requirements that
had to satisfy "over-the-road"
limitations. Generally, the unit can
process up to one ton of contaminated
solids per hour and up to one gallon of
contaminated liquids per minute.
Although not suited for Times Beach,
the Blue Goose can be used at other
sites around the country with more
limited amounts of dioxin
contamination. The unit cuts waste
transportation and storage costs and,
since it operates on-site, it eliminates
the possibility of accidental spills
occurring in transit to a landfill.
Generally, the unit can
process up to one ton of
contaminated solids per hour
and up to one gallon of
contaminated liquids per
minute.
In an article on the development of
the incinerator that appeared in 1982,
the EPA Journal stated: "The mobile
incinerator was the first of its
kind . . . EPA is counting on the
ingenuity of American industry to
produce future generations of this
technology." In fact, a number of major
waste management companies have now
indicated that they are interested in
building similar units.
Scientific data, reports, and permit
materials relating to the mobile
incinerator are available on request from
James Yezzi, U.S. EPA, Releases Control
Branch, Woodbridge Avenue, Edison,
N.J. 08837. a
OCTOBER 1985
-------
Lasers Help Unravel
Air Pollution Mysteries
by Donald T. Wruble
LkJar "slice?" of the atmosphere above a
Southern QiJif'ornin mountain nmge.
Darker portions slum1 ureas o\ j>rea,s, N'ev.)
smoke, and gas molecules absorb and
scatter the light. A portion of the light
pulse is scattered directly back to a
telescope. The telescope, pointed along
the laser light beam path, focuses the
scattered light on electronic detectors.
The detectors convert the scattering
intensity into a "picture" of the air
contaminants the beam has
encountered.
Various government, university, and
private groups around the world are
using ground-based lidar to conduct air
pollution research. EPA is one of the
few organizations to use airborne lidar.
(Only two other groups in the United
States employ such systems. SRI
International operates an airborne
system for various public and private
organizations, including the Electric
Power Resources Institute. The National
Aeronautics and Space Administration
(NASA) conducts research to develop
lidar systems for global air monitoring
from satellites.)
The EPA airborne lidars are dedicated
to research studies to help EPA and the
states understand the sources, chemical
and physical transformation, and
Curt Edmonds, KPA computer
electronics engineer, examines video
display ol airborne lidtir collection
technique.
transport processes of air pollutants.
The research data are used to develop
pollution-control plans as well as add to
our understanding of air pollution.
During operation of the EPA lidar,
every second or two a laser emits
extremely short pulses of light (lasting
17 billionths of a second) toward the
ground through a hole cut in an
airplane floor. The beam spreads out as
it travels downward to ensure that the
laser light energy at ground level will
not cause eye damage to an observer
who might be looking straight up at that
instant.
As each short pulse of the laser beam
light travels downward, the amount of
particulates or gases the light pulse
encounters affects the degree of
absorption or scatter. When the beam
strikes the ground, an even greater
portion is reflected back to the aircraft.
The light that travels back to the aircraft
18
EPA JOURNAL
-------
is "collected" through the telescope, and
the beam intensity and time-for-return
are measured continuously with a
complex ultra-high speed electronics
system.
Since light travels at about 186,000
miles per second, super-fast electronics
and onboard computers are required to
differentiate how long a portion of the
beam traveled outside the aircraft and to
determine how far from the aircraft a
given group of scattering particles were
encountered by the beam. This system
was developed by EPA engineers and
computer scientists to enable handling
the immense amount of electronic data
produced at "the speed of light." In the
airplane cabin, the electronic results of
each laser pulse are displayed
side-by-side on a video screen. The
operators see a "slice" of atmosphere
showing lighter and darker areas that
portray particle or gas concentration and
distribution below the aircraft.
Without this system, an armada of
airplanes flying simultaneously at
various altitudes collecting air samples
one after another could only begin to
collect a comparable number of air
samples to describe the aerosol
distribution, not to mention the years of
laboratory analysis that would be
required to determine the sample
constituents. With the airborne lidar, all
this is accomplished within a few
micro-seconds. Multiple flight paths
across an urban or regional area can
produce data that describe the
transport and transformation of particles
and gases found in that atmosphere.
Importantly, the data provide an
extensive three-dimensional picture of
an air mass, not just an estimate such as
would be gathered with only ground
sampling stations and weather
information. The "slices" can be studied
either visually or with much greater
sophistication using computer analysis
techniques. In this manner, EPA
scientists develop detailed mathematical
models of air pollutant transport over
mountains, cities, and countryside, or
assist state air pollution officials in air
pollution control planning.
Research studies in 1985 include:
• Mapping smoke plume from burning
rice fields in the Sacramento Valley of
California. After harvest, the rice fields
are burned to prepare them for the next
crop. This burn-off can create extensive
smoke through a large area.
Understanding smoke transport paths in
different weather conditions enables
farmers to plan burning periods during
weather that will carry most of the
smoke away from residents in the area,
• Mapping particulates associated with
photochemical oxidants in air over
Ventura and Santa Barbara counties in
California. Lidar mapping allows
assessment of the contribution of
hydrocarbon gases from offshore oil
drilling to onshore air pollution.
Understanding the amount of pollution
from this source versus pollutants from
automobile exhausts is important for air
pollution control planning.
• Mapping transport and diffusion of
airborne particles in the Midwest
(Kentucky, Ohio, Indiana). Tracking
these manmade particles helps EPA
meteorologists develop mathematical
models describing movement of air
contaminants that may combine with
moisture to create acid rain.
In order to study acid rain origins
from air pollutants, scientists at the Las
Vegas laboratory are working on the
next generation lidar, called an
ultraviolet differential absorption lidar.
This lidar will use a new technology
laser, called an excimer laser. NASA
scientists are cooperating in this
development which will enable
simultaneous measurements of ozone
and sulfur dioxide, as well as
particulate aerosols in the atmosphere.
Dr. Jim McEIroy, who heads the
laboratory's lidar research program, is
enthusiastic about the use of lidar to
help answer many of the research
questions about air masses producing
acid deposition. "We have an
opportunity to capitalize on a space
age technology development and apply
it to an environmental problem that can
affect us all. Airborne lidar can help us
locate sources of air pollutants leading
to acid rain, help us follow pollutant
paths across the country, and help us
understand the air flow patterns that
determine areas of lakes and forests that
are affected."
We have an opportunity to
capitalize on a space age
technology development and
apply it to an environmental
problem that can affect us all.
Airborne lasers can be used to test
lake waters for changes resulting from
acid rain. Dr. Mike Bristow, an optical
physicist at the Las Vegas laboratory,
has pioneered development of an
airborne laser fluorosensor. This system
uses different laser wavelengths that
create fluorescent light when the beam
strikes dissolved matter in water in
lakes below the aircraft. This fluorescent
light travels back to the aircraft to be
electronically processed in a similar
fashion to that of the lidar. As the
aircraft flies several paths across the
lake, the distribution of dissolved
organic material across the lake is
mapped. These and other data are
computerized to assess acidity changes.
Dr. Bristow is working with Cornell
University scientists who have made
recent discoveries in spectral analysis of
laser fluorescence using laboratory
water samples. By marrying this
technique to EPA's airborne laser
fluorosensor, Dr. Bristow hopes to equip
EPA with an advanced technology
system for monitoring acid rain impact
on lakes across the United States, a
OCTOBER 1985
19
-------
Using Computers to
Isolate Pollution Causes
by Karen Randolph
A'>out 20 years ago, peopie began
noticing a deterioration of
vegetation in the path of emissions of
pollutants such as sulfur dioxide,
nitrogen dioxide, and ozone. But these
emissions were only a few among scores
of variables that could be causing the
damage. How to isolate the real
culprits?
At EPA's Environmental Research
Laboratory in Corvallis, Ore.,
answering this question is a major
priority. Four years ago, the scientists
there set out to design a system that
would enable them to study plant
responses under realistic air quality
conditions. The result of their efforts is
not only a sophisticated facility of plant
growth chambers, greenhouses, and
outdoor exposure chambers, but also a
state-of-the-art computerized process
control technology.
Thanks to their computer, the
Corvallis scientists can go beyond the
constraints of the 9 to 5, Monday to
Friday work week to achieve continuous
control and monitoring of
environmental conditions in exposure
chambers. Where once it took years of
tedious observation to accumulate data
under different growing conditions, they
can now experiment with several
variables simultaneously. And because
the computer can dispense and control
hourly concentrations of pollutants so
precisely, scientists can also enter actual
ambient site data from specific
geographic regions to reproduce
real-world conditions in the exposure
chambers.
In studying plant response to ozone,
for example, scientists use ozone data
obtained from actual sites. The
computer is programmed to replicate
the hourly ozone concentrations over 30
days. Sample air lines feed back into
monitors, providing continuous reading
and adjustment of pollutant
concentrations delivered to each
chamber. Sensors in the chambers also
feed back data on light, air, soil
temperatures, and relative humidity.
These data are displayed so that
operators can monitor each chamber at a
(Randolph is Technical Jnl'onmifion
Manager til KP.Vs Environmental Hesearch
Laixmiforv in (,'mvallis. Orr.J
glance. Every 24 hours, the collected
data are transmitted to the laboratory's
mainframe computer and printed out.
During the first year of operation, the
scientists looked at the effects of ozone
on two important forage crops: alfalfa
and tall fescue. Using exposure
schedules based on air-quality data from
a midwestern hay-producing state, they
tested response under two conditions.
One delivered ozone in varying peaks,
frequencies, and durations; the other
provided it in a consistent pattern.
Throughout the growing season, plants
in both groups were exposed to equal
amounts of ozone.
In both groups, alfalfa growth
decreased as ozone levels increased. But
varying ozone concentrations reduced
growth more than did consistent
concentrations. Tall fescue, however,
was only slightly affected under both
conditions. Experiments last year with
timothy hay showed the same response
as the alfalfa studies: reduced growth
associated with varying ozone levels.
Another experiment involves the
These field exposure <:lir.s cif h'PA's
lab in (,'orvnllis. Ore., arc designed to
mimic natural
-------
A Compact to Track Down
Waste Dumping Cheaters
by David Pickman
The greatest obstacles in the path of
safe and economical waste
management are the "waste cheats" who
dump hazardous waste in city sewers or
in convenient swamps which eventually
poison water supplies. Some cheats are
deliberate, but many are cheating from
ignorance. According to EPA Regional
Administrator Michael R. Deland,
"Relentless tightening of regulations and
computer storage of information by state
governments are closing in on the
deliberate cheats, and education is
bringing the others into the system. In
New England, it's a regional effort."
The Resource Conservation and
Recovery Act (RCRA) was designed to
provide cradle-to-grave security for
hazardous waste. As most recently
amended, it requires manifesting of
waste shipments by all generators of 100
kilograms (220 pounds) or more of
hazardous waste per month. Shipments
must be accompanied by a manifest
giving the name, address, and EPA
identification number of the generator,
transporter, and receiving facility. The
law requires the generator to report to
the state or EPA if the signed manifest is
not returned within 45 days.
The six New England states have gone
a step further in a regional compact
designed both to track down the waste
cheat and to give waste managers in
government and industry the data base
they need to plan the future. When the
final RCRA hazardous waste regulations
were just beginning to take effect in
1981, the New England Regional
Commission (later absorbed by the New
England Governors Conference) devised
a uniform manifest for the six states and
their western neighbor, New York. The
system was in place in 1982 and needed
only minor adjustment when EPA came
up with its uniform manifest in
September 1984.
(Pickman is on the staff of the Office of
Public Affairs of EPA Region I.)
The idea of a single computer and
seven terminals was seriously proposed,
discussed and debated, and eventually
rejected for lack of agreement on where
the computer would reside. The five
computer systems now in existence are
compatible, although operated
independently by Connecticut,
Massachusetts, New Hampshire, New
York and Maine (which runs a
three-state system for itself, Rhode
Island, and Vermont). The three states
in the Maine system comprise about 20
percent of the New England population
of 12 million.
"When the New England states began
to adopt programs so as to receive
RCRA authorization, they recognized
that this region was unique in its heavy
reliance on out-of-region disposal
facilities," says Mel Hohman, Director of
Region 1's Waste Management Division.
"The fact that a large portion of the
waste is transported long distances and
through multiple state jurisdictions led
our states to focus on the need for
tighter controls. Hence, a load-by-load
tracking system was developed.
Unlike the national regulations, the
state rules require that manifest copies
be sent to the states when waste is
shipped and again when it is received at
the licensed facility. If waste is shipped
first to storage and then to permanent
disposal, the state gets two copies of
each of two manifests.
Manifest forms are numbered so that
the copies are easily matched and
disappearance or destruction of a
manifest is easily detected by the
computers. If the receiver's copy does
not join the generator's copy in a
reasonable time, the computer "sends"
an inspector to investigate.
Officials report sharp increases in
reported "deficiencies" since the
computerized system went into effect.
Previously, many generators simply
forgot about shipments once they were
off the premises, and there was no
check on transporters who might dump
OCTOBER 1985
21
-------
The "milk run" pickups cost
about 20 percent of what a dry
cleaner has to pay for
independent waste hauling.
/\t (i dn1 cleaning shop, a drive'
the Safety Klern (.,'urp. picks tip ivastr
fluids (or reeyrimt;. I 'ntJer a pro.mii
in/tinted fiv (he /ntcnmf ional <
.Assn.
-------
States believe they can steer
the largest quantity of waste
into safe disposal by a
judicious blend of education,
hand holding, and
enforcement.
under federal regulation.
The states are not relying exclusively
on enforcement at this stage. The goal
is to educate dry cleaners (chlorinated
hydrocarbons), auto body shops (paint
removers and solvents), paint shops
(paint waste), and other small
enterprises. Printed instructions on
waste identification, the use of the
manifest, and the penalties for
mismanagement are being mailed to
thousands of small businesses. Seminars
are planned. Slide tapes will be
produced and circulated to trade
associations. EPA has provided about
$300,000 in grants to states for outreach
to small generators. EPA's Hohman
warned against underplaying
enforcement. "It is the responsibility of
industry large and small to comply with
the regulations. If they cannot, they
should ask EPA or the state for advice
and direction. Non-compliers will be
identified and enforcement action will
be taken."
Trade associations have taken steps to
bring their members into compliance.
Northeast Fabricare Association
organized a New England-New York
"milk run" for dry cleaners. Safety
Kleen Corporation of Elgin, 111., picks up
the waste cleaning fluids, helps the
generators fill out manifests, hauls the
waste to a recycling plant, and delivers
the recycled product to many of the
same customers. The "milk run"
pickups cost about 20 percent of what a
dry cleaner has to pay for independent
waste hauling.
Only about 60 percent of the New
England-New York dry cleaners are on
the "milk run," according to Fabricare,
because the minimum pickup has been
three 55-gaIlon drums a year. Safety
Kleen has now agreed to include all
interested cleaning shops, and
participation is expected to rise sharply.
Safety Kleen is also working with the
states to bring more auto body shops
into the system. Like dry cleaners, auto
body shops produce fairly uniform and
consistent waste which, if properly
managed, can be recycled with a
minimum of waste analysis and
separation of incompatible substances.
As with the dry cleaners, the object is to
help with the paper work and cut costs
for the proprietors.
The states have stressed education
—the helpful approach—rather than
total reliance on enforcement.
Flagrant violations have been
penalized, sometimes with fines of
more than $20,000, but many violations
are routine and result in notices of
violation in which the generators are
instructed in correct internal procedures
to avoid future slip-ups. Every effort is
made to communicate with generators
through their trade associations and
trade publications. The states agree that
this will yield more returns at this stage
than all-out enforcement against routine
violations.
Enforcement is time-consuming. Even
with administrative penalties that
bypass court action, there is a lot of
paper work. States believe they can
steer the largest quantity of waste into
safe disposal by a judicious blend of
education, hand holding, and
enforcement.
Russell Sylva, Commissioner of the
Massachusetts Department of
Environmental Quality Engineering, told
an inquiring State Senate committee
that waste load tracking was not
primarily an enforcement tool, but was
being used to "identify patterns of
hazardous waste management." He
added that compliance was at a high
level. "We are finally seeing most of the
large quantity generators come into
compliance" and "we are now able to
focus more resources on small quantity
generators."
EPA is also interested in studying
these patterns of hazardous waste
management. Given the relatively strong
data base in Region 1, the agency's
Integrated Environmental Management
Division (IEMD) is running a pilot
project there. Regional data have been
entered into a computerized model
developed by EPA's Office of Policy
Analysis. The purpose is to correlate
existing information on waste volumes,
constituents, transportation routes,
waste management facilities, exposure
probabilities, populations, and potential
health and environmental effects. IEMD
plans to work with state, local, and
industry representatives to formulate
waste management strategies for
analysis by the model.
The model will calculate costs for
each potential strategy and indicate
each strategy's relative impact on
regional waste management systems.
The states are particularly interested in
the model's use in demonstrating the
need for waste management facilities.
The region has no integrated waste
treatment, storage, and disposal facility
and depends heavily on out-of-region
facilities. It is hoped that the pilot
project will help the states and the New
England Congressional Institute develop
a truly regional waste management
system with the right balance of waste
reduction by industry, solvent recovery,
incineration, chemical and physical
treatment and, last and preferably least,
land disposal.
Whatever success is achieved will be
dependent on the region's wasteload
tracking systems and the detailed
information the states provide on the
intricacies of the problem and the
viability of various management
strategies. The presence of an adequate
waste management system in the region
would also reduce the cost to the
individual generator and, with it, the
temptation to cheat, a
OCTOBER 1985
-------
Speeding Water Cleanup
While Saving Money
by John Jaksch
and Diane Niedzialkowski
Wafer pours nidi an mrrHuiv stnin'urr
in t/j
-------
to zero by complex and expensive
methods, nonpoint phosphorus
discharges from development activities
would cause continued algae growth.
Control of nonpoint sources was
necessary to avoid a sewer tap
moratorium that would effectively
freeze growth and severely restrict
Summit County's booming economy.
Faced with a potential crisis, the
Colorado Water Quality Control
Commission asked local agencies to
help*develop a comprehensive
management plan for addressing
phosphorus pollution in the Dillon
Reservoir Basin. The Northwest
Colorado Council of Governments
became the lead agency for what was
known as the "Phosphorus Club." This
consisted of representatives from the
state, county, surrounding
municipalities, environmental groups,
local industry, and other parties with a
significant stake in Dillon's water
quality. The Club developed a
consensus approach which took
point source pollution into account, but
fundamentally relied on systematic
nonpoint source control to achieve
water quality goals.
Several factors helped this
multi-government trading approach
develop and coalesce at Dillon. There
were sufficient water quality data to
evaluate the effects of various nonpoint
source control strategies. All interested
parties had continuing input. And
effective, low-cost nonpoint source
controls were available.
Previously, EPA's National Urban
Runoff Project and other studies had
indicated that low-technology "best
management practices," such as settling
ponds and percolation pits, could
remove large amounts of phosphorus
from urban runoff with far less cost,
energy use, and sludge generation than
advanced point source treatment. But
these results had not been widely tested
under real world conditions.
In 1982, the Northwest Colorado
Council of Governments asked EPA to
help fund and evaluate a pilot control
facility at Dillon. At the pilot facility,
urban runoff from an 81-acre watershed
was collected in a plastic-lined basin,
which overflowed into a settling pond.
In eight major runoff events, this facility
removed 68 percent of incoming
phosphorus, at a cost of only $67 per
pound removed. Available
treatment plant improvements would
have cost from $824 to nearly $8,000 for
each pound of phosphorus removed.
The trading system ultimately
developed at Dillon requires that
existing nonpoint sources be controlled
while phosphorus from future nonpoint
sources is minimized through
state-of-the-art controls. This allows for
point source (and municipal) growth in
the future, through compensating
nonpoint source control. Dillon's
phosphorus control strategy has five
major elements:
• 1982 levels of phosphorus were set as
the water quality target for Dillon. Each
municipal sewage treatment plant was
given a share of the available load.
providing a "growth margin" through
1990.
• In addition to installing state-of-
the-art phosphorus controls, new
developments must contribute to a
Nonpoint Source Facilities Investment
Fund, which will be used to construct
controls for pre-1984 nonpoint sources
and help finance administration of the
trading program;
• A "trading ratio" of 2:1 was
established to assure environmental
progress. For each pound of phosphorus
a treatment plant is allowed to
discharge above 1982 levels, two
pounds of phosphorus must be removed
from a nonpoint source existing before
1984;
• Both point and nonpoint dischargers
receive Clean Water Act permits which
define their phosphorus limits and their
responsibilities for maintaining
nonpoint source control devices. Failure
to operate and maintain the devices will
result in direct federal or state
enforcement action.
• The Summit County Water Quality
Committee was established to monitor
the trading program and provide
long-term water quality management.
The State of Colorado held public
hearings on Dillon's proposed trading
plan in May, 1984, and formally
approved the plan in June, 1984. With
approval by EPA Region 8 the next
month, Dillon Reservoir became the first
operating point/nonpoint source trading
system in the United States.
Can this trading approach be used for
other locations and types of nonpoint
source pollution? Is Dillon unique? The
quality of virtually all lakes is
controlled by a delicate balance of
nutrients such as phosphorus. Many
coastal rivers and bays are also affected
by phosphorus pollution. Trading offers
potential control of nutrient pollution
on all such water bodies, in ways which
are non-intrusive, save tax dollars, and
allow regulatory programs to operate
more smoothly.
While trading shows great promise,
Dillon left several questions
unanswered. For example, will trading
work on free-flowing streams or
estuaries rather than lakes or bays, or
for other nonpoint sources such as
farms? Can the frameworks developed
at Dillon be adapted to other locations?
Will other types of nonpoint source
control prove as cost-effective?
EPA headquarters, together with EPA
Region 3 and the states of Pennsylvania,
Maryland, and Virginia, is currently
examining application of "Dillon type"
approaches to the Chesapeake Bay,
which has a major nonpoint source
phosphorus problem due to agricultural
and other activity in its stream
drainages. In addition, EPA is looking
for other sites where trading may apply,
and which can also help answer these
questions, a
OCTOBER 1985
25
-------
How Chemicals
Can Cause Cancer
by Ronald W. Hart
and Angeio Turturro
That the average lifespan of humans
has increased, there can be no doubt.
Even in our lifetime, the improvement
of public sanitation, reduction in famine
(current conditions in Africa
notwithstanding), and control of
infectious diseases have contributed to
an increase in average longevity
worldwide. There is little doubt that our
increased, and relatively newfound,
scientific knowledge of the genetic
differences in, and the biochemical
complexities of man also have played
an important role.
We all make daily contact
with a considerable number
and variety of chemicals,
many of which are known
carcinogens.
Attendant to this newfound
knowledge is an increasing awareness
and concern about the possible adverse
effects of our exposure to
chemicals—adverse effects like the
production of cancer, or carcinogenesis.
No matter who we are, no matter
what our station in life, no matter where
we live, we all make daily contact with
a considerable number and variety of
chemicals, many of which are known
carcinogens, others only suspected as
being carcinogenic. Some occur
naturally in the food we eat, some are a
result of our lifestyles, some are
manmade products which have benefits
desired by members of our society, and
some represent naturally produced
compounds found in molds, fungus, and
other plants and animal species.
Educated estimates indicate that there
now may be more than 65,000 manmade
chemicals in everyday use. Our food
contains tens of thousands of natural, or
"wild," chemicals which help comprise
the flavor, aroma, or nutritional value of
our daily fare.
(Dr. Hart is Director of the National
(.'(.'liter for Toxirnlo^irui Hrscurc/i
-------
•iiifiiois u! tin's
:i iluilors :
.-.iny. (jli:ohol. (ind high fut
diets li
-------
Vigilance in the Deep Sea Environment
by Margherita Pryor
This summer, EPA's Office of
Radiation Programs (ORP) led a team
of scientists on a cruise from California
to some Pacific islands. Idyllic? Not this
trip. Their port of call was almost a mile
under water at a disposal site about 40
miles southwest of San Francisco . . .
pitch dark, icy cold, subject to
pressures of thousands of pounds per
square inch, and, incidentally, home to
the largest congregation of Great White
sharks in U.S. waters.
The bleak Farallon Islands area is a
bird refuge. Until 1954, it was also a
an ocean disposal site for low-level
radioactive waste, and scattered over the
sea floor are about 3500 steel drums,
dating back over 30 years.
Since 1974, ORP has monitored this
disposal area on several occasions. Last
June, ORP again directed an underwater
survey of the site. Using a U.S. Navy
state-of-the art deep submergence vessel
and a highly sophisticated satellite
navigation system, 11 specialists
descended 900 meters (3,387 ft.) to
observe conditions and take samples of
sediment and marine organisms.
Did they find two-headed fish?
Glow-in-the-dark tube worms?
No way, according to Bob Dyer of
ORP and head of the scientific
expedition. As in previous surveys at
the Islands, there appeared to be no
adverse impacts to either man or the
marine environment from these early
disposal operations; the surveyors found
the normal complement of marine
organisms at those depths and
conditions. They also found, with the
help of direct observations from the
submersible, that over time, some of the
drums have produced an artificial reef
effect. Several species of marine
organisms have attached themselves to
the drums or established residence in
the immediate drum areas.
Submersibles have radically improved
our ability to study deep sea
environments. Manipulator arms and
cameras can provide immediate and
accurate information, and the new
generation of manned
"mini-submarines" allows scientists to
observe marine conditions and
organisms as they really are. "There's no
better way," says Dyer, "to get that far
down and see what's actually
happening." D
r\t<>i is C-tiiitrihuttny Editor of EPA
Journal.J
28
Members nl I/in si ienh'lir; team one! \civy .support creiv. From (up Jet! to
bottom right: l)ai\ (JotsJiall. (xili'lorniu Department of Fish and (,'ame: Hay
Ki.s.sane. DSRV Avalon: /im Co/it. DSHV Avalon; Bill Pi»«. DSHV Avalon;
Deborah lY'iiiy. I'Diversity ol Washington, School of Oceanography; Mike
l}(itrit:ola. l)Sn\' Aviilon; Bob Dyer, hl'A: I'^te ("olomho. Rmukhuvtui
\afional Laboratory; Stun Kelly. Interstate Klectrom'cs CJorp.: mid llal
Palmer, Marine TacJinolcigv Society. \ot pictured: Tomio Iwomoto,
California Academy of Science; Far! U'eiler. Interstate Flectrunics (,'orp.:
and Brian Mel/run. EPA, Hegion 9.
•••
A drum packed ivith
low-level radioactive
HdSte ail;i sealed IVJtll
concrete. The drum has
Imcn iiJe'iid'fi'e.d ns one that
UTIS ((.'ft (it the site be.hvcen
]!).">! and !<)5-J. A/though
se
-------
'/'In1 DSHV [Deep Submergence Heseue
Vehicle/ Avainn on board its support
ship, the (r.,S.S. I'i^eim. h'J'A n.M'il the
Avalon during ils June survey o| the1
/•'(iniJJun LslancLs OOO-inofci' (/isposol
site. Rcixiuse of' its siinpi; nml ;
\d\-v CITUV nu-ni/frrs refer (o i( cis "t/ic
pickle"; (i Idunrli is roiled u "pickle.
iJunip." Ht'sciif u'liicli's like tlir Avalo
(ire iiuich Idrne.r tfuin lypicril rej
submersibJes, (.'urrvmi; up i
j)(!.ssent;ers mid crew.
Inside the Avalon's pi>
observation capsule. Observers ore
required to go .shoeless to protect the
capsule interior, even though
temperatures can »et us Ion- us 40°J;
during a dive.
OCTOBER 1985
29
-------
A 24-cm sediment corf; from
-------
V-J IJUCl l\J A review of recent major EPA activities and developments in the pollution control program areas
AIR
HAZARDOUS WASTE
Four Substances Reviewed
The agency has announced
its intent to list carbon
tetrachloride as a hazardous
air pollutant under the Clean
Air Act.
This action triggers the
development of emission
standards for significant
sources of this pollutant.
The agency also has
completed its evaluation of
manganese, chlorinated
benzenes, and vinylidene
chloride and has decided not
to regulate these chemicals
under the act at this time.
EPA has reviewed studies
on carbon tetrachloride, a
volatile organic liquid used
in making refrigerants,
pesticides, and other
chemicals, and has
concluded it is a probable
human carcinogen. Because
carbon tetrachloride is
extremely stable in the
atmosphere, emissions from
all countries contribute to
gradually increasing
concentrations that can be
measured virtually anywhere
in the world.
Heavy Duty Vehicle
Emissions
EPA has issued final
regulations allowing
manufacturers of heavy-duty
engines that do not have the
technological capability to
meet future, more stringent
emission standards to pay
penalties instead.
Without such a regulatory
mechanism, some
manufacturers unable to meet
future standards might be
forced out of the
marketplace.
These rulemaking actions
are the result of a unique
process called regulatory
negotiation, which allows
industry, states, and public
interest groups an
opportunity to participate in
the regulation's development
through face-to-face
negotiations.
Small Quantity Waste
Generators
Many producers of small
quantities of hazardous waste
will be required to send their
wastes to federally approved
disposal facilities starting
next year, under regulations
proposed by EPA.
In addition, these small
quantity generators will be
required to label their waste
with the hazardous waste
manifest form to ensure that
it is sent to either an EPA or
state-approved facility. This
requirement and the
proposed rule are both
authorized under the
Resource Conservation and
Recovery Act.
Liability Insurance
Alternatives
The agency is considering
alternatives to current
requirements for third-party
liability insurance that
hazardous waste facility
owners and operators must
now have to stay in business
under federal law.
Regulations published on
April 16, 1982 under the
Resource Conservation and
Recovery Act {RCRA) require
facilities to demonstrate
liability insurance coverage
for bodily injury and
property damage to third
parties resulting from both
accidental sudden and
nonsudden releases during
the operating life of a facility.
However, such third-party
liability insurance is
becoming increasingly
unavailable to segments of
the industry. In addition,
new amendments to RCRA
require that all disposal
facilities certify that they
meet all financial
responsibility requirements
when submitting an
application for a permit.
Under the amendments, all
facilities must apply for a
final permit by November 8,
1985.
To respond to the dilemma
posed by the growing
shortage of third-party
insurance available to
facilities which need to
certify compliance with
financial responsibility
requirements by November 8,
EPA is considering, and
seeking public comment on
the advisability of,
alternatives to the current
requirements.
PESTICIDES
Continued Use of Dicofol
EPA is proposing to allow
the continued use of the
pesticide dicofol under
certain conditions after
determining that the
substantial benefits of using
this product outweigh the
risks.
Dicofol is used to control
various species of mites on
cotton and citrus as well as
other crops.
This announcement
modified the agency's
proposal in October 1984 to
cancel dicofol. The earlier
proposal was based on the
high levels of the
manufacturing impurities
found in this insecticide,
including DDT and the
related compounds DDD,
DDE, and tetrachloro-DDT
(collectively known as DDTr).
The agency had determined
in the earlier proposal that
these chemicals could result
in unreasonable adverse
effects on fish and aquatic
bird populations, particularly
certain endangered species.
In response to the earlier
proposal, the registrants have
indicated that the DDTr
levels in technical dicofol
ranging up to approximately
10 percent can be reduced in
incremental stages up to 0.1
percent by July 1987. These
small amounts will be
indistinguishable from
current background levels of
DDTr and are not expected to
pose any significant risk to
the environment. According
to agency risk estimates,
these lower levels of DDTr
will not cause eggshell
thinning or other
reproductive problems in
birds or fish or otherwise
represent a threat to
endangered species or to the
environment.
Daminozide Notice
EPA has announced that it
will be sending its Science
Advisory Panel a draft
notice of intent to cancel the
use of daminozide, a
pesticide used primarily on
apples, as well as on peanuts
and other fruits and
vegetables. EPA is seeking
the panel's review of the
scientific basis for the
agency's determination that
lifetime exposure to food
residues of this product may
result in unreasonable risk to
public health.
Under the Federal
Insecticide. Fungicide, and
Rodenticide Act (FIFRA), the
agency is required to submit
cancellation actions to the
Science Advisory Panel for
peer review before final
cancellation actions are
taken. The agency also will
be submitting this notice to
the U.S. Department of
Agriculture as required by
FIFRA.
RADIATION
High-Level Radioactive
Waste
The agency has issued final
standards for the
management and disposal of
high-level radioactive waste
from both commercial and
defense sources. The rules
provide public health
protection for future
generations from
radioactivity from spent
nuclear reactor fuel and
high-level waste products
generated by atomic energy
defense activities.
The standards require
isolation of these nuclear
wastes far from man's
environment. Current
national law requires they be
placed in mined geologic
repositories several thousand
feet below the earth's surface.
The standards are expected
to provide the regulatory
framework and public
confidence necessary for the
federal government to
proceed in developing and
demonstrating geologic
repositories for disposing of
these radioactive materials, a
OCTOBER 1985
31
-------
Appointments at EPA
WATER
Offshore Oil and Gas
EPA has reported that it is
proposing rules to control the
discharge to the ocean of
substances such as
drilling fluids, drill cuttings,
well treatment fluids, and
sanitary wastes from offshore
oil and gas facilities such as
platforms and drilling rigs.
The rules would govern the
quality of such wastes from
all existing and future
facilities located offshore in
the Gulf of Mexico, the
Atlantic and Pacific Oceans,
and Alaskan waters. Oil and
gas exploration, well drilling,
and oil and gas production
activities are the primary
operations conducted by the
affected facilities.
The rules would require
the nearly 4,000 existing
facilities to control the
amounts of oil and grease,
mercury, cadmium, chlorine,
floating solids, and various
oils discharged to ocean
waters. The requirements are
based upon treatment of the
wastes by the best available
treatment technology. In
addition, the rules would
limit the toxicity of drilling
fluids being discharged.
These fluids are mixtures of
clays, minerals, oil, special
chemicals, and water used in
drilling an oil or gas well.
Jennifer Joy Mcmson Lawrence /. Jensen Michael J. Quigley Timothy Fields. Jr.
Jennifer Joy Manson has been
nominated to the post of Assistant
Administrator for EPA's Office of
External Affairs. She will be responsible
for managing the agency's public affairs,
Congressional relations, and liaison
with other federal agencies, state and
local governments, and environmental
and other private organizations. She
will be the national program manager
for dredge-and-fill oversight under
Section 404 of the Clean Water Act, and
will be responsible for coordination of
federal facilities compliance and the
Indian policy efforts of the agency's
Office of Federal Activities.
Since 1975, Manson has held policy
and management positions with the
White House, the Virginia governor's
office, the U.S. Senate, and several
political campaigns. Mpst recently, she
managed the successful re-election
campaign of Senator John Warner fR.-Va.)
Manson received a B.A. in Speech from
the University of North Carolina in 1974.
Lawrence J. Jensen has been
nominated to be Assistant Administrator
for EPA's Office of Water. The position
includes responsibility for all of the
agency's water-quality programs,
including drinking water standards; the
development of effluent guidelines for
industrial facilities and municipal
wastewater treatment plants; the
construction grants program; and the
protection of ground water and marine
and estuarine resources.
Jensen currently is Associate Solicitor
for Energy and Resources at the U.S.
Department of the Interior. From
October 1981 to June 1983, he served as
the department's Associate Solicitor for
Indian Affairs, and from 1976 to 1979,
he was a trial lawyer in the Civil Division
of the U.S. Department of Justice. Before
coming to Washington again in 1981,
Jensen was an associate with the law firm
of Jones, Waldo, Holbrook and
McDonough in Salt Lake City, Utah.
Jensen received a B.A. in History from
the University of Utah in 1973, and
earned his law degree from Brigham
32
Young University in 1976. He is a
member of the Utah State Bar.
Michael J. Quigley has been named
Deputy Director of the agency's Office of
Municipal Pollution Control in the
Office of Water. His major
responsibilities involve management of
the construction grants program,
including the development of
regulations, policy, and guidance for
municipal treatment facilities. Quigley
had been Acting Deputy Director of the
office since December 1984.
Quigley has been with EPA since
1971, primarily in the water program.
His previous experience includes five
years with NASA, and three years of
service in the U.S. Air Force.
Quigley received his B.A. from
Trinity College (Conn.) in 1961, and his
law degree from Georgetown University
in 1969. He also holds a master's degree
in public administration from Harvard
University, which he attended under
EPA's executive development program.
Quigley is,an associate certified
financial planner and a member of the
Virginia Bar Association.
Timothy Fields, Jr., has been
appointed director of the Emergency
Response Division of EPA's Office of
Solid Waste and Emergency Response.
His major responsibilities include the
development and implementation of
emergency response program policies
for uncontrolled hazardous waste sites
and releases of hazardous substances
and oil into the environment. He has
been Acting Director of the division
since January of this year.
Fields has been with EPA since 1971,
with most of his experience in the
Office of Solid Waste.
He received his B.S. in Industrial
Engineering from the Virginia
Polytechnic Institute and State
University in 1970. Under EPA's
Long-term Graduate Training Program,
he also attended George Washington
University and received an M.S. in
Operations Research in 1975. a
EPA JOURNAL
-------
EPA Administrator Lee Thomas, left.
hikes Camel's {lump mountain in
Vermont to study forest damage.
Accompanying Thomas on the
trek, which took p/ace in .August, are
Sen. Patrick Leahy (l.)-\'l.J. center; I'rol'.
Hubert Vo»elnuinii of Ilie Cnivei'sily of
Vermont. ri»iit; and several government
officials ond environmentaJists from
Vermont and ,Veiv Hcniijj.sJiire.
Back Cover: Leaves in (he fall. Photo by
Michael Philip Manheim. Ko/io, Inc.
-------
United States
Environmental Protection
Agency
Washington DC 20460
Official Business
Penalty for Private Use S300
Hfeta,
^••HT
*^
Third Class Bulk
Postage and Fees Paid
EPA
Permit No G 3b
------- |