United States
Environmental Protection
Agency
Office Of
Communications, Education,
And Public Affairs
Volume 19, Number 1
January/February/March 1993
EPA175-N-93-014
W "^
Profiles In Risk Assessment
-New Science
I
Mew Contexts
-------
?/EPA JOURNAL
United States ! !
Environmental Protection Agency A Magazine on National and Global Environmental Perspectives
Carol M. Browner
Administrator
Communications, Education,
and Public Affairs
Loretta M. Ucelli
Associate Administrator
Charles Osolin
Director of Editorial Services
Karen Flagstad
Associate Editor
Teresa Opheim
Assistant Editor
Gregg Sekscienski
Assistant Editor
Ruth Barker
Photo Editor
Nancy Starnes
Assistant Editor
Marilyn Rogers
Circulation Manager
Francheska Greene
Intern
Front cover: Triple exposure
photomicrograph of melanoma and
carcinoma cell tissues (skin cancer
tissues) with indirect fluorescent
staining, enlarged 400 times.
Gail MacKenzie. ImmunoGen, Inc.
This photograph won second place in the 1992
Nikon International Small World Competition.
For more information or an entry lorm lor this
annual competition, contact: Small World
Competition. Nikon Inc. Instrument Group. 1300
Walt Whitman Road. Melville. NY 11747-3084
January/February/March 1993 • Volume 19, Number 1 • 175-N-93-014
From the Editors
y's face it. When the subject is risk assessment, controversy
part of the picture.
What is the risk of Pollutant X? Even with a single pollutant as
our scenario, opportunities abound for misunderstanding and
disagreement. Scientists may disagree in their interpretation of the
available scientific studies. Science policy decisions can be highly
controversial, especially when "data gaps" and other uncertainties are
involved—as generally they are. There may be uncertainty and
disagreement about risk "numbers," the quantitative aspect of risk
assessment. The working distinctions between risk assessment (what
scientists know about risk) and risk management (what regulators
decide to do about it) entail further controversies. This issue of EPA
Journal explores such current controversies in risk assessment and
profiles recent developments in the science that supports it.
Of course, Pollutant X is not the whole story. How does the health
risk posed by Pollutant X compare to the ecological risk presented by
Pollutant Y? Moreover, how does the risk of X rank in terms of all the
other health and environmental risks faced by the nation? by a
particular state? by an individual community? These are the kinds of
real-world questions that gave birth to comparative risk analysis, a
relatively new (and still controversial) practice which was endorsed by
EPA's Science Advisory Board in its 1990 report Reducing Risk.
How should environmental priorities be set in these times of
constricted resources? Through comparative risk analysis and risk-
based priority-setting? Some participants in the debate reflected in this
issue of the Journal say yes, definitely so; others suggest alternative
paradigms, such as environmental justice and pollution prevention.
It's a high-stakes debate, and it deserves to be conducted
in the public eye.
Postscript: As this issue of the magazine went to press, a number of key
EPA appointments were announced, including Loretta M. Ucelli as
Associate Administrator for the Office of Communications, Education,
and Public Affairs. Profiles of the new appointees will appear in the
next issue of EPA Journal
EPA JOURNAL Subscriptions
The annual rate for subscribers in the U.S. is S10. The annual rate for subscribers in foreign countries is $12.50. The price of a single copy of EPA
Journal is 53.50 in the U.S. and S4.38 if sent to a foreign country. Prices include mailing costs. Subscriptions to EPA Journal as well as other federal
government magazines are handled only by the U.S. Government Printing Office. To subscribe to EPA Journal, send a check or money order
payable to the Superintendent of Documents. The requests should be mailed to: P.O. Box 371954, Pittsburgh, PA 15250-7954. To change address,
call or write The U S. Government Printing Office, Public Documents Department, Superintendent of Documents, Washington, DC 20402;
(202) 512-2262.
EPA JOURX'AL is printed on recycled paper.
-------
CONTENTS
Profiles in Risk Assessment: New Science, New Contexts
Meet the New Administrator
An interview with Carol M. Browner
The "Basics'
Points of Debate
1 Q The ABCs of Risk Assessment
u by Dorothy E. Patton
1 fa Uncertainty and the "Flavors" of Risk
1U by Robert J. Scheuplein
I Q The Role of Comparative Risk Analysis
by Wendy Cleland-Hamnett
22 EPA's IRIS Data Base:
Accessing the Science
by Linda Tuxen
Changing Profiles
A Flagship Risk Assessment
by Peter W. Preuss and
William H. Farland
Breakthroughs in
Cancer Risk Assessment
by Stephen Nesnow
OH Update on Noncancer Assessments
JU by V. L. Dellarco and C. A. Kimmel
OO The Lessons of Commencement Bay
bv Patricia Cirone and Matthew Coco
OT Relating Risk Assessment and
J^ Risk Management
Complete separation of the
two processes is a misconception
by Sheila Jasanoff
If risk management is broke,
why fix risk assessment?
by Bernard D. Goldstein
OQ The Delaney Clause Dilemma
by Victor J. Kimm
The Delaney Clause: Point/Counterpoint
Let's reform a failed food safety regime
by Al Meyerhoff
An obscure EPA policy is to blame
by Clausen Ely
A Legislative Proposal
by Senator Daniel Patrick Movnihan
p. 36
A fa
A Q What's a City to Do?
4:0 by Edward F. Hayes
Alternative Paradigms
By Adam M. Finkel and Dominic Golding
Departments
O Newsline
News and Comment on EPA
Cross Currents: Going Electric
A book review by Karen Flagstad
Featuring EPA: Clearing the Air
bv Rich ten Wolde
/in
UU
Habitat: Peregrine Falcon
An excerpt from Refuge
by Terry Tempest Williams
Chronicle: The Felling
of the Great Lakes Forests
by Teresa Opheim
P. 55
The U.S. Environmental Protection Agency is charged by Congress to protect the nation's land, air, and water systems. Under a mandate of national
environmental laws, the Agency strives to formulate and implement actions which lead to a compatible balance between human activities and the ability
of natural systems to support and nurture life.
EPA JOURNAL is published by EPA . The Administrator of EPA has determined that the publication of this periodical is necessary in the transaction
of the public business required by law of this Agency. Use of funds for printing this periodical has been approved by the Director of the Office of
Management and Budget. Views expressed by authors do not necessarily reflect EPA policy. No permission necessary to reproduce contents except
copyrighted photos and other materials.
Contributions and inquiries are welcome and should be addressed to: Editor, EPA JOURNAL (A-107), Waterside Mall, 401 M Street, SVV,
Washington, DC 20460.
-------
lEWSLINEi
Chemical Plants to Control Toxic Emissions
EPA has proposed a rule under
the Clean Air Act that would
reduce toxic emissions from the
chemical indnstn/ by 80 percent.
Such emissions are estimated to
contribute to 1500 to 3,000
cancer fatalities a year. The rule
would also reduce emissions of
volatile organic compounds
(VOCs)from the industry by
more than 71 percent; VOCs are a
prime ingredient in urban smog.
Then-EPA Administrator Reilly
commented: "This is the first
major air toxics rule ever issued
controlling the chemical indnstn/.
It will improve public health by
reducing air pollution
everywhere, providing companies
the flexibility to find the most
cost-effective way of controlling
air toxk emissions."
The Los Angeles Times
reported: "... The proposal
will now be subject to public
comment and possible
revision. Consequently, it is
expected to be another year
before it is put into final form.
The rule will take effect three
years after that.... The
environmental agency's
assistant administrator for air
and radiation ... hailed the
proposal as the most
significant step that will be
taken in the next 10 years to
abate toxic air pollutants. The
beneficiaries, he said, will be
human health, agricultural
production, ecosystems
harboring endangered species,
and streams and forests.
Provisions of the complex
draft, he said, will reduce
releases of 149 of the 189 toxic
chemicals explicitly mentioned
in the Clean Air Act, including
chemicals cited as possible
contributors to 1,500 to 3,000
annual cancer deaths. Most
significantly affected will be
areas of Louisiana, Texas, and
New Jersey where the
petrochemical industry is
heavily concentrated, although
the new rules will have
bearing on facilities in 38
states. Officials from the most-
affected areas joined
environmentalists and
industry representatives,
including the Chemical
Manufacturers Assn. and the
American Petroleum Institute,
in lengthy negotiations over
the proposed rule. As the EPA
touted the outcome, critics
characterized it as weak in
several important respects. A
joint statement by the State
and Territorial Air Pollution
Program Administrators and
the Assn. of Local Air
Pollution Control Officials said
flaws in the proposal not only
threaten its control efforts, but
also could 'set a dangerous
precedent for future air toxic
rules.'... Specifically, critics
attacked the proposal because
it provides for cost-benefit
analyses before control
standards can be imposed that
are more strict than the 'floor'
levels written into the Clean
Air Act. It also enables
operators of huge industrial
complexes to average
emissions from control points.
If one controlled aspect of their
operation is emitting excessive
toxics, they could compensate
for it by exceeding control
requirements at another
location ...."
The Wall Street Journal
commented: "... The EPA's
proposal to cut cancer-causing
pollutants would require 370
chemical manufacturing plants
to install devices and
techniques to control leaks and
other emissions of pollutants,
resulting in an estimated 80
percent cut in toxic fumes by
1996. Chemical manufacturers
emit the greatest amount of
toxic pollution of any industry
group in the nation. The
proposal will serve as a
precedent as the first in a series
of source-by-source controls
that will be issued under the
1990 Clean Air Act's program
for reducing cancer risks
around plant sites .... The EPA
also issued final rules for its
program to reward companies
that voluntarily make earlier
reductions. Thirty-two
companies have promised to
cut their toxic air pollution at
47 chemical-manufacturing
plants 90 percent to 95 percent
by Jan. 1,1994. In exchange,
they will get six extra years to
comply with new technology
standards, which most likely
will require them to install
additional equipment. Among
those promising early
reductions are plants operated
by Allied-Signal Inc.,
American Cyanamid Co., Du
Pont Co., Hoechst AG's
Hoechst Celanese Corp.,
Monsanto Co., and Union
Carbide Corp. The EPA
estimated the chemical
industry will spend $347
million in capital costs and
$182 million a year in
operating expenses to meet the
proposal's new technology
requirements for cutting fumes
that escape from leaks,
processing vents, storage
tanks, loading operations, and
wastewater facilities."
Notional Standards Set for Sewage Sludge Contaminants
One of the many uses of sludge
is to fertilize home gardens.
A rule issued by EPA under the
Clean Water Act sets pathogen
standards and pollutant limits
for sewage sludge. It also spells
out the practices to be followed
in using sludge or in disposing
of it. The rule
is multimedia
in that it seeks
to protect
surface water,
ground water,
air, and land.
It is also the
first rule
issued by EPA
that considers
ecological
effects.
Sewage
sludge is a
by-product of treating
wastewater from homes and
businesses. It is made up
mostly of water, but also of
solids and dissolved substances.
It contains nutrients—nitrogen,
USDA photo
phosphorous—and pathogens—
bacteria, viruses, parasites. It
may also contain small amounts
of organic chemicals, such as
chloroform, and inorganics,
such as iron. A typical family of
four generates up to 400 gallons
of wastewater a day. Treatment,
including removal of the water,
yields about one pound of
sludge on a dry weight basis.
Sludge must be treated before it
is disposed of or used.
Currently, most sewage
sludge is disposed of as waste;
about 30 percent is put to use.
EPA's rule is intended to assure
that public health and the
environment are protected from
any contaminants in sludge, and
thereby to promote its use.
Alone, or combined with other
materials, it can be used to
reclaim land damaged by strip
mining or clear cutting, and it
can be used to cover landfills.
Biosolids derived from sludge
can be used on farms, home
gardens and lawns, golf courses,
and in forests to increase the
ability of soil to store water and
nourishment for vegetation.
The rule applies to publicly,
privately, and federally owned
facilities that generate or treat
sewage sludge, as well as any
person who uses or disposes of
sewage sludge or sludge from
septic systems. It sets pollutant
limits for sludge applied to land,
disposed of in a landfill, or fired
in an incinerator, and it sets
requirements for reducing
pathogens that might cause
disease.
EPA JOURNAL
-------
ENFORCEMENT
Craven Laboratories Suspended
from Federal Contracts
Pennzoil and Quaker State Fined for
Violating Clean Water Act
EPA has suspended Craven
Laboratories and four of its
employees from participating
in federal contracts, grants,
loans, and assistance
programs. The suspension
derives from a 20 felony count
indictment alleging that
Craven and the employees
concealed and falsified
information on pesticide
residue tests. Manufacturers,
who had contracted for the
tests, were thereby defrauded,
and EPA was given false
information for setting
allowable residues in food and
feed. The indictment was
brought by a federal grand
jury in the U.S. District Court
of the Western District of
Texas, Austin Division.
EPA and the Department of
Justice have announced
settlements with two
Pennsylvania companies for
illegally discharging oilfield
brine into public waterways.
Pennzoil Exploration and
Production Co., of Bradford
will pay $1.15 million in fines;
the settlement with Quaker
State Corp., of Oil City
includes a fine of $447,694.
The Pennsylvania Fish
Commission, party to the
Pennzoil case, will receive
$150,000 of that company's
fine for environmental
restoration projects.
Brine is a toxic wastewater
that is formed when oil is
pumped from a well and
processed through a separator.
Discharging brine into public
waterways without a
permit violates the federal
Clean Water Act and the
Pennsylvania Clean
Streams Law.
High-Tech Inspections for
Cars and Trucks to be Required
Under a rule iff tied In/ EPA under
tlw Clean Aii- Act, the more than 80
cities with the most serious ozone
and carbon monoxide problems will
set up high-tech emissions testing
programs for ears and trucks.
Manufacturers already meet
increasingly stringent standards for
new vehicles. I lowever, ears and
trucks in actual use emit three to
four times the pollutants allowed
for the new ones. The primary
reason is improper maintenance.
In announcing the rule, then-EPA
Administrator Reilly said: "The
lugh-tecli testing program will
provide the largest emission
reduction of any pollution control
strategy KPA has thus far
identified. We project that vehicle
emissions in the most polluted cities
around the country will he reduced
by 28 percent for hydrocarbons, 31
percent for carhon monoxide, and 9
percent for oxides of nitrogen. The
program also will he more
comvnient for car owners since
testing will onli/ have to be done
even/ other i/ear."
The Wall Street Journal
reported: " ... Beginning in 1995,
vehicles in 83 urban areas with
the worst smog and carbon-
monoxide problems will face
more sophisticated tests of their
car exhaust as well as new tests
that for the first time will check
under the hood for leaks of
gasoline vapors. And in a blow
to independent service-station
operators, the EPA decided to
bar gasoline stations and repair
shops from performing the
high-tech inspections .... Audits
have shown that when shops
conduct inspections and sell
other services, such as repairs or
gasoline, about half the vehicles
that pass the tests should have
failed, and thus they continue to
pollute. The agency contends
that 'test-only centers' will help
avoid quality problems, conflicts
of interest that could lead to
unnecessary repairs or, on the
other hand, a tendency to try to
please a customer by fudging
test results. The result is that
test-and-repair shops now
operating in areas that must
begin the more sophisticated
tests will be forced to give up a
part of their business. The
change will affect portions of
California, Colorado, Georgia,
Louisiana, Massachusetts,
Nevada, New Hampshire, New
York, Pennsylvania, Rhode
Island, Texas, and Virginia.
Test-and-repair shops, though,
can continue operating in 98
other urban areas with more
moderate pollution. They will
conduct simpler exhaust tests of
vehicles to reduce air
pollution .... The tests are
designed to catch problems in
today's highly computerized
cars. They will use new
$150,000 computer-assisted
treadmills, now found only in
In cities like Denver with serious ozone and carbon monoxide problems.
new high-tech emissions inspections may help clear the air.
test laboratories, that will
analyze car exhaust during a
four-minute cycle of idling,
accelerating, and braking ...."
The Rocky Mountain News
said: " ... If a proposal outlined
Monday becomes state law, the
900 Denver-area service stations
that inspect cars for pollution
will be replaced by 20 to 40
high-tech garages. Each would
be equipped with at least one
$100,000 dynamometer to
conduct emissions tests. Each
test would cost $20, but they
wouldn't be required every
year. Currently, car owners pay
up to $9 each year for a quick
test to determine the amount of
carbon monoxide and
hydrocarbons the vehicle is
emitting. Under the proposal,
inspection wouldn't be required
of cars 6 years old and less on
the theory that their engines
usually are so clean that they
contribute little to pollution.
Older cars would be inspected
every two years. Any car, old or
new, would be inspected as a
condition of resale .... The plan,
to begin in 1995, calls for a
private management contractor
to lease 20 to 40 stations,
scattered around the metro area,
to independent operators. Each
year, the stations would test
more than 500,000 cars. Several
service station owners said they
don't like the proposal, saying
only companies with a lot of
money could afford the
overhead. 'It's taking away free
enterprise,' said Clay E. Carr of
Lakewood. 'You're creating a
monopoly.' But Jerry Gallagher,
head of the mobile sources unit
of the Health Department, said
the high-tech, four-minute test
on the dynamometer is best
because it simulates real
driving, braking, and
accelerating.... "
JANUARY/FEBRUARY/MARCH 1993
-------
1EWSLINEI
Air Pollution Down
Over Last 10 Years
Nitrogen Oxides from
Power Plants to Be Cut
For the period 1982 through
1991, nationwide levels of air
pollution have declined,
according to the latest trends
report published by EPA.
Under the Clean Air Act, the
Agency has set air quality
standards for six major
pollutants—ozone (urban
smog), carbon monoxide
(CO), sulfur dioxide,
particulates (dirt, dust, and
soot), lead, and nitrogen
dioxide. For the 10 year
period, all six were down.
In releasing 1991 data for
the first time, the report also
showed that 41 of 97 areas
designated "non-attainment"
for smog and 13 of 42 for CO
now meet the air quality
standards for those
pollutants. EPA believes that
these improvements resulted
in part from federal limits on
gasoline volatility and the
replacement of older cars with
newer, cleaner ones. The
Agency believes that weather
also played a role, since heat
and sunlight can exacerbate
smog, and weather patterns
over the past few years have
been different from those
occurring in the late 1980s.
The report also showed that
air pollution is still a serious
problem for many
communities. More than 69
million people live in counties
that exceed the air quality
standard for smog, over 19
million in counties that exceed
the CO standard, and over 21
million in areas that violate
the particulates standard.
Wide World photo.
Rules proposed by EPA would cut
emissions of nitrogen oxides from
existing power plants 1.5 to 2
million tons annually; this is the
first time existing plants have been
proposed for controls. Electric
utility boilers tlmt burn coal, gas,
or oil account for nearly 30 percent
of nitrogen oxide emissions; by
comparison, cars and trucks
account for roughly half.
Nitrogen oxide contributes to acid
rain and urban smog, among other
environmental problems.
The Baltimore Sun reported:
"... The Environmental
Protection Agency yesterday
proposed controls on smog-
causing pollutants from
factories and electric power
plants. But state officials said
the rules are too weak and may
hinder efforts to improve air
quality in urban areas ....
Nitrogen oxide, which is
produced by the burning of
fossil fuels, is a component of
urban smog as well as a cause
of acid rain. While the EPA
long has controlled the
chemical's release from
automobiles, it has never before
restricted its release from
electric power plants and
industrial boilers. Under the
regulation proposed yesterday,
nitrogen oxide emissions from
about 200 coal-fired utility
boilers will have to be cut by 2
million tons a year by 2000,
according to the EPA. A final
rule is expected to be issued
soon .... The regulations,
however, raised concerns from
other state pollution control
officials, who called the EPA
standards too weak to ensure
that cities will be able to comply
with federal air quality
requirements. The EPA's
[pollution] limits are
disappointing,' said Bill Becker,
executive director of two
associations representing the
country's state and local air
pollution control officials. He
said the EPA ceiling on
allowable nitrogen oxide
releases from industrial boilers
is nearly a third greater than
levels recommended by the
state and local officials as
necessary to meet federal air
quality improvement goals.
The EPA's latest air quality
assessment showed that more
than 86 million people live in
urban areas that do not meet
federal health standards for
smog."
The Wall Street Journal said:
" ... Environmentalists accused
the EPA of caving in to utilities
and the Energy Department by
failing to require boiler
technologies that would get
even deeper cuts in nitrogen
oxides. But utility interests also
complained. A spokeswoman
for the Edison Electric Institute,
a trade group of major utility
companies, said two alternative
technologies proposed by the
EPA still would require more
costly and extensive
modifications than industry
thinks is required by the Clean
Air Act of 1990 .... The rules are
in two parts. One, dealing with
acid rain, requires coal-fired
utilities to reduce nitrogen-
oxide emissions; the cost of
compliance is estimated at $300
million a year when the rules
are fully in force by the year
2000. The other part of the rules
provides guidelines for
reductions in nitrogen-oxide
emissions in urban areas with
unhealthy smog levels; the
guidelines would apply to
utility and non-utility boilers
beginning in June 1995, unless
polluters gain extensions, and
would cost an estimated $500
million a year .... Utilities
welcomed an EPA proposal that
the new standards apply to
average emission levels of
companies, instead of requiring
that each boiler meet the new
requirements. The EPA says the
averaging provision will give
companies greater flexibility
and will cut compliance costs."
EPA JOURNAL
-------
130 Cities Exceed "Action Levels"
for Lead in Drinking Water
Initial tests of high risk homes in
communities sewed by 660 so-
called large public water systems
reveal tlwt 130 systems exceed the
lead action level of 15 parts per
billion (ppb) set by EPA under the
Safe Drinking Water Act. Large
systems are those that senv more
than 50,000 people; high risk
homes are those wlmse service or
interior pipes are made out of lead,
or those whose interior pipes are
made from copper with lead-
soldered connections and that
were installed after 1982. In
announcing the results of the
tests, then-EPA Administrator
Reilh,' commented: "Lead is a
major public licalth concern.
Although we believe the majority
of American homes have safe lead
levels in drinking water that are
below 15 ppb, we are concerned
about the higli levels found in
some homes. Wlu'le systems with
elevated levels are required to
reduce their lead levels through
corrosion control measures over
the next six years, people senvd
b\l these systems can act now to
reduce their exposure. They
should contact their water
supplier for information, have
their tap water tested for lead,
and take some simple steps
in the home."
The Wall Street Journal said:
" ... The drinking water in 130
cities, including New York,
Detroit, and Washington, DC,
contains excessive levels of
lead, the Environmental
Protection Agency said. The
EPA's initial sampling of 660
large public water systems
found that about 32 million
Americans drink water from
systems that flunked the
federal test of 15 parts per
billion. In 10 cities, lead was
above 70 parts per billion, with
Charleston, S.C., posting the
worst level of 211 parts per
billion .... Over time, the
consumption of excessive lead
in water, along with exposure
from lead-based paint and
contaminated soil and dust,
can elevate blood levels of the
toxic metal. This, in turn, can
delay physical and mental
development in babies and
young children and impair
mental abilities in children.
Drinking water contributes 10
to 20 percent of the total lead
exposure in young children ....
The EPA cautioned that the
findings don't represent
average lead levels. The
samples were taken only from
'high risk' homes, those that
are served by lead service lines
or that contain lead interior
piping or copper piping with
lead solder. The tests also
were based on first-draw
water, before the system
was flushed ...."
The Washington Post
commented: "... The EPA
recommends that people
served by the water systems
cited in its survey can reduce
lead exposure by having their
water tested, letting tap water
run for several minutes before
using it and using cold water
for drinking or cooking ....
Under rules adopted last year
by the EPA, all water systems
in the nation serving more
than 50,000 people are to begin
taking measures to reduce lead
by Jan. 1 and complete
installation of systems to limit
pipe corrosion by 1997 by
adding chemicals such as lime
Steve Delaney photo.
and calcium carbonate to their
water. Municipal systems that
still do not comply with
recommended lead levels after
the installation of anti-
corrosion systems must
replace lead service pipes over
a 15-year period, a remedy that
could cost $7 billion. EPA's
announcement yesterday was
criticized by some
environmentalists, who said
the agency has incorrectly
suggested that lead levels are
safe in most large water
systems. The EPA's data
'imply that lead levels below
the action level of 15 parts per
billion are safe, but the truth is,
they are not,' said Erik Olson,
an attorney with Natural
Resources Defense Council."
Negotiated Rule Set for Coke Ovens
An agreement reached by
members of an advisory
committee is the basis for a rule
proposed by EPA to control
toxic emissions from coke
ovens. The committee, which
represented state and local
agencies, environmental and
citizen groups, labor unions, the
steel industry, as well as EPA,
agreed to requirements that not
only meet but in some cases
exceed the coke oven provisions
of the 1990 Amendments to the
Clean Air Act. At the same
time, they provide the steel
industry with the flexibility
needed to protect jobs and to
encourage new investment.
The agreement, and the rule
proposed by EPA, affects 30
coke oven plants in 11 states.
Coke ovens convert coal to coke,
which, in turn, is used in blast
furnaces to convert iron ore to
iron. The iron is further refined
to make steel. Coke oven
emissions are among the most
toxic of all air pollutants;
they include a mixture of
polycyclic organic matter,
benzene, and other chemicals
that can produce cancers of the
respiratory tract, kidney, and
prostate.
The proposed rule would
allow industry to choose
between two methods of
compliance, neither of which
would require the purchase of
new control technology. The
first would cut current
emissions by 66 percent through
better maintenance and repair of
existing equipment.
Nationwide, it would require
capital costs of $66 million and
annualized costs of $25 million.
The second, tougher method
would cut emissions 90 percent
by carrying out the provisions of
the first method plus, in many
cases, rebuilding existing
equipment. Its capital costs
would run $510 million, its
annualized costs $84 million. A
coke oven plant that chooses to
apply the tougher method of
compliance may delay meeting
other provisions of the law,
which require EPA to evaluate
any "residual risks" eight years
after the rule is imposed, if the
Agency discovers that risks
remain, it must issue additional
rules. D
JANUARY/FEBRUARY/MARCH 1993
-------
Meet the New Administrator
An interview with
Carol M. Browner
Qwe would like to begin
by asking you to talk a few minutes
about your priorities and objectives as
EPA Administrator.
A In terms of how we look
at the far reaching and very
complicated environmental problems
facing this country', we need to have a
framework in which to work, and that
framework needs to be pollution
prevention. We must focus our resources
upstream, look at new technology, look at
source reduction. If we use pollution
prevention as a framework, we can decide
on the methods that get us to where we
need to be in the long term—namely to
sustainable development and
environmental protection. To that end, I
look forward to working closely with
EPA's staff, as I will on all issues. There is
impressive talent here, and I want all of
our work—and particularly our scientific
analysis—to be recognized nationally as
the verv best.
Qwhat is your prognosis on whether and when
EPA might be elevated to Department and
Cabinet status?
A The President is very committed to the elevation of
EPA to Cabinet status. There is significant support
among members on the Hill for it to move quickly. One can
Steve Delaney photo.
never fully anticipate what will happen, but I am encouraged
by the support that we have. What is important about the
elevation is that it will bring the environment to the table as a
full participant. President Clinton has treated the Agency and
me thus far as a full member of the Cabinet. But it is important
for the future to give environmental issues the high visibility
that Cabinet status implies. It is important to keep the
environment at the top of the national agenda and at the
forefront of public consciousness, and Cabinet status will help
us accomplish these goals.
EPA JOURNAL
-------
QYou have said that during your tenure,
environmental policies will finally move beyond the
"either/or" tradeoff between jobs and the environment
toward a sustainable economic future. Two questions: How
will this goal be reflected in future decision making at EPA?
And how does your thinking on this compare with past EPA
policies?
Alt is important to look at the history of environmental
regulation in this country. It has been decades since the
enactment of the original Clean Water Act and the other federal
legislation that followed thereafter. The initial focus—and
rightly so because we didn't have the technology to do
otherwise—was on controlling end-of-the-pipe emissions and
discharges. Back then, we had a lot of significant, obviously bad
practices taking place. We began a series of regulatory schemes
that sought to resolve those problems. Oftentimes those
schemes did not look beyond the immediate impact; they did
not consider possibilities for preventing an activity from
happening in the first place or allowing it to happen only within
certain parameters. As the regulatory programs developed,
they inevitably have had economic impacts.
We are now at a crossroads. Everyone in this country shares
the idea of a clean environment for future generations, and they
want to achieve that. We are, however, going to have to be
mindful of the consequences of environmental protection. We
need to use economic incentives to encourage businesses to
make the right decisions. We need to look at the true cost of the
actions that we take—not just what it costs today—but what it is
going to cost over, let's say, the next 10,20,40 years and
beyond. Factoring in all these considerations is important so
that when we make decisions under the regulatory scheme, we
make them using all of the information available.
QYou are coming to EPA from Florida's Department of
Environmental Regulation (DER). Do you see a
prominent role for state and local governments in
environmental initiatives during your administration? If so,
how can EPA best help states and localities deal with
increasing environmental responsibility?
A The states have a lot to offer. There is a lot to be done.
There is more to be done than any one agency could ever
hope to achieve. We at EPA need to facilitate. We need to
encourage those states where the state legislature, the governor,
or the citizens of the state have decided that they are going to
put resources into environmental protection. We have other
states which have not made that decision. We need to work
with them, to help them understand their environmental
commitments.
We must respect the states' prerogatives. There are many
issues where states really are best suited to decide how to strike
that final balance. As a national agency looking at the entire
country, we must come up with rules that fit for Florida, Maine,
Hawaii, Alaska, and Iowa, to pick random states. There are
similarities between those states, but there are also great
dissimilarities. A state can look more closely at the situation
that exists within its boundaries. We have to respect that, and
we have to encourage that.
Q
What do you view as your major
accomplishments as head of Florida's DER?
A We brought an integrity to environmental protection
and environmental regulation in Florida. We worked
very hard, and I'm proud of the efforts we made. We started
looking broadly at ecosystem protection, which helped us get
beyond some of the kinds of problems that we're seeing all over
the country in a lot of the applications of environmental
regulation. I'm proud of the clean air act that we passed and
the coalitions we were able to forge between business,
environmental groups, and state agencies. In almost every
instance, our successes were because we worked in a
cooperative manner. We brought everyone to the table from the
beginning and received input on how we were going to address
each issue.
Steve Delaney photo.
"There is impressive
talent here, and I want
all of our work—and
particularly our scientific
analysis—to be recognized
nationally as the very best."
JANUARY/FEBRUARY/MARCH 1993
-------
Qln the past, you have worked closely with
Vice-President Al Gore. What lessons have you learned
through your working relationship with him?
A I served as Senator Gore's legislative director for
almost two years. It was an opportunity to work in an
important way in the U.S. Senate in formulating policy on a
variety of issues of significance, including environmental issues.
Along with the understanding of policy that comes from having
worked for the Senate, I have hands-on regulatory experience.
It's not often that you get both of those experiences. If I were to
go back to the policy side with the experience I have now, I
would probably do things differently. I'm forever thankful to
Senator Gore for the experience with him.
Q Relations between EPA and the corporate world have
been said to be improving. Does your agenda involve
working toward less adversarial roles for business and
regulators?
A Absolutely. Most businesses recognize that, for a variety
of reasons, a clean environment is important, and they are
willing to work cooperatively toward that goal.
Q President Clinton has been quoted as saying that one
reason he wanted you to be EPA Administrator is
"because she knows ... what it's like to be governed by EPA
and how awful I the result] can be when it is done wrong;
what it is like to see an agency take two contradictory
positions at the same time and put the heat on that state and
the private sector; what it is like to see the EPA take one
position, then another, and then another." Can you elaborate
on what he meant here?
A Well, when a state is subject to federal government
regulations—whether the agency is EPA or any other
federal regulator—inconsistencies occur. And they are
'There needs to be
consistency in the decision
making, and there needs to be
timeliness. I believe you can have
both of these without foregoing
environmental standards and
environmental quality."
extremely frustrating. The inconsistencies are at times
understandable; at other times they are not. The more we, as a
federal agency, respect that states are attempting to do the right
thing, give them clear guidance in what we will accept, and
then adhere to the guidance that we gave them, the better our
relationship with them will be.
We've certainly had the experience in Florida of being in
touch with our regional U.S. EPA office and being told, "See if
you can make 'X' happen," only to find out later that it was "Y"
we needed to accomplish. There may have been legitimate
reasons why it was now "Y," and "X" was no longer right, but
something got lost in the communication. For some reason, we
didn't find out until we were so far down the "X" path that the
change involved going back to the state legislature or going
back through the rulemaking process. That experience is not
unique to state government. It is something that businesses
legitimately complain about.
There needs to be consistency in the decision making, and
there needs to be timeliness. I believe you can have both of
these without foregoing environmental standards and
environmental quality. For a long time, people in the
environmental community argued that streamlining the
permitting process would result in the degradation of the
environment. I don't think that is true. I don't think, in the end,
that's what the business community is advocating. What they
are saying—and rightly so—is that time is money to them. And
they want answers. That is what we found the business
community saying in Florida: "You can say no; just don't take
four years to do it. Tell us no in a timely manner." D
Steve Delaney photo
EPA JOURNAL
-------
A New
Era...
A Profile of
the New
Administrator
Browner testifies at her confirmation hearing.
Steve Delaney photo
Ti Carol Browner, the greatest challenge of environmental
•egulation is clear: to balance environmental protection
vvith the need for economic growth.
"I hope my tenure will mark a new era in communication
between the EPA and America's business community, between
environmentalists and business people," Browner said during
her January 11, 1993, confirmation hearing before the U.S.
Senate Committee on Environment and Public Works. EPA's
seventh Administrator, Browner brings to her new position
experience in a variety of roles that will aid her as she works
toward that era.
Before serving as legislative director to then-Senator Al Gore
from 1989 to 1991, Browner, 37, was the chief legislative
aide for environmental issues to then-Senator Lawton Chiles
(D-Florida). During that three-year stint, she helped negotiate a
complicated land swap that expanded Florida's Big Cypress
National Preserve.
Browner became director of Florida's Department of
Environmental Regulation (DER) in 1991. At the Florida DER,
she negotiated a deal with the Walt Disney Co., that allowed
continued development in the Orlando area, while also
transforming an 8,500-acre ranch, Walker Ranch, into a wildlife
refuge. She streamlined Florida's permitting process, cutting
red tape by decreasing the number of state agencies involved.
She won passage of Florida's clean air act, which included
provisions allowing the Florida DER to obtain delegation from
EPA to implement the federal clean air program. Browner also
established a coalition of business leaders who implemented an
annual fee structure for major sources of air pollution in
advance of the federal deadline.
President Bill Clinton has cited Browner's innovative
approach and strong administrative background as among her
qualifications for the EPA post. Sierra Club Chairman J.
Michael McCloskey said that special interest groups will not be
favored over middle-class Americans under her aegis.
The Wall Street journal opined, in an editorial in part on
Browner, that "it's good to see some real-worlders coming to
Bill Clinton's Washington."
Browner is "a tremendous choice considering the massive
undertaking states face in implementing environmental
regulations over the next two years," said William Becker,
executive director of the State and Territorial Air Pollution
Program Administrators and the Association of Local Air
Pollution Control Officials. Said former-Administrator William
K. Reilly, in lauding Browner's valuable state government
experience, "So many of the best ideas for the environment have
come from the states."
Browner graduated from the University of Florida at
Gainesville in 1977 and earned a degree from the university's
law school in 1979. She began her career as general counsel for
the Florida House of Representatives Government Operations
Committee. From 1983 to 1986, she was associate director of the
Washington environmental group Citizen Action, where her
husband, Michael Podhorzer, is employed. Browner and
Podhorzer have a five-year-old son, Zachary.
"Our families—and particularly our children—have inspired
many of us in our work to protect this nation's environment,"
Browner said at her confirmation hearing. "I grew up in South
Florida, in a house from which I could bicycle into the
wilderness of the Everglades, and I want my son, Zachary, to be
able to grow up and enjoy the natural wonders of the United
States the same way I have. 1 believe that it will now be possible
to make the investment in our economy that we so desperately
need, yet preserve our air, land, and water." D
JANUARY/FEBRUARY/MARCH 1993
-------
The ABCs of Risk Assessment
Some basic principles can help people
understand why controversies occur
by Dorothy E. Patton
Risk assessment is a cornerstone of
environmental decision making.
Despite this role as the scientific
foundation for most EPA regulatory
actions, risk assessment means different
things to different people—a point that
comes across in subsequent articles in
this issue of EPA Journal—and is thus a
source of misunderstanding and
controversy. Some points of controversy
involve the interpretation of scientific
studies. Others have to do with science
policy issues. Still others center on
distinctions between risk assessment and
risk management.
The scope and nature of risk
assessments range widely—from broadly
based scientific conclusions about an air
pollutant such as lead or arsenic affecting
the nation as a whole to site-specific
findings concerning these same chemicals
in a local water supply. Some
assessments are retrospective, focusing
on injury after the fact—for example, the
kind and extent of risks at a particular
Superfund site. Others seek to predict
possible future harm to human health or
the environment—for example, the risks
expected if a newly developed pesticide
is approved for use on food crops.
In short, risk assessment takes many
different forms, depending on its
intended scope and purpose, the
available data and resources, and other
factors. It involves many different
il\itton is Executive Director of EPA'f Rifk
A^cffuicnt Forum.i
10
Assessing the safety of drinking water is one
possible use of risk assessment.
disciplines and specialists with different
kinds and levels of expertise,
representing many different
organizations. Moreover, risk assessment
approaches differ somewhat in line with
differences in environmental laws and
related regulatory programs. (See box on
Mike Brisson photo.
statutory mandates, page 15.)
Even with these differences, some
features of the risk assessment process
stand out as instructive principles that
clarify and demystify the process for
expert and novice alike. This article
highlights these principles.
EPA JOURNAL
-------
Risk Assessment
and Risk Management
Risk assessment and risk management
are closely related but different processes,
with the nature of the risk management
decision often influencing the scope and
depth of a risk assessment. In simple
terms, risk assessment asks, "How risky
is this situation?" and risk management
then asks, "What shall we do about it?"
(For a feature on the interface between
risk assessment and risk management,
see page 35.)
Also, it is especially important to
understand that risk assessment and
comparative risk analysis for ranking
environmental problems are not the
same. (On distinctions between the two,
see box on page 19, and for a more
comprehensive discussion of
comparative risk analysis, see article
beginning on page 18.)
I use the term "risk assessment," as the
National Academy of Sciences (NAS) and
EPA risk assessment guidelines have
defined it for almost 10 years, to mean the
process by which scientific data are
analyzed to describe the form, dimension,
and characteristics of risk—that is, the
likelihood of harm to humans or the
environment. Risk management, on the
other hand, is the process by which the
risk assessment is used with other
information to make regulatory decisions.
Contributing Disciplines
What specific kinds of information are
used for risk assessment? For risk
management?
Environmental risk assessment is a
multidisciplinary process. It draws on
data, information, and principles from
many scientific disciplines including
biology, chemistry, physics, medicine,
geology, epidemiology, and statistics,
among others. The feature distinguishing
risk assessment from the underlying
sciences is this: After evaluating
individual studies for conformity with
standard practices within the discipline,
the most relevant information from each
of these areas is examined together to
describe the risk. This means that
individual studies, or even collections of
studies from a single discipline, are used
to develop risk assessments, but they are
not in themselves generally regarded as
risk assessments, nor can they alone
generate risk assessments.
One way to highlight differences
between risk assessment and risk
management is by looking at differences
in the information content of the two
What's In a Number?
Risk values are often stated,
shorthand-fashion, as a number.
When the risk concern is cancer, the risk
number represents a probability of
occurrence of additional
cancer cases. For example,
such an estimate for
Pollutant X might be
expressed as 1 x 10"6, or
simply 10"6 This number
can also be written as 0.000001, or one in
a million—meaning one additional case
of cancer projected in a population of
one million people exposed to a certain
level of Pollutant X over their lifetimes.
Similarly, 5 x 10'7, or 0.0000005, or five
in 100 million, indicates a potential risk
of five additional cancer cases in a
population of 100 million people
exposed to a certain level of the
pollutant. These numbers signify
incremental cases above the
background cancer incidence in the
general population. American Cancer
Society statistics indicate that the
background cancer incidence in
the general population is one in
three over a lifetime.
If the effect associated with
Pollutant X is not cancer but
another health effect, perhaps
neurotoxicity (nerve damage) or birth
defects, then numbers are not typically
IxlO6
given as probability of occurrence, but
rather as levels of exposure estimated
to be without harm. This often takes
the form of a reference dose (RfD). A
RfD is typically expressed
in terms of milligrams (of
pollutant) per kilogram of
body weight per day, e.g.,
0.004 mg/kg-day. Simply
described, a RfD is a
rough estimate of daily exposure to
the human population (including
sensitive subgroups) that is
likely to be without appreciable
risk of deleterious effects
during a lifetime. The
uncertainty in a RfD may be
one or several orders of magnitude
(i.e., multiples of 10).
What's in a number? The
important point to remember is that
the numbers by themselves don't tell
the whole story. For instance, even
though the numbers are identical, a
cancer risk value of 10~6 for the
"average exposed
person" (perhaps
someone exposed
through the food
supply) is not the
same thing as a cancer risk of 10"" for a
"most exposed individual" (perhaps
someone exposed from living or
10
0.000001
working in a highly contaminated area).
It's important to know the difference.
Omitting the qualifier ''average'' or
"most exposed" incompletely describes
the risk and would mean a failure in
risk communication.
A numerical estimate is only as good
as the data it is based on. Just as
important as the quantitative aspect of
risk characterization (the risk numbers),
then, are the qualitative aspects. How
extensive is the data base supporting the
risk assessment? Does it
include human
epidemiological data as
well as experimental
data? Does the
laboratory data base include test data on
more than one species? If multiple
species were tested, did they all respond
similarly to the test substance? What
are the "data gaps," the missing pieces
of the puzzle? What are the scientific
uncertainties? What science policy
decisions were made to address these
uncertainties? What working
assumptions underlie the risk
assessment? What is the overall
confidence level in the risk assessment?
All of these qualitative considerations
are essential to deciding what reliance to
place on a number and to characterizing
a potential risk. —£rfs.
JANUARY/FEBRUARY/MARCH 1993
11
-------
Disciplines Contributing to
Environmental Decisions
Laboratory
and
Field Work
z.
Discipline-based:
• Chemistry
• Biology
• Geology
• Toxicology
• Epidemiology
Risk Multiple scientific disciplines:
Assessment « Chemistry, biology, etc.
• Statistics
• Medicine
• Models
r • Science policy
Risk Multiple disciplines: natural sciences,
Management physical sciences, social sciences:
Risk assessment
Economics
Politics
Law
Social values, concerns
processes. What kinds of information,
then, are used for risk management but
not for risk assessment? In general EPA
practice, data on technological feasibility,
on costs, and on the economic and social
consequences (e.g., employment impacts)
of possible regulatory decisions are
critically important for risk management,
but not for risk assessment. To the extent
called for in various statutes, risk
managers consider this information
together witli the outcome of the risk
assessment when evaluating risk
management options and making
environmental decisions. (See chart on
how disciplines are used, this page.)
The NAS Paradigm
The risk assessment paradigm put
forward by NAS in a 1983 publication
called Risk Assessment in the Federal
Government: Managing the Process (or
more collequially, the "Red Book,"
alluding to its cover) provides a useful
system for organizing risk science
informa tion from these many different
sources. Moreover, in the last decade,
EPA has used the basic NAS paradigm as
a foundation for its published risk
assessment guidance and as an
organizing system for many individual
assessments. The paradigm defines four
"fields of analysis" which describe the
D
Rarely is there a single
"answer" to an
environmental risk
assessment question.
n
use and flow of scientific information in
the risk assessment process. (See process
chart, page 13.)
One virtue of this system of analysis is
clarity: The paradigm makes the risk
assessment process accessible so that
scientists, regulators, lawyers, journalists,
educators, and committed laypersons can
use the paradigm as a relatively simple
frame of reference for understanding
where and how the data, scientific
principles, and science policies have been
used in any risk assessment developed in
line with the paradigm. (Even where the
paradigm is not explicitly used—e.g.,
certain climate issues—the same kinds of
questions are studied to evaluate
potential risk.)
The following discussion walks
through the four fields of analysis. Note
at the outset that each phase employs
different parts of the information base.
For example, hazard identification relies
primarily on data from the biological and
medical sciences. The dose-response
analysis then uses these data in
combination with statistical and
mathematical modeling techniques, so
tha t the second phase of the risk analysis
builds on the first.
• Hazard Identification. The objective of
hazard identification is to determine
whether the available scientific data
describe a causal relationship between an
environmental agent and demonstrated
injury to human health or the
environment. In humans, the observed
injury may include such effects as birth
defects, neurologic effects (nerve
damage), or cancer. Ecological hazards
might result in fish kills, habitat
destruction, or other effects on the natural
environment.
Information on the agent responsible
for the effects may come from laboratory
studies in which test animals were
deliberately exposed to toxic materials, or
from other sources such as chemical
measurements in the workplace. In
addition, studies on a pollutant's effects
on genetic material or metabolism, and
comparison of such effects in humans
and experimental test systems, may be
part of the analysis.
The principal question is whether data
from populations in which effects and
exposure are known to occur together
suggest a potential hazard for other
populations under expected conditions of
exposure to the agent under study. If a
potential hazard is identified, three other
analyses become important for the
overall risk assessment, as discussed
below.
• Dose-Response Relationships. The dose-
response analysis is designed to establish
the quantitative relationship between
12
EPA JOURNAL
-------
exposure (or dose) and response in
existing studies in which adverse health
or environmental effects have been
observed. The dose-response analysis is
based mainly on two extrapolations. One
extrapolation uses the relatively high
exposure levels in most laboratory
studies (or, for example, human studies
at relatively high workplace levels) to
estimate the probable magnitude of the
effect in the same population at lower
environmental levels where little or no
data are available.
The other extrapolation entails looking
for the expected level of response in
humans, or in animals or plants in nature,
based on comparisons of data from
laboratory and natural test systems. As
explained later, each extrapolation
involves numerous scientific
uncertainties and assumptions, which in
turn involve policy choices.
The number produced in the
dose-response analysis—perhaps a
cancer risk value or a reference dose
(see article on noncancer effects on page
30)—is sometimes regarded as a risk
assessment because it describes
important information from animal and
human studies. Under the NAS
paradigm and in most EPA practice,
however, risk assessment is complete
only when human exposure assessment
information is joined with dose-response
analysis and all relevant information to
characterize the risk.
• Exposure Analysis. The exposure
analysis moves the assessment from the
study of known populations (laboratory
or epidemiologic) in which dose
(exposure) and response occur together,
to the task of identifying and
characterizing exposure in other
potentially exposed populations. These
populations may be as general as the
nation as a whole for certain widely
distributed materials (e.g., contaminated
food), or as limited as certain occupation
or user groups (e.g., pesticide
applicators). Questions raised in the
exposure analysis concern the likely
sources of the pollutant (e.g., incinerator
discharge, factory effluent, pesticide
application), its concentration at the
source, its pathways (air, water, food)
from the source to target populations,
and actual levels impacting target
organisms.
The exposure analysis relies on many
very different kinds of information, some
based on actual measurements and some
developed using mathematical models.
Measurements of the kind and quantity
of a pollutant in various environmental
media and, when available, in human,
plant, and animal tissues are used to
project expected exposure levels in
individuals, populations, or both. The
D
Risk assessment asks,
"How risky is this
situation?" and
risk management
then asks, "What shall
we do about it?"
n
exposure analysis also develops
"lifestyle'' data to identify and describe
populations likely to contact a pollutant.
For example, if a chemical that causes
birth defects in test animals contaminates
tomatoes, the exposure analysis would
consider such "lifestyle" information as
the number of women of childbearing
age who eat tomatoes, how often they eat
this food, and in what quantities. To
complete the exposure analysis, the
lifestyle information is combined with
information on how much chemical,
probably measured at very low levels,
remains in tomatoes when sold for
consumption.
If the estimated exposure for an
environmentally exposed population is
significantly smaller than the lowest dose
producing a response in the study
population, the likelihood of injury to
exposed humans is smaller; if the
estimated exposure is significantly
greater than the lowest dose, then the
likelihood of injury is greater.
• Risk Characterization. Although each of
the preceding analyses examines all
relevant data and information to describe
hazard or dose-response or exposure,
under the 1983 paradigm none reaches
conclusions about the overall risk. That
task is reserved for the final analysis,
where important information, data, and
conclusions from each of the preceding
analyses are examined together to
characterize risk—that is, to fully describe
the expected risk by examining the
exposure predictions for real-world
Risk Assessment Process
Hazard
Identification
Data
Risk
Characterization
Dose-Response
Evaluation
Data
Human Exposure
Evaluation
Data
Level of Potential
Risk to Humans
JANUARY/FEBRUARY/MARCH 1993
13
-------
"Hoip THE MEAT- w, THE CHTO- WAIT
- HOLP THE FISH - WHAT H AYF YOU 60T
IM THE WAV 0F UNSPRAYFP VW6ICS?"
conditions in light of the dose-response
information from animals, people, and
special test systems.
Risk characterization—the product of
the risk assessment—is much more than a
number. (See box on page 15.) While the
risk is often stated as a bare
number—for example, "a risk of 10"*" or
"one in a million new cancer cases"—the
analysis involves substantially more
information, thought, and judgment than
the numbers express. These factors take
us behind the simple structural
framework that the NAS paradigm
provides into a complex world of
scientific uncertainties, assumptions, and
policy choices. As discussed below,
Copyright 1992 by Herblock in The Washington Post.
revisiting the NAS paradigm with these
conceptual principles in mind sheds new
light-
Uncertainties and Policy Choices
Scientific uncertainty is a customary and
expected factor in all environmental risk
assessment. Measurement uncertainty
refers to the usual variance that
accompanies scientific measurement such
as the range (10 + 1) around a value.
Another kind of uncertainty refers to data
or information gaps—that is, information
needed but unavailable for any particular
assessment. Sometimes the data gap
exists because specific measurements or
studies that would complete an
assessment are missing; sometimes the
data gap is broader, referring to a
fundamental lack of understanding about
a scientific phenomenon.
The 1983 paradigm and EPA risk
assessment guidelines stress the
importance of identifying uncertainties
and presenting them as part of risk
characterization.
In ordinary scientific practice, scientific
uncertainties describe new data needs
and stimulate further research, with
questions remaining open until research
provides needed information. Like
traditional science, environmental risk
assessment invariably identifies new data
needs and generates recommendations
for additional research.
However, "state-of-the-art" limitations
on risk methods, resource limitations,
and statutory timetables for regulatory
decisions often require EPA as well as
other participants in the regulatory
process (other governmental agencies,
industry, environmental groups) to
complete risk assessments in the face of
data gaps and other scientific
uncertainties. As a result, "science
policies"—that is, technically reasonable
positions assumed in lieu of scientific
data—may be developed to address
some of these uncertainties. Some
familiar policies relate to use (or nonuse)
of animal data to predict human risk,
models used to quantify or project cancer
risk, and the size of uncertainty factors
for health effects other than cancer.
Variability, Misunderstanding,
and Controversy
Variability is an often overlooked but
important feature of the risk assessment
process. Reasons for variability in risk
assessment should be obvious from the
preceding discussion. The need to use
data from many different disciplines,
characterized by data gaps and
uncertainties, is one source of variability.
Assumptions and policy choices
spanning a spectrum of scientific theses
about the nature of incompletely
understood biological processes is
another. These diverse elements can lead
to diverse results, an outcome that leads
to misunderstanding and seeds many
risk assessment controversies.
Controversy might be less strident if
practitioners and observers recognized
that varying interpretations of the
scientific information may lead to a range
of science-based descriptions of risk for
14
EPA JOURNAL
-------
any particular situation. In addition,
depending on data selected, scientific
assumptions, policy calls and
perspectives, different experts or
organizations may describe risk
differently. For example, a single data
set, applied to different populations with
different assumptions, may result in
different numerical risk estimates for a
single chemical. However, if the risk
characterization identifies data and
science policy choices, apparently
inexplicable inconsistencies may be
recognized as responsible, reasonable
descriptions of different aspects of the
same problem. The risk characterization
process can also aid identification of less
responsible, less reasonable descriptions
of the problem.
Perhaps this clarifies some of the
reasons for misunderstanding and
controversy. Rarely is there a single
"answer" to an environmental risk
assessment question. The risk
assessment process has an enormous
capacity to expand and contract in line
with the available data, science policies,
and problems. When risk management
information, options, and decisions are
examined along with the risk assessment,
opportunities for variability,
misunderstanding, and controversy are
even greater.
The task is to look behind the process,
always keeping in mind the multiple
sources of information, the several kinds
of scientific analyses, and the related
uncertainties and science policy choices
that shape each assessment. A related
task is to remember that risk assessment
and risk management are equally
important but different processes, with
different objectives, information content,
and results. D
Some Statutory Mandates on Risk
EPA is responsible for implementing
roughly a dozen major
environmental statutes. These laws
generally do not prescribe risk
assessment methodologies. However,
many environmental laws do provide
very specific risk management
directives, and these directives vary
from statute to statute. Moreover, in
certain statutes (such as the Clean Air
Act) different sections of the law set
forth different risk management
mandates.
Statutory risk management mandates
can be roughly classified into three
categories: pure risk; technology-based
standards; and reasonableness of risk
balanced with benefits.
Pure-Risk Standards
Pure-risk standards (sometimes termed
"zero-risk") are mandated or implied by
only a few statutory provisions. Two
examples in this category:
• The "Delaney clause" of the Federal
Food, Drug, and Cosmetic Act prohibits
the approval of any food additive that
has been found to "induce cancer" in
humans or animals. (See articles
beginning page 39 on the ongoing
controversy concerning the Delaney
clause.)
• The provisions of the Clean Air Act
pertaining to national ambient air
quality standards call for standards for
listed pollutants that "protect the public
health allowing an adequate margin of
safety"—i.e., that assure protection
of public health without regard to
technology or cost factors.
Technology-Based Standards
Technology-based environmental
standards direct the Agency to focus
on the effectiveness and costs of
alternative control technologies
rather than on how control actions
could affect risks. Technology-based
controls are considered appropriate
to certain kinds of problems, such as
industrial water pollution, where the
installation of a single control system
can reduce risks from a variety of
different pollutants.
Consider the several technology-
based standards in the Clean Water
Act: The Act requires industries to
install several levels of technology-
based controls for reducing water
pollution. These include "best
practicable control technology,"
"best conventional technology,"
and "best available technology
economically achievable" for
existing sources. New sources are
subject to the "best demonstrated
control technology." Total costs,
age of equipment and facilities,
processes involved, engineering
aspects, environmental factors
other than water quality, and
energy requirements are to be
taken into account in assessing
technology-based controls.
"No Unreasonable Risk"
A number of statutes require a
balancing of risks against benefits in
making risk management decisions.
Two examples in this category:
• The Federal Insecticide, Fungicide,
and Rodenticide Act requires EPA to
register (license) pesticides which, in
addition to other requirements, it finds
will not cause "unreasonable adverse
effects on the environment." The phrase
refers to "any unreasonable risks to man
or the environment taking into account
the economic, social, and environmental
costs and benefits of the use of any
pesticide."
• Under the Toxic Substances Control
Act, EPA is mandated to take action if it
finds that a chemical substance
"presents or will present an
unreasonable risk of injury to health or
the environment." This includes
considering the effects of such substance
on health and the environment and the
magnitude of the exposure of human
beings and the environment to such
substance; the benefits of such substance
for various uses and the availability of
substitutes for such uses; and the
reasonably ascertainable economic
consequences of the rule, after
consideration of the effect on the
national economy, small businesses,
technological innovation, the
environment, and public health.
—Eds.
JANUARY/FEBRUARY/MARCH 1993
15
-------
THE "BASICS".
Uncertainty and the
"Flavors" of Risk
Let's give risk assessment a reality check
by Robert J, Scheuplein
i f you are an average person (or even
a above average), you're exposed to a
1 large amount of risk. Like ice cream,
risk comes in a variety of flavors, and
some people like to add nuts and berries
to suit their tastes. So it is with risks. But
the basic ones—the vanilla, chocolate,
and strawberry of risks, if you will—are
the following three: personal activities,
natural disasters, and chemical
exposures.
Consider first certain dangerous
personal activities like firefighting, coal
mining, skiing, motorcycling, driving
automobiles, etc.—things you do for a
living or for fun. Here the danger in the
activity may be self evident, and the risk
is ordinarily self imposed. As a class,
these risks are the highest, along with
risks from ordinary diseases, which
personal behavior can also affect. For
example, the annual risk associated with
motorcycling is about 2 percent, or about
2,000 deaths per 100,000 persons at risk.
Firefighting is much safer, only about 80
annual deaths per 100,000, or 0.08
percent. The annual death rate from
motor vehicles is around 24 per 100,000,
or 0.024 percent. This is just the death
rate; of course, the accident rate and
injury rate are much higher.
The risks above might be described as
part of the price we pay for living in a
civilized world, but the next category of
risks, natural disasters, are not wholly our
fault. The risks from floods, hurricanes,
earthquakes, lightning, meteorite hits,
etc., are the price we pay for our 70-year
or so lease on the planet. Of course, if
you want to live on an earthquake fault,
you share some responsibility. These
risks in the aggregate are quite small. But
try telling that to the folks in South
Florida who experienced Hurricane
Andrew. Lightning kills 0.05 people per
year per 100,000, or about 0.00005
percent. The risks from meteorite hits are
about 0.000006 per 100,000, or
0.000000006 percent.
Now the last major category of risks,
the ''strawberry of risks/' derive from
chemical exposures. Before we discuss
them, let me emphasize an important
point. The risks above are real, obtained
by counting victims. They are actuarial
risks. They depend only on how
D
Currently the regulatory
objective is often fulfilled
at the expense of the
scientific one.
n
(Scheuplein is Director of the Office of Special
Research Skills at the Food and Drug
Administration.)
accurately the deaths and the populations
at risk were attributed and recorded.
They are not based on inferences from
animal data, nor on prudent
extrapolations of adverse effects in
animals. This distinction is essential to
make because the risks from chemical
exposures are for the most part based on
such inferences and extrapolations.
Everyone knows about poisons, drugs,
and acute occupational exposures to
industrial chemicals. For many of these
exposures there is human data. But for
chronic low-level risk from chemicals in
the environment we live in, in the air we
breathe, in the water we drink, or in the
food we ingest, we need to depend on
animal data. These risks are ordinarily
small. For example: The cancer risk from
chlorinated drinking water has been
estimated as 0.8 per year per 100,000
persons exposed or 0.0008 percent. While
this number looks the same as the others
and can be expressed in the same units, it
is not the same and can be compared
with the actuarial risks only if the
differences are kept in mind.
What are these differences?
First, as stated above, chemical risk is
based on the finding of an adverse effect
in an animal study. In the case of
chlorinated drinking water, it is based on
several carcinogen bioassays conducted
in mice using various chlorinated
compounds. It is inferred that humans
will be similarly susceptible to these same
compounds. But this is not necessarily
true.
Second, the quantitative result is a
worst-case estimate sometimes called an
upper-bound estimate. It is based on a
mathematical extrapolation of adverse
effects in animals, exposed at high dose
levels, to the much lower levels
anticipated for humans. Why is this
done? Why not just expose the animal to
the appropriate lower doses? The reason
is there would be no effect at low doses,
unless the number of animals in the
experiment were increased
dramatically—say to several thousand.
The problem lies in trying to detect in a
population of 100 a disease incidence that
you might believe to be one in 1,000. So
toxicologists need to exaggerate the
animal doses and extrapolate
downwards—hopefully.
To continue with our particular
example, the actual amount of
chlorinated hydrocarbon chemicals in
16
EPA JOURNAL
-------
drinking water, the chemical
byproducts of the
chlorination process, is \'ory
small, typically a few parts
per billion. The doses to
which the animals were
exposed are very high, many
thousands of times higher
than human exposure levels.
1 he high-to-low dose
extrapolation is used to
estimate the effect of the
lower dose using various
conservative assumptions.
The most important of these
assumptions is that there will
inevitably be some cancer risk
no matter how small the dose.
Third, the risk is an average
attributed risk; it applies to no
one in particular and to
everyone on the average. If
you're an average person and
you drink the expected
amount of water with the
expected concentration of
chlorinated compounds, your
possible risk is no greater
than the given risk number.
But in no way is this intended to be a
predicted risk for you individually; your
particular pattern of exposure, your
exposures to other carcinogens, your
genes, your diet, and other factors
determine your particular susceptibility.
This is the way carcinogenic risks are
determined for most regulated chemicals:
foods and cosmetics, pesticides,
household chemicals, most industrial and
workplace chemicals, air and water
pollutants, and toxics and waste site
contaminants. (Drugs and biologies are
usually regulated with human data.)
Of course, not all chemicals present a
cancer risk but they can pose other risks.
There are chemical substances that affect
developmental, reproductive,
neurobehaviorial, and other body
functions. Typically, such substances
are regulated by determining "no-effect
levels" in animals and applying safety
factors. Numerical risk estimates are
not made because thresholds are
assumed. In other words, unlike for
carcinogens, risk is not assumed to be
present at all closes.
Cancer risks of less than 10"—one in a
million per lifetime or one in 14,000 per
vear or 7 per 100,000 per year or 0.007
percent—are usually not considered
worth regulating. (Lifetime risks are
Downhill skiers take chances voluntarily.
while risk assessment generally is concerned
with characterizing involuntary risks.
Crested Butte Mountain
Reson photo
approximately 70 times higher than
annual risks if the risks are similar from
year to year for a lifetime.)
The inherent conservatism in estimates
of the cancer risk may be illustrated the
following way. Suppose you work for a
regulatory agency and you are askecl for
the agency's official estimate of the
average height of a person. You
remember that the average height of an
American male is, say, 5 feet, 10 inches.
But that applies only to American men,
and probably doesn't include modern
American basketball players, some of
whom are over 7 feet.
Getting the data on all the people in
the world is impractical, but unless you
do, you can't really give a figure without
including some certain error. And the
size of the error is also impossible to
obtain. So in order to be absolutely clear
and correct in your response you decide
to give a worst-case estimate. You will
cast your response in the form: "The
average height of a man will in no case
exceed ...."
This is a verv strong statement, so to
hedge your bet and to be sure you're
right, you will have to make conservative
assumptions. One you might make is
that the average height of a person will in
no case exceed the tallest person in the
world. This contains the
inherent reliability one likes to
have when called upon to
defend the regulatory
decision against tall activists.
Now, the tallest people you
know about from your
research are all less than 8
feet. But there may be giants
somewhere, and there is some
anecdotal evidence.
(Remember the stories about
"Bigfoot.") Let's assume you
find a record of a 12-foot giant
now deceased. On the
possibility that he might have
left living relatives, you
assume a maximum height of
15 feet because there is plenty
of data indicating that better
nutrition over the last 50 years
has increased the average
body size by about 20 percent.
So your official response,
supported by several pages of
data, reads:
"The average height of a
person will in no case exceed
15 feet."
This statement has all the required
regulatory qualities needed for the federal
Register. It is impeccably correct. It will
withstand any legal challenge. It is
prudent and does not underestimate the
height. It also has at least two
undesirable qualities: It is not very
helpful. And it discriminates against
short people. (Translation: The recasting
of the regulatory problem away from
probable risk (average height) to imrst-casc
rifk results in the under appreciation of
risk-lowering factors.)
The linear extrapolation of rodent
bioassay data embodies the regulator's
credo ("It's better to be safe than sorry")
far more than it does the scientist's,
("It's better to be right than wrong").
Currently the regulatory objective
is often fulfilled at the expense of the
scientific one.
When carcinogens were few, biological
understanding of mechanisms more
primitive, and analytical sensitivity in the
parts per million range, the differences in
these two points of view were not large
and didn't really matter much. Today,
for many substances, the situation has
changed in each of these areas, and we
face an ever-growing separation between
the application of good science and
credible, efficient regulation. O
JANUARY/FEBRUARY/MARCH 1993
-------
The Role of Comparative
Risk Analysis
Without risk-based
priorities, we are
shooting blind
by Wendy Cleland-Hamnett
EPA'S support for using comparative
risk analysis to help set the
Agency's priorities has been no
secret. Building on the lessons and
insights gained in the 1988 Unfinished
Business report, the Science Advisor}'
Board's 1990 Reducing Risk report, and
our experience in implementing strategic
initiatives, we have seen how valuable it
can be to have a grasp of the relative risk
of various problems in narrowing our
focus to the most important ones—
especially as fiscal reality has dictated
that we must.
Of course, other forces play critically
important roles in directing policy,
including statutory mandates, traditional
considerations of costs and benefits, the
state of technology, environmental
equity, and, above all, public values and
concerns. But comparative risk analysis,
and its promise of objective, relevant, and
even-handed guidance, has definitely
"made it to the table" at EPA. The
challenge for the Agency and its
stakeholders will be in deciding the
precise role it will play in delineating our
priorities.
The particular ways we have tried to
use relative risk and the conditions under
which we operate are not universally
understood. Some think risk ranking
affects our entire budget, and some think
it derives from a backroom dialogue with
cloistered scientific gurus. Perhaps most
iLlcltind-Hamnett is Acting Deputy
Assistant Administrator tor Policy,
Planning, and Evaluation at EPA.)
18
Enlightened public participation in risk-based decision making
is a goal in the tradition of Jeffersonian democracy.
Steve Delaney photo.
EPA JOURNAL
-------
often it is viewed as the only factor we
intend to include in our decision making.
It is important to note that we have never
understood priority setting to be one
dimensional, where comparative risk
analysis is the last word. In Reducing
Risk, the SAB made it clear that
establishing the relative risks of different
environmental problems was only "one
tool" that could help make integrated
and targeted national environmental
policy a reality. It also stressed that the
"dichotomy" that exists between the
perceptions of the public and the
"experts" on which risks are important
"presents an enormous challenge to a
pluralistic, democratic country."
As good as our intentions have been
over the last few years, there is ample
room for EPA to do a better job in
meeting the challenge of piloting the
doctrine of risk through a democratic
society. A quotation from Thomas
Jefferson provides valuable insight:
I know of no safe depository of the
ultimate powers of the society but
the people themselves; and if we
think them not enlightened enough
to exercise their control with a
wholesome discretion, the remedy
is not to take it from them, but to
inform their discretion.
This piece of wisdom implies, among
other things, that a democratic
government operates at its peril if it
becomes so arrogant that it makes
important decisions without informing,
involving, and taking guidance from
average citizens, and it should never
underestimate the citizens' ability to
understand. Jefferson is warning us not
to lose touch. He is not recommending
that all technical decisions of a
Two Faces of
The terms risk assessment and
comparative risk analysis are
sometimes confused. Actually, they
have very different meanings.
Risk assessment, which in
rudimentary form, at least, is older than
EPA itself, is a complex process by
which scientists determine the harm
that an individual substance can inflict
on human health or the environment.
For human health risk assessment, the
process takes place in a series of steps
that begins by identifying the particular
hazard(s) of the substance. Subsequent
steps examine "dose-response" patterns
and human exposure considerations,
and the conclusion is a "risk
characterization" that is both
quantitative and qualitative. The risk
characterization then becomes one of
the factors considered in deciding
whether and how the substance will be
regulated.
Risk assessments are not infallible.
For one thing, information on the effects
of small amounts of a substance in the
environment is often not available, and
data from animal experiments must be
extrapolated to humans. Such
extrapolations cannot be made with
absolute certainty. As described by
several authors in this issue of EPA
Journal, considerable research is being
focused on improving the risk
assessment process.
Unlike risk assessment, which for
years has provided regulators the
basis for deciding whether or not an
individual substance needs to be
controlled, comparative risk
analysis and its derivative relative
risk have arrived on the scene only
D
Risk assessment and
comparative risk
analysis are sometimes
confused. Actually,
they have very
different meanings.
D
recently. Very simply described,
comparative risk analysis is a
procedure for ranking
environmental problems by their
seriousness (relative risk) for the
purpose of assigning them program
priorities. Typically, teams of
experts put together a list of
problems then sort the problems by
types of risk—cancer, noncancer
health, materials damage, ecological
effects, and so on. The experts rank
the problems within each type by
measuring them against such standards
as the severity of effects, the likelihood of
the problem occurring among those
exposed, the number of people exposed,
and the like. The relative risk of a
problem is then used as a factor in
determining what priority the problem
should receive. Other factors include
statutory mandates, public concern over
the problem, and the economic and
technological feasibility of controlling it.
Not unexpectedly, comparative risk
analysis has its critics. As one skeptic
asked in the pages of EPA Journal two
years ago: "How does one compare a
case of lung cancer in a retired
petrochemical worker to the loss of
cognitive function experienced by an
urban child with lead poisoning? How
do we make choices between habitat and
health?" Nonetheless, in its September
1990 report Reducing Risk, EPA's Science
Advisory Board urged the Agency to
order its priorities on the basis of
reducing the most serious risks. The
board argued, in part: "... There are
heavy costs involved if society fails to set
environmental priorities based on risk. If
finite resources are expended on lower
priority problems at the expense of
higher priority risks, then society will
face needlessly high risks. If priorities
are established based on the greatest
opportunities to reduce risk, total risk
will be reduced in a more efficient way,
lessening threats to both public health
and local and global ecosystems ...."
—Eds
JANUARY/FEBRUARY/MARCH 1993
19
-------
Scientists and the public tend to see risks differently. For example, toxic waste sites such as
Kentucky's "Valley of the Drums" rank as a high-risk problem in public opinion polls.
Experts have rated the same problem as medium to low in risk.
Copyright 1979. The Courier-Journal, Louisville, Kentucky.
Reprinted with permission.
government agency be made only
through town meetings. But his
statement is a persuasive argument for
broader inclusion of the public in the
basic decisions that determine the
direction of all the policy minutiae that
follows.
It is becoming clearer to those involved
in this debate that risk-based decision
making should be based on a synthesis of
inputs broader and deeper than was
envisioned in the past. Risk-based
priority setting will be a major element of
the kind of informed and effective
dialogue that raises the quality of
environmental action across the board,
especially in the state and federal
legislatures. To achieve this, though, we
need a more participatory model of
prioritization—a risk system much
broader than the stereotypical one in
which "experts" make their
pronouncements about risks with clinical
dimpassion; one which is an organic part
of a broad-based, decision-making
process in which equity, social concerns,
fiscal feasibility, technological innovation,
and legislative mandates are fully
considered alongside the science.
D
Comparative risk
analysis has definitely
"made it to the table"
at EPA.
D
To ensure a proper place for
comparative risk in developing
environmental priorities, we must build
the strongest possible foundation of
individual risk assessments. I see three
basic guiding principles in the building of
that foundation. The first involves an
early step in the risk assessment process,
the characterization of risk. A
memorandum on the subject issued in
February 1992 provided that EPA needs
to offer more useful information when
characterizing a given risk—we need to
give more accurate predictions than a
single point estimate would allow, and
we need to evaluate more realistic
exposure situations than the unlikely
worst-case scenarios sometimes used as
the basis for policy. We must
characterize individual risks using
straightforward, consistent terminology
identifying uncertainties and data gaps
so that both experts and citizens can more
easily compare one risk to another.
This challenge remains enormously
important as more attention is focused on
the need to take into account both hard
science factors and societal elements in
the comparison of risks. The question
"What is really at stake here?" will need
to be answered realistically and usefully,
20
EPA JOURNAL
-------
again and again, in terms that all can
understand.
The second guiding principle is the
need to bring varied expertise into the
risk assessment process from the earliest
stage. Our work in relative risk stands
much less chance of acceptance if the
common perception persists that
assessments of specific risks emerge from
a black box. Therefore, just as the whole
enterprise of priority setting needs to be
broadly inclusive, the work of our
Agency professionals in working through
the important issues of specific risks
needs to be exposed to the critical eye of
independent experts, peers, and
colleagues in their fields. This both
enhances the quality of the work and
maximizes the number of people who
understand what the work attempts to
accomplish.
The Agency's existing peer review
process should be expanded as far as
possible into the earliest segments of the
life cycle of our risk-related work, and
active peer involvement in the
characterization and assessment of
individual risks should become standard
procedure. We are implementing the
recommendations made by the SAB and
an independent panel in the March 1992
report Credible Science, Credible Decisions
by establishing science advisors tor the
Administrator and Assistant
Administrators, and I hope future
administrations build on this collegia!
network.
We have also participated extensively
in interagency organizations such as the
Risk Assessment Working Group of the
Federal Coordinating Council on Science
Engineering and Technology (FCCSET),
mindful that cross-pollinating expertise
and real coordination on cross-cutting
issues with other parts of the government
can improve the quality of our work.
This cooperation must continue and must
extend not only to specific risk
assessments but also to the important
guidelines that are establishing the state
of the art in process and methods tor
cancer, noncancer, and ecological risk
end points.
The third guiding principle
we must observe in building a
foundation of credible risk assessment
is the need for basic research and
state-of-the-environment data. One of
the most fundamental reasons for the
controversv surrounding the uses of
relative risk is the persistent belief that
our risk assessments are based on default
assumptions rather than on hard facts.
Simply put, facts and hard conclusions
from data are better than estimates based
on extrapolations and interpolations.
Facts are what our research operations
D
Environmental
decision making in a
democracy is not
a math problem.
D
must give our risk assessors if their work
is to have dependable credibility at the
priority-setting table. These facts can
then be brought to life through advanced
computer visualizations in geographic
information systems, which will allow us
to target risks and develop more
meaningful geographic strategies.
Even if these principles are followed
and our risk assessments become more
widely accepted, there will remain major
legislative barriers to the widespread use
of relative risk, which makes it
imperative that Congress be an integral
part of the dialogue. As it stands now,
EPA policy makers implementing risk-
based priority setting can have an impact
only at the margins of funding. The two
funds set up for construction of
wastewater treatment plants and the
cleanup of abandoned hazardous waste
sites under Superfund dwarf all other
EPA spending areas, accounting in fiscal
year (FY) 1990 for over 70 percent of the
Agency's $6 billion budget. Only 16
percent of the full budget is allocated
toward the higher risk areas identified by
the SAB in Reducing Risk. In FY 1992, for
example, indoor radon, indoor air,
stratospheric ozone, and climate change
accounted for a little more than 2 percent
of our total budget, although they were
listed by the SAB as high risk.
Adding to the pressure is spending for
congressional projects: In FY 1993,
Congress added about 100 specific items
while approving an essentially flat
budget from FY 1992. These new
responsibilities have to be met at the
expense of both existing Agency
priorities and new initiatives. To be sure,
Monitoring for radon in soil. Radon is another example of a "disconnect" between expert
judgment and public opinion. Experts have characterized radon and other indoor air pollution
as a high risk problem, whereas the public regards radon as comparatively low in risk
JANUARY/FEBRUARY/MARCH 1993
21
-------
there is not always a direct correlation
between funding levels and results—EPA
programs such as "Green Lights" prove
that rich results can be achieved through
small budgetary investments. These
budgetary facts do indicate, however,
that comparative risk has a long way to
go before it becomes a dominant element
of priority setting at EPA.
Nevertheless, there are a number of
things going on within EPA to prepare
the way for a more inclusive and more
credible role for comparative risk in
priority setting. The current dioxin
reassessment (see article on page 24),
sparked by new findings on the
mechanisms of dioxin toxicity, has been
widely praised as evidence that the
Agency will practice what it preaches
concerning dedication to good science
and meaningful risk assessment, and has
involved both extensive peer review and
public participation. Agency
professionals used the techniques of
inclusion in the development of the
forthcoming neurotoxicity and
immunotoxicity guidelines, and EPA
EPAvs IRIS Data Base:
Accessing the Science
by Linda Tuxen
"hat is the potential human
^health hazard of exposure to
benzene? What are the possible
cancer and/or noncancer effects?
One source of information on
questions such as these is EPA's
Integrated Risk Information System
(IRIS). IRIS is a data base containing
EPA consensus scientific positions on
potential adverse human health effects
that may result from exposure to
environmental pollutants. Currently,
IRIS contains information on
approximately 500 specific substances.
IRIS was created for EPA staff as the
official repository of consensus
information in 1986; in 1988 it was
made available to the public.
Background
In the 1980s, as health risk assessment
became more widely used across
Agency programs, the need for
consensus and consistency in the areas
of hazard identification and dose-
response assessment became clear. In
1986, EPA work groups were
convened to establish consensus
Agency positions on a chemical-by-
chemical basis for those substances of
common interest and to develop a
(Tuxen is EPA's IRIS Coordinator in the
Office of Health and Environmental
Assessment.)
system for communicating the
positions to EPA risk assessors and
risk managers. IRIS currently
contains summaries of EPA human
health hazard information that
support two of the four steps—hazard
identification and dose-response
evaluation—of the risk assessment
process.
Since IRIS was developed in 1986,
and made available to the public in
1988, its use by EPA and by the
environmental health community, in
general, has grown substantially.
EPA uses the data base to provide
consistent risk information across
programs and regions. States,
national and international
organizations, and other public and
private organizations involved with
assessing potential health hazards of
exposure to a variety of
environmental contaminants use IRIS
as a source for EPA scientific opinion.
EPA's goal is that IRIS contain high
quality human health information,
based on credible science.
Development of IRIS
Health Hazard Information
Two EPA work groups—the
Carcinogen Risk Assessment
Verification Endeavor (CRAVE) and
the Oral Reference Dose/Inhalation
Reference Concentration Work
Group—develop the consensus health
hazard information for IRIS. Each
group consists of EPA scientists from a
mix of disciplines and EPA program
areas. The work groups serve as the
Agency's final review for EPA health
hazard information.
When the work groups reach
consensus for a particular substance,
they add a descriptive summary to
IRIS. The information may become
part of the supporting materials used
to develop EPA health hazard
assessments. Combined with specific
assessment information on situational
exposure, the data may be used in
evaluating potential public health
risks to environmental contaminants.
The summary directs users to the
underlying animal and human data on
which this risk information is based.
IRIS risk information is not a risk
assessment or a risk management
judgment; it should be used carefully
and with scientific judgment.
Data Base Contents
IRIS is comprised of three sections:
noncancer health effects resulting from
oral exposure, noncancer health effects
resulting from inhalation exposure,
and carcinogen assessment for both
oral and inhalation exposure.
Generally, the information is for
chronic effects, those that may result
from lifetime exposure to a given
substance or mixture.
IRIS contains full bibliographic
citations for each substance file,
directing the user to the primary cited
studies and pertinent scientific
22
EPA JOURNAL
-------
demonstrated its willingness to reach out
to peers outside the Agency by involving
a professional association, the Society for
Risk Analysis, in issues associated with
the cancer risk-assessment guidelines.
Successful development of an inclusive
system of risk assessment at EPA in the
future will require sustained attention to
some very ambitious and large-scale
initiatives. The principles in the February
1992 memorandum on risk
characterization must be fully
implemented. Development of
information for EPA's computerized
health risk-assessment database—the
Integrated Risk Information System, or
IRIS—is currently subject to very little
peer review or public involvement, and
literature. In addition, IRIS substance
files may contain one or more of three
supplementary information sections: a
summary of the Office of Water's
Drinking Water Health Advisory, a
summary of EPA regulatory actions, and
a summary of physical/chemical
properties.
EPA Review of
Some Aspects of IRIS
As part of an Agency-wide effort to
improve the quality of science used to
evaluate and manage risks, EPA has
initiated a review that seeks ways to
improve the way IRIS information is
developed and the way it is used by risk
managers. The intent of the review team
is to study the entire IRIS process, from
nomination of substances through
delivery of information.
As a first step, the team is focusing on
public involvement and external peer
review. Currently, peer review of the
technical bases for IRIS descriptive
summaries is undertaken by EPA
scientists familiar with the particular
substance. Many undergo external peer
review as well, from groups ranging
from the Agency's Science Advisory
Board and the Office of Pesticide
Programs' Scientific Advisory Panel to
specially convened panels and
workshops.
EPA seeks ways to increase and
improve public involvement and external
peer review. The Agency wishes to
identify mechanisms that can involve
qualified outside scientists and members
of the public in improving the quality of
information in IRIS, while not unduly
delaying the process of adding critical
new information to the data base.
As part of the information gathering
effort supporting the Agency's review
of IRIS, a Federal Register notice was
published on February 25,1993
(58 FR11490) that, among other things,
solicits public comments on the
review. The notice also announces the
availability of a background paper
describing IRIS, its contents, and the
current processes used by the two
Agency work groups responsible for
developing the IRIS information.
Interested persons are encouraged to
obtain a copy of the IRIS Federal
Register notice and companion
background paper.
IRIS will continue to be an
important EPA resource and a
successful public product of the
Agency. The consequence of that
success is a substantial responsibility
to provide the highest possible quality
and credible information to IRIS users.
That responsibility is taken seriously
by the Agency as evidenced by the
current self-appraisal activities.
(Editors' Note: For further information on
how to access IRIS and for a copy of the
Federal Register Notice on IRIS and the
IRIS Background Document, contact:
IRIS User Support (staffed by Computer
Sciences Corporation), EPA Office of
Research and Development,
Environmental Criteria and Assessment
Office (MS-190), Cincinnati, OH 45268,
Telephone: (513) 569-7254, Fax: (513)
569-7916.)
this vulnerability must be addressed if
we are to bring the credibility of the
information up to par with the influence
that this very important database has
developed since it became public in
1988. (See box, at left, on IRIS and its
present reassessment.)
The Environmental Monitoring and
Assessment Program, or EMAP, is a
centerpiece of the Agency's enhanced
focus on risks to the ecological health of
regions and ecosystems. Yet the
massive amounts of information gained
from the environmental indicators it
monitors form so broad a cut of cloth
that there is a real challenge to link this
information to new, concrete
understandings about risk, and then to
develop indicators that measure the
progress of our prevention programs.
And, in the wake of the June 1992
United Nations Conference on
Environment and Development in Rio
de Janeiro, risk assessment will
necessarily be an international issue as
well, and the Agency will have a direct
stake in the attempt to build upon the
work of the U.N.'s Organization for
Economic Cooperation and
Development in coordinating risk
assessment activities by scientists and
governments around the globe.
Government cannot do everything,
and the question of which things matter
the most is inevitable. Environmental
decision making in a democracy is not a
math problem. As a result, comparative
risk will never be the only criterion for
setting priorities. But if EPA's findings
are built upon a foundation of good
science and the public is fully informed
and involved in the dialogue, then
comparative risk will be an increasingly
important factor. By building
integrated strategies based upon solid
facts, and by harnessing the power of
communities and markets, I am
absolutely confident EPA can stimulate
entire new generations of clean
production and give new expression to
the concept of "sustainable
development." D
JANUARY/FEBRUARY/MARCH 1993
23
-------
CHANGING PROFILES
A Flagship
Risk Assessment
EPA reassesses dioxin in an open forum
by Peter W. Preuss and William H, Farland
very day brings new advances in
toxicology, biochemistry, molecular
1 biology, and other sciences that add
to our understanding of the interactions
between human activity and the
environment. Occasionally, these
advances give us much insight into
complex issues, insight whose
implications may greatly alter the way
we conduct our work.
Such advances have occurred with
respect to scientific data on the effects of
dioxin, one of the most prominent
environmental health issues of the past
decade. EPA began to assess the risks of
dioxin in the early 1980s; these efforts
resulted in a 1985 risk assessment that
classified dioxin as a probable human
carcinogen, primarily based on findings
from animal studies available at that
time.
Two recent scientific advances caused
us to consider reexamining our position.
The first was an October 1990 meeting at
the Banbury Center in Cold Spring
Harbor, New York, at which some 30
prominent experts reached consensus as
to the probable processes through which
(Preuss is Director of the Office of Science,
Planning, and Regulator}/ Evaluation and
Farland is Director of the Office of Health and
Environmental Assessment in EPA's Office
of Research and Development.)
dioxin causes toxic effects in humans and
animals. The second was the January
1991 publication of a major National
Institute for Occupational Safety and
Health (NIOSH) study of cancer
mortality in U.S. chemical workers
exposed to dioxin.
The implications of these advances
were uncertain. Some argued that they
D
This open scientific
process has thus far
been well received
by the public.
D
meant the risks weren't as high as
previously estimated. Some made the
opposite argument. A significant amount
of new data had to be obtained and
analyzed to reach a conclusion.
After much discussion, we in EPA
concluded that we should proceed with a
reassessment. We also concluded that
since the question of dioxin's risk had
been marked by considerable controversy
for more than a decade, we should
pursue a process that would achieve
scientific consensus on this issue.
As a rule, our risk assessments—such
as the original assessment of dioxin—had
been written by our own scientists. The
public and the broad, outside scientific
community were not involved until the
assessment was sent to the Science
Advisory Board (SAB) for peer
review—the last step before the risk
assessment became a final document. By
that time, the process had advanced too
far for the Agency to seek a consensus by
the mainstream scientific community.
We decided from the outset to make
the dioxin reassessment as open and
participatory as possible. As a first step,
it would be conducted as a cooperative
effort, written by both EPA scientists and
external scientists and peer-reviewed by
scientists outside the Agency who were
experts on dioxin. We hoped this would
help ensure not only that the most
current, most scientifically accepted
information was used, but also that all
scientific views would be heard and
debated. We also deemed it critical to
keep the public informed and involved
during the process by announcing the
reassessment, holding public meetings,
and using peer-review workshops to
evaluate the reassessment's progress.
Specifically, we asked seven prominent
outside scientists to author chapters
assessing the potential effects of dioxin
24
EPA JOURNAL
-------
on human health. These chapters would
reflect the latest, peer-reviewed, scientific
information; they would address general
toxicitv, reproductive and developmental
effects, carcinogenicitv, immunotoxic
effects, disposition and pharmacokinetics
(wh.it happens to dioxin in the body),
epidemiology, and dioxin's mechanisms
of toxic action (the biochemical reasons
why dioxin is toxic).
At the same time, we enlisted other
outside scientific expertise to develop a
biologically based dose-response model
that would advance our understanding
of how dioxin acts at the cellular level at
various closes. And, we asked our own
El'A scientists to update our
understanding of environmental
exposures of dioxin to humans.
Work began in May 144]. We held
public meetings in November 1441 and
April 1442 to report progress. Draft
chapters were completed by the outside
authors and made available to the public
in late summer and fall of 1442. At a
public meeting in September 1442, we
convened a panel of outside scientists to
hear comments by the scientific
community in a peer review of the draft
exposure document. That same month,
we called a similar meeting of other
outside experts to review the draft
health assessment chapters and
to review progress in developing the
dose-response model.
Immediately thereafter, members of
the health panel and the chairman of the
exposure panel—all of whom by this
time had the benefit of understanding the
latest science in these areas—began to
formulate a summary statement
regarding the potential health impacts of
dioxin. As of this writing, they had not
completed the statement; however, the
salient aspects of the public deliberation
were as follows:
• Risk characterization should
encompass the broad range of health
effects attributable to dioxin exposure,
not focus primarily on cancer as was the
case in the previous dioxin assessment.
• Certain noncancer effects—including
changes in endocrine function associated
with reproductive function in animals
and humans, behavioral effects in
offspring of exposed animals, and
changes in immune function in
animals—have been demonstrated.
Some data suggest that these effects may
be occurring in people at body burden
levels that can result from exposures at or
near current background levels.
• Although recent epidemiology studies
indicate that dioxin and related
compounds may be carcinogenic in
humans, a focused review of those
studies by a panel of epidemiologists is
required. EPA should then reconsider its
current classification of dioxin, which is
based primarily on the results ot
laboratory animal studies.
• Based on the key role played by a
cellular protein, called the "Ah receptor,"
in the sequence of biochemical reactions
that mav lead to toxic effects from dioxin
and related compounds in the body, the
dioxin risk characteri/ation should
consider the full range of compounds
that bind to this protein after exposure.
Additional work will be needed to better
understand the impact of dioxin-like
polvchlorinated biphenyls (PCBs).
• Additional data are needed for
application ot biologically based
statistical models for predicting
carcinogenic effects of dioxin and related
compounds. Studies under way at the
National Institute of Environmental
I lealth Sciences and EPA may provide
the needed data in early to mid-1443.
• Available data on early steps in the
responses to dioxins in human cells are
largely consistent with the results of
mathematical calculations used to predict
«*
the effects of exposure at low doses.
However, predictions cannot be made
with certainty about cancer effects at low
doses.
• Risks from everyday background
levels of dioxin in the general population
need to be carefully considered.
We want to emphasi/e that these
points are our interpretation of the
discussions and should not be viewed as
final. These views may be modified as
the summary report is completed.
The Agency is considering comments
from the reviews of the individual
chapters, as well as the above preliminary
aclvice, and the chapters are being
modified as appropriate. An additional
public peer review of the epidemiology
chapter is being organi/ed. Our schedule
for forwarding a draft risk
characteri/ation to the SAB has been
moved back to May 1443 jn the interests
of making sure that the report represents
the best possible estimate of the health
effects of dioxin.
Once the draft risk characterization has
been reviewed and approved by the SAB,
it will become EPA's scientific position on
this subject and will form the basis tor
changes, if any, by the Agency in its
regulatory policies. We plan to make the
draft available in Mav 1443.
This open scientific process has thus
far been well received by the public.
More than 70 people attendee! the
Soil sampling in Times Beach. Missouri. Times Beach was evacuated in 1983 following the
discovery of dioxin contamination alter the Meramec River flooded the town Own-tainted
oil had previously been sprayed on roads and parking lots to control dust
JANUARY/FEBRUARY/MARCH 1993
25
-------
exposure assessment peer review,
and more than 130 participated in the
health assessment peer review.
Scientists have applauded our
extensive effort to solicit information
and comment from outside EPA.
Once the reassessment is complete, we
will evaluate the success of this new
approach. On the basis of the results that
we've seen so far, we are enthusiastic that
we have developed a procedure that will
be key to ensuring public confidence in
the soundness of our science in major
future risk assessments. D
The Search for Answers: A Dioxin Timeline
r
hat degree of risk does dioxin pose to human health? What are the intricate molecular and cellular processes that may
' lead to biologic effects from dioxin in the human body? How prevalent is dioxin in the environment? These questions
have been widely debated by scientists, policy makers, and the general public ever since dioxin emerged as a leading health
issue with the evacuations of Love Canal in Niagara, New York, in 1980 and Times Beach, Missouri, in 1983. Here, briefly, is
a chronology of EPA's efforts to find the answers:
1980-1985
EPA issues its first dioxin risk assessments.
These focus primarily on data pertaining to
carcinogenicity and, based largely on animal
studies, classify dioxin as a probable, highly
potent, human carcinogen.
January 1991
NIOSH publishes a major cancer mortality
study of dioxm-exposed workers that adds
new data for assessing human health risks.
October 1990
Leading scientific experts meet at the Banbury
Center of the Cold Spring Harbor Laboratory.
They agree that effects of dioxin in humans
can be predicted from its effects in animals,
and that a risk assessment model based on
dioxin binding to a specific cellular "receptor"
in the body should be developed.
August 1992
EPA begins to issue draft health and
exposure assessment documents and
announces two public meetings in
September for scientific review of the drafts.
November 1991
and April 1992
EPA convenes public
meetings to report on
progress and to receive
public comment.
I
September 10-11,1992
EPA holds a peer-review
workshop on the exposure
draft document in which more
than 70 members of the public
participate.
1988
EPA issues a draft revised assessment,
based primarily on scientific judgment, to
suggest that dioxin may be less potent
than indicated by the 1985 assessment.
The draft also reports general agreement
in the scientific community that standard
procedures are inadequate to assess
dioxin's human health risks. However, the
draft also notes a lack of agreement
among scientists as to a better alternative.
Later in 1988, an SAB panel finds that
no scientific basis exists for revising the
1985 dioxin potency estimates. However,
to provide a better understanding of
potential risks, it recommends that a new
model be developed for assessing dioxin,
based on the multiple biological responses
that can be detected after exposure.
April 1991
Citing the results of the Banbury
Conference and the NIOSH study, the
EPA Administrator directs EPA to
work closely with the broader scientific
community in reassessing the full
range of dioxin risks. The task will
include the development of a
biologically based dose-response
model, laboratory research in support
of that activity, update of EPA's health
assessment, update of the Agency's
exposure assessment document, and
support for research to characterize
risks in aquatic ecosystems.
The health and exposure
assessments will largely be written
and peer-reviewed by prominent
scientists from outside EPA.
September 22-25,1992
*<
EPA convenes a peer-review workshop
on the health assessment draft document;
more than 130 members of the public
attend. On September 24-25, in the same
public forum, the participating scientists
discuss and summarize their thoughts on
key features that should be included in a
subsequent EPA document characterizing
the risks of dioxin.
Next Steps
EPA expects to submit a risk characterization and
revised drafts of the health and exposure
assessment documents to SAB in spring 1993.
The public will also be asked to comment on the
revised drafts. The Agency anticipates that the
reassessment will be completed by fall 1993, and
that risk management issues and options will then
be taken up within the Agency and in public forums.
26
EPA JOURNAL
-------
CHANGING PROFILES
Breakthroughs in
Cancer Risk Assessment
With new tools, scientists are learning
more about how cancer occurs
by Stephen Nesnow
Scientists are continually trying to
improve cancer risk assessment by
incorporating new information on
the cancer process and on how different
carcinogens affect the process.
Preferably, cancer risk assessments
would be based on epidemiologies 1
studies, studies that link actual human
cancer cases with human exposure to
specific agents. More often than not,
however, such information is not
available, and risk assessments are made
by extrapolating from experimental
results on laboratory animals to the
human situation.
Many uncertainties are inherent in
both approaches. Epidemiology studies
depend heavily on accurate assessments
of human exposure. Generally, these
assessments rely on external exposure
measurements: This doesn't account for
what happens to a carcinogen once it
enters the body, a factor that can greatly
influence the quantitative exposure-to-
tumor relationship. Extrapolations from
laboratory data entail even greater
uncertainties. For example, we cannot be
entire!}' sure that the carcinogen-induced
process in the animals, the so-called
mechanism of action, has relevance to
humans.
Recent scientific discoveries are
helping us to refine our estimates ot
human exposure to carcinogens and to
tPi. Ncviow is Chit-fiif thcCtiriiihycih'fif
at hi Mcttilvlivii Rmncli tit £/M's Health
Ff/iv/s Rctciin'h Lilwtiti'n/ in Rcfciircli
TniD/'/r I\irk. Niirtli Civvliiiii.)
better understand the mechanisms of
action of carcinogens both in
experimental animals and in humans.
While there is still much to learn, we
know that the induction of cancer by
chemicals is a very complex process.
Cancer, a set of diseases characterized by
uncontrolled cell growth, is thought to be
the result of a process involving multiple
steps. Carcinogens can initiate the
process; they can also influence the
development of cancer in several of the
steps. Each alteration can trigger a new
cascade of events that may eventually
lead to tumor formation.
At one time, it was thought that the
process of cancer induction by chemicals
was similar for many carcinogens. We
know now that this is probably not
correct: Carcinogens vary in how they
initiate, alter, and otherwise affect the
steps of the cancer process.
Moreover, research has now identified
many cellular targets and biochemical
and biological processes that carcinogens
can affect in such a way as to result in the
eventual formation of tumors. One target
is DNA, which contains the genes that
DNA. which contains the genes that
control cell growth, has been identified
as a "target" ol carcinogens.
JANUARY/FEBRUARY/MARCH 1993
-------
control cell growth; another can be the
biochemical processes involved in cell
growth, cell growth regulation, cell
signaling, and cell-to-cell communication.
Other targets of chemical carcinogens
may be processes involved in cell toxicity
and death or processes that alter
hormone levels. Still others may be
receptors involved in cell growth,
enzymes that metabolize carcinogens, the
immune system, and the systems that
allow cells to repair damage caused by
carcinogens.
One major scientific advance lies in
clarifying the body dose absorbed by
humans following exposure to a
carcinogen. Recently, mathematically
based computer models have been
developed that can help predict the dose
of a carcinogen or its metabolites in
specific animal and human organs and
tissues after exposure. These models are
used mainly to clarify tissue dose at the
site of tumors resulting from external
exposure. Referred to as physiologically
based pharmacokinetics (PB-PK) models,
they incorporate anatomical,
physicochemical, biochemical, metabolic,
and physiological parameters specific for
the route of administration, frequency
and duration of exposure, the carcinogen,
and the test animal.
The models are initially developed
using animal exposure data; after their
validation in animals, they are applied to
humans to obtain target organ doses.
They may then be used to predict the
magnitude and time course of human
organ or tissue target doses under
different exposure scenarios—for
example, acute or chronic exposure
conditions. The models are also used to
extrapolate from animal data (usually
obtained at high doses) to humans
(generally exposed at much lower doses)
and to extrapolate from data obtained
from one route of exposure (e.g.,
ingestion) to another (e.g., inhalation).
While predicting the tissue or organ
dose of carcinogens is essential to
assessing risk, we must also measure the
binding of carcinogens to the DNA.
There is ample evidence to suggest that
some carcinogen-bound DNA forms,
called carcinogen-DNA adducts, can
cause heritable changes (e.g., mutations
or chromosomal alterations) in the DNA,
and that these changes can eventually
result in rumor formation. Since DNA
can be a primary target for carcinogens,
the abilitv to identifv and measure
Cancer Risk
Guidelines Under
Review
[L^ PA, working in cooperation
Ln with the Agency's Science
Advisory Board, is in the process of
revising and updating its 1986
Guidelines for Carcinogen Risk
Assessment. Several activities are
part of this process:
• Currently, a discussion or
"working paper," on cancer risk
assessment issues prepared by
Agency scientists is being circulated
among outside scientists for
comment.
• Last December, EPA, California,
and the Society for Risk Analysis
held a workshop on carcinogen risk
and guideline development. The
workshop focused on the following
issues: statistical "meta-analysis" of
human epidemiological data, risk
characterization methodology, and
the use of new advances in
molecular biology in identifying
and quantifying cancer risks.
• This May, the National Academy
of Sciences will issue a report to
Congress—required under the 1990
Clean Air Act—on EPA's risk
assessment methods, emphasizing
cancer hazard and ah- exposure
assessment.
EPA is aiming to publish the
proposed revisions in the
Federal Register for public comment
by fall 1993.
—Eds.
carcinogen-DNA adducts will assist in
better characterizing human exposure.
There is a new tool for this purpose. A
new technique, the 3:P-postlabeling assay
for DNA adducts, developed by scientists
at Baylor College of Medicine, can
measure carcinogen-DNA adducts
obtained from exposed humans at an
extremely high sensitivity:
Approximately one carcinogen-DNA
adduct per single human cell can be
detected. This exquisitely sensitive
technique significantly advances our
understanding of the basic chemical
carcinogenesis process as it relates to
alterations in DNA. Studies are now in
progress in which cigarette smokers or
occupationally or clinically exposed
individuals are being monitored for the
presence of specific DNA adducts. Their
purpose is to explore the sensitivity and
selectivity of this assay, to further refine
it, and to examine the relationships
between exposure and DNA adducts.
Not only can the 32P-postlabeling assay
be used to further refine the estimate of
human exposure to chemical carcinogens,
but it can be applied to studies of
experimental animals. Similarities in
carcinogen-DNA adduct patterns, or
"fingerprints," obtained from animals
and humans exposed to the same
carcinogen can be used to strengthen the
links between animal cancer data, its
extrapolation to humans, and its use in
risk assessment. In a related approach,
techniques to detect, identify, and
quantitate carcinogens bound to proteins,
particularly blood proteins (e.g.,
hemoglobin), are being developed and
evaluated for their ability to more
accurately assess exposure in humans.
To relate carcinogen exposure to
biological effects in humans, we need to
know the biological consequences of the
exposure on DNA and chromosomes.
Fundamental advances in the
understanding of the cancer process have
come from studies in molecular biology.
Scientists have discovered two classes of
cancer genes—protooncogenes and
tumor suppressor genes—that are
involved in the normal control of cell
growth and cell differentiation
(conversion of one cell type to another).
When mutated or genetically altered by
carcinogens, they can cause abnormal cell
growth and differentiation, and they are
therefore implicated in the pathogenesis
of tumors.
Genetically altered protooncogenes
and tumor suppressor genes have been
found in the DNA from several types of
human tumors, as well as in the DNA of
tumors induced in experimental animals.
Identifying such changes in tumors of
workers in high risk populations,
smokers, or individuals exposed to high
levels of dietary carcinogens may set the
stage for their use in cancer risk
assessment.
A new molecular technique has made
significant impacts on this area of
research. The polymerase chain reaction,
or PCR, is a semi-automated technique
whereby very small amounts of DNA (or
DNA fragments) obtained from almost
any tissue (e.g., blood, skin, tumors) can
28
EPA JOURNAL
-------
Some population groups, such as children, are more susceptible than others to certain risks.
In cancer risk assessment, new molecular tools can help identify high-risk groups.
Sieve Deljnev photo
be amplified as much as a million-fold.
This allows detailed and complex
investigations to be performed on the
genetic changes that the DNA has
undergone, including changes in specific
protooncogenes and tumor suppressor
genes.
New molecular tools can also be used
to help identify high risk groups as part
of the risk assessment process. We
already know that certain population
groups can be more susceptible than
others to certain risks. In the risk
assessment supporting the National Air
Quality Standard tor lead, for example,
HPA identified children as a high risk
group that was especially vulnerable to
neurological and hematological
impairment. We know that factors such
as genetics and nutrition play an
important role in determining the
susceptibility of population groups to
environmental carcinogens and that these
factors contribute to variability in the
risks posed to different groups.
knowledge of the mechanisms of
action ot carcinogens in experimental
animals is important not onlv in
identitving carcinogens that may pose a
hazard to humans, but also in deriving
the methods by which animal data are
extrapolated to predict human tumor
rates. Data from experimental animals
are usually obtained at high exposures to
maximi/e the probability of an effect;
Recent scientific
discoveries are helping
us to refine our estimates
of human exposure to
carcinogens.
human exposure is often many orders of
magnitude lower. The extrapolation of
high to low dose has been accomplished
either through statistical models, which
assume that humans vary in their
susceptibility to carcinogens, or through
stochastic (random event) models, which
assume that everyone is equally
susceptible and that cancer occurs after
one or more randomly occurring
independent events. Neither approach is
grounded in the biology of the cancer
process as we know it today.
A new class of biologically based dose-
response models is being developed that
incorporates the ability of carcinogens to
alter DNA and the effects of carcinogens
on the processes that control cell growth
and cell death. .Although these models
are in the early stages of development,
they can already explain the increase in
human cancer rates with age, the
decrease in rumor rates after cessation of
smoking, and the effects of hormones in
breast cancer. These phenomena are not
readily explained by the non-biologicallv
based statistical or stochastic models.
In short, the complexity of the cancer
process continues to pose challenges to
the practice ot risk assessment. I Io\ve\ er,
recent exciting ad\ ances in our basic
knowledge ot how carcinogens induce
cancer and the development ot new tools
and models that can assist in
characteri/ing exposure and eftects will
undoubtedly improve cancer risk
assessment in the future D
JANUARY FEBRUARY/MARCH 1993
-------
CHANGING PROFILES
Update on
Noncancer Assessments
New initiatives are improving
the science base and the process
by V. L, Dellarco and C, A. Kimmel
Ithough public attention tends to be
drawn to cancer, many other
serious health effects with
long-term consequences may result
from exposure to environmental
pollutants. In its health assessments of
(Dellarco is Chief of the Genetic Toxicologi/
Assessment Branch in EPA's Office of Health
and Environmental Assessment within the
Office of Research and Development. Kimmel
is a senior-level developmental toxicologist
within the same office.)
pollutants, EPA considers the potential
for adverse effects such as reproductive
impairment, birth defects, genetic
damage, neurological and immunological
disorders, and respiratory and liver
diseases. For example, in the control of
air pollutants—ozone, carbon monoxide,
lead, and oxides of sulfur and
nitrogen—the Agency has long been
concerned with potential adverse
respiratory and neurological effects.
(See box, below, for other examples of
risk management decisions that were
driven by health risk concerns other
than cancer.)
Health risk assessment is an integral
part of making regulatory or risk
management decisions. As explained in
the article beginning on page 10, risk
assessment combines an evaluation of the
health effects data with the human
exposure estimates to determine the
potential risk to humans of exposure to a
chemical substance. The evaluation of
health effects is usually based on the
results of laboratory tests, but may also
Actions Based on Health Effects Other Than Cancer
• Acrylamide has been observed to
cause cancer in animals at high levels of
exposure; neurotoxic effects have been
observed at lower levels. To protect
workers from the neurotoxic risks
associated with dermal and inhalation
exposure, EPA recently proposed a ban
on grouts containing acrylamide and N-
methylolacrylamide. The grouts are
used, for example, in sealing water
mains.
• Lead has been regulated on the basis
of its developmental neurotoxic effects.
Exposure of fetuses, infants, and young
children to very low levels of lead can
have a subtle but long lasting effect on
the child's IQ. In the late 1970s, as a
consequence of lowering lead levels
in gasoline, both ambient air levels
and blood levels showed a correlated
decrease. Another major source of
lead is paint used in older housing.
EPA, the Department of Housing and
Urban Development, and the
Department of Health and Human
Services are actively pursuing risk
reduction programs for lead, since it
is estimated that one out of every six
children in this country has a blood
lead level that exceeds the Centers for
Disease Control's current
intervention level of 10 micrograms
per deciliter.
• Dinoseb-containing pesticide
products, used to control broadleaf
weeds and vegetation growth on
certain crops, were suspended in 1986
and canceled in 1988 after EPA
reviewed data showing that dinoseb
was acutely toxic to humans and had
the potential to cause birth defects and
reduce male fertility.
• The Agency is reassessing the data
on dioxin, a potent animal carcinogen
and suspected human carcinogen. It is
also considering health effects such as
immunotoxicity and developmental
and reproductive toxicity, which have
been observed at very low exposure
levels in animals.
30
EPA JOURNAL
-------
include human data, when available.
Many types of health effects must be
considered in the assessment of
noncancer risks. The term "noncancer
risk" encompasses a wide range of
responses, including adverse effects on
specific organs or organ systems, effects
on reproductive capacity, the viability
and structure of developing offspring,
and survival. To manage the
complexities of these systems, EPA has
developed guidelines that defined the
toxicity "end points" of concern in each
area, established the methodology and
assumptions made in assessing risk, and
promoted a consistent approach to data
evaluation and estimation of risk. (See
box on guidelines.)
EPA's risk assessment guidelines are
meant to be "living documents," that is,
flexible enough to be revised to keep pace
with advances in science. Currently,
health risk assessment at EPA is evolving
on a number of fronts. The following
discussion touches on some of the major
issues in risk assessment for health effects
other than cancer and on some of the
Agency's research efforts for developing
new risk assessment methods.
Scientists seldom have at hand direct
evidence that pollutants cause particular
adverse effects in humans. It is usually
necessary to rely on laboratory animal
data to identify potential hazards and to
estimate the amount of risk associated
with exposure to a given pollutant. This
approach has several shortcomings. For
example, responses to a toxic agent may
vary between laboratory animals and
humans because of differences in
metabolism and genetically determined
sensitivity, or differences in the route,
timing, and duration of exposure.
Moreover, to ensure that adverse effects
can be observed, exposures in animal
studies are set high, usually higher than
humans normally encounter. In addition,
the exposure regimen in animal studies
may not replicate the conditions of
human exposure.
To address the uncertainties inherent
in extrapolating from high to low doses
and from animals to humans, EPA has
established a program known as
Research to Improve Health Risk
Assessment (RIHRA). RIHRA, in concert
with other initiatives, is providing the
foundation for developing improved
exposure, pharmacokinetic, and dose-
response models by generating such
information as mechanisms of toxicity
Health Risk Assessment Guidelines
EPA has published or has under development several specific guidelines for
assessing health risks other than cancer. These guidelines are meant to be
"living documents," that is, flexible enough so that they can be revised to keep
pace with advances in science:
• Mutagenicity (published 1986)
• Developmental toxicity (published 1986, amended in 1991)
• Male/female reproductive toxicity (publication expected in 1993)
• Neurotoxicity (proposal publication expected in 1993)
• Immunotoxicity (under development)
• General Quantitative Methodology (under development).
and relative sensitivity across species.
Following are a few examples of current
research initiatives.
Adverse effects can be elicited in some
cases after only one or a few periods of
exposure; in others, longer term exposure
is required. An important challenge is to
develop an understanding of how
variation in exposure scenarios affects the
nature of toxicological outcomes and the
amount of risk to humans. Research is
underway to examine the effects of short
duration exposure and of relationships
D
The EPA's risk
assessment guidelines
are meant to be
"living documents/'
D
between exposure level and exposure
duration. They include the development
of models to predict effects at exposure
levels and durations other than those
evaluated experimentally.
The current practice of estimating risk
for health effects other than cancer is to
begin with a "no-observed-adverse-
effect-level" (NOAEL), which is usually
determined from animal studies. This
NOAEL is divided by numerical
"uncertainty factors" to reflect
uncertainties inherent in the
extrapolation from laboratory animal
data to humans and in the variations in
sensitivity among members of the human
population. The result is a reference dose
(RfD)—or reference concentration for
inhalation exposure—which is assumed
to be without appreciable risk for adverse
health effects. There are several
limitations to this approach, one of which
is that it does not provide a basis for
estimating risk at exposures above the
RfD. This limitation is particularly
problematic when one asks the question,
How much residual risk, if any, remains
after enforcement of control technologies
to reduce emissions of pollutants. This is
the approach to controlling air toxics
called for under the Clean Air Act
Amendments of 1990.
EPA is supporting a number of
projects to advance quantitative methods
of estimating risk, including both
statistical and biologically based dose-
response models. For example, as a first
step, the Agency is exploring, under the
RIHRA program, the use of a
"benchmark dose" to replace the
NOAEL. Unlike the NOAEL, the
benchmark dose is a quantitative
estimate that applies a mathematical
model to take into account all of the dose-
response information provided by a
study. The benchmark dose for a specific
level of response (risk) is then divided by
uncertainty factors to calculate the RfD.
Finally, to improve hazard
identification and quantification of health
risks, we must develop and incorporate
an understanding of the molecular
JANUARY/FEBRUARY/MARCH 1993
31
-------
EP/4 /s advancing research on health risks other than cancer,
including reproductive effects and effects on the unborn.
Steve Delaney photo.
mechanisms of toxicity. For example,
it is becoming increasingly clear that
certain pollutants have the ability to
interfere with some developmental and
cell regulator)' events that are controlled
through hormones. Scientists are finding
that organochlonnes and other chemicals
can harm a fetus and cause reproductive
impairment or other toxicities by
mimicking natural hormones. Even
minute exposures to these chemicals may
produce toxic responses. Also, major
advances are being made in identifying
the genes that control embryonic
development. Recent work has shown
that there are natural agents
(e.g., vitamin A and its derivatives) that
can alter the wav these genes control the
development of the embryo; it is
conceivable that environmental
pollutants mav also alter the activity of
these trenes.
EPA is supporting efforts to find ways
to incorporate such mechanistic
information into the risk assessment
process for both noncancer and cancer
end points. (See article on page 27.) Our
emerging understanding of the
mechanisms of carcinogenesis and other
health effects suggests that the
underlying basis for certain noncancer
and cancer end points may have several
commonalities. For example, chemically
induced toxicity can cause cell death and
tissue degeneration. Surviving cells may
then compensate for that injury by
increasing cell proliferation (hyperplasia),
which may underlie many types of toxic
responses. If this proliferative activity
continues unchecked, it may result in
tumor formation. Thus, the same basic
toxic mechanism may be related to both
the cancer outcome and to other types of
toxic effects. This could ultimately result
in use of similar quantitative approaches
for risk estimation for certain cancer and
noncancer health effects.
EPA is pursuing a variety of initiatives
to improve the science base for assessing
risk. This is being accomplished by
requiring that appropriate biological data
be collected on pollutants, establishing
consistency in data evaluation and risk
assessment, and developing better
estimates of human exposure. To continue
to promote the upgrading of current
methodologies and improved approaches
for risk assessment, an understanding of
basic mechanisms and their relationship to
toxic responses must be pursued. D
(The authors wish to acknowledge jeanette
Wiltsc, David Reese, Elaine Francis, Sue
McMaster, Eric Clegg, Annie larabek, and
John Vandenberg for their reviews,
contributions, and helpful discussions.)
32
EPA JOURNAL
-------
CHANGING PROFILES
The Lessons of
Commencement Boy
A pioneering study in Puget Sound
helped advance ecological risk assessment
by Patricia Cirone and Matthew Coco
Tie lobes of the glacier that carved
Duget Sound thousands of years ago
:reated numerous bays that became
natural locations for port cities.
Unfortunately, many of these bays then
became natural sumps for the
accumulation of toxic chemicals that
those cities produced. One example is
Commencement Bay on the shoreline of
Tacoma, Washington.
More than 280 point sources, including
a pulp mill, petroleum refineries,
aluminum processors, sewage treatment
plants, and an active ocean port, have
polluted Commencement Bay. Many
nonpoint sources also drain into it.
Concerns about the potential ecological
and human health effects of hazardous
substances in sediments of the
nearshore/tidal flats area of the bay led
to its addition, in September 1983, to the
National Priorities List (NPL) for cleanup
under EPA's Superfund program.
These concerns grew from earlier studies
by the National Oceanic and
Atmospheric Administration and others,
which had identified chemical
contamination in nearshore sediments
and abnormalities in fish.
Once Commencement Bay became an
NPL site, a Remedial Investigation/
Feasibility Study (RI/FS) was initiated, as
required under the Superfund law, the
Comprehensive Environmental
Response, Conservation, and Liability
Act (CERCLA). The study was intended
to define the risks to public health and
the Commencement Bay environment
(Cirone is Associate Director of Investigation
and Evaluation Programs and Coco is a
program analyst in EPA's Region 10 office.)
JANUARY/FEBRUARY/MARCH 1993
and to prioritize areas for remedial
action. As part of this RI/FS, a
pioneering ecological assessment was
conducted.
In the absence of a formal framework
or guidelines for conducting the risk
assessment, scientists working at
Commencement Bay grappled with the
methodology problems confronting them
and inductively developed a working
approach. As a practical matter, it was
not possible for the assessment to focus
on chemical contamination in the water
column, because the chemicals dispersed
with water movement and became
diluted; instead, the study focused on the
benthic community, the flora and fauna
at the bay's bottom. In general, the task
was made more difficult by a lack of
existing protocols for sampling sediments
and benthic organisms or for performing
bioassays and other procedures.
Moreover, because complex mixtures of
chemicals were found in the sediments of
contaminated areas, it was simply not
possible to ascertain cause-effect
relationships between exposure and the
effects of particular chemicals.
Two methods were developed for
characterizing ecological effects for
Commencement Bay:
• Comparison of conditions at
contaminated sites to benchmark
locations, or "reference sites." Two
reference sites were selected: Carr Inlet,
A Step Toward Ecological Risk Guidelines
The Framework for Ecological Risk
Assessment, published in February
1992 by EPA's Risk Assessment Forum,
is a first step in the development of risk
assessment guidelines for ecological
effects. Lessons learned at
Commencement Bay and elsewhere
contributed to the development of the
Framework, which provides a
three-stage process for analyzing
ecological risk:
• Problem formulation. The goals,
breadth, and focus of the risk
assessment are established. The end
product of this step is a conceptual
model that identifies the environmental
values to be protected (the assessment
"end points"), the data needed, and the
analyses to be used.
• Analysis. Exposure is characterized
to determine the extent of
contamination by a stressor and its
relationship with species.
Concurrently, ecological effects are
characterized to measure the adverse
effects associated with the stressor and
to identify cause-and-effect
relationships.
• Risk characterization. The results of
the analyses of exposure and ecological
effects are evaluated to determine the
likelihood of environmental harm
associated with a stressor.
—Eds.
33
-------
A pioneering ecological assessment was done a! Puget Sound's Commencement Bay
after it was added to the National Priorities List for Superfund cleanup. The bay
was contaminated by point and nonpomt sources of pollution.
which had the lowest detection limits for
most substances of concern in Puget
Sound embayments, and Blair Waterway,
which was the least chemically
contaminated of the seven waterways of
Commencement Bay.
• Use of an Apparent Effects Threshold
(AET) approach. Since data on biological
effects were not available for all portions
of the study area where chemical data
were available, the AET method was
developed to estimate "threshold"
concentrations of contaminants, nlwc
icliicli biological harm would be expected.
These threshold values were generated
for each of three biological indicators:
amphipods, oyster larvae, and benthic
macroinvertebrates. For amphipods, the
AET value was the level just below that
at which specimens began to die. For
oyster larvae, it was the concentration
just below that at which abnormalities
developed. For benthic
macroinvertebrates, the threshold effect
was diminished abundance.
The assessment as a whole was
designed to examine three aspects of the
benthic community: biological (through
field measurements of the number and
type of organisms), physical (through
examination of sediment grain size and
depth), and chemical (through chemical
laboratory analyses of sediments).
To begin the Commencement Bay
stuck , researchers compiled and
evaluated existing data and performed
extensive field sampling to collect
additional data. The field sampling
involved 50 field surveys, 500 sampling
stations, and 2,000 samples of water,
sediments, and biota. Close to 120,000
individual specimens belonging to 407
species of bottom-dwelling organisms,
including marine worms, round worms,
clams, crustaceans, sea cucumbers, and
brittle stars, were collected and analyzed.
These species were considered
appropriate indicators of the extent and
magnitude of environmental stress,
because they interacted extensively with
the sediment. If they failed to reflect
harm from the chemicals in the sediment,
impacts on other species that fed on them
were considered unlikely.
The field surveys of population
abundance and diversity of benthic
species were used to characterize areas
where pollutants might have eliminated
ecologically significant organisms. Also,
species of fish and crab that spend at least
part of their life cycles in Commencement
Bay and feed on the benthic organisms of
the Bay were tested for toxic substances
and multiple abnormalities. In addition,
sediments were analyzed for a host of
chemicals commonly termed "priority
pollutants," including arsenic, lead, zinc,
and cadmium.
Results of the testing indicated that the
abundance of major benthic taxa
increased with increasing distance from
the four major sources of contamination
that scientists had identified. Tests
showed bioaccumulation of toxic
substances and multiple abnormalities in
fish and crab samples. In particular,
PCBs were detected in muscle and liver
tissue of English sole throughout the
study area at concentrations substantially
elevated above those found at the
reference sites.
High levels of toxic chemicals were
found in the sediments of many areas of
the bay. Ten chemicals even had
concentrations that were 1,000 times
greater than reference area
concentrations.
As indicated earlier, scientists
conducting the Rl/FS found that
establishing a cause-and-effect
relationship between contamination and
measurable harm would be too resource-
intensive and time-consuming. They
chose instead to follow a preponderance
of evidence approach: Any identifiable
site in Commencement Bay that failed the
AET test for any of the three biological
indicators cited above would require
remedial action.
Application of the Apparent Effects
Threshold/preponderance of evidence
approach identified nine "hot spots" in
Commencement Bay. Eight of these were
eventually chosen for remedial treatment.
The Superfund Record of Decision
prescribed four clean-up strategies:
• Site use restrictions—local health
advisories and educational programs to
reduce potential exposure to site
contamination during the estimated 15-
to 20-year remedial period.
• Source controls—development and
implementation of regulations and
requirements; monitoring for compliance
with water and sediment quality
standards; and permit requirements.
• Natural recovery—Sediments would
be allowed 10 years to recover by means
of chemical degradation, diffusion across
the sediment-water interface, burial, or
remix of contaminated surface sediments
by recently deposited clean sediments.
• Sediment remedial action— Options
include capping in place with clean fill,
confined aquatic disposal, dredging and
dumping contaminated sediments in the
nearshore area, and upland disposal
above tidal influences.
In implementing the Record of
Decision, EPA's Region 10 and the
Washington Department of Ecology are
continuing the joint effort that began with
the Remedial Investigation at
Commencement Bay. D
34
EPA JOURNAL
-------
POINTS OF DEBATE
Relating
Risk Assessment
and Risk
Management
Should risk assessment be considered distinctly separate from risk
management? A 1983 National Academy of Sciences report
concluded that it should and government agencies since then
have generally operated on that assumption. Recently, however,
some experts have been Devaluating the relationship between the
two. Presented here are two views on the controversy.
Complete separation of the two processes is a misconception
by Sheila Jasanoff
Should risk management (what we
wish to do about risk) be allowed
to influence risk assessment (what
we know about risk)? The very idea is
anathema to environmental policy
makers who came of age in the 1980s. It
is like asking whether politics should
control science. We are reminded of
Galileo bowing to his inquisitors or of
Lysenko delivering to Stalin the genetics
that served the ideology of the Soviet
state. Trained to think of science as
value-free, we believe that the inevitable
result of subordinating knowledge to
politics must be the corruption of both.
Closer to home, EPA watchers may
recall the sorry events of the early 1980s
that helped give rise to the doctrine of
separating risk assessment from risk
management. In one especially
unfortunate incident, the Agency's Office
of Pesticides and Toxic Substances
exempted formaldehyde from
designation as a priority chemical under
Section 4(f) of the Toxic Substances
Control Act, even though this widely
used compound had been definitively
shown to cause cancer in rats. Legally
(]nsanoff if professor and chair of the
Department of Science and Technology/
Studies at Cornell University.)
and scientifically flawed, the underlying
analysis seemed to have been unduly
influenced by the concerns of the
formaldehyde industry. It was a blatant
case of risk management objectives
overriding the risk assessor's impartial
evaluation of scientific data. Later EPA
administrators were determined not to
see this error repeated.
D
Judgment, moreover,
must remain sensitive to
the policy context.
D
The specter of improper interest
group influence was one of the concerns
that guided the National Academy of
Sciences' (NAS) authoritative study of
risk assessment practices in the federal
government. In its "Red Book" report of
1983 (so named because of the report's
red cover), the NAS espoused the now-
classic position that regulatory agencies
should clearly separate risk assessment
from risk management. Even the
perception that risk management
considerations were influencing risk
assessment, the "Red Book'' authors
asserted, would diminish the credibility
of the assessments themselves and of
management decisions based upon them.
But careful practitioners of risk
assessment have recognized from the
start that theirs is not a purely scientific
activity. Indeed, risk assessment is often
described as an "art" rather than a
"science." This formulation emphasizes
that risk assessment, like any artistic
endeavor, requires the exercise of
subjective judgment. It cannot be done
by mechanically following the rules.
Judgment, moreover, must remain
sensitive to the policy context.
Risk assessment operates in the
ambiguous borderland between
systematic observations of the physical
world ("science") and politically
accountable decisions about public health
and welfare ("policy"). Even the NAS
report recognized that such a process
must be conditioned by factors deriving
from both the scientific and political
domains. The choices involved in risk
assessment, the report states, rest "on a
mixture of scientific fact and consensus,
on scientific judgment, and on policy
determinations."
The need for judgment in risk
assessment is often attributed to scientific
JANUARY/FEBRUARY/MARCH 1993
35
-------
Risk management at work. New York City
workers install the framework lor a plastic
encasement over an apartment building
splattered with asbestos-laced mud as a
result of a steam pipe explosion in
August 1989. The building could then
be cleaned without contaminating
the neighborhood.
David A Cantor photo Wide World
uncertainty. If uncertainty were the only
problem, then risk assessors would not
have to look much beyond science for
answers to their questions. More
knowledge would automatically make
for better risk assessments, because
additional facts would reduce
uncertainty. But the focus on uncertainty
misses the mark. The choices that must
be made in risk assessment are not due
only to gaps in existing knowledge; they
have their origin in the very methods by
which we assess uncertainty, and they
cannot be resolved simply by asking
science to fill in more facts as needed.
Risk assessors explore the physical
world through necessarily stripped-
down models of reality. In order to
estimate the probability of Kid events,
risk assessment has to reduce the
immense variability of natural and social
systems to dimensions that can be easily
mapped, measured, and modeled. Many
simplifying assumptions are needed to
construct a microcosm whose risks can
reliably be investigated. Thus, in models
conventionally used for health risk
assessment, adult human beings live
exactly 70 years, stay indoors all day in
radon-contaminated homes, drink
precisely seven cups of water per day,
smoke heavily or not at all, and exercise
while inhaling abnormal quantities of
airborne pollutants.
In other risk scenarios, water and
smoke plumes flow along
mathematically exact pathways, dense
population clusters are located
immediately downwind from polluting
factories, pregnant women and small
children eat steady diets of pesticide-
laden foods, and acid rain falls
relentlessly on forests of red spruce. We
know that nature and society actually
behave in more complex and
unpredictable ways, but we cannot begin
to estimate the magnitude of particular
risks except by building little model
worlds where all variation is artificially
restricted
If risk assessment proceeds by
mapping facts onto such approximate
versions of "reality," then risk
management is the guide that tells us
when these approximations are
acceptably accurate. Is it appropriate, for
example, to set standards for hazardous
air pollutants based on the exposure of
the so-called "porch potato"—that
mythical being who spends 70 years
immobile on the porch of a house at the
fence line of the emitting factory? Or is
this a "worst case" that risk assessors
should ignore in favor of a more
behaviorally realistic scenario? The
answers have to come from risk
management, for only the risk manager
can say how conservatively we should
draw up our policies for protecting
public health and the environment.
Should default assumptions be
chosen so as to safeguard the most highly
exposed, the most vulnerable, or the most
"normal" individual? Do we want to
eliminate risks that are unacceptably high
tor some subpopulations or only to
35
EPA JOURNAL
-------
reduce those that occur at too high a
frequency for the entire population?
Questions like these require the regulator
to cross the borderline between risk
assessment and risk management. The
answers bear on core elements of the risk
assessor's work: how to select among
competing models, how to balance
conflicting scientific inputs, when to
revise prior assumptions, how to register
and represent uncertainty, and when to
hold out for more scientific information.
Yet, the questions themselves are firmly
planted in the policy domain.
Too rigid a separation between risk
assessment and risk management seems
in the light of this analysis to be both
naive and misguided. Risk assessment
does indeed offer a principled way of
organizing what we know about the
world, particularly about its weak spots
and creaky joints. But the principles by
which we organize the "facts" of risk
have to derive, at least in part, from the
concerns of risk management. In the end,
we cannot order, arrange, or supplement
our knowledge about risk without a
clear, framing vision of the social and
natural order that our risk-management
policies are seeking to create.
If risk management is broke, why fix risk assessment?
by Bernard D. Goldstein
It has been interesting to follow the
twists and turns of the fortunes of risk
assessment in the less than a decade
since the National Academy of Sciences'
"Red Book" thrust this nascent paradigm
toward the center of environmental
regulation. Predictably, the initial burst
of enthusiasm was followed by a cautious
reappraisal when it became clear that risk
assessment could not solve all the
problems faced by risk managers, and
that there was a limit to its accuracy
imposed both by scientific uncertainties
and by policy directives.
Some critics of risk assessment now are
arguing that risk assessment is really not
separate from risk management. Not
surprisingly, this argument is primarily
being made by risk managers who
appear to have forgotten that a major
impetus for the adoption of the "Red
Book" formulation was the public
perception that EPA's science was being
manipulated for policy purposes.
Risk assessment is one of the major
recent advances in moving forward the
environmental regulatory process,
rivaled perhaps only by the "bubble
concept." (EPA's "bubble policy" allows
a plant with several emission points to be
treated as if it were in a "bubble." The
total emissions are averaged for the entire
plant, not each emission point, allowing
(Goldstein is Director of the Environmental
and Occupational Health Sciences Institute at
Rutgers University and the University of
Medicine and Dentistry of New Jersey-Robert
Wood Johnson Medical School.)
the plant's operators flexibility in meeting
emission standards.) Risk assessment is
enshrined in various laws as the driving
force for regulatory actions and has
proved to be a valuable tool in making
decisions and in assigning priorities for
the eradication of existing environmental
problems. It improves the credibility of
the entire regulatory process, particularly
in comparison to regulatory approaches
that violate the basic laws of toxicological
D
The deconstructionist
approach, which argues
that there is no absolute
truth, may hold for
the policy world
but not for science.
n
science by treating all chemicals as if they
are equal. In addition, risk assessment is
particularly valuable in developing a
research agenda that is responsive to
crucial uncertainties underlying the
decision-making process.
Organizations seemingly devoted to
advancing the field of risk management
spend much time and resources on
critiques of risk assessment. Why the
preoccupation with risk assessment?
One possible reason is that risk
assessment has been oversold. The
tendency toward overuse of risk
assessment by risk managers partly
represents an attempt to achieve
credibility for preconceived policy goals,
but unquestionably it also reflects the
failure of risk assessors to adequately
explain the basics of risk assessment,
including its inherent limitations. Two
examples are the misuse of risk
assessment as the sole determinant for
establishing priorities and in setting
so-called "bright lines"—a bright line is a
specific point, under any and all
circumstances, where a particular level
of risk is "acceptable" or not—for
regulatory decisions.
Those who argue that risk
assessment and risk management
should not be independent say that risk
assessors—scientists—have values and
therefore risk assessment cannot be an
apolitical effort. The deconstructionist
approach, which argues that there is no
absolute truth, may hold for the policy
world but not for science. There is a
knowable law of nature that describes the
risk of dioxin or benzene, but no
immutable law as to the best
management approach to deal with these
risks. At the very least, risk management
is contextual, with the best decision being
related to time and place, while risk
assessment inherently embraces the
concept that there is a single right
assessment for all time.
As with any area of science, the fact
that there is a natural law with one right
answer which eventually will become
JANUARY/FEBRUARY/MARCH 1993
37
-------
known is very inhibiting to the risk
assessor whose scientific reputation is
continually at stake. The scientist in
essence wrestles with formulas
attempting to depict the real world in the
full knowledge that, if the formula is
wrong, the real world will win the match.
In contrast, it often seems to the risk
assessor that the policy maker wrestles
with the real world in order to contort it
into fitting into a preconceived risk-
management formula. While the risk
manager has every right to attempt to
amend the various social, economic,
legal, and political pieces that go into
making a regulatory decision, it is bad
public policy to allow the manager the
seeming opportunity to amend the laws
of nature.
Perhaps risk assessment should be
considered analogous to one of the many
economic statistics supplied to us by the
government, such as the unemployment
figure or the amount of money in
circulation. The latter includes what you
and I have in our wallets and clearly
must be an estimate with a whole variety
of uncertainties built into it. Although
such economic estimates are frequently
given as preliminary figures and revised
later, I have never heard of them given in
In extrapolating from
laboratory animal test
results to real-world
conditions,
practitioners of risk
assessment confront
controversial questions
of science policy.
Monsanto Company photo.
such a way as to include a statement of
the extent of uncertainty.
I wonder why economists and other
policy makers involved in
environmental regulation ask for the
laying out of all the uncertainties in the
biological and mathematical aspects of
risk assessment. Risk assessment should
be viewed as a useful approximation of
risk. Similar to economic statistics, a risk
number should be particularly valuable
for comparisons, should be arrived at in a
way that is free of the inference of
political interference, and should be
subject to reevaluation based on new
data, but not on policy preferences.
Much of the attention given by risk
managers to risk assessment has been on
the risk characterization step. To some
extent this is reasonable, but the amazed
horror at which some view the revelation
that how one characterizes risk can affect
the public's perception often seems either
naive or strained. Surely, this is a basic
fact of life in public health, and, although
frequently forgotten, most environmental
laws are aimed at protecting public
health. Consider how the lack of
available jobs could be trivialized if we
were presented with employment rather
than unemployment figures. A fall in the
employment rate from 96.8 percent to
95.2 percent does not sound as bad as the
corresponding increase in unemployment
from 3.2 percent to 4.8 percent.
I suggest to those who are concerned
with the policy aspects that are built into
risk assessment, such as the extent of
conservatism, that the appropriate step at
which the issue should be dealt with is
not at the risk characterization step, but
rather in the science policy step that
precedes risk assessment. We really are
dealing with a three-step process. The
first step, before risk assessment and
management, is the establishment of
generic risk assessment guidelines
combining both science and policy.
These guidelines should cover every step
in the calculation and characterization of
the risk. They should be as explicit as the
procedures for estimating unemployment
or the gross national product, but
sufficiently flexible to allow alterations in
the performance of an individual risk
assessment when justified by scientific
data as supported by an open peer
review process.
It is time for risk assessors to stop
being defensive. Because risk
management is broke is no reason to fix
risk assessment. D
I iteffl
i l in1 ISrftl
38
EPA JOURNAL
-------
POINTS OF DEBATE
The Delaney Clause
Dilemma
EPA is calling for a public dialogue on the issues
by Victor J. Kimm
On July 8,1992, a federal appeals
court made a decision in
Les v. Reilly that has unfortunate
implications for pesticide decision-
making and food-safety policy in the
United States. Specifically, the court
ruled that EPA may not allow any level
of pesticide residues in processed food
greater than the level permitted in the
raw food if the pesticide presents any
carcinogenic risk, however negligible.
This decision was based on a strict
interpretation of a provision of the
Federal Food, Drug, and Cosmetic Act
(FFDCA) that was enacted in 1958 and
has come to be called the "Delaney
clause" after its Congressional sponsor.
EPA believes the court's recent
interpretation of the Delaney clause could
adversely affect the nation's food supply
by making it difficult for farmers to use
the safest pesticides on their crops. In
fact, the impact of the court's
interpretation of the unusually complex
statutory framework in which the
Delaney clause is embedded could be,
paradoxically, to discourage the use of
safer new pesticides in favor of more
risky chemicals.
The court acknowledged that EPA's
interpretation of the FFDCA's statutory
structure—namely its policy of applying
a negligible risk standard across the
board to all potentially carcinogenic
pesticides—might produce a more
"enlightened" scheme than does a literal
interpretation of the Delaney clause. But
it indicated that while Congress might
have created a defective—or, at least,
(Kimm is Acting Assistant Administrator for
EPA's Office of Pollution Prevention,
Pesticides, and Toxic Substances.)
JANUARY/FEBRUARY/MARCH 1993
outdated—framework, only Congress
could correct or update the law. Since
the Supreme Court declined to review
this case in February 1993, the law stands;
therefore, barring legislative action by
Congress, EPA will have to move to
implement the court's ruling.
Consequently, the Agency believes
that this court decision intensifies the
need for all interested parties to work
together with federal agencies and
Congress to reform pesticide regulation.
Our goal must be to bring U.S. pesticide
D
The court's
interpretation will
produce some
strange results.
D
regulation into conformity with current
science and ensure that the nation's
pesticide safety laws provide the best
available protection to consumers.
A Complicated Statutory Framework
In order to understand the court's ruling
and its impact on pesticide use, one needs
to have some idea of the complicated
interplay between different provisions of
the FFDCA. Even the undisputed basics
of that scheme are difficult to understand.
The FFDCA authorizes EPA to set legal
limits through "tolerances" and "food
additive regulations" on the amounts of
pesticide residues allowable in raw food
commodities and processed foods,
respectively. Tolerances and food
additive regulations are generally
expressed in parts per million (ppm).
Under FFDCA section 408, EPA
sets—and the U.S. Food and
Drug Administration (FDA)
enforces—tolerances for pesticides in or
on raw agricultural commodities, such as
vegetables, fruit, and grains. For
example, a tolerance of 5 ppm of
Pesticide X on apples allows residues of
that pesticide to remain on harvested
apples up to the level of 5 ppm. EPA
issues tolerance regulations in
conjunction with pesticide registration
decisions governed by the Federal
Insecticide, Fungicide, and Rodenticide
Act (FIFRA); a pesticide registration
under FIFRA constitutes a license for use
of the pesticide under specified
conditions.
Under FFDCA section 409, a pesticide
residue is considered a food additive.
This provision authorizes EPA to issue
(again, with FDA enforcement) food
additive regulations for pesticides on the
processed food forms of raw
commodities—such as, for example, fruit
and vegetable juices, flour, and tomato
and almond pastes. Section 409 is also
the provision under which FDA sets
limits on other food additives—such as,
for example, the amount of preservatives
or artificial sweeteners that can safely be
added to foods.
The FFDCA "Flow-Through"
Provision
So far, the scheme is relatively
straightforward. But complications arise
in setting tolerances for processed
commodities under FFDCA section 409.
Section 408, the raw food provision, is
39
-------
Crops that are sold raw. such as fresh apples,
are not subject to the controversial
"Delaney clause" of the Federal
Food. Drug, and Cosmetic Act.
Steve Delaney photo
linked to section 409, the processed food
provision, by a section of the statute
known as the "flow through" provision.
Under this provision, pesticide residues
in processed foods will not be considered
unsafe under two conditions:
• First, good manufacturing procedures
must be used to remove as much
pesticide as possible during the
processing of the food.
• Second, and more important for our
purposes, the processed food will not be
considered unsafe if the residues
remaining after processing do not exceed
the tolerance limit set for residues on the
raw commodity under section 408.
Hence, this provision allows section
408 raw food tolerances to "flow
through" to cover pesticide residues in
processed food. Consequently, EPA's
policy concerning when a section 409
food additive regulation is needed turns
on whether there is a possibility that the
processing of a raw food containing
pesticide residues would result in
residues in the processed food at a level
greater than the raw food tolerance.
EPA believes there is a possibility of
over-tolerance (illegal) residues in
processed food in cases where a
processing study shows that significant
concentration of residues occurs through
the processing of raw commodities. If
significant concentration does occur
during processing, then, theoretically/ if
residues in the raw food are at or near the
tolerance level when the raw food is
processed, the processed food may
exceed the level of the raw food
tolerance.
EPA's Coordination Policy
Because the raw food and processed food
provisions of FFDCA relate to each other
as explained above, it has been EPA's
policy to consider pesticide registrations
and related raw food and processed food
tolerances in holistic decision scenarios.
In other words, it has been EPA's policy
to revoke or refuse to issue section 408
raw food tolerances or FIFRA
registrations for pesticide uses on foods
that might become processed foods in
cases where a section 409 tolerance
cannot be allowed. Largely because it is
often difficult at the time a pesticide is
applied to predict whether the crop will
be eaten raw or processed, EPA refuses
otherwise acceptable raw food tolerances
if for some reason a food additive
regulation cannot be issued for residues
of the pesticide in processed food.
The policy just described is known as
the Agency's "coordination policy,"
sometimes called its "concentration
policy." It is based on the rationale that
EPA should not approve uses of a
pesticide under FIFRA and FFDCA
section 408 if the use could result in
illegal residues in processed food.
EPA's coordination policy has been
challenged by the National Food
Processors Association (see article on
page 44). The NFPA argues that linking
action under sections 408 and 409 is
illegal, and that data show that EPA's
concerns with residues in processed food
exceeding the raw food tolerance are
unsubstantiated.
Inconsistent Standards in FFDCA
Sections 408 and 409
At this point, the statutory scheme
appears to be complicated yet logical.
Not true: There is a serious problem in
the framework because the standards for
setting tolerances under section 408 and
for establishing food additive regulations
under section 409 are fundamentally
different. On the one hand, section 408
provides that in setting tolerance levels,
EPA must balance any dietary risks to the
public health against the benefits of the
pesticide in producing an adequate,
wholesome, and economical food supply.
On the other hand, the Delaney clause,
which appears in section 409 only,
provides that no substance found to
cause cancer in man or animals may be
added to food.
In Les v. Reilly, the court agreed with
the Natural Resources Defense Council's
interpretation of the Delaney clause as
prohibiting the addition to food of any
substance carrying any carcinogenic risk,
regardless of how small that risk might
be. This means that even negligible
carcinogenic risks are prohibited by the
Delaney clause. It follows that this
interpretation precludes any risk/benefit
analysis where even the tiniest risk of
cancer is involved. The court disagreed
with EPA's interpretation of the statute as
containing an implicit exception for
negligible risks, although such de minimis
exceptions have been recognized for
other FFDCA provisions.
Results of the Court's Interpretation
The court's interpretation will produce
some strange results. For example,
because the Delaney clause does not
affect FFDCA section 408, a pesticide
carrying a negligible risk of cancer could
be used on crops that will be consumed
raw, such as apples and tomatoes. And
because the Delaney clause, which is in
section 409, does not affect the "flow-
through" provision, pesticide residues
40
EPA JOURNAL
-------
carrying a negligible risk of cancer could
appear in processed food—as long as the
residue level does not exceed the
tolerance limit for the raw commodity.
Thus, for example, a grocery store
could legally sell raw apples containing
Pesticide X residues that carry a
negligible risk of cancer. And the store
could also sell applesauce containing
residues of Pesticide X, as long as the
residues in the sauce were not higher
than the tolerance for residues in the raw
apples (calculated in both cases as the
parts per million of residue in the food).
But if the residues of Pesticide X in
applesauce happen to exceed the residues
in the raw apples, the Delaney clause
would prohibit sale of the applesauce.
The applesauce would be legally
"unsafe" even if the carcinogenic risk
presented by the residues in the apple
juice were less than the risk in the raw
apple or applesauce. This is an entirely
plausible scenario because risks from
pesticide residues in processed food may
be less than from the same pesticide in
raw foods. The reason is the
magnitude of risk depends on both
the level of residue and the amount
of food consumed.
Considered in conjunction with EPA's
policy regarding the coordination of its
various pesticide regulatory authorities,
the court's interpretation of the FFDCA
statutory framework produces even more
striking anomalies. As mentioned above,
it is EPA's policy not to issue a section
408 raw crop tolerance or FIFRA
registration if a section 409 food additive
regulation is needed but cannot be
granted for a processed food that could
be made from the crop.
Under the Les v. Reilly decision and
EPA's coordination policy, to continue
the example above, EPA would prohibit
pesticide residues in apples, applesauce,
and apple juice, and consequently, the
pesticide could not be used on apples.
All applications to apples would be
prohibited, even though only the
residues in apple juice, a processed food,
could not be approved under the
governing statutes.
Under this scenario, whether or not a
pesticide concentrates during processing
is more important than the level of
resulting risk, if any. The fact that
residues concentrate in one processed
food—rather than the level of
carcinogenic risk in any raw or processed
foocj—eliminates the pesticide from U.S.
JANUARY/FEBRUARY/MARCH 1993
markets for use on that food. Contrast
this outcome with the outcome of the
statutory and regulatory scheme on use
of Pesticide Y, which is also applied to
apple crops and has a carcinogenic risk
similar or greater to pesticide X, but
which does not concentrate. Because the
"non-concentrator" does not need a
section 409 food additive regulation, its
use is not governed by the Delaney
clause.
Consequently, if Pesticide Y can meet
the standards in FIFRA and FFDCA
section 408, the Delaney clause has the
effect of increasing the public's dietary
cancer risk from pesticides by keeping
the lower-risk pesticide off the market.
Not surprisingly, in its report entitled
Regulating Pesticides in Food: the Delaney
Paradox (1987), the National Academy of
Sciences concluded that the FFDCA's
focus on the concentration-during-
processing criterion "make[s] no
discernible sense in terms of public health
protection."
Conclusion
The current framework is illogical in that
it results in opposite outcomes for
pesticides having similar risks, benefits,
costs, and efficacy. The situation
becomes even more troublesome when
the anomalies in the law result in the
elimination of safer pesticides in favor of
more risky pesticides.
Thus, the appeal court's ruling that the
Delaney clause contains no implicit
exception for negligible or de minimis
carcinogenic risks, and the Supreme
Court's recent decision not to consider
the case, highlights the need for all
parties to begin a working dialogue to
resolve the controversy. The goal should
be to come to an agreement on how to
proceed to ensure the nation's pesticide
safety laws provide the best available
protection to consumers.
In the meantime, EPA will necessarily
begin action to rescind the food additive
regulations for the four pesticides that
were at issue in the appeals court
decision. Moreover, on February 2,1993,
the Agency publicly released a list of over
30 pesticide chemical/crop combinations
that might also be affected by the court
decision and EPA's current policies. In
addition, on February 5, EPA published
in the Federal Register a notice soliciting
public comment on pesticide regulatory
reform. The notice invites public
comment on issues related to the Delaney
clause and the Les v. Reilly decision,
including the petition of the National
Food Processors Association. The Federal
Register notice requests comments to help
the Administrator formulate policy on
how to deal with these complex and
serious policy issues. D
CONTAINS:
102 FOOD
Copyright 1992. Reprinted with special permission of King Features Syndicate
41
-------
POINTS OF DEBATE
The Delaney Clause;
Point/Counterpoint
Let's reform a failed food safety regime
by Al Meyerhoff
Ihe essential premise of the Delaney
clause of the Food, Drug, and
Cosmetic Act is as simple as it is
powerful: What we understand best
about carcinogens is the limited extent of
our knowledge. Accordingly, the famous
clause is grounded in a policy of
prevention: prohibiting the addition of
carcinogens in the food supply to prevent
avoidable cancers in humans. This
approach was deemed necessary by
Congress since the entire nation's
population would otherwise be routinely
exposed to carcinogens in their
daily diet.
More than three decades after the
Delaney clause was enacted in 1958, the
public policy issue now presented is
whether science has evolved to the point
where this premise is no longer valid.
Are the principles and practice of cancer
risk assessment up to the task of
providing complete protection from
cancer to consumers exposed to
carcinogens in their food? Or, to put the
question differently, Can those industries
responsible for that exposure
affirmatively demonstrate that it does not
jeopardize public health?
Perhaps unfortunately, but
demonstrably, the answer remains no.
Inevitably, to quote from a Federal Register
(Mci/erhoffis n Senior Attorney in the San
Fmncisco office of Natural Resources Defense
Council.)
notice published by EPA in 1991 (56 FR
7750, 2757), "There are inherent
uncertainties in quantitative risk
assessment because, among other things,
of the necessity of relying on data from
animal studies to predict human risk."
As a result, the same starting data in
risk assessments can yield predictions
that vary over several orders of
magnitude, depending on the
assumptions that go into the model. As
FDA Deputy Director Robert Scheuplein
D
Since the Delaney clause
was enacted, pesticide
use has increased
dramatically in the
United States.
n
warned in 1987, in a paper called "Risk
Assessment and Food Safety: A
Scientist's and Regulator's View,"
agencies like EPA "risk losing the
integrity of the science and objectivity
they need from it by continuing to
suggest risk assessments are better than
they are and that cancer risk can be so
clearly self-evidently dismissed as de
minimis solely on a scientific basis. We
have not seen a scientific breakthrough
which now permits the precise
assessment of low-level cancer risks."
Existing risk assessments also virtually
ignore the special risks toxic chemicals
may pose to infants, young children, or
various subpopulations, yet they may be
at greatest risk from dietary exposure to
carcinogens. Moreover, taking a
chemical-by-chemical, use-by-use
approach as the long-term policy
objective, rather than an interim step to
overall pesticide use reduction, ignores
the fundamental deficiency of risk
assessment—that it looks at only one
chemical at a time.
The reality of life is that we are
exposed to a multiplicity of toxic
substances. Calculating the combined
risks of these exposures is problematic;
some 300 pesticidal active ingredients are
used on food as well as an imperfectly
examined large number of "inert"
ingredients. For the most part, existing
EPA pesticide tolerances for allowable
pesticide residue levels do not even
attempt to calculate the aggregate
human health risks presented, nor do
they address the cumulative and
synergistic effects on multiple pathways
of exposure. For this reason, ultimately,
the overall policy underlying the Delaney
clause—that we should avoid
unnecessary and involuntary exposure to
cancer-causing agents—remains as valid
today as when enacted.
42
EPA JOURNAL
-------
In the subsequent commentary,
perhaps seeking a "quick fix," industry
representative Clausen Ely focuses on a
rather arcane question: whether EPA's
policy that pesticides that concentrate
during processing must be treated as
food additives subject to the Delaney
clause is lawful and correct. It is both.
But the far more important question is
how what most acknowledge to be a
failed food-safety regulatory regime can
be reformed. The importance of this
cannot be overstated. During the Reagan
administration, in describing dietary risk,
the former Director of EPA's Pesticide
Program said, "Pesticides dwarf the other
environmental risks the Agency deals
with. Toxic waste dumps may affect a
few thousand people who live around
them. But virtually everyone is exposed
to pesticides."
Yet, to date, EPA's consistent approach
to carcinogens in food has been to ignore
or evade the Delaney clause, the most
health-protective statute governing
pesticides. This approach is no longer
legally permissible. Last July, a U.S.
Court of Appeals held in Lt's z>. Reilly that
pesticides present in processed foods,
either due to concentration during
processing or post-harvest application,
are subject to the strict proviso of the
Delaney clause that "no [food] additive
shall be deemed to be safe if it is found to
induce cancer when ingested by man or
animal." The Agency's Ac miniinis policy,
allowing carcinogens based on the
purported low level of cancer risk, was
also rejected by the court: "The language
of the Delaney clause, its history, and
purpose all reflect that Congress intended
the EPA to prohibit all additives that are
carcinogens, regardless of the degree of
risk involved." EPA has now issued an
initial list of 32 pesticides subject to the
Lcs rule. Since EPA is unable to
determine in most cases which raw
commodities will or will not be
processed, the presence of these
carcinogenic pesticides in raw
commodities is foreclosed as well.
Since the Delaney clause was enacted,
pesticide use has increased dramatically
in the United States, now approaching 3
billion pounds per year. There is also an
emerging consensus that the existing food
safety scheme, with its different and
contradictory legal standards, long
delays, and inadequate monitoring and
enforcement procedures, is in need of
serious reform. In this regard, as EPA
Administrator Browner recently noted,
the Clinton administration and newly
elected Congress will be confronted with
important policy choices. However, the
fundamental principles reflected in the
W\ X^g, , ^
Vf^ ^
Delaney clause—of preventing
unnecessary exposure to cancer-causing
substances while providing incentives to
reduce pesticide use—must be the
centerpiece of any new legislation.
Given its vagaries and ambiguities,
risk assessment should not, by itself,
serve as the linchpin for pesticide reform.
Instead, it should be used as an interim
tool to achieve a deserved result—overall
reduction of the use of pesticides in
agriculture and the promotion of
alternative technologies. Thus, any
legislation reopening Delaney clause
issues should necessarily include a
scheduled phaseout of food-use
pesticides that EPA has found are
"probable" human carcinogens, together
with incentives for reducing overall
pesticide use.
Since the Delaney clause became law,
much new scientific knowledge has been
developed. Yet we still do not know
whether humans are more or less
sensitive than experimental animals to
various carcinogens. We don't know
how to assess the contribution of one
carcinogen in relation to the impacts of
exposures to other carcinogens. We don't
know the cumulative impact of dozens of
carcinogens now permitted in the food
supply. We should, therefore, follow
Rachel Carson's advice three decades
ago: "The ultimate answer
is to use less toxic
chemicals. This system of
deliberately poisoning our
food and then policing the
result—is too reminiscent
of Lewis Carroll's 'white
knight' who thought of a
plan to dye one's whiskers
green and always use so
large a fan that they could
not be seen."
(Point /Counterpoint
continued on ne\t page.)
California public health
chemists test apple juice
and applesauce for pesticide
residues. Under a strict
interpretation of the Delaney
clause, any residue of a
carcinogenic pesticide in
processed foods is simply
"unsafe"—no matter if EPA
finds the risk negligible.
George Olson photo Wide World
JANUARY/FEBRUARY/MARCH 1993
43
-------
The Delaney Clause:
Point/Counterpoint (continued)
An obscure EPA policy is to blame
by Clausen Ely
PA's current dilemma with respect
to application of the Delaney clause
to pesticide tolerances is a self-
generated problem for which there is a
readily available solution. The dilemma
is not really caused by a conflict or gap in
the law. It stems from EPA's adoption of
an obscure administrative policy. EPA is
free to rescind that policy, and the
National Food Processors Association
(NFPA) and a coalition of food trade
associations have petitioned the Agency
to do so. However, if EPA declines to
modify its administrative policy, the Les
v. Reilly Appeals Court decision of last
July is likely to have far-reaching adverse
consequences.
The administrative policy in question,
EPA's "coordination policy," sometimes
called its "concentration policy," requires
application of the Delaney clause to
pesticide residues in many processed
foods. The EPA policy also disallows the
raw product tolerances for any pesticide
that EPA concludes may concentrate in a
processed food but because it triggers the
apparent zero-risk standard of the
Delaney clause does not qualify for a
separate processed food tolerance. This
policy is at odds with governing law,
inconsistent with pesticide residue
survey data, and contrary to the public
interest.
'£/y is (7 partner m the law finn of'Covington
tind Burling and senvs as Counsel to the
National Food Processors Association.)
44
Every food-use pesticide must be
registered by EPA under the Federal
Insecticide, Fungicide, and Rodenticide
Act (FIFRA); at the same time, a tolerance
must be established for that pesticide use
under section 408 of the Federal Food,
Drug, and Cosmetic Act (FFDCA) for
each raw commodity for which use is
approved. Both FIFRA and section 408
of FFDCA embody a general safety
standard that authorizes EPA to ignore
D
This policy is at odds
with governing law,
inconsistent with
pesticide residue survey
data, and contrary to the
public interest.
n
de minimis or negligible risks, require a
risk-benefit evaluation of each pesticide
use, and contain no special safety
requirement for carcinogens.
Under the "flow-through" provision of
FFDCA, a pesticide residue in a
processed food is lawful as long as the
residue in the processed food when
ready to eat is not greater than the
tolerance prescribed for the raw
commodity from which the processed
food was made. The clear purpose of the
flow-through provision was to apply the
same safety standard for pesticide
residues in raw and processed foods and
to avoid the necessity of establishing
separate tolerances for pesticide residues
in processed food. Where a pesticide
residue in a processed food exceeds the
raw product tolerance, the food is legally
adulterated and the Food and Drug
Administration (FDA) has ample
authority to seize or enjoin shipment of
the product.
EPA's so-called coordination policy
ignores the language and intent of the
flow-through provision by requiring
separate tolerances for pesticide residues
in processed food. Because processed
food tolerances must be issued under the
food additive provision of FFDCA
(section 409), EPA's concentration policy
imposes the restrictive standard of the
Delaney clause on pesticide residues in
processed food. Moreover, by linking
seqtion 408 and 409 tolerances, EPA's
policy unlawfully incorporates the
Delaney clause into the standard for raw
product tolerances.
Under EPA's policy, a separate
processed food tolerance is required
whenever EPA concludes, on the basis of
conservative processing studies, that it is
hypothetically possible for the processed
food residue to exceed the raw product
tolerance. However, extensive residue
studies conducted by the U.S.
EPA JOURNAL
-------
Department of Agriculture, the FDA, and
the food industry demonstrate—and EPA
has conceded—that, in fact, pesticide
residues in processed food rarely, if ever,
exceed raw product tolerances. This is
because raw commodity tolerances are
set at levels to ensure that all legal
applications of a pesticide will result in
residues below the tolerance level, and
residues in raw crops are generally
reduced through processing and
otherwise greatly dissipated prior to sale
as processed products. Thus, not only is
EPA's concentration policy inconsistent
with the statutory scheme, but it does not
reflect marketplace reality.
In view of the importance of EPA's
concentration policy, it is surprising that
the policy is not explicitly set forth in any
regulation. Nor has it been legitimized
through notice-and-comment
rulemaking. Although the Appeals
Court held in Lt's v. Reilly that EPA has no
discretion to permit de miiiimis risks
under the Delaney clause, the court did
not rule on EPA's concentration policy.
Nothing in the court's decision indicates
an endorsement of EPA's policy; nothing
indicates any agreement that there is a
reasonable legal or factual basis for the
policy.
Clearly, however, retention of the EPA
coordination policy, coupled with the
court's restrictive interpretation of the
Delaney clause, will lead to the loss of
numerous valuable food-use pesticides.
Although the Les v. Reilly case directly
involved only four pesticides, EPA has
acknowledged that more than 30, and
probably more than 60, food-use
pesticides may be required to be banned
as a consequence of the decision: This
will be the case if EPA continues to
require separate tolerances for processed
food and to revoke the raw commodity
tolerances for pesticides that are ineligible
for processed food tolerances. The result
will be multiple adverse public health
impacts.
• First, the reduced number of effective
pesticides will disrupt agricultural
production, diminish food quality,
increase food costs, and reduce the
availability of nutritious fruit, vegetable,
and grain products. Farmers will be hurt,
consumer choice will be restricted, and
the public health will suffer, without any
countervailing social benefit.
• Second, EPA will be required to
undertake lengthy and burdensome
administrative proceedings to revoke
existing tolerances for pesticides now
conceded to pose negligible cancer risks.
This will divert already strained EPA
resources.
• Third, the risk to the public will be
increased by forcing EPA to ban
beneficial pesticides that pose de minimis
cancer risks and requiring the
substitution of pesticides posing greater
health risks which are not associated with
cancer.
• Finally, EPA prohibition of pesticides
approved and used in other countries,
which export food products to the United
States, will have an adverse impact on
international trade.
EPA's pesticide program is constrained
by many unavoidable burdens and
challenges. It is difficult to understand
why EPA has elected to saddle itself with
the enormous additional burdens
associated with an inappropriate
administrative policy. It is unnecessary
and counterproductive for EPA to
superimpose its so-called coordination
policy on statutory
tolerance
requirements that
were designed to
work perfectly well
without it.
Accordingly, as the
NFPA et al. have
petitioned, EPA
should effectuate
Congressional intent
and avoid the current
Delaney clause
impasse by
rescinding this
policy. D
A Food and Drug
Administration official
inspects imported fruit.
FDA is empowered to
seize or enjoin
shipments of foods that
do not comply with
pesticide residue limits
set by EPA.
Wide World photo
JANUARY/FEBRUARY/MARCH 1993
45
-------
POINTS OF DEBASE
A Legislative Proposal
Why not enact a law that would
help us set sensible priorities?
by Senator Daniel Patrick Moynihan
nvironmental decisions are difficult
in any circumstance, and especially
so when the economy is weak and
there is competition for resources both in
the private sector and in government.
People want a clean environment, but
they don't want to pay more than is
necessary. We can't do everything at
once. The question of priorities arises,
almost unbidden. We would rather not
have to think this way but there you are.
The choice is not between having
priorities or not having them. Rather it is
between setting them consciously or
setting them by default. An informed
and involved public is critical to setting
workable priorities and acting effectively.
Easier said than done!
I am convinced that risk ranking and
cost-benefit analyses are valuable tools
for making environmental decisions.
They are not our only tools, but they do
offer the means to set priorities and to
measure success. It took over half a
century for the dedicated public servants
in the Bureau of Labor Statistics and the
Council of Economic Advisors to learn
how to fashion economic indicators. It
will no doubt take a long time to develop
(Sen. Moynihan (D-NY) is the former
chairman of the Committee on Environment
tind Public Works and chairs its
Subcommittee on Water Resources,
Transportation, and Infrastructure.)
a reliable methodology to assess costs
and benefits of environmental
regulations, and to reliably measure the
risks associated with environmental
exposures. But this is no reason not to
begin. If we don't start now, we will
never learn. And it must not be forgotten
that there are some things that science
cannot tell us. Values, for instance. Do
we care more about this species or that
D
Let us have the courage
and foresight to make a
conscious decision.
In public.
With people watching.
D
one? Or fairness. Some questions are a
matter for the legal system, not for
scientists.
But we do have problems. During the
presidential campaign and transition,
much discussion focused on the
economy, getting and keeping it going.
At a hearing last September on the bill I
proposed as the Environmental Risk
Reduction Act, Dr. Paul Portney of
Resources for the Future testified that
compliance with environmental laws
costs about $130 billion per year—
something like 2.2 percent of the U.S.
gross national product (GNP). Our next
closest competitors in terms of money
spent on environmental protection,
perhaps not surprisingly, are Germany
and Japan, which each spend about 1.6 to
1.8 percent of GNP. While this may not
be too much money to spend on
environmental protection, it is too much
to spend unwisely.
Obviously, we are seeing a new trend.
Federal environmental laws are being
questioned by state and local
governments, which say they can't afford
to comply with all environmental laws.
Their resources are finite and must make
do for competing needs such as
maintaining roads and providing social
services. For example, a report done for
Columbus, Ohio, indicates that new
environmental initiatives will cost
Columbus more than $1 billion over the
next decade—an extra $856 per year of
local fees or taxes for every household in
the city by 2000. (See article on page 48.)
Another study showed a similar impact
in eight other Ohio cities.
As far as we can tell, this pattern is
being repeated in other places. An
editorial in the January 8, 1993, issue of
Science magazine alerts us to the
"growing questioning of the factual basis
for federal command-and-control
actions," all because of concerns over
46
EPA JOURNAL
-------
regulatory costs. The message is clear.
State and local governments will hold
Congress and EPA more accountable in
the future about obligating them to spend
their resources on federal requirements.
They will want "proof" that there is a
problem and confidence that the
legislated solution will solve it.
California's threat to return enforcement
of its drinking water program to EPA last
spring speaks volumes. The most
environmentally advanced state in the
Union close to rebellion—a sobering
prospect. The Science editorial suggests
we are seeing the "beginning of a revolt."
Clearly risk ranking and cost-benefit
analyses aren't the sole factors for
decision making. Social concerns (who
should bear risk for whose benefit?),
public preference, and basic fairness must
be considered too. Truth be told, I
suspect that environmental decisions
have been based more on feelings than
on facts. This is understandable in a
prescientific field, but we are being
overtaken by events. The questions that
are key to making decisions about the
environment are slowly but surely being
identified.
There are those who dismiss relative
risk ranking because the assumptions
needed to assess risk are myriad. Facts
often seem scarce. What are we exposed
to? What effects does this produce?
What portion of the population is
affected? Or might be affected? There
are no precise answers to these questions.
And so risk assessments are
controversial, at times very controversial.
But we cannot forget that knowledge
need not be precise to be useful.
Relative risk ranking and cost-benefit
analyses are tools. Crude tools today,
yes, but perhaps they are sufficient in
some cases to rank activity "A" as riskier
than activity "B." Costs or political
realities may dictate that we should
control "B" before "A." Let us have the
courage and foresight to make a
conscious decision. In public. With
people watching. History shows that
crude tools give way to more refined
ones. Experience teaches.
Last session, I sponsored the
Environmental Risk Reduction Act to
help advance the practice of
environmental risk assessment and cost-
benefit analysis. The bill would put into
law the major findings of the 1990
Reducing Risk report by EPA's Science
Advisory Board. I introduced the bill
because I agree with former EPA
Administrator William Reilly's belief that
science can lend much needed coherence,
order, and integrity to costly and
controversial decisions.
In September 1992, we held a hearing
on the Act before the Committee on
Environment and Public Works. There
wasn't time to modify the bill and bring it
to the floor in the last Congress, so I have
introduced S.110, the Environmental Risk
Reduction Act of 1993, as a first-day bill
in the 103rd Congress.
America's environmental laws are a
large and diverse lot. We have only two
decades of experience on this subject, and
we are still learning, feeling our way.
The relative risk ranking and cost-benefit
analyses called for in my bill provide
some common ground for looking at our
environmental laws. The bill also
provides the public and Congress with
access to the findings. To quote from
Reducing Risk, ''relative risk data and risk
assessment techniques should inform
[the public] judgment as much as
possible." Not dictate it, but inform it.
All this will take time, decades
perhaps. But let us take heart. Questions
that seem difficult now can, with a
certain amount of effort, yield to the
scientific method. D
B.C.
By Johnny Hart
By permission of Johnny Hart and Creators Syndicate, Inc.
JANUARY/FEBRUARY/MARCH 1993
47
-------
POINTS OF DEBATE
What's A City to Do?
Columbus, Ohio, documents
its problems with the feds
by Edward F. Hayes
ith the recent flurry of federal
environmental laws, Columbus,
Ohio, like other cities, is facing
dramatically higher compliance costs.
These costs will compete with spending
on other priority needs such as education,
health care, and public safety. Columbus
wants to be sure that mandated spending
for environmental compliance yields the
greatest possible benefit. However, the
uncertainties in environmental risk
assessments and the failure of federal
laws to account for local variability lead
inevitably to inefficiencies. As
communities have begun to look at the
costs involved and at the competing
financial priorities, critical questions
have been raised about the particular
means that these laws have mandated to
achieve our environmental goals.
Environmental laws already on the
books will cost this city of 632,910 people
more than SI billion (1991 dollars) in the
next decade, according to a recent study,
Environmental Legislation: The Increasing
Cost of Regulnton/ Compliance to the City of
Columbus. Columbus, with an annual
budget of $591 million, spent $62 million
on environmental compliance in 1991. In
1995, the figure will be about $107
million, in 1991 doUars. By 2000, each
Columbus household may pay $856 per
year in new costs to meet these
requirements. (The figures exclude
private sector compliance.) The Ohio
(Hn\/es is Vice President for Research for
Ohio State University and the chair of
Columbia's Environmental Science Advison/
Committee. I
Metropolitan Report Work Group has
found that other Ohio cities face even
heavier costs.
Federal regulatory laws increasingly
require local and state action without
providing federal funds to help.
Congress has released a flood of such
environmental mandates in the past few
years, under the authority of several
laws: the Clean Air Act, the Clean Water
Act, the Safe Drinking Water Act
(SDWA), the Resource Conservation and
Recovery Act, underground storage tank
regulations, amendments to the
Superfund legislation, the Asbestos
D
Cities have a unique
perspective on
priority setting that
needs to be heard.
D
Hazard Emergency Response Act of 1986,
and others. With these regulations have
come myriad state and federal agencies
to ensure compliance.
In some cases, there is room for doubt
that the risks or costs—especially at the
local level—are properly estimated in
preparing some federal regulations. The
risk assessments themselves are subject to
great scientific uncertainties. In addition,
the laws' requirements may offer too little
latitude for local adaptation. Thus, we
may be paying too much for small
benefits while larger ones go begging.
Today's fiscal conditions make such
inefficiency doubly unacceptable.
New amendments to the SDWA
illustrate both the rigidity and
uncertainty of some federal regulations.
Under the amendments, water utilities
must analyze drinking water for at least
133 specified pollutants, beginning in
1993. Many of the substances are not
present in significant quantities in Ohio.
One of them, DBCP, a chemical whose
use was discontinued 15 years ago, was
used almost entirely on pineapples in
Hawaii. EPA's promised guidelines on
the conduct of "vulnerability
assessments"—which project local
impacts of a particular pollutant—to
obtain local waivers have not appeared.
Regulations that let state and local
governments develop their own water
quality programs could produce better
results at lower cost.
The herbicide atrazine, used widely on
Ohio cornfields, is an example of the
costs of scientific uncertainty in SDWA
regulations. Because of certain effects on
the offspring of rats and the hearts of
dogs at very heavy doses, atrazine is
deemed dangerous for humans. Atrazine
levels, therefore, need to be monitored
quarterly at each surface water intake,
and twice yearly at ground water intakes.
If a yearly average of those
measurements exceeds 3 parts per billion
(ppb) of atrazine, the city will need to
take action.
From previous measurements we
know that Columbus's water supply
exceeds the standard every year or two,
EPA JOURNAL
-------
in brief "spikes" triggered by rainfall in
spring and summer. Thus, although
typical levels are far below 3 ppb, the city
may be required to install costly EPA-
specified technology for atrazine
removal.
In the face of such costs, the weakness
of the science underlying the standards is
troubling. The chain of reasoning is
tenuous by which the effects of massive
short-term doses to rats are extrapolated
to estimate the risks to humans of
infinitesimal but long-term doses. The
uncertainties are such that regulators add
a safety factor of 1,000 to their best
estimate of the risk: a factor of 10 to
account for the extrapolation from
animals to humans, another factor of 10
to protect the most susceptible people,
and a third factor of 10 to account for the
fact that atrazine causes mammary
tumors in rats at very high doses and is
therefore rated a "possible carcinogen."
Uncertainty is inherent in
environmental regulation, of course.
Society must be conservative in
estimating risks, or it may be taken by
surprise by unsuspected illness or death.
But attempts to reduce uncertainty
should accompany (if possible, precede)
any major regulatory legislation. A costly
regulatory program like the SDWA needs
to be guided by research to quantify
risks.
Consider the benefits of reducing
uncertainty, so that atrazine could be
regulated with a safety factor of, say, 100,
instead of 1,000. The upshot might be
higher or lower atrazine standards, but
the better defined risks would give a
clearer sense of priorities. Columbus
might find, for example, that money
spent on health clinics for poor
neighborhoods would save more lives
than the federally mandated technology
for controlling atrazine.
Consider, too, the benefits of better
local knowledge. A national data base of
agricultural chemical use by watershed
would let EPA tailor regulations locally.
Columbus would need to monitor only
the chemicals known to be present.
Instead of the rigid "best available
technology" requirement, Columbus and
federal regulators might negotiate
alternatives for safeguarding health, such
as reducing runoff from fields or finding
ways to alter atrazine use in the
watershed. There is no such data base.
Sen. Daniel Patrick Moynihan (D-NY)
has offered a plan for working toward
better environmental priorities in his
proposed Environmental Risk Reduction
Act. Moynihan's bill would require EPA
to seek independent advice in ranking
environmental risks and benefits, to fund
scientific research on risks, and to use this
information in managing regulatory
resources. Among other things, EPA
would develop and publish risk
assessment guidelines for the range of
environmental risks, and promote
consistency and technical quality in these
assessments—a requirement for setting
priorities.
No matter how sound EPA's risk
assessments are, of course, Congress will
remain the source and shaper of most
environmental regulation. Regulatory
priorities can benefit from better scientific
The herbicide atrazine is commonly used
on Ohio cornfields. Under a federal EPA
regulation, the city of Columbus must monitor
its water supplies for the presence
of atrazine and take action if yearly
average measurements exceed 3 pans per
billion—a costly prospect for the municipality.
Gene Alexander photo. Soil Conservation Service. USDA.
analysis but will continue to be set by
politics and public opinion. Still, the act
would make EPA a more potent force for
rationalizing regulatory programs by
giving legislators objective information
about risks and the costs and benefits of
abating them.
At the local level, we in Columbus
have embarked on an important
experiment in cooperation with federal
regulators. Mayor Greg Lashutka has
established the Environmental Science
Advisory Committee (ESAC) as a source
of independent scientific and engineering
advice about environmental risks and
remedies. ESAC, as requested by the
mayor, will assess the scientific rationales
of current and proposed federal and state
environmental regulations. Such
reviews, we hope, will give the city the
information needed to cooperate with
federal regulators and legislators in
shaping environmental requirements and
strategies that meet local needs. If
successful, the state of Ohio may want to
replicate the program and devise a
network of advisory groups that could
assist communities that lack the capacity
for such a program.
The early landmark environmental
statutes, such as the Clean Water Act and
Clean Air Act, brought dramatic
improvements in relatively short order
through a cost-sharing partnership of
federal, state, and local governments.
The risks that regulators set their sights
on today are increasingly subtle,
uncertainties correspondingly great, and
gains more costly. Costs are more likely
to be borne locally.
The nation must be sure the greatest
risks are tackled first and in the most
cost-effective way. Cities have a unique
perspective on priority setting that needs
to be heard. They know that diverting
funds from schools, police protection,
and assistance to the poor to deal with
environmental mandates can have a
hidden "toxic" effect that can be more
lethal than the environmental toxicity
that needs to be eradicated. D
JANUARY/FEBRUARY/MARCH 1993
49
-------
OF DEBATE
Alternative Paradigms
Comparative risk is not the only model
by Adam M. Finkel
and Dominic Golding
e suspect that one of the
enduring legacies of EPA in the
1990s will be the way it defined
its central problem and the solution it
advocated. The problem statement is
relatively noncontroversial: EPA's
budgetary and regulatory priorities, and
consequently the nation's overall
environmental agenda, have become so
haphazard that large amounts of
financial and human resources are being
wasted, while at the same time, serious
problems are being ignored or glossed
over. The solution is not as widely
accepted—that EPA should use scientific
judgment to identify the best
opportunities for reducing risks to health
and the environment and should try to
shift resources and our national attention
accordingly.
Over the past decade, the practice of
assessing the risks of individual
substances has been applied routinely in
the regulatory process; it has gained
ardent supporters as well as detractors.
Both the strengths and the weaknesses of
tliis "plain vanilla" risk assessment are
magnified as it takes on the new task of
comparative risk assessment
(CRA)—ranking a portfolio of disparate
risks. Prompted by EPA's Unfinished
Business report in 1987 and spurred on by
the 1990 Science Advisory Board report
Reducing Risk, many observers have
stressed the advantages, if not the
necessity, of using CRA to set priorities.
"Our most reliable compass in a
(Fiukel and Golding arc Fellows in the Center
tor Risk Management
-------
The State of the Art
It was evident from the
presentations and discussions at
the conference that despite
widespread enthusiasm for using
empirical information to rank
environmental problems, CRA
has serious drawbacks.
Methodological problems thwart
all efforts to rank risks reliably;
the most prominent seem to be
how to address issues of
uncertainty, variability, and
commensurability, especially
given major data deficiencies. No
less daunting is designing a
ranking process that blends
expert judgment with public
values, while at the same time
avoiding other forms of elitism
and undue influence from special
interests. Finally, the biggest
obstacle to implementation seems
to be that of maintaining a logical
priority-setting process that still
gives states the flexibility to
respond to local concerns in a
timely fashion.
Criticisms and Alternatives
Many of the criticisms of CRA
raised during the first half of the
conference were serious and
potentially daunting, but none of
them really called into question two basic
tenets of the current mainstream of
thought about environmental priority-
setting:
• Risk reduction, broadly construed, is
the raison d'etre of environmental
programs, and hence of environmental
resource allocation.
• Risk assessment—or at least a "softer"
brand of analysis—is the right way to
gauge how large, or how socially
important, various problems are and is
the right guide for reducing risks
efficiently.
Clearly, EPA's efforts over the last four
years have squarely embraced both this
goal and this means—the Agency has
essentially defined "risk-based priority
setting" as encompassing a reordering of
priorities both for risk reduction and by
risk reduction.
The second half of the conference
opened up the debate to include
fundamentally different premises. Two
speakers questioned the advisability of
making risk reduction the fundamental
goal of environmental protection, while a
third embraced risk reduction as the goal
but asserted that the greatest advances
can come via a wholly different type of
analysis. (See boxes.)
Interestingly, although all three
speakers eschewed or downplayed CRA,
they did not seem to be viscerally
opposed to risk assessment per sc or
ignorant of its details, which is the
caricature some observers draw of the
clash between "rationalism" and its
opponents. Rather, they clearly saw the
virtue of CRA as one tool to inform the
debate, but had concerns that were
unaddressed by (though not necessarily
antithetical to) the quantitative outputs of
risk assessment exercises. We see their
efforts as more enriching than divisive,
as we believe our national debate over
environmental protection should not
begin and end with discussions of the
consequences of our actions, but must
leave room for the discussion of moral
duties and proscriptions.
Some poor and minority
communities experience a
disproportionate pollution burden.
Proponents of environmental justice
say equity issues may fall by the
wayside in risk-based priority setting.
Sieve Delaney photo
Conclusions
Our conference ended just as it
seemed to be poised to offer
practical answers to the central
questions of how our
environmental priorities ideally
should be set. As two of the
organizers, we believe we can
identify three emerging themes
in this debate where our
participants seemed to be
moving toward common
ground. First, wherever it is
being pursued, a "hard" version
of CRA needs to be "softened"
by incorporating public values
concerning the more elusive
qualities of risk, such as
voluntariness, dreadedness,
blame, and so forth; risk is a
complex, multi-dimensional
concept that cannot be
characterized adequately on the
basis of one or two attributes.
Second, citizens and state and
local governments need more
than mere access to the priority-setting
debates; they need to be "empowered" as
integral players in the process. Third,
although quantitative risk estimates
should not necessarily have the status of
"trump card" over other factors, we
should continue to refine them and give
them serious consideration.
There was much less agreement,
however, over the question of whether
the paradigms that had been presented
were really in conflict or could be
pursued simultaneously. Some
participants seemed to resent being asked
to choose among "risk, prevention,
justice, and innovation," saying they
were for all these ideas equally. Others
found it ironic that, at a conference about
how society can't address every
environmental problem at once, some
"rationalists" were arguing that society
can embrace every priority-setting
framework at once.
The real conflict, then, if there is any,
consists in whether one or more of the
three non-risk approaches—or still other
JANUARY/FEBRUARY/MARCH 1993
51
-------
Pollution Prevention Paradigm
Barry Commoner, director of the Center
for the Biology of Natural Systems at
Queens College, distinguished between
two fundamental strategies for solving
environmental problems: "end-of-pipe"
control versus pollution prevention. He
argued in favor of pollution prevention
as the preferred strategy and in favor of
public opinion as the only means for
setting priorities that logically follows
from that choice for two reasons.
Commoner first argued that CRA is
wedded inextricably to the "failed
national enterprise" of pollution control,
because it is designed to identify the
largest risks and only reduce them
down to the level of the rest of the
"environmental landscape." Secondly,
he observed that pollution prevention
on a large scale, such as a shift to electric
cars, would engender massive social
and economic changes, which may
make the environmental impacts "a
subsidiary, though welcome,
consequence rather than the prime
mover." Logically, only public
judgment is broad enough to determine
if these changes are desirable, he said;
CRA is not.
John Graham, professor of health
policy at the Harvard School of Public
Health, countered that pollution
prevention and CRA are
complementary, not mutually exclusive,
because pollution prevention is just one
strategy for tackling high-priority
problems identified by CRA. In some
cases, prevention may even be more
expensive than control, or may present
new risks.
approaches, such as targeting resources
to geographic regions, setting priorities
via "regulatory negotiation," or having
Congress reconcile conflicts among
statutory mandates—should be grafted
on to a risk-based foundation, or whether
they should flourish on their own, with
CRA as the subsidiary consideration.
Although we aren't prepared to
endorse any particular method (or
combination thereof), we strongly believe
that at the very least it does matter, both
symbolically and functionally, which
organizing principle for priority setting
emerges as the "default" or "the trump
card." For one example of how
important this distinction might be,
consider the same set of problems as
addressed by two different ranking
schemes. Among the long list of
environmental problems that EPA is
trying to prioritize are risks that are
exacerbated by single, nondurable
consumer products, such as the daily
newspaper. Production of this good
contributes to the nation's water
pollution problems, to air pollution, to
the shrinking capacity of solid waste
landfills, to deforestation, and to other
problems outside EPA's purview, such as
occupational hazards. If this problem is
seen as a collection of risks we should try
to control or prevent in some rational
sequence, the path towards amelioration
is clear: Eventually, the newspaper
producers would install pollution control
devices on their effluent points and on
their process and fugitive air-emission
points, change to less toxic inks,
encourage more recycling of newspapers,
plant more trees, etc.
Suppose, instead, that our first cut at
setting priorities used a "solution-based"
approach, such as Ashford's model.
(See box.) Government might then
consider, among a menu of actions,
subsidizing or otherwise promoting the
development and proliferation of user-
friendly electronic newspapers, wherein
ink never touches paper. A similar net
amount of risk reduction—that is, risks
reduced minus any new risks created by
the control devices or by the new
electronic product—might eventually
come by either path. But the speed and
efficiency of the transformation might be
far greater in the latter case, and the net
cost to the economy might be far less.
If space permitted, we could sketch
out an analogous example contrasting
the risk-reduction and social-
transformation consequences if risks to
particular minority and poor
communities were addressed either as
special priorities under a CRA
framework or as the centerpieces of a
"reduce first, assess later" scheme. Such
examples merely scratch the surface,
however. The point is that, as risk
assessment practitioners, we personally
have no desire to turn society's back on
this useful tool. Indeed, we would
encourage experts and representatives of
the public to come together to craft a
broader, less restrictive, and more
reliable version of CRA, whatever its
final role might be. We believe that the
unsettled end to the conference in
November augurs well for the future, as
it leaves open the door to a broader
national debate on the choice of priority
setting tools, strategies, and goals. D
(For a detailed synopsis of the conference
"Setting National Environmental Priorities,"
which was sponsored by CRM with financial
support from the Peiu Charitable Trusts,
EPA, the Chemical Manufacturers
Association, and the U.S. Department of
Energy, contact the authors via fax at (202)
939-3460.)
Technological Innovation Paradigm
Nicholas Ashford, professor of
technology and policy at the
Massachusetts Institute of Technology,
asserted that strict regulation, properly
designed, can trigger technological
innovation and yield more risk
reduction at lower cost than can risk-
based priority setting schemes. He
argued that existing approaches have
ceded too much control to the polluting
industries, triggering only the diffusion
of existing technologies. Rather than
attacking the "worst risks first," he said,
EPA and other agencies should identify
which industrial sectors are most "ripe"
for innovation, perhaps to the extent of
adopting enforcement strategies that
give some leeway to companies
experimenting with new prevention or
control ideas. If agencies made these
areas their high priorities, he argued,
products and processes that create many
risks could be improved in a directed
rather than a happenstance manner.
James Wilson, regulatory issues
director for Monsanto Co., endorsed the
promotion of innovation but was
skeptical about the efficacy of Ashford's
proposal. He emphasized that
individual companies seldom know
which innovations will succeed
environmentally or economically and
said it is folly to believe that the federal
government would know better.
52
EPA JOURNAL
-------
c
ROSS CURRENTS^
Going Electric
A book review by Karen Flagstad
9 hinking of buying a commuter car? Maybe
I you should think electric. No gas. No oil.
I No antifreeze. No guilt about heavy duty
emissions during all that stop-and-go driving ....
Too futuristic, you say? Consider these facts
from transportation history:
• On the first day of this century, nearly half the
cars in the United States were electric. True, there
were only a few thousand motorcars in this
country then. But by 1910, the number had grown
to 458,000, and still nearly half of these were
electric.
• Ninety years ago, all the taxis in New York City
were electric vehicles.
• The first electric ambulance in the United States
went into service in February 1900. It delivered
patients to St. Vincent's Hospital in Manhattan.
• In 1903, eastern Massachusetts had a network of
42 battery charging stations, analogous to gasoline
filling stations, for servicing electric motorcars.
• Not gasoline powered cars but electric vehicles
were the first to feature power steering. As early
as 1897, electric taxis with power steering were
cruising around Paris.
In Solo: Life with an Electric Car (W. W. Norton,
1992; 191 pages), Noel Perrin, a professor of
Environmental Studies at Dartmouth College, tells
two stories—one historical concerning the rise and
decline of the electric car in the early decades of
this century; one personal, dating from his
decision in January 1990 to acquire an
(Flagstad is Associate Editor of EPA Journal.)
environmentally friendly commuter car of his
own. The several events mentioned above are just
a few pieces of the historical narrative that weaves
in and out of Perrin's tale of how he came to own
and operate the shiny red solar electric car that
earned the name "Solo." The eponym has a
double meaning: It refers to Solo's solitary
adventures on U.S. highways vastly dominated by
gasoline powered cars and trucks, and it alludes to
solar energy. (Solo has six solar electric panels,
four on the roof and two on the hood, and Perrin,
determined to have a clear conscience when he
walks into Environmental Studies class, has
equipped his Vermont farmhouse with 28 solar
panels that generate energy to recharge Solo's 18
batteries.)
Readers of Perrin's book may be surprised at
the early prominence that electric cars had in this
country. Yet early electric vehices had several
advantages over their gasoline driven
competition: They started up easily with the turn
of a key (no cranking required); they operated
cleanly and quietly, whereas the early gas cars
were notoriously dirty and noisy; many were fully
enclosed and featured plate glass windows, thus
shielding their occupants from the elements,
whereas the early gas cars were characteristically
open to the air, presumably to keep gas fumes
from asphyxiating occupants; and electric vehicles
avoided the worrisome combustion dangers
associated with gasoline. Some disadvantages:
They lacked the increasing range and speed of
gasoline powered cars, and they were necessarily
limited to urban driving. (After all, at the turn of
the century, 97 percent of the United States lacked
electricity.)
If Perrin is
correct,
electric vehicles
are on the verge
of a renaissance
in the 1990s and
beyond.
JANUARY/FEBRUARY/MARCH 1993
53
-------
Dartmouth professor Noel
Perrin, the solar electric
car he has named
"Solo," and his
pet cat.
Photography
staff photo
Copyright
Valtey News
1991
Ironically enough, the turning point in the
competition came when, in 1912, gasoline cars
began featuring an item borrowed from electric
vehicles: the electric starter. After that, starting a
gas car was no longer the dirty, chancy, and
undignified business it had been. In the post-
World War I era, with petroleum in plentiful
supply, electric cars ceased to be really serious
competition—even though they were
manufactured by various companies well into the
1920s, and by one company until 1940.
Then, in the 1970s, electric vehicles began to
seem potentially feasible again, although their
speed and range had scarcely changed since the
1920s. The reasons: their ability to do without gas
and oil, and their ability to reduce air pollution.
For these compelling reasons, if Perrin is correct,
electric vehicles are on the verge of a renaissance
in the 1990s and bevond.
But in January 1990, when the author resolved
to obtain an electric car, it was not easy to do so; in
fact, it was difficult enough to warrant writing a
book about the whole endeavor. We live in a
gasoline car culture. As an indication, notes
Perrin, "If you want a gas car, the whole world
conspires to help you get one. There are constant
ads on TV and in the magazines. There are
dealers in every town, special lots where you can
buy secondhand machines at low prices, whole
companies devoted to auto loans, classified ads in
every newspaper...."
Not so for electric cars. In early 1990, there were
only two small companies making electric cars in
the United States, and two companies abroad,
neither of which sold cars here. (Several U.S.
companies that began manufacturing electric
vehicles in the 1970s and 1980s had gone out of
business by 1990.) Professor Perrin learned of the
existing companies only through arduous research
and a syndicated "want ad" in the form of an op-
ed piece lamenting his frustrated quest for an
electric car. The want ad brought results.
The market is rapidly changing, however. The
epilogue of the book lists several current makes of
electric vehicles now for sale in this country as
well as a couple of used electric car dealers. And,
according to the prologue, "Almost every major
automobile manufacturer has experimental or
prototype electric cars on hand, and several are
already committed to mass production. General
Motors is, and so is Ford .... Abroad there are so
many it's hard to keep track." Peugeot, Fiat, and
BMW are mentioned.
In February 1991, Perrin contracted with Solar
Electric Engineering of Santa Rosa, California, for
a solar electric car in the form of a converted Ford
Escort Station Wagon (cost: $15,700 for the electric
car plus $1,800 for the six solar panels). The
following May, he flew to California, took some
informal lessons in the care and operation of an
electric car, and drove off in the 56th electric car
made by the company. His destination: home in
Thetford, Vermont, via an itinerary planned on the
basis of bicyclists' maps. Figuring on an average
range of 63 or 64 miles a day, perhaps double that
on some days, the intrepid Professor Perrin hoped
to get home on battery and solar power in roughly
a month or six weeks.
Why drive an electric commuter car across the
continent against the practical advice of the
manufacturer? Two reasons: to avoid shipping
charges and to make a little history by being the
first to drive an electric car all the way across the
country. On the latter point, Perrin subsequently
found he was mistaken; the feat had already been
accomplished. (Of course, if Professor Perrin
hadn't given the trip the old college try, the book
might have lost much of its plotline.)
Solo didn't make it, at least not entirely on his
own, although he did pretty well until the
54
EPA JOURNAL
-------
approach to Dormer Pass in the California Sierras.
But that seems not to be the point so much as what
happens along the way. One thinks of Travels with
Quirky—with an electric car in tow playing the
roughly the same part as Steinbeck's pooch.
Moreover, once Solo is safely home in Vermont, he
performs admirably well in the role for which he
was intended: commuting between Perrin's farm
and Dartmouth College 13 miles away.
But are electric vehicles really so much more
environmentally sound than gas cars? There is
some debate on this question, and a chapter of Solo
is devoted to the controversy. After all, it doesn't
make sense simply to compare the emissions of
gas versus electric cars. You have to consider
where their electricity comes from and how much
of what kind of pollution is created in the process
of generating that electricity. (Not every potential
driver of an electric car can be expected to invest
in solar panels, as did Perrin, to supply the car
with pollution-free energy.) Perrin argues
convincingly in favor of figures from the
California Air Resources Board that suggest that
"mile for mile, [electric vehicles] that charge from
power plants cause less than one-eighth the
damage that gasoline cars do."
So what's the near-term outlook for electric cars
in the United States? According to this book, they
definitely have a future, but there are questions
and contingencies. One contingency concerns the
development of better batteries to boost the range
and power of electric vehicles. Questions include
whether federal and state government agencies
will encourage a transition to electric cars. (Right
now, two states actively encourage the purchase
of electric vehicles: California offers substantial
tax incentives to purchasers of electric vehicles,
and under the state's new zero-emissions law, by
1998, 2 percent of all new cars fold there must be
completely nonpolluting—graduating to 10
percent by 2003; Arizona offers dramatically
reduced vehicle registration fees to electric vehicle
owners.) And what about the general public?
Will substantial numbers of us opt for battery-run
electric cars in the interest of the environment?
Says Perrin, "Ask me again in two years." d
Said to be the world's largest electric-powered vehicle. Southern California Edison's "clean air float"performed in the Tournament of Roses Parade
on New Year's Day 1993 in Pasadena. Solar cells power the wizard's spinning crystal ball.
JANUARY/FEBRUARY/MARCH 1993
55
-------
Clearing the Air
by Rich ten Wolde
In the past, the
environmental
dangers of wood
smoke too often
have been
ignored, and
improvements in
stove design
have been slow
to come.
he off-white haze lingered for days over the
Rocky Mountain town of Crested Butte,
Colorado. Days accumulated into weeks
and drifted into months, offering nothing but
sporadic views of the royal blue winter sky. With
the approach of spring, the disheartening haze
disappeared, but the following year it returned,
settling on the town of 1,200 for the entire winter.
At the time, the residents of Crested Butte
didn't realize that they, together with Mother
Nature, were the source of the problem. The haze
was actually pollution. Crested Butte's dilemma
was similar to that of other towns in which many
citizens rely on wood-burning stoves as a
principal heat source. Older stoves with no
pollution control equipment were emitting dense
smoke, laden with particulate matter, that became
trapped in the town during temperature
inversions. An inversion occurs when a layer of
warm air traps a cold one close to the ground,
inhibiting normal circulation. The town's
residents were breathing unhealthy levels of
smoke that violated EPA's air quality standards.
To rectify the problem, EPA's Region 8 joined
forces with the Crested Butte town council, the
Colorado Department of Health, and the Wood
Heating Alliance, representing private industry.
Together, they transformed the town into a living
laboratory for testing the emissions reductions that
new, high-tech stoves promised. And they
established programs to get homeowners to
replace their polluting stoves. The results of the
Crested Butte experiment will help solve a
problem that exists in many towns and
communities across the nation. In 70 small
American cities, levels of small particles exceed the
National Ambient Air Quality Standards, and in
(ten Wolde, who graduated from the University of
Man/land in December, was an editorial intern for
EPA Journal last fall.)
about half of those towns, wood stoves are the
source for a portion of the pollution, according to
Jack Hidinger of EPA's Region 8.
In the past, the environmental dangers of wood
smoke too often have been ignored, and
improvements in stove design have been slow to
come. Benjamin Franklin invented the first free-
standing metal stove in the 18th century. With
Franklin's design, wood burned more slowly, and
the stove emitted six times as much heat as
fireplaces. The slower burning time was made
possible by restricting the air supply, and the
increased heating was the result of the exposed
metal sides that were part of the stove's design.
The same basic stove design was followed until
the mid-1960s, when the first air-tight stove was
produced by the Fisher Stove Company. The
Fisher design permitted longer burning time of
fuel and increased heating efficiency. Unlike the
Franklin model, the new stoves also allowed the
homeowner to control the amount of heat given
off by adjusting the air intake.
Unfortunately, however, the Fisher design also
produced more pollution than previous models.
Subsequent studies of air-tight stove performance
concluded that, although the stoves were about 50
percent more fuel-efficient, they produced six
times as much particulate matter as fireplaces and
three times more than Franklin stoves.
Blaze King of Washington and Jorul of Norway
developed the first ''new-tech'' or "high-tech"
stoves, which lowered particulate output about 60
percent compared to air-tight stoves. Pollution
was reduced by using an automobile-type
catalytic converter on the stoves, which filtered
out smoke particles. Companies also designed
non-catalytic stoves that burned more cleanly.
EPA had a number of objectives planned for the
Crested Butte study, in which the Agency
supplied crucial technical, coordination, and
56
EPA JOURNAL
-------
financial support. According to Jack Hidinger, the
Agency wanted to characterize the emissions of
existing and new technology stoves and assess the
results of converting an entire town over to new
stoves. EPA also wanted to check the validity and
accuracy of a short-term test that measured
particulate emissions from stoves.
In order to succeed, the experiment, which was
organized by Hidinger and Bob McCrillis of EPA's
Office of Research and Development, needed
cooperation from the wood stove industry, whose
future sales would be affected by the study.
Hidinger and McCrillis negotiated an agreement
with the Wood Heating Alliance, which represents
stove manufacturers, that defined the roles the
participants would play in the study.
The State of Colorado and the Crested Butte
town council also joined the efforts. The town
council passed an ordinance outlawing the use of
older model stoves and provided the residents
with information on the hazards of wood smoke.
In addition, the local government and industry
representatives held a fair that promoted high-
tech stoves and offered them at discounted prices.
At the fair, information was circulated on clean
and efficient wood stove operation.
In-home emissions testing of old and new
stoves was another important aspect of the
program. Virginia Polytechnic Institute
contributed to the project by designing an in-home
emissions testing device that EPA has since
approved for emissions measurement.
During the 1988-89 winter, Crested Butte
residents had 349 conventional stoves and 85 high-
tech ones. The following year, as a result of the
changeover program, the town's residents had 276
high-tech stoves and 68 conventional models.
Some residents only occasionally used their stoves,
so they removed them rather than convert to
newer stoves; others decided to switch to an
alternate heat source.
The high-tech, EPA-certified stoves were found
to reduce particulate emissions by almost 70
percent over the old stoves. In total, airborne
particulate matter for Crested Butte decreased by
an average of 40 percent over the measuring
period. Visibility, which was measured hourly
during both years, improved about 60 percent for
1989-1990 over the previous year. Because
meteorological data revealed that the two years
were comparable, weather didn't significantly
affect data.
The Crested Butte program brought the town
within compliance of the 1990 Clean Air Act.
Because EPA and others carefully crafted a
politically and economically feasible plan, other
communities will be able to use the plan to
improve their air quality. Moreover, the residents
at the foot of Mount Crested Butte no longer have
a wintertime haze lingering over their town. D
High-lech stoves
were sold at discount
at the Crested Butte,
Colorado, wood stove
fair in the summer of 1989.
Colorado Dept of Health photo
JANUARY/FEBRUARY/MARCH 1993
-------
Peregrine Falcon
From Refuge by Terry Tempest Williams
Suddenly, the
flock pulls
together like a
winced eye, then
opens in an
explosion of
feathers.
ot far from Great Salt Lake is the
municipal dump. Acres of trash heaped
high. Depending on your frame of mind,
it is either an olfactory fright show or a
sociological gold mine. Either way, it is best to
visit in winter.
For the past few years, when the Christmas Bird
Count comes around, I seem to be relegated to the
landfill. The local Audubon hierarchy tell me I am
sent there because I know gulls. The truth lies
deeper. It's an under-the-table favor. I am sent to
the dump because secretly they know I like it.
As far as birding goes, there's often no place
better. Our urban wastelands are becoming
wildlife's last stand. The great frontier. We've
moved them out of town like all other "low-
income tenants."
The dump where I count birds for Christmas
used to have cattails—but I can't remember them.
A few have popped up below the hill again, in
spite of the bulldozers, providing critical cover for
coots, mallards, and a variety of other waterfowl.
I've seen herons standing by and once a snowy
egret, but for the most part, the habitat now is
garbage, perfect for starlings and gulls.
I like to sit on the piles of unbroken Hefties,
black bubbles of sanitation. It provides comfort
with a view. Thousands of starhngs cover refuse
with their feet. Everywhere I look—feathered
trash.
The starlings gorge themselves, bumping into
each other like drunks. They are not discretionary.
They'll eat anything, just like us. Three starlings
picked a turkey carcass clean. Afterward, they
crawled inside and wore it as a helmet. A carcass
with six legs walking around—you have to be
sharp counting birds at the dump.
(Tempest Williams is Naturalist-in-Residence at the
Utah Museum of Natural Histon/ in Salt Lake City.)
I admire starlings' remarkable adaptability.
Home is everywhere. I've seen them nesting
under awnings on New York's Fifth Avenue, as
well as inside aspen trunks in the Teton
wilderness. Over 50 percent of their diet is insects.
They are the most effective predators against the
clover weevil in America.
Starlings are also quite beautiful if looked at
with beginner's eyes. In autumn and winter, their
plumage appears speckled, unkempt. But by
spring, the lighter tips of their feathers have been
worn away, leaving them with a black, glossy
plumage, glistening with iridescences.
Inevitably, students at the museum will
describe an elegant, black bird with flashes of
green, pink, and purple.
"About this big," they say (holding their hands
about seven inches apart vertically). "With a
bright yellow bill. What is it?"
"A starling," I answer.
What follows is a dejected look flushed with
embarrassment.
"Is that all?"
The name precedes the bird.
I understand it. When I'm out at the dump with
starlings, I don't want to like them. They are
common. They are aggressive, and they behave
poorly, crowding out other birds. When a harrier
happens to cross-over from the marsh, they
swarm him. He disappears. They want their trash
to themselves.
Perhaps we project on to starlings that which
we deplore in ourselves: our numbers, our
aggression, our greed, and our cruelty. Like
starlings, we are taking over the world.
The parallels continue. Starlings forage by day
in open country competing with native species
such as bluebirds for food. They drive them out.
In late afternoon, they return in small groups to
nest elsewhere, competing with cavity nesters
58
EPA JOURNAL
-------
such as flickers, martins, tree swallows,
and chickadees. Once again, they move
in on other birds' territories.
Starlings are sophisticated mimics
singing songs of bobwhites, killdeer,
flickers, and phoebes. Their flocks
drape bare branches in spring
with choruses of chatters,
creeks, and coos. Like any
good impostor, they confuse
the boundaries. They lie.
What is the impact of such a
species on the land? Quite
simply, a loss of diversity.
What makes our relationship to
starlings even more curious is that we
loathe them, calling in exterminators
because we fear disease, yet we do everything
within our power to encourage them as we
systematically erase the specialized habitats of
specialized birds. I have yet to see a snowy egret
spearing a bagel.
The man who wanted Shakespeare's birds
flying in Central Park and altruistically brought
starlings to America from England, is not to
blame. We are—for creating more and more
habitat for a bird we despise. Perhaps the only
value in the multitudes of starlings we have
garnished is that in some small way they allow us
to comprehend what vast flocks of birds must
have felt like.
The symmetry of starling flocks takes my breath
away, I lose track of time and space. At the dump,
all it takes is the sweep of my hand. They rise.
Hundreds of starlings. They wheel and turn, twist
and glide, with no apparent leader. They are the
collective. A flight of frenzy. They are black stars
against a blue sky. I watch them above the dump,
expanding and contracting along the meridian of a
winged universe.
Illustration by
Stacey Stevenson
Suddenly, the flock pulls together like a winced
eye, then opens in an explosion of feathers. A
peregrine falcon is expelled, but not without its
prey. With folded wings he strikes a starling and
plucks its body from mid-air. The flock blinks
again and the starlings disperse, one by one,
returning to the landfill.
The starlings at the Salt Lake City municipal
dump give us numbers that look good on our
Christmas Bird Count, thousands, but they
become faceless when compared to one peregrine
falcon. A century ago he would have seized a teal.
I will continue to count birds at the dump,
hoping for under-the-table favors, but don't
mistake my motives. I am not contemplating
starlings. It is the falcon I wait for—the duckhawk
with a memory for birds that once blotted
out the sun. LI
(From REFUGE by Terry Tempest Williams.
Copyright(c) 1991 by Terry Tempest Williams.
Reprinted by permission of Pantheon Books,
a division of Random House, Inc.)
JANUARY/FEBRUARY/MARCH 1993
59
-------
c
HRONICLEl
The Felling of the
Great Lakes Forests
by Teresa Opheim
"C
Breaking a jam.
If teams of horses
or dynamite failed,
workers ventured
out to loosen the piled
up mass on the river.
Stale Historical Society ot
Wisconsin photo
enturies will hardly exhaust the
pineries above us," a Minnesota booster
predicted in 1854 of the pines that
blanketed the North Woods of Michigan,
Wisconsin, and Minnesota. Fifty years later,
however, the pines were largely gone. Today, the
swiftness and thoroughness of that decimation
deserves to be remembered as a prime example of
unfettered forest exploitation in the days before the
practices of forest ecology, reforestation, and
sustained yields.
When white pioneers moved onto the grasslands
of the continent's mid-section in the early 1800s,
only one element rivaled soil in its importance:
lumber, which was scarce on the prairie but
essential for homes, barns, fence posts, and ties for
the railroad that was beginning to snake across the
continent. The settlers looked north for much of
their wood, and the lumber barons of the North
Woods offered a gem of
a supply in white pine,
which was easily
worked, straight-
grained, strong, and
durable. White pine also
floated well, an
invaluable trait as,
before the railroad, logs
were floated to market
via the region's small
rivers and streams, Lake
Michigan, and the
Mississippi River.
Settlers viewed the
steady supply of pine
coming from the north
as Providence. How
could "our munificent
Maker," one
contemporary asked,
"have left two thousand
miles of fertile prairies
down the river, without
-------
Be nice to yourself, or to someone else special
The $10 Gift that
keeps right on giving!
-.—.« I^VH ir%fe i Jt I
£[=PA JOURNAL
INFORMATION, NEWS, TRENDS, ISSUES, VIEWPOINTS, PEOPLE, POLICIES,
AND PROGRAMS —from local to global
If you are currently a subscriber, you
know how the special people on your
list, friends, college students, or family
members, will appreciate your
thoughtfulness in sending them a gift
subscription to EPA JOURNAL.
If you're not a subscriber, it's about time
you were nice to yourself.
What else can you give or receive that
comes every other month, is dedicated
to making ours a better world, and is
ONLY $10? EPA JOURNAL makes for
an imaginative, educational, year-round
gift and is a real bargain, too!
Simply check the boxes on the other
side and fill in the name and address of
the person you want to receive a
subscription.
MORE THAN ONE? You can copy and fill out the reverse side of this card for as
many subscriptions as you wish to give.
WINNER! EPA JOURNAL took FIRST PLACE in the 1991 "Blue Pencil
Competition," a prestigious Award that makes a subscription even more valuable.
-------
ORDER FORM
Order Processing Code
*5172
D YES, enter subscriptions to EPA JOURNAL for $10.00 per year ($12.50 foreign).
The total cost of my order is $ . Price includes regular domestic postage
and handiing and is subject to change.
Company or personal name
Please type or print
Additional address/attention line
City, State, ZIP Code
Daytime phone including area code
YES n N°
To fax your orders (202) 512-2233
Mail to:
New Orders
Superintendent of Documents
P.O. Box 371954
Pittsburgh, PA 15250-7954
Thank you for your order!
This form may be photocopied.
Purchase Order No.
May we make your name/address available to other mailers?
Please Choose Method of Payment:
LJ Check payable to the Superintendent of Documents
HI GPO Deposit Account
U Visa or MasterCard
Account
Credit card expiration date
Authorizing Signature
-------
Farewell
Departure has a sadness. For me, it is in the memory of 62 issues and more
than eight years as editor of EPA Journal. Back copies of the magazine are
my "scrapbook"—every issue has its story, the triumphs, the things that
could have been done better, the surprises, and yes, the mistakes. Every issue
meant something to me and, I believe, to somebody out there. We—the staff and
I—did succeed, I believe, in carrying out the magazine's tradition of objectively
and comprehensively addressing the big, current environmental issues with the
idea of contributing to the public dialogue about them.
What's next for the Journal? I might imagine that it won't be able to get along
without me. "You and the magazine are inseparable," a friend said recently. A
flattering thought, but not so by a long shot. The talented and energetic staff is
already taking charge. They are doing the work of the magazine now, and this
departing editor is experiencing the new, challenging feelings of "not being
needed." It is part of the sadness, but also, part of the joy—life rolls on, endlessly
finding new ways, making new choices.
But just as the American dream passes on generation to generation, adapted,
amended, but still, basically the same, on the rock of a society's conviction, I
believe the Journal will pass on, based on the rock of an idea that a government
magazine can be credible, forthright, balanced, and brave in reporting the issues
with which its agency is involved.
What's next for this editor? There will be his "fingerprint" in the magazine, in
the form of a regular, short essay in the Departments section. In these essays, I
will try to give the reader something to think about, a fresh, creative angle that
he or she may not have seen before.
And beyond this? I'll be trying to contribute to the public dialogue in other
ways, in an independent ministry of stewardship, utilizing the wonderful
experience and insights EPA has given me.
Thanks for these eight years.
Back cover: Aground off the Shetland Islands,
northeast of Scotland, the tanker Braer breaks
apart, spilling 25 million gallons of oil and putting
rich ecological resources at risk.
John Emmons photo Photoreporters. Inc
------- |