r/EPA JOURNAL
  Environmental Data:
  -.—Providing Answers
  or Raising Questions?

-------
Environmental  Data—
Providing   Answers or  Raising   Questions?
   Environmental
   data—foundation for
 decision-making and source
 of controversy. This issue ut
 KPA Journal explores the
 subject.
  In a lead-off article, the
 Agency's new Administrator,
 William K.  Reilly. suggests a
 goal tor KPA's widespread
 data-gathering efforts:
 measuring for environmental
 results.
  Next,  John A. .Vloore, the
 Agency's acting Deputy
 Administrator as the journal
 went to press, discusses the
 scientific evidence on Alar
 and apples  in  light of recent
 public controversy.
  A Journal forum follows, in
 which three outside
 observers answer the
 question, what kind of data
 does the public need to
 evaluate the safety of
 chemicals in the environment':1
  Three articles follow on the
 major new supply of data
 being provided under
 the Emergency Planning  and
 Community Right-to-Know
 Act, Included are a piece on
 Right-to Know's potential
 usefulness and limitations for
 citizens, a  piece on its uses
 for KPA, and a piece on the
 application of Right-to-Know
 data in  local emergencies
 using the computerized
 CAMKO system.
  Next is an article by two
 opinion analysts on how  the
 public gets  its informal ion
 and  develops its views on
 environmental problems.
  Then  live articles ieature
 different aspects oi
 environmental data-gathering
 at KPA. They include:
   —How the Agency trucks
 air quality trends.
   —How researchers are
 measuring pollution in
 clouds.
     The Agency's continued
 detective work on dioxin.
     1 low I'il'A is combining
 different data sources into
         The new Emergency Planning and Community Right-to-Know Act means homework for
         industry and citizens. Bob Boland, an environmental protection superintendent with
         Monsanto Chemical Company, spent 20 months assembling information needed for a
         Monsanto plant to comply with reporting requirements.
"pictures" the public can
understand and
decision-makers can use.
   -Finally, how Agency
scientists art; gearing up to
better determine risks to
ecological systems.
  Next, an article explores
how  society might more
successfully avoid
environmental problems by
looking more closely at
available data.
  Articles from an
international  perspective
include a piece arguing that
knowledge of the planet's
ecology is seriously
incomplete and a feature on
how the gases that contribute
to the Greenhouse Effect are
being traced.
  Also included in this issue
is a special report on China's
environment by an official
visiting here under a
U.S.-China bilateral exchange
agreement.
  The issue concludes
with a regular feature—
Appointments, n

-------
                                United States
                                Environmental Protection
                                Agency
                                Office of
                                Public Affairs (A-107)
                                Washington DC 20460
                            S-EPA JOURNAL
                                Volume 15
                                Number 3
                                MAY JUNE 1989
                                William K. Reilly, Administrator
                                Jennifer Joy Wilson, Assistant Administrator for External Affairs
                                R.A, Edwards, Acting Director, Office of Public Affairs

                                John Heritage, Editor
                                Karen Flagstad, Assistant Editor
                                Jack Lewis, Assistant Editor
                                Ruth Barker,  Assistant Editor
                                Marilyn Rogers, Circulation Manager
EPA is charged by Congress to
protect the nation's land, air, and
water systems. Under a mandate of
national  environmental laws, the
agency strives to formulate and
Implement actions which lead to a
compatible balance between
human activities and the ability of
natural systems to support and
nurture life.
  EPA Journal is published by
the U.S.  Environmental Protection
Agency.  The Administrator of EPA
has determined that the
publication  of this periodical is
necessary in the transaction of the
public business required by law of
this agency. Use of funds for
printing  this periodical has been
approved by the Director of the
Office of Management and Budget.
Views expressed by authors do not
necessarily reflect EPA policy.
Contributions and inquiries should
be addressed to the Editor (A-107),
Waterside Mall, 401 M St., SW.,
Washington, DC 20460. No
permission necessary to reproduce
contents except copyrighted photos
and other materials.
Measuring for Environmental
Results
by William  K.  Reilly

Speaking of Data: The Alar
Controversy
by John A. Moore

What Kind  of Data Does The
Public Need?  A Forum

Right-to-Know: What It Can
Mean for Citizens
by Susan G. Hadden

Right-to-Know: What It
Means for EPA
by Charles L. Elkins
On the Scene
with CAMEO
by Jean Snider and Tony
Jover

What the Public Thinks
about Environmental Data
by David B. McCallum and
Vincent  T. Covello

Keeping Tabs on Air Quality
by Roy Popkin

There's More than Poetry
in the Clouds
by Gregg Sekscienski

Dioxin Pathways:
Judging  Risk to People
by Michael A. Callahan

Getting It Together with CIS
by Thomas H. Mace
Keeping a Closer Watch
on Ecological Risks
by Jay J. Messer

Our Record with the
Environmental Crystal Ball
by William R. Moomaw

Taking the Pulse of the
Planet
by Francis Bretherton  40

Tracking the Greenhouse
Gases
by Pieter P. Tans  .yz

China's Environment:
A Special Report
by Changsheng Li   44

Appointments   47
Price Change
Because EPA journal has become
a bimonthly publication, the
subscription price has been
changed. Now, the annual rate for
subscribers is $8. The charge to
subscribers in foreign countries is
$10 a year. The price of a single
copy of EPA Journal is S2.25 in
this country and $2.81 if sent to a
foreign country. Prices include
mail costs. Previously, the
magazine  was being published 10
times yearly at an annual
subscription price of $11 in the
United States.
  Subscriptions to EPA /onmtil as
well as to other federal government
magazines are handled only by the
U.S. Government Printing Office.
Anyone wishing to subscribe to
EPA Journal should fill in the form
at right and enclose a check or
money order payable to the
Superintendent of Documents.  The
requests should be mailed to:
Superintendent of Documents,
GPO, Washington, DC 20402.
                                Front Cover: Apples have taken
                                (heir turn us the subject of
                                controversy in the intensifying
                                debate about environmental issues.
                                See article on page 5. Photo by
                                Don Carstens, Folio. Inc.
EPA Journal Subscriptions
                                The next issue of EPA Journal ivi'll
                                concern the institutional
                                challenges involved in tackling
                                environmental problems..

                                Beginning ivith this issue, the tc.xl
                                of KPA Journal will be printed on
                                recycled paper.
                                Design Credits:
                                Ron Farrah;
                                James R. Ingrain;
                                Robert Flanagan,
Na
me -
Firs
t, Lc
ist







PL
EAS
E P
RIN1











Co
Str
Tipa
eet i
ny K
\ddr
ame
ess
5 or

Add

tion

al Ac

Id re

3S L

ne














Cit
f

























Sta
te
















Z P
Coc
e


      Payment enclosed (Make checks payable to Superintendent of Documents)

    Charge to my Deposit Account No	

-------
Measuring  for
Environmental  Results
by William K. Reilly
   First, the good news:
   Based on my years in the
environmental movement, as well as my
first four months as EPA Administrator,
I believe EPA has the most talented,
most dedicated, hardest-working
professional staff in the federal
government. What's more, I think this
Agency does an exemplary job of
protecting the nation's public health
and the quality of our environment.
  Now the bad news: I can't prove it.
  I could cite facts and figures
telling how many regulations and
permits we've written since 1970, how
many enforcement actions we've taken,
how many lawsuits we've defended,
how many chemicals we've tested, how
many reports we've published. But what
do all those activities add up to? Do
they show that EPA is accomplishing its
mission—or do they just show that
we've managed to keep busy for the last
18 years?
  By listing our activities we do not
necessarily prove we're doing a good
job. When it comes to environmental
protection, the best measure of our
success must be results. Is the public
healthier than it was 18 years ago? Is the
environment cleaner? The limited
evidence we have is mixed and
inconclusive; we really don't know how
far we've come—or how far we still
have to go.
  Thus, as I begin a challenging
assignment as manager of this vital
agency, one of my first priorities is to
build on the  work of my predecessors in
strengthening our ability  to track the
nation's, and EPA's,  progress in
cleaning up the environment.
  A key element in any effort to
measure environmental success is
information—information on where
we've been with respect to
environmental quality, where we are
now, and where we want to go. Since its
beginning, EPA  has devoted a great deal
of time, attention, and money to
gathering data. We are spending more
than half a billion dollars a year on
collecting, processing, and storing
environmental data. Vast amounts of
By listing our activities we do
not necessarily prove we're
doing a good joo.
data are sitting in computers at EPA
Headquarters, at Research Triangle Park,
North Carolina, and at other EPA
facilities across the country.
  But having all this information—about
air and water quality, about production
levels and health effects of various
chemicals, about test results and
pollution discharges and wildlife
habitats—doesn't necessarily mean that
we do anything with it. The unhappy
truth is that we have been much better
at gathering raw data than at analyzing
and using data to identify or anticipate
environmental problems and make
decisions on how to prevent or solve
them. As John  Naisbitt put it in his book
Megatrends: "We are drowning in
information but starved for knowledge."
  Our various  data systems, and we
have hundreds of them, are mostly
separate and distinct, each with its own
language, structure, and purpose.
Information in one system is rarely
transferable to another system. I
suspect that few EPA employees have
even the faintest idea of now much data
are available within this Agency, let
alone how to gain access to it. And if
that is true of our own employees, how
must the public feel when they ponder
the wealth of information lurking, just
out of reach, in EPA's huge and
seemingly impenetrable data bases?
  One of the main reasons for the
proliferation of compartmentalized data
bases can be found in our history.
Congress created EPA by linking
together several different agencies and
bureaus, each with its own
media-specific environmental
responsibilities. Rather than integrating
EPA's programs into a unified, cohesive
framework, the environmental
legislation of the 1970s only heightened
the fragmentation by assigning EPA's
pollution  control responsibilities
according to environmental medium or
category of pollutant—air, water, solid
and hazardous wastes, pesticides and
toxic substances, and so on. Each
program office, in pursuing its own
distinct legislative mandate, has created
and maintained its own unique
information systems,  geared to that
program's specific needs and  regulatory
approaches. Until recently, few attempts
have been made to generate or use data
across programmatic lines.
  This must change. EPA must take a
strategic, "big-picture" approach to the
collection and use of environmental
data. Our knowledge  and technology
have matured to the point where we can
not only integrate EPA's various data
systems, but we can also combine our
data with information from other
sources to create elegant,
information-rich pictures, or models, of
the environment as a whole.
  Using powerful new supercomputers,
for example, we can create global and
regional climate models that use
                                                                                                   EPA JOURNAL

-------



                                                                             existing data to help predict the effects
                                                                             of heat-trapping "greenhouse" gases and
                                                                             ozone-depleting chlorofluorocarbons in
                                                                             the atmosphere. Within a few years, as
                                                                             the next generation of ultra-fast
                                                                             computers becomes available, these
                                                                             models will become more and more
                                                                             precise and useful in projecting future
                                                                             trends.
                                                                               On a somewhat more modest scale,
                                                                             EPA is developing geographic
                                                                             information and environmental
                                                                             EPA must take a strategic,
                                                                             "big-picture" approach to the
                                                                             collection and use of
                                                                             environmental data.
                                                                             modeling systems which combine many
                                                                             different kinds of data—population
                                                                             density, meteorologic and topographic
                                                                             data, location of drinking water aquifers.
                                                                             etc.—to identify sensitive populations
                                                                             and ecosystems which may be at risk
                                                                             from pollution. One example among
                                                                             many is the Office of  Research and
                                                                             Development's proposed Environmental
                                                                             Monitoring and Assessment  Program
                                                                             (EMAP), which would improve EPA's
                                                                             ability to assess the effects of specific
                                                                             pollutants  on ecosystems, as well as the
                                                                             impact of EPA programs on mitigating
                                                                             any adverse effects.
                                                                               All EPA programs must begin looking
                                                                             for creative new ways to make use of
                                                                             the information gathered by their
                                                                             counterparts  in other  programs and
                                                                             agencies—as  well as new, multimedia
                                                                             data such as  the toxic chemical release
                                                                             information now  available through the
                                                                             Emergency Planning and Community
                                                                             Right-to-Know Act of  1986. This
                                                 Charles Curtis photo. Duluth News-Tribune
MAY/JUNE 1989

-------
information can be invaluable in
helping to define total pollutant
loadings in this country—as well  as the
effectiveness of our efforts to minimize
them. It can also help us improve our
ability to anticipate and head off  future
environmental problems.
  The Agency's new  Information
Resources Directory,  which lists and
briefly describes all of EPA's data
systems as well as other information
sources, is a step in the right direction.
So is our new Comprehensive
Assessment Information Rule (CAIR),
which allows all EPA programs and
other federal agencies to obtain the
information they need on the
manufacture, importation, and
processing of chemicals of regulatory
interest. The strategic information effort
I have described, however, will require
a new attitude on the part of every EPA
program manager—a  willingness  to
break out of the traditional constraints
of media-specific and category-specific
thinking.
  A number of suggestions have been
made for ways to institutionalize  this
strategic approach to environmental
data, both to improve our existing
activities and to identify areas that are
not being addressed.  One  such  idea is a
proposal to create within  EPA a
semi-independent Bureau of
Environmental Statistics,  which would
be charged with overseeing the
collection, analysis, and dissemination
of environmental data. Another
proposal, made last fall by the Research
Strategies Committee of EPA's Science
Advisory Board, is the creation of an
Environmental Research Institute.
Among other things,  the Institute would
conduct ecological research and monitor
and report annually on overall
environmental conditions and trends.
These and other suggestions should be
given careful consideration as we look
for ways to focus our resources where
they can have the greatest  impact on
reducing environmental risk.
  Just as important, we must find ways
to share our data more effectively with
the people who paid for it in the first
place: the American public. Eventually,
as EPA makes progress in standardizing
and integrating  its information systems,


Sharing information with the
public is  an important step
toward establishing  a common
base of understanding with
the American people on
questions of environmental
risk.
the information in those systems—apart
from trade secrets—should be as
accessible as possible. Such information
could be made available through on-line
computer telecommunications, through
powerful new compact disc (CD-ROM)
technologies, and perhaps a
comprehensive annual report on
environmental trends.
  Sharing information with the public is
an important step toward establishing a
common base of understanding with the
American people on questions of
environmental risk. As the recent furor
over residues of  the chemical Alar on
apple products shows, there can be a
wide gap between public perceptions of
risk and the degree of risk  indicated by
the best available scientific data. When
this happens, government can become
preoccupied with responding to public
outrage over sensational,
well-publicized hazards at the expense
of dealing effectively with  less obvious,
but perhaps more significant, risks to
public and environmental health.
  EPA must share and explain our
information about the hazards of life in
our complex industrial society with
others—with other nations, with state
and local governments, with academia,
with industry, with public-interest
groups, and with citizens. We need to
raise the level of debate on
environmental issues and to insure the
informed participation of all segments
of our society in achieving our common
goal: a cleaner, healthier environment.
  Environmental data, collected and
used within the strategic framework I
have described, can and will make a
significant contribution to
accomplishing our major environmental
objectives over the next few years.
Strategic data will  help us:

• Create incentives and track our
progress in finding ways to prevent
pollution be/ore it is  generated.
• Improve our understanding of the
complex environmental interactions tha
contribute to  international problems liki
acid rain, stratospheric ozone depletion
and global warming.
• Identify threats to  our nation's
ecology and natural systems—our
wetlands, our marine and wildlife
resources—and find ways to  reduce
those threats.
• Manage our programs and target our
enforcement efforts to achieve the
greatest environmental results.
  With respect to environmental data,
then, our long-term goal is clear. In the
future, when someone asks us if EPA
has done an effective job of protecting
the environment, we  should be able to
reply without hesitation: "You bet—and
we can prove it!" o
(ReiJly is Administrator of EPA.]
                                                                                                        EPA JOURNAL

-------
Speaking  of  Data:
 The Alar  Controversy
 by John A. Moore
                                                                          American consumers have
                                                                          experienced a veritable media blitz
                                                                       concerning the pesticide Alar
                                                                       (daminozide). a plant growth regulator
                                                                       that may be used on apples to postpone
                                                                       fruit drop, enhance the color and shape
                                                                       of the fruit, and extend storage life. Not
                                                                       accidentally, this media attention
                                                                       coincided with the release of a report
                                                                       entitled Intolerable Risk: Pesticides in
                                                                       Our Children's Food, published by the
                                                                       Natural Resources Defense Council
                                                                       (NRDC).
                                                                         The public was alarmed to the point
                                                                       of panic. Worried mothers dumped
                                                                       appSe juice down the drain, despite
                                                                       EPA's repeated assurances that such
                                                                       measures were unnecessary and
                                                                       inappropriate. Apples were banished
                                                                       from school cafeterias. Consumers were
                                                                       further unsettled and confused as points
                                                                       of disagreement between EPA and
                                                                       outside groups on the risks of Alar
                                                                       played in the press and on TV. As one
                                                                       measure of the general confusion, it was
                                                                       necessary for EPA to issue a
                                                                       statement—responding to a query from
                                                                       "60 Minutes" following the program's
                                                                       feature story on Alar—saying that the
                                                                       dietary cancer risks of Alar had been
                                                                       overstated by the NRDC report on the
                                                                       one hand, but were understated on tho
                                                                       other hand by the apple industry in its
                                                                       rebuttal.
                                                                         The initial panic has subsided
                                                                       somewhat, but an uneasy confusion
                                                                       remains, and it is still very difficult for
                                                                       the ordinary consumer to sort through
                                                                       the issues on Alar. Moreover, as a
                                                                       consequence of the Alar case, the:
                                                                       public, sadly, is left with lingering
                                                                       doubts about the safety of our food
                                                                                    •
                                                            USDA photo
MAY/JUNE 1989

-------
supply and whether our pesticide
regulatory system is adequate to protect
it.
  In general, consumers, understandably
enough, have limited patience with
extended deliberations by scientists and
regulators over scientific data, and in
the case of Alar there have been
protracted scientific deliberations.
While scientists and regulatory officials
Public opinion,
understandably, tends to be
impatient with extended
deliberations by scientists and
regulators while health effects
questions have been raised,
but not resolved.
are concerned with questions of
scientific uncertainty and qualitative
versus quantitative aspects of risk
assessment, consumers—who generally
do not speak the language of risk
assessment—tend to ask very direct
questions. Is it safe to eat apples? Is it
safe for my child to consume apple
products? Does Alar cause cancer?
  The answers to the first two
questions, concerning the continued
consumption of apples and apple
products, are definitely yes in both
instances.
  Two questions are implicit in the
third question. First, is Alar a known
human carcinogen? The answer is no;
scientists do not have direct  evidence in
humans that traces actual cancer cases
to Alar exposure. In  fact, there are
comparatively few chemicals in the
world which have been demonstrated
beyond doubt, on the basis of
epidemiological data, to cause cancer in
humans.
               Drawing by Modell; © 1989
            The New Yorker Magazine, Inc.
  Second, does Alar cause cancer in
laboratory animals? The answer is yes;
Alar and its breakdown product called
unsymmetrical dimethylhydrazine
(UDMH) have increased the incidence of
tumors in mice. Moreover, EPA has
recently received interim data from new
cancer studies on Alar and UDMH; final
reports from these studies are due to
EPA in September 1989 and January
1990.  On analyzing these interim
results, Agency scientists found a direct
correlation between exposure to UDMH
and the development of malignant
tumors in test mice.
  Because of the implications of these
new cancer data, on  February 1, 1989,
EPA announced its intention to initiate
cancellation proceedings, through its
Special Review process, for all food
uses of Alar; a final decision, through
this formal process, will be forthcoming
by mid-1990. But again, from the
standpoint of concerned consumers,
what are these implications?
  It is difficult to understand cancer
risks—or any kind of risk, for that
matter—without a  meaningful frame of
reference. For perspective, one of the
key words consumers should keep in
mind in the Alar case, and generally in
cases of chemicals said to pose cancer
risks, is long-term. In evaluating the
risks of pesticides  to consumers, EPA
uses the working assumption that
dietary exposure to the pesticide occurs
over a lifetime (70 years). This is one of
a number of conservative assumptions
that EPA factors into its chemical risk
projections.
  Another key word in chemical risk
assessment is  incremental. Using widely
accepted quantitative risk assessment
"models," EPA calculates
"upper-bound" (worst-case) estimates of
incremental (increased) risks due to
pesticide exposure, above the
background cancer risk in the general
population. The background (lifetime)
cancer  risk in  the general population is
roughly one in four, or 0.25 (2.5 x 10"1).
It is also important to note that these
incremental risk estimates represent  the
upper bound of  theoretical risks and
reflect highly  conservative assumptions
used in the risk  extrapolation  process.
 LS/jr^
                rpft   (
                 "Pass  it along. A-p-f>les are back.'
                                                                                                    EPA JOURNAL

-------
                                         -
           grocery stoi

             providing apple
                               Ali!
 Since theoretical risks are, by definition,
 upper-bound estimates, actual risks may
 be lower or even zero.
   As an index for regulatory decisions,
 EPA's stated policy is that lifetime
 incremental cancer risks from exposure
 to a pesticide in the diet should not
 exceed  one in one million or 0.000001
 (1 x 10"8)—meaning a
 one-in-one-million risk over and above
 Since there is really no such
 thing as a risk-free pesticide,
 FIFRA reauires EPA to
 balance the risks of a
 pesticide against its
 socio-economic benefits ....
 the background risk of one in four. This
 is the definition of "negligible risk"
 applied by EPA and the Food and Drug
 Administration.
   Based on the new cancer data that we
 now have, EPA has preliminarily
 estimated cancer risks from  dietary
 exposure to Alar over a 70-year lifetime
 as an increased risk of 4 cases per
 100,000  persons (4 x 10"5). EPA deems
 this lifetime, 70-year risk to be
 unreasonable under the meaning of the
 Federal Insecticide, Fungicide, and
 Rodenticide Act (FIFRA). This
 conclusion, which is consistent with the
 Agency's negligible-risk policy, has led
 the Agency to start Alar cancellation
 proceedings under the law.
   The term unreasonabJe is important
 here and has a specific legal  meaning
 under FIFRA, which sets a risk/benefit
 standard for the registration (licensing)
 of pesticides. Under FIFRA, a pesticide
 fails the risk/benefit test for  initial or
 continued  registration if, when  used
                                                                                                   •
                                                         We wain you to know It has
                                                           been our long-standing
                                                         policy to buy apples only from
                                                          growers u ho certify to us
                                                         thai Ihei 1)0 NOT USK ALAR.
according to label directions, it presents
"any unreasonable risk to man or the
environment, taking into account the
economic, social, and environmental
costs and benefits of the  use of the
pesticide." In other words, since there is
really no such thing as a risk-free
pesticide, FIFRA requires EPA to
balance the risks of a pesticide  against
its socio-economic benefits in the
pesticide decision-making process.
  In addition to the regulatory option of
cancellation when a pesticide is found
to pose unreasonable risks, the  law
gives EPA the authority to suspend the
registration of the pesticide immediately
if its continued, short-term use  poses an
"imminent hazard" during the time
required  to complete cancellation
proceedings. In separate calculations,
the Agency has also estimated lifetime
cancer risks from exposure to Alar over
the 18-month period EPA needs to
receive final study results and conclude
cancellation proceedings  on Alar. For
the general population this risk is
estimated at an additional 8 cases per
10 million persons (8 x 10"7). For
children, the estimated lifetime risk
from exposure for this period is 8
additional cases in one million  (8 x
10"B).  These risk projections, for either
children or adults, do not in EPA's best
judgment represent an imminent hazard
to consumers and thus do not warrant
suspension action.
                         Steve Delsney photo.

  Again, it should be understood that
the short- and long-term risk projections
EPA has calculated for Alar represent
theoretical risks. Because of highly
conservative assumptions built into
EPA's risk extrapolation process, actual
risks may be lower or even zero.
  EPA's assessment of Alar-related risks
to children and other consumers is
severely at odds with the conclusions
presented in the report released on
February 27, 1989, by the NRDC and
widely promoted through a coordinated
publicity campaign. In  this report, the
NRDC presented cancer risk projections
for children, from dietary exposure to
UDMH, that are up to 100 times higher
than EPA's estimates. The report alleges
that Alar and UDMH thus account for
90 percent of the pesticide-related
cancer risk to children  and concludes
that American children face a "massive
public health problem" from pesticide
residues in foods.
  How could the cancer risks posed by
Alar, as extrapolated from scientific
data, be open to such drastically
different projections? The primary and
greatest difference between EPA's and
the NRDC's estimates arises from the
fact that the NRDC used cancer potency
estimates from data on  UDMH which
were rejected in 1985 by a FIFRA
MAY/JUNE 1989

-------
Scientific Advisory Panel. Mandated by
FIFRA, the Scientific Advisory Panel is
comprised of outside experts convened
by EPA to review scientific questions
related to major pesticide decisions or
regulations. Following a public meeting
held in September 1985, the panel
issued a formal opinion stating that the
existing cancer studies  on Alar and
UDMH, while raising concerns about
possible cancer risks, were inadequate
for the purpose of either quantitative or
qualitative risk assessment.
  Following the panel's vote of "no
confidence" in these cancer studies,
EPA decided to postpone its final
regulatory decision on Alar and, as an
interim regulatory measure, took steps
to lower the tolerance (maximum legal
residue limit allowed under the Federal
Food, Drug, and Cosmetic Act, or
FFDCA) for Alar on apples and to
reduce the application rates.
Meanwhile, the Agency used its
data-collection authorities under FIFRA
to require the manufacturer to sponsor
and submit new cancer studies for both
Alar and UDMH. EPA's risk assessment
of Alar is based on data from these new
cancer studies.
  In addition, the NRDC used  recent
(1985) data on food consumption from  a
small survey  (2,100 people) which had a
relatively poor completion rate (65
percent), and these data were
inappropriately used  in the  NRDC
calculations. For dietary risk
assessment, EPA used data from a  much
larger survey of over 30,000 persons
conducted by the USDA in 1977-78,
which had a 95-percent completion rate.
(This USDA survey is currently being
updated to reflect 1987-88 data.) These
factors together with a number of
differences in procedures account for
the  vast  differences between EPA's and
the  NRDC's assessment  of Alar-related
dietary risks—including the 100-fold
difference in the estimated risk that Alar
poses for children.
  In summary, the NRDC report is
gravely misleading for a number of
reasons, including the NRDC's use of
data that were rejected  in scientific peer
review together with food consumption
data of unproven validity. The report
also misleadingly alleged that EPA's
analyses of pesticides in the diet fail to
take into account that children and
infants typically eat more food in
relation to their body weight, and more
of certain types of food, than the
My own  view is that the NRDC
report struck a chord among
consumers by providing
"answers" in a case that EPA,
facing scientific uncertainties,
had not yet been able to bring
to final closure under FIFRA.
average adult. Generally the Agency
bases its pesticide tolerance decisions
on a composite average lifetime risk,
which includes a proportionately greater
exposure occurring  in childhood.
However, EPA also  routinely calculates
exposure values for the two most
sensitive subpopulations  identified by
our computerized Tolerance Assessment
System; these calculations allow us to
ensure that no particular  group—such as
infants and children—receives exposure
that is likely  to cause unreasonable
risks.
  EPA is also concerned  about the
possibility that children and infants
may be more sensitive to  toxic effects of
pesticide residues in their diets than are
adults. Scientific data to resolve this
uncertainty are limited and
inconclusive, and available studies
actually show mixed results. EPA has
commissioned the National Academy of
Sciences to study this issue and make
recommendations as to whether
modifications are needed in the
^.aencv's nesticide risk assessment
process. We expect the report of this
study in 1990.
  Apropos of this issue of EPA Journal,
and in the wake of the apple panic of
1989, what does the Alar case tell us
about the applications of scientific data
in the context of pesticide
decision-making?  On evaluating the
overall weight of evidence that is now
available concerning the dietary cancer
risks associated with continued use of
Alar, EPA has found the long-term risks
to be unreasonable and, consequently,
has taken steps  to effect cancellation of
the pesticide. Yet, to return  to a point
mentioned earlier, there  are no "hard
facts" on hand that directly  link Alar
with human cancer cases.
  Moreover, Alar is not unusual in this
respect. The truth is that hard evidence
on the effects of pesticide chemicals is
generally limited to those cases where
short-term pesticide exposure has
caused acute toxic poisoning in
humans, or killed important non-target
organisms in significant numbers. Such
acute toxic effects are  immediately
apparent. Most risk scenarios are not so
easy to assess, and this is especially true
of chronic or delayed health effects such
as cancer, reproductive dysfunctions,  or
effects on the unborn.
  Such chronic or delayed effects do
not become apparent for a long time,
and when they do occur, it  is almost
always impossible to trace them with
certainty  to exposures to specific
chemicals. Instead, the evidence at hand
consists of the raw materials of risk
assessment: animal data tabulations,
cancer potency estimates based on
animal study results, food consumption
statistics, exposure estimates. As  the
Alar case has brought  home, such data
are susceptible to  manipulation and
may be used selectively and
inappropriately  to make calculations
that misrepresent pesticide risks.
                                                                                                        EPA JOURNAL

-------
                           THEY'RE SAFE*
                           CONTVOU
                           TRUST ME?*«
                                \
on the basis of informed judgments. As
implied by this regulatory framework,
we have decided, as a society, not to
wait for conclusive evidence of
widespread damage to health and the
environment so that we can be
absolutely certain of all the facts before
we weigh the risks of a  pesticide against
its benefits.
   Dealing with uncertainties  in the
regulatory process can be especially
problematic for existing pesticides, such
as Alar, which  may remain on the
market until EPA has gathered enough
evidence to justify regulatory action
under FIFRA. Under the law, the same
health and ecological standards apply to
both new and old pesticides. However,
in dealing with new pesticides, EPA
may withhold registration approval and
keep the pesticide off the market as long
as necessary to resolve uncertainties
concerning its potential risks. With an
existing pesticide, EPA  must  build a
scientifically and legally defensible case
as to whether the pesticide poses
unreasonable risks before the Agency
can take regulatory action.
   For EPA, the Alar case has been
problematic from a  scientific  and
legal/procedural standpoint, and the
Agency has tended  to focus its attention
primarily on resolving these problems,
generally failing to anticipate a
mounting problem in public perception.
For the consumer public, on the other
hand, I think it is fair to say that the
Agency's ongoing review of Alar has
been a lingering question mark without
an answer. Public opinion,
MAY/JUNE 1989
understandably, tends to be impatient
with extended deliberations by
scientists and regulators while health
effects questions have been raised, but
not resolved. Regarding  Alar and the
apple panic of 1989, my own  view is
that the NRDC report struck a chord
among consumers by providing
"answers" in a case that EPA, facing
scientific uncertainties, had not yet been
able to bring to final closure under
FIFRA.
  What can be done to avoid  recurrence
of the kind of confusion that prompted
the Alar apple panic? I believe part of
the answer lies in streamlining the
7s it really EPA's job,  as a
regulatory agency, to promote
this kind of broad-scale
consumer education effort? I
believe the answer is  yes
cancellation process under FIFRA. and !
have repeatedly stated tin: need for
changes in the law to allow EPA to get a
pesticide off the market more quickly
and easily once a  problem has been
identified. This is particularly important
as a follow-up to the 1988 amendments
to FIFRA.
  Under FIFRA '88, the review and
"reregistration" of existing pesticides
will be proceeding on an accelerated
basis, to be completed in approximately
nine years. Ultimately, this accelerated
program will mean increased public
confidence in pesticide regulation and
the safety of the food supply, since all
pesticides that remain registered must
                                                                                Copyright .c  1989. The Miami Herald.
                                                                                Reprinted with permission.
be supported by a complete data base.
However, it is only  realistic to expect
that over the next few years, the
accelerated review of existing pesticides
will uncover risk concerns  that will
require resolution through EPA's
Special Review process. As a result, it
will not be surprising if pesticide issues
are in the news with increasing
frequency.
  Finally, the Alar apple panic suggests
a strategic failure of risk communication
on the part of EPA and underscores the
need for more comprehensive, more
proactive public education  and risk
communication initiatives on pesticide
issues than EPA has carried out in the
past. In the Alar case, the public was
very prone to give credence to the
selective and inappropriate use of data
regarding consumer risks and to believe
"the worst" despite counterstatements
from EPA. Clearly, consumers need a
basic frame of reference for
understanding pesticide issues if they
are to become less vulnerable to
alarmist publicity in the future.
  Is it really EPA's  job, as a regulatory
agency, to promote  this kind of
broad-scale consumer education effort':' 1
believe the answer is yes, if pesticide
decision-making is to take place in an
atmosphere that is relatively free of fear,
confusion, and unnecessary economic.
disruption. Of necessity. I see consumer
education and risk communication as
increasingly an integral part of KPA's
job. :

(Dr. Moore Juis served KPA  as Assistant
Administrator for Pesticides and Toxic
Substances und as Acting Deputy
Administrator.)

Editor's Note: As EPA Journal went to
press, UniroyaJ Chemical Co.
announced that all  sales of AJnr for
food-crop uses are being voluntarily
ha/ted in the U. S.  Uniroyal's voluntary
stop-sale action  will remain in effect
until EPA has reached a final
regulatory decision.

-------
What  Kind  of  Data
Does  The  Public  Need?
What kind of data does the
pubJic need to evaluate the
safety of chemicals in the
environment? This issue
comes up with increasing
frequency in discussions of
risk communication and
recently enacted federal and
state community
right-to-knoiv laws. EPA
Journal asked three
experienced observers to
address the  question. Their
comments follow.
Elizabeth  M. Whelan
   Consider, for a moment,
   these chemicals: safrole,
hydrazine, tannin, and ethyl
carbamate. We ingest them
every day when we consume
pepper, mushrooms, tea, and
bread. Now consider this:
each of these chemicals is a
naturally occurring
carcinogen. Do they
jeopardize human health?
Should this information lead
to a movement to eliminate
tea, outlaw mushrooms,
condemn  pepper, and banish
bread from our tables?
  Of course not.  Yet that's
where a manipulation of the
numbers,  and a
misinterpretation of the facts,
can take us. The  numbers can
be made to show that a
substance is killing us—even
when there isn't  the remotest
possibility. How, then, can a
mother be sure that her
food-shopping purchases will
nourish her family and not
contribute to its morbidity?
  If you listen to every
restrictive environmental
report that has received
media attention,  you know
that in addition to apples,
you shouldn't eat most other
fruits, not to mention meats,
fish, fowl, vegetables, eggs, or
milk products. You shouldn't
even drink the water. This
begs the question, how can
we make intelligent choices
about  risk?
  Determining levels of
safety in the environment,
which is broadly defined to
include lifestyle, must start
with some basic premises.
The first is that public health
means preventing premature
disease and death. The
second is that public health
policy should ensure safety,
not  harass industry or
needlessly terrify the public.
  What Americans suffer
from is not a lack of data. It's
something else entirely. The
malady that needs immediate
attention is called
nosophobia. It's akin to
hypochondria, but different.
  Hypochondriacs think they
are sick. Nosophobics think
they will be sick  in the
future because of lurking
factors in their diet and
general environment. They
fixate  on an array of allegedly
health-threatening gremlins.
Due to this phobia, they
believe that living—and
eating and drinking—in
America in 1989  is
inherently hazardous to their
health. They are sure there is
a death-dealing carcinogen
on every plate, a life-sapping
toxin under every pillow.
They see salvation only in
ever-increasing federal
regulations and bans.
  The nosophobics' fears of
Alar and other agricultural
chemicals used in the United
States are obviously purely
emotional. These are fears of
"invisible hazards," which
have always played a special
role in the mass psychology
of paranoia, according to
Park Elliott Dietz, Professor
of Law and Psychiatry at the
University of Virginia.
Yesterday's invisible hazards
gave rise to monster legends,
claims of witchcraft, and
vampire myths. Today, notes
Dr. Dietz, we see the same
phenomenon among those
who exaggerate the hazards
   These numbers are in.
 They aren't hypothetical.
 They aren't based on
 probability theories that
 require one to suspend
 disbelief. These data detail a
 real loss of life. Clearly, our
 focus should be on
 environmental lifestyle issues
 that, left unchecked, are
 systematically and
 prematurely killing our
 population. As a society,
 however, we seem more
 willing to assume the
 enormous and deadly  risks of
 smoking or not wearing
 seatbelts—risks that are
 within our power to prevent.
 Ironically, what we appear to
 be unwilling to tolerate are
 the minute, infinitesimal
 risks we perceive to be
 outside our control. Today's
 prime example is the risk the
          What Americans suffer from is
          not a lack of data. It's
          something else entirely. The
          malady mat needs immediate
          attention is  called nosophobia.
of radiation, chemicals, toxic
waste, and food additives.
  The most deadly public
health issues that threaten
our lives have been obscured
in the face of trumped-up
charges against the food we
eat, the water we drink, and
the air we breathe. They fall
under the category of
hazardous lifestyles. And the
data detailing the toll they
take on human lives—not the
lives of laboratory rats and
mice—are compelling and
truly frightening.
  Let's take a ride to
Marlboro country. Cigarette
smoking claims  1,200 lives
a day. In just one year, over
400,000  will perish because
they'd rather die than switch.
Another obvious example of
a hazardous lifestyle habit is
excessive or abusive alcohol
consumption, which claims
100,000  lives annually. Add
to this the use of addictive
substances,  such as heroin,
cocaine, and crack, which
claim some  50,000 lives by
each year's end.
public perceives when
chemicals are married to
food.
  What most don't
understand is that food  is
100-percent chemicals. Even
the foods on our holiday
dinner tables—from
mushroom soup to roasted
turkey to apple pie—contain
naturally occurring chemicals
that are toxic when taken in
high doses. Undoubtedly,
there are some who may
think we should start
worrying about levels of ally!
isothiocyanate in broccoli,
because this naturally
produced chemical is, in
high doses, an animal
carcinogen. Where does it
end? Worrying about more
numbers to focus more
attention on non-issues
accomplishes absolutely
nothing, o
                                                                                      (Dr. Whelan is President of
                                                                                      the American Council on
                                                                                      Science and Health.]
10
                                                                        EPA JOURNAL

-------
 David Roe


      What's true of statistics is
      also true of chemical
 safety data: there are lies,
 there are damn lies, and then
 there are risk calculations.
   The public emphatically
 does not need to be deluged
 with "the data" on health
 risks from chemical
 exposures, general  or
 specific, and told to make up
 its own mind. This, in effect,
 is too often what happens
 now by default, particularly
 in controversial cases. The
 public is not interested in
 government's abandoning the
 responsibility for deciding
 where chemical regulatory
 limits lie.
   What the public  does want
 and need is a system that
 delivers a clear signal when a
 chemical exposure crosses a
 boundary from the trivial to
 the significant, like the red
 light above a hockey net that
 flashes when the puck enters
 the goal. The public also
 needs assurance  that the
 system is hooked up and
 operating, so that the light
 goes on when the line is
 crossed, no matter  which
 teams are on the ice. And
 people need to know that the
 line itself is not being curved
 back into the net, or even
 erased, just before the
 playoffs.
   In other words, the public
 wants assurance that clear,
 consistent, and meaningful
 standards are in  place and
 that those standards are being
 obeyed. This
 simple-sounding system is
 exactly what Congress,  EPA,
 other agencies like the Food
 and Drug Administration,
 and their equivalents at the
 state level have been
 promising for the last 20
 years, in the specific context
 of toxic chemical regulation.
 Part  of the promise of such  a
 system is that the data that
 ordinary citizens will be
 provided to evaluate the
 safety of chemicals in their
environment will be a simple
set of yes-or-no answers. Is
there a red light and a goal
line  for that chemical? Is the
system hooked up? And is
the light flashing?
  Of course, all this is easier
said  than done, as the last 20
years have shown. Progress
on standard-setting has been
excruciatingly slow.
Enforceable numerical limits
have been set at a rate
averaging  less  than one
  Industries responsible for
the greatest excesses with a
particular chemical, and the
ones most worried about
enforcement action therefore
have a strong incentive to
stall the process as long as
possible. This means drawing
out not only every scrap of
honest potential debate over
risk calculations but also
trumped-up disputes and.
elaborate delaying tactics as
well. The practical results of
           Is there a red light and a goal
           line for that chemical? Is the
           system  hooked up? And is the
           light flashing?
chemical per year under key
federal laws—the Toxics
Substance Control Act
(TSCA), the Safe Drinking
Water Act, and the toxics
section of the Clean Air Act.
The standards that do exist
have varied enormously in
the amount of calculated
health risk that they allow.
  In part,  the reason  is that
calculation of health  risks
from toxic chemical
exposures, based on the
usual data, is genuinely
uncertain  and depends
heavily on calculating
conventions as  well as direct
results of laboratory
experiments or
epidemiological studies. As
with the calculation of Gross
National Product statistics,
there is room for honest
disagreement as to where the
line should be drawn.
  But another factor is at
least as important in
explaining the wholesale
failure of chemical
standard-setting over the last
20 years. Built into the
structure of all of the major
federal laws on toxic
chemicals is a powerful
disincentive to  resolving
disagreements and setting
actual standards. In effect,
under the  prevailing legal
structure, no  enforcement
(and thus no  protection)
whatsoever takes place with
regard to a specific chemical
until after  a standard is set.
this incentive structure are
only too apparent.
  As long as  clearly
delineated, health-based
standards for toxic chemicals
are the rare exception rather
than the rule, it is somewhat
misleading to talk about the
kinds of data that the public
needs to evaluate the safety
of chemicals  in the
environment. The answer is
either "standards" or "all the
elements necessary to
calculate standards and
consensus on the calculating
methods." If the latter are
available, of course, then the
former will not be far behind.
  Fortunately, the
disincentives that inhibit
standard-setting under
conventional toxics laws are
not immutable. A new law in
California, designed with
exactly this problem in mind,
has created a structure in
which industry is as eager as
the public to  succeed in
setting standards for
particular toxic chemicals.
Passed by direct voter
initiative in 1986, with
first-stage  enforcement
beginning approximately one
year ago, the  law is
commonly known by its
ballot name: Proposition 65.
  Under Proposition 65's
new incentive structure,
California's regulators
managed to draw clear,
numerical, health-based
standards for more chemicals
in 12 months than EPA had
managed to address under
TSCA in 12 years—in fact,
more than twice as many.
Most of the data on which
the California lines were
based came directly from
EPA and other  national and
international bodies, which
had long since  completed
critical evaluation of the
research results; the
difference was  that, for once,
there was a premium on
getting to  the bottom line.
  Proposition 65 has also
produced  hundreds of  pages
of regulatory consensus on
technical issues such as
low-dose extrapolation
models, exposure
assumptions, and other
elements of risk calculation,
all now being applied in
uniform fashion to some 280
different carcinogens and
reproductive toxins. Perhaps
most impressive to insiders is
the fact that, despite intense
controversy over these  rules,
not a single word of them has
been challenged in court by
any of the potentially
affected industries. The
contrast with federal agency
experience concerning  the
same issues is dramatic.
  Doing no more than
adopting the methodological
consensus and  numerical
standards  that Proposition 65
has already generated would
be a breathtaking leap of
progress for EPA.  But
understanding why the new
California  law works the way
it does, and incorporating its
lessons into federal toxics
laws, could have much
greater impact.  Proposition
65 has shown that gridlock
over standard-setting is not
inevitable, either
scientifically or politically.
To meet the public's needs,
the top priority  on the
national level should be to
catch up. Q
                                                                                          (Roe is Senior Attorney
                                                                                          with the Environmental
                                                                                          Defense Fund.]
MAY/JUNE 1989
                                                                                                                   11

-------
Ronald F. Black

   Chemical companies are
   currently perceived by
the public as
"problem-creators" more
often than problem-solvers.
Some of the problems
attributed to our
industry—like the accident in
Bhopal, India—are real; some
are imagined. The mistrust,
however, is very real.
  This mistrust of industry is
complemented by a general
mistrust of government,
particularly regulators, at the
federal, state, and local
levels. Public frustration over
the perceived lack of progress
in solving environmental
problems has spawned
"regulation by information"
as the new environmental
battle cry.
  The first of these new laws
are on the books, and
volumes of information flow
to EPA, state environmental
agencies, and local
emergency groups. The
submissions provide page
after page of data about
chemical releases into the
environment.
  The  news media already
have begun dutifully
reporting "the numbers." The
Subcommittee on Health and
Environment of the U.S.
House of Representatives has
fired its first salvo
demanding emissions
reductions. Government
officials have expressed
shock at emissions levels.
The chemical industry has
attempted to place the
numbers in perspective. How
the public will react, remains
to be seen.
  What kind of information,
beyond raw numbers, does
the public need to be able to
draw its own conclusions
regarding chemical releases?
The question goes to the
heart of regulation by
information. Several kinds of
supplemental  information
come to mind:
• Health-Based Criteria:
Health-based criteria are the
foundation upon which risk
evaluations can be made.
Merely knowing the amount
of material released by a

12
chemical plant does little to
enlighten either risk
managers or the public. An
annual discharge of 1,000
pounds of a carcinogen, for
example, is meaningless
without some reference to the
increased chances of
contracting cancer.
  Unfortunately, little
information concerning the
relationship between the
volume of chemical releases
and its potential impact on
human health has been made
chemicals, this information is
vital.

• Incident Information: In
addition to statistics on
routine chemical emissions,
the public needs information
when something goes
wrong—when an accident
happens and an
instantaneous release occurs.
When accidents happen, and
they will, there  is little time
for industry and regulators to
compute human exposure;
          Information on accidental
          releases should be shared with
          the public, regardless of
          whether there is  an impact on
          human health.
available to the public.
Without this information,
people are likely to react to
the sheer magnitude of the
numbers.

•  Exposure Assessments:
Another factor that would
assist the public's ability to
make judgments is actual
exposure to chemicals
released into the
environment. Any  release of
chemicals from a facility into
the air,  water, or land poses a
potential risk to people and
the environment. However,
the amount and nature of
substances being released are
meaningful only when
translated into public
exposure levels: the amount
of these materials that can
reach people.
   The importance of the
exposure factor holds true for
all environmental media:
ambient air, drinking water,
ground  water, etc.  Most
exposure projections come
from computer  models that
predict  potential exposure to
the public. These predictions
use the traditional  strategy of
"overestimating" exposure
and are designed to provide
margins of safety. The
models are, by  their very
nature,  complex, difficult to
understand, and even more
difficult to communicate.
However, if people are to
judge their exposure to
immediate answers are
demanded.
  Information on accidental
releases should be shared
with the public, regardless of
whether there is a impact on
human health. This includes
the "raw" numbers and some
explanation of their impact
on the community. To
prepare for these events,
chemical facilities should
routinely perform accident
simulations in order to
predict  possible impacts. The
information gleaned from
such simulations should be
shared with the public.
• Emissions Reduction: The
chemical industry must also
continue to communicate,
not only concerning its
criteria  for determining what
it deems to be acceptable
levels of exposure, but  its
plans for reducing releases.
Several companies have
announced
emissions-reduction goals,
some after soliciting
community input.
Information about what the
industry is doing to reduce
releases helps  provide a
realistic context for
evaluating potential
exposure. It can also
engender public support for
the efforts being made.
• Risk Management Systems:
Finally, the public needs to
see how industry has
structured its risk
management systems. These
systems are not new, but they
haven't been talked about.
There is one possible
explanation for the industry's
traditional lack of
communication on these
systems: in order to explain
how a facility manages risk,
the existence of risk must
first be acknowledged. This
remains a very uncomfortable
concept for many facility
managers. However, the risks
do exist, they are being
managed, and we must learn
to talk about them.
  Maintaining the confidence
of our constituencies is a
major challenge for the
chemical industry.
Stewardship implies a
responsibility that transcends
legal requirements: it means
earning trust and cooperation
from others;  it means
listening to concerns and
sharing information.
  In industry's defense, let
me say that major strides
have been made in recent
years to break down
communication barriers. A
good example is the
Chemical Manufacturers'
Association's Community
Awareness and Emergency
Response Program, which has
improved communication
concerning many chemical
facilities in this country. The
Association's new
Responsible  Care Program
will build on this initiative,
heralding a new era of
openness.
  If we can demonstrate that
we can operate facilities
safely, and if we continue to
look outward to the
community and seek
opportunities to
communicate effectively, we
can begin to gain social
legitimacy. By meeting the
communication challenges of
"regulation by information,"
we may find a better way to
live and work together. D

(Black is Corporate
Environmental  Manager for
the Rohm and Haas
Company, a U.S. chemical
firm.)

                EPA JOURNAL

-------
 Right-to-Know:
 What  It  Can  Mean  for  Citizens
 by Susan G.  Madden
                              Photo provided courtesy of Washington Convention and Visitors Association
                                                                        In Thomas Jefferson's vision of
                                                                        democracy, an informed citizenry
                                                                      participates actively in the political
                                                                      process. Since Jefferson's time, however,
                                                                      society has become considerably more
                                                                      complex, and it has become increasingly
                                                                      difficult for citizens to keep informed
                                                                      about public policy-related issues,
                                                                      especially those that require an
                                                                      understanding of scientific or technical
                                                                      information.
                                                                        In  the twentieth century, government
                                                                      has gradually assumed much greater
                                                                      responsibility for making information
                                                                      available to citizens, particularly
                                                                      information concerning risks to health
                                                                      and the environment. In fact,
                                                                      information has become  an important
                                                                      means of regulating risks in the
                                                                      workplace and from consumer products.
                                                                      For example, many consumer products
                                                                      carry requisite labeling that provides
                                                                      The chemical reporting
                                                                      requirements of Title III
                                                                      create a vast new resource of
                                                                      data about the presence of
                                                                      potentially hazardous
                                                                      chemicals in communities. .
                                                                      information about safe storage and use,
                                                                      and many food products must display
                                                                      ingredient labels.
                                                                        In 1986, Congress enacted the
                                                                      Emergency Planning and Community
                                                                      Right-to-Know Act—also known as title
                                                                      III of the Superfund Amendments and
                                                                      Reauthorization Act (SARA)—which
                                                                      extends a late twentieth-century
                                                                      Jeffersonian approach to the risks posed
                                                                      by hazardous chemicals in our
                                                                      communities. Title III, as the new
                                                                      statute is often called, contains a
                                                                      number of new provisions related to
                                                                      emergency  planning, emergency
                                                                      notification, and a Toxic Chemical
                                                                      Release Inventory as well as Community
                                                                      Right-to-Know reporting on chemicals.
Them-       '.41 believed thai
                   .is on informed
         •
                  'izens to slay
                omplex issues such as
         al quality.
MAY/JUNE 1989
                                                                                                     13

-------
                                                                                           Responding to comm


                                                                                           Company's
                                                                                           Executive f
                                                                                           Mahoney has pledged to '
                                                                                           air emissk"
                                                                                               ' 1992. At right, the
                                                                                           company's W. G. Krumrtv
                                                                                               a Sauget, Illinois, across
                                                                                           from downtown St. Louis.
  The chemical reporting requirements
of Title III create a vast new resource of
data about the presence of potentially
hazardous chemicals in communities,
and this new resource opens up a vast
opportunity for citizens to assume a
stronger role in environmental affairs.
However, the new law also raises
important questions about the respective
roles  of government, citizens, and the
private  sector in monitoring,
disseminating, and  interpreting data.
For example, who should be responsible
for analyzing Title III data? Who should
ensure that the analysis is balanced?
  Title  III is a complex statute that
differentiates among three categories of
hazardous chemicals, mandates three
different kinds  of reports and several
reporting formats, and embodies
multiple goals.  Since it was passed in
part as a response to the accidental
release  of a highly toxic chemical in
Bhopal, India, in  19B4, one important
purpose of Title III  is contingency
planning by state and local governments
as well  as commercial facilities for
emergencies involving hazardous
chemicals. Facilities subject to Title III
requirements include such  diverse
establishments  as warehouses,
drycleaning and manufacturing
establishments, and hardware stores.
  Title  III requires facilities to report to
new Local Emergency Planning
Committees concerning the storage on
their  premises of "extremely hazardous"
chemicals in amounts over a
"threshold"  quantity. These local
committees are charged with developing
emergency response plans,  using this
storage  information as well as other
information  they may request from
facilities. Emergency reporting, a second
purpose of Title III, is achieved by
requiring facilities to report whenever
they accidentally  release these
hazardous chemicals.
   Facilities must also submit annual
 hazardous chemical  inventories,
 covering a much greater number of
 chemicals at higher thresholds, both to
 assist local planning committees and to
 enable citizens to learn about the
 chemicals present in their communities.
 Finally, manufacturing facilities must
 annually submit an emissions inventory
 detailing emissions into air, water, and
 ground of certain toxic chemicals.  EPA
 is charged with compiling these
 emissions  data in its Toxic Chemical
Title HI and other similar laws
raise a number of questions
about who should have
responsibility for turning data
into information.
Release Inventory and making this
information available to the public in
electronic as well as other forms.
  Since it has previously been difficult
to learn the identities of chemicals
stored and emitted by local facilities,
Title III has given citizens access to
important new data. The first chemical
inventories were submitted in March
1988, and the first emissions inventories
in July 1988. The sheer complexity of
the law's reporting requirements may
adversely affect the consistency, quality,
or utility of the data, at least in the first
few years of the program. Nevertheless,
citizens have already begun to use the
information in a variety of ways.
  It is not unusual for community and
environmental groups to band together
temporarily to combat a perceived
hazard in the community. In many
instances, such community action
committees  have worked with their
Local Emergency Planning Committees
to conduct assessments of the hazards
posed by particular facilities. Focusing
first on the facilities storing  the most (or
the most toxic) chemicals, citizens have
asked industry to create scenarios
illustrating what would happen if an
accident occurred.
  Such scenarios typically include, for
example, the plotting of plumes
showing how chemicals would disperse
in air and an analysis of whether and
how especially vulnerable persons like
children and the elderly would be
affected in an emergency. Appropriate
emergency plans are also developed.
Many citizens have taken tours of local
facilities and learned how chemicals are
being stored. They have begun working
with facilities to achieve reductions in
inventories of hazardous chemicals
stored in large quantities.
  Although the chemical inventories
constitute the largest part of the data
collected under Title III, the Toxic
Chemical Release Inventory  emissions
data have  drawn the greatest attention
from the press and the  public.
  Anticipating  public concerns about
the quantities of emissions that they
would be reporting on July 1, 1988,
several major companies previously
announced plans to reduce their
emissions over several years by amounts
ranging from 50 to 90 percent.
Conversely, in one neighborhood near
Houston, Texas, citizens are  working
directly with a local plant to develop an
emissions-reduction plan, using the
emissions report filed in July 1988 as
the basis for their negotiations.
  Other citizens are more interested in
using Toxic Chemical Release Inventory
data to develop a picture of conditions
area-wide or industry-wide.  The
Massachusetts Public Interest Research
Group, for example, compiled
state-wide emissions data based on the
14
                                                                  EPA JOURNAL

-------
 following factors: location by city.
 potential adverse health effects, ant)
 disposal sites. The group then helped
 draft a bill, introduced  in the state
 legislature, designed to  accelerate
 adoption of pollution-prevention
 strategies by industry.
  In another example, a .uttional public
 interest group, OMB Watch, used tin;
 emissions inventory data to  obtain an
 overview of routine emissions of heavy
 metals, which can cause a range of
 adverse health effects in humans.
  Citizens for a Better Environment
 examined the chemicals omitted in
 Richmond, California, identifying
 facilities responsible for the  greatest
 amounts of emissions and the most
 hazardous chemicals both stored and
 emitted. This group noted that lower
 income and minority citizens an; most
 at risk, because they live nearest to  the
 facilities.
  Another California group,  the Silicon
 Valley Toxic Coalition,  is focusing on
 the semiconductor industry, examining
 industry-wide patterns in emissions. All
 reports of these groups call for citizens
 to work with industry to obtain
reductions in both use and routine
emissions of  toxic chemicals as well as
to develop strong accident-prevention
 programs.
  Citizen groups are also investigating
how Title III  data can be used in local
enforcement efforts. For example, many
of them have suggested correlating
emissions inventory data with the air
and water permits of each facility. In
many cases, however, the permits may
not mention the particular chemicals


Citizens have already  begun
to use the  information  in a
variety  of ways.
that are reported under Title III. An
alternative approach, now being
explored in Texas, is to compare the
emissions data with the chemical
inventories, in order to identify any
possible inconsistencies.  Apparent
inconsistencies would then lead citizens
to question facilities about substances
stored in large quantities but not riled
in the emissions data; even more
concern might be aroused by substances
reported to the emissions data base but
not included on  the chemical
inventories. Obtaining answers to such
questions  will require citizens to work
closely with industry and government.
  These kinds of activities, which  are
going on throughout the nation, suggest
the important opportunities that the
new data available under Title III
provide citizens to become  involved in
monitoring and reducing risks from
                           Monsanro pfioio

hazardous chemicals in their
environments. However, raw data and
statistics do not constitute useful
information. Such data must be
analyzed and placed in  context to
provide information that can be the
basis for citizen participation in
decision-making.  Title III and other
similar laws raise a number of questions
about who should have  responsibility
for turning data into information.
  First,  citizens who wish to take an
active role in reducing risks from
hazardous materials in their
communities need information beyond
that submitted under Title III. The
precise nature of this supplementary
information is beyond the scope of this
article, but  it is clear that at the very
least, citizens need detailed information
about the potential  health effects of
chemicals and information  about
appropriate storage, use. and  disposal
techniques. Should government,
industry, or someone else be responsible
for providing this supplementary
information?
  For citizens who are in potential
danger,  it is unquestionably worthwhile
to obtain the necessary information to
reduce their risks: on the other hand, it
is a waste of resources for many
different citizens to duplicate the  same
MAY/JUNE 1989
                                                                            15

-------
searches for information. Another factor
to be considered is that citizens do not
trust all sources of information equally;
they generally prefer information that
comes from environmental groups or, in
the case of health effects information,
physicians.
  In many cases, however, industry is
likely to have a monopoly on the
necessary  information. Should
government step in to evaluate the
quality of  supplementary  information
and ensure its availability? If the answer
is "no," and if citizens are unable to
acquire or use the supplementary
information, they will not be able to
participate fully in the decision-making
process.
  Second, although Congress ensured
that Title III emissions inventory data
would be  computerized, it did  not
require that the inventories of stored
chemicals also required under the
statute be  computerized. The same
advantages derived from computerized
emissions data would also apply if the
other data were computerized.  These
advantages include expanded capability
for data analysis, more effective
community-wide emergency planning,
and better, speedier emergency
response.  In short, computers can help
turn data into information by sorting out
data based on the needs of the
particular user, analyzing the data
selected, and even providing needed
context.
  At present, computerization at the
state and local levels depends on the
availability of resources, and there is no
way to ensure that local data are
compatible with data compiled by
neighboring constituencies. In cases
where Local Emergency Planning
Committees cover small geographic
areas, such as those near  Boston Harbor
or in New Jersey, citizens and
emergency response planners are likely
to need information from neighboring
jurisdictions because they could so
easily be affected by events at plants in
adjacent areas. Should an effort be made
to link existing emergency response
networks to Title III data and to each
The role citizens play in
decisions about the
acceptability of risks from
hazardous chemicals in the
community is changing . . .
from  ignorance  and impotence
to knowledge and power.
other to ensure that statewide or even
national data are available to everyone?
Who would pay for such an effort, and
how should such  a data base be
constructed to be  useful and
meaningful?
  Third, Title III provides data so that
citizens may participate more fully in
decisions  concerning hazardous
materials  in their  communities.
However,  our society does not presently
have many institutions that encourage
interactions between citizens and
private industry. Existing institutions
for citizen participation are usually
intended to foster direct access to
government rather than industry. The
Community Awareness and Emergency
Response  program (CAER) sponsored by
the Chemical Manufacturers
Association—part of which includes an
effort to remedy this institutional
deficiency—has, in practice, focused
more on reducing risk  and developing
emergency plans than on establishing
ongoing relationships between member
companies and citizens other than
elected officials.
   Local Emergency Planning
Committees, which by  law must include
representatives from all three
sectors—citizens, government, and
industry—could serve as forums in
which decisions are made about risks
from hazardous chemicals. At present,
most local committees are absorbed
with their primary statutory tasks of
emergency planning and emergency
response, but with some encouragement
and assistance, their responsibilities
could be expanded to include
negotiation about emissions reduction
and the substitution of less hazardous
for more hazardous chemicals. If
appropriate channels are not developed,
these decisions are likely to become
subject to an adversarial process that
will be costly and time-consuming for
all parties.
  Title III has provided citizens,
emergency managers, and regulators
with a rich  new source of data. So far
the reports concerning citizen initiatives
around the  country indicate that the
data are likely to be used to a greater
extent once this new resource has been
available longer and citizens have had
an opportunity to become familiar with
its strengths and weaknesses. Even these
early activities are evidence, however,
that the role citizens play in decisions
about the acceptability of risks from
hazardous chemicals in the  community
is changing: a change from ignorance
and impotence to knowledge and
power. Fully realized, this change will
have widespread effects on both our
environment and our polity. °
(Dr. Hadden is Associate Professor at
the Lyndon B. Johnson School of Public
Affairs, University of Texas at Austin.
She is author of A Citizen's Right to
Know: Risk Communication and Public
Policy (West view Press, 1989).)
16
                                                                                                        EPA JOURNAL

-------
Right-to-Know:  What  It
Means  for  EPA
 by Charles L. Elkins
   Epichlorohydrin is a caustic,
   flammable chemical used in the
production of epoxy resins, solvents,
plastics, and other products. Breathing
its vapors can irritate your eyes, nose,
and lungs. High-level or repeated
exposure can damage your liver and
kidneys and could cause a fatal buildup
of fluid in your lungs.
  What's more, breathing
epichlorohydrin has been shown to
cause nasal cancer in laboratory rats.
Based on these and other animal
studies, EPA has classified
epichlorohydrin as a "probable human
carcinogen."
  Sound like a good candidate for
regulation by EPA? Not necessarily.
With effects such as these, the key
question  is: how extensive is public
exposure to the substance? Until
recently, data available to EPA did not
indicate that significant numbers of
people were being exposed to
epichlorohydrin.
  Now, however, thanks  to information
in a new EPA data base called the Toxic
Chemical Release Inventory (TR1), EPA's
Office of Air and Radiation is taking
another look at  epichlorohydrin. The
reason: TR1 data show that there are at
least three times as many manufacturing
plants releasing epichlorohydrin into
the air in the United States as the
Agency had previously estimated.
According to the data base, 70 facilities
in 24 states emitted a total of 363,300
pounds of epichlorohydrin into the air
in 1987. Before  the TRI data were
available, EPA had identified only 20
sources of epichlorohydrin emissions.
  Locating previously unknown sources
of toxic chemical releases is only one of
dozens of potential uses of the TRI that
are being identified by EPA's  various
programs. Other uses include:
• The Air Office has  used TRI data to
support the development of
administration proposals to amend the
 air toxics provisions of the Clean Air
 Act. In addition, the Air Office and the
 Office of Solid Waste will use the data
 to help set their regulatory agendas.

 •  The Office of Water plans to use the
 TRI to spot possible violations of
 water-pollution discharge permits; to
 target enforcement activities; to help in
 reviewing permit requests; and to set
 water quality standards.


EPA received about 75,000
reports from some 18,000
facilities for 1987—one for
each chemical reported by
each facility.


 •  The Office of Toxic Substances is
 screening TRI data to locate candidates
 for regulatory investigation under its
 existing chemicals program and to
 verify production estimates for asbestos
 and other regulated chemicals.

 •  The Pollution Prevention Office
 expects to use the TRI in developing its
 strategy for assessing progress in
 pollution prevention; to determine
 research needs; and to identify
 industries or facilities that need
 technical assistance.

   The toxic  chemical release data,
 which must be submitted to EPA and
 the states every year by thousands of
 manufacturing facilities across the
 country, are providing EPA with an
 unprecedented national "snapshot" of
 toxic chemical emissions from some
 industries to all environmental
 media—air,  water, and land.
   The reporting is required by Section
 313 of the Emergency Planning and
 Community Right-to-Know Act of 1986
 (Title III of the Superfund Amendments
 and Reauthorization Act). The Act also
 requires  industries to participate in
contingency planning for chemical
emergencies and to notify their states
and communities of the presence and
accidental release of hazardous
chemicals.
  As envisioned by Congress, a primary
purpose of the Emergency Planning and
Community Right-to-Know program is
to inform communities and citizens of
toxic chemical hazards in their own
localities, so they can work together to
reduce risk. Used in this way, TRI and
other Title III data can be a potent force
for environmental change.
  A unique aspect of the TRI is that it
is made available to the public directly,
without analysis or interpretation by
EPA or any other intermediary (see
box). As discussed in another article
(see page 13), citizens already are using
the data to lobby for stronger federal
and state regulation of  toxic chemicals.
They also are using this new
information to pressure local industries
to implement pollution prevention
programs in order to cut back on
unregulated releases. Several
companies, after reviewing their own
TRI reports, have announced ambitious
plans to voluntarily reduce their toxic
chemical emissions within the next few
years.
  Because of its multi-media nature,
however, the TRI has potential value
that extends well beyond the boundaries
of individual facilities and local
communities. It can also be a valuable
source of information for environmental
regulators and  public health officials at
all levels of government.
  EPA and the states can, for example,
use the information to better understand
what toxic chemicals are released and
where, in order to get a more complete
picture of the total toxic loading in a
given geographic area. With this
information, regulatory agencies will be
able to set priorities, focus their
activities, identify gaps in regulatory
MAY/JUNE 1989
                                                                                                               17

-------
The v                        ::.>ry

methyl iso•..,•
Carl.;
and Come.
pass*:

to the
    s/ Betfmann News photo
coverage, and integrate their programs
more effectively. EPA has prepared a
"risk screening" guidebook to help state
and  local officials use the TRI data for
these purposes. Other documents,
including toxicity fact sheets on the TRI
chemicals and "roadmaps" to other
sources of information, also are being
distributed.
  The TRI also will make it easier  to
monitor pollution trends from year to
year, as well as shifts of pollutants
among air, land, and water (to
determine, for example, if restrictions
on land disposal cause greater releases
to air, or vice versa). Before the TRI.
much of this information had never
been collected, and what ivas collected
was  scattered in separate, mostly
incompatible, EPA program files.
  The first industry reports under
Section 313  were due to EPA and the
states last July 1, covering both
accidental and routine releases of more
than 300 reportable chemicals during
1987. Manufacturing facilities with 10
or more employees that used more than
10,000 pounds of one of the chemicals,
or manufactured or processed more than
75,000 pounds of a reportable chemical,
were required to report. The
75,000-pound threshold drops to 50,000
pounds for 1988 releases and to 25,000
pounds for 1989 and thereafter.
  EPA received about 75,000 reports
from some 18,000 facilities for
1987 — one for each chemical reported
by each facility. The reports showed
that  at  least  2.7 billion pounds of toxic
chemicals were emitted into the  air in
1987, 9.7 billion pounds were released
to streams and other bodies of water, 2.4
billion pounds \vere placed in landfills
or otherwise disposed of on land, and
3.2 billion pounds were injected
underground. In addition, an estimated
1.9 billion pounds of toxic chemicals
were sent to municipal wastewater
treatment plants for processing and
disposal, and 2.6 billion pounds were
transported off site to other treatment
and  disposal facilities.
18
                                                                                                          EPA JOURNAL

-------
  While these numbers are large and
clearly indicate the need for additional
efforts hy both government and industry
to reduce toxic: chemical emissions, they
do not suggest an immediate public
health crisis. In fact,  the overall risk to
the public: health from these releases is
probably low. It is likely that only a few
facilities are exposing the public; to
toxic: chemicals at a rate that could
warrant immediate action; others,
however, may be creating risks due to
long-term, low-level exposures, and
those  must bo dealt with as well.
  From a regulatory standpoint, the
value of the TRI data lies primarily in
their ability to pinpoint specific
facilities, industries,  or geographic: areas
of particular concern for further
investigation and follow-up action. For
example, the information can call
attention to a facility or category of
facilities that may be releasing excessive
amounts of toxic: chemicals,
  It is important to note that the data
are only annual  estimates, not
measurements of actual releases. No
additional monitoring by reporting
companies is required by the law. Nor
does the TRI show the relative toxicity
of the chemicals or the  rate at which
they are released, although EPA will
consider adding reporting  on "peak
releases" to the inventory in the future.
(Some high-volume chemicals, such as
sodium sulfate, are relatively harmless
and are scheduled for removal from the
TRI list of reportable chemicals.) If an
inventory user is concerned, for
instance, about acute exposures rather
than bioaccumulation, he or she would
want to know whether most of the
emissions were discharged in a few
days or over the course of the entire
reporting year.
  Despite these limitations—which will
diminish over time as companies
                                             How To Obtain
                                             Emissions Data
     EPA's Toxic Chemical Release
     Inventory (TRI) data base is being
     made available directly to the
     public through computer
     telecommunications and other
     means. Here are some of the ways
     members of the public can obtain
     information from the TRI:

     •  If you have a home computer
     and a telephone modem,  you can
     "dial up"  the data base, which is
     housed at the National Library of
     Medicine (NLM) in Bethesda,
     Maryland, and review the data on
     your monitor or "download" it
     onto a computer disc or printer. A
     nominal access fee  will be
     charged. For  information on
     obtaining an  account with NLM.
     call 301-496-6531.

     •  If you have access to a
     microcomputer, you can obtain
     TRI data for each state on
     diskettes.

     •  You can review microfiche
     copies of data on TRI releases in
     your state either at the
     Government Printing Office (GPO)
     federal depository library in your
     Congressional district or at a
     designated public library in your
     county. The complete national
     data base will be available at EPA
     libraries and at regional and state
     depository libraries. Call the
     toll-free Emergency Planning and
     Community Right-to-Know Hotline
  at 800-535-0202 (in Washington,
  DC, 202-479-2449) for the address
  of the library with TRI data nearest
  you,
  • The national data base will also
  be available in compact disc.
  (CD-ROM) format at 400 federal
  depository libraries, 200 other
  research and academic libraries,
  and all EPA libraries. The Hotline
  can help you locate these libraries
  as well.
  • EPA is publishing a National
  Report with detailed summaries
  and analyses of the TRI data. The
  report will be available at the
  federal depository libraries or can
  be purchased from  GPO or the
  National Technical Information
  Service (NTIS). In addition, all
  versions of the TRI data
  base—magnetic tape, CD-ROM,
  microfiche, and diskettes—can be
  purchased from GPO
  (202-275-2091) or NTIS
  (703-487-4650). Call the Hotline
  for ordering information.

  • You can obtain copies of TRI
  reports for individual facilities
  from your State TRI Coordinator.
  The Title III Hotline can tell you
  who to contact.
  • Finally, the TRI Reporting
  Center in Washington, DC, will
  make data from individual
  facilities available in  its reading
  room, will mail out limited
  numbers of TRI reports, and will
  conduct limited searches of the
  data base and  provide printouts on
  request. The Reporting Center's
  address is: Title III Reporting
  Center, P.O. Box  70266,
  Washington, DC 20024-0266 (Attn:
  Public Inquiry).
improve their estimation techniques and
as EPA takes steps to improve the
accuracy and usefulness of the data
base—Title III  is nothing less than a
revolutionary approach to
environmental protection.  It
fundamentally challenges the notion
that decisions  about the control of toxic
chemicals should be left to the
"experts" in government, industry, and
academia.
  Since Title III data are made  available
to the public and EPA at the same time,
the program creates a new  opportunity
for a working partnership between the
public and the Agency. By sharing
information, the public: and EPA can
also share in finding solutions  for
example, by developing new programs
to identify options for reducing (lu:
production and use of toxic chemicals.
The development of a creative new
partnership involving EPA, state and
local governments, and the public: may.
in fact, be the most  important benefit of
all to flow from the Emergency Planning
and Community Right-to-Know
program, n
                                                                               (Eikins is Director of KP/V.s Office of
                                                                               Toxic Substances.]
MAY/JUNE 1989
                                                                                                                    19

-------
On  the  Scene with  CAMEO
 by Jean Snider and Tony Jover
    Not long ago, fire department
    personnel responding to fires or
other incidents involving hazardous
materials were severely hampered by a
lack of information. More often than
not, they didn't know either the nature
of the chemicals involved or the
problems they would face on arrival at
the scene.
  Today, such lack of information need
no  longer be a problem. Computerized
emergency  and chemical information
data systems are available to provide the
vital information, even before the
response team gets to the fire or
hazardous material spill.
  CAMEO—which stands for
Computer-Aided Management of
Emergency Operations—is one of these
systems. A  computer program
developed by the National Oceanic and
Atmospheric Administration (NOAA)
and EPA to help firefighters and
emergency  managers respond effectively
to HAZMAT incidents, CAMEO is
already being used by about 3,000 fire
departments and emergency
management agencies. The Macintosh
version of CAMEO contains response
recommendations for over 2,600
chemicals,  an air dispersion model, and
the capability to access local maps and
information stored  in the
community—information required to be
provided to local government and
response personnel under the Title III
Emergency Planning and Community
Right to Know amendments to
Superfund.
  How  does this emergency information
system  work? Just suppose you are
fictional fire lieutenant Joe Sackler
when the firehouse bells sound at 2:35
A.M. . . . Sackler jumps from his
bunk and wipes the sleep from his eyes.
It was only a short catnap, but it sure
has helped; in the last 10  hours,  he and
his crew have been through several fire
runs and one hazardous material
incident. Now the bells and the public
address system  are signifying another
HAZMAT problem: a strange sulfur-like
smell being reported by people living
near the Freeland Chemical Company.
  As the fire officer pulls his boots on,
he thinks about the way HAZMAT runs
used to be—and how they have changed
over the past two years. Formerly,
firemen responding to a hazardous
materials incident or a fire involving
chemicals had no idea what they might
encounter when they arrived on site.
This was especially true when the
incident was at one of the smaller,
marginally profitable companies. On
"pre-fire" visits to such facilities to
determine what chemicals might be
stored on the site and the location of
fire hydrants, fire department inspectors
were often rebuffed by owners who
said, in effect, "Trust us, we are safe
operators and will take care of any
spills on our property. The people who
live around here won't be  affected."
CAMEO is already being used
by about 3,000 fire
departments and emergency
management agencies.


  Although most plant operators are
responsible and cooperative, one bad
incident involving a "fly-by-night"
operator was enough to convince the
lieutenant that his department must
have all available  information in its
possession and readily accessible when
the alarm sounds. In the past, it was
simply too uncertain and
nerve-wracking to depend on others to
provide it after the firefighters reached
the scene—assuming, of course, that
someone was there on site with the
necessary information.
  But now things  are different.
Lieutenant Sackler has his Mac
(nickname for the Macintosh computer)!
He jumps into the back  of the HAZMAT
van as the driver pulls out of the
firehouse. While the driver switches on
the siren and flashing lights, Sackler
turns  on his computer and calls up his
CAMEO system. The sooner he knows
what  problems they  face, the better off
they'll be.
  As the van races down the street and
the sirens wail outside,  Sackler hears
the familiar sound of the computer
warming up and sees the smiling face
on the Macintosh before CAMEO's
opening screen comes up. This is the
"Navigator," which allows him to select
the data base he needs by a simple click
of the mouse,  pointing to the picture
representing the data base he wants.
First, he reads what chemicals Freeland
Chemical has stored on its premises.
Next, he learns the name of the
company contact person and how to
reach him if he is not already at the site,
in order to verify the chemical
identification.
  Fortunately, his Captain  previously
insisted on stepping up efforts to survey
the chemical plants in the community,
especially since new federal laws
provide additional leverage to collect
critical information from chemical
facilities on what hazardous chemicals
were stored in the community, and to
plan for possible accidents. As a result,
the information  is in his CAMEO
program, organized in a  logical retrieval
form, including recommendations for
response actions. The new
law—popularly known as SARA Title
III—and the computer program have
certainly reduced much  of the
uncertainty associated with past
HAZMAT runs.
  From the CAMEO screen, Sackler
learns that Freeland has  a number of
nasty substances that could produce a
sulfur smell. He checks out methyl
disulfide and sulfur tetrafluoride to see
which would be the more likely culprit
and what types of problems these
particular chemicals might cause
firemen trying to control the situation.
  The van sways as the driver races
over potholes and  around corners. The
lieutenant wishes his boots were bolted
down, like the computer. CAMEO has
more to tell him: only sulfur
tetrafluoride is a gas and likely to give
off a sulfur smell. And, says CAMEO, to
control a spill the  firemen are going to
have to suit up in  full gear with
protective breathing apparatus AND
NOT USE  WATER!
  Next question: where is the stuff
stored (and what would  be a good
staging area)? Click, and the screen
shows the facility site plan. More
questions: Who would be affected by
the fumes? The worst-case scenario run
several months earlier had shown
several schools in  the area, although
they would not be in session at this
hour, and hospitals are out of range of
the airborne plume, given the amount of
the chemical stored by Freeland. But a
20
                                                               EPA JOURNAL

-------
        D
      JU
                            V 1.01
Codebreaker  Response Info   Air Model
                        Terminal     Resources
                                         M
  Facility    Transportation Populations    Inventory   Facility Plans  Screening    Scenarios
  Records
Files
Release
Request   EHS Chemicals   Reports
Planning
  rest home is close by. What kind of
  ventilation does it have? Can it be shut
  off for a few hours? Click: the answer.
    Now, as the van nears the scene,
  Sackler and his crew are ready for what
  they have to do. What a difference from
  the old days, when they spent precious
  time on arrival to get the same
  information they are ready with as the
  van rocks to a stop ... thanks  to CAMEO!
                          CAMEO's Macintosh version (15
                        diskettes and a manual) is available
                        from the National Safety Council
                        (312-527-4800) for $115. An MS-DOS
                        compatible version on 9-track tape
                        including only the chemical data base
                        and vulnerability calculation (and
                        lacking graphics capability and Title 111
                        information) is available from NOAA
                        (206-526-6317) at  no cost, Q
                                                   (.Snider is ivith the Hazardous Materials
                                                   Response Branch of the National
                                                   Oceanic and Atmospheric
                                                   Administration, /over is Director of tin:
                                                   Information Management and Program
                                                   Support Staff in EPA's Chemical
                                                   Emergency Preparedness and
                                                   Prevention Office.)
                                                                               1	
                                                                               Railroad
   FREELAND CHEMICAL: Sulfur Tetrafluoride:  Plume radius=2.1  miles:  Map Scale 1:70000
  MAY/JUNE 1989
                                                                                                                  21

-------
What the  Public  Thinks
about   Environmental  Data
by David B. McCallum and
Vincent T. Covello
   Through state and federal community
   right-to-know laws, vast amounts of
data about toxic substances are newly
available to the public. In research
jointly conducted by Georgetown
University's Institute for Health Policy
Analysis and Columbia University's
School of Public Health, public
knowledge, attitudes, and behavior were
examined at the community level with
respect to this new resource. Our survey
results indicate that the capacity of
citizens and many local officials to
understand and effectively discuss these
data is generally not keeping pace.
  Recognizing their own lack of
technical knowledge, many citizens
depend on others to interpret
environmental data and provide them
with information in terms they can
relate  to. Who does the public trust as
sources of information about chemical
risks?
  In general, doctors and environmental
groups are the most trusted sources of
information about chemical risks. Our
respondents identified news reporters
and Title III local emergency planning
committees as the next most
trustworthy. Government officials are
only moderately trusted, and industry
officials least trusted.
  Many people fear that government
and industry may withhold information,
and they are skeptical that the
information they routinely receive
reveals the full magnitude of
environmental problems. For example,
85 percent of our survey respondents
agreed with the following statement:
"The only time you hear about a
chemical release is when it  is so big it
can't be covered up."
  The sources the public most trusts are
not necessarily those it considers most
knowledgeable. In fact, industry
officials, the least trusted of information
sources, are widely considered to be the
most knowledgeable about chemical risk
issues. Next most knowledgeable in
public opinion are environmental
groups, followed by the government.
Respondents felt that physicians, cited
by many as the most trustworthy, were
less knowledgeable than other sources;
however, physicians are among the least
frequently used sources of information
about chemical risks.
  In practice, the public depends most
of all on the mass media for information
about chemical risks. Our respondents
In general, doctors and
environmental groups are the
most trusted sources of
information about chemical
risks.
cited local television and newspapers as
their primary sources. Ironically, the
source people rely on most often is
neither the most knowledgeable nor the
most trustworthy in their opinion.
  According to our survey results,  most
citizens  do not follow environmental
issues on a daily basis. We found that
only about 25 percent had read or heard
something about toxic substances in the
past week. The public attention tends to
be captured primarily by sensational
events like the chemical release at
Bhopal,  India, or the intense media
coverage of Alar in apples, rather than
routine reports concerning
environmental  topics.
  Most citizens express only moderate
interest in gathering information on
environmental  issues. In this context,
many people cited unrewarding prior
attempts to obtain  information. In many
cases, contacts  with local sources have
been unsatisfactory because only a  few
local government officials, teachers,
librarians, health professionals, and
industry personnel are prepared to
respond  to public inquiries. A majority
indicated that their personal motivation
to pursue environmental data under the
new right-to-know provisions would
depend on whether they or their
families were directly affected by an
environmental hazard.
  On the general subject of risk
communication, a majority of our
respondents did not understand the
concepts of exposure, dose-response,  or
fundamental concepts of probability.
For instance, the majority equate release
of a toxic substance in the community
with exposure for citizens and assume
the exposure has an adverse effect. This
may in part explain the incredulous
reaction to the concept of "permitted
releases" (releases allowed by air quality
standards): "You mean that they are
releasing toxic substances and the
government knows it and is not doing
anything?" Participants clearly
recognized, on the one hand, that zero
risk is not achievable; on  the other
hand,  they thought it should be the
goal.
  Citizens did express concern about
the long-term and interactive effects of
pollutants. In many cases, adequate data
are not available to answer their
questions, and this erodes their overall
confidence in risk assessment
information.
  Clearly there is room for improvement
in the process of communicating risks to
the public. What can be done? In the
short term, it seems imperative for
environmental managers and
policymakers to understand the current
flow of information and to use existing
channels to deliver clear messages that
speak  to the needs of various publics
who have different kinds of skills,
interests, and needs. This will require
everyone concerned with  risk
communication to be aware of the needs
and concerns of various audiences.
  Perhaps alliances among various
information sources could serve as a
22
                                                                                                   EPA JOURNAL

-------
Ed Stein, Rocky Mountain News. Reprinted with permission.
way of pooling knowledge and
credibility in order to better serve the
public's need for information. Moreover,
new channels for information
dissemination in the community, in
addition to existing information sources,
should be developed and supported.
Many people fear  that
government and industry may
withhold information ....
  To date, public hearings have been
emphasized over mass media strategies
in environmental risk communication
programs. If carefully planned and run
to avoid confrontation, such hearings
can establish contact with the public at
an early stage in the  communication
process before or during risk
assessment. Public hearings provide
forums for expression of the public's
concerns, apprehensions,
misconceptions, and  information needs.
They give citix.ens the; satisfaction oi
"being heard" and can build trust in
effective risk communicators and focus
public attention on the issues.  However,
public hearings and town meetings
reach only those citizens who are most
actively concerned about an
environmental problem. Only 25
percent of those we surveyed had
attended a town meeting.
  The  long-term challenge will be to
create more  active interest and better
skills among a greater percentage of the
general public to foster better
understanding of environmental data.
This will require institutionalized
educational  efforts. As a beginning,
environmental issues could be
addressed in public schools, preferably
allied with science and math curricula.
Also, substantial benefits might be
gained from  greater emphasis on
chemical risk issues in the training of
physicians and increased involvement
of physicians and medical organizations
in communications about chemical
risks. D
(McCallum is Associate Professor in (lie-
School of Medicine, Georgetown
University, and Director of
Georgetown's Program on Hisk
Communication 
-------
Keeping  Tabs  on  Air  Quality
by Roy Popkin
    Air quality nationwide continues to
    show considerable progress over the
years, offset by concerns that many
areas still do not meet applicable air
quality standards. Despite the overall
progress, almost 102 million people in
the United  States reside in counties
which exceeded at least one air quality
standard in 1987. Ground-level ozone,
in particular, continues to be a
problematic air pollutant in many urban
areas.
  These conclusions are presented in
EPA's recently issued National Air
Quality and Emissions Trends Report
covering the 10-year period January
1978 through December 1987. But how
does EPA know what air-pollution
trends are from day to day and year to
year across  the country? The answer is
the national air quality monitoring
system that keeps track of the nation's
progress in  cleaning up the air we
breathe.
                                                             Sreve Deianey phoio.
.  ;
  The national monitoring system
focuses on the six criteria air pollutants
for which National Ambient Air Quality
Standards have been established by EPA
under the Clean Air Act. These are:
airborne particulate matter, sulfur
dioxide, carbon monoxide, nitrogen
dioxide, ozone, and lead.
  The daily, sometimes hourly, data
that are collected, compiled,
transmitted, and analyzed through this
system have very specific practical
applications:

• The data  provide feedback to state
and local governments regarding their
success, under individual State
Implementation Plans, in meeting
national air quality standards.
• The monitoring data alert EPA as well
as state and local officials when
individual metropolitan areas are in
violation of air quality standards.
• The data  also serve as a warning
system for local environmental agencies
concerning  pollution problems that may
require emergency response measures in
order to protect the public: health.
Accidental releases, air inversions, or
brush fires,  among other things, may
trigger emergency air pollution
conditions.

  The national air quality monitoring
system is a  network involving millions
of data points gathered by hundreds of
"environmental meter readers" from
thousands of instruments in over 5,000
locations across the country. The
data-gathering process  typically begins
when a technician pulls a filter or reads
a tape from  a measuring device at  one of
the many sites maintained by state or
                                                                              Dick Wies, head of t^
                                                                              group of the M
                                                                              Environment, check:
                                                                              high-volume air •
                                                                              Avenue station in an           •• of
                                                                                   jre. Stall.
                                                                                 isivo data  into t'PA'
                                                                              system.
                                                                                                      EPA JOURNAL

-------
local governments. Some pollutants are
measured hourly using continuous
monitors, while others are measured as
daily averages. The data-gathering
process continues as these technicians
gather  up and replace filters, collect
taped data, maintain the equipment and
check calibrations, and do whatever else
is necessary to keep the system
functional and accurate, 365 days a
year.
  The  monitoring sites must meet
uniform criteria for siting, instrument
selection, quality assurance, analytic
methodology, and sampling intervals;
the sites must also satisfy annual
"completeness criteria" appropriate to
pollutant and measurement
methodology. This assures data of
consistent quality across the United
States.
  State and  local monitoring stations
and special-purpose monitors must meet
the same strict criteria. Data from only
those locations with sufficient historical
data are included in the annual trends
analysis to ensure that trends are in fact
due to changes in air quality, and not
simply the result of using data from
different sites.
  Monitoring site instrumentation  must
meet EPA specifications and standards.
The Agency works closely with
universities  and manufacturers to
improve the technology used to gather
air  trends report data.
  Recently, for example, it was
necessary to adopt a new indicator for
airborne particulate matter in
conjunction with revisions to the
national air quality standards for
particulate matter. The original
standards, established in 1971, treated
all  particles  as the same, regardless of
their size or chemical composition.
Since then, however, new studies have
shown that the smaller particles
penetrate more deeply into  the human
respiratory tract, thus posing a
particularly  significant health risk.  In
1987, in light of this new information,
EPA revised the National Ambient Air
Quality Standards and replaced the
original instrument (called a
"high-volume" sampler) for measuring
overall particulate matter levels with a
new indicator that measures particles
that are 10 micrometers or smaller.
The monitoring data alert EPA
as well as state and local
officials when individual
metropolitan  areas are in
violation of air quality
standards.
  In general, a practical problem that
confronts air pollution agencies is
where to place the monitors to get an
accurate idea of what the air pollution
levels are. They need to know what
levels people are breathing, what levels
are coming into the area from other
locations, and what normal ambient air
levels are. At the same time, they must
have ready access to the monitor sites.
  These logistical considerations
sometimes  lead to the creative
placement of monitoring sites. Many air
sampling instruments are located in or
on top of schools and suitably located
government buildings. Others are placed
in more exotic locations, such as the top
of the World Trade Center in New York
and the  ninetieth floor of the Sears
Tower in Chicago. A specially rented
apartment on Waikiki Beach in Hawaii
houses instruments measuring
pollutants originating from the busy
beach road. Others are atop water intake
"islands" in Lake Michigan,  several
miles from  the Chicago lakefront; in
heavily  trafficked midtown city areas
and equally busy suburbs; or in areas
where power stations or heavy industry
emit a variety of pollutants. The
monitoring station closest to the Office
of Air Quality Planning and  Standards
in Durham, North Carolina, is on top of
the county jail in downtown Durham.
  It is important to realize that the
tremendous volume of data that feed
into EPA's air trends reporting system
come from state and local agencies.
EPA,  itself, does very  little of this
monitoring.
  Maryland, for example, has one of the
nation's most sophisticated
air-monitoring systems. It extends from
Cumberland, an old industrial city in
the state's western mountain region, to
southern Maryland's Chesapeake Bay
coastal region. Some of Maryland's sites
are checked twice daily year round;
others less frequently. Some report their
information automatically via telemetry.
In addition to monitoring for the six
criteria pollutants, the state measures
various toxic air pollutants and acid
rain. Moreover, its laboratory also
analyzes air toxics samples from  a
number of  northeastern states.
  The state agency's three laboratories,
where Maryland's daily pollutant levels
and statistics for the air trends report
are compiled and analyzed, are in a
section of a long building that was once
a wire factory, near the Dundalk  Marine
Terminal and across the harbor from a
steel plant  at Sparrows Point. One
laboratory unit analyzes monitor
samples, dust, soil particles,  lead,
sulfates, and pyrene derivatives found
in filters and other manual recording
devices.
  Another deals with "dry surveillance"
(levels recorded on  tapes or printouts by
recording devices) of carbon  monoxide,
sulfur dioxide, nitrogen oxide, and
ozone. The third lab analyzes
non-methane organic compounds and
other  toxics. This unit also has its own
workshop,  built by the technicians,
where instruments are regularly
checked, and recalibrated if necessary;
motors are  rebuilt every six months. The
MAY/JUNE 1989
                                                                                                                   25

-------
                                                                                               , around i;
workshop is dubbed the "music room"
because of the shrill sound of new
motor bushings being tested.
  Tht! Maryland "meter readers" also
monitor the effectiveness of the state's
own air quality improvement program,
and they watch for anomalies that could
necessitate immediate regulatory action.
For example, smoke from a recent series
of brush fires raised participate matter
levels almost to "the point where we
might have had to shut down
Baltimore's industry," according to Dick
Wies, head of the monitoring analysis
group.
  Tin; Illinois monitoring network
likewise satisfies both state and federal
reporting and regulatory requirements,
and state and EPA Region 5 personnel
have collaborated in the design of
instruments that meet both state and
federal criteria for valid data collection.
Illinois owns 250 monitoring devices,
including 39 intended to measure ozone
levels, (hie monitor is "hidden" in the
Chicago Museum of Science atici
Industry. Five of the state's 18 carbon
monoxide stations are installed around
O'llare airport to monitor emissions
from both planes and cars.
  Data from Illinois flow, almost
continuously, into the capital city of
Springfield,  where a staff of 28 validates
the information and oversees the
maintenance of equipment. The data are
                  Chicago Tourism Council photo.

fed directly into EPA's new Aerometric
Information and Retrieval System
(AIRS).
  Illinois and other states not only
provide raw monitoring data to EPA but
can also  serve as a valuable sources of
supplemental information. For example,
some initial studies by Dave Kolaz of
the Illinois Environmental Protection
Agency have provided insight into the
effects of meteorological conditions on
ozone trends in the midwest. According
to Kolaz, the reduction in carbon
monoxide levels can be attributed to "a
tremendous reduction in carbon
monoxide in Chicago because of tax
incentives in Illinois to use ethanol."
On the other hand, ozone levels have
been going up in Illinois,  "so our
technicians take note of temperature
levels and what's coming  into our air
from other states."
  Raw data from the states can be held
in AIRS computer  system  screening files
for review by state analysts before being
made available to EPA. Quality
assurance checks by  state  agencies flag
unusual readings, such as those caused
by forest fires or temperature inversions.
In addition,  the monitoring sites are
visited periodically by state and EPA
staff to double check not only technical
performance but also the continued
validity of the location itself. For
instance, a monitoring site may need to
be relocated if a  heavily trafficked street
has been replaced by a pedestrian mall,
or if fast-growing trees have partially
blocked the air flow to the instruments.
  The tremendous volume of
information gathered each year winds
up at the Office of Air Quality Planning
and Standards (OAQPS) in North
Carolina, where it is subject to
detailed analysis. The data currently
reach the OAQPS compilers and
analyzers by two  routes: the "old path"
from localities and states through the
EPA regions, where they are processed
and relayed to the EPA National
Computer Center at Research Triangle
Park; or directly from the states to the
Computer Center by way of AIRS.
  As its acronym implies, AIRS is EPA's
new computerized information
management system for data on ambient
air quality  and  emissions; AIRS also
tracks compliance with data
requirements. By the end of 1988, 28
states and Puerto Rico were plugged
into the AIRS system, and at least 14
more state-level agencies are scheduled
to go on line in 1989.
  Throughout the entire air trends
monitoring and analysis process, there
is a scrupulous regard  for accuracy.
Tom Curran, Chief of the Data Analysis
Section at OAQPS,  stresses the integrity
and cooperative aspects of the
monitoring system at all levels, even
when work loads are heaviest. "We're
dealing with professionals  who know
the importance of their work," he says.
"When the data reflect downward
pollution trends,  we take pride in the
accomplishments of EPA and state air
pollution agencies. When  the trend  is
up, we know there's a  lot of work to do
to clean up the air." d
fPopkin is a Writer-Editor in EPA's
Office of Public Affairs.}
26
                                                                                                          EPA JOURNAL

-------
There's  More than  Poetry
in  the  Clouds
by Gregg Sekscienski
                                                              John Sttsw photo.
MAY/JUNE 1989
A    man-made spider web of teflon
    string sits atop a tower on Mt.
Mitchell in North Carolina's Mt.
Mitchell State Park. As clouds roll in,
cloud droplets deposit on the teflon
strings. Slowly, the water drips off the
strands of the web and into a small
trough. Every hour, a man climbs up the
70-foot tower and empties the trough.
He climbs other towers to empty other
troughs. Whenever clouds appear he
performs this ritual.
  The  man works for the Mountain
Cloud  Chemistry Project, which is
managed by EPA's Office of Research
and Development. The project is
designed to measure the impact clouds
have in distributing chemicals and other
pollutants—primarily sulfates and
nitrates—to high-altitude forests.
Catching clouds may sound unscientific,
an act  better left to poets and dreamers.
but for the Mountain Cloud Chemistry
Project it is serious business.
  Acid rain damage was first  studied
extensively in Europe, when the effect
known as Wa/dsterfaen, or "tree
die-back," began destroying the forests
of Germany, especially the legendary
Black Forest. But something puzzled
researchers. They knew rain carries
airborne pollution to forests in a process
called wet deposition. They also knew
that some chemical particles find their
way to forested areas by wind and air
currents, a process known as dry
deposition,  But pollutant levels
measured in the forests were greater
than these two sources could  produce.
Therefore, pollutants must be reaching
the forests by some other means. Clouds
were suspected as the culprits.
  As clouds move through a  lint; of
mountains,  they collide with the
high-altitude, spruce-fir areas  of the
forests. Chemicals contained  in the
clouds  are deposited  in the forest
canopy. This phenomenon is known as
cloud interception and is believed to be
responsible  for a considerable amount of
the pollutants that are being measured
in these areas. According to Volker A.
Mohnen, the Principal Investigator lor
the project,  the areas  of the forest that
clouds  regularly collide with can have
deposition levels twice as high as areas
                                                                           This moisture collection tower must be
                                                                           checked at least hourly. The cloud moisture
                                                                           collects on teflon strings and drips into a
                                                                           trough; researchers check the amount of water
                                                                           that has collected and analyze its contents.
                                                                           This site is one of six staffed by EPA's
                                                                           Mountain Cloud Chemistry Project.
                                                                                                             27

-------
just below the cloud-impact level.
  With clouds fingered as possible
culprits, the idea was born, in 1984, for
a project designed to collect reliable,
accurate data that might prove clouds
were contributing to the Waldsterben
problem here in the United States. The
project would need to situate
site-monitoring stations in areas
representative of the problem, transport
reliable monitoring equipment to those
areas, and catch, collect, and analyze
the clouds.
  In choosing the monitoring areas, the
project directors decided that three
northern sites and three southern sites
would best represent the different
weather systems that affect East Coast
forests. The northern forests receive
clouds primarily from the industrial
Midwest and Canada,  while the
southern forests receive clouds from the
Midwest and the southern states.
  Also, to help insure the accuracy of
the data, the sites had to be located near
people qualified to collect and analyze
the data. So each site has an associated
lab close by, where the data can be
reliably analyzed. In addition, a centra!
analytical  laboratory has been
designated for performing overall
quality checks.
  By 1986, the six sites were selected
and data collection began. Rowland
Forest, Maine, Mt. Moosilauke, New
Hampshire, and Whiteface Mountain,
New York, make up the northern half of
the study. Shenandoah Forest and
Whitetop Mountain, both in Virginia,
and Mt. Mitchell, North Carolina,
comprise the southern forest sites of the
project. Most of the sites are relatively
remote, but all must be accessible by
vehicles. Power lines were strung to
provide electricity for the monitoring
equipment. Towers were installed so
instruments could be positioned at  the
same  height as the forest canopy level.
At the Shenandoah site, a  helicopter
was needed to fly the  monitoring towers
in.
  At each site monitoring devices were
installed. Some were adapted from
lab-based equipment when possible, but
others had to be designed specifically
for the project. They had to withstand
the hazards of the mountain
environment: average wind speeds of
15-25 miles per hour, and gusts over 80,
as well as cold and precipitation.
  A method of detecting cloud presence
was needed. To catch, collect, and
analyze clouds, the researchers needed
to know when clouds were present.
They decided to use a variety of
methods, including visual  observations
by on-site technicians, video recordings,
and humidity measurements, and to
compare  the results to a newly
developed optical cloud detector that
uses an infrared beam system to detect
clouds. A 95-percent agreement between
the detector and the other  methods has
convinced the project directors to use
the new device at all its sites.
  Once a cloud is detected, the amount
 The data gathered since 1986
 have shown that clouds are
 indeed culprits in  the acid
 deposition damage to the
 nation's mountain forests.
 of water in the cloud must be measured
 and a sample of the cloud water must be
 taken. A cloud liquid-water-content
 monitor measures the amount of water
 in the cloud. This device collects water
 in a cylinder filled with a honeycomb of
 polypropylene mesh. A blower  is used
 to draw cloud air through the cylinder,
 and the exact amount of air drawn is
 measured. The exact amount of water
 collected is also measured. With both
 these figures known, the liquid  water
 content of the cloud can be calculated.
  Two  devices are used to collect cloud
 water. The first is a "passive" collector.
 It looks like a cylinder standing on end,
 approximately two feet high and one
 foot in  diameter. The cylinder's wall is
 actually made of spider-web-like strands
 of teflon string. Cloud  water collects on
 these strings and drains into a trough at
 the bottom of  the cylinder, where it is
 collected.
  The second device, an "active"
 collector, looks like two window
 screens, again made of teflon string.
 Teflon  is used  because it is
 chemically inert and will  not affect the
 chemical composition  of the cloud
 water. Cloud air is drawn through the
 screens by a blower. Again, water is
 deposited on the strings and collected.
 These samples, from either device, are
 stored for later chemical analysis.
  The on-site data and cloud water
samples, collected in a May-to-October
measuring season each year, are
analyzed and verified at each site's
associated lab and sent  to the project's
data management center in Albany, New
York. Here the data are computerized
and stored. This data base can now be
accessed by anyone in the assessment
community. The project even shares
data with Canada and Germany.
  The hypothesis that gave rise to the
Mountain Cloud Chemistry  Project has
been proved correct. The data gathered
since 1986  have shown that clouds are
indeed culprits in the acid deposition
damage to the nation's mountain forests.
Analyses of cloud water have indicated
that significant  amounts of sulfate,
nitrate, ammonia, and hydrogen ions are
deposited to the forests through the
cloud water. In particular, the project
has found that:
• Clouds are between 5  and 20 times
more polluted than rain.

• The acidity of clouds  is 1/2  to 1 pH
unit more acidic than rainfall.
• The chemistry of clouds varies
according to their pathways. In the
northern forests, for example, clouds
from a southwestern direction are
higher in pollution content than clouds
from the northwest. This is  due in part
to the heavy concentration of industrial
emissions from areas including the Ohio
River Valley.

  The data from the Mountain Cloud
Chemistry Project join the growing body
of knowledge about the acid deposition
problems throughout the world. This
new knowledge about the intensity and
amounts of cloud deposition will be
crucial in sorting out the relative
contribution of  regional  air pollution to
the damage of spruce-fir forests. Q
(Sekscienski, a journalism student a!
the Universit}' of Maryland, is an intern
with EPA Journal.)
28
                                                                                                         EPA JOURNAL

-------
Dioxin  Pathways:
Judging   Risk to  People
 by Michael A. Callahan
   Since the 1970s, when the
   controversy surrounding Vietnam
veterans' exposure to the defoliant
Agent Orange first attracted national
media coverage, the word "dioxin"  has
crept into the vocabulary of average
Americans. After Vietnam, we heard
about dioxin at Times Beach.  Missouri,
where property was contaminated by
flood waters containing dioxin. We
heard about dioxin again when an
industrial plant exploded  in Seveso,
Italy, about a decade ago. A couple  of
years ago, we heard  it connected  with
some forms of paper products. It  is  said
to be present at some Superfund sites.
More recently, dioxin has been
mentioned in connection with
incinerators.
  What is dioxin? More importantly.
how can we get a handle on dioxin
exposure? Are we likely to see more
data on dioxin, in more and more places
and things, in  coining years?
  Dioxins, or CDDs (see boxes), are
formed as unwanted byproducts in
chemical reactions involving
hydrocarbons and chlorine, usually at
elevated temperatures. From studying
data on the amounts of CDDs formed in
different reactions, scientists today
know a lot more about what conditions
are favorable to CDD formation than
they did in the 1970s.
  During the 1960s and 1970s, pesticide
manufacturing processes and other
industrial reactions sometimes were
unknowingly performed under
conditions favorable to production of
CDDs, and the byproducts or wastes
from these reactions have become  the
Agent Oranges, Times  Beaches, Sevesos,
and Superfund sites of later years. Many
industrial processes  have since been
changed to avoid the formation of  CDDs.
  More recently, analytical data
indicating that minute quantities of
dioxin were present in certain bleached
paper products and sludges have again
caused some rethinking and  revision of
manufacturing processes. The tipoff in
this case was the analysis of data from
aquatic organisms downstream from a
few paper plants. Puzzling at first, the
levels of CDDs found downstream led to
collection of more data, by both EPA
and the paper industry, which  showed
that some processes being used to
bleach paper involved conditions which
resulted in trace-level CDDs.
  What about incinerators? Evidently.
the small amounts of chlorine present
from various materials in municipal
trash—coupled with hydrocarbons and
the heat of the incinerator—are enough
to produce tiny quantities of various
CDD compounds, depending on the
exact conditions present in the
incinerator's burn chamber. Only a
small percentage of the CDD compounds
(typical data indicate less than  10
percent) formed in an incinerator appear
                    oodeci










        ii Times b
                                                                  '  ?
                              •-' ' -.-  •  ' •- '
                                     .    "



                             APIWide World photo
MAY/JUNE 1989
                                                                                                          29

-------
to be 2,3,7,8-TCDD. This dioxin can
leave the incinerator adsorbed onto
particles ("soot"), or perhaps as a vapor,
or in "fly ash" (trapped participate
matter that is filtered out before the
exhaust from the incinerator leaves the
stack).
  Once it enters the environment, the
chemical properties of the CHI)
molecules, the conditions  in the
environment, and the activities of the
persons involved determine potential
human exposure to dioxin. Typically,
laboratory data  for CDD molecules
indicate very low water solubility, very
low vapor pressure, and a  high
tendency to dissolve in lipid (fatty)
  What Is "Dioxin?"

  The term "dioxin" is chemical
  shorthand fora large family of
  compounds more correctly termed
  chlorinated dibenzo-p-dioxins
  (CDDs). One of these compounds,
  2,3,7,8-tetrachIorodibenzo-p-dJoxin
  (2,3,7,8-TCDD), is a very potent
  animal carcinogen; based on this
  evidence  in laboratory animals, it
  is classified as a probable human
  carcinogen.
     Although no one is absolutely
  sure of its potency as a human
  carcinogen, based on its  animal
  potency—and following  the
  customary policy on implications
  of animal evidence—EPA has
  treated 2,3,7,8-TCDD as potentially
  one of the most potent human
  carcinogens known.  There are
  more than 100 other CDDs, and all
  are less toxic than 2,3,7,8-TCDD,
  some considerably less toxic.
     K is unusual to find
  2,3,7,8-TCDD alone,  without other
  CDDs present also. Scientists have
  therefore  devised a system of
  "toxic equivalency factors" (TEFs),
  which converts the toxicity of a
  mixture of many CDD compounds
  into the amount of 2,3,7,8-TCDD
  that represents the equivalent
  toxicity of all the CDDs in a
  mixture.
material. CDD molecules also have a
tendency to stick to surfaces,
especially surfaces with high organic
content such as soot or soil. Both
laboratory and field data indicate that
CDD molecules do not degrade rapidly,
although some CDDs degrade much
faster than  others in the: environment.
  In the case of incinerators, these and
other data allow scientists and engineers
to estimate where the dioxin molecules
may go after leaving the incinerator and
how people might ultimately be
exposed to it. The dioxin in the exhaust
of an incinerator stack, whether emitted
in vapor or particulate form, becomes
diluted and may eventually be inhaled
by persons  in the vicinity of the
incinerator. The dioxin may also be
deposited on soil, crops, or water.
  Risk assessors must combine the
insights gained by analyzing data from
the laboratory with analytical results
from measurements from the field in
order to begin to evaluate exposure. For
example, dioxin  in soil can be
inadvertently ingested (say. by children
playing), blown around as dust, or
washed into waterways. Soil analysis
data can show how much dioxin is in
the soil, but experimental data on how
much soil is inadvertently ingested—as
well as data on how often children
might play in a contaminated
area—must be gathered and  interpreted
before estimates can be made as to how
much exposure children might get from
contaminated soil.
  Because of its properties, scientists
believe that dioxin is unlikely to
dissolve in water to such an extent that
it will  contaminate ground water
through leaching. However, in cases
where organic solvents are present,
there have been some data indicating
that small amounts of dioxin may move
to ground water. Porsons who
inadvertently ingest or inhale
contaminated dust, or drink
contaminated drinking water (a less
likely event), would thus be exposed to
dioxin.
  Dioxin falling on crops or waterways
presents a different sort of problem.
Although data on whether dioxin is
taken up through the roots of plants are
controversial, many scientists believe
that transport of dioxin from soil
through roots to edible portions of a
plant above ground is unlikely to  be
significant.
  Less  is known about whether root
crops such as potatoes may take up
dioxin  from the soil. Plants can also be
contaminated by particles falling on the
plant, which can then be eaten by either
livestock or humans. Because of
dioxin's preference for fatty material, it
can then accumulate in the animals. !!
livestock are used for human food, this
can be  another exposure pathway.
  Although these food-related
"pathways" can be logically surmised
from laboratory test results,
actual  data tracing these potential
exposure pathways are sparse. On the
other hand, once dioxin contaminates  a
waterway, its preference for soil
(sediment) or lipid means that it is very
likely to turn up in those substances.
Fish bioconcentration is a well-known
   Measuring Dioxin
   Concentrations

   Because of its toxicity,
   2,3,7,8-TCDD is perhaps one of the
   most well-studied organic
   chemicals. The current analytical
   capabilities for CDDs, and
   especially 2,3,7,8-TCDD, are
   nothing short of remarkable. For
   example, the well-known tests for
   the 129  priority pollutants usually
   allow detection limits  in the low
   parts-per-billion (i.e., one part
   pollutant in  a billion parts of the
   material being tested for
   contamination). It is not unusual
   for analytical tests for
   2,3,7,8-TCDD to be a thousand
   times more sensitive, and
   laboratory researchers  have
   discussed getting to
   parts-per-quadriilion levels, or a
   million  times more sensitive than
   more routine pollutants. With this
   sort of analytical capability, we
   can be much more aware of CDD
   contamination than we are of
   contamination from many other
   pollutants.
                                                                                                         EPA JOURNAL

-------

phenomenon in areas where dioxin has
contaminated waterways, and some
areas have been closed to fishing due to
measured levels of dioxin
contamination in  fish.
  Finally, to continue the incinerator
exposure assessment, the dioxin in the
fly ash must be tracked down. Usually,
fly ash is landfilled,  and if precautions
are taken to contain the ash and
eliminate solvent  leaching, scientists
believe the dioxin will be relatively
immobile, although long-lived. This
conclusion is mainly based on
laboratory data, although  field data tend
to back this up with  "non-detects" in
ground water. Improper care of the fly
ash could lead to  the same types of
exposures as outlined above for
contaminated soil.
  What does all this  mean? Dioxins,
especially 2,3,7,8-TCDD, are treated as a
very potent toxicants that may cause
health effects at quite low levels of
exposure. Risk assessors must use a
combination of laboratory and field data
to make estimates of exposure and risk.
Analytically, we find dioxin in many
places; however, our analytical
techniques allow us to "see" it much
better than many other pollutants.
  Dioxins are even found in human fat
tissue. At least one estimate has been
made, from data on adipose tissue levels
in the U.S. population, that virtually
everyone in the United States has a very
small "background" level of dioxin
exposure, perhaps on the order of one
millionth of one millionth of a gram of
dioxin ingested per kilogram of body-
weight per day. Even using conservative
assumptions,  this  level indicates a low
level of risk as "background"; however,
many other exposure and risk factors
need to be considered in assessing
dioxin-related risks for any  individual's
specific situation.
  Tracing exposure to persons around a
site such as a landfill or incinerator is a
complicated  matter involving the types
of CDDs produced, the environmental
conditions, and the activities of the
people involved. Preliminary
calculations show that in  these cases,
food-chain exposures or inadvertent soil
ingestion may ultimately be the most
important pathways,  but much depends
on site-specific factors. In fact, the
environmental and population activity
factors are important enough to
preclude categorical statements on the
risk levels from landfills or spill sites.
  As a closing thought, there is some
reason for optimism about the future
concerning dioxin. As time goes by,  we
discover more about this class  of
compounds known as dioxins, and the
more we know, the more  likely we are
to be able to put bounds on what we
might see in the future.
  We cannot  rule out a future surprise
or two as to where dioxin may be
discovered. (Paper products certainly
caught many  people unaware a couple
of years ago, and just recently CDDs
have been found in certain
petroleum-refining process streams.)
However, we now have a general idea
how CDDs are formed mid how they're
not formed. Although dioxins have been
detected in many places, the resulting
exposure and risk have in many cases
been low. And the active interest in
research on dioxin means we will
continue to learn more about these
compounds and  there will he fewer data
gaps in our knowledge. We may then
truly get a handle on dioxin. a
(Ca/lahan is Director of the Exposure
Assessment Group in the Office of
Environmental Health, .Assessment,
Research, and Development in EPA's
Office of Research and Development.]
MAY/JUNE 1989
                                                                           31

-------
Getting  It  Together  with  GIS
 by Thomas H. Mace
   EPA collects, processes, and interprets
   massive amounts of data on the
 environment. Such data come in the
 form of tables, maps, or images from
 space-sensing systems and reflect
 everything from water quality, air
 emissions, and soil gas measurements to
 the results produced by models of the
 global environment. How can the
 Agency's scientists, managers, and
 decision-makers absorb this influx
 without being paralyzed by
 "information overload?"
   Now there is an important new tool
 that makes it possible for computers to
 integrate diverse,  multi-media
 information into a common data base. It
 is called a Geographic Information
 System (GIS), and it has the potential to
 revolutionize the  way EPA analyzes
 environmental  data and significantly
 improve the Agency's ability to make
 complicated environmental decisions.
   GIS in its most  rudimentary form can
 be a series of transparent overlays to a
 map showing land use, soils, land
 ownership, surface elevation, and other
 information about a particular portion of
 landscape. Following a visual analysis
 of a set of overlays, conclusions can be
 made, for example, concerning where
 airborne emissions are coming from—or
 even about  the suitability of actions
 such as siting a landfill. But using
 overlays without  computer assistance
 has obvious limitations. It is difficult to
 integrate more than a  few layers at  a
 time and  still know what one is looking
 at. Also, compiling data onto a common
 map base is a very time-consuming
 process.
   The modern GIS, as it was developed
 first by Canadian geographers and
 adapted for various uses in the United
 States, is a computer and software
 system that permits the automated
 overlay and analysis of multiple data
 layers (called "themes") for data
 management, mapping, and
 decision-making. The GIS used by EPA
 has a  "relational" data base that stores
 32
themes such as land uses, soils,
population, and well logs for a
particular area and enables users to
explore the .interrelationships among
them. Another computer file contains
"earth-coordinate" data—latitude and
longitude—and other information on the
relationships between a specific location
and surrounding areas. The GIS
software  enables coordinates and their
associated themes to be related to other
sets of coordinates and themes—e.g.,
where drinking water sources are
located in relation to pollution sources.


GIS  in  its most rudimentary
form can  be a series of
transparent overlays to a map
showing land use, soils, land
ownership, surface elevation,
and  other information ....


  This capability has a number of
practical  applications. Suppose an
environmental agency wishes to know
the population served by a well that has
been found to be contaminated. GIS can
graphically overlay the well location
with a map of the subsurface
contaminant plume and then add an
overlay of census data for the area; the
potentially affected population can then
be determined by census category. An
alternative would be to plot the well
locations for an area, point out a well,
and simply ask for a printout of the
measurements that have been taken
there.
  GIS also allows users to "create"
buffer zones around a well, a stream, or
the habitat of an endangered species,
look at circumstances within these
special zones, and then take action
having considered several possible
scenarios. GIS can, for example, create a
200-meter zone around all streams
within a  particular metropolitan area.
Then, using  census street corner address
information, service station locations
can be inserted into the data base. By
extracting stations within the 200-meter
buffer, planners can develop an
emergency response strategy for
potential spills or leaking underground
storage tanks. In much the same way,
GIS can aid the analysis  of an air
pollution problem and who is being
affected by it.
  Various environmental models can
"interact" with the GIS by directly using
the functions it provides. Alternatively,
data can be extracted from the GIS
data base for external model use, and the
results then placed back  into the GIS for
use in further analyses.
  More than just a mapping system, the
GIS functions as a window on
data bases, allowing users to interrelate
and manage data, models, and maps. It
enables users to develop  scenarios and
visually shows the results in either
permanent paper map form or as
temporary presentations on a color
computer screen. GIS not only helps
users answer site-specific questions,
offering hew perspectives on complex
environmental interactions, but also
facilitates the use of such data in
environmental decision-making.
  The Superfund Program's approach to
dealing with problems in the San
Gabriel Basin is a good illustration of
how GIS can be used to solve complex
environmental problems, in this case
dealing with ground-water
contamination by industrial chemicals
and solvents whose sources are still
unknown.
  The San Gabriel Basin  site is one of
the largest and most complicated sites
on EPA's National Priorities List.
Encompassing over 200 square miles, it
is located in the heavily urbanized Los
Angeles Basin of EPA Region 9.
  Region 9 is  faced with  the problem of
reaching a documented, systematic
decision on a  clean-up strategy that
takes into account not only
environmental data from  all media, but
also human and ecological risk and the
tremendous costs assnr.iated with

                         EPA JOURNAL

-------
   SAN  GABRIEL  BASIN
      CENSUS TRACTS AND
      CONTAMINANT PLUME
                                                   One of EPA's largest and most
                                                   complicated Superfund sites is
                                                   the 200-square-mile San Gabriel
                                                   Basin near Los Angeles,
                                                   California. The Agency's
                                                   Geographic Information System
                                                   has been put to a number of uses
                                                   in dealing with the San Gabriel
                                                   site, including the strategic
                                                   monitoring of underground water
                                                   pollution.
 ground-water cleanup over so large and
 populous an area. With hundreds of
 thousands of people potentially affected,
 and with millions of dollars at stake, (he
 Region 9 final Record of Decision must
 be supported by the best available
 information and presented in a way that
 the public can understand and support.
 The pilot study, begun in 1985 by EPA's
 Environmental Monitoring Systems
 Laboratory at Las Vegas and EPA  Region
 9, is the Agency's first extensive
 application of GIS technology by a
 mainstream EPA  program.
  The study began with the assembly of
 a spatial data base containing
 information about the characteristics of
 the aquifer and the Basin's physical
 environment, locations of potential
 pollution sources, and information on
 current pollution, land use, and
 population. The data came from other
 federal agencies, Region 9, and state,
 local, and private sources. The data base
 integrated both what was  known
 through environmental measurements
 and  what could be reliably surmised
 through the use of environmental
 models. Immediately successful, the GIS
 data base is still in use today by Region
 9 as an operational part of the Remedial
Investigation; it is continually being
expanded and updated as our
understanding of the San Gabriel Basin
increases. Eventually, it will be a
valuable tool for enforcement and
compliance actions.
  Among the steps already taken
through the use of GIS in the San
Gabriel situation has been the mapping
of a number of significant factors related
to the present and future pollution of
the Basin's water supply. These factors
include: the area's generalized geology;
locations of drinking water wells; the
movement, actual and projected, of
underground waters; observed and
projected  underground volatile organic
chemical pollutant plumes; wells
actually within the plume areas;
location of water supply district within
the plumes; census tracts; and data
about the potentially affected census
tracts. GIS has also aided in the
determination of the principal
responsible  parties involved.
  The combinations of overlays that are
being developed for this project have
already enabled Region 9 to pinpoint
sites for continuous monitoring so that
the advance of underground water
pollution can be detected if and when it
happens. In addition, Region  9 will be
able to tell if large numbers of children
or older persons are at risk in specific
areas, and where potentially serious
pollution sources are located  in relation
to such populations. I lie region win
also have access to other information
critical to planning for long-range
prevention—or cleanup where
necessary.
  Use of GIS is increasing within the
Agency. The Office of Information
Management Systems has developed
and is actively managing a plan of
action for its implementation. Seven
regional offices already have GIS
systems, and the others are conducting
pilot studies to assure that the
technology applies to their programs.
The Office of Research and
Development has developed GIS
capabilities  at several laboratories and is
providing research and technical
support to the regions through the
Environmental Monitoring Systems
Laboratory at Las Vegas. The payoff will
be better use of data  in decisions and
the integration of multimedia
information  supporting environmental
management. The San Gabriel
Superfund program is just the
beginning, a

(Dr. Mace is  Chief of the Remote and
Air Monitoring Branch at EPA's
Environmental Monitoring Systems
Laboratory in Las Vegas. He is
responsible for the research program in
Geographic Information Systems in
EPA's Office of Research (ind
Development.)
MAY/JUNE 1989

-------
Keeping
a  Closer  Watch
on  Ecological
Risks
 by Jay J. Messer
 It seems that almost every week, the
 news media confront us with accounts
of  new ecological disasters. Some result
from localized accidents, spills, or
system failures; others result from
intentional breaches of sensible
environmental housekeeping, such as
dumping medical wastes in our coastal
waters.
  Even more troubling than these
events, however, are reports that our
coastal resources are slowly but
inexorably wasting  away—casualties of
a combination of air and water
pollutants acting on global and regional
scales. Bolstered by an alarming rate of
habitat loss and fragmentation, the
resulting rate  at which species are being
lost may rival the mass extinctions in
the fossil record generally associated
with catastrophic meteor impacts. As a
barometer of this widespread  concern,
Time magazine replaced its "man of the
year" in 1988 with the "planet of the
year": Earth.
  How can society best respond to this
emerging environmental crisis?
Currently, the United  States alone
spends $70 to $80 billion annually on
environmental programs, most of them
targeted at protecting  human health.
Protecting ecological resources while
controlling acid rain, reducing
emissions of greenhouse gases, replacing
chlorofluorocarbons that destroy
stratospheric ozone, reducing inputs of
nonpoint source pollutants to surface
waters, and preserving critical habitats
will further increase the cost of
environmental protection.
  If the nation's economic resources are
insufficient to address  all of these
concerns, there certainly will  not be
enough to proceed on any course less
than a solid understanding of
environmental processes and effects.
EPA must have a sound basis for
targeting its limited research and
regulatory resources at the most critical
threats to our ecological resources.
  EPA sets its standard-setting priorities
for  pesticides and toxic substances
released to the environment based on a
formal risk assessment procedure. The
results of toxicological tests on
laboratory organisms are coupled with
pollutant exposure models to  calculate
estimated rates of illness or mortality in
humans. This procedure is sometimes
backed up using human health statistics
obtained in surveys of  clinical records,
where available. The most toxic or
carcinogenic chemicals with the highest
exposure levels are given priority for
regulatory attention.
  This risk assessment procedure is also
useful in comparing the relative toxicity
to plants and animals of single
chemicals, but it develops limitations in
assessing risks from multiple pollutants
to processes that operate at the
community and ecosystem level. Instead
of relating animal test results to one
species (Homo sapiens), toxicological
results must be extrapolated from a few
species that can be reared in the
laboratory  to hundreds of thousands of
species that may be exposed to a
pollutant in the natural environment.
Unknown effects  of pollutants on
reproduction, competition for resources,
and susceptibility to predators and
disease that are not fully testable under
laboratory  conditions further reduce our
confidence that we are  truly protecting

                                                                    .-
                                                               Ray Muzika photo
                                                                                                      EPA JOURNAL

-------
ecosystems from harm. Furthermore,
decreased finfish and shellfish harvests
in near-coastal systems, dying
high-elevation forests, and loss of
biodiversity may stem from a
combination of human-induced causes
not restricted only to toxic pollutants.
  Assessing current and future risks to
our ecological resources requires two
components in addition to th
traditional toxicological approach. The
first is ecological field data to allow us
to determine which problems are the
most widespread or most rapidly
becoming worse. The second is a
sufficient understanding of complex
ecological processes and effects to allow
us to adequately predict the response to
regulatory alternatives. The focus of this
article is the first issue: the need for
data on status and  trends in the
condition of our ecological  resources.
Despite the hundreds of
millions of dollars spent on
monitoring, we appear unable
to determine with confidence
if the conditions of our
resources are getting better or
worse.
  We are certainly not without
environmental data. Although no
official statistics are kept, the federal
government spends more than $500
million each year on environmental
monitoring. State and private
organizations more than double this
figure. The majority of environmental
monitoring dollars are spent on
compliance monitoring: making sure
polluters obey regulations. The majority
of ambient monitoring is targeted at
urban air quality and  contamination of
food and drinking water. The remaining
programs provide us with what we do
know about the condition of our
ecological resources.
  Ambient water quality is  monitored
by the U.S. Geological Survey (USGS)
NASQAN and Benchmark networks, as
well as by EPA and at least 10 other
federal agencies. Air quality is
monitored in metropolitan areas, and
acid deposition rates have been
measured in rural areas by a consortium
of USDA Agricultural Experiment
Stations, EPA, USGS, the National Park
Service, and others since 1983.
  Levels of toxic and  carcinogenic
organic compounds and certain heavy
metals have been measured in bottom
fish and shellfish by the National and
Oceanic and Atmospheric
Administration at 200 near-coastal sites
since 1984. The U.S. Fish and Wildlife
Service currently monitors contaminant
levels in fish in 200 rivers, as well
pesticide and metal residues in bird
tissues.
  State and private monitoring also add
to what we know about environmental
quality. The states  monitor air and
water pollutants, and many conduct
biological  surveys of surface-water and
terrestrial  systems. The Audubon
Society conducts an annual Christmas
bird count, and the Nature Conservancy
tracks changes in availability of wildlife
habitat. University researchers have
provided ecological monitoring data that
alerted society to the threats of
surface-water eutrophication, acid rain,
stratospheric ozone depletion, and
global warming.
  Despite  the apparent "glut" of
monitoring data, the most recent
Conservation Foundation report on the
"State of the Environment"  in 1987
contains fewer than 10 figures (out of
150) that describe field data on
ecosystem contamination or conditions.
Most of the other figures describe
pollutant  releases,  industrial and
economic  activity levels, and population
and transportation statistics that serve
only as surrogates for pollutant releases.
The authors of the report noted with
some frustration that more data on the
actual condition of ecosystems simply
were not readily available.
  What is missing?
  To meet our ecological  field data
requirements, two  needs must be met.
The first is better coordination and
communication among the agencies that
collect environmental data.  The second
is a framework for data interpretation
and reporting that  meets our ecological
assessment needs and for identifying
and filling the critical data gaps. The
second need  is a prerequisite if the first
is to result in more than a few scientific
workshops and exchanges of data tapes.
  In order to be useful in the risk
assessment process, monitoring data
must affect decisions concerning
whether or where to target research or
regulatory resources. Such decisions are
facilitated when the data  provide
answers to certain  specific questions:


• What is the resource of concern (e.g.,
the number of lakes subject to
acidification  or the acres of wetland
subject to loss)?
• What fraction of this resource appears
to have suffered damage, and where is
the  problem most pronounced?

• Are the magnitude, extent, and
location of the damage  changing over
time?
• Are patterns of damage related to
patterns in pollutant exposure or other
disturbances?
• What level of uncertainty is
associated with each of these
assessments?

  Many monitoring programs provide
vital clues to the condition of the
environment, but were  never meant to
characterize a particular resource of
concern. Some programs are based on
an "early warning" or sentinel concept
in which highly sensitive monitoring
sites are chosen. Other  program designs
focus on resources that are of particular
management interest, such as national
parks, commercial timber, or fishery
landings. The NASQAN network
monitors the quality of 90 percent of the
major riverine water discharges in the
nation, and the data  have been used to
document decreases  in  lead
concentrations in runoff resulting from
declining use of leaded gasoline. The
network is not designed, however, to
describe the distribution of water
quality or biotic conditions in the
thousands of miles of stream that  make
up the aquatic habitat.
  Despite the hundreds of millions of
dollars spent on monitoring, we appear
unable to determine  with confidence if
the conditions of our resources are
getting better or worse. In order to
provide an estimate of  the percentage of
a particular resource that appears  to
have suffered some damage, the sites
selected for a monitoring program must
be representative of the overall resource.
In other words, if 20 percent of the
resource is damaged or experiencing a
trend, approximately 20 percent of the
monitoring sites should show the  same
pattern or trend. Such a correspondence
would be expected if the sampling sites
were randomly selected.
  In most cases, however, monitoring
sites are not selected randomly but
deliberately and justifiably located to
determine the effects of a known
pollutant source or to serve as a
background or an experimental control
for  such a site. In other cases,
monitoring sites placed at "convenient"
locations (e.g., near roads or bridges or
in parks with unusual geology or
MAY/JUNE 1989
                                                                                                                  35

-------
landforms) may unknowingly introduce
a bias into the sampling network. Such
bias in status or trends estimates may
result in  incorrect or inefficient
management decisions.
  Lncomparability among data collected
by different organisations, or by the
same organization at different times,
makes  it  difficult to clearly separate
patterns and trends  from differences in
techniques and sampling schemes. For
example, there is no question that
wastevvater treatment plants reduce
inputs  of pollutants to receiving waters
and that  many lakes and rivers have
undergone marked improvements in
water quality since  the early 1970s.
  In spile of hundreds of thousands of
water quality measurements,  however, it
has so  far proved impossible  to
document unequivocally, on a national
basis, overall changes in the condition
of aquatic lile or tin; extent of
eutrophication in lakes, streams, and
estuaries due to the billions of dollars
spent on water-related pollution  control
programs. The biannual reports on the
status of surface waters  required by the
Clean Water Act cannot document
trends  because of changing station
locations and  differences in
            :.if;r crew samples water <:
 the N                'o assess the effects
measurement and reporting procedures
among states and from year to year.
Could we be winning the battles but
losing the war?
  Finally, few of the current programs
were ever meant to relate changes in  the
ecological resources to changes in
pollutant exposure. Programs that
Could we  be winning the
battles but losing the  war?
monitor natural resources typically
cannot distinguish between effects of
harvesting (e.g., commercial fishery
landings), management (e.g.. crop and
forest planting and cultivation), and
actual environmental change (e.g.,
effects of climate, pollutants, and
disease).
  This problem could be partially
solved if organizations with overlapping
programs  could coordinate reporting of
monitoring results. Unfortunately,
monitoring programs often take a back
seat to more urgent issues, and
resources are diverted away from
interpretation, coordination, and
reporting. This situation has led Paul
Portney, Director of the Center of Risk
Management, Resources for the Future,
to call for a Bureau of Environmental
Statistics  to coordinate and facilitate the
communication of ecological monitoring
data to decision-makers and the public.
  What can EPA do to address this
critical data gap? In response to an
extensive review by the  Administrator's
Science Advisory Board, EPA's Office of
Research and Development is
undertaking a research program to
bolster the Agency's ability to assess the
effects of individual pollutants at the
population, community, and ecosystem
level; to assess the extent and causes of
existing damage where the damage
already appears to have  occurred; and to
determine whether the sum total of  EPA
regulatory programs and policies are
having the desired or  predicted effects
on our ecological  resources.
  The Environmental  Monitoring and
Assessment Program (EMAP) represents
the first step  in this program and  is
aimed at filling the critical gaps in our
ability to assess the status and trends  in
the condition of our ecological
resources, particularly as they relate to
EPA's commitment to protect the
environment.
  EMAP will begin pilot testing in 1990
a set of interlocking monitoring
networks that will monitor changes  in
indicators of  ecological conditions and
environmental progress  in terrestrial
and aquatic ecosystems, including
estuaries and near-coastal systems. The
program will  focus on both biological
resources and exposure  to pollutants
and will meet the criteria outlined
above to assure its usefulness in  making
policy and management decisions.
  EMAP will be highly coordinated
with ongoing monitoring efforts, both
within and outside EPA, to add value to
existing monitoring data and to avoid
duplication of effort. The program will
provide regular statistical summaries
and interpretive reports  on its activities
and will serve as a means to focus the
ecological research needed to
understand and prevent the  most
serious future ecosystem-level impacts.
  Only through the development of
scientifically  sound and rational
methods of comparing environmental
risks  can we  hope to protect the  planet's
life support system with the resources
that are available,  a
                                                                                (Messer is Program Manager/or EPA's
                                                                                Environmental Monitoring and
                                                                                Assessment Program in the Of/ice of
                                                                                Research and Development.]
 EPA photo
.«
                                                                                                          EPA JOURNAL

-------
Our Record with  the
Environmental  Crystal  Ball
 by William R. Moomaw
   As the articles in this issue of EPA
   Journal clearly demonstrate, data
have become a major commodity in our
society- Quantitative information is
needed to understand any
environmental problem, evaluate its
seriousness, and develop a control
strategy by establishing regulatory
standards. It is virtually unthinkable
that we could manage our vast
environmental programs today without
the weajth of analytical and  statistical
information that has been collected and
is now available to us instantly at the
touch of a few key strokes from a
computer terminal.
  Yet, how often when working on a
particular environmental problem are
we brought up short by the need  for
additional data? One need only
remember how  often in recent years we
have hesitated in our response to such
major issues as  acid deposition, urban
and regional air pollution, stratospheric
ozone depletion, ground-water
contamination by toxic chemicals, or
the Greenhouse Effect because someone
argued that we  needed more data.
  I certainly do not intend to argue that
we should act on these or any other
environmental problems without
adequate information; I firmly believe
that effective solutions must  be
grounded on a solid scientific basis.
Instead,  I propose to raise the question
of why we have allowed ourselves to
create some of these environmental
dilemmas in the first  place, sometimes
even when we possessed knowledge
that should have forewarned us of the
consequences.
  It appears to me that we have made at
least three major kinds of mistakes in
the past that have led us into difficulty.
The first is implicitly assuming that the
law of the conservation of matter has
somehow been revoked instead of
remembering that whatever we put into
the environment ends up somewhere.
The second is ignoring the complexity
of  biological systems and (in our
ignorance about them) being surprised
when they respond in an unexpected
way. Our  third type of mistake is failing
to  include, within the boundaries of our
concern, all ecological and social
systems affected by our proposed
actions.
  The modem era of environmental
concern is often said to have its origins
in the 1962 publication of Silent Spring.
This book created a storm of outrage
among chemists and the agricultural
community with its attack on
then-current practices ot pesticide use.
  Rachel Carson was a gifted writer, hut
with only a master's degree in biology,
was not considered by most to he a real
scientist. Yet. despite her somewhat

                                                                                      U S Fish and Wildlife Service photo
MAY/JUNE 1989
                                                                    37

-------
                                        It appears to me  that we have
                                        made at least three major
                                        kinds of mistakes in the past
                                        that have led us into difficulty.
problematic opening chapter and an
accusatory tone, she did what no other
scientist had done. She simply asked
what had happened to the 300 million
pounds of chlorinated pesticides and a
comparable amount of other agricultural
chemicals being sold each year. Her
answer, documented by the wealth of
data available in the specialty literature,
demonstrated that persistent chemicals
do not simply disappear,  but duo to
such phenomena as bioconceiitration,
often end up in surprising places in
large and damaging amounts.
  In our well-intentioned desire to
promote agricultural productivity and
protect public: health through  the use of
pesticides, we had  ignored all three
principles: we failed to ask where these
vast quantities of sprayed pesticides
might go; we ignored the  question of
how they might interact with living
organisms; and we narrowly defined our
region of concern to a particular sprayed
field or forest.
  In the decade and a half following the
publication of Silent Spring, several
other issues  involving the massive
release of substances to the environment
were the focus of environmental policy
concern and debate.
  In 1974, Sherwood Rowland and
Mario Molina examined the fate of the
nearly one million tons of
chlorofluorocarbons (CFCs) that  were
being produced annually. These
amazing and versatile chemicals were
inert, non-toxic, and non-flammable,
which made them suitable for a
remarkable variety of industrial and
commercial  uses. Because of their
benign properties, there was virtually no
concern that the majority of each year's
production was being released directly
into the atmosphere.
  What Rowland and Molina  found was
that good data existed on the  production
of CFCs and that new measurements of
their atmospheric: concentration (of the
order of several parts per trillion I might
add) suggested that most of the released
CFCs  remained in the atmosphere.
Having asked the conservation-of-matter
question, these researchers then jumped
to principle number three and asked
questions about the proper bounds of
the  system and of our concern. They
realized from the work of others that
chlorine might cause depletion of
stratospheric ozone and, from  their own
research, that CFCs, while stable in the
lower atmosphere, could be broken
down by high-altitude ultraviolet
                                                                                                       Mike Bnsson phoJO
:s
                                                                                                         EPA JOURNAL

-------
                                        Data may provide us with
                                        answers,  but only if we can
                                        formulate the right questions.
 radiation to release chlorine right in the
 midst of the ozone layer. In this case,
 the biota are affected by the biologically
 damaging ultraviolet radiation that can
 penetrate a depleted ozone layer.
  Subsequent  data-gathering has largely
 confirmed the principle concern that
 chlorine released into the upper
 atmosphere by CFCs can deplete the
 earth's protective ozone shield. What
 has come as a surprise is the sudden
 appearance of the Antarctic ozone hole
 by a previously unanticipated  chlorine
 pathway and the much more rapid
 depletion occurring at mid-latitudes
 than was previously predicted.
  Our heavy reliance on fossil fuels has
 presented us with a plethora of
 unanticipated  environmental problems
 that all arise because we implicitly
 assume  that there is no cost involved
 nor any problem related to our release
 of vast quantities of carbon dioxide and
 other gases into the atmosphere. From
 an engineering perspective, we quite
 reasonably drew the boundary of our
 concern around the technology we were
 developing. We treated the environment
 as a continuing source of fuels and
 materials and a limitless dump for the
 products of combustion and the waste
 heat that must be released in order for
 our engines to run.
  Would we have made different
 technology choices had we known a
 century ago what we know today? I am
 certain that "automobility" would  still
 have occurred. But would  we have
 chosen the gasoline-powered internal
 combustion automobile had we realized
 that, even when meeting today's U.S,
 fuel efficiency  standards,  the average
 new American car releases
 approximately its own weight in carbon
to the atmosphere each year? Would we
have at least favored a different
 propulsion technology that could avoid
the large amounts of carbon monoxide,
nitrogen oxides, and volatile organic
compounds that—despite impressive
pollution control technology—acidify
precipitation and prevent approximately
70 metropolitan regions from  meeting
air quality standards?
  In our enthusiasm to solve the
remaining vehicle air pollution problem
by shifting to alternative fuels such as
methanol, have we once again drawn
the boundaries of the problem too
narrowly by ignoring both the local air
and water problems associated with any
synfuels program, or the greenhouse
(and hence biological) implications of
large increases in carbon dioxide release
should we manufacture methanol from
coal?
  Given the increase in  the average
number of miles driven  per year and the
large amount of highly polluting idling
time arising from traffic congestion, a
new fuels strategy may slow the
deterioration of local air quality a bit,
but in my view this is unlikely to be a
real solution to the problem.
  Let me close with an example of a
future that we are just beginning to
address that illustrates my point.
Human-induced global climate change
is believed  to be caused by the trapping
of the earth's radiant heat by increasing
quantities of carbon dioxide, methane,
CFCs, and other gases. Of these, the
only one whose concentration can
potentially be lowered once it is "out of
the bottle" and in the atmosphere is
carbon dioxide.
  Since green plants absorb carbon
dioxide during photosynthesis, several
proposals have been  made to increase
the growth rate of plants in order to
offset carbon releases from fossil  fuel
combustion. The more conventional
plan is to halt net deforestation to
ensure the existence of large carbon
stocks on land and to replant areas that
have already been cut in order to
remove atmospheric carbon dioxide by
sequestering it in new growth.
   More recently, some have called for
 increasing the oceanic photosynthetic
 rate through ocean fertilization. This
 would have the advantage of
 automatically sequestering large
 amounts of dead organic matter that
 sinks to the ocean floor.
   Before such a massive  project is
 undertaken, I would suggest that \ve
 carry out an environmental impact
 assessment since we have learned the
 hard way from  inadvertent
 eutrophication  experiments on a more
 limited scale that the response of the
 biota can often surprise us.
   One question that  particularly needs
 to be addressed is whether some of the
 microorganisms in the oxygen-deficient
 waters into which these dead plants
 might sink are capable of converting
 biomass into methane. Were this the
 case, it would have the unfortunate
 consequence of ultimately converting
 one molecule of atmospheric carbon
 dioxide into a molecule of methane.
 which, after bubbling up  and entering
 the atmosphere, is capable  of producing
 20 to 30 times the global  warming of the
 carbon dioxide it replaced.
   It  is clear from these examples as well
 as many others that the development of
 a sound, high-quality data base has boiui
 critical to our understanding and
 treatment of environmental problems.
 What is also illustrated, however, is that
 if  we fail to analyze our actions
 properly, possessing data is not  going to
 avoid major environmental problems.
 Data may provide us with answers, but
 only if we  can formulate  the right
 questions.  G
(Dr. Moomaw directs the Climate,
Energy, and Pollution Program cit the
World Resources Institute in
Washington, DC. He is a physical
chemist who was formerly on the
/acuity of Williams College, where IK;
was Professor of Chemistry and directed
the Center for Environmental  Studies.)
MAY/JUNE 1989
                                                                                                                   39

-------
Taking
the  Pulse
of  the  Planet
by Francis Bretherton
A    second dust bowl in the nation's
   heartland, sweltering humidity all
summer long in New York City, balmy
days in Alaska: these are examples of
climate changes that might be in store
for our children and
grandchildren—consequences of the
burning of coal and oil worldwide.
Other symptoms of the ever-increasing
impact of human activities on the global
environment include acid rain, the
ozone hole, desertification (productive
land turning to desert), and the
destruction of tropical forests.
  Scientists know that these changes are
all interrelated. They cannot be
understood or reliably predicted
without considering the earth as a
complete system of interacting parts: the
atmosphere, the oceans, the ice  sheets
and glaciers, the solid earth, the soils,
and the biota. Yet  our knowledge of
how this system functions is woefully
incomplete.
  For example, there is  consensus that
the global  average temperature will
increase over the next century, but there
is no agreement about the changes this
will  bring in individual regions, Thus,
we must develop new programs of
research to acquire new knowledge. We
must monitor the vital signs of this
planet that is our home. The earth must
be placed  in "intensive  care."
  Two years ago, the World
Commission on Environment and
Development issued its  aptly titled
report, Our Common Future, which
underscored the mutual
interdependence of world population,
environmental problems, and economic
imperatives worldwide. The common
future of the world's citizens, according
to the Commission's findings, will
depend on the success of internationally
coordinated efforts to achieve
"sustainable development."
  The basic message of  Our Common
Future, concerning the need for
solutions that take into account  the
global interrelationships of ecological
and economic concerns, is more urgent
than ever.  We need to develop new
ways of thinking about basic national
policies on the environment, energy.
industrial  development, and foreign aid.
We need to recognize the synergisms
and conflicts of our choices on a global
scale and work with other nations to
minimize the adverse consequences of
our common actions.
  To feed the world population and
satisfy its needs for a decent way of life.
many compromises and
accommodations will be necessary, as
well as resolute action, where feasible,
to mitigate the most undesirable
impacts. Protecting our global
environment cannot be successfully
accomplished in isolation but must be
approached as an integral part of the
economic and technological realities
and value systems of the peoples of the
world.
                                      Mfke Bnsson photo
                                                                             We must monitor the vital
                                                                             signs of this planet that is our
                                                                             home. The earth must be
                                                                             placed in "intensive care."
  Central to any dialogue about these
issues must be reliable environmental
information. We need to document the
changes that are actually occurring on a
global scale, understanding how much
is natural variability and how much is
due to human activities. We need
proven models that can be used to
examine the effects on the environment
of different policies, and to make
credible predictions about specific:
regions and nations. Developing such
information will be a tremendous
challenge, requiring new modes of
cooperation among scientists in
different disciplines, among various
federal agencies, and among the nations
of the world. Although the United
States and other nations are  beginning
this effort, a sustained long-term
program will be required, with tew
quick returns.
  Monitoring the earth's vital signs
must start with observation.
Measurements are needed of such
variables as the temperature and rainfall
all over the globe, together with the
action of clouds, winds,  ocean currents,
the extent of sea ice, vegetation cover,
and many other factors that  determine
climate. Also important is the
measurement of increasing
concentrations in the atmosphere of
"greenhouse" gases that act to warm the
earth by retaining  heat that would
otherwise be radiated into space.
  Also crucial, but more difficult to
document, is the state of health of
ecosystems such as forests and
grasslands as climate changes and
nutrients in the soil are exhausted.
Fires, droughts, pest epidemics, and
wind storms are major influences on
what species flourish and how the
system reacts to change;  the frequency

                          EPA JOURNAL

-------
 WITH INDUSTRIAL   SELF-SERVING  GREEDY
       ANDANWOTHET1C
     POLLUTERS POLITICIANS DEVELOPERS    PUBLIC
Copyright c 1989. The Miami Herald. Reprinted with permission.
of these catastrophic events is sensitive
to land management practices like
harvesting trees and controlling fires.
  Measuring these variables requires
scientific instruments at many sites
around the globe, combined with the
perspective  that can be obtained from
earth-orbiting satellites, which are
maintained  consistently over many
decades. Some of the systems required
are already deployed for other purposes,
including  weather prediction. However,
modifications are necessary to achieve
the accuracy needed to document global
change. In other cases, the  most critical
information can be obtained by small
groups of dedicated scientists, provided
they are given suitable encouragement
and support, In some situations,
research is-required to  develop suitable
methodology.
  In all cases a major effort is needed to
establish adequate monitoring systems.
Accomplishing this will be difficult and
expensive, requiring an indefinite
commitment. But we, must  make this
investment because our future depends
on  it.
  Measurements by themselves are not
enough. They must be  integrated into  a
coherent framework of information of
established reliability. Enormous
amounts of complex data have to be
organized  and retained for  future use
because we  are certainly not wise
enough to know just what will turn out
to be critical in the years to come.
Currently, we cannot even  evaluate the
quality of  some data. Our successors 20
years from now will have to decide
whether the changes they observe are
real or artifacts of either the way the
measurements were made or the data
processed. We continually find that
critical  information from the past was
not recorded or has been lost.  \Ve must
not repeat these mistakes.
  Computer models play an essential
part in this integration process. These
models  range from simple conceptual
relationships among variables  in some
part of the earth system calculated on a
personal computer spreadsheet to
models  providing comprehensive
simulations of the weather systems and
ocean currents of the world. Models
have many different roles, each
requiring different input information.
  Models encapsulate what is  known
about how the different parts of the
earth system function, serving as the
common language in which specialists
in different scientific disciplines can
communicate. They can also be used to
assimilate many different kinds of
measurements into a self-consistent
analysis, as the daily weather patterns
are inferred from isolated observations
of atmospheric pressure and
temperature. They are used to  simulate
observed phenomena, as a test of the
model itself. Once tested, they are used
in experimental mode to examine cause
and effect relationships—for example,
showing what would happen if the
burning of fossil fuels were reduced by
one quarter.
  Finally, models are used to predict
what actually will happen, including
the effects of natural variability, given
our best available estimate of the
present state of the system and of future
inputs. At present, we have only
separate subsystem models, each
describing an isolated piece of the earth
system. A major effort will be required
over the next decade to  integrate these
pieces into a tested comprehensive
model that can be used  for specific
predictions.
  Caring for the earth will require
translating this predictive capability
into effective policies. For example,
estimates of regional changes of
temperature and rainfall will  have to be
combined with specialized models of
river flow and economic development to
estimate the impact upon  water
resources. The impact on agriculture of
changes in climate and of enrichment in
atmospheric carbon dioxide depends
also on the breeding  of new crop
varieties and prices on the world
market.
  Analyses of the processes of industrial
societies will be required to project
plausible scenarios for the emission of
greenhouse gases and other air
pollutants,  including the secondary
effects of various control strategies. It is
essential  to develop methodologies for
making these analyses objective.
relatively uncontaminated by the  value
system of one particular social group or
nation. As the interrelationships
between populations, economic
development, and the environment are
taken seriously, evaluating policy
options will become  much more
difficult.
  Placing the earth in intensive care  is
not just a scientific problem,  although
science has a crucial role to play. Wo all
have to develop an awareness of the
consequences of our  collective actions,
of the cumulative effect  of the simple
choices we make in our everyday life as
they are multiplied by the billions of
people on this planet. Wo will never
positively know the consequences.
However, we must act now where
action is  clearly needed, while
continuing to develop a better basis of
knowledge and understanding for the
future. It will be a long and difficult
road.  Yet, facing this challenge might
possibly draw the peoples of  the world
closer together as they face their
common  future. D
(Dr. Bretherton is the Director of the
Space Science and Engineering Center
and a professor in the Department of
Meteorology at the University of
Wisconsin-Madison.)
MAY/JUNE 1989
                                                                                                                    41

-------
Tracking  the
Greenhouse
 by  Pieter P. Tans
    As a graduate student, I ran across a
    book on climate that grabbed my
imagination. Written in 1971, its
shocking message was that we humans
were busily changing the earth's
climate. Since then, my research has
focused  on trying to find  out how man
has apparently been able  to "compete
with the sun" in influencing the global
temperature and other properties of our
world climate.  Part of this work
concerns keeping track of the so-called
"greenhouse gases," which have been
increasing with the expanding scope of
human activity.
  Energy from the sun amounts to more
than 10,000 times the heat liberated by
coal, oil, natural gas, and wood that
people burn worldwide. Nevertheless, it
is the combustion that impacts the
earth's global heat balance. Once in the
atmosphere, the combustion product,
carbon dioxide, stays there for a long
time. Year after year it absorbs the
infrared heat radiation emanating from
the earth, thereby causing the surface to
get warmer.
  In the last 10 years, scientists have
come to realize that it is not solely
carbon dioxide that is making the
earth's surface warmer. Other
greenhouse gases doing the same
include, most importantly, methane,
nitrous oxide, and the
chlorofluorocarbons (also held
responsible for the enormous springtime
stratospheric ozone decreases over
Antarctica). Like carbon dioxide, these
greenhouse gases also remain in the
atmosphere for a long time; they are all
increasing in concentration due to
human activities.
  Paradoxically, without greenhouse
gases the earth would be too cold for
life to exist. The Greenhouse Effect
seems to have kept the earth warm
enough for life to develop over billions
of years, but now we may be creating
too much of a good thing. Studies of air
bubbles buried deep underneath the
Antarctic and Greenland ice show that
our present atmosphere contains higher
levels of greenhouse gases than during
the last 150,000 years. It is virtually
certain that disturbance of the
atmospheric heat balance resulting from
these increases will lead to some
profound changes in climate because
the heat balance is the ultimate driving
force of climate elements like
temperature, humidity, and wind. One
climate element in particular,
cloudiness, has, in turn, a great impact
on the heat balance.
                                                                           ••stallation. Carbon dioxide levels recorded
                                                              ncrease is accelerating.
42
                                                              EPA JOURNAL

-------
   The changes being produced by
 humans are competing with natural
 fluctuations that appear to be, so far, at
 least as large. At this time, we are still
 not able to explain many important
 features of today's observed climate or
 predict precisely which regions might
 become warmer, wetter, or drier during
 the next decades.
   Part of the climate prediction puzzle
 is that scientists don't have precise
 information about the amount of
 methane produced from various sources,
 which include rice paddies, wetlands,
 livestock, landfills, and fossil fuel
 burning. We can only begin to estimate
 the range of future methane
 concentrations. At the same time, we
 don't have all the answers  about
 Studies of air bubbles buried
 deep underneath the Antarctic
 and Greenland ice show that
 our present atmosphere
 contains higher levels of
 greenhouse gases  than during
 the last 150,000 years.
 controlling methane. Likewise, the
 separate roles of the oceans and the
 land plants in determining the carbon
 dioxide concentration have still not
 been defined in quantitative terms.
   To find answers to such questions, an
 international scientific effort is making
 precise measurements of greenhouse
 gases all.over the world. There are
 atmospheric observatories ranging from
 Alert at 81 degrees north latitude in the
 high Canadian  Arctic all the way to the
 Amundsen-Scott station exactly at the
 South Pole. Additional air samples are
 collected regularly in glass flasks at
 many more sites and sent back to
 laboratories for analysis. For example,
 the Geophysical Monitoring for Climatic
 Change division of the National Oceanic
 and Atmospheric Administration
 (NOAA) operates four observatories and
 collects air samples from over 20
 additional sites. Thousands of flask
 samples per year are analyzed in the
 division's Boulder, Colorado, laboratory.
 In a tightly choreographed sequence, the
 flasks are hooked up to three different
 analyzers,  then prepared again for
 shipping, so that they can be  used for
 the next air sample.
   In earlier times, after chemists
 discovered that the atmosphere is made
 up primarily of a mixture of nitrogen
 and oxygen, scientists climbed into
 balloons to find out that the
 composition does not change with
 altitude. As a precaution, they took with
 them a small bird in a cage, hoping the
 animal would pass out before they
 would if the upper-atmosphere turned
 out to be less than healthy for breathing.
 Determinations of carbon dioxide were
 first made around 1880. The famous
 Swedish chemist Svante Arrhenius
 hypothesized before the turn of the
 century that carbon dioxide played an
 essential role in keeping the earth
 warm.
  Today observatories and sampling
 sites are carefully located to avoid local
 contamination of the air measurements.
 The oldest greenhouse-gas measuring
 installation is at NOAA's observatory
 high on the Mauna Loa volcano in
 Hawaii. It is surrounded by many miles
 of bare lava rock, which minimizes the
 effects of local vegetation on the
 measurements.
  The modern measurements of carbon
 dioxide were started at Mauna Loa in
 1958 by David Keeling of the Scripps
 Institution of Oceanography. The
 measurements recorded there reflect
 repeated seasonal oscillations and
 steady increases, which have
 accelerated since the beginning of the
 measurements. The former results from
 photosynthesis and respiration of land
 plants in the northern hemisphere. The
 latter is mainly due to increased
 burning of coal, oil, and natural gas.
  The Mauna Loa station records "the
 breathing of earth." During the growing
 season, plants take up carbon dioxide
 from the air and with sunlight convert
 it into organic material, while in the
 other seasons respiration and decay
 predominate.
  The South Pole station is farther from
 human civilization than any other. The
 buildings are continuously being buried
 by unrelenting snow drifts.  A few
 hundred thousand years from now, the
 slowly moving glacier will eventually
 dump the original buildings of the
 station, now abandoned and buried in
 the ice, into the ocean. The  present
 station, the second, is already much
 deeper into the ice than when it was
 first built. During the brief Antarctic
 summer, 80 scientists perform special
 experiments and install equipment, but
 in  the winter only about 20  scientists
 remain.
  Since the South Pole has the cleanest
 air on earth, the worldwide  increasing
 trends  of many greenhouse gases show
up very clearly. Records of its methane
concentration, for example,  show a
 seasonal cycle due to photochemical
 destruction of methane in the
 atmosphere. There is good evidence that
 the steady upward trend  is due mainly
 to human activities. Methane is about
 three times higher today  than a few
 hundred years ago, and six times higher
 than during the last ice age.
  One of the important developments in
 the last decades is the growing
 realization that the chemical
 composition of the atmosphere bears the
 heavy imprint of the existence of life on
 the surface of the earth. Living
 organisms in the sea and  on  the land are
 responsible for the presence  of oxygen.
 They lower the concentration of carbon
 dioxide, and they emit a  host of minor
 atmospheric constituents  of which
 methane is one. We have  learned by
 analyzing gas bubbles in  ice  that levels
 of many gases in the atmosphere varied
 considerably with the coming and going
 of ice ages.
  Yet there is still much  to learn from
 these measurements and  other  research.
 At present, the Greenhouse Effect from
 these variations in the gas
 concentrations is calculated to  be much
 weaker than the temperature
 fluctuations actually observed in the ice
 core record. Either the greenhouse
 warming is amplified many times by
 changes in circulation and cloudiness,
 or the variations in the greenhouse gases
 themselves are relatively  unimportant
 compared to other processes  that are
 still not sufficiently understood. In the
 latter case, the gas concentrations would
 have primarily responded to  the new
 conditions that living organisms were
 experiencing when the world slipped
 from an ice age into a warm period, or
 vice versa.
  Scientific understanding of the
 climate, and the role played by
changing greenhouse gas concentrations,
 is still far from complete.  The pressure
 is on environmental scientists to attain a
much better grasp of what controls the
earth's climate and the greenhouse gas
"budgets" in a relatively short time.
Decisions made today will continue to
have an impact on greenhouse gases a
hundred years from now.  I hope the
measurement of greenhouse gases can
contribute to decisions that will be both
rational and protective, o
(Tans is a scientist at the Cooperative
Institute /or Research in  Environmental
Sciences, University of Colorado.)
MAY/JUNE 1989

-------
China's  Environment:
A   Special  Report
 by Changsheng Li
    Chinese civilization has existed for
    more lhaii 4,000 years. It has circuited
a remarkable! cultural history. However.
the use; of natural resources in the
development of the nation has
sometimes had negative impacts,  In tact.
soil erosion  and desertification  caused
by deforestation and land misuse
affected economic: growth in the basins
of Yellow River and l.iao River in (~,hina
even during ancient dynasties.
  In the last 100 years, the social
instability in China has accelerated tiie
degradation  of the environment. During
wars, natural pest infestations, or
political movements, the conservation of
natural resources and  the environment
was frequently minimized as a  priority.
  (,'hina missed an opportunity to start
controlling its exploding population in
the 1950s. Due to the misdirection of
the population policy  in the 1950s,
1960s, and early 1970s, the population
increased from less than 500 million at
the beginning of the 1950s to one billion
in the early  1900s.  The pressure of
population growth  accelerated the
Consumption of natural resources and
the deterioration of the environment,
resulting in  soil erosion, desertification,
deforestation, shortages of fresh water.
and pollution. The pollution of air,
rivers, and lakes was apparent in the
industrial areas in China by the latter
1950s. But the. problem was not
recogni/.ed by our society until  the early
1970s.
  In China,  we have inherited a thorny
legacy of history regarding the
environment that we have to grapple
with. In 1973. to initiate the
management of environmental
protection in China, the central
government  sot  up a specific: agency, the
Office of Environmental Protection, in
the State Council to coordinate  the
relevant ministries on affairs concerning
environmental protection. During the
last 15 years, the office expanded and
improved, and finally became an
independent government agency, the
National Knvironmental Protection
Agency (NEPA)  in 19H».  in charge of
policy analysis,  developing and
implementing regulations, ami
monitoring environmental management
in the country.
With large amounts of sulphur
dioxide (15 million tons)
emitted every year,  if is not
surprising to find acid
precipitation in China,
  It is not hard to imagine the feeling
within the ranks of the agency when it
began confronting China's
environmental problems. There were a
variety of serious problems and limited
financial resources. Mr. Qu Coping, the
director of NEPA. and his colleagues
made pollution their priority and
coordinated with other agencies to deal
with this and other environmental
problems. The agency has played  an
important role during the last 15 years
in monitoring environmental quality,
organizing research programs, increasing
public awareness, setting regulations,
and initiating other aspects of
environmental management  in China.
The perspective on environmental
problems in China has become clearer
and broader through the agency's work.
  Visitors to China, especially in winter
or springtime, can scarcely fail to  be
aware of the air pollution  in urban or
industrial areas. In fact, air pollution
has been designated a priority of
pollution control in China by NEPA.
Epidemiologies! studies have shown
significant differences in the incidence
of respiratory diseases, including  king
cancer, between urban and adjacent
rural areas around most of the cities in
China.
  The main source of air pollution  in
China is c;oal combustion. The, Chinese
consume about 580 million tons of coal
annually as fuel,  including 4;iO million
tons for industrial use and  150 million
tons for domestic use. The pressure of
market demand is so high, and facilities
for processing coal are so deficient, that
75 percent of the raw coal flows directly
into plant boilers or home stoves,
without washing  or other processing.
  The "dirty" coa) contains, on average.
23 percent ash and 1.7 percent sulphur.
Dust and sulphur dioxide are the major
air pollutants in most of the urban  or
industrial areas in China. Although
participate removal devices have been
installed in most  of the modern plants.
there are still great numbers of sources,
including small factories and  domestic
stoves, which emit dust into the air.
Sulphur dioxide emissions control  is
progressing somewhat slowly because of
the high cost involved. To reduce
serious air pollution during heating
seasons, emphasis is placed on
developing central heating systems to
replace the hundreds  of thousands  of
small  heating boilers or stoves in urban
areas.
  Meanwhile, working in concert with
other agencies in  charge of energy
resources, XKPA  is going to set up  a
long-term program to  encourage the;
processing of raw coal, including
washing, briquetting.  and gasification.
Coal is and  will continue to be the
major  source of energy in China for
some time, even though new projects
using  nuclear or hydropower are being
planned or considered.
  With large amounts of sulphur
dioxide (15 million tons) emitted every
year, it is not surprising to find acid
precipitation in China. Monitoring
during the last 10 years has found that
acid rain occurs mainly in southern
China, although there is no obvious
44
                                                                                                      EPA JOURNAL

-------
difference between the southern and
northern cities in terms of sulphur
dioxide emissions. The high content of
ammonium and alkaline particles.
including clay and weatherable
minerals, in the air plays a role in
buffering the atmospheric  acidity in tin;
northern part of China.
  Acid rain  is  most common in tin;
Sichuan and Guizhou Provinces in
southwest China, where high-sulphur
coal, humid climate, and acidic; soils
exist. The ecological impact of acid rain
is not yet clear and studies will
continue. Like most of the countries in
the world, China is inclined to wait to
see if any new scientific evidence comes
out before taking serious actions to
reduce emissions of sulphur dioxide.
  Recently, there  has been a new
awareness of the impact of carbon
dioxide and other "greenhouse" gases
on the global climate. Several groups of
senior scientists have been organized
within the Chinese Academy of
Sciences, as well as the NEPA system,
to  initiate interdisciplinary studies in
China.  China will be joining the
relevant international programs. Since a
large portion of the world's fossil fuel  is
consumed in China, its action should be
significant.
  Water pollution is another big issue in
China.  An investigation of the total
length of 55,000 kilometers of rivers in
China was made in 1982 and T983. The
study found 85.9 percent of our river
length unsuitable for drinking or

MAY/JUNE 1989
fishing; 47 percent did not meet
national standards; 23.7 percent was
unsuitable for irrigation; and 4.3 percent
was found to be severely polluted. The
study also showed that pollution was
more serious in the branches than  in the
main streams of the big rivers. The
dominant pollutants—ammonium.
phenol, oil,  and other organic
pollutants—were found in most of the
polluted rivers, especially in the
sections around big cities.
During the last five years,
small factories have
proliferated in the
countryside.
  The lack of adequate facilities for
municipal sewage treatment is a
significant  problem  for most of the cities
in China. Non-point source pollution
makes the problem  worse, especially in
southern China, where many lakes and
estuaries are threatened with
eutrophication,
  The application of chemical fertilizers
and pesticides has rapidly increased in
farming areas in order to maintain high
yields during the last two decades.
According  to our statistics,  40 million
tons of chemical fertilizers  and 500,000
tons of pesticides were applied to farm
fields in 1985. Legislation and
regulations are being initiated to
encourage the recycling of water
resources in industry and agriculture, to
reduce the total  amount of waste water.
In the northern part of China, certain
recycling practices  have been
encouraged since the 1970s, as a
strategy for solving both the problems of
sewage treatment and the shortage of
irrigation water.
  Ground water is the  main source of
drinking water for most cities in
northern China,  and overdraft is a
common problem. For  example, the
water table has been lowered at the rate
of about 0.5 to 1 meter per year in  the
Beijing area during the last 20 years. At
the same time, there is a  shortfall of
clean water supplies to meet the
demand of our cities, and the shortage
of clean water is placing a dark shadow
on urban  development plans. The
concentration of nitrites and nitrates in
the ground water is also increasing. For
a number of reasons, regulations are
needed to control the consumption of
ground water.
  About 480  million tons of solid waste
are produced annually by industries.
Only 20 percent of  the solid waste is
recycled; most of it is disposed of by
landfilling. There are few incinerators in
China to treat hazardous waste.
Landfilling is also the major approach to
disposing of hazardous waste,
  To handle future  pollution problems.
it will be  necessary to establish a data
base to collect information on the
location, hydrogeology, etc:., of  existing
landfill sites and measures
for preventing leaching. A specific:  office
in NEPA will be set up soon to
implement the management  of
hazardous waste, including the
registration of chemical production, use,
and  disposal; the new office will also
collect relevant information  for
computerized data bases.
  During the  last five years,  small
factories have proliferated in the
countryside. As a result, China's gross
national product has been boosted
considerably. (The industrial output of
township enterprises increased  from $20
billion in  1983 to $08 billion in 1986.)
However, some of those; small
enterprises have inadvertently caused
serious pollution around their locations.
For example, sulphur dioxide pollution
from small factories which produce
sulphur from pyrite nearly destroyed all
of the vegetation on the surrounding
hills. Moreover, the incidence of
occupational diseases is quite high
                                                                            45

-------
A shafl mil      ; omit
   "enchon Coal f/
    Shantung Province,
      1978. To increase
         resources for

  China has adopted ihe
principle of "Ihe polluter
               pays."
 among workers in these factories. To
 monitor and handle this new situation,
 NEPA is going to extend its
 management to the town level in areas
 where township enterprises are
 prevalent.
  Environmental pollution has depleted
 China's natural resources at an
 accelerated  rate, For example, total
 farmland area in China has decreased at
 the rate of 822,000 acres per year as a
 result of erosion, desertification,
 industrial or municipal construction, and
 pollution—including landfilling, which
 has taken 1,310,000 acres of farmland
 during last 30 years. The pollution of
 rivers, lakes, reservoirs, and ground
 water has reduced our capacity to
 supply clean water for industrial  and
 domestic uses. Further degradation of
 key resources would have profound
 implications on  sustainable
 development and would have negative
 impacts  on  the economy. These negative
 economic effects would reduce the
 revenue available to improve
 environmental quality.
  China faces a shortage of funds on the
 one hand and a  variety of severe
 environmental problems on the other.
 To handle this situation, the
 government and the public recogni/e
that environmental management must
be enhanced through legislation and
regulation. The Constitution of  the
People's Republic of China states:

  The State protects and improves the
  living environment and ecological
  environment, and controls
  pollution and other public
  hazards. (Item 26 of the Constitution)

  Based on this statement, four acts
were issued by the People's Congress:
• The Environmental Protection Act
(September 13, 1979)
• The Marine Environmental Protection
Act (March 1,  1983)
• The Water Pollution Control  Act (May
11, 1984)
• The Atmospheric Pollution Control
Act (June 1, 1988)
The environmental regulations  China
has issued so far have covered air,
surface  water, marine water, irrigation
water, fishing  water, sludge for
agricultural use, pesticides, and noise.
NEPA is the main government agency in
charge of  implementing these
regulations, through its headquarters in
Beijing  as well as bureaus or divisions
at the province, city, district, and
county levels.
  The environmental laws and
regulations have played an important
role in moderating the pollution trend
in China during the last 10 years,
although there is  still some resistance,
possibly due to a traditional skepticism
regarding legislation. Educating the
public, industry, and government on
environmental legislation should be a
major future  priority. Environmental
policy analysis also needs to be
improved. Risk assessment, benefit-cost
analysis, and environmental and
ecological impact analysis are just
beginning to  be incorporated into the
process  of environmental regulation.
  "Prevention First" has been adopted
as the primary principle of
environmental policy in China. The
national social-economic development
plans reflect  this principle. For
example, during the sixth 5-Year-Plan
(the first half of 1980s), $3.22  billion
was allocated for  pollution control
facilities in new construction (0.36
percent  of the total industrial output in
that period of the time). In 1983, the
State Council issued a regulation
requiring every industrial ministry to
target 7  percent of its total investments
to reduce pollution  through  technical
innovation. The State Council also
required all provinces to  consider
environmental protection issues  in their
city reconstruction plans.
Environmental impact  statements have
been used in approving individual
engineering projects for years, resulting
in reduced pollution from new
enterprises.
  To increase the resources for
environmental protection, the principle
of "the polluter pays" is followed by
NEPA. NEPA is in charge of enforcing
environmental regulations, collecting
fines, and managing funds, including
disbursements to industry to enhance
pollution monitoring and control.
  To face environmental  problems  in
China, there is still a long way to go.
There will be a lot of difficulties as well
as challenges to meet. But we have
made a beginning, n

fChangsheng Li, Ph.D, is Senior
Scientific  Adviser of the National
Environmental Protection  Agency of
China fNEPAJ and Deputy Director of
the Research Center for
Eco-EnvironmentaJ Sciences, Chinese
Academy of Sciences (Acadernia
SinicaJ.  He is currently visiting EPA
under a  U.S.-China  bilateral exchange
agreement.)
46
                                                                  EPA JOURNAL

-------
  Appointments
William G. Rosenberg is the
new Assistant Administrator
for Air and Radiation at EPA.
   Before coming to EPA.
Rosenberg was chairman of
The Investment Group of
Ann Arbor, Michigan, and
Washington, DC, which is
engaged in the acquisition,
development, and financing
of income-producing real
estate. From 1977 to 1982, he
was" president of Rosenberg,
Freeman and Associates, an
Ann Arbor real estate
development and syndication
firm, specializing in low- and
moderate-income housing.
   Rosenberg was Assistant
Administrator, Energy
Resource Development.
Federal Energy
Administration from 1975 to
1977. He was a presidential
appointee on the Project
Independence Advisory
Commission, formed in 1974
to establish a national energy
policy.
   Rosenberg is a graduate of
Syracuse University and
holds a law degree and MBA
from Columbia  University.
He practiced  law in Detroit
from 1965 to  1969.
Edwin B. ("Ted") Erickson is
the new Regional
Administrator for Region 3.
  Erickson was Delaware
County, Pennsylvania,
Council Chairman,  with
responsibility for a  $150
million county budget and a
workforce of 2,500  people.
He joined the Council in
1982 and currently is
chairman of the Delaware
Valley Regional  Planning
Commission, founder and
chairman of Delaware County
Human Services Partnership.
and a member of the advisory
committee for EPA's Folcroft
Landfill/Tinicum Marsh
Environmental Study, which
is measuring the impact of a
landfill on the Tinicum
Wildlife Refuge.
  Erickson holds a  doctorate
in biochemistry and
microbiology from Bryn
Mawr College. Bryn Mawr,
Pennsylvania. He joined the
faculty at Drexel University.
Philadelphia, in 1962. In
1969, he became an assistant
professor of biology at
Hamilton College in Clinton.
New York.
  He was the Director of
Public Health from  1973 to
1976 and Chief
Administrative Officer from
1976 until 1982  in Upper
Darby Township,
Pennsylvania. In 1983 and
1984. he served as liaison
between the Governor's
Office in the Commonwealth
of Pennsylvania  and the EPA
for the Chesapeake  Bay
Program.
Gordon L. Binder is the new
Chief of Staff in the Office of
the Administrator.
  A long-time associate of
William K. Reilly's, he was
the Administrator's Assistant
at The Conservation
Foundation since 1974, and
at World Wildlife Fund since
the two affiliated in 1985. He
also served as a member of
Reilly's transition team.
  Binder earned a master's
degree in architecture from
the University of Michigan in
1972. From 1972 to 1973.
Binder was a staff member on
the Rockefeller Brothers
Fund Task Force on Land
Use and Urban Growth for
the Citizen's Advisory
Committee on Environmental
Quality.
  Binder also worked for the
Federal Architecture Project
of the National Endowment
for the  Arts.  He was a Loeb
Fellow in Advanced
Environmental  Studies,
Graduate School of Design,
Harvard University, from
1979 to 1980.
James P. Moseley has been
named as Agricultural
Consultant to the
Administrator.
  Moseley is owner and
general manager of Jim
Moseley  Farms Inc.. AgRidge
Farms, Moseley Genetics Plus
Inc.. and Moseley Land Corp.
He is also chairman of the
Indiana Institute of
Agriculture, Food, and
Nutrition, a non-profit
organization that  promotes
agribusiness development.
and a member of  the board of
directors of the Farm
Foundation, a national
non-profit organization
which addresses agricultural
and rural problems.
  He is currently  a  member
of the Dean's Advisory
Council,  School of
Agriculture, Purdue
University, and chairman of
the Indiana Agricultural
Leadership Program. A 1970
graduate  of Purdue
University with a B.S. in
horticulture, Moseley has
served on EPA's
Ground-Water Workshop
Committee and is a past
president of the Indiana Farm
Management Association.
MAY/JUNE 1989
                                                                                                                :.

-------
Clarice E. Gaylord has been
named the new Deputy
Director for Policy, Programs,
and Executive Resources in
the Office of Human
Resources Management.
  She came to EPA in 1984
as Director of the Research
Grants Program in the Office
of Research and
Development.
  In  1987, she was selected
for the Senior Executive
Service Candidate
Development Program. While
in the program she worked  as
Chief of the Risk Analysis
Branch in the  Office of Toxic
Substances, Chief Executive
Officer of the Office of
Compliance Monitoring in
the Office of Pesticides and
Toxic Substances, and
Director of the Policy  and
Management Staff in the
Office of Ground-Water
Protection.
  Gaylord implemented
EPA's Minority Summer
Intern Program, under which
minority college honor
students  work as researchers
in EPA labs. She received a
B.S.  in /oology from UCLA
in 1965.  She earned a
master's  degree in zoology
from Howard University in
1967, and a doctorate  in the
same field from Howard in
1971.
  Gaylord then worked for
the National Institutes of
Health as a health scientist
administrator and joined EPA
in 1984.  She has been
awarded two EPA bronze
medals for exceptional
service, as well as the Special
Achievement and Public
Service Recognition awards.
Dr. Gary J. Foley has been
appointed Director of EPA's
Atmospheric Research and
Exposure Assessment.
Laboratory (AREAL) in
Research Triangle Park,
North Carolina.
  As Director of the
Environmental Monitoring
System's Laboratory since
1988, Foley reorganized and
combined the lab with the
Atmospheric Sciences
Research Laboratory to create
AREAL. From  1982 to 1988,
he worked in the Acid
Deposition Research Program,
leaving the program as
Division Director.
  He started his career with
the Agency in  1973 with the
Control Systems Laboratory,
moving on to the Office of
Energy. Minerals, and
Industry in 1974. In 1976,
Foley left the Agency to work
for the Organization for
Economic Cooperation and
Development. He returned to
the Office of Energy,
Minerals, and Industry in
1979 and went on to become
Division Director in the
Office of Environmental
Processes and Effects
Research.
  Foley holds a master's and
a doctorate in chemical
engineering from the
University  of Wisconsin at
Madison. He has received
three EPA bronze medals for
exceptional service.
Martha R. Steincamp is the
new Regional Counsel for
Region 7. She previously
served as Acting Regional
Counsel since January 1988.
  Steincamp joined the
Agency in 1977, as a staff
attorney. She became
Associate Regional Counsel
in 1983 and Deputy Regional
Counsel in  1985.
  Steincamp received a B.A.
in political  science at Fort
Hays State University. She
earned a Juris Doctorate from
Washburn University in
1971. After completing law
school, she was Assistant
General Counsel for the
Kansas Corporation
Commission until 1975.
Before joining EPA. she was
an assistant professor of Law
and Society at the University
of Nebraska at Omaha for two
years.
Marcia E. Mulkey is the new
Regional Counsel for Region
3. She had been Chief of the
Air and Toxics Branch in the
Office of the Regional
Counsel for Region 3.
  Mulkey joined EPA  in 1980
as a General Attorney  in the
Pesticides Toxic Substances
Division in the Office  of
General Counsel. She worked
for the U.S. Nuclear
Regulatory Commission as an
Attorney-Advisor from 1976
until joining EPA.
  Mulkey received a B.A.
from the University of
Georgia in 1967. and a
master's degree in 1968. She
was an assistant professor at
Western Illinois University
and debate coach from 1970
to 1973.
  A 1976 graduate of Harvard
Law School, she coached the
university's debating team
while earning her degree. She
has been awarded EPA's
silver medal for exceptional
service, u
                              Editor's note: VVhili; this
                              issue tvas at the printer, EPA
                              Journal learned that I-'.
                              Henry (flank) Habidif  had
                              been confirmed as the
                              Agency's new Deputy
                              Administrator. A full report
                              will folloiv in (lie next  issue.
48
                                                                                                         EPA JOURNAL

-------
Thinking. (Lake Mendota, Madison, Wisconsin)    Photo by Mike
Back Cover: The computerized CAMEO system
developed by EPA and the National Oceanic
and Atmospheric Administration provides
on-the-spot information about how to deal
with chemical emergencies. See article on
page 20.

Firefighte' photo by Robert ^retck, Woactfm Camp. Inc Computrt piM^ : . Si Me fm Departmeni

-------

-------