PROCEEDINGS
   EPA SCIENCE FORUM 2004:
Healthy Communities and Ecosystems
           June 1-3, 2004
  United States Environmental Protection Agency
        Mandarin Oriental Hotel
          Washington, DC

-------
Table of  Contents
Acronyms	  vii

Executive Summary	  x

Section I: Overview	  1

Section II:  Plenary Session	  2
Opening Remarks	  3
AAAS Environmental Fellows Session	  3
Plenary Addresses	  5
   EPA Administrator Plenary Address	  5
   EPA Region IV Administrator Plenary Address	  6
   OEI Assistant Administrator Plenary Address	7
   EPA S cience Advisory and ORD Assistant Administrator Plenary Address	  7
   Environmental Council of the States Plenary Address	 8
   International Business Machines Corporation Plenary Address...:	  10
   Office of Science and Technology Policy Director Plenary Address	  12
   Question and Answer Session	  13
Closing Remarks	  13

Section III: Science and Innovation to Protect Health and Environment	  14
Advanced Remote Sensing	  16
   The Status of the 2001 National Land Cover Data	  16
   Evaluating Environmental Quality Using Spatial Data Derived from Satellite Imagery	  16
   Development of Landscape Indicators for Potential Nutrient Impairment of Streams
   in EPA Region VIII	  18
   Multi-Scale Remote Sensing Mapping of Anthropogenic Impervious Surfaces:
   Spatial and Temporal Scaling Issues Related to Ecological and Hydrological
   Landscape Analyses	  18
   LIDAR:  A Remote Sensing Tool for Determining Stream Channel Change?	  19
   The Use of Remote Sensing in the Detection and Removal of Chemical Weapons
   in Spring Valley	  21
   Questions and Answers	  22
Innovations in Risk Assessment: Improving Data Resources	  22
   The Need for Scientific Data in Regulatory Decision Making	  22
   The ATSDR Experience in Using the Supplemental Documents Database in
   Developing Toxicological Profiles	  23
   Distributed Database Approach to Sharing Data	  24
   The Chemical Effects in Biological Systems Knowledgebase	  26
   Questions and Answers	  27
                       EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Table of  Contents  (continued)
Science and Innovation to Protect Health and Environment	  27
   The Ethics of Research Involving Human Subjects	  27
   EPA Clinical Research:  Implications for Air Quality Standards	  29
   Research with Human Subjects:  Future Challenges and Opportunities	  30
   Questions and Answers	  32
Supporting Innovations in Science to Identify Children's Vulnerability to Environmental
Exposures	  32
   Children's Health and Environmental Exposures: The Most Important Unanswered
   but Answerable Questions	  33
   Highlights from the Columbia Center for Children's Environmental Health:
   Studying Air Pollution in Community Context	  34
   The National Children's Study	  35
   Wrap Up and Discussion	  37
Sustainability - Educating for the Future	  37
   Education for Sustainability Initiatives	  37
   Principles and Practice of Sustainability  Education in Schools	  39
   National Efforts in Sustainability Education	  40
   Building Partnerships for Sustainable Science Education	  41
   Questions and Answers	  42
Partnering with New York on Air Quality and Human Health: Issues, Challenges,
and Perspectives	  42
   Federal-State Partnerships for Enhanced Understanding of Air Quality and Health
   Relationships	42
   Environmental Public Health Tracking and the Public Health Air Surveillance
   Evaluation Project	  43
   NO AA-EP A's National Air Quality Forecast Capability.....	  44
   Air Quality:  A Regional Perspective	  45
   Air Quality Management and Challenges in New York State	  46
   Health Activities Related to Air	  46
State-of-the-Science Research on Swimming-Associated Health Effects and the
Translation of Health Data to Water  Quality Guidelines for Bathing Beaches	  47
   The National Environmental and Epidemiologic Assessment of Recreational Water:
   The Relationship Between Novel Indicators of Water Quality and Health	  47
   Epidemiology Study of Swimmers in Nonpoint Source Polluted Marine
   Recreational Waters from San Diego, California	  48
   Partnerships: Linking EPA Beach Research with State and Local Beach Programs	  50
   Questions and Answers	  51
                       EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Table of  Contents  (continued)
Section IV: Using Science to Make a Difference	 52
Can You Hear Us Now? EPA's Role in Invasive Species Research and Management	 53
   Snakeheads, Green Crabs, and Other Nasty Things: An Overview of Invasive Species.... 53
   Office of Water and Aquatic Nuisance Species: What's Underway and What's Planned.. 54
   Rapid Assessment Surveys in Northeast National Estuaries: Identifying Marine
   Bioinvaders in Fouling Communities	 55
   Targeted Screening for Invasive Species in Ballast:  Genomic Approaches	 56
   Non-Native Oysters in Chesapeake Bay	58
Monitoring and Assessment to Protect Tribal Health and Ecosystems	 59
   Protection of Tribal Cultural Practices Through the Development of Native
   American Exposure Pathways	 59
   Towards a Better Understanding of Mercury Fate and Transport on the Fond du Lac
   Reservation: Monitoring Air, Water, Sediments, and Biota	 60
   Primary Production Study of Coastal Waters of the Bay of Fundy	 62
   Panel Discussion/Questions and Answers	 63
R-EMAP: The Application of EMAP Indicators	 63
   The Past, Present, and Future of the Regional Environmental Monitoring and
   Assessment Program	 63
   S outheastern Wadeable Streams R-EMAP:  Overview, Interim Findings, and Status	 64
   Maryland Biological Stream Survey: Science for Streams	 65
   Questions and Answers	 67
Great Places Demand Great Science	 67
   Defining Restored Water Quality, Allocating Load Caps, and Implementing
   Reduction Actions: Chesapeake Bay Lessons Learned	 67
   The Great Lakes:  Collaborative Science to Inform and Help Frame Policy	68
   Ecological Sustainability of the Gulf of Mexico: The Role of Science, Management,
   and Activism	69
   Wrap-Up and Discussion	 70
Looking into the Future of a Region	 70
   Ecological Forecasting:  An Introduction	 70
   A Weight-of-Evidence Approach to Projecting Land-Use Change and Resulting
   Ecological Vulnerability	 71
   Alternative  Scenarios and Land-Cover Change: Examples Using Nutrient Export	 72
   Statistical Modeling of Groundwater: Vulnerability in the Mid-Atlantic Region -
   Present and Future	73
   Forecasting Species' Distributions:  The Shape of Things to Come	74
   Putting it All Together:  Implications for the Mid-Atlantic Region in 2020	 74
   Questions and Answers	 76
Iv                      EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Table  of Contents  (continued)
Regional Research Partnership Program	 76
   Microbial Source Tracking: the Application of a DNA-Based Molecular Approach to
   Identity Sources of Fecal Contamination 	 76
   Land Cover Diversity Measured by Satellite as a Proxy for Biodiversity	 77
   The Relationship of Terrestrial Ecosystems to Manganese Emissions from
   Wood Burning	 77
   Panel Discussion/Questions and Answers	 78
Community Air Toxics Projects	 78
   Addressing Air Toxics at the Local Level	79
   Developing a Local HAP Inventory and Reduction Strategy in New Haven, CT	 79
   St. Louis Community Air Project	80
   Louisville 2004: Risk Management Actions	81
   Mobile County, Alabama Air Quality Study	82
   Questions and Answers	 83
Science to Support Decisions: Climate Change	 83
   Climate Vulnerability and Impact Assessments Can Provide Useful Insights
   and Guidance-Now	83
   The Feasibility of Conducting Climate Change Impacts Assessments:
   Opposing Viewpoints	85
   Use of Science in Gulf of Mexico Decision Making Involving Climate Change	86
   Alternative Approaches to Climate Change Impacts Assessments: Success Stories	87
   Questions and Answers	88

Section V: Delivering Science-Based Information to Decision Makers	 89
The Future of EPA's Environmental Indicators Initiative and Report on the Environment	 91
   Indicators of Healthy Communities and Ecosystems	 91
   ROE: Focus on Human Health and Ecological Condition Chapters (Overview of the
   Outcome Chapters)	92
   Environmental Public Health Tracking: Moving Into the New Millennium
   (Human Health Trends and Outcomes)	92
   CADDIS:  The Causal Analysis/Diagnosis Decision Information System	 94
   Questions and Answers	 94
Using Geospatial Tools to Make Program Decisions	 95
   OEI Support for EPA HQ Emergency Operations Center: Emergency Response
   Analyzer	 95
   Assessing Urban Growth and Land Cover Trends Using Remote Sensing Imagery and
   Landscape Metrics Gulf of Mexico Address	 96
   NEPAssist: A Web-Based Mapping Application for Environmental Review	 97
   Questions and Answers	 98
                       EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Table  of Contents (continued)
Delivering Consistent Information on Health and the Environment	 98
   National Biological Information Infrastructure: Collaborative Opportunities
   in Ecoinformatics	 98
   Environmental Information Exchange Network	  100
   EPA's System of Registries, A Foundation for Consistent Environmental Information...  101
   Questions and Answers	  102
Developing Science-Based Information for Coastal Systems	  102
   From Tropical Beaches to Fjords, An Overview of Western Coastal EMAP,
   Western Pilot Study	  102
   The Utility of NCA-type Monitoring Data for EPA Decision Making	  104
   Florida's Inshore-Marine Monitoring and Assessment Program	105
   National Coastal Assessment: A Successful State-Federal Collaboration in
   New Hampshire	105
   National Coastal Assessment: Approach and Findings in the Northeast	106
   National Coastal Assessment: Monitoring and Modeling in Support of TMDL
   Calculations	107
Scientific Computing	  108
   The Center of Excellence for Environmental Computational Science	  108
   Current Proj ects and High Performance Computing and Visualization Direction	  109
   Growing the Environmental Science Portal	  110
   Closing and Questions	  Ill
Healthy Communities—One Building at a Time	  Ill
   Indoor Air Quality: Knowledge Base and Gaps	112
   Indoor Environmental Research Base	113
   Delivering Technical Assistance	114
   EPA's New Indoor Air Quality Label	115
   Indoor Environments Program Strategy	  116
   Questions and Answers	  117
Net Environmental Benefit Analysis	  117
   Developing Consensus for Environmental Decision-Making in Emergency Response	  117
   Case Study of Isle Royale	  121

Appendix A: Meeting Agenda	:	>	  123
vi                     EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Acronyms
AAAS        American Association for the Advancement of Science
ADA         Americans with Disabilities Act
ATSDR       Agency for Toxic Substances and Disease Registry
ATtlLA       Analytical Tools Interface for Landscape Assessments
CAA         Clean Air Act
CADDIS      Causal Analysis/Diagnosis Decision Information System
CDC         Centers for Disease Control and Prevention
CMAQ       Community Multi-Scale Air Quality
CREM        Council on Regulatory Environmental Modeling
DNA         deoxyribonucleic acid
DOD         Department of Defense
DOE         Department of Energy
DSSTox      Distributed Structure-Searchable Toxicity
ECOS        Environmental Council of the States
EIS          Environmental Impact Statement
EMAP        Environmental Monitoring and Assessment Program
EPA         Environmental Protection Agency
EPHT        Environmental Public Health Tracking
FDA         Food and Drug Administration
FIFRA       Federal Insecticide, Fungicide, and Rodenticide Act
GARP        Genetic Algorithm for Rule-Set Prediction
GIS          geographic information system
GPS         global positioning system
HAP         hazardous air pollutant
I-BEAM      Indoor Air Quality Building Education and Assessment Model
IBM         International Business Machines
IMAP        Inshore-marine Monitoring and Assessment Program
IRB          institutional review board
IRIS         Integrated Risk Information System
ISO          International Standardization Organization
LCD         liquid crystal display
                         EPA SCIENCE FORUM 2004 PROCEEDINGS
vll

-------
Acronyms  (continued)

LIDAR      light detection and ranging
LOAEL      lowest observed adverse effect level
MACT       Maximum Achievable Control Technology
MBSS       Maryland Biological Stream Survey
MOUs       Memoranda of Understanding
MMT       methylcyclopentadienyl manganese tricarbonyl
NAAQS      National Ambient Air Quality Standards
NASA       National Aeronautics and Space Administration
NBII        National Biological Information Infrastructure
NCEA       National Center for Environmental Assessment
NCER       National Center for Environmental Research
NEBA       Net Environmental Benefit Analysis
NEPA       National Environmental Policy Act
NERL       National Exposure Research Laboratory
NESHAPS    National Emission Standards for Hazardous Air Pollutants
NHEERL     National Health and Environmental Effects Research Laboratory
NIEHS       National Institute for Environmental Health Sciences
NIH         National Institutes of Health
NLCD       National Land Cover Data
NOAA       National Oceanic and Atmospheric Administration
NOAEL      no observed adverse effect level
NPDES      National Pollutant Discharge Elimination System
NRMRL      National Risk Management Research Laboratory
NSRC       National Science Resource Center
NWS        National Weather Service
OEI         Office of Environmental Information
OMB        Office of Management and Budget
ORD        Office of Research and Development
ORIA       Office of Radiation and Indoor Air
OTOP       Office of Technology Operations and Planning
viii
EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Acronyms  (continued)

OWOW     Office of Wetlands, Oceans, and Watersheds
P3         People, Prosperity, and the Planet
PAH        polycyclic aromatic hydrocarbons
PBPK       physically-based pharmacokinetics
PCB        polychlorinated biphenyl
PCR        polymerase chain reaction
PHASE     Public Health Air Surveillance Evaluation
PNEIR      Program Needs for Indoor Environmental Research
PM        particulate matter
QPCR       quantitative polymerase chain reaction
ReVA       Regional Vulnerability Assessment
R-EMAP    Regional Environmental Monitoring and Assessment Program
ROD        Record of Decision
SAR        structure-activity relationships
SCCWRP    Southern California Coastal Water Research Program
SoR        System of Registries
STAR       Science to Achieve Results
TMDL      Total Maximum Daily Load
TRI        Toxics Release Inventory
USAGE     United States Army Corps of Engineers
USD A      United States Department of Agriculture
USGS       United States Geological Survey
                      EPA SCIENCE FORUM 2004 PROCEEDINGS
Ix

-------
 Executive   Summary

The Environmental Protection Agency (EPA) presented the 2004 Science Forum: Healthy Communities
and Ecosystems on Tuesday, June 1, through Thursday, June 3, 2004, in Washington, DC. This Science
Forum highlighted EPA's scientific accomplishments, showcased EPA's commitment to quality science,
and demonstrated, through examples, the use of science in decision making and policy making. The
Science Forum also provided an opportunity for dialogue and interaction among EPA scientists, clients,
stakeholders, and colleagues with over 1,000 attendees at this event, including EPA program, research,
and regional staff; members of other Federal agencies; the scientific community; and the public.

The Science Forum consisted of a full day session of plenary speakers and a review of the relationship
between the American Association for the Advancement of Science (AAAS) Environmental Fellows
Program and EPA, and three two-day breakout sessions. Each breakout session examined a theme area—
science and innovation to protect health and environment, using science to make a difference, and
delivering science-based information to decision makers. The Science Forum included 223 posters and
demonstrations on current EPA research activities and speaker-specific topics, EPA scientists/engineers
present to discuss their research efforts, 11 exhibits/demonstrations of EPA and other Federal agency
scientific and educational programs, and demonstrations of canine  scent capability to detect vapor
intrusion using an EPA-trained dog.

AAAS  Session

The Science Forum opened with a panel discussion about the fellowship program sponsored by AAAS.
Dr. Fran Sharpies, Dr.  Venkat Rao, and Dr. Terry Keating discussed their experiences with the different
program stages (from the initial fellowship program to the present) and lessons learned through then-
experiences at EPA about the role of science in decision making and policy making.

Plenary Session

The purpose of this session was to provide plenary addresses on the role and value of science at EPA, new
research and technology directions,  and the partnerships supporting all of these activities. EPA
Administrator Mike Leavitt provided a perspective on the successful growth of EPA into an
internationally respected scientific organization and new directions such as the integration of social
science and communication, networking of people and resources to overcome boundaries and solve
challenging issues, and growth beyond the historic experience of environmental cleanup. The Regional
Administrator for EPA Region IV, Jimmy Palmer, discussed the regional perspective on the role of
science in environmental decision making and the importance of partnerships in how science is
accomplished within EPA. EPA Science Advisor and Assistant Administrator for the Office of Research
and Development (ORD), Dr. Paul Oilman, provided highlights of numerous science-related initiatives to
address the science needs of EPA, strengthening of science within  EPA, and emerging areas of
computational toxicology  and sustainability. Executive Director of the Environmental Council of the
States, R. Steven Brown, addressed 10 science needs of the states, key areas of concern to states over the
next 5 to 20 years, and the role of EPA and the states in developing the necessary science to support
regulatory action. Chief Technology Officer for International Business Machines Corporation Federal
services, Dr. David McQueeney, provided insights on the creation  of market value from research,
exploration of emerging high performance computing capabilities,  and the use of E-business to harness
value in unstructured data and support real-time decision making.  Director of the Office of Science and
Technology Policy, Dr. John Marburger, discussed the role and limitations of science in decision making
and the need to consider potential consequences of emerging technologies.


x                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Science and Innovation to Protect Health and Environment

This two-day session focused on innovative scientific approaches for protecting human health and the
environment specifically, advanced remote sensing techniques, data resource and acquisition/use
improvements, human exposure vulnerability, and correlations between air and water quality and human
health. A key theme is the importance of partnerships and cross-collaboration to develop robust data sets,
analysis tools, and data management systems

Advanced Remote Sensing.  Terrence Slonecker, with the National Exposure Research Laboratory
(NERL), led this session addressing applications of remote sensing technology in landscape analysis,
indicator development, and remediation. James Wickam, with NERL, discussed the continued
development of the National Land Cover Data (NLCD).  K. Bruce Jones, with NERL, discussed the
development of landscape indicators using the NLCD. Karl Hermann, Environmental Monitoring and
Assessment Program (EMAP) Coordinator for Region VIII, discussed the application of EMAP in the
Western United States to predict regional level landscape conditions. S. Taylor Jamagin, with NERL,
provided an overview of research involving the use of human-made surfaces as indicators. David
Jennings, with NERL, discussed the use of light detection and ranging as a local-scale remote sensing
tool. Mr. Steven Hirsh, Remedial Project Manager with EPA Region III, discussed the application of
remote sensing for the detection and removal of chemical weapons.

Innovations in Risk Assessment: Improving Data Resources.  George Woodall, Jr., with NCEA,
led this session addressing improvements in data resources and data organization supporting risk
assessment. Roy Smith, with the Office of Air Quality Planning and Standards, discussed the
organization of data for use in regulatory decision making. Henry Abadin, with the Agency for Toxic
Substances Disease Registry, discussed the development of lexicological profiles. Ann Richard, with the
National Health and Environmental Effects Research Laboratory (NHEERL), discussed the development
and application of a toxicological and structural database. Michael Waters, with the National Institute for
Environmental Health Sciences, discussed the development of a toxicogenomics database on chemical
effects in biological systems.

Human Data in Risk Assessment.  John Vandenberg, with the National Center for Environmental
Assessment (NCEA), led this session addressing the acquisition and use of human data in risk assessment.
James Childress, with the University of Virginia, discussed the ethics of research involving human
subjects. Bill McDonnell, with NHEERL, discussed the role of human subject research in developing air
quality standards.  Richard Sharp, with Baylor College of Medicine, discussed the ethical issues
associated with genetic-based research.

Supporting Innovations in Science to Identify Children's  Vulnerability to Environmental
Exposures. Mr. Nigel Fields, with the National Center for Environmental Research, led this session
addressing children's vulnerability to environmental exposures, how these exposures impact their health
and development, and how these impacts differ from those seen in  adults. Michael Weitzman, with the
American Academy of Pediatrics, Center for Child Health Research, discussed the differences in health
effects from environmental exposures in children and adults.  Virginia Rauh, with the Columbia Center
for Children's Environmental Health, discussed social and environmental conditions having pre- and
post-natal health effects.  Carole Kimmel, with NCEA, provided an overview of the upcoming National
Children's Study.

Sustainability - Educating for the Future. Alan Hecht, Director of Sustainable Development in
ORD, led this session on current initiatives in Sustainability education, and discussed the concept of
Sustainability  and education and capacity building within the environmental sector. Jaimie Cloud,
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                           xl

-------
President of the Sustainable Education Center, Inc., described her organization's efforts in educating
Kindergarten through Grade 12 students on sustainability, providing leadership training of administrators
and teachers, and developing curriculum materials and outreach tools. Alan Elzerman, with Clemson
University, discussed national efforts in sustainability education within higher education programs and
the efforts of the National Council for Science and the Environment Council of Environmental Deans and
Directors. Sally Shuler, Director of the National Science Resource Center, discussed mechanisms to
incorporate environmental education and sustainability awareness into the educational curriculum.

Partnering with New York on Air Quality and Human Health: Issues, Challenges, and
Perspectives.  Val Garcia, with NERL, led this session addressing state and Federal initiatives to address
linkages between air quality and human health. S.T. Rao, with NERL, discussed EPA efforts in
partnering with EPA Regions, states, tribal governments, and local governments to enhance the
understanding of air quality and its relationship to human health. Vickie Boothe, with the Centers for
Disease Control and Prevention, introduced an Environmental Public Health Tracking Program that tracks
hazards, exposures, and human health effects and a Public Health Air Surveillance Evaluation project.
Paula Davidson, with the National Oceanic and Atmospheric Administration, presented the National Air
Quality Forecast Capability program to predict ground-level concentrations of ozone and develop 1-day
forecast guidance for ozone. Kenneth Colburn, with Northeast States for Coordinated Air Use
Management, summarized regional challenges in using technology and innovations to better public health
and the environment and to bridge the gap between these two areas of science.  Robert Sliwinski, with the
New York State Department of Environmental Conservation, described New York's air quality
management program and its initiatives. Nancy Kim, with the New York Department of Health,
summarized efforts in providing outreach and education, responding to health concerns, conducting
research, and establishing an environmental public health tracking system.

State-of-the-Science Research on Swimming-Associated Health Effects and the Translation of
Health Data to Water Quality Guidelines for Bathing Beaches.  Alfred Dufour, with NERL, led
this session on  the health effects from human use of recreational waters.  Timothy Wade, with NHEERL,
summarized ORD's National Environmental and Epidemiologic Assessment of Recreational Water
project, which focuses on research efforts to define any associations between human illnesses and
recreational water quality as measured using rapid analysis methods.  Kenneth Schiff, Deputy  Director of
the Southern California Coastal Water Research Project, and Mr. Jack Colford, with the University of
California at Berkeley, discussed the complexities of non-point source pollution (i.e., animal
contamination) in marine recreational waters in Southern California.  Rick Hoffmann, with the Office of
Water, discussed EPA efforts to address the requirements of the Beach Act to improve quality within the
United States' beach waters.

Using  Science to Make a Difference

This two-day section focused on regional efforts to use science for making real differences, specifically
invasive species, research collaborations, and ecological forecasting.  A key theme in these presentations
is that the sound science needed to make wise decisions is best obtained through collaboration.

Can You Hear Us Now? EPA's Role in Invasive Species Research and Management.  Michael
Slimak, with NCEA, led a session addressing the control and ecological consequences of invasive
species. Henry Lee II, Chair of EPA's Nonindigenous Species Working Group, discussed sources of
invasive species; their direct and indirect ecological, economic, and regulatory effects; recent research
findings; and areas for future work.  Diane Regas, head of the Office of Wetlands, Oceans, and
Watersheds, discussed ongoing and planned work within the Office for the control and prevention of the
introduction of aquatic nuisance species. Judy Pederson, with the Massachusetts Institute of
Technology's Sea Grant Program, discussed the results and management implications of a marine

xll                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
bioinvasion rapid assessment survey.  Mike Blum, with NERL, discussed genomic approaches to targeted
screening of invasive species in ballast water. Michael Fritz, with EPA's Chesapeake Bay Program
Office, discussed options for the management of non-native oysters in the Chesapeake Bay.

Monitoring and Assessment to Protect Tribal Health and Ecosystems.  Valerie Bataille, with EPA
Region I, led a session addressing monitoring and assessment projects sensitive to tribal-specific
concerns.  Fred Corey, Environmental Director of the Aroostook Band of Micmacs in Presque Isle,
Maine, discussed the development of Native American-specific exposure pathways. Nancy Cost, Fond du
Lac Water Project Coordinator, discussed monitoring of air, water, sediments, and biota to develop a
better understanding of mercury fate and transport on the Fond du Lac Reservation. Steve Crawford, with
the Pleasant Point Passamaquoddy Environmental Department, discussed the Primary Production Study
of Coastal Waters in Maine to measure and monitor impacts from aquaculture and non-point sources.

Regional Environmental Monitoring and Assessment Program (R-EMAP):  The Application
of EMAP Indicators.  Brian Hill, with NHEERL, led a session addressing the application of R-EMAP
to various regional studies, and provided an overview of R-EMAP.  Peter Kalla, with  EPA Region IV,
discussed the goals and initial findings of the Southeastern Wadeable Streams R-EMAP project.  Daniel
Boward, with the Maryland Department of Natural Resources, discussed the Maryland Biological Stream
Survey and habitat assessment.

Great Places Demand Great Science. Rochelle Araujo, with NERL, led a session addressing the use
of collaboration to solve environmental problems. Richard Batiuk, Associate Director for Science in the
EPA Chesapeake Bay Program Office, discussed research outcomes from the Chesapeake Bay Program.
John Lyon, with NERL, discussed collaborative research efforts in the Great Lakes. Quenton Dokken,
Executive Director of the Gulf of Mexico Foundation, discussed ecological sustainability issues and
concerns associated with the Gulf of Mexico.

Looking into the Future of a Region. Betsy Smith, with NERL, led a session addressing current and
future regional ecological risks. K. Bruce Jones, with NERL, discussed the approaches, goals, and
applications of ecological forecasting. Laura Jackson, with NHEERL, discussed models for assessing the
effects of urbanization. James Wickham, with NERL, discussed the use of land-cover change in the
determination of changes in non-point source pollution, its link to nutrient export and vulnerability
assessments, and nutrient modeling. Earl Greene, with the United States Geological Survey, discussed
the relationship between land use and groundwater vulnerability, and the results of statistical modeling of
groundwater in the Mid-Atlantic Region. Daniel Kluza, with NCEA, discussed a model to forecast the
distribution of native and non-indigenous species. Betsy Smith, with NERL, discussed future
implications of land  use change for the Mid-Atlantic Region.

Regional Research Partnership Program. Tom Baugh, with EPA Region IV, led a session
addressing projects within the Regional Research Partnership Program.  Bonita Johnson, with EPA
Region IV, discussed the use of microbiological indicators to assess water quality. David Macarus, with
EPA Region V, discussed the potential for using satellite data as a proxy for biodiversity.  Dan Ahern,
with EPA Region V, discussed exposure, health effects, and future research areas for manganese.

Community Air Toxics Projects. Henry Topper, with Office of Pollution Prevention and Toxics, led a
session addressing the control and prevention of air toxics at the local level and EPA initiatives in this
area.  Madeleine Weil, with the City of New Haven, CT, discussed the development of a local hazardous
air pollutants inventory and a risk reduction strategy for the City. Emily Andrews, Managing Partner for
the St. Louis Community Air Project, provided an overview of this program and its partnerships. Jon
Trout, with the Louisville Metro Air Pollution Control District, discussed the basis of the West Louisville
Air Toxics Study, findings, and actions taken as a result of the findings. Steve Perry,  with The Forum,

                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          xiii

-------
Industry Partners in Environmental Progress, discussed the purpose, participants, organization, and scope
of the Mobile County, AL, Air Quality Study.

Science to Support Decisions:  Climate Change.  Michael Slimak, with NCEA, led a session
addressing climate change assessment and issues. Michael MacCracken, with The Climate Institute,
discussed the climate change issue, the factors that complicate the issue, and the potential impacts of
climate change.  William O'Keefe, with the George C. Marshall Institute, discussed issues for
consideration in making policy decisions, the limitations of the current knowledge base of climatic
effects, and actions to be taken to promote a broader knowledge base. Arnold Vedlitz, with Texas A&M
University, discussed the use of science in Gulf of Mexico decision making involving climate change
under an EPA cooperative agreement. Joel Scheraga, National Program Director of ORD's Global
Change Research Program, discussed the feasibility of conducting regional and place-based climatic
impact assessments.

Delivering Science-Based Information to Decision Makers

This two-day session focused on EPA development of environmental indicators, the use of geospatial
tools to support decision making, mechanisms for environmental health information exchange,
development of science-based information for coastal systems, scientific computing applications,
improving the indoor environment, and tools for net environmental benefit analysis.  These presentations
included several pilot projects and public information/outreach activities, as well as partnerships among
federal, state, and local governments.  Key themes in all of the discussions were the development of
sophisticated, computer-based tools, approaches, and systems to assist in evaluating large volumes of data
supporting research, analysis, and decision making for environmental and human health.

The Future of EPA's Environmental Indicators Initiative and Report on the Environment.
Michael Flynn, with the Office of Environmental Information (OEI), led this session addressing the
development of environmental indicators and analytical tools to link environmental conditions and public
health outcomes. Ms. Heather Case, with OEI, provided an overview of the Draft Report on the
Environment, which highlights the conditions of air, water, and land in the United States and
demonstrates their effects on life, health, and ecological conditions. Denice Shaw, with OEI, summarized
the outcome  chapters (human health and ecological condition) in the Draft Report on the Environment
and trends identified from the large amount of collected information. Judy Qualters, with the Centers for
Disease  Control and Prevention, provided highlights of efforts to develop a National Environmental
Public Health Tracking Program. Susan Norton, with NCEA, presented the Causal Analysis Diagnosis
Decision Information System project, its goals to help investigators in states and tribes to identify causes
of biological impairments, and its use as a web-based system providing guidance, examples, and links to
information.

Using Geospatial Tools to Make Program Decisions. Brenda Smith and Wendy Blake-Coleman,
with OEI, led this session addressing geospatial tools and their applications to decision making.  Joe
Anderson, with OEI, described the Emergency Response Analyzer—a geographic information system-
based tool to aid in visualization and data integration/mapping for an emergency situation. Gary Roberts,
with OEI, discussed projects in five urban areas that focused on the analysis of remote sensing data  when
determining trends of urban growth.  Julie Kocher, with OEI, presented a Web-based tool to support
analyses under the National Environmental Policy Act and to streamline data access as well as
review/approval of environmental assessments and impact statements.

Delivering Consistent Information on Health and the Environment.  William Sonntag, with OEI,
led this session addressing the use of information technology to provide greater access to health and
environmental information.  Mike Frame, with the United States Geological Survey, provided an

xiv                         EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
overview of the National Biological Information Infrastructure program and its associated tools and
services to provide access to data/information on biological resources in the United States. Molly O'Neill,
with the Environmental Council of the States, introduced the Environmental Information Exchange
Network, which is intended to promote data sharing to support better decision making among Federal
agencies and regulators as well as to improve the data that are available.  Larry Fitzwater, with OEI,
introduced EPA's System of Registries to address the challenges of data access for EPA and other
government agencies.

Developing Science-Based Information for Coastal Systems. Kevin Summers, with NHEERL, led
this session addressing the development of science-based information for coastal systems. Henry Lee II,
with NHEERL, discussed the Western EMAP coastal research activities and preliminary findings in
California, Washington, Oregon, Hawaii, and Alaska. Diane Regas and Darrell Brown, with the Office of
Wetlands, Oceans, and Watersheds, provided an overview of EPA's National Coastal Assessment
program and the challenges of integrating local estuary program data into a national summary. Kevin
Madley, with the Florida Fish and Wildlife Conservation Commission, summarized Florida's Inshore-
marine Monitoring and Assessment Program and ongoing sampling activities.  Phil Trowbridge, with the
New Hampshire Department of Environmental  Services, described ongoing coastal assessment activities
and results for the shortest coastline in the United States. Henry Walker, with NHEERL, provided an
overview of National Coastal Assessment Program activities in the Northeastern United States as well as
research findings and their applications.

Scientific Computing. Rick Martin, with the Office of Technology Operations and Planning (OTOP),
led this session addressing EPA's high performance computing mechanisms/tools and their respective
program applications.  Joseph Retzer, with OTOP, discussed the role of the Center of Excellence for
Environmental Computational Science in meeting EPA's scientific computing capability need. John
Smith, with OTOP, discussed EPA initiatives to expand data acquisition, storage, and manipulation by
internal and external users.  Terry Grady, with NERL, discussed the development and potential
applications of EPA's Science Portal.

Healthy Communities—One Building at A Time. Elizabeth Cotsworth, with the Office of Radiation
and Indoor Air (ORIA), led this session addressing scientific information exchange as a means of
influencing public action and promoting healthy buildings and indoor environments. John Girman, with
ORIA, provided an overview of indoor air pollution sources and effects as well as principles for
managing its prevention and control. Jim Jetter, with the National Risk Management Research
Laboratory, discussed current ORD research focusing on indoor environments.  David Mudarri, with
ORIA, discussed strategies for public outreach  and involvement in improving indoor air quality.  Sam
Rashkin, with the Office of Atmospheric Programs, discussed the development and implementation of an
Indoor Air Quality Label for new housing construction. Tracy Enger, with ORIA, discussed the
mechanisms of turning research and guidance into action and the importance of social marketing.

Net Environmental Benefit Analysis. Ann Whelan, with EPA Region V, and Bill Robberson, with
EPA Region DC, led this session addressing the Net Environmental Benefit Analysis (NEBA) approach.
Bill Robberson discussed the use of NEBA in environmental decision making for emergency response,
and Ann Whelan presented a case study of the application of NEBA to emergency response planning.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                         xv

-------
                             (This page intentionally left blank.)
xvi                        EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Section  I:    Overview

The Environmental Protection Agency (EPA) presented a Science Forum at the Mandarin Oriental Hotel
in Washington, DC, on Tuesday, June 1, through Thursday, June 3, 2004.  The EPA 2004 Science Forum:
Healthy Communities and Ecosystems was an opportunity to showcase the activities of EPA and other
organizations in key areas of environmental research and to spotlight new initiatives and recent successes.
As the third in a series of annual events, this Science Forum built upon the first two Agency-wide Science
Forums held in May 2002 and May 2003, and was co-sponsored by the Office of Research and
Development (ORD), the Office of Environmental Information (OEI), and EPA Region IV.

The Science Forum highlighted selected high priority topics and EPA's scientific accomplishments,
showcased EPA's commitment to quality science, and demonstrated, through examples, how science
influences Agency decisions. The Science Forum also provided an opportunity for dialogue and
interaction among EPA scientists, partners, clients, stakeholders, and colleagues with over 1,000
attendees at this event. Attendees included EPA program, research, and regional staff; members of other
Federal agencies; stakeholders; the scientific community; and interested members of the public. The
Science Forum included 223 posters addressing current EPA research activities and specific topics
addressed by speakers, discussions of research efforts by EPA and external scientists and engineers, 11
exhibits of EPA scientific and educational programs, and demonstrations of canine scent capability to
detect vapor intrusion using an EPA-trained dog.

EPA Administrator Mike Leavitt opened the plenary session of the Science Forum with a perspective on
the scientific credibility established by EPA in the past 30 years and important technical and cultural
directions that will have a major impact on future EPA science initiatives.  Other plenary speakers
provided highlights of the regional perspective of EPA's science assets and future scientific needs,
examples of the integration of information management and technology with sound science to solve
environmental problems, ongoing initiatives to address the science needs of EPA and the quality of its
scientific products, the science needs of the states, future trends in information technology and the
impacts on data management and analysis for regulatory agencies, and upcoming challenges in addressing
the new types of technologies under development. The opening day session also included a review of the
American Association for the Advancement of Science (AAAS) fellowship program and the experiences
of its alumni in supporting EPA activities.

Three two-day breakout sessions each examined a theme area—science and innovation to protect health
and environment, using science to make a difference, and delivering science-based information to
decision makers. The audience had an opportunity in each session to ask questions of the speakers.
Poster sessions followed  the plenary session and each breakout session addressing session-specific and
related topics.  EPA engineers and scientists were available at these poster sessions to provide additional
information and to address attendee questions.
                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Section  II:      Plenary  Session
                                                     Tuesday, June 1, 2004
The purpose of this session on the first day of the meeting was to provide plenary addresses on the role
and value of science and partnerships to support environmental decision making and policy making,
future research directions, and implications of information technology to data management and sharing.
The plenary session also provided an overview of the AAAS fellowship program and the experience of
several participants in supporting EPA initiatives.

The Science Forum opened with a panel discussion about the fellowship program sponsored by the
AAAS. Dr. Fran Sharpies, Dr. Venkat Rao, and Dr. Terry Keating discussed the different program stages
(from the initial program to the present), the fellowship activities conducted at EPA, and the lessons
learned about the role of science in decision making and policy making gained from this experience.

EPA Administrator Mike Leavitt opened the plenary session of the Science Forum with a perspective on
the scientific credibility established by EPA in the past 30 years and important technical and cultural
directions that will have a major impact of future EPA science initiatives. The Regional Administrator
for EPA Region IV, Mr. Jimmy Palmer, discussed the ongoing regional review of science usage and the
role of science in all EPA actions and decisions. EPA Science Advisor and Assistant Administrator for
ORD, Dr. Paul Oilman, provided highlights of ongoing initiatives to address the science needs of EPA as
well as the quality of the scientific products.  The Executive Director of the Environmental Council of the
States, R. Steven Brown, addressed the science needs of the states and the role of EPA in addressing those
needs. Chief Technology Officer for International Business Machines (IBM) Corporation Federal
services, Dr. David McQueeney, discussed future trends in information technology and the impacts on
data management and analysis for regulatory agencies. The Director of the Office of Science and
Technology Policy, Dr. Jack Marburger, addressed upcoming challenges in addressing the new types of
technologies under development.
                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Opening Remarks

EPA Science Advisor and Assistant Administrator for the Office of Research and Development (ORD),
Dr. Paul Oilman, welcomed all the attendees to this third annual EPA-wide Science Forum: Healthy
Communities and Ecosystems.

AAAS Environmental Fellows Session
Dr. Paul Oilman, Assistant Administrator for ORD and the EPA Science Advisor, introduced a panel
discussion by three former participants in the AAAS fellowship program about their experiences and
contributions.

Dr. Fran Sharpies was one of six pioneering members of the AAAS fellowship program class of 1981.  At
that time, the program involved 10 weeks in Washington and it was not until 1996 that the fellowship
became a year in length.  The program has grown significantly in 23 years involving over 200 fellows
who have participated in an array of projects related to policy and the environment. Areas in which the
fellows have participated include air and radiation, children's health/protection, environmental policy
innovation, safeguarding the environment,  and statutory responsibilities, among others. These AAAS
fellows have supported numerous EPA program offices and laboratories. A key program goal is to
demonstrate the value of science, technology,  and economics in solving problems.

Dr. Sharpies participated  in the AAAS fellowship program about 3 years after graduate school and while
working at Oak Ridge National Laboratory. Instead of pursuing an academic track, Dr. Sharpies became
involved in highly technical environmental work that in turn introduced her to environmental policy. The
AAAS fellowship program was an opportunity to learn more about this topic and she was able to work on
a project that had an unexpectedly large impact on her career. She supported the EPA Office of
Exploratory Research (which no longer exists) on an effort to determine what problems might occur from
the release of modified organisms (bacteria, plants). At the time, there was little literature on this
particular topic but there was much analogous literature regarding the introduction of non-native species
that might serve as a basis for extrapolation (prediction). The report she wrote from this effort was one of
the first to appear on this  topic  and she was asked to testify at a Congressional hearing on the release of
engineered organisms.  The National Institutes of Health (NIH) subsequently asked her to participate on a
research advisory committee, as few had considered this problem.  She then found herself being publicly
interviewed and being requested by AAAS and international  organizations to write papers on this topic.
This was a life altering experience that introduced Dr. Sharpies to a new world in which science was a
tool for decision makers to select from an array of choices rather than an end unto itself.

Dr. Venkat Rao participated in the AAAS fellowship program in 1992. This provided an opportunity for
a scientist to interact with the policy process, which in turn changes the way we think and orient ourselves
as well as how we grow and mature in our  own careers. Until participation in the fellowship program, Dr.
Rao was involved in scientific work regarding chemical carcinogens and how combinations in multiple
exposures could produce  different types of effects, with an emphasis on building models. The fellowship
program was an opportunity for him to sit and work with EPA staff, become acquainted with the Agency
leadership, and to be exposed to the policy aspects of science.

At that time, many of the Clean Air Act (C AA) Amendment issues were engineering-related and the
office he supported felt that health should be key and was attempting to bring health into the decision
making. As a board-certified toxicologist,  he  was able to look at the health dimensions of National
Emission Standards for Hazardous Air Pollutants (NESHAPs) through a case study of a neighborhood,
developing a regression model using multiple data sets, and using the model as a baseline to show how
NESHAPs in this environment would be addressed. This helped in understanding how policy issues come
into play.  Another great experience during the fellowship was the opportunity to have lunch with  Senator


                            EPA SCIENCE FORUM 2004 PROCEEDINGS                            3

-------
Al Gore, and to participate in a discussion of Senator Gore's recently published book and sharing of
ideas. Dr. Rao has continued to support the AAAS fellowship program as a member of the selection
committee and sees a great spectrum of talent applying for this opportunity as well as increasing
participation at EPA.

Dr. Terry Keating is a recent participant in the AAAS fellowship in one of the first year-long program
classes and after a one-year extension became a permanent EPA employee. His experience prior to the
fellowship was focused on academia although his education in air quality emissions and modeling had a
policy component.

For his fellowship, Dr. Keating was assigned to the EPA Office of Policy Analysis and Review within the
Office of Air and Radiation. He developed an eclectic portfolio of issues such as intercontinental
transport and linkage of air quality and global climate change. Interdisciplinary thinking and problem
solving was encouraged and this experience was an opportunity to learn new things.  In his current
position, this experience continues and offers the opportunity to teach scientific concepts to lawyers,
economists, and Congressional staff as well as Agency  leadership. The fellowship and the continuing
position have provided opportunities to observe scientific policy making, educate decision makers, and
make new connections between different disciplines.

Dr. Keating continues his participation in the fellowship program by serving as a mentor to an AAAS
fellow for the past two years. The fellows are talented, knowledgeable, enthusiastic, and bring fresh ideas
and a critical eye to the Agency.  This  also helps to bring a connection to the outside for the Agency.

A brief question and answer period addressed a range of topics, including the following:

•   An important experience in the fellowship program is to move from the academic/research setting for
    communicating peer-to-peer to a policy  setting that requires communication in ways someone outside
    your discipline can understand. This is learning to  communicate with people who are not
    knowledgeable in the specific field or in science in general, as well as how to communicate
    complicated, technically-oriented information at a simpler level. A science policy analyst must be
    able to analyze, interpret, and present information in a simplified way because most decision makers
    in government are not scientists or may be specialists in another field.

•   The AAAS fellowship program is  only one source of personnel for EPA, which also has its own
    internship program and hosts presidential management fellows. Similarly, EPA is one of many hosts
    for AAAS fellows.  Of the 90 AAAS fellows in Dr. Keating's class, approximately nine were at EPA.

•   EPA places a high value on science as well as understanding what is and is not known. For each
    issue, there will be much debate on all sides and it is important to understand how credible the
    scientific arguments are  on all sides by asking  good questions, checking the literature, and discussing
    issues  with knowledgeable persons. This helps to distinguish between what science is becoming
    accepted and what is considered "on the edge"—an important aspect for decision makers to
    understand. EPA is often criticized for not relying completely on science in making rules and
    policies; however, science is only one element of policy making. Other factors include cost, who is
    impacted, where they are, whether they are vulnerable, and whether there is disproportional impact.
    Often the policy makers desire a yes/no framework, yet it is difficult to explain science issues in that
    format. In addition, policy does not consider weight of evidence and policy decisions are made from a
    different set of issues and from a different way of thinking than science. A related point is that there
    is always scientific uncertainty and there is never enough science to determine "the answer."

•   The AAAS selection process picks participants who can learn quickly, adapt, think well on their feet,
    and can absorb, synthesize, and recap a lot of information. Each fellowship class has a range of
                             EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
    experiences, background, and career stages. The actual fellowship experience also varies a lot in the
    exposure received to policy, value judgments, and balancing viewpoints.

•   Many graduate programs do not prepare scientists to work in and contribute to the public policy
    arena. There is a great deal of work involved in making this transition, and academic programs often
    do not value this. An area of improvement is to help the academic world understand that this is
    something of value; for example, for someone in the academic world, it is useful to understand where
    future funding is coming from and how funding for an academic program can be obtained. Science
    education has changed little in 30 years despite the calls for curriculum reform, to identify legitimate
    science careers that are not academic/research, and making academic programs more multi-
    disciplinary.  One approach may be to get more people who have had this experience to speak with
    academic departments about the experience and its value.

•   Training a policy student in science/engineering is also difficult, and it may be harder to develop
    sufficient understanding of a scientific discipline. Examples do exist in the legal community where
    science and statistics may be used to support cases in the courtroom.

•   There are no formal alumni activities, but AAAS tracks, maintains, and publishes a directory of
    former fellows. There also is a List Serve for email, sessions at the annual AAAS meeting, and other
    venues to stay in communication.

Plenary Addresses
Dr. Jack Puzak, Acting Director of the National Center for Environmental Research (NCER), opened the
Plenary Session and welcomed attendees to the third annual Science Forum. Dr. Puzak also
acknowledged the three partners in developing this event: the EPA program offices led by OEI, the
Regional offices led by EPA Region IV,  and the state environmental agencies.  The EPA Administrator
and six additional speakers provided opening addresses to Science Forum attendees on the role of
science at EPA and in  environmental decision making, current initiatives,  and future directions.

EPA Administrator Plenary Address

EPA Administrator Mike Leavitt welcomed all attendees and discussed a recent opportunity to meet with
15 outside scientists (nobel prize laureates, distinguished academics, etc.) and hold discussions on
identifying the leading edge scientific questions that EPA should focus on as an Agency. These free
form, collaborative discussions addressed genomics, molecular biology, chemical climatology, disease
registries, computational toxicology, and risk assessment. The Administrator then shared four
observations from that experience.

The first observation involved the question of how EPA is doing to which the response was: "You've
come a long way baby." This is reflected in the fact that in 34 years EPA has moved from formation as
an Agency to being one of the most respected scientific bodies worldwide. It is a real privilege to attend
an international meeting and to see the respect for EPA throughout the world for environmental
protection.  Over the next 5 to 7 years, most of EPA's scientists will move on to other efforts. This is a
great challenge and a focus for the next few years—to make sure that the tradition continues.

The second observation involves the importance, in the future, of social science and communication as an
integrated discipline.  This is best illustrated by a recent experience in New Bedford, MA, where he went
to see a Superfund site and met an elderly  gentleman, who was fishing for stripers in front of a sign that
said "Don't Fish Here." Asked how the fishing was, the gentleman said he had not caught any keepers
lately. When asked if he ate what he caught because there was a need to limit the intake offish taken
from this area, he replied that he did not catch many keepers, but he does eat what he  catches, so he was
                            EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
limited in his intake because he does not catch a keeper very often.  This demonstrates the importance of
social science in knowing why people do what they do.

A third observation is that we are in a networking era.  When the Agency started, there were main frame
computers and now there are networked personal computers. Interdisciplinary approaches to science are
the new frontier of human productivity. Solutions need to transcend political boundaries, not only those
such as United States-Canada, United States-Mexico, Virginia-Maryland, and California-Nevada, but also
boundaries between EPA and the Department of Energy (DOE), between DOE and the Department of
Commerce, between the Office of Air and the Office of Water, and between cubicles in the same office
building. There are many different kinds of boundaries. The 21st century will be defined by how we
transcend those boundaries and collaborate. We have learned to make machines work together and the
question now is whether we can get people to work together.

The fourth and last observation involves new thinking and old thinking. Sometimes values are interpreted
as being partisan interest when they may really just be differences in new/old thinking. There is an
opportunity through the course of the next 30 years to shape the Agency to be collaborative and to be
preventative. We have learned how to  clean up the environment and contamination, so efforts now
become focused on how to prevent it, become informative, and become a key repository of scientific
information and facts—to be focused on the big, leveraged questions of science and most important, to
know what they are and maintain the global perspective.

EPA Region IV Administrator Plenary Address

The Regional Administrator for EPA Region IV, Mr. Jimmy Palmer, presented the regional perspective
on the role of science in environmental decision making and the importance of partnerships in how
science is accomplished within the Agency. EPA Region IV is a co-sponsor of this Science Forum, and is
a lead EPA Region for research and information involving partnerships with ORD and OEI.

An initiative began last year for all EPA Regions to look at how science is being used, the obstacles to
making the science work, and how to address those obstacles. This effort is now in its final stages and
has involved both large and small teams focused on answering those questions, with preliminary findings
anticipated to be presented by the end of the summer.

During 34 years in the environmental field in varying capacities (regulator, engineer, lawyer), he has seen
much abuse of what many call science. One example involved a public hearing over a contentious
proposition to issue permits to a landowner to start a swine operation. The opposition brought in an
employee of a state agency to tell concerned citizens that if the operation can be smelled at the property
line, people are being harmed. Many facilities (e.g., wastewater treatment plants, industrial facilities, and
landfills) emit various odors at various concentrations that are not harmful.  Another example involved
plans to build a shopping center in a coastal region that required relocation of wetlands/streams on the
property. Wetlands scientists had evaluated the site with the engineers and determined that the wetland
systems on the property would remain  functional but their functionality would decrease.  However,
citizens at the public meeting felt that any drop in functionality equated to destruction of the ecosystem.

Science drives all EPA actions and underpins all EPA decisions, whether approving a Total Maximum
Daily Load (TMDL), making a decision on an attainment area, or approving a Record of Decision (ROD)
for a Superfund site specifying a particular remedy. That is one reason for having this type of event,
which provides a means to improve our approach to scientific issues and to grow the body of work that is
the science in what EPA does.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
OEI Assistant Administrator Plenary Address

The Assistant Administrator for OEI, Kim Nelson, discussed three key projects currently underway with
ORD. The EPA mission is to protect human health and safeguard the natural environment, and this has
always relied on sound science, which today requires sound information management and technology.
These three projects focus on collaboration, help to improve EPA's strategic planning and ability to
measure results, and demonstrate the use of leading edge information technology applications to solve
environmental problems.

The first project is the Environmental Indicators Initiative, which has resulted in a product that has been
widely and well received by the public, the decision-making community, and the scientific community.
The first phase, the Report on the Environment, provides a strong foundation for moving forward. Efforts
are underway to determine how to display this information electronically for everyday use rather than
retaining a traditional report format.

The second project is the Environmental Science Portal, which provides a mechanism for sharing and
accessing information to support collaborative work. This Portal will become the gateway that will
provide the capability the EPA needs to bring science, scientific data, and advanced information
technology solutions to environmental issue decision making.  This involves collaboration,
communication, and interdisciplinary partnership to help researchers come together and work together.

The third project involves high performance and grid computing. EPA is making progress in upgrading
its information technology with a goal of using tomorrow's information technology solutions to address
environmental problems today. High performance computing will help to address critical air quality
issues that are computationally and scientifically challenging, central to the EPA mission, and critical to
EPA partners (e.g., states, regions).  This will require less time to run the more complex models that EPA
has today and the data will support improved modeling and forecasting of air quality. EPA and DOE are
also collaborating to utilize grid computing capacity for the most difficult and challenging environmental
decisions and to investigate linkages between environmental condition and health condition with partners
such as state agencies, the National Oceanic and Atmospheric  Administration (NOAA), and the Centers
for Disease Control and Prevention (CDC).

These advancements in information technology will help to  support the environmental decision makers
with the tools, computational resources, and the models they need no matter where they are located. This
is a watershed opportunity to help EPA deliver on its mission. The environmental threats we face
everyday do not recognize political or organizational boundaries, and the use of innovative information
technology and tools is necessary to be able to work beyond those traditional boundaries.

EPA Science Advisor and ORD Assistant Administrator Plenary Address

Dr. Paul Oilman, EPA Science Advisor and Assistant Administrator for ORD, discussed science
achievements and future directions across many EPA programs. Complementing this discussion was a
slide show in the background providing covers of journals in which EPA scientists have published their
work as well as awards given to EPA and its extra mural researchers.

The estimates that are performed of the consequence of EPA's work in air regulation represent a  state-of-
the-art analysis and require input from scientists and engineers. It is a challenging technical
accomplishment to estimate the mortalities avoided due to a particular regulatory action or decreased
hospital admissions.  Even today, the quality of such estimates is rarely questioned and there is no better
example of the progress that can be made when the science is done right. Five to 7 years ago, the EPA
position on paniculate matter (PM) was viewed with a great deal of skepticism and has now grown to
                            EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
credibility having identified the mode of action for how PM acts on our health, morbidity, and mortality.
We have moved from understanding results from ambient air monitors and contaminants to which We are
exposed in our homes to a more fundamental understanding of the mechanisms of exposure.  Yet there are
many areas for which such understanding must still be developed.

There are 3.7 million miles of streams and rivers in the United States and 4 million miles of roads. There
are 1.8 billion trips to the beach each year and coastal resources supply $54 billion in goods and services.
It is easy to say that we are loving these water resources to death. Key to progress in this area is the
collaboration among state and Federal agencies to understand the health state of coastal waters.  In
February 2004, the second version of the coastal assessment was released and enables the Agency to
make specific statements about coastal health against a specific point in time. EPA progress in assessing
all of our water resources is an emerging story for this decade.

Since the initiation of the "45 day" study of the use of science in EPA decision making, EPA has
increased outreach and sharing of resources for collaborative work with other scientists and engineers.
There is now a pamphlet that describes the EPA facilities that are available to support the research of
others. Opening up the doors to these research resources will benefit not only EPA but also state/local
government and the academic community.

Strengthening of science within the EPA is a continuing effort.  A science inventory is underway to make
available more than 4,000 science projects in the regional and program offices as well as ORD.  This will
include contact points, peer review information, etc. There is also a Council on Regulatory
Environmental Modeling (CREM) that is addressing the computer modeling needs of EPA, which
currently uses over 60 models.  CREM is addressing how to develop, verify, and peer review models. In
addition, peer review continues as a major initiative with over 650 work products (not journal articles)
undergoing external peer review. Furthermore, efforts continue to develop new methods for measuring,
to fix ones that may not be working well, and draw on the post-doctoral program to begin incorporating
new scientific areas.

Computational toxicology is another example of collaboration at many levels. There are growing
relationships between the Office of Pesticides and others to determine how to accomplish required
assessments faster and at less cost and with less use of animal testing. EPA has gone from learning what
new areas  such as genomics can do to having the National Institute for Environmental Health Sciences
(NIEHS) and other partners look to EPA to bring them tools in such areas.

Another emerging area, perhaps more of a resurgence, is sustainability, which draws on both science and
technology.  The EPA Web site has over 50 links on this topic.  Key components being addressed are how
to pull this together within the Agency—across cubicles, on the research side, how to make it happen in
the regions—as well as external collaboration with the academic community. Activities include proposals
seeking ways for engineering schools to incorporate green science into their curriculum and a student
design competition entitled People, Prosperity, and the Planet (P3) for engineering students to put
together projects  for the next generation of prevention.

EPA has come a long way in 34 years, as noted by the Administrator, and has developed the foundation
for a future based on science and technology.

Environmental Council of the States Plenary Address

The Executive Director of the Environmental Council of the States (ECOS), R. Steven Brown,  addressed
the science needs of the states, for both current and future issues. Together, the states and EPA spend
over $20 billion each year to protect the environment.  The states typically have well-equipped and well-
                            EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
staffed programs for air, water, waste, etc., and while they may have some laboratories, they are less
likely to have basic scientific research capability. A few years ago, only California, Illinois, and New
Jersey had science programs, but those may not have survived recent budget cuts. Therefore, the states
must rely on basic science support from EPA and state universities.  This appears to be an appropriate
division of roles—states having responsibility to implement delegated Federal programs and EPA
supporting basic scientific research and development.

Mr. Brown presented 10 areas that are currently of interest to ECOS members as identified from a
meeting this spring, as follows (in no particular order):
1.   Small drinking water  systems—there is a need to develop treatment technologies suitable for small
    communities that are inexpensive and effective.  Otherwise, these  small systems are hard pressed to
    meet drinking water standards such as arsenic.
2.   Monitoring and remote sensing—there is a lack of technologies to acquire data remotely.  EPA
    support of research on motes and other small computer systems that can float down streams and
    report on their journey may be beneficial to meet this need.
3.   Information management—states generate about 94 percent of the data in EPA's six largest
    databases. Problems have been experienced over the years with timeliness, quality, and standards for
    this information.  States are currently working with EPA to develop a network to provide more
    accurate data and to do so more quickly. Data transfers have already begun and this will help EPA
    scientists get more reliable data from the field and  do so more quickly.
4.   Mercury—emission and transport remain one of our most important problems.  Developing transport
    models for this may help with other pollutants as well, and may also help to develop an understanding
    of long-range transport from other countries to the United States.
5.   Military facility impacts on the environment—this continues to be a concern for ECOS members and
    EPA assistance in helping the military to address this, as well as the growth around these facilities,
    will be advantageous.
6.   Hazardous waste cleanup technologies—Interstate Technology Regulatory Council supports the
    implementation of cleanup technologies by reviewing their applicability and efficacy. EPA, the
    Department of Defense (DOD), and DOE are all supporting mis, and it is hoped that EPA support will
    continue.
7.   Cross-media pollutant transport—this continues to be an area of concern.  Large municipal sewage
    treatment plants may have to install special equipment to remove fine PM emissions from the
    treatment facilities. This will result in having treatment facilities on top of treatment facilities, which
    may need to be reduced.
8.   Environmental tracking and  footprint—more help may be needed to track the source(s) of pollutants.
    Areas of assistance include broad issues such as interstate transport as well as local concerns. The
    footprint is the impact on the environment and technologies may be needed to lessen the impacts.
9.   Toxicology—states rely heavily on EPA's toxicology work and it is important that EPA hears that
    message and continues to do that important work.
10. Environmental indicators—in shifting from outputs to results, we find a lack of indicators for many
    areas. This is of concern to both EPA and the states.

Mr. Brown also identified a number of state concerns for the next 5 to 20 years:

•   Waterborne nutrients—the effects and contributions will continue  to rise in importance.
    Technologies are needed to control area pollutant sources. In addition, contributions of nutrients and
                             EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
    other pollutants to problems in the Chesapeake Bay and the Gulf of Mexico need to be delineated as
    the existing problems are expected to get much worse.

•   Biotechnology—growth in this area is reminiscent of the growth of the chemical industry 50 years
    ago without thought to environmental consequences. Biotechnology offers great promise towards
    cleanup and other applications, but the use needs to be thought out in great detail, which may be a
    major role for EPA.

•   Nanotechnology—concerns about the growth in this area are the same as for biotechnology, but
    applications in this field are perhaps 10 years behind biotechnology. A program was created at NIH
    on the ethical, legal, and social implications of biotechnology, and recent legislation authorized such a
    program for nanotechnology.  It is hoped that EPA will be active in this program.

A final recommendation was to continue the use of state government scientists on review panels that
consider research proposals.

International Business Machines Corporation Plenary Address

Chief Technology Officer for IBM's Federal services, Dr. David McQueeney,  discussed insights on
developing and applying good science from his experience in research for IBM and others as well as
exploiting technology to meet EPA's changing needs. The presentation addressed three primary topics:
(1) evolution of the research value creation process, (2)  exploration of supercomputing, nanotechnology,
and pervasive computing, and (3) E-business on demand, which encompasses web services, autonomic
computing, grid computing, and harnessing value in unstructured data for business optimization.

A major challenge is getting hard core science through the delivery chain from the laboratory to the
customer.  In the period from the 1940s to the 1970s, such investment in scientists/facilities was a
business decision focused on making an impact on customers, with an emphasis on curiosity-driven
research and technology transfer of an idea to a product. However, the yield from this investment was
low because innovations did not get to customers or took too long to move from research into
development, manufacture, and sales. In the 1980s, IBM began to connect these aspects more tightly in
order to move the research to customers, to bring customer needs to the research, and to get rid of the
chain that slowed down this process. In the 1990s,  information technology made the first big transition
from "something that happens in the back room" to being more involved in the front office (e.g., customer
preferences, market segmentation). Some of the most interesting, cutting-edge applications were
occurring with the customers, so if the researchers were not in those organizations, they were losing
perspective of what was cutting edge. Now in the 2000s, the emphasis is on the use of information
technology to produce results (e.g., business outcomes)  rather than the details of how the technology
performs.  If researchers are to contribute value, they must understand what customers consider valuable.

Early research in information technology involved innovations in hardware (i.e., faster, better).  In the
1980s, software began to emerge as a science in its  own right and information technologies had to be able
to merge the two together. From the 1990s to the present, the emphasis has been on ways to support the
customer's business process or policy objective, and involves a more people- and services-oriented
business. The question  of how researchers can support  a services industry is still a deep topic of
discussion with the focus today on impacting end users  and their customers.

Many of the information technology performance metrics (e.g., memory, storage, bandwidth) follow
exponential growth curves, and eventually technology performance becomes "good enough." For
example, when resolution on a screen reaches the limit of resolution detectable by the human eye, further
improvements in resolution will not provide product differentiation in the market place. Therefore,
efforts now focus on liquid crystal display (LCD) screens, which are less expensive, since the pixel count
10                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
is no longer important. Another example is that the functions of a personal computer have improved to
the point where the price now begins to decrease, and storage technology is becoming "good enough" for
consumer personal computers because it does not fill up by the time the computer is replaced. Yet, the
storage capacity suitable for general consumers may not be good enough for managing environmental
data or other applications.

These examples all illustrate how a commodity technology crosses a threshold that is "good enough,"
which involves the question "When is faster, cheaper, better, good enough?" Continued efforts to drive
technology to new levels are noble, but may not be useful. For many customers, the important issue is
not the absolute cost for the equipment, but how easy this equipment is to use/administer, which in turn
depends on the skills and expertise of the users.  An important engineering policy is to use some of the
power of the system to make it less complex to users.

In the second topic area, advances in computing power can be applied to science, business, and policy
challenges by taking off-line computing tasks into real-time, which supports continuous operations and
provides for real-time business responses to changing conditions. We have seen a huge increase in
calculation capability—more than exponential. For example, the airlines began scheduling equipment
and manpower based on customer utilization. In  1992, the run time to identify projected demand took
multiple days. Using today's equipment, this process takes about 10 minutes.

Scale is another factor—yesterday's super computers are today's desktop computers.  As computer
components get smaller, more can be included in the computer or on the components. However, there are
limits to our ability to scale down the components. We are approaching atomic scale in some areas, such
as new transistors.

There is a tremendous increase underway in computing power and estimates are that this is expected to
increase by five to six orders of magnitude over the next 10 years. Networking of computers, such as grid
computing, really increases the computing power. The transition has involved movement from
networking to network sharing (e.g., the Internet), which in turn moved to the World Wide Web, and now
moves  to a computational power increase via the  "grid." This is leading to a huge change in how
computing is used in business and its applications in optimization.

Nanotechnology moves us from using atoms to make computers to moving atoms to make computations.
Advances in fabrication enable placement of atoms individually in carbon nanotubes.  To date, there is an
experiment, a theory, and a new scientific capability, but we do not yet have the manufacturing capability.
If deoxyribonucleic acid (DNA) replication is the manufacturing process, how do we utilize this?

Another example involves pervasive computing enabled by integration to obtain data in real time. Once
all of the sensors can be placed on a chip, it is easy to scale it down.

The third topic area—E-business on demand—is  IBM's vision for the next step in computing, which can
help businesses to respond to  changing conditions in real-time and make the infrastructure more resilient.
At EPA, this approach would tie vertical systems together horizontally with the value coming from the
connection of air-water, regulators-users, and state-Federal.  Such horizontal ties and our ability to
manage them is the next step forward for the advancement of science.

Everyone is facing the need to master the explosion of unstructured data resulting from information
technology advances. Most data are unstructured and grow at a higher rate than structured data. A key
research area involves improvement of search capabilities  to move internet-scale searching beyond words
and simple parsing of sentences toward more relational queries. A very promising area for collaboration
is how to apply unstructured data mining technology to EPA's unstructured data.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          11

-------
In closing, base computing technology will continue to increase at current rates for the next 10 years, but
we are close to the point where advancements will involve individual atoms and molecules, and to do this
we are looking to the life sciences for solutions. EPA faces challenges similar to those in the commercial
world—horizontal integration and managing complexity.  The focus is shifting from having data to being
able to harness it for a specific purpose.

Office of Science and Technology Policy Director Plenary Address

The Director of the Office of Science and Technology Policy, Dr. John Marburger, discussed his
experience with environmental issues, policy, and science. As a university president for 14 years, he had
responsibility for over 1,100 acres, a hospital, sewage plant, co-generation plant, nuclear physics
accelerator, and agricultural animals all located on Long Island over a sole-source aquifer with
environmentally conscious neighbors.  He then became the Director of DOE's Brookhaven National
Laboratory, which was facing closure due to a number of environmental issues. In this capacity, he found
himself facing activists who were more technically knowledgeable than he and this was effective in
giving credibility to deep-seated public fears even when the knowledge was inaccurate. With every
environmental decision, he was always reminded that property values were at stake. In addition, a 1988
Memorandum of Understanding (MOU) between DOE and EPA established stringent requirements and
milestones to accomplish for evaluating industrial processes and laboratory experiments as well as
developing an environmental management system.  Within 3 years, Brookhaven developed an
International Standardization Organization (ISO) 14001-certified environmental management system, was
recognized as the good neighbor of the year locally, and received a national award, among  other
achievements. This was accomplished by learning lessons, listening to the community, and trying to
accomplish all that  DOE and EPA set forth.

The United States Government is subject to much public scrutiny. Reporters look for hidden meanings
and ulterior motives. Citizen groups and activist groups have much information and they use the
processes of democracy to make their points and ask for action. In addition, the G8 environmental
ministers have observed that Americans are more likely to accept new technologies (e.g., genetically
engineered food) than their own people. This reflects the public confidence that our government, e.g., the
EPA, the United States Department of Agriculture (USDA), and the  Food and  Drug Administration
(FDA), regulates responsibly, which is a major factor in American attitudes. These public processes have
direct consequences on our quality of life and economics, and must reside on sound science.

The industrial revolution centered on new energy sources, such as coal, steam, and oil, for electricity.
Today, the emphasis is on the atomic understanding of matter, such as quantum mechanics and all of the
extraordinary breakthroughs post-World War II and during the Cold War including new areas of
biotechnology, information technology, and nanotechnology. These new technologies pose similar
challenges to those  of the industrial revolution—unintended consequences on health and the
environment—and also provide new tools, such as improved detection and increased control over
chemical/life processes (e.g., green chemistry and reduction of potentially hazardous by-products). The
explosive development of information technology is magnified by the effectiveness of our  government's
engagement in public processes, including the media and Web-based information resources.  Individuals
who in the past may have had unrecognized concerns are now able to band together, and the World Wide
Web is adding new dimensions to this effect.

New technologies have immense consequences. The only possible constructive government response to
this is to link regulation as strongly as possible to science. There are too many variables for controlled
experiments, and there is too much complexity and variability to determine a definitive guide to
regulatory reaction  in every case.  Science is a way to test ideas about how nature works, but does not tell
12                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
us when our ideas are incomplete or incorrect, nor does it tell us truths about nature, nor how to change a
condition that we have observed but do not like. We must invent plausible approaches and responses to
what we find in nature.

The pace of science is slow and may reveal that a past decision was too conservative and, having been
built into people's perception, is difficult to change. Similarly, if the past decision was found to not be
conservative enough, there will be public outcry. Both outcomes can cause a negative reaction toward the
regulators, but the second one is worse. Therefore, we tend to make conservative decisions.  Some say
changes should not be made if we do not know the science, but that also involves big risks. Societal
change is faster than that and sometimes we must guess as to the right actions to take when the science is
not yet understood. This is very difficult and those who make such decisions must put themselves on the
line because of the risks to human health and the economy from these decisions.

EPA has done an excellent job to recruit the best scientists to its cause.  The EPA peer review process is a
model to other agencies as demonstrated by recent efforts by the Office of Management and Budget
(OMB) to promulgate peer review best practices for all agencies, largely based on what EPA does.

Question and Answer Session

A brief question and answer period addressed a range of topics. These included: (1) assisting the public
to come to their own conclusions on technically complex issues by making available the correct science
(on Web-sites, in journals, and other venues) and having credibility in the supporting science; (2) how,
when apprised of issues that may affect their health, the public focuses on the impact to themselves and
wants reassurance that the people addressing this issue are taking it seriously and doing something about
it; (3) the role of risk communication to listen to and address concerns rather than focusing on "getting out
the science"; (4) the importance of forming an ongoing dialogue with the public to establish the
trustworthiness of the government; (5) the importance of partnerships, coalitions, alliances, and access to
other funding sources to accomplish the research that EPA will need for its programs, which cannot all be
funded by EPA; and (6) how to think proactively about the impacts of the new technology revolution and
conduct the research that may be needed, including changes in university training of chemical engineers
and other disciplines to provide a greater understanding of the need for sustainability, environmental
stewardship, green chemistry, and environmental considerations.

Closing Remarks

Dr. Puzak concluded the plenary sessions by thanking all of the speakers for taking time out of their
schedules to address the Science Forum. Dr. Puzak reminded participants of the poster sessions, exhibits,
and additional presentations throughout the Science Forum.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          13

-------
Section  III:    Science  and
                               Innovation  to
                               Protect  Health  and
                               Environment
                               Wednesday and Thursday, June 2-3, 2004
The purpose of this breakout session on the second and third days of the meeting was to focus on
advanced remote sensing techniques, improvements in data resources for risk assessment, acquisition and
use of human exposure data, vulnerability of children to environmental exposures, sustainability
initiatives, relationships between air quality and human health, and water quality issues for recreational
waters. Each session included opportunities to respond to audience questions that provided additional
information and insight on a variety of science, health, and environmental topics.

Terrence Slonecker, with the National Exposure Research Laboratory (NERL), led a session addressing
applications of remote sensing technologies and data analysis for landscape evaluation, indicator
development, and remediation. Presentations included updates to the National Land Cover Data (NLCD),
development of indicators from NLCD analysis, landscape analysis at multiple spatial and temporal
scales, applications of light detection and ranging (LIDAR) for hydrological landscape analysis, and use
of photographic  analysis combined with a geographic information system (GIS) to detect and remove
chemical agents and weapons from a residential area.

George Woodall, Jr., with the National Center for Environmental Assessment (NCEA), led a session
addressing improvements in data resources and data organization supporting risk assessment.
Presentations included data organization for regulatory decision making, development of Toxicological
Profiles, modifications of existing databases to enhance searchability and access, and development of a
toxicogenomics  database on chemical effects in biological systems.

John Vandenberg, with NCEA, led a session addressing the acquisition and use of human data in risk
assessment. Presentations included the ethics of research involving human subjects, the use of human
subject research in setting air quality standards, and the ethical issues associated with genetic-based
research.

Nigel Fields, with NCER, led a session addressing exposure in children and differences in the impacts on
their health and development from that in adults. Presentations included understanding differences
between children and adults in order to understand exposure mechanisms and responses,  social and
environmental conditions with pre- and post-natal effects, and the National Children's Health Study
anticipated to begin data collection in 2006.

Alan Hecht, Director of Sustainable Development in ORD, led a session introducing the  concept of
sustainability. Presentations included efforts by the Sustainable Education Center, Inc., to educate
14                      EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
students from Kindergarten through Grade 12 on sustainability, national efforts to address sustainability
in higher education programs, and National Science Resource Center program goals to improve learning
and teaching of sciences in the nation's 16,000 school districts.

Val Garcia, with NERL, led a session addressing state and Federal initiatives to address air toxics and to
understand their health effects.  Presentations included EPA efforts in partnering with EPA regions,
states, tribal governments, and local governments to enhance the understanding of air quality and its
relationship to human health, an overview of the CDC's Environmental Public Health Tracking Program
and the Public Health Air Surveillance Evaluation project, NOAA's National Air Quality Forecast
Capability Program, regional challenges in using technology and innovation to better public health and
the environment, and experiences of the State of New York's air quality management program and
Department of Health initiatives including an environmental public health tracking  system.

Alfred Dufour, with NERL, led a session addressing health effects from human use of recreational waters.
Presentations included research efforts to identify associations between human illnesses and recreational
water quality using rapid water quality analysis techniques, complexities of non-point source pollution in
marine recreational waters and associated health risks, and the EPA Beach Program developed in
response to the Beach Act of 2000.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                           15

-------
Advanced Remote Sensing
Following opening remarks by Terrence Slonecker, -with NERL, six speakers addressed applications of
remote sensing technology in landscape analysis, indicator development, and remediation. An audience
question and answer period followed the presentations.

The Status of the 2001 National Land Cover Data

James Wickam, with NERL, discussed the continued development of a data set to provide consistent
national land coverage information that in turn supports assessment and use of this important indicator.
Many Federal agencies are involved in this mapping effort either by contributing labor or buying images,
including: United States Geological Survey (USGS), EPA, NOAA, U.S. Forest Service, USDA, Bureau
of Land Management, National Park Service, National Aeronautics and Space Administration (NASA),
and the U.S. Fish and Wildlife Service. There is also some state participation by Illinois and Kentucky.

A database approach is being adopted for mapping the continental United States and Alaska. Image data
are being used to obtain derivatives such as imperviousness and tree canopy, generate land cover maps,
and develop ancillary data such as a confidence estimate in the classification, a node map, digital
elevations, and decision rules. Classification is being conducted using a regression tree format, which is
supported by the node map, and is the step involving cross-validation.  Density maps and impervious
surface maps are also developed using very high resolution imagery, such as digital ortho quarter quads,
in conjunction with regression techniques to map across an entire area.

An example of a confidence map was presented in which a map is color coded for percent classification
confidence. This technique is not new to image processing and this type of map shows how well the data
fit into the envelope for each particular pixel.

A change detection product also is being prototyped that will help to compare the 1992 and 2001 versions
of the National Land Cover Data (NLCD), but this is difficult because of technology changes. The
original 1992 data are being overlaid with the 2001 map products, and a decision tree is being trained to
reclassify the 1992 data using the 2001 methods. The result is a reclassified map, entitled as the adjusted
1992 NLCD, which enables more residential features to be seen. This helps to evaluate change by
enabling comparisons between the 2001 NLCD and the adjusted 1992 NLCD, with areas of change  color
coded on the map.

The continental United States is divided into 66 segments for completion of this analysis, and each is in
varying stages of completion.  Three segments are currently  completed for an area in Minnesota, an area
in the Mid-Atlantic, and an area in the West.

Information dissemination methods are still in development as needs have changed since the original
effort.  In 1992, the only product available was the NLCD map. Now information users wish to acquire
subsets of the data, which will require data bundling. Currently, data can be obtained via
http://www.mrlc. gov.

Evaluating Environmental Quality Using Spatial Data Derived from Satellite
Imagery

K. Bruce Jones, with NERL, discussed the development of landscape indicators using the NLCD.
Landscape indicators and models enable retrospective risk analysis using archival information to identify
current conditions and risk, such as the Environmental Monitoring and Assessment Program (EMAP) and
TMDL prioritization. These also may be used to forecast and/or evaluate proposed management actions
16                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
under the Regional Vulnerability Assessment (ReVA) or in evolving areas such as evaluation of
management and policy effectiveness at the community level or through TMDL action plans.

Combining geospatial layers provides fine-scale information that can be used to develop landscape
metrics (e.g., riparian, aquatic) at watershed scale. Information from a roads database can be used to
calculate road density. It is also possible to downscale wet nitrate deposition by understanding the
relationship between elevation and deposition using EPA data and a network model. Other metrics of
interest include forest and agricultural land cover as well as characteristics of specific catchments.

In the mid-1990s, EPA started to develop a set of metrics and indicators and to apply these metrics to
specific catchments.  An example is the development of the Mid-Atlantic atlas that color coded ecological
quality across the region. The challenge now is how to develop such metrics into an indicator of some
outcome and to create models, which requires quantification of relationships.  Empirical models
(multivariate), Bayesian, and process-based models can take land cover information and generate
information about an end point like water quality.

The general approach is to select a specific endpoint of interest, collect and  acquire field samples through
existing monitoring programs, filter collected data based on selection criteria to obtain a consistent
temporal data set, assemble spatial data at various scales on various land units (functional and arbitrary),
generate metrics and/or other measures, then conduct statistical analysis. An example product is the
density and type of land cover in a watershed.  From this process, it is possible to develop a set of
functions, for example, multiple step-wise regressions to analyze percent agriculture and nitrate
deposition.

Another analytical technique,  logistic regression, provides cross-validation and assesses the probability of
exceeding a threshold based on a set of independent variables, such as landscape metrics and biophysical
measures.  An example showed the application of logistic regression analysis for exceeding TMDL points
for fecal coliform and the use  of a model to predict and map the probability  of exceeding the TMDL for
each watershed in South Carolina.  In this case, the landscape metrics that were important included:
percent urban, agricultural areas with more than 9 percent slope, and roads crossing streams.

An additional example involved a classification and regression analysis study of 177 watersheds in the
Mid-Atlantic region to identify the important variables in watershed condition. This involves a layer-by -
layer analysis (e.g., percent forest, then nitrogen deposition, then another layer, etc.) until the watersheds
are broken down into terminal nodes to determine patterns. Classification and regression analysis also
enables the distribution (spatially) of terminal nodes to be evaluated. This is a powerful inductive tool to
generate hypotheses and identify patterns.

This analysis process can also integrate multiple endpoints. For example, grid cells were used to analyze
changes in bird habitat quality from the 1970s to the 1990s.  Analyses were  separately conducted for
nitrogen loadings to streams in the same areas. The two analyses were then integrated in a  GIS to look at
the spatial patterns.

Ongoing and future activities involve the development of landscape models that address horizontal
interactions (cell-to-cell flow networks and continuing distance metrics) and development of Web-based
analysis tools for decision support.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                           17

-------
Development of Landscape Indicators for Potential Nutrient Impairment of
Streams in EPA Region VIM

Karl Hermann, EMAP Coordinator for EPA Region VIII, addressed the application of EMAP to predict
landscape conditions at the regional level in the Western United States. The landscape indicator concept
in this study is that ecological stream condition is often a function of watershed disturbance.

The use of GIS enables derivation of landscape metrics. A landscape model can be developed once
catchments for surface water monitoring and metrics for the catchments are developed. The NLCD is
very important to this effort, which largely relies on the 1992 NLCD, and eventually the 2001 NLCD, in
conjunction with monitoring data from 2000 to 2001.

Currently, all monitoring sites in the study region are incorporated, some Regional EMAP (R-EMAP)
projects are also being added (Montana northern plains and southern Rocky Mountain areas), and
catchments for the Montana sites have been generated. One question of interest is how far away is the
influence for the identified watershed disturbance. The farther away the influence is from an existing
monitoring location, the harder this is to evaluate.  So, efforts are underway to identify the clipping
distance for each monitoring location in order to address landscape characteristics in just that area. This
involves mapping areas for which metrics can be generated. A number of distances from monitoring sites
are being evaluated—ranging from Vi to 15 kilometers. Buffer distances from streams are also clipped
with distance from study areas of interest.

To date, about 40 catchment definitions have been developed through clipping and buffering, and the
NLCD land cover information is modified by adding roads, potential grazing impacts (from a region-wide
model), and nitrogen deposition estimated from an Office of Water model, each in different GIS layers.
The Analytical Tools interface for Landscape Assessments (ATtlLA), developed by EPA in the Las
Vegas laboratories, is used to evaluate all of this information.  Some of the catchments are still being
processed.

Since grazing  is important in the West, a potential grazing impact model was developed to support this
analysis. Inputs to the model include weighted land cover for grazing (i.e., where cattle may
preferentially be located), weighted administrative uses (to eliminate areas where grazing would not be
allowed such as residential areas), density of cattle, and proximity to water.

For nutrient impairment, anthropogenic influence is one of the major factors. This involves various land
cover classes,  land use, and modeled atmospheric data. To assess nutrient impairment, these tools are
used to relate nutrient landscape information to land metrics followed by application of a regression
model.

The value of this analysis process is that it helps to target areas to be assessed for the TMDL process.

Multi-Scale Remote  Sensing Mapping of Anthropogenic Impervious Surfaces:
Spatial and Temporal Scaling Issues Related to Ecological and Hydrological
Landscape Analyses

S. Taylor Jarnagin, with NERL, provided an overview of research involving human-made surfaces (e.g.,
roads, rooftops, driveways, swimming pools) as an indicator.  These surfaces are easy to quantify and
measure using remote sensing platforms, and they also act as indicators of changes associated with their
presence, such as changes in topography, water runoff, and sewer sheds.  This is important because the
human impact on runoff to streams may be greater than what is seen from the surface topography or
modeling.


18                         EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
This research involves different scales of study and their impact.  For example, spatial scales are effects at
the catchment (first order watershed) level up to regional scale, while temporal scale ranges from hourly,
daily, monthly, and yearly to decades. When looking at the impact of an impervious surface on a stream,
the primary change is the method for delivering water to the stream—by increasing the peak discharge
rate, water delivery occurs in a shorter period of time and moves off at a faster rate, which reduces the lag
time.  This also affects groundwater recharge in which the impacts are typically delayed and may appear
on a different time scale. The effects are most apparent at the first order subwatershed scale and at a daily
time scale, but to address impacts it is necessary to look at this from the decades scale.

An illustrative example involved the Upper Accotink in Vienna, VA in which historical aerial
photographs were used for analysis. In that area, impervious surfaces increased from 3 percent to 33
percent between 1945 and 2000. The stream flow per unit of precipitation increased and the median flow
increased, but the amount of precipitation per event did not change. Long term effects included an
increase in the number of times per year that both low and high flows occur. This has an ecological effect
on stream biota as well as a physical effect on stream morphology.

A problem for this type of analysis is the lack of such extensive historical data sets in many places. Often
data that would help in evaluating impacts over shorter timeframes are not available for the areas where
development is occurring.

A current study in a special protection management area in Clarksburg, MD, involves the USGS,
Montgomery County, and the University of Maryland Baltimore County Center for Urban Environmental
Research and Education. Research questions include what effect does an urban riparian area have and the
effectiveness of best management practices such as storm water collection and diffusion back into the
environment rather than capture and release. This effort includes the use of remote sensing platforms
(LIDAR, aerial photography, satellite imagery) and ground sensing platforms (for stream flow,
precipitation, water quality, and biological indices). This will enable analysis of before and after
conditions.  The study includes a rapidly developing area as well as an area that will not be developed.

Another project underway involves an accuracy assessment at medium spatial/temporal scale.
Collaborators on this project are USGS and the Chesapeake Bay Program. The research question being
addressed is the accuracy of remotely-sensed estimates of impervious surfaces, including how, when, and
where remotely-sensed estimators of impervious surfaces can be used and how good they are.  Remote
sensing includes LIDAR, aerial photography, and satellite imagery.

In addition, historical impervious data studies are continuing to use imagery data.  These efforts focus on
correlating historical time-series estimates of impervious surface change and development with changes
in stream flow over time.

The context for impervious surface analysis  is the pattern of urban growth.  In Virginia, from 1900 to
present, there is linear growth starting about at 1950 as a combination of population growth and
immigration. If the population increases, then there is an increase in the impervious area.  Another
evaluation examined changes in urban, suburban, and rural populations since 1950 in the same area. The
largest growth area is suburban and that is where the largest environmental impacts are occurring.

LIDAR: A Remote  Sensing Tool for Determining Stream Channel Change?

David Jennings, with NERL, provided background on LIDAR, examples of topographic LIDAR,
accuracies and advantages of this local-scale tool, and an example of LIDAR application.  LIDAR is an
integrated, aircraft-mounted system that pulses energy (in the form of light) to the ground that returns to a
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          19

-------
sensor. The system includes a global positioning system (GPS) and an inertial measurement system that
compensates for the aircraft and the laser.

The energy return provides topographic (vertical) data and represents various above ground components,
such as forest, underbrush, and buildings. The system can deliver multiple returns, and other research is
underway that involves waveform analysis to assess multiple returns from the same pulse. The intensity
of the return pulse is also of interest and may provide useful information.

LIDAR data are exported as simple ASCII x, y, and z data points. The data set is simple and involves
easting, northing, elevation, and intensity information.

LIDAR image products provide more actual landscape information than aerial photography (ortho-image)
products. An example demonstrated how the LIDAR image product (bare earth) provided greater ability
to view a stream running underneath the Baltimore beltway than that shown in an ortho-image product.
In addition, LIDAR images enable evaluation of vertical exaggeration. Example products included a 2-
dimensional plan view (from above) and a morphological view with vertical exaggeration showing
variation in stream channel side depths and the stream slope from upstream to downstream. Horizontal
accuracy is 1 meter or less and vertical accuracy is 15 to 60 cm depending on slope and vegetative
conditions.  LIDAR imagery is especially good for open areas such as asphalt, but is less accurate in
densely foliaged areas.

The promise of topographic LIDAR includes the following:

•   Accurate, high resolution, terrain model to drive hydrologic prediction models at multiple scales

•   Ability to identify subtle terrain/drain features such as depressional wetlands, vernal pools, and side
    channels

•   Provide a means to assess morphological change of stream channels and topography
•   Ability to provide assessments of large areas rather than traditional field surveys.

These potential advantages are being confirmed through research. The ability to provide large area
assessments may be the most important aspect of LIDAR applications to evaluate morphological change.

A comparison of LIDAR with a resolution of 0.5 meters against National Elevation Dataset with a
resolution of 30 meters demonstrated that LIDAR provided much better resolution and therefore is
believed to be a much better input for modeling.

Another example showed a side view of a LIDAR image with vertical elevation changes. In general, this
aspect of LIDAR is replicating actual field conditions quite well.  However, LIDAR does not provide a
return in deep water.

LIDAR is currently the only method for determining change in a  large geographic area. Montgomery
County, MD, needs a quick, effective way to assess the effectiveness of Best Management Practices to
mitigate impacts of development on stream channel and landscape topography.  Currently the County
relies on field surveys, but it is not possible to do enough of these. Therefore, LIDAR is being
incorporated into the Clarksburg, MD study, discussed in the previous presentation, for evaluation as a
potential replacement for field surveys.

This research effort involves the development of a conceptual model that relates stream channel change to
landscape development. This analysis requires both original and  changed land cover and hydrograph
data, with LIDAR being used to obtain channel information.  Best Management Practices come into play
20                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
between the land cover changes and the stream hydrographs, and therefore should mitigate changes in
channels. Because of this relationship, it may be possible derive some sort of quantitative assessment of
the effectiveness of Best Management Practices. To date, LIDAR data have been collected in December
2002 and March 2004, land use and land cover change data have been collected for both periods, and four
out of five USGS stream gauge stations are in place. Results may be available in the next year.

The Use of Remote Sensing in the Detection and Removal  of Chemical Weapons
in Spring Valley

Steven Hirsh, a Remedial Project Manager with EPA Region III, presented an example of remote sensing
applications for a high profile and difficult site area in Northwest Washington, DC involving buried
munitions from World War I that are in poor condition and still contain toxic materials.  Special aspects
of this project involve very old data, new techniques, and operations in a residential area (1,600 homes) as
well as the use  of historians and photo analysts to reduce the invasiveness of the investigations and
remediation activities to the homeowners.

In 1917, the Army conducted research on chemical agents at a 50-acre facility owned by American
University in Northwest Washington, DC. The area involved lots of open farmland and a reservoir
among other features. There are many pictures taken at the time of that Army research effort as well as
information from that time that enables us to know precisely what was going on in specific buildings.
Efforts are currently underway to identify  where specific operations occurred and where the wastes were
placed.

Chemical agents in World War I included  mustard gas, lewisite, phosgene, ricin,  and arsenicals. Research
at this site involved laboratory testing of these chemical agents on mice, then larger animals, and then
humans.  Other research activities involved the development of offensive capability (e.g., ordnance,
delivery) and countermeasures (e.g., personnel protection). Efforts are underway to trace where bullets
and other ordnance landed from field firings on the site. Many shells  did not explode and efforts are also
underway to find the unexploded ordnance.

The challenge is using the photographs taken at the time of operations to identify where specific activities
occurred, then translating that location information into the present time. Researchers are able to use
calculation and geometry techniques to find sites from historic pictures that have  buildings whose
locations are known. Subsequent efforts involve translation of this information to identify current
locations so as  to minimize disruption of homeowner's land as remediation activities proceed.

Historical aerial photograph research, acquisition, and interpretation are key aspects of this effort.  This
information has been placed into  a GIS in  order to map physical feature locations as well as elevated
contaminant levels to support "what if' analysis. This involves the creation of hundreds of layers—for
utilities, cut/fill, and land surface in 1918,  among others. The project has access to an aerial photograph
from 1918, many terrestrial photographs from the 1910s and 1920s, and documents that specify locations
where activities occurred.  Aerial photographs from earlier years are also being reviewed to see where
vegetation is not growing.

As an example of the GIS application, persistency tests were conducted in 1918 to determine how long
mustard gas would last in the environment. These locations can now be overlaid with current residential
areas in order to identify sampling locations.  Ground scars are also being evaluated to identify trenches
and other land features that were constructed to help with the chemical agent research at that time.

A research activity underway by the U.S. Army Corps of Engineers (USAGE) Vicksburg District
involves the hyper-spectral identification of contaminants and/or affected vegetation.  This includes the
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          21

-------
planting of many ferns to determine if contaminants of interest can be concentrated, which will also help
to identify locations of contaminants and former site activities.  Hyper-accumulation is also of great
interest for contaminant removal in lieu of more invasive remediation techniques as it may help to avoid
removing trees or otherwise disturbing the landscape of this mature residential neighborhood.

Future activities include continued support for ongoing removal operations, GIS support for partners in
this remediation and research effort, and completion of the hyper-spectral study.

Questions and Answers
The speakers had an opportunity to address questions from the  audience.

A brief question and answer period addressed a range of topics. These included: (1) the importance of
urban riparian buffers in regulating flow through sewer sheds and minimizing impacts to aquatic life; (2)
the need to determine whether engineering solutions (such as partially impervious surfaces), point source
controls,  and other Best Management Practices are having the desired impacts on watershed quality; (3)
the need for multi-temporal accuracy assessment for NLCD data points; (4) availability of information
from these projects of which some is available now (1992 NLCD) and other data, such as being
developed for Montgomery County, that may not be available for 4 or 5 years; and (5) sources for more
recent photographic data for chemical agent remediation activities and the challenges of obtaining recent
aerial photographs resulting from the difficulties of accessing Washington, DC airspace since the events
of September 11th, 2001.

Innovations in  Risk Assessment:  Improving Data Resources
Following introductory comments by George Woodall, Jr., with NCEA, four speakers discussed methods
to manage and assess large volumes of data for decision making. An audience question and answer
period folio-wed the presentations.

The Need  for Scientific Data in Regulatory Decision Making

Roy Smith, with OAQPS, discussed the ways to better organize dose-response data for use in regulatory
decision making. The Air Toxics "universe" can be viewed as n-dimensional space, consisting of many
aspects that are related to each other in different ways. GIS can be used to map such information density
in n-dimensional space as presented in this session.

The multi-dimensional view of the Air Toxics "universe"  includes the following dimensions:

•   "Width" (first dimension) consists of 174 source categories and 96 NESHAPs developed to date. The
    National Air Toxics Assessment activities include a monitoring network, an inventory of emissions,
    assistance to communities in conducting local risk assessments (e.g., guidance, interaction), and a
    national-scale assessment every 3 years, as well as completion of a one time set of "boutique
    assessments" (e.g., mercury, power generation).

•   "Length" (second dimension) involves the needs of every assessment, such as emissions, dose-
    response, and exposure analyses.  The exposure analyses alone take significant effort. Therefore,
    dose-response may receive less consideration.
•   "Depth" (third dimension) involves the 188 hazardous air pollutants (HAPs), of which 20 are
    "categories" whose members vary widely in toxicity.  EPA activities include identifying substances
    that are not currently  on the list and perhaps should be, as well as conducting assessments to remove
    substances from the list of HAPs.

•   Dimensions four through six for the 188 HAPs, involve inhalation and multi-pathway exposures,
    chronic and acute exposure time scales, and human and ecological receptors.

22                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
There are many ways to cope with these assessment needs through tiering. This may range from
screening using lookup tables or a simple dispersion model to more refined approaches ranging up to full
blown probabilistic analysis. The overall approach involves a series of steps and the use of multiple
iterations:

•   Initial screen - toxicity-weighted scoring

•   Tier 1 - simple, conservative screen focused on important stressors/sources

•   Tier 2 - more complex models, real receptors

•   Tier 3 - best available analysis including human behavior (moving in/out of area).

At each stage of the assessment, resources focus on the small number of stressors and sources that really
affect risk and eliminate the assessment of aspects that are not relevant to exposure. For example, dose-
response assessments are typically generic until Tier 3 analysis, which uses only the newest and best
existing assessments. If newer data are available, they must be considered credible. Note that many
HAPs lack such assessments.

Existing dose-response values are useful for most, but not all, risk assessments. However, if it is
necessary to add/remove a HAP from the regulatory list, reduce emissions, or take similar actions, the
best possible information must be used in the assessment. Better organized lexicological data would help
with this analysis. An emission reduction example was the need to use risk estimates based on the
physically-based pharmacokinetics (PBPK) model developed for the plywood Maximum Achievable
Control Technology (MACT) emissions standard. Other examples were offered for residual risk analysis,
removal of a HAP from the regulatory  listing (i.e., demonstrate the  absence of risk), and addition of a
HAP to the regulatory listing (i.e., demonstrate the presence of risk).

The Integrated Risk Information System (IRIS) develops 10 assessments each year for all programs. The
universe of HAPs under the CAA includes hundreds of substances. Therefore, a well-organized
lexicological database/system could help lo prioritize which subslances lo do nexl—eilher for IRIS
assessment or for lexicological research sludies.

The ATSDR Experience  in Using the Supplemental Documents Database in
Developing Toxicological Profiles

Henry Abadin, wilh the Agency for Toxic Subslances Disease Registry (ATSDR) Division of
Toxicology, addressed ihe Toxicological Profiles Program, supplemental documenls, and a related
dalabase.  The Toxicological Profiles developed by ATSDR succinclly characterize the lexicological and
adverse heath effects information for specific substances, determine levels of exposure (e.g., acute,
intermediate,  chronic) that present a significant risk to human health, and identify research areas needed
to fill dala gaps.  These profiles undergo independenl peer review, typically three to seven peer reviewers,
and are made available for public comment

The conlenls of each Toxicological Profile cover a wide spectrum of topics, including a public health
statemenl that addresses in layman's terms what is in  the profile, heallh effecls (route of exposure, acute
measures, system affected), toxicokinetics, mechanisms of action, biomarkers, chemical and physical
properties, production/import/use/disposal, environmental fate, analytical methods, regulations/advisories,
identified data needs, children's health, PBPK, melhods lo reduce loxic effecls, endocrine disruption, and
wildlife impacls (as sentinels for human exposure). ATSDR also prepares a public health slalemenl in
bolh English and Spanish as a sland-alone portion. The Toxicological Profiles have evolved over the
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          23

-------
years to bring in more information. Therefore, those completed earlier will look different than those
completed more recently.

ATSDR developed a list of chemicals found at hazardous waste sites, and this provided the basis for
preparing the lexicological Profiles.  ATSDR must update published profiles every 3 years.  ATSDR has
also developed Toxicological Profiles for DOE that address ionizing radiation and uranium; efforts are
currently under way to address americium, cesium, cobalt, iodine, and strontium.

The Toxicological Profiles have widespread application in public health practice, such as emergency
response, developing public health assessments, consultations, health advisories, and environmental
alerts, among others. The Toxicological Profiles are available on the Web and there is a search engine to
help users find information.  ToxFAQs is a public summary of the profiles and is also available on the
Web. These profiles also are distributed to about 70 countries worldwide, based on requests received. In
2002, ATSDR began placing the Toxicological Profiles on CD-ROM.

The profile development process includes literature search, article retrieval, and preparation of a
supplemental document to pull together all the information that someone may need about the quality of
the studies and data to extract into the profile. This includes the number of animals, species, exposure
duration, route of exposure, parameters monitored (histology, clinical signs, other things), doses, no
observed adverse effect level/lowest observed adverse effect level (NOAEL/LOAEL) values,
calculations, study description, and comments (such as limitations of the study).  This helps to determine
the level of confidence in the information. ATSDR developed a set of criteria that are given to
contractors to conduct the literature search.

Data are then extracted into a table addressing levels of significant exposure. ATSDR typically uses
studies that have the most data on acute, intermediate,  and chronic exposure; organ system affected;
NOAEL/LOAEL values; what occurred, etc. These data are used to prepare a "levels of significant
exposure" figure that shows all of the data in graphical form.  This database of information is developed
to enable data extraction to support HazDat (which contains detailed data from Toxicological Profiles,
site data, and lexicological and health information); the TopHat database (formerly the Federal facilities
information management system developed for DOD sites); the Toxicology Profile and Health
Assessment ToolKit; and IRIS.

ATSDR and EPA are working to develop an MOU to continue collaborations in this area.

Distributed Database Approach to Sharing Data

Ann Richard, with the National Health and Environmental Effects Laboratory (NHEERL), discussed the
development and application of a toxicological and structure database now available on the Web. A
challenge faced by EPA is that there are too many chemicals requiring testing and a lack of sufficient data
on those chemicals.  A potential solution for this is to use computational toxicology, which is the
application of mathematical and computer models and molecular biological approaches, to improve
prioritization of data requirements and perform risk assessments.

Gathering relevant information is one of many steps in the risk assessment process.  The first step is the
conduct of chemistry-based data mining and exploration to look for chemical-specific data. Since there  is
typically little data  found, the search expands to structural or chemical analogs that have similar
properties (biological or mechanistic) to the chemical of interest. This involves the establishment of
structure-activity relationships (S ARs), in which activity is a function of structure, which may be
identified through analogy, heuristics, machine-learning inference, and statistical correlation.
24                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
An existing model can be applied, if one exists, to accomplish structure-based screening and
prioritization.  If enough information exists, it also may be possible to develop a new S AR model or to
mine existing data for analogs. Each of these steps requires data.  However, there are limitations that
include:  scattered sources, nonstandard formats, diverse information content, and a lack of chemical
structure annotation.  This is a significant issue as this is important to analog searches and limits access to
the full database, which is necessary for model development, yet most lacking in the literature.

Development of a Distributed Structure-Searchable Toxicity (DSSTox) database is underway to address
these limitations.  This effort involves the standardization of chemical structures to aid in data searches, to
open public access and partnerships to help advance this database concept, and to bridge diverse toxicity
disciplines to bring in many potential information sources.  This will help to improve coordination and
collaboration, access to and utilization of toxicity data, and toxicity prediction modeling.

Currently, data files are found in many locations and have diverse content. This effort involves the
annotation of public toxicity databases with chemical structure information and the creation of SDF files
that can be imported to other uses (e.g., models, database).  The data files are found at
http://www.epa.gov/nheerl/dsstox, which is both a resource and portal in that it provides links to outside
resources. Four databases have been published to date: carcinogenic potency, water disinfection by-
products with carcinogenicity estimates, EPA fathead minnow toxicity, and estrogen receptor binding.
Each database is a separate repository and associated with each database is a series of modules that can be
downloaded.

The database is designed to integrate lexicological data with chemical structures and properties.
Activities are underway to create files that bridge these two user communities. Toxicological data are
typically compiled to support risk assessors and do not usually involve a chemistry component. So, this
project is trying to provide some general context such as chemical structure and property fields useful to
both user communities.

The database development effort is incorporating certain standards, such as data file format, file naming
convention, chemical structure information fields, documentation requirements, and publishing
requirements (i.e., how to get a database onto this Web site).  The standard chemical fields are extensive
and enable searches across the four databases.  Minimum structure annotation is also provided in order to
cover the diversity of databases of interest.  Structure data include the tested form simplified to parent,
general form, active ingredient of formulation, other ingredients (typically found in pesticides and
Pharmaceuticals), and monomeric form of the polymer. The standard fields are well defined and well
characterized for others to successfully reproduce these fields in order to publish their data on this
database.

The next steps are to develop general lexicological fields as was done for chemistry, such as species,  sex,
and strains, to facilitate dala searches; expand ihe databases being converted lo standard formal; and
provide resources lo encourage the use of chemical relational databases, especially among those who are
nol chemists or toxicologisls.

Collaboration wilh Ihe Chemical Effecls in Biological Systems (CEBS) database is also underway lo
develop useful relational tools to search Ihrough this diverse data. An example is how to establish a
linkage from a chemical structure standpoinl for historical data. If Ihe same standard fields are applied lo
genomics, proleomics, and other new data types, natural linkages may be found lhal are applicable lo
older date and ihis may be ihe only melhod lo bind Ihis information logelher in a useful way.  This would
improve  relational searching capabilities and oblain improved understanding of biofunctional
classification, which in turn improves the ability lo predict toxicity.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                           25

-------
Adoption of this approach will lead to increased use of chemical structure searching, improved public
access to toxicity data, and improved predictions based on chemical activity structure.

The Chemical Effects in Biological Systems Knowledgebase

Michael Waters, with NIEHS, discussed efforts by the National Center for Toxicogenomics to develop
the CEBS Knowledgebase, which relies on toxicogenomics and systems toxicology.  Toxicogenomics is
the study of the response of a genome to environmental stressors and toxicants, and combines genetics,
transcriptomics, proteomics, metabonomics, and bioinformatics with conventional toxicology. Systems
biology is a complete description of how the components of a biological system work together, while
systems toxicology is a complete description of the toxicological interactions within a system. The goal
is to be able to describe a biological system, perturb it, measure changes, and then develop a better model.

The CEBS Knowledge base uses data and information to carry  out tasks that create new information and
new understanding. This is heuristic in that the system learns from relationships and develops new data.
CEBS aims to be dynamic in order to integrate large volumes of disparate information in a framework
that continually changes. CEBS will evolve both in content and capabilities to become a system for
predictive toxicology.

CEBS will enable comparison of the toxicogenomic effects of chemicals/stressors across species. In
order to phenotypically anchor these changes with conventional toxicology data (i.e., classify effects as
well as disease phenotypes), it must be possible to bring together these two data sets, identify signatures,
relate  these to known phenotypes, understand what is adaptive, and define biomarkers,  sequences of key
events, and modes/mechanisms of action.

Two hallmarks of the CEBS Knowledgebase are: (1) sequence anchoring (anchoring the genomic
sequence to chromosome coordinates) to help understand the genome and to interpret resulting data sets;
and (2) phenotypic anchoring in which toxicological effects (expression profiles or outcomes) are
anchored in phenotype using a controlled vocabulary.

The interpretive challenges for building such a knowledge base are formidable. Chemical structure is
one, as described in the previous presentation, and CEBS will be able to link up to those resources.
Annotation information will be brought in from multiple genes/proteins using CABio developed by the
National Cancer Institute with a focus on gene/protein categories (functional characterization).  Efforts
are also underway to address sequential events (pathways and processes). These three efforts lead to an
understanding of integrated responses by networking this information and using toxicology/pathology
(e.g., adverse effects) to map chemical structure.

Immediate objectives are to capture, store, and analyze gene expression data produced from
toxicogenomic experiments in different laboratories; interrogate gene expression data using queries from
the genomic, experimental, and toxicological domains to understand the molecular interface; and gain
knowledge of relationships between gene expression changes and toxicological endpoints.  The main
challenge is to provide internally consistent data to enable comparisons among the many data sets.

NIEHS is looking at potential data sources for CEBS and is developing intramural and  extramural
partnerships including government agencies such as EPA, private industry, and international
organizations to facilitate data transfer. EPA collaborations include Metabonomics Center of Excellence
with NERL, SAR interface with DSSTox with NHEERL, and toxicogenomics applications in risk
assessment with NCEA. In addition, the Environmental Science Portal may be, in the future, a way to
bring  CEBS into the EPA system.
26                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Data acceptance began in 2003 and there are about 25 gene expression experiments included at this time.
CEBS provides data processing options and resources to help view data sets and do gene analysis via data
tables and various pathway diagrams.

Toxicogenomics will change the way toxicology is performed and will contribute new methods, new data,
and new interpretations to environmental toxicology.  CEBS will be a key component in toxicological
interpretation, providing the ability to link transcriptomics, proteomics, metabonomics, and toxicology to
generate new knowledge and assist in evaluating various dose-response paradigms.

Questions and Answers
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics.  These included: (1) the need to integrate
a vast amount of disparate dose-response information and to be up-to-date on what is already available;
(2) the  potential to expand DSSTox to include chemical fate/exposure aspects such as toxicity and
bioavailability, which are linked to chemicals and structures; (3) attempts to address dose-response in
CEBS and the challenges of projecting gene expression to identify signatures, chemical structures, and
mode of action; (4) the impact of new technologies such as carbon nanotubes; (5) how to verify
toxicological outcomes extrapolated  from structures or other analogies, how to develop confidence in
such extrapolation when such data may not be available, and the need to use/leverage existing data; (6)
the need for an intelligent way to query an entire dataset as to the gene or mechanism through gene
annotations, integration of array data, and other mechanisms; and (7) the need for methods to use all of
the classical toxicological data in risk assessment that are already available.

Science and Innovation to  Protect Health and Environment
Following opening remarks by John  Vandenberg, NCEA, three speakers addressed evolving research
guidelines for human participants and diverse ethical considerations involving existing and new research
methods.  An audience question and answer period followed the presentations.

The Ethics of Research Involving Human Subjects

James Childress, with the University of Virginia, discussed the current ethical framework for research
involving human participants such as third party dosing studies conducted by EPA.  The Belmont
principles were established in the Federal Register in 1979 and called for beneficence, respect for
persons, and justice. These principles resulted from a Congressional request of the National Commission
for Protection of Human Subjects of Biomedical and Behavioral Research. Respect for persons involves
the respect for autonomous agents, the protection of persons with diminished autonomy (e.g., cannot
make own choices), and use of informed consent.  Beneficence involves not doing harm, maximizing
possible benefits, and minimizing possible harm. Justice involves fair distribution of the benefits and
burdens of research including fair selection of research subjects.

The Belmont principles are a general framework, and Federal agencies and other groups have come up
with additional guidelines. Yet, there was widespread negative reaction to intentional dosing studies
conducted for EPA regulatory purposes  even though analogies exist between such studies and Phase  I
testing of new drugs by pharmaceutical companies. The hope was for human testing to address the
difficulties encountered in conducting interspecies extrapolations of effects.

In 2004, the National  Research Council  set up an interdisciplinary committee to prepare a report and
make recommendations.  The report  stressed an integrated review of science and ethics. The committee
applied existing standards including  the  Belmont report and the Common Rule (Federal Policy for the
Protection of Human Subjects) adopted in 1981 by the government, and did not try to create or invent

                            EPA  SCIENCE FORUM 2004 PROCEEDINGS                          27

-------
ethical standards.  The existing guidelines cannot simply be applied because they require interpretation
and there are gaps in areas addressed.

Subject protection generally involves review of proposed research by an institutional review board (IRB)
and informed consent. The Common Rule provides guidance for IRB review, but does not require
approval.  Key elements are the importance of minimizing risks to subjects, determining that risks to
subjects are reasonable in relation to anticipated benefits, equitable selection of subjects, informed
consent and its documentation, and maintaining confidentiality.

Criteria for evaluating research proposals/subjects, in terms of scientific and ethical acceptability
standards, include:

•   Prior research including animal studies and, if available, human observational studies

•   Whether there is a demonstrated need for the knowledge

•   Adequacy of research design and statistical analysis to  address an important scientific or policy
    question, which is where ethics and science merge as there is bad science from exposing subjects to
    risk that is of little benefit

•   Whether there is an acceptable balance of risks and benefits as well as minimization of risks to
    participants.

Other risk-benefit considerations include:

•   Possible societal benefits from improving accuracy of reference doses and providing public health or
    environmental benefits

•   Justification of research to improve scientific accuracy  only when there is reasonable certainty that
    human subjects will experience no adverse effects
•   Justification of greater risks to human participants  if the research will produce a benefit, but only if
    addition ethical conditions are also met, including a precondition requiring a protocol related to
    ethics.

Participant selection must be equitable (fair, just),  use of persons from vulnerable populations (i.e.,
possible exploitation) must be convincingly justified and the research protocol must include additional
protective measures, and use of individuals at increased risk for harm must be convincingly justified and
the researchers must have protective measures to reduce risk. These factors come together for children
who are vulnerable in both senses as they are unable to give informed consent and are in a developmental
stage so they are at greater risk.  A recommendation to EPA is to adopt  Subpart D of the Regulations for
Protection of Human Research Subjects, or to at least adhere to them, as these standards are quite
stringent, when children are used in research.

Payment for participation is an unsettled area. A variety of perspectives indicate the appropriateness of
receiving payment for taking part in a research project, but care must be taken as to how much and why
(e.g., time, inconvenience, and/or level of risk).  The amount cannot be  so high that it is inducement or so
low as to be attractive  only to socio-economically  disadvantaged persons.

With regard to compensation for research-related injuries:

•   Participants should receive needed medical care for research-related injuries, without cost to them

•   EPA should study whether broader compensation for research-related injuries should be required,
    such as for lost wages, death, job loss, etc.
28                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
In addition, the Common Rules state that participants should be informed if compensation will not be
provided for research-related injury.

Best practices for informed consent means that the participant understands the information provided about
the research protocol.  Recommendations to EPA in this area are to develop and disseminate a list of best
practices in this area, encourage their adoption in third-party studies, and require their adoption in studies
it sponsors or conducts.

IRB review of intentional dosing studies is useful. There are precedents for a human studies review board
for an integrated science and ethics review. This helps to ensure public trust and build experience. Such
requirements are preferred to be mandatory but legal and logistical issues may make this difficult. At a
minimum, research protocols should be reviewed in advance (through voluntary submission) and study
results should be reviewed after completion.

If ethically problematic studies are conducted after the new standards are adopted, EPA may face
challenges in regulatory decisions.  However, there may be exceptions where studies provide valid data to
support a standard that provides greater public health protection.  It may be useful to have these evaluated
by a special, outside panel with public members as well as  experts. If ethically problematic studies are
conducted before the new standards are adopted, EPA can accept them if the studies provide valid
information that leads to greater health benefit.

Ethical principals for research can justify some intentional  dosing studies if several conditions are met as
described above, but these conditions should apply to both third party and EPA-conducted research.

EPA Clinical Research:  Implications for Air Quality Standards

Bill McDonnell, with NHEERL, discussed the role of clinical research in developing air quality
standards. In controlled human exposure studies, human volunteers are assigned to exposure groups,
undergo exposure to a pollutant under controlled conditions, and health effects are measured and
compared to a control group. This information supports the development of effective standards, which
have a high probability of meeting the requirements of the  law, and optimal standards, which are both
effective and are not unnecessarily restrictive.

Health information that is needed to identify an optimal standard  includes whether the pollutant causes a
health effect and accurate estimates of human health effects and the associated uncertainty. The more
uncertainty there is in the human health data decreases the  probability of identifying an optimal standard,
and if there is too much uncertainty, the standard may be set at a level that is more restrictive than
necessary.  Sources of uncertainty in health data include extrapolation from one species to another,
individual variability in effect, difficulties in establishing causality in epidemiological studies, and issues
of precision or accuracy in measurements. If there is limited data, there will be uncertainty about the
effects measured.

Some of the strengths of clinical studies involving human participants are that the  subjects are the species
of interest and it is possible to control and accurately  measure exposure, so modeling becomes much
easier. Ethical limitations include the ability to only study  pollutants with limited, acute, and reversible
effects; difficulties in studying the populations of greatest interest because they are the most susceptible
and are unlikely to be included in the study because the health effects will be more pronounced;  and
limited health endpoints (e.g., few people willing to undergo a brain biopsy). Logistical limitations
include small sample sizes in which interactions and rare outcomes are difficult to study, short duration,
and volunteers who may not be representative of the population.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                           29

-------
Examples were presented where clinical studies directly contributed to standards setting efforts.  The first
example involved the relationship between ozone and eye irritation; clinical studies showed that ozone
alone (without other air pollutants) did not irritate the eye. Another example is the relationship between
ozone and asthma attacks.  A number of epidemiological studies showed a relationship between ozone
levels and hospital admissions for asthma. Since it was not possible to study asthma attacks (rare in
occurrence and should not be induced), researchers investigated whether ozone has mechanistic causes
such as promoting a response to allergens. Researchers were able to obtain information that increased the
certainty regarding the epidemiological observations. A third example addressed whether long-term
exposure to chlorine causes nasal lesions; clinical studies compared chlorine uptake in the human nose to
that in animals in order to quantify interspecies differences and to determine differences in tissue
sensitivity.

The final example involves the National Ambient Air Quality Standards (NAAQS) for ozone in which the
effect of interest can be directly measured. The first NAAQS for ozone was set in 1971 at 0.08 ppm with
a 1-hour averaging time based on experience in Los Angeles where quick peak spikes and drops were
encountered.  In  1979, the 1-hour averaging time was retained, but the standard was changed to 0.12 ppm.
Then in the 1980s, it became evident that the pattern encountered in Los Angeles was not the predominant
pattern, particularly on the East Coast where the pattern was slower and broader. Questions then arose
whether the standard was adequate to protect people in those situations. As a direct result of those
studies, EPA promulgated a revised standard at 0.08 ppm ozone with an 8-hour averaging time.  This has
gone through the courts for 6 years and is now being implemented.

A clinical study looked at a 2-hour ozone exposure at various concentrations (up to 0.4 ppm) in healthy
young adults. This study alternated rest and heavy exercise, then looked at symptoms.  Elevation of
coughing was identified in all cases, but at increased ozone concentration there were more significant
changes in lung function that were not comfortable but were reversible. The study was repeated a year
later and the individual ozone responsiveness remained consistent.  In addition, younger adults were more
responsive to the effect than older persons. In another study using 6.7 hour exposures in healthy young
adults with differing ozone concentrations up to 0.12 ppm with alternated moderate exercise and rest,
ozone clearly had an effect well below the standard.  These studies  demonstrated causality, provided
accurate estimates of effect, and identified individual variability and some sensitive subpopulations.
Researchers have been able to use these data to create exposure-response models and to develop estimates
of precision. These studies served as the foundation for regulations we have today.

Conclusions from this experience are that, under the right circumstances, clinical studies can directly
establish causality and provide accurate and precise estimates of effect. In less optimal circumstances,
clinical studies can complement animal and epidemiological data and can decrease uncertainty.

Research with Human Subjects: Future Challenges  and Opportunities

Richard Sharp, with the Baylor College of Medicine, discussed genetics research, current efforts to
decipher gene-environment interactions, and several ethical issues associated with these activities. The
general public does not realize that the vast majority of genetic research is not on genetic diseases. The
human genome project warranted investment for the resulting ability to address more generalized
susceptibility (e.g., the strong dominant alleles found to be associated with disease such as the breast
cancer gene are very useful in predictability) rather than genetically-based diseases.  In addition, some
genes have limited use in predicting disease.

Environmental exposure is another pathway to disease. Actions within the human body in response to
exposure include taking up contaminants, repairing cellular damage, and getting bad materials out of the
30                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
body. If there are defects in the enzymes that perform these functions, they will play a role in the
exposure process. This is a potentially important area for EPA. Some environmental response genes
identified to date include the CYP2E1 gene, which in the presence of benzene leads to an increased risk
of leukemia, and the TGF-alpha gene, which in the presence of maternal smoking leads to an increased
risk of facial clefts in newborns.

His definition of toxicogenomics is the use of genomics and genetic resources to identify potential human
and environmental toxicants, and their mechanisms of action. Specifically relevant to EPA work in this
area is the use of toxicogenomic tools to identify biomarkers of susceptibility, biomarkers of exposure,
and biomarkers of early clinical effect (early disease processes).

The promise of toxicogenomics is in the potential ability to accomplish the following:

•   Develop less expensive ways to assess chemical and other agent toxicity

•   Improve understanding of toxicity mechanisms

•   Provide more precise estimates of exposure levels

•   Measure biological effects earlier, perhaps before there is evidence of toxicity

•   Provide measures of unknown toxins by using patterns to determine exposure

•   Identify individuals and subpopulations with increased sensitivity to specific substances.

There are many ethical issues associated with toxicogenomics, particularly regarding how scientists
should present the promise/limitations of emerging scientific technologies to the public. Much of the
current literature includes many statements with "hype" and promotion, and these are largely found in
peer-reviewed journals rather than the lay press. This representation does not appear to provide an
appropriately balanced discussion that includes some of the limitations of these toxicogenomic
technologies. Without balanced discussions, the nonscientist is likely to misunderstand the information.
For example, the suggestion of genetic predisposition may lead to conclusions about predetermination for
depression, intelligence, criminal behavior, infidelity, etc., when there in fact may be multiple causal
factors.

A second ethical issue is the potential to use genetic information in ways that are discriminatory. The
genetic influence on exposure-induced disease processes may depend upon the level or timing of the
exposure, whether there are concurrent exposures, or whether other genetic mutations combine to increase
the likelihood of disease occurring. The effects of environmental response genes may be altered by
changes in behavior or environmental conditions; for example, eliminating exposure to benzene will
eliminate the action of the benzene-sensitive CYP2E1 gene discussed above. Since variability in
sensitivity genes is common, the biological implications of these genetic variants is often unclear and may
be more complicated than simply having an altered gene.

This situation offers more opportunity for misuse of such genetic information because it is easy to find
differences, but difficult to interpret their meaning such as risk. Concerns have been raised regarding
health insurance and the use of this information, and there are an increasing number of statutes being
enacted to address this. In addition, businesses may choose to remove genetically sensitive workers
rather than address a workplace hazard.  As an example, Burlington Northern Santa Fe Railroad required
workers with work-related carpal tunnel claims to provide blood testing for genetics. While there is a
gene for palsy, there is not one for carpal tunnel. This testing was not intended to benefit the
health/welfare of tested employees, there was no informed consent, and the tests themselves were
scientifically questionable. Yet the company was not doing anything illegal. However, a lawsuit under
the Americans With Disabilities Act (ADA) resulted in this practice being banned.


                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          31

-------
A third ethical issue is how knowledge of genetic sensitivities leads to informed decisions about health
impacts.  An example involves a person applying for a job at a DOE facility where beryllium exposure
will occur.  Genetic testing was performed that revealed the individual was at risk for developing a
beryllium-related disease. The individual still took the job, and eventually acquired the disease. The
ethical question is whether this person is responsible for his condition. In general, there are inherent
problems in assigning responsibility for poor health outcomes. There are excusing conditions, such as the
choice may not have been fully  voluntary because of limited employment options, or the choice may not
have been fully informed. There also are justifying conditions such as a greater benefit to his family to
take the risk.  It is easier to blame someone than to put responsibility on broader social issues that channel
a person to a specific path.

A fourth ethical issue involves donor attitudes regarding the use of stored biological materials for genetic
research. It is very difficult to do certain studies without large numbers of data sources. An example is a
children's study involving 100,000 participants in which the genes will be defined over the course of the
study, not at the beginning. As  a result, consent provided by the parents of these children will be generic
and open-ended.  IRBs find this problematic and want the consent to specify the genetic test for those
involved in a study. In Mr. Sharp's studies of participant attitudes, he is finding that many are happy to
have their biological sample support a number of studies.

There are many types of genetic research available today that may help EPA.  This will introduce a
number of ethical challenges. How to respond to those challenges and also develop quality science is still
an open question.

Questions  and Answers
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics. These included:  (1) the range of
considerations to take into account in deciding whether to use scientific information that is unethically
obtained while discouraging companies from engaging in unethical research that promotes their interests
over that of the public/environment or with deliberate intention to harm participants or violate human
rights; (2) how decision making on ethical issues is a process of reasoning rather than yes/no; (3) how an
ethics group is working with consent protocols and consent renewals for the EPA children's study; (4)
how Johns Hopkins obtains new consent for each new study using stored biological materials; (5)
potential indemnification issues arising out of EPA review of established university research protocols or
requiring university protocols to provide additional measures such as further compensation for injury or
loss of life;  and (6) an example of a manufacturer that allows prospective workers to have a genetic test
and receive genetic counseling,  while the company chooses not to see the data and the offer is made after
the employment offer is issued but before employment occurs thereby addressing ADA requirements.

Supporting Innovations in  Science to Identify Children's Vulnerability
to Environmental Exposures
Folio-wing opening remarks by Nigel Fields, with NCER, three speakers addressed health and exposure
differences between children and adults, a study of environmental exposure impacts on young children,
and the upcoming National Children's Study. An audience question and answer period followed the
presentations.
32                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Children's Health and Environmental Exposures:  The Most Important
Unanswered but Answerable Questions

Michael Weitzman, with the American Academy of Pediatrics, Center for Child Health Research,
discussed the differences between children and adults in terms of health effects and how genetics and
exposures in children set the stage for chronic illnesses that beset adults and the elderly. Children are
different from adults, from animals and juvenile animals, from each other, and from themselves at
different ages/stages. The differences are age-, stage-, and substance-specific.

Age-specific drug effects include tetracycline staining of teeth, fluorosis (leading to enamel problems),
blindness from administration of oxygen to premature babies, grey baby syndrome from chlorampheticol
administration, ciprofloxacin-related cartilage damage, and agitation from phenobarbital, which acts as a
sedative in adults.

At this time, we are at a critical juncture to begin to understand the interface between children's health
and development. We know more about changing exposure rates, but we often jump to the conclusion
that the exposure causes the problem.  We also have much exposure information, but little information on
consequences, especially in children. We have had increasing rates of asthma that appear to have leveled
off, increasing rates of obesity and autism, and decreasing age of puberty onset in girls. We are also
beginning to understand differences in long-term effects from low-level exposure and acute poisoning,
such as childhood lead poisoning, which leads to subtle but serious dysfunctions; there is little in the
literature on long-term effects other than this. We also are beginning to understand childhood antecedents
of adult disease.  However, the medical community does not yet recognize that there are different forms
of substances, which may have different medical effects, such as the many different forms of mercury.

Much more is known about drugs than chemical exposures and effects. There is much more clinical trial
data for drugs, yet there is concern with acute poisoning and about long-term effects of the use of drugs in
young children that were originally developed for adults.  Examples are the use of statin drugs to control
obesity and what a lifetime of altered liver metabolism resulting from this drug will do. For
environmental exposure, there are no clinical trials, yet there is concern about the subtle or long-term
effects of low-levels of exposure.

The vast majority of medications and prescriptions are subject to regulations that restrict the types of
research that can be done in children.  As a result, we face the same type  of interspecies extrapolation
problems in human form—extrapolating from adult studies to children. This is difficult because physical
growth is non-linear and children are closer to the  ground, explore constantly, have greater surface area,
and have a greater metabolic rate than adults.  This may lead to greater exposure potential, but their
bodies may be more resilient as well.

The ontogeny of drug disposition also is nonlinear and therefore it is not possible to equate maturation of
a specific organ system with a specific pathway for metabolism of a particular drug or environmental
exposure. For example, different fields of research indicate that there is a window of vulnerability to
infant botulism in children (30 days to 4 years of age) relating to digestive tract development.  A
completely different pattern is found for the drug diazepam, which is most efficiently metabolized at ages
1 to 30 days and the ability to metabolize it subsequently decreases with age.  In addition, there are some
drugs that will damage the  liver in an adult but not in a child. There also are differences in disease
impacts, with diseases such as chicken pox, polio,  and Severe Acute Respiratory Syndrome being worse
in adults than children.

There are many diseases of unknown etiology or mechanism, and some or all of these may involve
environmental exposures. Examples include Sudden Infant Death Syndrome, pyloric stenosis from
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                           33

-------
treatment for chlamydia infections with systemic erythromycin, infantile colic, Kawasaki's disease,
autoimmune diseases, nephrotic syndrome, Intussusception, appendicitis, asthma, and cancer.
In reviewing cases of children, etiology or mechanisms of action or influencing factors are rarely
discussed. These may be environmental and EPA work may help to address this issue.

Several examples were offered of the subtle effects of childhood exposure.  These included potential
changes in the Intelligence Quotient for each increase in blood lead concentration, behavioral problems in
children who are prenatally exposed to tobacco if a certain allele is present, and cognitive impacts from
mercury and other chemicals. Another example involved how cigarette smoke and other factors affect
retention and performance in Kindergarten through the third grade, which correlated with findings that
those who have problems in high school are already having  difficulty in the third grade.

Genes, health services, environment, and behavior all contribute to children's health and ultimately to
adult health. Key to understanding the impacts to children's health are:  (1) children are different and
sometimes are more sensitive, resistant, or resilient than adults, (2) generalities do not work, (3) there are
critical windows of exposure, (4) animal models and juvenile models are critical to understanding things
like tobacco exposure, (5) there are many uncertainties, (6) there are less data on environmental exposures
than drugs, and (7) children in low socio-economic conditions are more vulnerable and may show greater
effects for the same environmental exposure.

Highlights from the Columbia Center for Children's Environmental Health:
Studying Air Pollution in Community Context

Virginia Rauh, with the Columbia Center for Children's Environmental Health, discussed research
activities that examine the juncture of environmental and social sciences in partnership with NIEHS,
EPA, and a number of private foundations. A primary interest of this Center is the health response from
social and physical "toxicants."

Evaluations of exposure and susceptibility indicate that children and fetuses are quite susceptible, more so
than adults. Exposures include air pollutants from fossil fuel, pesticides, environmental tobacco smoke,
allergens, and social stressors.  Susceptibility includes inadequate nutrition, genetic factors, and social
stressors. Key questions involve what can cross the placenta to cause in utero effects and what factors
make whole groups of children more susceptible.

Neural development and asthma are two endpoints being evaluated in a study of 730 mother and newborn
pairs involving non-smoking African American and Dominican participants resident in northern
Manhattan and the South Bronx in New York City.  A high  proportion of the mothers are single parents,
who are on Medicaid, have little education, lack basic necessities, are poor (low income), and have an
average age of 28 plus/minus 5.1 years. While the study group does not have much variability in income,
there is a range in the levels  of deprivation—some went without food, housing, clothing, and/or health
care during pregnancy while others did not. This is material hardship and demonstrated how people do
different things with money they have.  The study involved  the acquisition of air samples (via a backpack)
and biologic samples (placental, blood, and urine, among  others), as well as measurements of a variety of
insecticides.

This study found significant pre-natal and post-natal exposure to poly cyclic aromatic hydrocarbons
(PAHs) from fossil fuels, pesticides (resulting from cockroach and other inner city type issues), and
environmental tobacco smoke. A significant decrease in the air and blood levels of insecticides was found
after an EPA regulatory action was taken in 2000. A significant number of participants reported at least
one material hardship.  This study shows that impacts occur from social conditions and chemical
exposures, and those who can least afford it carry the biggest burden. In addition, there may be some
34                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
race/ethnic differences in patterns of socially stressful conditions that were found to exist despite
similarities in educational level and income.  These may be due to the support network or other factors.

The study also found, using biomarkers to indicate fetal exposure and differential susceptibility, that
exposure occurs across the placenta. Prenatal PAH exposure is associated with decreased birth weight
and head circumference among African Americans in this study. Lower birth weight, even in the normal
range, is associated with adverse effects on health as well as physical and cognitive development.

The EPA pesticide ban implemented in 2000 appears to have had a measurable effect. A study reported
in the news media indicated that birth weights increased after the EPA pesticide ban, and similar results
were seen in the cohort study data. There were also some associations found between fetal growth and
cord blood organophosphorus levels that also underscored the success of the EPA ban. Of note is that the
City of New York has an integrated pest management program, which has a tremendous commitment to
intervention, including intervention on an entire building to reduce pest populations thereby reducing
antigen levels and morbidity in children.

Post-natal assessments are seeing cohort children with much delayed development in cognitive abilities,
as well as a rise in the proportion with mild/moderate delay from 12 to 24 months. Also, a study by
Bellinger showed increased effects of tobacco smoke exposure in low income situations.  Other studies
showed similar outcomes  as a result of material hardships. This implies that socio-economically
disadvantaged situations may have fewer mechanisms to offset the exposure than in a more affluent
environment. In addition, material hardship may be a marker for exposure to other types of toxicants.

Conclusions from this initial study are that 100 percent of the babies in the study had prenatal exposure to
multiple neurotoxicants; the children had increased risk including a significantly heightened fetal
susceptibility to PAH-induced DNA damage (on order of 10-fold); there were adverse effects from
prenatal exposure PAH, pesticides, and environmental tobacco smoke on birth outcomes; there are direct
benefits of pesticide regulation on fetal development; and there is evidence for interaction between PAH-
DNA adducts and environmental tobacco smoke on fetal growth. In addition, conclusions are:
environmental pollutants are disproportionately distributed in society; pollutants rarely occur in isolation,
so environmental risk is cumulative; processes thought to link social conditions and health frequently
involve  adverse conditions; socially and physically toxic exposures are stressful; and there is emerging
evidence that such interactions are biologically based.

Future directions for this research include expansion of the size of the cohort to follow the children
through age 78. Researchers are also trying to incorporate additional biomarkers using genomic and
proteomics, and to incorporate additional biologic measures of response to psychosocial stress.

The take home message is that researchers must begin to consider the social context when framing
environmental research so there is access to these other measures, the need to  consider biomarkers of
social stress, and the possible use of genomics to obtain some of these answers.  There also may be a need
for a national cohort study and to examine all social conditions in order to understand the finer gradations.

More information on the Center for Children's Environmental Health, Columbia University, Mailman
School of Public Health can be found at http://www.cumc.columbia.edu/dept/sph/ccceh/index.html.

The National  Children's Study

Carole Kimmel, with NCEA, provided an overview of the National Children's Study, its current status,
and planned  activities. This effort began with a Presidential Executive Order in 1997 that led to the
President's Task Force on health risks and safety risks to children and the planning of a national
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          35

-------
longitudinal study of environmental influences on children's health and development. Passage of the
Children's Health Act of 2000 authorized the Director of the National Institute of Child Health and
Human Development, together with EPA, CDC, and later NIEHS, to conduct this study.

The National Children's Study is a high quality longitudinal study of children, their families, and their
environment, involving about 100,000 participants from before birth to adulthood.  This study is national
in scope and will look at a common range of environmental exposures; less common  outcomes (such as
autism); chemical, biological, and psychosocial factors; and the basic mechanisms of developmental
disorders, environmental risk, and genetic expression.  State-of-the-art technology will be used to track,
measure, and manage data.

This study will address issues important to health risk assessment, including understanding of the
following:

•   Ranges/types of exposures throughout development

•   Role of environmental factors in children's health

•   Contribution of exposure to the burden of disease in children

•   Long-term health effects from early exposure of children and their parents, such as asthma, cancer,
    cardiovascular disease, obesity, diabetes, and neurologic conditions that are all linked to
    environmental factors in early development.

The study will also address factors that alter susceptibility to environmental agents, immune deficiencies
and increased risk of asthma, early allergen exposures enhancing immune function, differences in
responses to environmental exposures by age or life stage, effects of aggregate exposures to a chemical or
cumulative exposures, and disparities in health outcomes due to race, ethnicity, poverty, housing, etc.

Study efforts began in 2000 with a pilot study and methods development work.  A 5-year planning effort
is underway with the full study anticipated to begin at vanguard centers by mid-2006. Current activities
involve the finalization of specific hypotheses and development of the study design.  Once the full study
begins, data analysis will occur regularly, with data distribution anticipated to begin by 2009-2010. The
study will continue to collect data through 2030.

Protocol development has focused on when to contact participants and collect samples, what measures to
look at, sample design, and selection of geographic locations and participants. Timing of recruitment is
another aspect of protocol development. The desire is to acquire participants at the preconception stage
(helpful to establish conditions before pregnancy), early in pregnancy (less than 4 weeks), and later in
pregnancy.

A number of workshops have been held on specific issues to consider in this study and how to measure
them, such as mild traumatic brain injury incidence/outcome, placental measurements, psychosocial stress
effects on pregnancy and the infant, physical activity, herbal and dietary supplements, effects of media on
activity/behavior, impact of rurality, sampling design, estimating date of conception,  possible roles for
inclusion of a study of cancer markers,  measurement of maternal/fetal infection and inflammation,
questionnaire and diary-based methods for early assessment of asthma-related health outcomes, gene-
environment interaction, and the regulation of behavior. These activities are ongoing and there will be
additional workshops.

Methods development activities  and pilot studies are focused on developing low-cost, low-burden
methods and alternative exposure measurement (validation) designs, methods for newborn assessment,
utility of frozen breast milk, feasibility  of using 3-dimensional ultrasounds for fetal assessment, methods
36                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
to elicit community involvement, subject recruitment and retention, and lessons learned from
EPA/NIEHS children's environmental health center research on outcomes and exposure useful to this
study.

The information technology infrastructure is in development and the scientific support from NIH and
EPA is in place. Procurements are anticipated to be issued and awarded during the next year for a
clinical/data coordinating center, biological/environmental sample studies, laboratory services, and
vanguard sites.

Approximately $20 million was allocated for the planning phase through FY03.  Startup (FY04-05) has a
budget of approximately $12 million for this year with $26 million estimated to be needed for FY05.
Implementation during FY05 through FY29 is estimated at $2.7 billion over the 24 year period.

Additional information may be found at the Web site at http://NationalChildrensStudy.gov.  Interested
parties may join the List Serve for news and communication.

Wrap Up and Discussion
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics. These included:  (1) inclusion of indoor
and outdoor air in the Children's Health Study; (2) the correlation of vaccinations to children's health and
potential side effects, which is not a primary question of the Children's Health Study; (3) opportunities
for involvement in the Children's Health Study and its workshops with suggestions to contact the study
developers via the Web site; and (4) an August time frame for the Science Forum proceedings to become
publicly available.

Sustainability - Educating for the Future
Following opening remarks by Alan Hecht, Director of Sustainable Development within ORD, four
speakers addressed current initiatives in sustainability education.  An audience question and answer
period followed the presentation.

Education for Sustainability Initiatives

Alan Hecht, Director of Sustainable Development within ORD, discussed education and capacity building
within environmental sectors. According to Dr. Paul Oilman, ORD Assistant Administrator and EPA
Science Advisor, "There is nothing more essential in moving toward the long-term goal of sustainability
than teaching the next generation how to incorporate sustainability principles into their work."

Part of ORD's mission is to further information and education. Training the next generation and sharing
information today is essential and requires that today's decisions be based on sound science  and research.
ORD education and sustainability initiatives are based on the themes of capacity building, professional
training, Kindergarten through Grade 12 foundations, and public education and communication. One
huge step in achieving educational and sustainability goals was ORD's participation in the World Summit
for Sustainable Development, in Johannesburg, South Africa in 2002. This was an important event
because the Institute for Sustainable Development was created during the summit, an effort  led by the
Smithsonian Institution, NOAA, USD A, NASA, and EPA. While Congressional participants and policy
makers determined action plans during the summit, other participants attended educational workshops and
training.  Training manuals and educational materials that could serve as model materials for many
organizations were available to participants.

The world faces many challenges in sustainability:


                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          37

-------
•   2.8 billion people live on less than $2 per day
•   1.1 billion people lack safe drinking water

•   2 billion people live in areas without access to modern energy supplies

•   2.3 billion people live in areas that lack access to sanitation

•   3 billion additional people by 2050 with 99 percent of these in developing countries.

These statistics result from the lack of information and education, not just technology.

Social marketing can enable improvements in these situations and, therefore, non-governmental
organizations, CDC, and other agencies are educating communities on how to maintain clean waters in
the home, globally. In Zambia, social marketing resulted in a disinfectant sales increase from 3,558 in
1998 to  1,107,168 in 2002.

Another initiative involves the United Nations' Decade of Education for Sustainable Development whose
goals include:

•   Promote and improve high quality, relevant basic education

•   Reorient existing education policies and programs to address social, environmental, and economic
    knowledge

•   Develop public understanding and awareness of the principles of sustainable development

•   Develop training programs to ensure that all sectors of society have the skills necessary to perform
    their work in a sustainable manner.

To further these efforts, EPA is promoting the following initiatives for the next generation: student
design competition, engineering and chemical school curricula, information and tools kits, university and
schools as centers for sustainable operations, and a Report on the Environment.  One effort in supporting
these next generation goals is the P3 award, which ORD is promoting by sending requests to schools to
design new technologies  to address sustainable development issues.  The United States Peace Corps and
the National Academy of Engineering participated with this effort as well. Submissions from students
have just been received, and the first competition for the award winners will be hosted sometime next
year. This has served as  an education experience, as well as an educational tool. More information on the
P3 award initiative is available at http://www.epa.gov/P3. Other outreach efforts include:

•   The EPA and the National Science Foundation sponsored workshop at the National Academy of
    Engineering, The Engineer of 2020, Visions of Engineering in the New Century

•   ORD's request for proposals in supporting the effort, Benchmarking the Integration ofSustainability
    into Curricula at Colleges and Universities, to stimulate more discussion on sustainability at Untied
    States universities and schools

•   ORD's report Science and Technology for Sustainability, P3: Promoting prosperity, benefiting
    people, protecting the planet.

More information on ORD efforts in achieving and promoting sustainability awareness can be found at
http://www.epa.gov/sustainability. This Web site provides a road map for all program offices to provide
information on practicing sustainability, planning for sustainability, and measuring sustainability efforts.

Every EPA program office should be credited for their efforts in education and sustainability awareness.
Their research and data gathering efforts can be seen at the EPA Envirofacts Web site, the one-stop
38                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
source for environmental information. Capacity building, training the professionals, and educating the
public are three essential efforts in ensuring sustainability.

Principles and Practice of Sustainability Education in Schools

Jaimie P. Cloud, President of the Sustainability Education Center, Inc., discussed the education of
students on sustainability. The Sustainability Education Center provides a foundation to educate for a
sustainable future. When considering students, the Center hopes to answer the questions:

•   What would students know, be able to do, and be like if they were educated for a sustainable future?

•   What habits of mind will they demonstrate?

•   What core content will they study?

When considering schools, the Center hopes to answer the questions:

•   What are our schools already doing?

•   What do our schools need to do differently?

Environmental and sustainability awareness and education have grown slowly but surely over the past
years, beginning with Silent Spring which was published in 1962. When considering student education
on sustainability, the Center is considering what habits of mind students will demonstrate, such as:
understanding of systems as the context for decision making, intergenerational responsibility,
understanding the implications and consequences of actions, protecting and enhancing that which we all
hold in common or for which we have common responsibility, awareness of driving forces and their
impacts, assumption of strategic responses (we want youngsters to take responsibility for their actions),
and paradigm shifts.

In order to integrate sustainability awareness, the following core content areas are recommended for
inclusion in the school curricula:

•   Ecological literacy (i.e., mimicking the way the world works)

•   System dynamics and "systems thinking"

•   Multiple perspectives

•   Place (i.e., what lives here with me)

•   Sustainable economics

•   Citizenship (e.g., participation and leadership)

•   Creativity and visioning of the future and the bigger picture.

Within these core content areas, it is important to include environmental science and education,
sustainable economics, and social awareness (e.g., global, ecological design and architecture education,
holistic education, future studies, organizational learning and change, environmental ethics, and
philosophy).

Currently, many schools in the United States are concentrating on state content and performance
standards, applied research on instructional methodologies, improving student experience in the
classroom, and utilizing the community as a resource.  Also, there is a big movement in authentic
instruction that would enable students to feel as if they are doing work and learning educational skills for
a real reason or for the purpose of individual education.


                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          39

-------
To support these efforts, the Sustainability Education Center provides leadership training of school
administrators and teachers, helps to develop curriculum materials, builds capacity, assesses teacher and
student outcomes resulting from sustainability outreach efforts and research and development activities.
The Sustainability Education Center first based its efforts and activities on global education, but began to
concentrate on sustainability in 1995.  Currently, the Center provides sustainability and other training to
teachers in order to connect standards and the real world in the classroom and to provide information on
the mathematics of global change.  As motivation for teacher participation, the Center's training classes
also provide education credits for teachers, which should positively impact teacher salaries. Another step
for the Center is to develop partnerships with other schools, communities, government, and industry, as
well as a partnership with the Society for Organizational Learning.

Selected teacher learning outcomes resulting from these training classes include changes in knowledge
and attitudes about sustainability, teaching practices, and behaviors related to consumption patterns,
materials cycling, and political involvement. In the future, the Center hopes that selected student learning
outcomes resulting from sustainability awareness and teacher training will include:

•  Understanding the concept of sustainability and its application in business practices

•  Understanding and applying systems thinking into business plans

•  Recognizing the moral and ethical, social, and ecological reasons for sustainable business practices

•  Developing the ability to think critically and systematically

•  Thinking creatively in terms of problem solving and decision making

•  Demonstrating collective respect for oneself and the commons

•  Developing an awareness of human choices and their consequences.

This new method of education needs to cross over to other disciplines (e.g., business), not just science,  in
order to get sustainability to the forefront.  Also, the Federal government is encouraged to provide
continued forums for discussion and mechanisms to coordinate and encourage Federal agencies to support
education for sustainability in Kindergarten through grade 12; create education for sustainability centers
around the  country to  serve as clearing houses for information, research, practitioners,  and to host
conferences and trainings; and provide legislation for funds to encourage sustainability education.

Many organizations currently are supporting these efforts within school systems across the country,
including the Foundation for Our Future, Northwest Environment Watch, New Jersey Sustainable
School's Network, Vermont Department of Education, Creative Change Educational Solutions, Creative
Learning Exchange, National Science Teachers Association, and the Lawrence Hall of Science
(University of California, Berkeley).

More information on the Sustainability Education Center can be found at http://www.sustainabilityed.org.

National  Efforts in Sustainability Education

Alan Elzerman, with Clemson University, discussed  sustainability education, which includes participation
by educational institutions, government agencies, non-government organizations, not-for-profit
organizations, businesses, professional societies, and others. Sustainability goes beyond environmental
education.  There is no single prescription for sustainability, and there is no one curriculum.
40                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Sustainability education is analogous to environmental protection.  There is a need for a global
perspective and, therefore, there is a need to look beyond the United States. In addition, scientists and
engineers must look at systems approaches to address questions about quality of life and how to evaluate
it, as well as whether risk assessment is necessary or sufficient.

The term sustainability means"  to meet the needs of the present without compromising the ability of
future generations to meet their own needs" (World Commission on Environment and Development, Our
Common Future, Oxford University Press,  1987). There are a number of other definitions of
sustainability, such as understanding the consequences of our decisions and actions, giving thoughtful
consideration to input from and impact on all systems and people affected, and making rational choices.

In order to achieve sustainability, it is necessary to understand its links with economic security, ecological
integrity, and social equity, which differs from environmental education.

Highlights of sustainability efforts in higher education include:

•   A tremendous variety of structures, characteristics, components, approaches, and compositions of
    education programs

•   Diversity, which requires understanding commonalities, goals, needs, and how to benefit from one
    another

•   Curriculum, core competencies, and professional issues that require discussion, definition, and
    focused efforts.

The objectives of the Council of Environmental Deans and Directors are to assure that the members of
universities know the environmental, social, and economic impacts of their actions on the world.  It is
intended that the universities will serve as models for sustainability.  There are a number of state
programs supporting these efforts, as well as not-for-profit organizations and non-government groups,
including the National Council for Science and the Environment.

In order to achieve all of these goals, scientists, engineers, professors, teachers, and decision makers must
use creativity, encouragement, guidance, cross-fertilization and coordination, empowerment, and
metadisciplinary education.  They must also communicate sustainability concepts to the public and foster
business leadership of sustainable practices.

Building  Partnerships for Sustainable Science Education

Sally Shuler, with the National Academies  of Science and the Smithsonian Institution, Director of the
National Science Resource Center (NSRC), provided background on the NSRC and suggested several
mechanisms to incorporate environmental education and sustainability awareness into the education
curriculum. The goal of NSRC is to improve the learning and teaching of science in the nation's 16,000
school districts. All students should have access to a research-based program that will lead to improved
attitudes about science.

Approximately 760 school districts have participated in NSRC-supported activities in order to develop
infrastructure changes. Infrastructure changes begin with the teachers and administrators to achieve
improvements in teacher performance and student achievements.  For example, in the State of
Washington, effective  changes in the infrastructure are improving student education in science because of
their involvement with NSRC. Within all school districts, there are several stages for integrating
environmental education and sustainability awareness into  their programs.  The first step is initiation (1
year), then  implementation (5 years), and then institutionalization (10 or more years).
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          41

-------
The current state of science education in schools is poor. In the past, most children grew up on farms and
had real interaction with life and death, hard work, etc. and, therefore, understood how science and the
environment affected everyday life. Now, children have no concept of life cycles, science, the
environment, or the general idea of how things work in the world.  The number of children and young
adults pursuing careers as scientists and engineers is steadily declining. Students in the United States are
last in the world in science education.  The United States' best advanced placement students are last in the
world when comparing test scores. Therefore, we need to improve on many things, starting with
education.

The answer to moving forward and pursuing the vision of improving science education and sustainability
awareness is valuing science and teaching in this country. Partnering with the business community to
bring science to the forefront in the educational curriculum is the first step to achieving this goal. There is
a need to form private and public sector partnerships. Students must increase their interest in science,
teachers must demonstrate the wonder of discovery while helping students master the rigorous content of
science education, and education boards must acknowledge the professionalism of teachers.

What is being suggested is a revolutionary, comprehensive system change, placement of science
education programs in Kindergarten through grade 12, and development of thinking skills and sustainable
themes in students' efforts. There is no longer a need for pilot projects and curriculum building. Efforts
only need to be placed on what is working now and simply scale them up to a national level.

Questions and Answers
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics.  These included:  (1) the need to include
both science education and engineering education in improved curricula in the nation's schools; and (2)
improvements in economics education in Kindergarten through grade 12 curricula.

Partnering with New York on Air Quality and Human Health:  Issues,
Challenges, and Perspectives
Following opening remarks by Val Garcia, with NERL, six speakers addressed the relationship between
air quality and human health.

Federal-State Partnerships for Enhanced Understanding  of Air Quality and Health
Relationships

Dr. S.T. Rao, with NERL, summarized the Agency's current need to understand air quality and human
health relationships. NERL partnered with EPA regions; state, tribal, and local governments; and other
organizations in a collaborative effort for air quality  studies in New York, North Carolina, and the
Western United States. Through these partnerships, NERL hopes to facilitate the use of air quality
planning applications through grid computing, prototype air quality forecasting for PM 2.5 (New York
only), and assist in surveillance of human health-air quality relationships (New York only). This work
also is being supported by other agencies within the Federal government, as well as internal EPA
partnerships.

For the New York pilot study, NOAA and EPA will provide remote access to daily air quality forecast
guidance for ozone. The State of New York will develop local forecasts and inform the public about
mitigation actions, and will apply the Community Multi-Scale Air Quality (CMAQ) model to prototype
predictions of PM 2.5 and other pollutants. In addition, NASA and NOAA will work to improve air
quality modeling and forecasting of PM 2.5.
42                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
The actual data used in these modeling efforts are the most critical pieces. In support of these pilot
studies, EPA has a substantial number of air monitoring sites to gamer actual data. However, mere are
limitations with the data. For example, some measurements are taken weekly for PM 2.5, and the
networks are sparse rather than dense. There are approximately 1,000 stations for monitoring ozone
across the United States, but this is not the case for PM 2.5. Also, the PM 2.5 monitoring sites that do
exist are in urban areas only because of the higher levels of pollutants expected in these areas.

There are tools and methods that are statistically acceptable to use for modeling with actual data, but
problems exist because of the lack of data stations. Also, there are issues with different data sources (i.e.,
data from different networks) that have been gathered using different methodologies and protocols. In
addition, there are uncertainties such as clouds and other media that interfere with data collection from
satellites. Modeling tools are useful in simulating air quality levels.  However, the usefulness and
efficiency of the modeling tools depend upon model input and data quality.

Enhanced tools (e.g., optimized CMAQ and satellite data) and information technology methodologies
(e.g., grid computing) should be used to improve air quality applications. Air quality applications that
could benefit from such enhancements include policy analysis, state implementation plans, air quality
forecasts, public health tracking, and detecting and tracking progress of the air quality applications.
Gathering and publishing air quality enhancement program information on a routine basis will help the
public as Federal, state, local, and tribal governments, as well as other stakeholders, can then assess the
progress made by the Agency and its partners.

Environmental Public Health Tracking  and  the Public Health Air Surveillance
Evaluation Project

Vickie Boothe, with the CDC National Center for Environmental Health, described CDC's
Environmental Public Health Tracking Program and the Public Health Air Surveillance Evaluation
(PHASE) Project. In response to findings of the Pew Environmental Health Commission stated in the
Environmental Health Review 2001 Report in 2001, CDC established the National Environmental Public
Health Tracking Program, two Centers of Excellence (Johns Hopkins University and Tulane University),
created planning and capacity building activities, and established infrastructure enhancement and data
linkage demonstration projects in several states. Most of the funding went to the state partners to obtain
community input on priorities, implement programs, and track progress.

The Environmental Public Health Tracking Program is  an ongoing, systematic evaluation used to track
hazards, exposures, and health effects, and to get this information back to the stakeholders and decision
makers. The Program focuses on chronic diseases and  others with possible environmental etiology (non-
infectious), provides information to address the effects  of exposure and disease, and provides for
surveillance (tracking) of the data rather than data research. The Environmental Public Health Tracking
Program will provide the results, not the causes driving the data.

The conceptual model by which an environmental agent produces adverse effects is as follows:  (1) an
environmental contaminant effects air, water, food, and soil;  (2) humans are  exposed to these media by
external exposure, absorbed dosage, or targeted organ dosages; and (3) once exposed, humans may suffer
from subclinical effects, morbidity, or mortality related consequences. The Environmental Public Health
Tracking Program will provide hazard tracking (i.e., tracking environmental contamination and
contamination of media types), exposure tracking (i.e.,  tracking how humans are exposed to these
environmental agents), and health effect tracking (i.e., tracking human health effects resulting from
exposures to environmental agents). With this Program, CDC hopes to identify populations at risk;
respond to clusters, outbreaks, and emerging threats; examine relationships between health effects and


                            EPA SCIENCE FORUM 2004 PROCEEDINGS                           43

-------
hazards; guide and evaluate interventions and prevention efforts; identify, reduce, and/or prevent harmful
environmental risks; and develop and disseminate information to policy makers and the public.

CDC also determined the need to evaluate whether different air quality characterization methods improve
capabilities for tracking environmental public health. Therefore, CDC established the PHASE project to
develop and evaluate alternative air quality characterization methods for environmental public health
tracking.  This involves the study of air pollutants (e.g., ozone and PM 2.5) and health endpoints (e.g.,
acute cardiovascular diseases and asthma). Three state partners support the PHASE project—Maine,
New York, and Wisconsin.

EPA is seeking better ways to measure the success of its programs and these projects  offer new
possibilities for improving the characterization of air quality data.  Such new approaches may improve
our ability to understand relationships between air quality and public health.

NOAA-EPA's National Air Quality Forecast Capability

Paula Davidson, the National Weather Service (NWS) Program Manager for Air Quality Forecasting at
NOAA, described efforts to predict ground-level concentrations of ozone. The initial goal of the National
Air Quality Forecast Capability project is to develop 1-day forecast guidance for ozone.  The guidance
will be developed and validated in the northeastern portion of the United States by September 2004 and
deployed nationwide within 5 years. The intermediate goal (5 to 7 years) is to develop and test
capabilities to forecast PM 2.5 concentrations. The longer-range goal (within 10 years) is to extend the
air quality forecasting range to 48 to 72 hours and to include a broader range of significant pollutants.

The National Air Quality Forecast Capability project is based  on an initial operating system that includes
a linked numerical prediction system, gridded forecast guidance products, verification basis, and customer
outreach and feedback from state and local air quality forecasters as well as public and private sector air
quality constituents. The linked numerical prediction system utilizes data from NOAA and the National
Center for Environmental Prediction that are drawn from weather observations and EPA emissions
inventory data, as well as data from the EPA Data Management Center.  The prediction of ground-level
ozone concentrations will be based on 1-hour averages.

During 2003 to 2004, NWS developed an end-to-end integrated weather-air quality forecast model
system; conducted real-time test runs through September; analyzed system performance and identified
upgrades needed for pre-deployment testing in 2004; and has been testing potential upgrades for
improved performance utility. In the summer of 2004, NWS is preparing to deploy initial operational
capabilities, engage in parallel testing  or evaluation of expanded "developmental" domain, and
experiment with coordination between NWS forecasters and air quality forecasters outside of NWS and
EPA.

Overall, NOAA hopes to produce air quality forecast guidance twice daily.  The experimental forecast
products should be placed on NWS and EPA data servers in the summer of 2004 with gridded data (grids
approximately 12 kilometers  in size) evaluated on an hour-by-hour basis through midnight, providing
data the next day. In September 2004, the products will be provided and tested in the Northeastern United
States, and in 5 years, expanded to a nationwide level.

Further information on the status and progress of the NWS National Air Quality Forecast Capability can
be found at http://www.nws.noaa.gov/ost/air_quality.
44                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Air Quality:  A Regional Perspective

Kenneth Colburn, with the Northeast States for Coordinated Air Use Management, provided a regional
perspective on the use of technology and innovations to better public health and the environment.
Historically, environmental protection grew out of public health concerns. As science and engineering
progressed, environmental protection and public health systems grew further apart and have almost
become stand-alone areas. The outcome of this separation is the lack of traceable environmental public
health tracking. Currently, there are trends in combining and relating these two global concerns, and
Federal, state, tribal, and local governments, as well as not-for-profit organizations and other
stakeholders, have been closing this gap.

The overall vision for EPA is to bridge the gap between the two disciplines by building infrastructures
and partnerships, increasing greater air quality standard data accessibility, and providing more
surveillance and accessibility to the states and  air quality forecasters. Implementation is not simple and
requires communication and partnerships among Federal, regional, and state agencies.

For regions, there are issues that must be addressed in order to support the partnerships between Federal
and state agencies. Many air quality problems are increasingly regional, and few states find it possible to
run the CMAQ model and address air quality standards.  Also, greater in-house expertise is needed in
order to analyze, interpret, and apply monitoring and modeling data to exposure assessments. The regions
will need to address:

•   Economies of a scale and leveraging scarce resources into broader geographic coverage

•   Consistent, coordinated building blocks of monitoring, modeling, and public health data to  be
    developed under one roof

•   Research, development, and coordination of regional regulatory response strategies

•   Need for some original research.

There is also the concern of being able to identify all areas geographically.  An example involved a
carbon black incident that took place in a small town near Boston, MA.  Air quality forecasters  could not
provide the public with the appropriate information regarding environmental and public health effects of
the incident because the town was not included in air quality surveillance. This brings up the question of
how to interpret, track, and make available environmental and public health data when a specific location
is not on the map in the first place.

In order to address these challenges, the following recommendations were offered:

•   Monitoring sites for environmental public  health tracking and NAAQS

•   Identifying and prioritizing the best regulatory strategies and workplace practices to reduce exposures
    and improve public health

•   Improving the effectiveness of environmental public health messaging

•   Demonstrating public health benefits.

It is important to involve regional organizations to expedite the improvements and advancements desired.
However, government agencies and other parties must target education and intervention programs in
order to ensure success of the effort.  Also, there is the need to develop and integrate public health
accountability practices  "up front" into air quality regulatory processes. Finally, there must be  effective
public health communication that provides information to the public in an understandable manner.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                           45

-------
Air Quality Management and Challenges in New York State

Robert Sliwinski, with the New York State Department of Environmental Conservation, provided an
overview of the air quality management program in New York administered by the Division of Air
Resources, which has about 280 technical and scientific staff in nine regional and central offices. The
Division of Air Resources reports air inventories to EPA, maintains and manages a monitoring network of
more than 50 sites that are located in urban and remote rural areas, and conducts measurements of air
toxics and acid deposition as well as criteria pollutant monitoring.  Throughout the State, the Division of
Air Resources controls 25 continuous PM 2.5 monitors and 32 ozone monitors.

The State's criteria pollutants, including lead, sulfur dioxide, carbon monoxide, nitrogen dioxide, and PM
10, all meet NAAQS. Ozone and PM 2.5 are the only pollutants exceeding national standards.
Designation of the artainment/non-attainment areas is ongoing with the State Implementation Plans due in
2007 and 2008.

The State of New York has many air pollution control programs, including Reasonably Available Control
Technology, Low Emission Vehicle, Inspection/Maintenance, Title IV, NOx State Implementation  Plan
Call, Summer Fuels for Upstate, Governor's Acid Rain Initiative (2005 to 2006 timeframe),  Regional
Greenhouse Gas Initiative, and the Ozone Transport Commission model rules for volatile organic
compounds and NOx.  Another big initiative involves programs targeting dry cleaners to help use closed
loop systems, decrease the amounts  of harmful chemicals used, and apply better chemical disposal
methods.

The State of New York also faces a few challenges with their air quality work. Currently, the State has
recognized that pollutant transport into New York is a major concern. This requires an assessment of the
8-hour ozone concentrations and PM 2.5 issues. There is a lack of data on PM 2.5, which makes it
difficult to control.  In addition, there is an increased effort in gathering toxics emissions to address acute
and epidemiological problems.  The Division of Air Resources needs to maintain a balanced approach
between monitoring air toxics and criteria pollutants under the proposed regulatory framework.

In the future, the State of New York may see a reduction in funds for air quality monitoring programs,
which would require the Division of Air Resources to focus on only a few  criteria pollutants. Therefore,
New York, as well as other states, may not be able to maintain current or future monitoring systems.
With this in mind, modeling, not monitoring, will be the answer to air quality concerns.

Health Activities Related to Air

Dr. Nancy Kim, with the New York Department of Health, provided an overview of the Department's
activities in outreach and education, responding to  health concerns, conducting research, and establishing
an environmental public health tracking system.

Outreach and education activities include efforts with CDC for a several years to determine and
communicate exceedances of standards. The New York Departments of Environmental Conservation and
Health provide this information in press releases and notifications to the public through State, regional,
and local health department contacts; the New York Department of Education; and other interested
individuals.  The Department also provides Web publication of health effects and health trends so that the
public has access to this information. Other State agencies can use this information as well;  for example,
the New York Department of Transportation uses this information to change work practices  if there are
air quality warnings to consider.
46                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Research activities include a study of air contaminants and acute asthma attacks in urban areas, the
Breathe Easy Project, and studies of childhood asthma hospitalization trends and ambient air sulfur
dioxide concerns in the Bronx in New York City.

The Department of Health's efforts in responding to health concerns involve issuing unscheduled air
releases, addressing odors and visible emissions complaints, and addressing requests for health studies.
For example, in one incident of reported fog and visible emissions from a nearby manufacturing plant,
residents of a small town participated in a community study and used the Bucket Brigade method (5-
gallon buckets, pumps, and Tedlar bags) to obtain samples to document emissions episodes.  The results
of the study showed that samples taken with the Bucket Brigade method were higher than the annual
average values for methylene chloride, toluene, ethylbenzene, and xylenes. Another example involves
concerns that exposure to operations at a nearby stone quarry could result in cancer, asthma, lupus, and
other auto-immune diseases. The Department of Health conducted two incidence studies and found
trends in cancer incidences and hospital admissions for asthma and other respiratory diseases.

The Department of Health's environmental public health tracking activities include the following:

•   Linking ambient air pollution data to reproductive outcomes, asthma development, and childhood
    death in focused study areas

•   Investigating the association between health outcomes and air pollution throughout the State

•   Tracking the space-time patterns in asthma and air pollution throughout the State

•   Using maps, graphs, and spatial statistics to identify patterns and trends in time and space

•   Testing for associations between air pollution and asthma hospitalizations using regression and other
    statistical models.

State-of-the-Science Research on Swimming-Associated Health
Effects and the Translation of Health Data to Water Quality Guidelines
for Bathing  Beaches
Following opening remarks by Alfred Dufour, with NERL, three speakers addressed research and
epidemiology studies of human health impacts from the use of recreational waters. An audience question
and answer period followed the presentations.

The National Environmental and Epidemiologic Assessment of Recreational
Water: The Relationship Between Novel Indicators of Water Quality and Health

Tim Wade, with NHEERL, introduced the National Environmental and Epidemiologic Assessment of
Recreational Water project supported by ORD, with participation by NHEERL, NERL, and CDC. This
project hopes to determine whether there is an association between illness  and recreational water quality
as measured by rapid methods of determining water quality (i.e., methodologies supplying results in less
than 2 hours).

Prompted by the Beach Act of 2000, NHEERL was tasked to determine microbial indicators for beach
water quality, develop efficient protocols for monitoring, assess human health risks, and provide guidance
to beach managers. The overall approach to the National Environmental and Epidemiologic Assessment
of Recreational Water project includes water sampling during the summer weekends, and interviewing
and surveying beach-goers to establish background health assessments. Participants were contacted by
telephone 12 days after initial  contact in order to evaluate their health conditions.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                         47

-------
 Two fresh water beach sites on the Great Lakes were involved in the 2003 study: one near the Great
 Lakes National Park on Lake Michigan and one near Cleveland, OH, on Lake Erie. The sites are
 examples of fresh water sources vulnerable to point-source human contamination. They also have large
 populations.  Researchers did not target any specific population or ethnic group during interviews and
 surveys.

 Testing of the fresh water occurred at Sam, 11am, and 3pm, and samples were taken at two different
 depths at each site—0.3 meters and 1.0 meters. Six samples were collected at each time during the day.
 Methods used for water quality testing included Enterococci Method 1600 and a DNA-based quantitative
 (real-time) polymerase chain reaction (QPCR) method for enterococci and bacteroides. Both methods
 used intestinal tract bacteria.

 Researchers also categorized human risk exposures as "any contact" with water, "body contact" meaning
 that the body is immersed in water, and "head under" meaning that the total body, including the head, is
 under water.  Health outcomes identified in this study include gastrointestinal illness, skin rash, ear ache,
 eye irritation, and respiratory illness.

 Event sampling, as well as interviewing and sampling took place from May 31, 2003 to August 3, 2003 at
 Lake Michigan  and from July 27, 2003 to September 14, 2003 at Lake Erie. The  Lake Michigan study
 resulted in 20 days of surveying and 2,877 completed interviews, while the Lake Erie study resulted in 16
 days of surveying and  2,840 completed interviews. The studies held 67 percent and 60 percent
 completion rates, respectively.

 Data analysis showed that for most days of the study the indicator concentrations  were well below the
 geometric means. At Lake Michigan, there were only 3 out of 20 days where exceedances of the current
 fresh water standards (33 colony forming units/100ml) occurred. At Lake Erie, the water quality was
 worse. A total of 6 out of 16 days of sampling had exceedances of the current standards.  Also, the
 relationship between the QPCR (measuring bacterial DNA) and enterococci Method 1600 (measuring E.
 coli colonies) laboratory protocols were well correlated, but not exact.

 Finally, survey results  of the two sites showed that swimmers engaged in recreational activities at very
 different levels.  For example, at Lake Michigan, 75 percent of the  swimmers had "any contact" with the
 fresh water, whereas only 46 percent of the swimmers did at Lake Erie. "Body contact" with the fresh
 water differed too.  At Lake Michigan, 58 percent of the swimmers had "body contact" with the fresh
 water, whereas only 27 percent of the swimmers did at Lake Erie.

 This study found that there is increased risk for illness for swimmers over non-swimmers, with swimmers
 2.2 times more likely to have gastrointestinal illness.  Rash was also commonly associated with
 swimming.

 There is preliminary evidence that QPCR appears to be a promising predictor of gastrointestinal illnesses
 from fresh water exposure. Trends were not observed for respiratory illnesses or for rash, ear ache, or eye
 ailments, but more data may be necessary to evaluate  this further.

 Epidemiology Study of Swimmers in Nonpoint Source Polluted  Marine
 Recreational Waters from San Diego, California

 Kenneth Schiff,  Deputy Director of the Southern California Coastal Water Research Project (SCCWRP),
 and Jack Colford, with the University of California at Berkeley, discussed the complexities of non-point
 source pollution (i.e., animal contamination) in marine recreational waters. Southern California has a
tremendous amount of beach use with approximately  175 million beach-goers visiting the area's beaches
48                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
yearly. The largely growing tourism population brings in money, and money brings more industry and
businesses. Therefore, there is a constant effort to keep the waters clean. The State of California spends
$3 million each year in beach monitoring.

Another result of largely populated areas is a large amount of sewage disposal.  Most of the treated
sewage is discharged more than five miles offshore, and is not really affecting the marine recreational
waters. The questions are where is the contamination coming from and whether it is from animal waste.

Because there are a number of announced beach closings, there is an effort to find out the causes and
effects of the contamination seen in Southern California beaches. One beach highlighted in this
discussion was Mission Bay, which is a heavily used aquatic park that has no discharges. However,
Mission Bay had more than 100 days of beach postings for contamination in 1998. Mission Bay has
numerous storm  water drains,  and wildlife (e.g., birds) serve as another possible non-point source of
contamination.

SCCWRP aims to answer the following  questions:
•   Is there a health risk of swimming in Mission Bay? This question can be answered by  comparing the
    exposure level and health  of swimmers versus non-swimmers at Mission Bay.

•   Can we relate the health risk to traditional health indicator (e.g., bacterial indicator) concentrations
    and non-traditional health indicators (e.g., virus and phage, as well as bacteriodes and enterococcf)7

The first step in this research involved a pilot study to determine when and where most swimming occurs.
The design of this coastal water project mimics the National Epidemiology Study and requires use of
intensive water quality measurements to describe exposure.  SCCWRP utilizes an almost identical
questionnaire in  order to compare study results to other beach studies.

The pilot study included helicopter surveys of swimming activity and a focus on six beaches to represent
over 75 percent of the swimming activity.  Since historical water quality trends prove unpredictable in
forecasting in space and time,  sampling was conducted at all times of the day. At each of the six sites,
project researchers, set up study centers  and approached beach-goers with a short questionnaire. After
obtaining contact information for study volunteers, project researchers contacted the volunteers within 2
weeks to ask about illnesses possibly resulting from their recreational water exposure and to complete a
30-minute questionnaire.

At each site, project researchers visited every weekend and holiday during the summer in order to get a
good number of people to participate. The goal was to involve 8,000 beach-goers during the summer.
Researchers achieved about 70 percent participation from the volunteers from beginning to completion of
the entire project.

In the study results, there were approximately 5,000 swimmers, while approximately 600 of the
participants were non-swimmers. The participants were largely Hispanic, followed by Anglo-Saxon, and
then others.

Study results showed that peak swimming times occurred during 12pm to 4pm. Therefore, peak sampling
events occurred during the same time. The researchers sampled hourly from  12:30pm to 3:30pm, and
also obtained grab samples (e.g., single beach composite samples) at 12:30pm.

Preliminary water quality analysis showed that most of the study beaches were clean, based on
enterococci concentrations. Also,  researchers found that beach water quality suffered during peak
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          49

-------
vacation or holiday times (i.e., July 4) and that there was a random generation of contamination from non-
point source pollutants.

The next steps for the SCCWRP are to examine other factors contributing to the random generation of
contamination from non-point source pollutants and to attempt to determine if an actual point source leads
to contamination.  For example, results show that there is an increase in enterococci concentrations that
directly correlates to an increase in the number of people at the beach.  In the next phase of this project,
researchers will address whether higher values of enterococci only result when there are more people at
the beach, and whether the higher values result from accidental fecal releases in humans or result from
more animal waste (e.g.,  seagulls waste)  since humans bring more food to the area, which could lead to
more food for the seagulls.

Also, researchers must compare the uses of indicators in this marine recreational water study and address
whether study results indicating human health effects and illnesses depend on traditional bacterial
indicators, such as E. coli and enterococci, or newer bacterial indicators, such as bacteroides.  In the next
phase, researchers will attempt to determine if using different sampling and laboratory methods affect
study outcomes. Questions to consider include whether the laboratories should rely on membrane filter or
IDEXX methodologies and whether researchers would benefit from using composite samples.

Additional steps in this research project include the commencement of epidemiological data analyses in
August 2004, a presentation of SCCWRP at the National Beach Conference in October 2004, and
completion of the final SCCWRP report in December 2004. More information on SCCWRP may be
found at http://www.sccwrp.org.

Partnerships: Linking EPA Beach Research with State and Local Beach Programs

Rick Hoffmann, with the Office of Water, reviewed EPA efforts to address the requirements of the Beach
Act to improve quality within the United States beach waters. EPA was prompted to tackle the beach
water issues because of inconsistent recreational water monitoring and notifications provided to the
public. The Agency was also concerned with the fact that there were inconsistent standards among the
nation's beaches and there are trends in illnesses reported due to human exposure to pathogens during
swimming activities.

The Beach Act amended  the Clean Water Act in October 10,  2000, by adding Section 406 to strengthen
the existing water program at the Agency. As a result, EPA created its Beach Program to ensure that
there are consistent standards for pathogens, make available grants and funds to enable states to
participate in studies and conduct research, provide guidance and performance criteria for state programs,
and improve data collection, electronic data transfer, and public reporting.  In general, the Beach  Program
is tasked to reduce the risk of infection to users of recreational waters through improvements in
recreational water programs, communication, scientific advances, and research.

All coastal states of the United States and its territories must adhere to regulations under the Beach Act.
Therefore, states must adopt water quality criteria for coastal recreation waters  as published by EPA for
E. coli and enterococci. EPA also must propose regulations for those states that did not adopt the
standards or did not submit more stringent standards to EPA for approval by April 10, 2004.

The Beach Program uses an indicators concept to correlate pathogens and acute gastrointestinal illnesses.
Fecal indicator levels (such as levels of bacteria E. coli and enterococci) are measured and used to
determine the pathogens  that cause acute gastrointestinal illnesses drawing on historic trends that link
specific bacteria and pathogens to illness. Study findings and results will soon become available. Based
on these research and results, EPA is tasked with publishing the new criteria by November 10,2005.
50                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
In regards to the Beach Program's monitoring and notification requirements, states and tribes are eligible
to receive Beach Program grants to monitor local beaches and to notify the public when water quality
standards are exceeded.  The Office of Water also is planning to develop program guidance using ORD
and other research to include recommended sampling depths, frequency of sampling events, etc.  Public
notification plans and procedures also will be included in the program guidance documents.

In addition, the Beach Act requires EPA to collect, store, and display beach data in a public database.
Information technology  developments to achieve this requirement include the creation of a national
database to store beach program monitoring, notification, standards, and grants information, and the
creation of an Internet display that will allow users to migrate through and understand this information
easily. Data integration is a necessary process in this step.  Database functions will rely on other EPA
program links, and the Office of Water will use the Waters Architecture Internet display to house this
information. For example, the Beach Program database can utilize data from National Pollutant
Discharge Elimination System (NPDES) permits, assessments and listings of criteria data (e.g., general
monitoring guidance, TMDL, and 303(d) and 305(b) monitoring reports), planning documents, and local
beach information in real time.

More information on EPA's Beach Program, including the EPA Clean Beaches Plan for 2004, grant
information, water quality standards, and local beach data, can be found at http://www.epa.gov/beaches.

Questions and Answers
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics. These included:  (1) requests for Office
of Water guidance or information regarding beach closings and advisories by states and local
communities; (2) concerns with quality  data for recreational waters that may be prompting additional
epidemiological study research; (3) recommendations on getting states to adhere to new water quality
criteria and guidelines, from an  EPA Regional perspective;  (4) composite and batch sampling sites
utilized in the SCCWRP; (5) use of IDEXX, culture laboratory testing, and plate  laboratory testing in
SCCWRP; and (6) consideration of younger populations (i.e., children) when predicting or evaluating
illnesses, such as ear aches, resulting from exposure to recreational waters.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          51

-------
Section  IV:    Using  Science  to
                                  Make  a  Difference
                                 Wednesday and Thursday, June 2-3, 2004
The purpose of this breakout session on the second and third days of the meeting was to focus on regional
research being conducted regarding invasive species, monitoring and assessment to protect tribal health,
the application of indicators, ecological forecasting, air toxics, and climate change issues.  Each session
included a panel discussion or opportunities to respond to audience questions that provided additional
information and insight on a variety of regional topics.

Michael Slimak, with NCEA, led a session addressing EPA's role in invasive species research and
management. Presentations included overviews of invasive species  and aquatic nuisance species
research, marine bioinvader rapid assessment surveys, descriptions of genomic approaches to screening
for invasive species, and an evaluation of non-native oysters in the Chesapeake Bay.

Valerie Bataille, with EPA Region I, led a session addressing monitoring and assessment projects
sensitive to tribal-specific concerns. Presentations included identification of Native American exposure
pathways, monitoring as a means to better understand mercury fate/transport, and a coastal waters study.

Brian Hill, with NHEERL, led a session addressing the Regional Environmental Monitoring and
Assessment Program and the application of EMAP indicators. Presentations included an overview of the
regional program and regional studies in the southeastern United States and in Maryland.

Rochelle Araujo, with NERL, led a session addressing the use of collaboration to solve environmental
problems. Presentations included lessons learned from the Chesapeake Bay Program, an overview of
collaborative science efforts in the Great Lakes, and an evaluation of the role of science, management,
and activism in sustainability of the Gulf of Mexico.

Dr. Betsy Smith, the ReVA Program Director at NERL, led a session addressing current and future
regional risks.  Presentations included ecological forecasting and its applications, approaches for
projecting land use change and resultant ecological vulnerability, a statistical groundwater model,
methods to forecast species distribution, and forecasting land cover changes to assess risk/vulnerability.

Tom Baugh, with EPA Region IV, led a session addressing the Regional Research Partnership Program.
Presentations included the use of microbial source tracking, the use of land cover diversity as a proxy for
biodiversity, and the relationship of terrestrial ecosystems to manganese emissions from wood burning.

Henry Topper, with the Office of Pollution Prevention and Toxics, led a session addressing air toxics at
the local level. Presentations included development of an air toxics  emission inventory and reduction
strategy, and descriptions of air toxics programs in St. Louis, MO, Louisville, KY, and Mobile, AL.

Michael Slimak, with NCEA, led a session addressing climate change issues. Presentations included an
evaluation of the feasibility of conducting climate change impact assessments, an overview of the role of
science in decision making in the Gulf of Mexico, and alternative approaches to  climate change
assessments.
52                        EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Can You Hear Us Now?  EPA's Role in Invasive Species Research and
Management
Following opening remarks by Michael Slimak, with NCEA, Jive speakers provided an overview of
invasive species, ongoing and planned aquatic nuisance species research, marine bioinvader rapid
assessment surveys, genomic approaches to screening for invasive species, and non-native oysters in the
Chesapeake Bay.

Snakeheads, Green Crabs, and Other Nasty Things:  An Overview of Invasive
Species

Dr. Henry Lee II, Chair of EPA's Nonindigenous Species Working Group, discussed sources of invasive
species; their direct and indirect ecological, economic, and regulatory effects; recent research findings;
and areas for future work.  Snakehead fish invaded the Maryland area last year. Green crabs from Europe
invaded the East Coast over 100 years ago and the West Coast 10 years ago, and in the West Coast have
now spread from San Francisco to Puget Sound. Thinking about invasive species requires breaking out of
the pollutant-centric mindset since pollutants break down while invasive species increase rapidly over
time. The presence of an invasive species can increase the abundance of other invasive species.

Ballast water discharge is a major source of invasive species, and an International Treaty in Ballast Water
Treatment is underway. Other sources of invasive species include ship fouling and drilling platforms,
recreational boating, aquaculture, stocking offish and shellfish, canals and water diversions, aquarium
and horticulture industries, the live seafood industry, research facilities and public aquaria, and habitat
restoration and dredging.

Invasion rates in the San Francisco Estuary are still rising, which is not surprising given the rate of
international trade over the last 10 years. Although aquatic invaders are harder to control, some sites have
had at least partial success with eradication or management of aquatic invaders. In some cases, such as
the case ofcaulerpa taxifola in southern California, the rapid response was completed within a month.
Other response efforts took significantly longer.

Invasive species have direct and indirect ecological effects. The general consensus is that they are the
second most important cause for declines in both biodiversity and endangered species, and that they may
be the most important cause for declines in lake biodiversity by 2100. Furthermore, invasive species can
fundamentally alter ecosystem processes, such as nutrient fluxes and sedimentation.  For example, plant
invaders in riparian zones are nitrogen fixers.

The economic costs of invasive species totaled $97 billion in damages from 1901 to 1991. The current
predictions are for $137 billion per year in damages and losses in the United States. A significant amount
of damage and loss will be agricultural, but some will also be aquatic. Given the costs of invasive
species, it is economically sensible to control them.

EPA regulations and goals already address invasive species under Executive Orders; NPDES; TMDLs;
the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA); the Clean Water Act; and the National
Environmental Policy Act (NEP A).  For example, the use of an invasive species in a laboratory may
constitute noncompliance of the Invasive Species Executive Order.  Another example is the  ability to use
the pesticide emergency exception under FIFRA against a new invader.

Spatial pattern research in the San Francisco Estuary and the Small Pacific Coast Estuaries has revealed
that San Francisco Bay subtidal benthos are numerically dominated by non-indigenous species. This
research used clustoral organization to examine assemblages. With the exception of a small, sandy
section, 45 to 90 percent of the species were alien. EMAP data on the relative abundance of non-

                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          53

-------
indigenous species in small estuaries by biogeographic province show that northern California rivers
appear to be less invaded than San Francisco. Additionally, EMAP data on small estuaries in Oregon,
Washington, and California have shown that non-indigenous species are more widespread stressors than
contaminants or low dissolved oxygen. These small estuaries have never been exposed to ballast water or
agriculture, and yet they are invaded. This is evidence of the need to have a regional, as opposed to a
local, view of such species, and NHEERL is currently working with the USDA on this issue.  Data from
the Pacific Coast Estuarine Information system show similar distributions of native and non-native
species, and also show a pool of invasive species that can potentially invade the rest of the West Coast.

Invasive species directly and indirectly affect the ability to achieve EPA's mandates and goals. However,
EPA has a unique  niche in invasive species research and management. Steps to be taken by the Agency
to develop an understanding of the patterns of invasion in Pacific Coast estuaries at multiple spatial scales
include the development of local, estuary, and regional-scale indicators of invasion; prediction of
vulnerable ecosystems and potential distributions of new invaders; and prediction of the impact of
invasive species on estuarine goods and services.

Office of Water and Aquatic  Nuisance Species: What's Underway and What's
Planned

Head of the Office of Wetlands, Oceans, and Watersheds (OWOW), Diane Regas, discussed how
OWOW, other EPA offices, and other Federal agencies are working to develop a strategy to control and
prevent the introduction of aquatic nuisance species.  In addition to invasive species, OWOW
responsibilities include TMDLs and non-point source issues. The U.S. Fish and Wildlife Service, NOAA,
and others are members of a task force created to address aquatic nuisance  species.

The aquatic nuisance species issue is tremendously challenging, because these species have already had a
dramatic impact on our ecosystems.  There are over 298 non-indigenous vertebrates and algae, 109 non-
indigenous species offish, and 200 species of non-indigenous vascular plants that have become
established in the coastal waters of the United States. Known significant impacts include lower water
tables and reduced water flow.  Other impacts include significant economic effects, potential health
threats, and both economic and  quality of life recreational industry impacts.

OWOW has developed an invasive species action plan to serve as a management tool, and is actively
conducting work on vessels, one of the major sources of invasive species' introductions. Vessel
regulation negotiations are being held on an international scale.  In 1991, the Marine Environment
Protection Committee of the International Maritime Organization began  discussions and developed
guidelines to prevent the introduction of unwanted organisms and pathogens from ship ballast water and
sediment discharges.  In 1997, this group began to develop legally binding requirements for ballast water
management and,  in 2004, this group developed text for a treaty. Ratification will require the support of
30 countries, representing 35 percent of the world's shipping tonnage; however, the position of the United
States on this treaty has not yet been established. The draft treaty involves a control scheme based on
ballast water discharge standards, but the standards are almost three orders of magnitude less stringent
than those preferred by the United States.  Therefore, there is concern that the standards will not be
protective enough. Another drawback to the draft treaty are the limitations placed on the ability of
individual parties to approve the use of treatment systems that use active substances, such as biocides, to
achieve those standards. A better understanding of the effects of those limitations is needed before the
United States government can take a position on the treaty.

OWOW is working with the United  States Coast Guard and Navy on vessel-related rulemakings. The
United States Coast Guard is the lead on developing ballast water standards for vessels, and OWOW is
supporting that effort by lending expertise and helping to develop the Environmental Impact Statement
54                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
(EIS).  The United States Navy is working to develop discharge standards for their vessels; this provides
OWOW with the opportunity to work with cutting-edge engineers and technologists to experiment with
strategies and to get the successful ones built into standards.

Once established, invasive species are nearly impossible to eradicate.  Therefore, it is essential to take all
necessary actions to try to prevent or reduce their introduction, limit their spread, and respond rapidly
when they are introduced. To address this, OWOW is identifying barriers to rapid response and is
developing guidelines for states and localities to use when they are trying to navigate Federal guidelines.

A good estimate of the potential impacts of aquatic nuisance species is needed and should include
estimates of national and watershed level economic impacts as these have not been defensibly quantified.
The Office of Policy, Economics, and Innovation will be working with OWOW to develop a framework
for such an economics impact assessment, which is awaiting funding.

Currently, OWOW is investigating which of their projects are funding the use of invasive species, and is
reviewing their regulatory authorities and non-regulatory tools to determine if there are other means not
yet being explored that could help reduce the spread of invasive species.  OWOW is investing in
education and outreach through a number of projects, including traveling exhibits, and the Aquatic
Nuisance Species Task Force has developed an overall strategy for education outreach. OWOW's goal is
to support that task force rather than develop its own path.

There is not a single government entity to address the invasive species issue. Instead, this must be a
collaborative effort on the local, state, and Federal levels, with most of the work conducted at the state
and local levels.  Approximately 15 states have already taken aggressive approaches.  Other areas to
address in the future include:

•   Supporting the needs of those at the forefront of this effort, with tool development considering the
    end user

•   Understanding economic impacts, which is crucial

•   Investing in predictive models

•   Developing tools to provide an understanding of the issue to communities and to understand the
    relative risks in order to convey this information to the public and decision makers

•   Developing new tools for prevention, treatment, and control of ballast water

•   Understanding treatment effectiveness and determining which approaches are going to deliver the
    desired difference.

In addition, scientific and technological analyses are needed to support requests made to invest in and
implement technologies to put onto ships.

Rapid Assessment Surveys in Northeast National Estuaries:  Identifying Marine
Bioinvaders in Fouling  Communities

Dr. Judy Pederson, with the Massachusetts Institute of Technology's Sea Grant Program, discussed a
marine bioinvasion rapid assessment survey, its results, management implications, future directions, and
the need to more effectively communicate the bioinvader issue.  Additional research is needed in order to
understand the life histories and quantify the vectors and potential uses of biological control.

Many vectors are responsible for introduction of bioinvaders.  Intentional vectors include aquaculture,
agriculture, ornamentals, and aquaria pets. Unintentional vectors include ballast water, shipping, escapes,

                            EPA SCIENCE FORUM 2004 PROCEEDINGS                           55

-------
and spillage. Invasive species create problems that are usually measured in human terms.  However, they
may also impact ecosystems.

The Rapid Assessment Survey is used to identify native and non-native species in floating dock
communities. Taxonomic experts visited approximately 20 sites to identify species on the dock and in the
laboratory.  This information is used to support management actions to prevent, reduce, and manage
invasive species. Taxonomic expertise used in the Rapid Assessment Surveys in 2000 and/or 2003
included experts on mollusks, hydroids, pericaridans, tanaeids, polychaetes, barnacles, sponges,
bryozoans, sea slugs, tunicates, nemerteans, fiatworms, and crustaceans. An interactive map that
identifies the locations where species were identified can be accessed at http://massbay.mit.edu. Surveys,
scientists, and divers provided the information contained in the map.

As a result of the survey, 260 to 300 species were identified.  The Massachusetts and Rhode Island
surveys were conducted consecutively in 2000,  while the New England survey was conducted in 2003.
The percent of total introduced and cryptogenic species ranged from 10 to 20 percent.

Styela clava is an example of one of the invasive species identified as growing in Prince Edward Island
where everything growing on the island is an introduced species.  This organism has major influences and
has caused major problems, and is evidence that species can become something different in different
environments.

Didemnum is a compound tunicate from the Pacific that has been found in Georges Bank gravel beds,
which are prime scallop beds. -The affected area is six and a half square miles, 80 percent of which is
covered with the organism. A rapid response could not be conducted at the site.  This example is the first
case of an invasive organism being found in waters of that depth.

Approximately $130 billion is spent annually in the United States to address invasive species in all
ecosystems.  However, there is poor documentation for the marine environment.

Future  research directions include economic studies, ecological studies, communication and outreach,
early detection,  rapid response, regional ballast  water management, and general management issues.

Targeted Screening for Invasive Species in Ballast: Genomic Approaches

Mike Blum,  with NERL, discussed traditional approaches for identifying species found in ballast water, a
targeted screening research project, potential applications of the research, the design of allele-specific
polymerase chain reaction (PCR) primers, and future directions of the targeted screening research.
Ballast water is  the dominant transport vehicle for invasive species.  The frequency or number of
introduction events required for a species to become established is not known. Many of the invasive
species are unknown or cannot be recognized. Green crab and zebra mussels are two species of great
concern in the United States.

Morphological taxonomy is the traditional approach for identifying species found in ballast water.
Classification is dependent on adult traits and requires broad knowledge of major taxonomic groups with
identification typically limited to the family or genus level. Such data do not provide a basis for
comparison across studies, and, therefore, the data have very limited applicability. In addition, larva and
egg forms in ballast water pose a very difficult question since the adult forms do not present themselves
for identification during ballast water examinations.

Bioinformatics coupled with DNA taxonomy is an alternative approach to identify species found in
ballast water. Classification is dependent on genomic variation and does not require broad knowledge of
56                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
major taxonomic groups with identification at the species or subspecies level. This approach provides an
objective standard for comparison across studies and, therefore, the data have broad applicability.

Targeted screening research, supported by the Regional Methods Program, is a novel application of allele-
specific PCR methods and DNA sequencing technology.  The research includes the development and
application of bioinformatic databases. The research objectives are the exploratory characterization of
species diversity in ballast water, and targeted screening of ballast water for invasive  species. Potential
applications of the research include:

•   Early detection and monitoring of non-indigenous species of concern, cryptic invasions, and
    introgressive hybridization between non-indigenous and endemic species

•   Assessing compliance with treatment requirements

•   Risk assessment
•   Characterization of invasion events.

This research project was initiated in December 2003 with laboratory work underway on ballast water
samples from the Great Lakes secured in February and May 2004.  Contracts have been drafted to sample
ship traffic between San Francisco, Columbia River, and Puget Sound. Sampling of Pacific Coast ship
traffic will begin in late summer 2004 and will continue through Spring 2005. Collaboration is necessary
to the success of the project because of the difficulty in gaining access to ballast water samples.

The research is designed to derive sequences from sludge samples by the following sequence of activities:
eggs -> purification -» amplification -» bacterial cloning -> develop sequences -> analysis.  Designing
allele-specific PCR primers  for preferential amplification of targeted species or groups of species requires
identification of:

•   Primer binding sites that are identical among individuals within a target group or absent/ineffective
    among members of excluded groups

•   Amplicon gene regions that are consistent within the target group and variable among members of
    different target groups.

Diagnostic markers can be used to differentiate between sister species and hybrids.

Much of the data generated has been deposited into a repository that is accessible on  the Internet.

Molecular approaches such as these can provide powerful tools for exploratory characterization of ballast
water content and targeted screening for species of concern. Additionally, molecular data function as a
common denominator and, therefore, have broad applicability. Future directions for  genomic approaches
to targeted screening  for invasive species in ballast water include:
•   Application of techniques to support early  detection and monitoring programs, and to assess
    compliance with ballast water treatment regulations

•   Development of streamlined molecular tools for detection and monitoring

•   Development and implementation of non-indigenous-species-focused bioinformatics databases

•   Further integration of multidisciplinary data to support risk assessment and vulnerability analyses of
    coastal regions.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          57

-------
Non-Native Oysters in Chesapeake Bay

Michael Fritz, with the EPA Chesapeake Bay Program Office, discussed three options for the
management of non-native oysters in the Chesapeake Bay, guidance for choosing the appropriate
management option, myths associated with the introduction of non-native species, ongoing EIS and field
trials, risk management guidance, and issues for EPA consideration. Over 90 percent of the oysters in the
Chesapeake Bay are from the Gulf of Mexico. In the late 1940s and 1950s, as technology improved,
diseases were discovered in the Chesapeake Bay, which have been attributed to the introduction of an
Asian oyster.  In addition, current restoration efforts are not working.

The first of the three management options addresses the outright prohibition of the use of non-native
oysters in the Chesapeake Bay, either for controlled aquaculture or for deliberate release into open waters.
The long-term risk of such a prohibition is dependent upon the potential success of restoration programs
for the native Eastern oyster.  Researchers addressed a risk they perceived of a rogue introduction of a
non-native oyster, a practice that is compared to bootlegging.

The second management option addresses the contained aquaculture of triploid C. ariakensis.
Containment such as this provides an opportunity to research the potential effects on the ecology of the
Bay of either extensive triploid-based aquaculture or the introduction of reproductive non-native oysters.
The utilization of this sterile oyster management option offers research options as well as economic
benefits for the oyster industry and watermen.

The third management option addresses the  introduction of diploid oysters into the Chesapeake Bay.
Research has concluded that it is impossible to predict the impacts that a controlled introduction of
reproductive C. ariakensis would have on either the oyster fishery or the ecology of the  Chesapeake Bay.

Overall, the research has suggested that the second management option, contained aquaculture, be
utilized, but that it should be considered as only a short-term or interim action undertaken to provide
researchers with the opportunity to obtain the biological and ecological information on the non-native
oyster necessary for risk assessment. This option allows for more management flexibility in the future
depending on the status of the native oyster  and the success of restoration efforts. However, stringent
regulations will be necessary to oversee this kind of development to ensure that it does not result in the
establishment of a self-reproducing population in the Chesapeake Bay region. Such a result would have
significant effects up and down the East Coast.

There is no quick fix for the oyster problem in the Chesapeake Bay, the oysters will not improve the water
quality in the Bay, and the problems are much bigger than any oyster restoration program can resolve.
The research has concluded that native oyster restoration has not failed, but rather has not been given
enough effort. Therefore, it is premature to give up on native oyster alternatives. Additionally, the
research has indicated that the existing regulatory and institutional framework is inadequate.

An EIS is needed before field trials can be furthered. The U.S. Army Corps of Engineers is currently
developing a proposal in collaboration with the State of Maryland and EPA. The purpose of conducting
the EIS is the economic recovery of the fishery.  However, additional benefits include water quality
improvement and reef habitat restoration.  The States of Maryland and Virginia are applying pressure to
complete the EIS in 1 year. However, scientists are predicting a 5-year timeline for the completion of the
research.

Ongoing  field trials include the introduction of 800,000 triploids in Virginia and small-scale in-water
research conducted in Maryland.  Many parameter assumptions were made in the Virginia project due to
58                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
the lack of life history parameters, but permit conditions have been set.  Fluorescent dye dispersion
projects will be conducted to determine the dispersal potential from spawning.

The focus of the risk assessment is on the adult oyster population.  A population of greater than two
adults per square meter is necessary for establishment of the species. The areas of greatest uncertainty in
the assessment involve size-specific fecundity varying within the genus, the use of fertilization efficiency
from another phylum, and the possibility that larval dispersal may be non-random.

The Chesapeake Bay Program Office strives to balance accommodation of the experimentation with these
species and keeping the risks at a minimum. The EIS is needed before good judgments about risk
management needs in the future can be made.  Risk will continue to be modeled as it has been up to this
point, using empirical data on size (length/size ratios) and through continued gameta genesus monitoring.
Additionally,  the hydrodynamics of dispersals and dispersal evaluations will be applied in the adaptive
risk management.

Issues for EPA consideration include jurisdiction under the Clean Water Act, determination of the
adequate amount of science, determination of an acceptably low level of risk, and long-term restoration in
a short-term world.

Monitoring and Assessment to Protect Tribal Health and Ecosystems
Following comments by Valerie Bataille, with EPA Region I, three speakers addressed the development
of Native American exposure pathways,  monitoring as a means of developing a better understanding of
mercury fate and transport, and a coastal waters study. An audience question and answer period
followed the presentations.

Protection of Tribal Cultural  Practices Through the Development of Native
American Exposure Pathways

Fred Corey, Environmental Director of the Aroostook Band of Micmacs, discussed the reasons that
necessitate the development of Native American-specific exposure pathways, the project approach, and
the expected project results.  The project represents a Direct Implementation Tribal Cooperative
Agreement between EPA and the following five tribes: Aroostook Band of Micmac Indians,  Houlton
Band of Maliseet Indians, Passamaquoddy Tribe Indian Township, Passamaquoddy Tribe Pleasant Point,
and Penobscot Indian Nation. Undertaking this project enabled EPA to fulfill its trust responsibility to
protect the tribal resources, and ensure that tribal lands are suitable for tribal usage.

There are more than 6,000 enrolled tribal members in Maine inhabiting tribal land holdings in excess of
250,000 acres. These land holdings represent different types of ecosystems, including wetlands, uplands,
farmland, developed land, and frontage on rivers, streams, lakes, ponds, and the Atlantic Ocean.  In
Maine, tribal  food, medicinal, spiritual, and recreational practices are linked to water resources.
Therefore, water resource protection is essential to ensure the health and safety of tribal members
engaging in their cultural practices. The use of plants and animals must be safe to ensure the  preservation
of cultural practices.

The purpose of this project is to document the cultural practices and resource utilization patterns of the
five Native American tribes in Maine through the development of multi-pathway exposure scenarios to
support the development of appropriate water quality standards for tribal lands. Typically, EPA focuses
on uses such as drinking water and recreation, and does not consider special tribal uses such as plants,
animals,  and sweat lodges.  Standards are needed that will be protective of such tribal uses, and
development  of exposure scenarios is essential to human health protection. The project seeks to protect
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          59

-------
the most vulnerable portions of the tribal population, which are the tribal members who live off the land,
ingest a lot of resources, have a lot of contact with the environment, or are children.

Although the project approach combines some elements of consumption surveys, it relies heavily upon
anthropological research to determine historic tribal natural resource utilization patterns.  To demonstrate
that the approach is scientifically sound and legally defensible, experts will assist in assembling the
information, which will undergo peer review by a tribal panel and EPA risk assessment experts.
However, personal/confidential information will be proprietary to the tribe because they are unique to a
given tribe.  There must be a balance between protecting confidential information and providing adequate
information to risk assessors to demonstrate that the approach is scientifically sound.

Consumption survey issues include suppressed consumption associated with fish consumption advisories,
land use constraints, depleted natural resources, social oppression, and economic factors. Specific causes
of suppressed consumption include over-harvesting, land mismanagement, and the presence of
contaminants. The survey does not consider increased exposures associated with exposure from living off
of the land. The goal of the tribes in Maine is to re-establish tribal fisheries and to use the resources in the
way that they were used 500 years ago.

Tribal scenarios or exposure factors intersect in three areas: anthropology or ethnography, ecology, and
toxicology and risk assessment. Anthropology or ethnography describes the use of natural resources in
the context of traditional lifeways. Ecosystems provide insight into what is important in terms of
environmental restoration and what is culturally important. Toxicology and risk assessment are essential
for determining exposure routes and understanding the implications of diet and the frequency, duration,
and intensity of environmental contact.  Examining these three areas enables the calculation of actual
exposures.

The expected results of this project are exposure scenarios indicative of fresh water and marine natural
resource utilization patterns. These scenarios will be used to  develop water quality standards to sustain
the cultural traditions  of the Maine tribes.

Towards a Better Understanding of Mercury Fate and Transport on the Fond du
Lac Reservation: Monitoring Air, Water, Sediments, and Biota

Nancy  Cost, the Fond du Lac Water Project Coordinator, discussed the reasons why mercury is
problematic in the lakes of the Fond du Lac Reservation, the tribal air monitoring program,  sediment
assessments, results of these assessments and other studies pertaining to cultural uses of the natural
resources on the reservation, and areas for future research.  The Fond du Lac Reservation, located
approximately 15 miles inland from Lake Superior, encompasses approximately 40,000 acres of wetlands.
Because these wetlands support the most important natural resources to the tribe, water resource
protection is critical.

Although there are no significant point sources of pollutants,  the boreal forest/wetland ecoregion is
especially sensitive to mercury deposition.  In this ecoregion, ionic and elemental forms of mercury are
more likely to be methylated, creating greater bioavailability to the aquatic food web.  Bioaccumulation in
higher trophic levels has been seen in piscivorous fish, eagles, osprey, loons, kingfishers, mink, otters,
and people.  The aquatic food web is of increasing importance due to the recent resurgence of tribal
members moving back to the Reservation to practice their cultural traditions.

The tribal community relies upon natural and cultural resources such as wild rice, fish, waterfowl, and
game for sustenance.  The use of these resources is compromised by the health concerns associated with
60                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
exposure to environmental contaminants. Therefore, it is important that monitoring and protection efforts
acknowledge risks posed by mercury.

Tribal air monitoring has included participation in the National Atmospheric Deposition Program since
January 1997.  Participation in this program provides information on acid deposition and chemistries, and
allows the tribe to review any air permit within 50 miles of the Reservation, including those for some
nearby power plants. Additionally, the tribal air monitoring program includes monitoring for mercury
and methyl mercury in precipitation. The close proximity of the Reservation to academic and Federal
laboratories enables the tribe to take advantage of their expertise and laboratory capabilities.

Two sediment assessments were conducted on the Fond du Lac Reservation.  The first studied 12 lakes in
an effort to characterize sediments; assess contaminant levels in the bioavailable portion for mercury,
polychlorinated biphenyls (PCBs), and lead; and conduct toxicity tests.  Each sample was analyzed for
total mercury and other parameters. However, funding was not adequate to sample for all parameters in
every sample.  An outcome of the study was the development of a sediment quality database. The results
of the study indicate higher total mercury values associated with organic sediments, one-third of the sites
sampled were in the zone where the possibility exists for effects to aquatic biota, and shallow lakes had
consistently higher mercury levels.  Data collected in the database were used to rule out PCBs as a
contaminant of concern, but did not support the elimination of lead from consideration.

The second sediment assessment project studied 12 St. Louis River sites using the same parameters of the
first study with the addition of methyl mercury. Archived samples taken during the first assessment were
also evaluated for methyl mercury and the results were added to the sediment quality database.

Graphical interpretation of the data revealed the following:

•   A significant relationship exists between the presence of high volatile solids and high total mercury

•   Shallow lakes continued to have high mercury levels, and lakes that had both shallow and deep ends
    showed lower mercury levels in the shallow end sediment samples than in the deep end sediment
    samples

•   A relationship exists between the size of the watershed and total mercury in sediment, with larger
    watersheds showing consistently higher mercury levels

•   Lakes that had higher total suspended solids levels also had higher mercury levels in sediment

•   Lakes that were well buffered had low mercury concentrations

•   Mercury levels have no relationship with conductivity and pH.

In analyzing the data, it is important to note that water quality chemistries differ from site to site. Color is
currently being measured, and there are plans to conduct dissolved oxygen concentration measurements to
ensure that color is a good proxy for dissolved oxygen concentration.

The tribe partnered with the Minnesota Department of Health to study fish  contaminants in an effort to
develop culturally sensitive guidelines for fish consumption. The study targeted species commonly
collected from Reservation waters and eaten by tribal members. Results of the study indicated that
mercury drives consumption restrictions, and PCBs, organochlorine pesticides,  and toxaphene could be
ruled out as consumption restriction drivers.

In partnership with the University of Minnesota, and funded by the Minnesota Sea Grant, the tribe also
conducted a study to determine if the cultural and nutritional benefits of wild foods, as compared to
market alternatives, offset contaminant exposure. One driver for this study is the fact that tribal members
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          61

-------
have seen an increase in nutritional and metabolic disorders (e.g., diabetes) since moving away from the
more traditional diets.  The study concentrated on food harvested from the aquatic environments on the
Reservation. Food sources including wild rice, waterfowl, and moose were analyzed for mercury and
lead. The study revealed that waterfowl and fish have comparable mercury levels. Therefore, waterfowl
should be considered in risk assessments.

Recommendations for future research studies include additional waterfowl sampling, continued fish tissue
analysis, continued atmospheric deposition monitoring, and investigations into potential sediment
mercury mitigation techniques.

Primary Production Study of Coastal Waters of the Bay of Fundy

Steve Crawford, with the Pleasant Point Passamaquoddy Environmental Department, discussed the goals
of the Primary Production Study, sources of major impacts on the Quoddy region of the Gulf of Maine,
the resulting impacts to aquatic life and farming,  and methods to measure primary production.  The goals
of the Primary Production Study of the Quoddy Region are to measure and monitor photosynthesis and to
monitor algal species biomass on clam flats to establish a baseline for nutrients.

Aquaculture, non-point sources, industry, and sewage treatment can all be sources of major impacts on
coastal waters, yet there is no baseline. Although species composition data exist, there are no data on
primary production in the area.  Another problem is the philosophy that an impact does not exist if there is
no science to support an impact determination. Defining any environment as being impacted is based on
perspective.

Aquaculture can be a good thing in moderation. However, there are many salmon farms in concentrated
areas with 22,000 metric tons of salmon in the Quoddy Region. This generates 4,000 metric tons of feces
and 2,000 metric tons of uneaten food. Another impact from salmon farming is the slice used to treat fish
lice, which is specific to arthropods, yet researchers have found this in a shellfish (mussel) in a location 1
mile away from the nearest salmon farm. This discovery of such accumulation in mussels raises the
question of the severity of the impact of this material on copepods.

Sewage treatment has the potential to cause major impacts in coastal waters. For example, 30 percent of
the sewage coming in from St. John in untreated.

Impacts seen thus far include suffocating green slime, increased red tides, altered ecosystems, unknown
chemical pollution (slice), and eutrophication.  Green slime, which grows in the flats, is made up of over
20 species.  The suffocating effect of the slime has impacted the waters to the point that tribal members
can no  longer make a living harvesting clams because there are too few to harvest. Salmon farms are not
believed to be the cause of green slime. A clam restoration project was undertaken involving the planting
of clams in flowerpots. Although the data are still being analyzed, preliminary results indicate that the
area is no longer useful for growing clams.

Although red fallar traditionally migrated into the Quoddy Region by the millions in the month of August,
this migration has not been seen since 1989. Another indication of ecosystem alteration is the
disappearance of cod greater than 24 inches in length, the legal limit for cod tagging. All of the
specimens that have been caught and examined have been healthy but none have been over 24 inches. It
is not known why the large cod  have disappeared.

Another problem exists with the approaches used to measure primary production, which involve the use
of light-dark bottles and chlorophyll A. Each approach has limitations. For example, when using a
62                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
spectrophotometer in the laboratory, some plants hold the chlorophoticity much better than others. Other
interferences include organic matter.

Panel Discussion/Question and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing upon questions from the
audience.

A brief question and answer period addressed the following topics: (1) the length of the research
programs; (2) potential use of a specific fish species as an indicator because of its tendency to change
color in certain water quality conditions; and (3) the use of exposure pathways in guideline development.

R-EMAP: The Application of EMAP Indicators
Following opening remarks by Brian Hill, with NHEERL, three speakers addressed EMAP applications
to regional projects. A panel discussion including an audience question and answer period followed the
presentations.

The Past,  Present, and Future of the Regional Environmental Monitoring and
Assessment Program

Brian Hill, the National Coordinator of the Regional Environmental Monitoring and Assessment Program
(R-EMAP), discussed the program's approach, design components, regional program histories, and future
research directions. EMAP has been in existence for over a decade, and the goal of R-EMAP is to build
the scientific basis as well as the local, state, and tribal capacity to monitor for status trends in the
condition of the nation's aquatic ecosystems in a cost-effective, scientifically defensible, and
representative manner, utilizing quantifiable trends and supporting performance-based management.

The United States spends more than $650 million per year on environmental monitoring, most of which is
targeted to individual chemicals and to physical conditions at specific sites. Point source problems have
been greatly  reduced; however, conventional monitoring does not address all issues.

The EMAP approach is proven to be cost effective. An example is the study of eutrophication of lakes in
the Northeastern United States in which 2,756 non-random lakes were censused using conventional
monitoring.  EMAP research reached the same conclusions by censusing only 344 lakes.

The EMAP approach has a sound scientific basis.  More than 600 peer reviewed EMAP publications
exist. While there once was a lot of criticism, the publications are now viewed as valid.

Examples of environmental decisions made using EMAP science include mountain top removal mining
impacts in EPA Region III, the State of Streams Report in Maryland, the revised coho salmon assessment
program in Oregon, and the fish consumption advisory for mercury in Maine.

Although the EMAP approach has been successful, unanswered monitoring questions remain including:
•   How much of our state/national aquatic ecosystems are healthy?

•   Are we targeting the right problems to make a difference?

•   How do we measure trends in the condition of aquatic ecosystems?

•   How do we determine that this is a cost-effective, scientifically-defensible, and credible  approach?

•   How do we aggregate this information from the local to the state and then to the national levels?
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                         63

-------
The EMAP approach is the only statistically valid approach to determining state and national aquatic
ecosystem conditions.  It has stood the test of time and is scientifically defensible. Reasons for the
validity of the EMAP approach are that it uses biological indicators as integrators of aquatic ecosystem
conditions, establishes measurable baselines for health of aquatic ecosystems and assesses trends in
condition, reduces costs and identifies the most important areas and stressors, and provides monitoring
designs for consistent aggregation of data from local to national levels.

EMAP utilizes multi-tier monitoring designs. The three  tiers are landscape characterization, regional
surveys, and index sites. This scale-defined design allows aggregation and interpretation of monitored
data in a tiered manner.

EMAP research is divided into four categories: index sites, indicators, remote sensing, and geographical
surveys. Index site research includes acid rain effects and Science to Achieve Results (STAR) program
research. Indicator research focuses on biocriteria and STAR program research. Remote sensing research
includes Multi-Resolution Land Characteristics, landscape atlas, and STAR program research.
Geographical surveys include R-EMAP, Mid-Atlantic Integrated Assessment, Western Pilot, Coastal
Initiative, and Great Rivers. The Western Pilot and Coastal Initiative are in their last year of data
collection and analysis. The Great Rivers survey will begin this year.

Since 1993, R-EMAP projects have been conducted in all 10 EPA Regions, with funding of nearly $20
million. Other funding for these projects has amounted to over $27 million. Partnering on these projects
has proven to be very successful for all those involved.

Future directions of EMAP and R-EMAP include adjusting designs to support the 303(d) listing, TMDLs,
and restoration; new efforts to support 305(b) assessments of large, floodplain rivers including designs,
methods, and establishing reference conditions; conducting a National Stream Survey anticipated to begin
in the summer of 2004; and developing designs and methods, and establishing reference conditions for
wetlands and the Great Lakes.

Southeastern Wadeable Streams R-EMAP: Overview,  Interim Findings, and
Status

Dr. Peter Kalla, with EPA Region IV, discussed the Southeastern Wadeable Streams R-EMAP project,
the basis of the sampling process, aquatic parameters, media and analytes,  data interpretation, and
landscape factors. The goals of the Southeastern Wadeable Streams R-EMAP project are to develop
statistical characterization of wadeable streams region-wide; complement state random and non-random
sampling; and be a component of the Regional Ecological Assessment Program. This project supports the
update of the Fish Consumption Advisory database, assessments of beneficial use, and assessments for
watershed restoration action strategies.

The process for this project included the acquisition of R-EMAP sample points; expanding the scope of
data; adding ecoregional reference sites; using contractor support to obtain permission for access,
reconnaissance, sampling, and identification; developing descriptive, spatial, and temporal statistics; and
developing multivariate analyses. Additional sampling parameters were added once the project was
underway. Ecoregional sites recommended by state partners were also examined.

Over the course of this project, researchers developed a better understanding of what should not be
targeted. Out of the 96 streams selected for sampling in  1999,37 sites were actually sampled.  In 2001,
52 sites out of the originally selected 120 streams were sampled.

Aquatic community parameters included benthic macroinvertebrates, periphyton, and fish.
64                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Graphical interpretation of the data indicates that approximately 55 percent of the streams are below
suboptimal habitat levels.

Aquatic media and analytes included water, forage fish, and periphyton. Water was analyzed for:
conductivity, dissolved oxygen, pH, temperature, nitrogen series, total and dissolved phosphorus, algal
growth potential, and total suspended solids. Forage fish were analyzed for total conventional and whole-
body mercury. Periphyton was analyzed to determine Autotrophic Index levels.

The researchers used low water grab samples, and sampling was conducted during the summer because it
was logistically easier to do so.  Mercury was examined in forage fish (blue gills) and paraffin levels also
were examined. Almost all stream miles showed no conductivity effect.  Specifying sample sources is
important because every random sample taken will have an inflection point. This could be a starting point
for setting a standard.

Mercury values in samples were comparable to mercury values observed in the Everglades.  The data also
indicated with a high degree of certainty that 45 to 65 percent of watersheds are at risk for mercury
contamination.

It is important to consider landscape factors in risk determinations since a stream network integrates
everything that happens on the landscape. Landscape factors for consideration include sub-watershed
land use, riparian cover, channel conditions, road and bridge density, the percentage of impervious area,
and drainage ditch density.

Maryland Biological Stream Survey: Science for Streams

Daniel Howard, with the Maryland Department of Natural Resources, discussed the Maryland Biological
Stream Survey (MBSS), survey designs and methods, management, science for the public, and
information dissemination. The MBSS used both its own methods and those used by the State of
Maryland. Both methods were comparable.  Thirteen sites will be examined this year using EPA and
standard MBSS methods as part of the assessment.

The MBSS relies heavily on GPS. Field crews mark off 75-meter sampling ranges around midpoints
chosen by consultants. Ninety percent of the landowners agreed to sampling on their land in 2000.
However, that percentage has since decreased for reasons unknown.

The MBSS has a random design, and involves sampling of 1st to 4th order fresh water streams stratified by
medium-sized watersheds. Over 2,000 sites are sampled each year, and the sampling cycle is 5 years.
Fish and benthos are the two indicators used in the survey.  Field crews use backpack-based
electroshocking units to conduct a two-pass electrofishing effort in the 75-meter sampling area.  Every
fish caught over one inch is counted and measured. Benthos sampling occurs at the same time as fish
sampling, using an EPA accepted protocol. Volunteers are used in sampling activities.

The physical habitat assessment was conducted using an EPA recommended approach, which included
examination of the condition of banks, the riparian zone,  and the amount and variety of habitat in the
stream from a fish and invertebrates perspective. A limited suite of water chemistry sampling consisted
primarily of acid-related constituents  and nutrients. The majority of the project funding is obtained from
the surcharge  on electric bills in Maryland (i.e., the Environmental Trust Fund). The University of
Maryland's Appalachian Laboratory was used to analyze the samples.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                           65

-------
Upstream landscape factors were examined by delineating the catchments upstream and evaluating land
use.  The percent of impervious surface located upstream of all of the sites was estimated to understand
stressors. Land use in Maryland is urban, agricultural, forest, wetlands, and water.

A great amount of effort was spent in developing indicators. The intent of this effort was to relay
information about stream health to managers and the public in a non-technical way. Multimetric
indicators are fish, benthic macroinvertebrates, and physical habitat.

Metrics based on indexes for biotic integrity for fish are grouped into three categories, depending on the
location within the State: coastal plain, eastern Piedmont, and highland. Several of the metrics repeat
across the IBIs.  Maryland benthic macroinvertebrate IBIs are divided into the two categories of coastal
plain streams and non-coastal plain streams.  Both fish and benthic IBI scores are rated on a scale from
one to five, with one being very poor and five being good.

Stream salamanders are good indicators of environmental health due to their wide distribution and
abundance. Knowledge  of their life history, their physiology, and their responsiveness to multiple
stressors are other factors that make them good indicators.

Water quality, physical habitat, and aquatic life were also examined.  Streams were sampled for acid
related constituents, and  the results indicated that Western Maryland streams are not very well buffered.
Artificial wetlands containing buffering materials were constructed to aid in the restoration of these
systems.  This information has been used to guide others in the construction of acid mitigation projects.

MBSS also included nutrient sampling, given interest in the impacts of agriculture  land use.  Many of the
Maryland streams have elevated nitrate levels that seem to be related to the amount of agricultural activity
located upstream from the sampling sites. The Maryland Department of Natural Resources is defining an
elevated nitrate  level as being any value over one.

Physical habitat assessments  included the examination of the quality of the stream channels, stream
banks, and riparian zones.  A large number of stream miles in major river basins are channelized.
Stream banks and the vegetation (or lack thereof) that holds the stream banks together were assessed. The
best approach for restoring stream banks is to plant trees. MBSS data may be useful to those involved in
stream restoration projects.

Aquatic life studies focused on fish.  Long time resident species, such as brook trout, are smaller in
number because they have lost competition for food, and urbanization is believed to be the cause.  It is
vital to identify  potential signs of stress prior to population decline. If researchers can identify the
locations where the fish are most unhealthy, they can focus on those areas.

The Stream Waders program is an adult volunteer program in which aquatic invertebrate samples are
collected using  MBSS protocols.  Response to this program has been  overwhelming, with 700 volunteers
collecting nearly 3,000 samples from 2000 to 2004.  Through this volunteer program, 75 percent of all
subwatersheds in Maryland have been sampled.  Data collected by the volunteers will be used in the
MBSS, and volunteer sites are being plotted on the annual report.

Many groups in the Maryland Water Monitoring Council are using the MBSS  methods including seven
counties, the City  of Baltimore, two State agencies, MNCPPC, three colleges,  the Smithsonian Institution,
the U.S. Fish and Wildlife Service, the National Park Service, and the U.S. Army Corps of Engineers.

The searchable  MBSS database contains photographs as well as habitat and stream information. It can be
accessed at http://www.dnr.state.md.us/streams/mbss/index.html.
66                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Questions and Answers
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics. These topics included: (1) examination
of the of uncertainty, variability, and variance of data; (2) die use of habitat restoration as opposed to
concrete to channelize streams; (3) prioritization of different types of restoration actions; and (4) the
difficulty in identifying trends in the stream surveys.

Great Places  Demand Great Science
Following introductory remarks by Rochelle Araujo, with NERL, three speakers addressed lessons
learned in the Chesapeake Bay Program, collaborative science efforts in the Great Lakes, and the role of
science, management,  and activism in sustainability of the Gulf of Mexico. An audience question and
answer period followed the presentations.

Defining Restored Water Quality,  Allocating Load Caps, and Implementing
Reduction Actions:  Chesapeake  Bay Lessons Learned

Richard Batiuk, Associate Director for Science in the EPA Chesapeake Bay Program Office, discussed
the basis of the Chesapeake Bay Program, the program drivers, the importance of providing a strong
motivator for change, and the development of the Chesapeake Bay document.  The Chesapeake Bay
Program is not limited to  EPA as it includes six states, 26 Federal agencies, and other partners.  Effective
research must begin with  comprehensive scientific studies and the synthesis of existing knowledge.  Key
lessons learned from 1978 to 1983 are the need to define the research driver, to define what to achieve,
and to build science as an underpinning for policy commitments.

Underwater grasses are the driver in the Chesapeake Bay because they affect species' survival.  Over 90
percent of the Chesapeake Bay and its tidal rivers are impaired due to low dissolved oxygen levels and
poor water clarity, which  are related to nutrient and sediment pollution.  Without oxygen and grasses,
crabs, oysters, and fish cannot survive and thrive in the Chesapeake Bay.

A strong science basis and a strong motivator are needed to convince people, including farmers, to do
what is best for the  Chesapeake Bay. It is important to define what to achieve and to provide the science
objectives behind the reasoning. The Chesapeake Bay Program desires to achieve a "water quality that
supports abundant fish, crabs, oysters, and underwater grasses in the bay and its rivers." This
achievement could not have been defined 20 years ago because the program lacked the necessary
scientific basis.

Building science as the underpinning for policy commitments is accomplished through three steps, which
are simple questions that  serve as the basis  of the research program:

•   What is the water quality of a restored Bay?

•   How much pollution  do we need to reduce?

•   What actions do we need to take to reduce pollution?

The Chesapeake Bay Program Office answered these three questions in  1983 and 1987, and is still
building upon the base that those answers provided.

Two decades of science have gone into defining clean water as having fewer algae blooms and better fish
food, clearer water and more underwater Bay grasses, and more oxygen and improved habitat for more
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          67

-------
fish, crabs, and oysters. Seven states agree on this definition of clean water. Defining restored water
quality involved developing dissolved oxygen criteria, refining designated used for the Chesapeake Bay
and tidal tributary waters, developing a series of models to measure nitrogen loads, and estimating
Chesapeake Bay responses to pollution reductions.

All of this science was drawn together to develop a Chesapeake Bay document. As of 2002, progress had
been made in nitrogen loads in the tidal Chesapeake Bay. Although point sources are shrinking,
agriculture is still the primary source of nitrogen loads. The research has indicated that nitrogen and
phosphorus reductions improve oxygen levels in the Chesapeake Bay.  Sediment was similarly analyzed,
and findings indicated that as sediment loads were reduced, the underwater grasses increased. The goal is
to restore the Chesapeake Bay to its 1950 s condition.

A tremendous amount of science went into developing the Nutrient and Sediment Cap Load Allocations
chart. The driver was a 55 percent total reduction in nitrogen loads. Tributary strategies associated with
nitrogen load reduction include river specific, local action driven nutrient/sediment pollution clean up
plans; local stakeholder refinements; actions and schedules for reducing point sources and agricultural,
urban, and septic loads to  the Chesapeake Bay; and funding strategies.

Implementation of these reduction actions will require unprecedented involvement of the farming
communities and significant point source reductions.  The State of Virginia is asking that its farmers
increase the amount of crop land under conservation tillage from 56 percent to 96 percent by 2010,
amounting to 74,000 additional acres. Point sources have been reduced by approximately 20 million
pounds per year, but the 2010 goal is to reduce them by an additional 30 million pounds per year.

Science must be used as the foundation for decision making such as determining where thresholds should
not to be exceeded and determining the safe level. The bottom line is to synthesize information in the
beginning, find the drivers, employ people who can bridge the pieces, adopt new science, and lock in new
policy opportunities.

The Great Lakes: Collaborative Science to Inform and Help Frame Policy

John Lyon, with NERL, discussed environmental issues within the Great  Lakes, the benefits of and need
for research collaboration, the  Great Lakes Observing System, invasive species studies, and lessons
learned. Collaboration is  essential to doing anything  ambitious. Constituents and clients want practical
solutions that they can understand. Research efforts are complex and multi-jurisdictional. Because
existing funds and resources are limited, it is vital to leverage funds. Collaboration fills in the gaps of
participation, fosters dialogue and understanding with scientists and decision makers, and results in high
quality science, engineering, and technical support.

Cooperative efforts in the Great Lakes region have been ongoing for years.  Thus far, these cooperative
efforts have identified critical uncertainties, projected alternative futures,  formulated management
strategies, and evaluated trade-offs for regional sustainability. In addition, these collaborations address
issues vital to human health and the environment including invasive species; water quality and land cover
issues; water quality, sediments, and toxics issues; recreational waters;  air quality and toxics; and
community growth.  The collaborative network includes multi-national groups, Federal agencies,
Canadian groups, the Canadian Provinces of Quebec  and  Ontario, the states bordering the Great Lakes,
and others. Collaborations have included the Great Lakes Environmental Research Laboratory, the Great
Lakes Water Resources Management Decision Support System Project, the Great Lakes Observing
System, and invasive species analyses.  The long-term goal of these collaborative research programs is
the development of a scientifically defensible, reliable ecosystem forecasting capability for the Great
Lakes.
68                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
The focus of the Great Lakes Observing System is the coordination of data collection between the United
States and Canada; integration of large data holdings and data archives; facilitated discovery, evaluation,
and access to data (including access to the public); and new product development. The user community
includes interest groups such as commercial and sports fishing groups, recreational boaters, emergency
responders, national security groups, restoration management groups, coastal researchers, and the
commercial shipping industry.

Lessons learned from the Great Lakes collaborations include science, innovation, and collaboration
protect human health and the environment; place-based assessments demonstrate true potential; science-
based information dissemination is critical to facilitate dialogue; and collaborative processes generate
management strategies.

Ecological Sustainability of the Gulf of Mexico:  The Role of Science,
Management, and Activism

Dr. Quenton Dokken, Executive Director of the Gulf of Mexico Foundation, discussed population and
coastal growth, economic and resource trends, environmental quality  issues, and the role of science in
sustaining the Gulf of Mexico. Historically, the Great Lakes and Chesapeake Bay regions have received a
great amount of attention. However, the Gulf of Mexico is 20 years behind those regions in terms of
receiving the attention required to address its ecological problems.  EPA's Gulf of Mexico program has
grown and matured and is starting to provide substantial management to the region, but much work is still
needed. The Gulf of Mexico is the largest economically successful water body, and strong science and
activism is needed to sustain the ecological quality of the region.

The Gulf of Mexico Foundation takes the  approach of managing the people responsible for sustainment of
the resources, as opposed to managing the resources themselves. This approach is necessary due to the
increase in coastal population. Quality of life and industrialization have transformed coastal cities into
international centers of trade and commerce. Thirty-four percent of the population in the United States
lives in a state adjoining the Gulf of Mexico.  Coastal communities, including Brownsville, Houston, and
Galveston, TX; New Orleans, LA; and Cape Coral and Key West, FL, continue to experience significant
population increases.

Although population growth improves economic conditions, it also places an enormous burden on the
environment.  The once held belief that the ocean and coastal resources were inexhaustible and, therefore,
could be exploited, has been replaced in recent years with a heightened understanding  of the environment
and the impacts that increased economic development has on the coastal region and its environmental
resources.

Marine-related economic activities including outdoor recreation and tourism, waterborne commerce,
energy and mineral resources production, fisheries resources, and food supply account for two percent of
the gross national product in the United States. Recreation and tourism activities, which include
residential and commercial development, account for 50 percent of the economic activity in coastal areas.
Six of the 10 major ports in the United States are located in the Gulf of Mexico. Fifty  percent of the
nation's undiscovered oil and gas supply is thought to be located in the Gulf of Mexico, which is expected
to lead to a 60 percent increase in offshore activity by 2010. Over fishing, including both commercial and
recreational fishing activities, has resulted in the ocean and coastal waters reaching the maximum
capacity on the production offish.

A host of environmental quality issues are impacted by coastal population increases. Estuaries and
coastal waters are being increasingly stressed by point source pollution, non-point source pollution, and
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          69

-------
habitat loss and degradation.  Municipal and industrial waste discharge and ocean/coastal dumping are the
two major sources of point source pollution in the Gulf of Mexico. Four of the five states most
responsible for the greatest amount of toxic chemical discharge to surface waters are Gulf coast states
(Alabama, Louisiana, Mississippi, and Texas). Recent attempts to control and limit point source pollution
have included the construction of wastewater treatment facilities and enforcing limitations on the
dumping of dredged material.  The majority of non-point source pollution is attributed to urban and
agricultural activity. Contaminants of non-point source pollution include sediments, nutrients, animal
wastes, pesticides, and toxins. Powering industrial processes and motor vehicles produces most of the air
pollution in the region, and coastal habitats have been changed, degraded, and destroyed by
anthropogenic activities. As a result, many important species occupying these habitats are threatened.
Sensitive and important Gulf coast habitats include wetland, upland, dune, beach, oyster/coral reef,
mangrove, and pond/stream habitats.

Good science produces accurate facts from which truth can be  determined. It is essential to ensure that
the science produced is used and reported as it was intended. Pressure from political, judicial, and special
interest groups as well as practitioner biases can impact science. Additionally, political, economic, and
time constraints make producing good science more challenging.  The science produced is typically at a
95 percent confidence level; this is of importance because, five percent of the time, the scientific results
could be incorrect.

Management of the Gulf of Mexico requires great science, which must include long-term monitoring.
EPA is a key player in the production of the required science and successful management of this region.

Wrap-Up and Discussion
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics including:  (1) mercury levels in the Great
Lakes as compared to the levels found in the fish; (2) salinization in coastal wetlands; and, (3) selling
local benefits as well as downstream benefits when trying to convince communities to make changes.

Looking into the Future of a Region
Following introductory comments by Dr. Betsy Smith, the ReVA Program Director atNERL, six speakers
addressed ecological forecasting, ecological vulnerability resulting from land use change, statistical
groundwater monitoring, forecasting species distributions, and alternative scenarios for land cover
change. An audience question and answer period followed the presentations.

Ecological Forecasting:  An Introduction

K. Bruce Jones, with NERL, discussed the approaches, goals, and applications of ecological forecasting.
There are two general approaches used in forecasting: to attempt to predict the future using predictive
models that carry with them a level of uncertainty, and to derive a best estimate from a method, mode, or
individual, which has the benefit of public understanding that this estimate may not be true.

Ecological forecasting has broad goals that primarily focus on conducting vulnerability/risk assessments
based on both current and potential future environments and adopting management approaches to reduce
vulnerability and risk.  Accomplishing these goals entails  the extensive use of models that relate
ecological endpoints and important processes to biophysical variables (conditions) and natural and
anthropogenic stress; development of scenarios of future conditions of biophysical variables and stressors
to evaluate the consequences of certain types of changes; and development of scenarios of change as
opposed to prediction.
70                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Applications of ecological forecasting include assessment of vulnerability to the spread of invasive
species, ecological resource vulnerability (e.g., consequences of landscape change projections/scenarios
on ecological endpoints and processes), and vulnerabilities of important estuary and coastal water
processes and resources to future climate change scenarios. Other applications of ecological forecasting
include disaster forecasting, assessment of the consequences of alternative future urbanization,
assessment of the impact of alternative landscape futures on multiple resources, and use by the
Interagency Working Group on Earth Observations.

Although very  helpful to many assessments, ecological forecasting has its limitations and issues,
including:

•   Many assumptions that are made about future changes

•   "Space-for Time" construction of models for determining consequences of future change

•   Change models based on historic patterns of change that may not accurately reflect future changes

•   Complexity of ecosystems

•   Scaling

•   Limited number of fixed monitoring sites to establish the pattern of historic change and relationships
    to drivers of change.

A Weight-of-Evidence Approach to Projecting Land-Use Change and Resulting
Ecological Vulnerability

Laura Jackson, with NHEERL, discussed the results of urbanization and models for assessing the effects
of urbanization. The idea of examining land use change across an area is daunting, and this project is a
result of the cumulative effects of citizens and governments. Urbanization is the most rapidly increasing
driver of environmental degradation in the Mid-Atlantic Region. Examples of its direct and indirect
effects include habitat conversion/fragmentation, polluted and  excessive runoff, polluted air and
deposition, increased invasive species, longer commute times,  and over use of natural areas.

Alternative futures of land use change is a scaled approach to evaluating risk that involves a region-wide
analysis, study of large-scale sensitive resources, and high-resolution land use models. Developing this
approach began with an examination of underdeveloped models that had undergone some scientific
review. These models were then overlaid with what was already known about sensitive resources, and R-
EMAP hexagons were also considered. Finally, finer scale models that were more specific to localities
were applied.

The first model examined includes census projections and formulas for estimating how much land the
future populations will use and how many total acres will be consumed by new development. Using these
formulas, projections were made through 2010 focusing on the growth in metropolitan areas such as
Philadelphia, PA, Washington, DC, and Research Triangle Park, NC. The second model examined, the
Research Economics Model, is more complex and is used to determine whether or not land will have a
high value for urban development or agricultural uses. The model indicates that urban areas are losing
more land and  are becoming more metropolitan.  Confidence on these two models is limited, and,
therefore, the certainty of their projections in unknown.

SLEUTH is a cellular model.  Once guidelines are entered, it produces urban cells in a Monte Carlo
simulation. Each pixel is assigned a percentage, and 51 percent was chosen as the cut off point for this
assessment.  Cells assigned percentages of 51 percent or higher were the ones identified to be urban by
2010. Pittsburgh, PA and Charleston, WV continually fell into that category. The model, based on the


                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          71

-------
location of current urbanization, was run against a past date then run again against a future date to
generate the projections.

Another model examined utilized two methods for prediction.  The first method examined Department of
Transportation road plans to capture outlying areas.  Since growth and roads are circular, new roads result
in new construction such as gasoline stations and strip malls.  This method estimated that any county
having 10 kilometers or more of new or widened roadway would have urbanization by 2010. The second
method examined the number of projected new jobs and the unemployment rates by county. Counties
were ranked on a sliding scale based on their current unemployment rates and expected numbers of new
jobs.  Using this ranking, half of the counties in the Mid-Atlantic Region were forecasted for urbanization
by 2010. Because this estimation did not sufficiently narrow down the areas, overlays including forest
cover, native birds, fish, reptiles, and exotic species populations were applied to rank each county as
being rich or low in these areas.

In the end, counties that were projected to have land use change by at least two of the models were
ranked. Another outcome of this project was the identification of outlying areas (e.g., mountains and
beaches). This is important considering that these areas represent what is left to lose to urbanization and
the purpose of the project was to help development, not impede the natural resources.

Alternative Scenarios and Land-Cover Change:  Examples Using Nutrient Export

James Wickham, with NERL, discussed the use of land cover change in the determination of changes in
non-point source pollution, its link to nutrient export and vulnerability assessments, and nutrient
modeling. Alternative scenarios use forecasts of future regional urbanization patterns to distinguish
between risk and vulnerability.  Land cover change is linked to nutrient export and vulnerability using
more traditional EPA endpoints in order to  use land use change to determine resulting changes in non-
source point pollution.

Forecasting links land use change to risk. Land cover change is one of the major factors for regional
vulnerability forecasting in EPA because it is an indicator of the occurrence of temporal change. EPA
examined the relationship between land cover change and nutrient export in an effort to develop export
coefficients. Associated risk can be estimated by choosing a distribution in the study region.

Two model performance results were examined.  The first performance result, labeled "A" for adequacy,
is the ability of the model to replicate the entire range of observed data. The second result, labeled "R"
for reliability, is a measure of how often the model produces results unobserved in the monitoring data.
Reliability is an interesting issue because there may be value in learning more about unobserved results.

A small area of the Mid-Atlantic Region was divided into 5-kilometer cells.  A nutrient model was first
run using the current land cover map, and then land cover change was imposed upon it by examining the
distribution of roads. The relationship between the proportion of urbanization in the land cover map and
the road density was used to determine future land cover change.

The results indicated that the largest changes are occurring in some of the major cities along the 1-95
corridor. Nitrogen risk was more predominant in the Ohio watersheds that drain through the Mississippi
River, and the phosphorous risk was more predominant in the Mid-Atlantic watersheds that drain through
the Chesapeake Bay and Delaware. Vulnerability was defined as a change in risk that was exceeding the
model error in both the nutrient export and  the forecast.

Land cover change effects on variance are an important issue.  Because an increase in variance is an
increase in sensitivity to outside factors, it makes environmental management more difficult. Pfiesteria
72                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
variance is of major concern because this toxin has been implicated as the primary cause of major fish
kills and fish disease events in many Atlantic and Gulf Coast states. Research has indicated that human
activity, such as excessive nutrient loading, increases the activity of pfiesteria, leading to greater fish
fatalities.

Land cover changes impacting nitrogen and phosphorus export variances were also examined to
determine the location of gaps between the mean and the variance. The steps taken in these examinations
include:

•   Compiling forest, agriculture, and urban proportions by watershed for early- and late-date land cover
    data

•   Running nitrogen and phosphorus export simulation models on temporal land cover data to estimate
    mean and variance

•   Repeating the simulations 150 times to generate confidence intervals for means and variances

•   Comparing confidence intervals and declaring significance when a positive difference (gap) is seen
    between mean and variance ranges over time.

A study conducted in eastern Maryland showed that, on average, an 11 percent loss in forest was required
to change the mean and variance. The amount of forest loss required to significantly change the variance
increased as the percentage of forest decreased. These study results indicate that watersheds with more
forest (i.e., more than 70 percent forest) are more vulnerable to increased nutrient export than watersheds
with less forest. Vulnerability is distinguished from risk by means of statistical significance tests.

Statistical Modeling of Groundwater:  Vulnerability in the Mid-Atlantic Region -
Present and Future

Earl Greene, with the USGS, discussed the relationship between land use and groundwater vulnerability,
the benefits of using input functions in modeling to increase their predictive power, and applications of
model predictions. Land use change predictions can be used to make predictions for groundwater
vulnerability. The goal is to build statistical models for predicting the probability of any kind of
constituent in groundwater for which data exists. Nitrate was used in this study because it is considered
an overall indicator of groundwater health. This kind of modeling can be adjusted to any specified
management threshold, and can define and predict areas where groundwater is most vulnerable. This
information is important to managers in determining which areas require the implementation of best
management practices and in collaborating with county governments for protection of groundwater
resources.

Land use variables, soil variables, and geologic variables are used to determine and predict what is going
to happen with groundwater quality.  An input function was added to the model to improve its power of
prediction.  The input function can be sources such as inorganic fertilizer, organic fertilizer, or
atmospheric deposition. Other potential input functions that have not yet been tested include septic tanks
and home septic systems.

The study analysis consisted of identifying the thresholds, determining the appropriate management
concentration level, and characterizing the response variable (nitrates) as either below or above the
specified concentration. The nitrate concentration  data obtained from the laboratory  is converted into a
binary value, and  a logistic regression model is used for the modeling work. Logistic regression is used
to identify a relationship between a categorical dependent variable (nitrates) and independent variables
(geology, land use, etc). Parameters developed for the logistic regression model are computed on the
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          73

-------
samples. The model, based on those samples, is then applied to the entire region. Use of the model is
tedious and time intensive.

Model results have indicated that manure patterns are very important, and show the impacts of high
cultivation on nitrates. Geology is also a very significant variable (e.g., limestone is very vulnerable).

This type of modeling is useful for increasing knowledge of groundwater quality and developing
vulnerability maps for mangers to use at the regional, watershed, and county scales. This study has been
investigating the use of the groundwater model to conduct site-specific analyses or applications of this
work. Maps developed using the model are used to determine future monitoring.

Forecasting Species' Distributions: The Shape of Things to Come

Daniel Kluza, with NCEA, discussed the Genetic Algorithm for Rule-set Prediction (GARP) model, the
results of a case study utilizing the model to predict the locations of areas in the Mid-Atlantic that may be
vulnerable to non-indigenous species habitation, and additional applications of the model results.
Forecasting results in indications of what might happen, and predictive modeling uses a few different sets
of rules, the end result of which is a basic rule set.  GARP describes relationships between occurrence and
environment using multiple rules and demonstrates excellent predictive ability through the use of a
genetic algorithm and an artificial intelligence application for generating rules.

GARP is a very robust model that looks for non-linear relationships between data.  The model determines
a potential distribution, and it is important to note that individual species do not always inhabit all of then-
potential habitats because of factors such as predators, parasites, competitors, barriers to dispersal, or the
dispersal ability of the species itself. However, GARP has been proven to be a strong predictor of the
distributions of non-indigenous species.

A case study was conducted involving the application of GARP to determine which parts of the Mid-
Atlantic Region are potentially inhabitable by a particular non-indigenous species, the giant salvinia,
which grows in the extreme southern part of North Carolina.  Graphical interpretation of the data
indicates that the potential distribution of the  giant salvinia depends upon dispersal, management action,
and/or inaction. Changes in climate, precipitation, and temperature are also potential factors in the
distribution of giant salvinia.

GARP has other regional applications such as modeling distributions of native species and assessing the
effects of climate change.  Modeling the distributions of native species, conducted to assess threatened
and endangered species, can reveal biodiversity (i.e., hotspots and coldspots) and can provide insight into
the potential impact of future changes in land cover and use.  The effects of climate change on native
species, the ecosystem (including nutrient cycling), and the economy can also be effectively predicted by
GARP application. Such predictions allow scientists to anticipate the changes and decision makers to be
proactive rather than reactive.

Putting it All Together:  Implications for the  Mid-Atlantic Region in 2020

Dr. Betsy Smith, the ReVA Program Director, discussed future scenarios, the determination of watershed
conditions and vulnerability now and in the future, the patterns of watershed vulnerability, and the
application of the ReVA approach to decision making. One of the most important aspects of the ReVA
program is that it examines all available information to determine which areas are in need of additional
work or finer-scale models. Thus far, the program has focused on integrating all of the information.
However, different methods are needed to integrate spatial data to develop useful information. ReVa
74                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
collected information on more than 150 coverages, but this amount of information was too large to be
utilized by decision makers. Therefore, a Web-based tool was built to synthesize the data for use.

The following were examined as drivers of land use change (e.g., migration scenarios, groundwater
vulnerability, landscape indicators, and nitrogen, phosphorus, and sediment loadings); mining; risk of
timber harvest; "Clear Skies" scenarios for ozone, PM, nitrogen, and sulfur; human population
demographics; and risk of non-indigenous species (with and without climate change).

Because ReVA is intended to be an integrated assessment, it is essential to examine human health
impacts.  Census data were used to identify vulnerable humans (i.e., the young, old, and economically
deprived) and to examine the health impacts they could expect in the future.

The assessment tool is portable and Web-based, and NERL is collaborating with EPA Region III to
ensure they agree with the future scenarios.  The assessment tool will be publicly available by the end of
this fiscal year. NERL is also collaborating with Pennsylvania, Maryland, and local governments to
assess how future changes might affect their areas and what actions they might need to take in response.

Simple methods involving the use of low environmental quality values were used to determine which
watersheds are currently in the worst condition in the overall ecological region.  Currently, the
Washington, DC, and Baltimore, MD, area watersheds are in the worst condition. However, an urban
area location does not necessarily mean that a watershed will be in bad condition. For example, some
watersheds actually improved when land use changed from agricultural to urban. This improvement is
attributed to nitrate removal.

The watersheds that were in the best condition had a lot of resources. These watersheds might be of real
significance based on the number of resources present. Improvements seen in some watersheds are
largely attributed to improvements in the Clear Skies program. The areas that fall out of the "best"
category are the ones to watch closely.

Determining which watersheds will be the most vulnerable in the future includes assessing ecological
conditions in the current scenario. This type of assessment provides information for restoration and
protection activities.  Most changes within a watershed are the result of increasing growth in the region.

Determining patterns of vulnerability in watersheds involves the assessment of both resources and
stressors. Watersheds examined are those that currently have a lot of resources whose future can be
confidently predicted and have  a number of stressors acting upon them. This type of assessment reveals
cumulative effects of the stressors on watershed resources.

Irreversible change is a critical point that ecosystems typically reach once they have been under stress for
awhile, and it indicates that the  ecosystem will collapse and never revert back to its previous state.
Determining how the pattern  of watersheds vulnerable to irreversible change will change in the future is
essentially a measure of how  far ecosystems have already declined and how far they are likely to decline
in the future.

The next step of the process involves application of the ReVA approach and information to decision
making.  Applications include the evaluation of alternative "Smart Growth" strategies, identification of
where to set aside land for conservation, assessment of impacts of alternative incentives for pollution
prevention, investigation of solutions for "cross boundary" issues associated with air and water quality,
estimation of impacts of new road development, and tracking of progress and performance.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          75

-------
Questions and Answers
The speakers had the opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics. These topics included: (1) the application
of land use models at the local and municipal levels; (2) examples of where the land use change
information is being used in decision-making processes; (3) choosing the appropriate thresholds for
modeling; and (4) the use of an overlay approach in modeling.

Regional Research Partnership Program
Following introductory remarks by Tom Baugh, with EPA Region IV, three speakers addressed microbial
source tracking, the use of land cover diversity as a proxy for biodiversity, and the relationship of
terrestrial ecosystems to manganese emissions from wood burning. An audience question and answer
period followed the presentations.

Microbial Source Tracking: the Application of a DMA-Based Molecular Approach
to Identify  Sources  of Fecal Contamination

Bonita Johnson, with EPA Region IV, discussed the use of microbiological indicators to assess water
quality, the significance and purpose of using a DNA-based approach, and the PCR and Agarose Gel
Electrophoresis approach. Pathogens are known to cause harmful, potentially deadly, infectious diseases.
Primary sources of pathogens include agricultural operations, septic tank systems, drinking water
systems, wastewater treatment systems, and recreational waters. They typically enter and contaminate
surface and groundwater via flood events, abandoned and poorly constructed wells, and spills. Indicator
organisms are used to detect the presence of pathogens in water because they are easily isolated, have a
longer survival rate than most disease-producing organisms, and are expected to be present in water
containing enteric pathogens.

Total coliform is the indicator primarily used for potable water assessment. Fecal coliform, E. coli, and
enterococci are used to assess wastewater and other non-potable water quality, fresh water quality, and
marine water  quality, respectively. Studies are typically looking for the presence of fecal coliform.
However, E. coli and enterococci are better indicators than fecal coliform.

Microbial Source Tracking is an innovative approach based on the assumption that specific species and
strains of bacteria are associated with specific hosts. Microbial Source Tracking was introduced in 1999,
and is considered to be the best available tool for identifying sources of fecal pollution in water.  There
are currently about nine other technologies being explored for the  same use.

Bacterial source tracking is conducted using two methods:  molecular (genotype) and biochemical
(phenotype).  Molecular methods are referred to as  "DNA fingerprinting" because they are based on
genetic makeup of fecal bacteria.  Biochemical methods are based on an effect of an organism's genes
that produce a biochemical substance.  The intended end result of this research is to construct a library or
database of isolates obtained from samples of known sources such as humans, cows, and deer. The size
of the library  will be contingent upon the number of potential major sources of fecal pollution in the target
area.

Enterococci is the targeted indicator and was selected because it persists longer in the environment than
E. coli, survives in adverse  conditions, is more source specific than E. coli, and is found in 80 to 90
percent of clinical isolates taken from infected humans. The sampling plan collected surface water and
cow manure samples from a farm in Athens, GA that is located in  the Broad River Watershed, which is
impaired due  to excessive nutrients and pathogens.  Samples were taken once a month at sites above and
below the farm. EPA Method 1600-mEI agar was the method used to isolate enterococci.


76                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
PCR is a technique that replicates the DNA present in a cycling reaction run that typically takes about 2
hours.  Because the project is ongoing, the actual mixture or cycling cannot be revealed. The final PCR
product is then subjected to electrical voltage to cause DNA fragments to migrate. The molecular weight
and number of base pairs of the DNA fragments are related to the distance that they migrate. Enterococci
isolates identified thus far in water and/or manure include  E. faecalis, E. faecium, E. columbae, E. avium,
E. casseliflavus, E. hirae, and E. raffinosis.

Ongoing related work in EPA Region IV includes continued determinations of the species of enterococci
isolates obtained for the Broad River Watershed, maintenance of in-house PCR/Electrophoresis and
Microbial Source Tracking capabilities, and the development of a national Microbial Source Tracking
guidance document.

Land Cover Diversity Measured  by Satellite as a Proxy for Biodiversity

David Macarus, with EPA Region V, discussed research on the potential for using satellite data as a proxy
for biodiversity.  Dr. Mary White, with EPA Region V, and others developed a model for identifying
ecosystems.  The research driver was the question of whether or not land cover diversity can be used as a
proxy for community development.

Sites in Wisconsin, Arkansas, and Texas were chosen to compare land cover diversity to vegetative
community diversity. The research used data from the national land cover database because satellite data
are relatively inexpensive because they already exist, airplane fly over data are more expensive to acquire,
and ground-based data can be very expensive and is not available everywhere.

To determine if the research project data were good, the community diversity data from the developed
model was compared to data from an actual community. Ten such comparisons were made with the
Shannon Wiener Diversity Index and, with the exception of the Wisconsin data, the results were not very
good. The model is based on shape correlation (i.e., the circularity of the area being examined), which is
a possible explanation of the results.

An important aspect of the regional research partnership program is its flexibility. For example, Dr.
White spent  1 week per month at the EPA laboratory in Las Vegas and completed computer work at her
home laboratory. Regional scientists may be encouraged to participate in the Regional Research
Partnership Program by a schedule such as that.

Lessons learned as a result of this project include:

•   ORD scientists are eager to collaborate with the EPA Regions

•   Use of the land cover diversity index is valid in the EPA Region V geographical area

•   ORD computing facilities offer advantages over those available in the regional facilities.

The Relationship  of Terrestrial Ecosystems to Manganese Emissions from Wood
Burning

Dan Ahern, with Region V, discussed the medical effects and exposure routes of manganese, the
element's relationship to plants, its emissions from burning, and areas for future manganese research.
Manganese is the number one relative risk for EPA Region V.  Discovered in 1774, manganese is the
twelfth most abundant element. More  than 90 percent of all manganese is used for steelmaking and in
other metallurgic processes.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          77

-------
Although considered an essential nutrient for plants and animals, manganese is a neurotoxin at high
levels. It affects the central nervous system causing manganism, a disease with symptoms similar to
Parkinson's disease. Safe levels are uncertain, but the inhalation safe level dose is approximately one-
fifth of the inhalation safe level for mercury.  Manganese appears to have effects on children similar to
those caused by lead, although the supporting data are uncertain.  Manganese exposure has been linked to
violent behavior. An example of this link is the increase in the rate of violent crime in Canada after the
country introduced methylcyclopentadienyl manganese tricarbonyl (MMT) in gasoline. However, there is
no correlation between ambient manganese levels and violent crime statistics.  The impacts of manganese
exposure are irreversible and persist after exposure has ceased.

Studies on workers who were tracked for 10 years after exposure to manganese showed that while the
their manganese concentrations decreased over time, their symptoms continued to increase.  Additionally,
Parkinson's disease studies are showing onset of the disease 15 to 20 years earlier in welders than in other
workers. The critical manganese exposure route is inhalation.  The liver and blood will protect the body
from ingestion exposures.

Manganese is an essential nutrient to plants. The major factor in manganese levels in plants is the pH of
the soil.  The  range is from 10 to 7,000 ppm, but the average level is about 600 ppm. Low soil pH can
lead to high manganese levels that can be toxic to plants. Manganese levels in boiler wood vary from 4 to
more than 100 mg/kg.  Most of the research conducted on manganese levels has studied leaves and stems
of plants.  Bark has been shown to have much higher levels of manganese than wood.

This research project considered three emission sources: wood-fired boilers, residential fireplaces and
stoves, and wildfires and prescribed fires. Of the three sources, wood-fired boilers produced the highest
levels of manganese emissions, typically two orders of magnitude greater than emissions from stoves or
wildfires. The difference in emissions is attributed to the fact that most organics are burnt out in boilers
and end up in the fly ash, whereas wood  stoves and wildfires are not as efficient and most of the metals
and manganese end up in the bottom ash. The new MACT regulation is aimed at controlling particulate
emissions, and therefore manganese emissions. It has been estimated that wood-fired boilers complying
with MACT will reduce their manganese emissions to levels emitted by stoves and wildfires. MACT will
require the use of electrostatic precipitators and scrubbers.

Areas for further research include Toxics Release Inventory (TRI) and ambient manganese air
concentrations.  CAA Title V permits are reporting different emissions for the permit and for TRI. This
may result from under reporting as a result of the TRI 25,000 pound threshold level, or from
understatement of anthropogenic burning in material flows, mobility issues, and gasoline additives,  such
as MMT.

Panel Discussion/Questions and Answers
The speakers  had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics. These topics included: (1) the effect of
pH soil levels on manganese availability; and (2) using equipment in research studies that can feasibly be
used in the EPA Regions and states.

Community Air Toxics  Projects
Following introductory comments by Henry Topper, with the Office of Pollution Prevention and Toxics,
five speakers  addressed support for community air toxics programs, the development of an air toxics
emission inventory and reduction strategy, and community air toxics programs in St. Louis, MO,
Louisville, KY, and Mobile, AL. An audience question and answer period followed the presentations.


78                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Addressing Air Toxics at the Local Level

Henry Topper, with the Office of Pollution Prevention and Toxics, discussed EPA actions to address air
toxics at the local level.  The CAA categorizes air toxics as being a criteria pollutant, a mobile source
pollutant, or a HAP.  The six criteria pollutants include ozone (Os), NO2, SO2, lead, carbon monoxide,
and PM. The mobile source pollutant category consists of 21 chemicals and mixtures, while the HAPs
category contains 188 chemicals and compounds.

EPA has been working on air toxics for some time.  Recent research efforts have revealed that many of
the problems originate at the local level. Therefore, in addition to government-mandated approaches,
EPA is working on community-led approaches at the county, city, and neighborhood levels. It is believed
that the community-led approach will be the key to the overall success of this project.

Available resources include a national database, guidance materials, training courses, and emission
reduction information for mobile, stationary, and indoor sources. Ongoing work to develop additional
resources includes the development of a reduction activities matrix for indoor, stationary, and mobile
sources; creation of a publicly available library of reference information and Web sites; and an effort to
coordinate and pool resources from all Agency programs for integrated assistance to communities.

EPA and its partners have a new vision to help communities which includes:

•   Identifying the appropriate mix of analysis and risk reduction efforts at the local level

•   Understanding that each community has different needs, abilities, and interests

•   Helping communities to identify and prioritize risks and risk reduction options, educate the public,
    and find the needed resources.

The Air Toxics Program is part of a larger effort within EPA that includes the Communities for Renewed
Environment Program, the Environmental Justice Collaborative Problem Solving Grant Program, the new
Administrator's emphasis on collaboration and local solutions, and the Agency Framework for
Cumulative Risk Assessment. Additionally, the new Agency focus includes resources and tools for
addressing environmental health issues at the local level.

Developing a Local HAP  Inventory and Reduction Strategy in New Haven, CT

Madeleine Weil, with the City of New Haven, CT, discussed the development of the HAP inventory, its
results, and lessons learned as well as  the development of a risk reduction strategy. The purpose of this
project was to develop an inventory of local HAPs emissions from point, area, and mobile sources, and to
design and implement an air toxics reduction strategy focused on priority pollutants and sources identified
by the inventory. The project began with an $80,000 Air Toxics Inventory grant from EPA Region I.
The Comprehensive Plan of Development (in 2003) emphasized environmental health and sustainability
as components of quality life, provides guidance for development policies and regional planning
initiatives, and addresses environmental justice.

The primary project questions addressed the identification of HAPs emission sources, pollutants of
concern, and the location of HAPs concentrations.  Secondary project questions addressed the
accessibility of air toxics data by local governments that lack emissions inventory expertise, the
identification of technical and systemic challenges to data gathering, and innovative methods resulting
from a "rookie" project.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                         79

-------
Point sources were inventoried individually, and 33 facilities reported 114 tons of total HAPs emissions
per year. Sources of the HAPs emissions included surface coating, degreasing, petrol tank farms, and
power plants. Thirteen of the 33 facilities accounted for 96 percent of all of the emissions. The following
are lessons learned from the point source assessment: up-front National Emissions Inventory training
would be useful, existing inventories are inconsistent, Connecticut lacked a HAP inventory, and
partnerships with record-keeping staff at regulatory agencies are essential.

Area source emissions were assessed through the use of surveys, experts, utility requests, existing
research, per capita information, and employee information. Graphical  interpretation of the data revealed
that architectural and industrial surface coating and solvent cleaning produced more than 75 percent of the
emissions.  The following are lessons learned from the area source assessment: groundwork improves the
accuracy of activity data, the method still relies on EPA emission factors, and the  factors were developed
for national-level inventories.

Mobile source emissions were assessed in two categories: on-road and  non-road.  Methods to assess on-
road source emissions included the 1999 National Emissions Inventory  county-level emissions; variables
such as vehicle mix, traffic patterns, and speed; and a local vehicle classification mix. The results
indicated that light duty gas vehicles, light duty  gas trucks, and heavy duty diesel  vehicles produce the
vast majority of on-road source emissions. The  following are lessons learned from the mobile on-road
source assessment: heavy-duty diesel vehicle emissions were potentially underestimated, high ambient
levels of PM 2.5 exist, and port-related traffic, including idling trucks, slow  speeds, and heavy loads, is a
factor for consideration.

Methods to assess non-road source emissions included the use of aircraft landing-takeoff data, locomotive
fuel consumption data, waterborne commerce statistics for commercial marine vessels, and National
Emissions Inventory county data. Results indicated that construction and landscape activities and
commercial marine vessels produced the vast majority of non-road source emissions.

Overall, the results indicated that point, area, on-road, and non-road sources are responsible for 11, 22,
39, and 28 percent of the total emissions, respectively.

Health risk prioritization is an important part of the assessment because it evaluates relative risk;
examines cancer, chronic, and acute health risks; and focuses the reduction strategy on high ranking
pollutants.  Lessons learned in the health risk ranking include:  assistance from toxicologists  and air
pollution experts is essential, health risk analyses must follow inventory development, and the risk
reduction strategy should reflect a reduction in risk as opposed to amounts of emissions.

The risk reduction strategy examined mobile, stationary, and indoor risks, and was funded by a $50,000
EPA Healthy Communities Grant in 2003. The  strategy targeted diesel and passenger vehicles, gasoline
stations and fossil fuel combustors, and indoor tobacco smoke.  Reduction strategy lessons learned
included: the strategy should dove-tail with other priorities, there  are pre-existing constituencies, and the
"Big-Tent" approach leverages the City's power to catalyze change.

St. Louis Community Air Project

Emily Andrews, Managing Partner for the St. Louis Community Air Project, provided an overview of this
program, the partnership team and advisory board, and solutions developed by the program.  The goal of
the project is to ensure healthier air and was the result of the community's interest in knowing more about
air pollution and its health effects.  The project involved close data collection collaborations  with the
state, city, and region.
80                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Data were collected from three monitoring stations located south of St. Louis for a period of 1 year. Data
collected on 250 pollutants and diesel particulate matter were compared to previously set health
benchmarks for cancer and non-cancer incidents to examine long-term health effects. The analyses
identified six pollutants of concern:  acetaldehyde, arsenic, benzene, chromium, formaldehyde, and diesel
particulate matter.

The partnership team met every 3 months for 3 years. The team believes in the collaborative efforts of
EPA, scientists, and community with local, state, and Federal agencies to solve air pollution problems.
Modules developed to educate the public on air toxics will soon be used as part of an outreach program.

One of the challenges of providing information to the partnership team is that not all members of the
group are familiar with scientific terminology. Therefore, it is important to present scientific data in such
a way that all partners can comprehend it and participate in the decision-making process. As a result, the
project developed tools to facilitate data presentation. Visuals were found to be particularly helpful for
information presentation.

Another challenge is determining the amount of information required in order for individuals to begin
making  better decisions regarding their health and the environment. Too much information can be
overwhelming.  Yet, people ask questions that science cannot answer such as:

•  What is in the air that is causing my child's asthma?

•  What is the cumulative effect of all these pollutants on our health?

•  What is causing my friend's cancer?

"In the Air" provides tools for conveying information about airborne toxics.  More information about
these educational modules can be found at http://www.earthwayshome.org/intheair.

Ultimately, the goal is to increase knowledge on air pollution and to make correlations between behaviors
and air quality.

Louisville 2004: Risk Management Actions

Jon Trout, with the Louisville Metro Air Pollution Control District, discussed the basis of the West
Louisville Air Toxics Study, findings, and the actions taken as a result of the findings. The West
Jefferson County Community Task Force is comprised of citizens,  industry, academia, and government.
Funded  by resources from EPA, the State of Kentucky, the University of Louisville, and the Louisville
Metro Air Pollution Control District, the West Jefferson County Community Task Force has chosen
monitoring site locations and air toxics to be monitored,  and has developed a risk assessment work plan
and a risk management plan.

The West Louisville Air Toxics Study is a 1-year study conducted  from April 2000 to April 2001 that
monitored for volatile organic compounds, semi-volatile organic compounds, formaldehyde, hydrochloric
acid, hydrogen fluoride, and metals. The study results indicated that there were 17 carcinogens posing a
risk greater that one in one million. The carcinogen with the greatest risk was 1,3-butadiene.

The analysis portion of the risk management plan involves source identification, option selection, and
implementation. Options included public awareness, education of  sources, education of health providers,
technical assistance, pollution prevention, political action, economic assistance, public health initiatives,
and regulatory and legal actions. To fulfill the public awareness option, the CourierJournal reported
2001 emissions of 1,3-butadiene by three companies. Political actions taken under the risk management
plan included meetings between the Mayor of Louisville and the three companies. All three of the

                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          81

-------
companies promised to implement emission reduction actions in response to the Mayor's request for
voluntary emissions reductions.

The following issues are raised under the regulatory response option:

•   Which compounds should be included in draft regulations?

•   What is the acceptable level of emissions?

•   Who sets the standards and how?

•   Which sources should be regulated?

•   Multiple pollutant consideration?

•   How is acceptability determined?

Mobile County, Alabama Air Quality Study

Steve Perry, with The Forum, Industry Partners in Environmental Progress, discussed the purpose,
participants, organization, and scope of the Mobile County, Alabama Air Quality Study.  The mission of
this study was to evaluate the existing air quality of Mobile County, determine community-based
expectations for the county's air quality, and propose and implement the necessary actions to achieve and
maintain the community-based air quality expectations.

The study was designed by a group of local citizens to address local issues, only considered air toxics (not
criteria pollutants), and was not designed to be a regulatory study, a health study, or an ongoing
monitoring program.  Key participants in the study included the City of Mobile, Mobile County, Mobile
Bay Watch/Baykeeper, the Mobile Area Chamber of Commerce, and The Forum, Industry Partners in
Environmental Progress. Participation of local partners was very important to the study.  Mobile Bay
Watch/Baykeeper is the primary environmental group in the area. All partners came to the table with
different agendas.

The study organization consisted of a steering committee, a fiscal agent, a management group, a citizen
panel, a technical task force, and contractors. The technical task force contributed to the study up to the
point of defining the scope.

The initial cost estimate of the study was $750,000 to $800,000. However, the final funding costs are
estimated to be approximately $1.2 million. EPA and the State of Alabama provided the  additional
funding.

The scope of the study is air monitoring, air modeling, and community-based expectations.  Air
monitoring included volatile organic compounds, carbonyls (formaldehyde and acetaldehyde), metals,
and PAHs, among others. Monitoring locations included industrial, traditional, and high  population areas
as well as a background site.  Sampling occurred on randomly selected dates for a period of 1 year, and 60
samples were taken at each location.

Modeling was conducted using the Assessment System for Population Exposure Nationwide model,
which was chosen because EPA had previously used this model for some other assessments.  Coastal
zones are difficult to model and, therefore, modeling was a challenge.

Community based expectations were the core of this project. A work group of 25 to 30 people spent 6 to
9 months being educated on monitoring and modeling. There were no pre-conceived ideas of what form
the expectations would take (e.g., qualitative, quantitative, risk based, or concentration based). The


82                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
process for determining community-based expectations included establishing rules, agreeing on
educational needs, setting targets, establishing a subcommittee and consideration of their reports, issuing
recommendations to the steering committee, and sharing information via a Web site.

The approach for carcinogens was determined as follows:

•   Risk at less than 1 in 1,000,000 is acceptably low

•   If the sum of all risks in an area is between 1 and 10,000 and 1 in 1,000,000, then some evaluation
    will necessary

•   If the sum of all risks in an area is greater than 1 in 10,000, the cause should be examined and
    reductions should be considered based on political, social, economic, and engineering implications.

The approach for noncarcinogens was determined as follows:

•   Exposure to any individual chemical at less than its reference concentration is acceptable

•   Exposure to multiple chemicals with a hazard index of less than 1 is acceptable

•   If a hazard index for an individual chemical is greater than 1, risk evaluation and risk management
    will be conducted.

Monitoring results are just now becoming available and will be used in modeling. Lessons learned are
that it is important not to be bound by convention, take time to educate, and embargo data.

Questions and Answers
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics.  These topics included: (1) the nuances
of regulation; (2) industry involvement in the Louisville project; (3) the completeness and accuracy of
TRI self-reporting; (4) use of actual source data to ensure the accuracy  of results; and (5) the likelihood of
the success of voluntary emission reduction programs.

Science  to Support Decisions: Climate Change
Following comments by Michael Slimak, with NCEA, four speakers addressed the feasibility of assessing
climate change impacts, decision making involving climate change in the Gulf of Mexico, and alternative
approaches to conducting climate change impact assessments. An audience question and answer period
followed the presentations.

Climate Vulnerability and Impact Assessments Can Provide Useful Insights and
Guidance - Now

Michael MacCracken, with The Climate Institute, discussed the climate change issue, the factors that
complicate the issue, and the potential impacts of climate change.  The issue of climate change is divided
into three key questions:

•   How is the climate expected to change and are we already seeing the early signs of these changes?

•   What are the environmental  and societal impacts that are expected, and to what extent can adaptation
    ameliorate the projected negative consequences?
•   What are the options for limiting the human-caused factors and how rapidly and economically can
    they be implemented?
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                         83

-------
Answering these questions is complicated by the following factors:

•   The changes in climate, the impacts of the changes, and the implementation of options all have
    century-long time horizons

•   Projection of a range of possibilities is all that can be expected

•   This is a global issue that is international in scope.

Under the guidance of the United Nations Framework Convention for Climate Change, the
Intergovernmental Panel on Climate Change assesses expert understanding of these three key questions.
The United Nations Framework Convention for Climate Change is comprised of representatives of more
than 150 countries.

Climate models provide consistent projections of the expected climate changes. However, they are better
at predicting temperature than precipitation. The Intergovernmental Panel on Climate Change projects
that there will be intensifying changes in climatic measures during the 21st century, including increases in
surface temperature, precipitation, and evaporation rates as well as a rise in sea level.

Impact assessments evaluate the potential vulnerability to scenarios of projected change in climate. The
scenarios are not predictions. A range of approaches can be used to generate plausible scenarios.
However, two important points are that the wide range of possible changes in climate does not directly
affect impact assessments of potential vulnerability, and only  generalized projections are possible.

Key findings of the United States' National Assessment include the following:

•   Increased warming is projected across the United States

•   Climate change and impacts will vary regionally

•   Many ecosystems are highly vulnerable and their goods and services will be costly if not impossible
    to replace

•   Water is an issue in every region, but the nature of the vulnerability varies

•   The agriculture sector is likely to be able to adapt to a climate change

•   Forest productivity will likely increase for a few decades and then possibly decrease over the long-
    term

•   Increased damage is projected for coastal and permafrost areas

•   Adaptation is likely to help protect much of the population in the United States from adverse health
    outcomes

•   Climate change is likely to magnify other stresses, such as air and water quality

•   Significant uncertainties remain and surprises are likely.

The United States Government is responsible for reporting likely consequences to the United Nations
every 4 years.  Even with the limitations in available information, particular regions and sectors can
enhance the basis of their long-term planning using these assessment results.  For example, global
warming affects snowmelt in the West, which affects the water resources that are very important in
California.

Although the impact assessments can provide useful  information, they cannot provide details.
84                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
It is important to communicate science to the stakeholders so that they have a basis for their decisions on
whether to take action. Climate change should always be considered when making long-term decisions.

The Feasibility of Conducting  Climate Change Impacts Assessments:  Opposing
Viewpoints

William O'Keefe, with the George C. Marshall Institute, discussed issues for consideration in making
policy decisions, the limitations of the  current knowledge base of climatic effects, and actions to be taken
to promote a broader knowledge base.  Historically, there has not been enough attention placed on
providing information to the public and policy makers that would be useful in making wise decisions.
The public has not been properly informed on the issues, which has led to confusion.  Wise policy
requires wise decisions and an understanding of the trade-offs and their consequences.

In order for information to be valuable to policy makers, it is important to be clear on the following:

•   The limits of data, models, and analyses

•   What is known, what is unknown but knowable, and what is unknown and may not be knowable any
    time soon

•   Science can illuminate our understanding, but cannot solve climate change problems

•   Policy should flow from our state of knowledge, reflect the reality that actions have consequences,
    and be capable of being adjusted one way or another as knowledge increases

•   Creating new knowledge has a high priority and should be driven by the value of the information and
    the likelihood of being able to produce it

•   Hedging strategies reflecting the enormous uncertainties in our understanding of the climate system
    have great value but have not been adequately explored

•   Scenarios have value to the extent that they help to illuminate implications for capital investment,
    changes in capital stock, new technology, and economic and population growth

•   The time horizon for planning and action is inversely related to the extent of uncertainty.

The first National Assessment for the United States was a noble effort, demonstrating the limits of
knowledge and the limits of models, but it did not help the policy-making process. Two of the best
climate models in the world produced conflicting climate forecasts in many regions. These conflicting
estimates should have led to a serious discussion of model limitations, why long-term regional assessment
simply cannot be done at this time, and what can be done about the regional impacts of today's climate.
In spite of the severe limitations of the climate models, they are being made more complex rather than
improved in their capability to accurately capture variables.  A better approach is to limit their use and
focus more on building the knowledge base and the data that would eventually enable these models to be
relevant to policy making.

Models that cannot be validated have limited value as policy tools. Calibrating them to replicate past
temperature is appropriate for research purposes, but inappropriate for forecasting the future and for
decision making.  As a result, modeling should have a lower priority in the government's research agenda
and a higher priority should be given to research on key climate variables such as water vapor, feedback,
cloud formation, solar variability, ocean currents, and aerosols. A better understanding of those variables
is critical to gaining insights about natural variability and climate sensitivity.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          85

-------
Limits on knowledge and constraints on our ability to radically alter either our economic or energy
systems in the short-term should be matched by constraints on our planning horizon and actions. Actions
driven by apocalyptic visions of the future rob us of needed flexibility. Large organizations have a hard
time achieving and sustaining flexibility and creativity. Problems associated with incentives,
communications, and cohesion lead to processes that promote efficiency and order.  The inertia of large
organizations makes it difficult to quickly recognize the value of new information, alternative approaches,
or the need to change direction.

Effective planning should address these problems at the outset.  Providing mechanisms to encourage
creative tension without promoting chaos  is one way. Promoting healthy and constructive dissent is the
hallmark of a healthy organization, but it requires an understanding of the need to compensate for
organizational inertia.

The way to invest wisely is be clear about priorities, objectives, measures of success, and mechanisms for
bringing an end to work that does not meet expectations. Until there is a better understanding of natural
variability, climate feedback, climate sensitivity, and the like, there cannot be understanding of the extent
of human influence.

Policy proposals that fly in the face of economic and  energy realities have little hope for long-term
survival. In the end, climate policy is energy policy,  and has economic impacts.

Actions that can be taken to create options and deal with adaptation include mitigating the  effects of
current climate extremes, investing in infrastructure, curbing subsidies for excessive water use, and
development. It is important to stop pretending that we can accurately forecast future climate and its
impacts.

Use of Science in Gulf of Mexico Decision Making Involving Climate Change

Arnold Vedlitz, with Texas A&M University, discussed the purpose of an EPA cooperative agreement
project, its framework, and the anticipated results. The ongoing project is multi-disciplinary, employing
the expertise of representatives of Texas A&M University, EPA, the University of New Orleans, the
University of Louisiana at Lafayette, the Florida A&M University, and stakeholder advisors and
informants, and attempts to address the issues of how science can be useful as a tool for decision making.
The team interacts on a weekly basis.

The purpose of this project is to educate decision makers and the public on the uncertainties surrounding
complex scientific information so they can make or influence informed policy decisions. The project
goals are to:
•   Investigate the salience of climate change for Gulf of Mexico stakeholder groups

•   Examine how stakeholder groups use  climate change science information in decision making

•   Describe unfilled information needs on this topic

•   Recommend strategies for making climate change information more useful to decision makers.

The conceptual framework of the project includes social construction of problems, setting agendas, and
social amplification of risk.  Data sources  include unstructured interviews, document analysis,
observation of group processes, and focus groups.

The project in divided into four phases. Phase I involves a research team workshop, a stakeholder
workshop, and the selection of research locations, endpoints, and stressors.  Phase II involves the
86                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
collection of documentary evidence, preliminary analyses of media coverage, and field work preparation.
Phase III involves field work and continuing document collection. Phase IV involves data analysis and
preparation of the project report.

The plan is to conduct interviews with 600 major decision makers. Wave 1 is a completely undirected set
of interviews, the first 100 of which did not focus on climate change.  Wave 2 of the interviews will focus
attention on potential events that might occur and how climate change will influence the results of those
potential events.

Preliminary findings of the Wave 1 interviews identified some problems involving population growth
effects on organization's budgets, problem definition variability between organizations, and linking
environmental problems to economic considerations.  The interviews  also provided insight into how Gulf
Coast stakeholders acquire and use scientific information as well as links between scientists and
stakeholders.

Once completed, the project will:

•   Explain how issues such as climate change become identified as problems

•   Describe how information relevant to climate change is received and processed

•   Identify valued and trusted information sources

•   Identify the most accessible, useable, and understandable information types and formats
•   Describe how information providers  can best frame, package, and deliver objective science and
    technological information for most effective consumption and utility by policy makers and the public.

Alternative Approaches to Climate Change Impacts Assessments:  Success
Stories

Joel Scheraga, National Program Director of ORD's Global Change Research Program, discussed the
debate regarding the feasibility of conducting regional and place-based climatic impact assessments with
much of the debate coming from the modeling community. A broader understanding of which tools can
be used can be obtained by examining the tools from a user's perspective. The objective is to determine
which model will provide answers to the questions surrounding the issue. Frequently asked questions
include:

•   Is climate change an issue of concern?

•   Is it possible to develop a better understanding of the vulnerability of a system to climate change?

•   Do opportunities exist for increasing resilience to both  climate variability and climate change?

•   Are there actions that will foreclose future options?

•   Can potential maladaptive practices be identified?

Five categories of insight are used to  answer these questions:  effects  of concern, potential variabilities,
win-win opportunities, preventing foreclosure of future options, and potential maladaptive practices.  Use
of these insight categories has proved to be successful in projects regarding drinking water, heat wave
mortality risks, riparian buffer zones, rolling easements, and sea level rise as well as shipping industry
changes necessitated by water level changes.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                           87

-------
The climate and science community should recognize that models are not the only way to assess potential
effects of climate change. Using these categories of insight is another way to successfully link sound
science and sound decision making.

Questions and Answers
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics. These topics included: (1) the best
approaches for communicating scientific knowledge to the public; (2) the local impacts of climate change;
(3) making the effort to engage stakeholders from the beginning of the research process; (4) testing the
validity of models; (5) educating the public (through organizations holding the public's trust) on climate
change, the true state of knowledge; (6) basing predictions on information other than what has happened
historically; and (7) the difficulties associated with predicting years into the future.
88                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Section  V:    Delivering
                              Science-Based
                              Information  to
                              Decision  Makers
                              Wednesday and Thursday, June 2-3, 2004
The purpose of this breakout session on the second and third days of the meeting was to focus on the
development of environmental indicators, the use of geospatial tools to support decision making,
mechanisms for environmental and health information exchange, development of science-based
information for coastal systems, scientific computing applications, improving the indoor environment,
and tools for net environmental benefit analysis. Each session included opportunities to respond to
audience questions that provided additional information and insight on a variety of science-based
information, analysis methods, and tools.

Michael Flynn, with the OEI Office of Information Access and Analysis, led a session addressing the
development of environmental indicators and analytical tools to link environmental conditions and public
health outcomes.  Presentations included an overview of the Report on Environmental Indicators and its
status, examples of integrated environmental monitoring and public health data systems, and a causal
analysis diagnosis decision information system to identify causes of biological impairments.

Brenda Smith and Wendy Blake-Coleman, with OEI, led a session addressing the use of geospatial tools
in support of decision making. Presentations included the development and implementation of an
emergency response analysis system, analysis of remote sensing data to determine trends of urban growth
and impacts of urbanization, and a Web-based tool enabling interested agencies and organizations to
obtain information on how specific projects will affect surrounding communities and the environment.

William Sonntag, with OEI, led a session addressing the use of information technology to provide greater
access to health and environmental information. Presentations included an overview of the National
Biological Information Infrastructure that provides access to data and information on biological resources
in the United States, an overview of the Environmental Information Exchange Network that was created
to share environmental information and promote information exchange over the Internet in a secure
network environment, and an overview of the EPA System of Registries, which supports metadata and
serves as a gateway for searching diverse EPA metadata repositories.

Kevin Summers, with NHEERL, led a session addressing the development of science-based information
for coastal systems.  Presentations included the development of an integral network within the states and
tribes in the Western United States to assess environmental conditions of coastal areas, an overview of the
National Coastal Assessment Initiative to improve the overall health of coastal aquatic ecosystems
nationally and regionally, a State of Florida in-shore marine monitoring program, State of New
                        EPA SCIENCE FORUM 2004 PROCEEDINGS                      89

-------
Hampshire efforts to assess estuaries and develop indicators, and National Coastal Assessment Program
activities in the Long Island Sound and Narragansett Bay areas.

Rick Martin, with the OEI Office of Technology Operations and Planning (OTOP), led a session
addressing enhancements in EPA's high performance computing capability. Presentations included the
implementation of a new high performance computing system; EPA initiatives to expand data acquisition,
storage, and manipulation by internal and external users; and a desktop of the future for information
access and analysis for decision making in the form of an Environmental Science Portal.

Elizabeth Cotsworth, with the Office of Radiation and Indoor Air (ORIA), led a session addressing how
science can be shared to influence public action for healthy buildings. Presentations included sources of
indoor air pollution and their prevention/control, research underway to understand and document the
health effects of indoor air pollutants, guidance and other materials developed to help the public take the
actions necessary to improve indoor air quality,  development and implementation of an Indoor Air
Quality Label for new housing construction, and strategies for public outreach and engagement.

Ann Whealan, with EPA Region V, and Bill Robberson, with EPA Region IX, led a session addressing
net environmental benefit analysis. Presentations included the use of this tool in environmental decision
making for emergency response and a case  study of its application in planning.
90                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
The Future of EPA's Environmental Indicators Initiative and Report on
the Environment
Following opening remarks by Michael Flynn, Director of the Office of Information Access and Analysis,
four speakers addressed indicators of healthy communities and ecosystems, the EPA Report on the
Environment, environmental public health tracking, and a causal analysis and diagnosis decision
information system. An audience question and answer period folio-wed the presentations.

Indicators of Healthy Communities and Ecosystems

Heather Case, with OEI, provided an overview of the EPA. Report on the Environment, which highlights
the conditions of air, water, and land in the United States and demonstrates their effects on life, health,
and ecological conditions. The OEI Environmental Indicators Team is working to report on what is
known and not known about the condition of the environment, improve the indicators and information
available to report on the condition of the environment, and support the use of indicator information for
EPA decision making (e.g., strategic planning, budget decisions, policy).

The Report on the Environment is a step in the overall initiative of the President's Management Agenda
of improving environmental indicator uses and relating the science behind indicator results to human
health and ecological conditions. The main goal of the Report on the Environment is to demonstrate how
EPA activities to protect the environment actually lead to positive outcomes, which are defined as
changes in emissions, ambient concentrations, exposure, and disease trend or condition of an ecoystem.
The key to enhancements of environmental indicator activities is to understand the  degree to which EPA
program activities support these improvements. The Report identifies indicators (measures of
environmental results), describes the status and trends in the environment and in human health, and
describes what EPA knows about the current state of the environment at the national level and how it is
changing. A shorter report is available to the public and a longer technical report is available to
environmental professionals.

The data and information in the Report on the Environment underwent peer review and are supported by
sound experimentation, data management systems, and quality assurance procedures.  This Report uses
data and results from EPA databases, as well as other sources outside of EPA, including other Federal
agencies, states, tribes, and non-governmental organizations.

The format of the Report on the Environment follows a hierarchy of indicators, as described below:

•   Administrative Indicators
    D    Level 1 - EPA, state, tribal, or other government regulations and activities
    o    Level 2 - actions and responses by regulated and nonregulated parties

•   Environmental Indicators
    D    Level 3 - changes in pressure or stressor quantities
    D    Level 4 - ambient conditions
    o    Level 5 - exposure or body burden and uptake
    D    Level 6 - ultimate impacts or changes in human health and/or ecological condition.

Some findings from the Report on the Environment include the following:

•   Air emissions of six criteria pollutants and their precursors have decreased by 25 percent in 30 years
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                         91

-------
•   The percentages of days in which at least one criteria pollutant exceeds an air quality index of 100,
    which is known as Code Orange, were at their lowest levels in more than 10 years in 2000 and 2001

•   Acid rain continues to decline in the East and Midwest

•   Stratospheric ozone levels declined over Seattle, Los Angeles, and Miami between 1979 and 1994.

ROE:  Focus on Human  Health and Ecological Condition Chapters (Overview of
the Outcome Chapters)

Denice Shaw, with OEI, summarized the outcome chapters in the Report on the Environment, which
focus on air, water, and land. These chapters answer the question "How do we understand and interlock
the information received from environmental indicators when assessing the status of our air, water, and
land?"  A human health chapter focuses on the health of the American public, which is generally good
and improving  when compared to statistical health data from other nations.  However, there is no direct
relationship between the trends found in human health and disease assessments to trends in exposure
levels to specific pollutants.

EPA has an abundance of information about the environment and human heath, but linking trends and
noting cause and effect relationships cannot be done with confidence. Historically, there were
circumstances where the correlation between trends was clearer. For example, reductions in lead in
gasoline resulted in a clear reduction in blood lead levels, but there is no clear connection between blood
lead levels and  human health.  The incidence of waterborne disease is declining, but uncertainty exists as
to whether the decline results from exposure to sources of drinking water, well water, or recreation such
as swimming.  Similar examples include human health chronic obstructive pulmonary disease studies
along with cancer and asthma.  Asthma attacks can be the result of exposure to air pollutants, as well as
other causes. When studying cancer deaths, many pollutants are carcinogens, but cancers also are
associated with many other factors.

When considering human health, the EPA has a real opportunity to more clearly identify the missing links
and to successfully overcome these issues with new goals, initiatives, and activities. When considering
ecological conditions, there are significant gaps in the availability of environmental indicators and data
that make it impossible to report on status and trends nationally. For example, there are data on the
ecological condition of forests in the United States based on the Forest Inventory and Analysis program,
which provides nationwide, representative snapshots of tree conditions and ozone damage to trees in the
United States.  After studying these data, EPA realized that ozone information was not collected in the
same way as the data for the Forest Inventory and Analysis.  Similar findings were discovered when
looking at coastal waters in which NOAA and EPA use an EMAP probability design to provide
nationwide, representative data for coastal water assessments.

In summary, the chapters on human health and ecological conditions show that there is a wealth of
information being  collected, and some trends have been found.  However, there is a great need to improve
the links and the assessments.

Environmental Public Health Tracking:  Moving Into the New Millennium (Human
Health Trends and Outcomes)

Dr. Judy Qualters, Chief of the Environmental Health Tracking Branch at CDC, introduced recent trends
in tracking environmental public health. Currently,  we do not understand how environmental hazards
relate to health effects in the United States.  Scientists have the most data, but there is little sound
knowledge of how to link environmental and human health assessments to current trends. There also is a
92                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
lack of data and data access, as well as a lack of tools. Therefore, there is a need to build capacity, as well
as standards to improve the data.

Congressional funding was provided to develop and implement a nationwide environmental public health
tracking network and to improve capacity in environmental health at state and local health departments.
The goal of this CDC National Environmental Public Health Tracking (EPHT) Program is to provide
information from a nationwide network of integrated environmental monitoring and public health data
systems so that all sectors may take action to prevent and control environmentally-related health effects.
The CDC National EPHT Program goals include building a sustainable, national environmental public
health tracking network, increasing environmental public health tracking capacity, disseminating credible
information, advancing environmental public health science and research, and bridging the gap between
public health and the environment.

Environmental tracking means surveillance.  This initiative requires looking at hazards, exposures, and
health effects; linking data within a tracking network; disseminating information to stakeholders; and then
improving the environment and human health.  Ideal characteristics of the National EPHT Network
include:

•   High quality, timely mortality and morbidity data with high resolution geographic coordinates

•   A wide range of information on exposures based on biomonitoring, personal monitors, or exposure
    modeling

•   Relevant, high quality, timely emissions data and monitoring data for air, water, soil, and food (all
    based on temporally and spatially appropriate sampling schedules)
•   Updated population data for denominators to calculate rates with adjustment for migration and socio-
    demographic factors

•   Ability to link geographically and in some situations, individually

•   Resolution that is fine enough to enable evaluation of effects from localized environmental exposures
    in small areas.

Tracking data will enable scientists, analysts, and decision makers to quantify the magnitudes of
environmental issues, evaluate trends and risk groups, present hypotheses and data to support the
hypotheses, develop information for better clinical care and individual health actions, and facilitate
planning.

Program components include information technology and standards, communications, training, research,
legislation and policy,  and scientific methods. The National EPHT Program also requires partnerships
among stakeholders and government agencies, and these partnerships and collaborations are key.  For
CDC, these partnerships have involved state and local health departments, academic centers of
excellence, national public health and environmental professional organizations, advocacy groups, EPA,
and NASA.

CDC has funded 21 states at different levels for demonstration or pilot projects of environmental public
health tracking. Thus far, each state has its own initiative or program for gathering information and
conducting research. Also, to support the goal of partnering and collaboration, the states were not eligible
for funding unless the state health department worked with the local health and environmental
departments.  As a result of these pilot projects, CDC is hoping to create a state model that can be
followed by other states in the future.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                           93

-------
Also, EPA and the Department of Health and Human Services have partnered with CDC and ATSDR to
advance efforts to achieve mutual environmental public health goals and strengthen the bridge between
the environmental and public health communities. These efforts include linking agency databases,
sharing timely and reliable environmental and public health data from agency networks, and increasing
interaction and enhancing collaboration between the agencies.

An example of partnering projects between CDC and the states is the New York Pilot Data Exchange
Project. The purpose of this project is to implement and test a system for exchange of air monitoring data
between the New York State Departments of Environmental Conservation and Health, examine
interoperability issues between CDC and EPA databases, and provide lessons learned to CDC and other
partners.  Other examples include the Wisconsin Environmental Public Health Tracking Data Linkage
Demonstration Project; the Public Health Air Surveillance Evaluation Project with the states of Maine,
New York, and Wisconsin; and the  Health and Environmental Linkage for Information Exchange with the
city of Atlanta.

All activities focus on moving into an implementation phase in late 2005.  More information can be found
at http://www.cdc.gov/nceh/tracking.

CADDIS: The Causal Analysis/Diagnosis Decision Information System

Susan Norton, withNCEA, provided an overview of the Causal Analysis/Diagnosis Decision Information
System (CADDIS), which is supported by NCEA, NERL,  several EPA offices, the Idaho  Department of
Environmental Quality, the Connecticut Department of Environmental Protection, the Maine Department
of Environmental Protection, Ohio State University, and the Minnesota Pollution Control Agency.
CADDIS helps investigators in states and tribes to identify causes of biological impairments and serves as
a Web-based system providing guidance,  examples, and links to information.

Determining the cause of a biological impairment is a continuous planning process, and the causal
analysis approach is based on the EPA 2000  Stressor Identification Guidance. The causal analysis
approach provides a logical method for analyzing evidence, making a case, and identifying useful
information. CADDIS supports state and tribal communities and users of the stressor identification
guidance with the goals of bringing together  guidance, tools, information, and case experiences.
CADDIS provides links to relevant information, helps to analyze and interpret evidence, and helps to
organize, quantify, and share results. Also, CADDIS can be used for coastal evaluations of real cases.

CADDIS, which can be found at https://cfpub.epa.gov/caddis, includes a step-by-step guide, an
interactive flowchart of the process, and examples of CADDIS activities.  Each step ends  with an output
statement so that the user can expect the next step. CADDIS also provides a complete listing of all blank
worksheets as word processing documents, a conceptual model, example projects and full case studies,
external data and information links, a list of references, site map, and search glossary.

In the near future, CADDIS also will include databases of empirical stressor-response studies, analytical
tools for users, a conceptual model library, and case study  examples. In addition, a new exposure-
response database will bring in field data and research, not just laboratory data or research reports. Case
studies with states and tribes are continuing,  and these results will be included on their Web sites so that
other groups can review the cases and model their own approach accordingly. The overall goal is to help
EPA link its efforts in biological assessment  and biocriteria to environmental outcomes by working with
states and tribal partners.

Questions  and Answers
The speakers had an opportunity to address questions from the audience.
94                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
A brief question and answer period addressed a range of topics. These included:  (1) improving outcome
measures in order to address the lack of broad data, data access, and data standards; (2) harmonizing data
collection efforts and initiatives among Federal, state, local, and tribal agencies and workgroups; and (3)
providing users of databases with the correct results and conclusions.

Using Geospatial Tools to Make Program Decisions
Following opening remarks by Brenda Smith and Wendy Blake-Coleman, with OEI, three speakers
addressed the use ofgeospatial tools for emergency operations, to assess urban growth and land cover
trends, and environmental impact reviews. An audience question and answer period followed the
presentations.

OEI Support for EPA HQ Emergency Operations Center:  Emergency Response
Analyzer

Joe Anderson, with OEI Office of Information Access and Analysis, described efforts to develop custom
software to support the EPA Headquarters Emergency Operations Center.  After the September 11, 2001
terrorist attacks, the EPA realized that there was a need to renovate and improve the Emergency
Operations Center.  A main goal is visual observation of an emergency situation and the need to
determine the result of any situation. Therefore, GIS technology is crucial in determining the location and
results of emergency situations, as well as the integration of tabular data and mapping. The outcome of
this initiative is the  Emergency Response Analyzer.

A pilot exercise for the Emergency Response Analyzer was completed in EPA Region II. As part of this
pilot exercise, scientists and engineers simulated a fire at a warehouse where there were incidences of
injuries, fatalities among the firemen, etc., that resulted in the mobilization of the regional operations
center. EPA and firemen were on the scene, and the goal was to coordinate mapping and simulations to
establish occurrences and results of the fire.  The mapping technology used during the pilot exercise was
demonstrated during this presentation.

OEI and EPA Region II integrated aerial photography and various map layers in order to visualize the fire
scene. The warehouse was located in an urban area, and major highways were nearby.  The scenario also
included closure of nearby highways that prompted other routes to be used during the rescue effort.

Another layer added to the visual, aerial photography was a view of other, nearby, EPA-regulated
facilities to determine and assess effects of the fire. For example, with a top tier risk facility nearby, the
goal may be to evaluate the chemicals at the facility and determine ways these chemicals may be affected
by the fire. Also, effects on the community would be evaluated based on the facility and its chemicals.
An additional goal was to visualize the size of objects, buildings, land areas, etc.  When measurements are
determined, scientists and engineers can better assess how many workers and rescuers are needed, as well
as the logistics of a rescue effort.

The Emergency Response Analyzer also can be used to predetermine any land area and to evaluate what
is already there.  For instance, a polygon is drawn over a land area with the mapping tool to represent a
zone of concern. This allows users to look at the weather, wind, and other conditions in order to
determine the effects of an emergency situation. Users also could retrieve data on the number of homes,
EPA-regulated facilities, ecological areas, and populations within this area. Thus, sensitive areas would
then be protected as best as possible.

Currently, OEI has  an abundance of data and detailed information to determine the facilities located
within a particular area, a detailed facility  report for each site, and a list of hazardous wastes for each site.


                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          95

-------
This information is accessible depending upon which EPA database receives the facility-submitted
information (i.e., Biennial Reporting, TRI, etc.). Users also can determine if there are environmental,
regulatory non-compliance problems.  This integration theme is extremely useful when evaluating the
effects of an emergency situation.

Future goals are to include more modeling that will enable scientists and engineers to pull in parameters
of the release associated with an emergency situation and to predict behavior and results. Currently, EPA
can determine wind and weather conditions at an emergency site situation, but a better capability would
be to use an interactive, real-time feed of weather conditions at a specific site.

The Emergency Response Analyzer (at http://intramapl7.rtpnc.epa.gov/era/em4er.asp) is only accessible
within EPA.

Assessing Urban Growth and Land Cover Trends Using Remote Sensing Imagery
and Landscape Metrics

Gary Roberts, with OEI Office of Information Access and Analysis, discussed projects in five urban areas
that focused on the analysis of remote sensing data when determining the trends of urban growth or
urbanization, and resulted in new technology and  imagery science. Urbanization is commonly defined as
an increase in human habitation, combined with increased per capita consumption and extensive
modification of the landscape. To determine urbanization, we must look at direct and indirect impacts of
the urban environment to determine loss of natural resources, environmental degradation, land
consumption, and fiscal constraints. Assessments of urban growth and land cover trends are designed to
provide a historical perspective on urban growth, to evaluate growth, and to develop  indicators.

These five projects provide a historical perspective of land use change in urban areas; assess spatial
patterns, rates, trends, and impacts of urban growth using remote sensing and GIS; and illustrate urban
growth as a pressure on environmental resources (e.g., source water, air quality, and habitat) that can be
measured by indicators. The five areas of interest include Chicago, IL; Detroit, MI; Minneapolis, MN;
Phoenix,  AZ; and Raleigh-Durham, NC. Data resources and reference data used in these project areas are
extensive, and include NLCD, North American Land Cover imagery, multi-resolution land characteristics
imagery,  USGS digital orthophoto quarter quads,  United States population (census) data, Texas
Transportation Institute for transportation data, and state and local government economic data sources.

The projects also require the following image processing and analysis steps:

•   Acquire, subset, and re-project imagery

•   Geo-register images

•   Mosaic imagery using histogram matching

•   Clip imagery to final study  area

•   Classify land cover for 1970s, 1980s,  1990s, and 2000 data

•   Produce land cover change  analysis products
•   Calculate land cover metrics and indicators.

Crosswalks among data sets also were necessary in order to classify data where needed.  Visual editing
was used to identify likely errors and provide quality control checks.
96                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Landscape indicators are quantitative measurements of environmental condition or vulnerability.  ATtlLA
was used to determine landscape indicators/land metrics over the last 30 years for each metropolitan area.
ATtlLA is an Arc View extension that requires input from land use/land cover, elevation and slope,
streams, roads, population, and precipitation. ATtlLA outputs results into spreadsheets and as Arc View
shape files. Metric data were combined with census, transportation, and air and water quality data to
generate urban environmental indicators in each project.

Other activities include:

•   Development of site specific descriptive stories using the ATtlLA landscape metrics

•   Assessing relationships among the metrics and land use development trends and patterns

•   Evaluating water quality and aquatic indicator correlations with landscape indicators

•   Expanding external partnerships and developing a refined methodology to share with partners

•   Continuing development of the urbanization and land cover trends Web site (at
    http://www.epa.gov/urban), that provides access to data and analytical tools for data visualization.

The main Web site provides links to all five projects, a general background of each city, animated
changes within the urban area, and review summaries of the overall trends.

NEPAssist: A Web-Based Mapping Application for Environmental Review

Julie Kocher, with OEI Office of Information Access and Analysis, discussed the development of a Web-
based tool to support analyses under NEPA, which requires all government agencies to assess their land
use projects by gathering data on potential environmental impacts associated with their project. These
assessments are time-sensitive, very costly, and require EPA approval.  There are many requests for
environmental project assessments from Federal offices, and there is a great need to simplify the process
of filing and reviewing EISs and environmental assessments, providing better access to core geo-data,
conducting environmental screenings of all proposed projects, and streamlining the review process.

EPA created the NEPAssist GIS tool to assist with this streamlining. NEPAssist is a Web-based tool that
requires no licensing, data loading, desktop configuration, or training. NEPAssist is a distributed
application (via Web services), includes consistent data sets, and is available to any government agency
user.

A pilot project was conducted in EPA Region II with study areas selected using the NEPAssist Web site.
When a Federal,  state, local, or tribal project group is interested in a particular area, these government
agency users are  able to select or indicate their area of interest with the NEPAssist GIS tool and to obtain
information on how particular projects will affect their surrounding community, etc.  Users are able to
review strategic planning and multimedia program options, as well as look at proposed projects and
planning.

Generally, the NEPAssist Web service provides the following data for any predetermined area of interest:

•   Area measurements

•   Nearby rivers within 400 meters

•   Nearby drinking water sources and associated drinking water contaminants of concern

•   Wetland information

•   Critical habitats for endangered species


                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          97

-------
•   Flood protection

•   Environmental justice issues

•   Nearby coral reefs

•   National ambient air quality data

•   TRI screening data

•   National-scale air toxics data

•   Nearby state and local parks, fish and wildlife service refuges, national forests, and national estuary
    program study sites.

Users of NEP Assist also can obtain environmental justice reports that provide a socio-economic and
population-driven background of the area, based on 2000 census data.

In addition, users can submit an application form via the NEP Assist Web service in order to seek approval
of a project or planned area and to obtain status reports of the review process. Once users have completed
the  NEPA review, an EPA reviewer can submit the approval letter and send it to the applicant via the
NEP Assist Web service.

NEP Assist also helps to raise important environmental issues at the earlier stages of land use project
development, results in performance improvements, provides easy access to region-specific geo-data and
customized regional assessments, and streamlines the review process. OEI is reviewing feedback and
comments from the EPA Regional reviews and Federal partners, is planning for a national expansion with
EPA Regions III, V, VI, and VIII, and is establishing other regional partnerships.

Questions and Answers
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics. These included: (1) useful links to assist
with the Emergency Response Analyzer; (2) partnerships with other agencies, such as NOAA and CDC,
to gather other associated data that would be useful in spill response scenarios; (3) current uses of remote
sensing and ATtlLA within communities; (4) data standards and commonality among outside data sets
when using NEP Assist; (5) cumulative effects analysis using NEP Assist; (6) actual compliance history
within NEP Assist predetermined review areas; (7) use of state-specific data within NEP Assist; and (8)
initiatives to identify historic backgrounds affecting urban growth, as well as use of historic backgrounds
to make better decisions for watersheds, environmental communities, etc.

Delivering Consistent Information on Health and the Environment
Following opening remarks by William Sonntag, OEI, three speakers addressed several initiatives to
manage and array metadata on biological resources, to promote environmental information exchange,
and to improve data access.  An audience question and answer period followed the presentations.

National Biological Information Infrastructure: Collaborative Opportunities in
Ecoinformatics

Mike Frame, with the USGS, provided an overview of the National Biological Information Infrastructure
(NBII) program and introduced NBII tools and services.  NBII is a broad, collaborative program to
provide  access to data and information on biological resources in the United  States. This information is
98                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
needed for environmental projects and initiatives. There are similar efforts underway in Mexico, Canada,
and other countries to promote data sharing.

There are a number of collaborative efforts underway between the USGS NBII program and EPA
including the Ecoinformatics Working Group, metadata standards promotion and propagation, Integrated
Taxonomic Information System, interagency biotechnology efforts, BioEco Interagency Group, EMAP
Partnership meeting, EPA representation on the NBII Science Advisory Committee, and biodiversity and
ecosystem grants and workshops.

NBII nodes include thematic, regional, and infrastructure project efforts. These nodes yield products,
standards, tools, and services for biodiversity data management and delivery. The goal of the NBII
program is to use thematic data, regionally and locally, and to provide and share the data nationally. The
USGS has produced a national Web site for NBII at http://www.nbii.gov.  The NBII Web site and its
associated links provide large quantities of information, as well as on-line mapping tools that allow users
to conduct trend analyses. The Web site also provides a searchable clearinghouse on principal nodes.  For
example, with the fisheries node, users can obtain fishing resource data by state, fishing conditions by
state, National Fish Strain Registry data, Pennsylvania Fisheries Explorer data, and Delaware River
Mapping data. With the bird conservation node, users can obtain population and habitat data on
migratory birds. Users of the NBII Web site also have access to a biocomplexity thesaurus that provides
a broader and more complete list of definitions of terms used by different agencies.

Content management is a key initiative.  The standard for cataloguing Web resources is based upon the
Dublin Core Metadata Initiative, with modifications for the biological community, and an NBII-wide
standard.

In the future, NBII is expected to host an individualized Web site, my.nbii.gov, which will allow users to
alter information available on the main Web site  to fit individual needs and interests. The individualized
Web site will support remote offices and the NBII network (e.g., network operations, partner intranet,
information sharing and collaboration, leveraging of resources, and private information). The Web site
will also enable research (e.g., scientific collaboration, integration of data, peer review, and data analysis)
and deliver useful information to the public. With the use of GIS tools, the individualized Web site will
support access to the BioSafety project, Open Mapping  Application, and BioBot search.

Future NBII program priorities include the following:

•   Reporting trend information

•   Implementing a public portal

•   Fully implementing Web resources standards

•   Adding more nodes, such as Gazetteer and Address Finder

•   Developing simple data capture for reporting of events

•   Ongoing peer review of current and new applications

•   Hosting  semi-annual geospatial and information technology workshops

•   Expanding museum records, partnerships for content creation, and data holdings.

The USGS NBII program hopes to continue supporting  ecoinformatics and GIS/mapping activities;
collaborate with the NBII Fisheries and Aquatic Resources Node; host interagency biodiversity and
ecosystems workshops; and provide metadata integration, tools sharing, and standards support.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          99

-------
Environmental Information Exchange Network

Molly O'Neill, with ECOS, introduced the Environmental Information Exchange Network, which is
intended to promote data sharing to support better decision making among Federal agencies and
regulators as well as to improve the data that are available. Currently, EPA and state agencies require
better access to environmental information among partners, better approaches to information exchange
because current stovepipe approaches are inefficient and burdensome, and better integrated information
technologies and approaches. Also, states are modernizing their information systems and are migrating
away from EPA's national systems.  States and EPA are using different information to assess
environmental conditions, the status of land areas, and human health.

As a result, there is a need to collaborate and improve data standards, which is supported by the shared
vision between EPA and states:  "The States and EPA are committed to a partnership to build locally and
nationally accessible, cohesive, and coherent environmental information systems that will ensure that both
the public and regulators have access to the information needed to document environmental performance,
understand environmental conditions, and make sound decisions that ensure environmental protection."
The Environmental Information Exchange Network is an approach to move towards this goal and to share
data among EPA, states, and the tribes. Efforts began in 2000 to design the implementation of this
network and in July 2002, the first stage of implementation began.

The Environmental Information Exchange Network takes data and results from several partners and
shares this information via the Internet over a secure network. The information available to government
partners is based on data standards and the use of XML as the universal language for the data and
information exchange.

There are network nodes that represent Web services. These network nodes provide the hardware and
software used to exchange information on the Environmental Information Exchange Network. The nodes
can work with any type of data in order to share information.

Trading partner agreements support data exchange efforts and are provided in order to protect the
partners.  These agreements detail the data that partners agree to exchange and how often.  States, tribes,
and EPA can submit data or acquire the data from one another.  Currently, there are 37 states building
their nodes and systems, and approximately 10 states have operational staged nodes and systems.

The following are a few examples of key partnership projects and data sharing initiatives of the
Environmental Information Exchange Network:

•  eDMR Challenge Grants - developing electronic Discharge Monitoring Reports with states and EPA
•  Pacific Northwest Surface Water Quality Exchange Challenge Grant - exchanging surface water
   monitoring data between states with a focus on multi-state boundary watersheds and bringing in
   community data for the first time

•  Drinking Water Laboratory Challenge Grant - delivering drinking water laboratory results directly
   into state systems, providing timely and better quality data.

Future goals and applications for the Environmental Information Exchange Network include the
following:

•  Better quality data exchange

•  Data standards embedded in XML
100                         EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
•   Close coordination with the state and EPA Data Standards Council for new data standards to
    incorporate

•   Use of machine-to-machine technology to minimize data entry errors

•   Additional data exchange among new partners

•   Application of the infrastructure to new data exchange types (such as new partnerships involving
    health-related information from CDC) while connecting this information to environmental indicators

•   More state-to-state data exchange

•   More timely data, where data can be published and exchanged as soon as partners agree.

The Environmental Information Exchange Network already is successful in reducing the effort to verify
that exchange systems are communicating (i.e., interoperability  of Web services), reducing and
eliminating the costs of duplicate data entry, and providing infrastructure and mechanisms to support new
data flows between existing and new partners. More information on the Environmental Information
Exchange Network is available at http://www.exchangenetwork.net.

EPA's System of Registries, A Foundation for Consistent Environmental
Information

Larry Fitzwater, with OEI, discussed the challenges of data access for EPA and other government
agencies and a tool, the EPA System of Registries (SoR) to addresses this. The SoR provides a gateway
to EPA registries and search capabilities for linked Agency registries. The SoR also provides
identification information for objects of interest to EPA (e.g., data elements, XML tags, data standards,
substances of concern, terms, facilities, regulations, and data sets), and supports the Agency's data
standards program and technology initiatives.

An abundance of data and metadata exists within EPA, and there is a need to document and keep track of
this data. OEI has done this from a standards-based perspective with SoR, which physically integrates
application records in the Registry of EPA Applications and Database, data elements in the
Environmental Data Registry, data in the Substance Registry  System, and terms in the Terminology
Reference System. Also, SoR provides links to data sets in the EPA Environmental Information
Management System and facility records in the Facility Registry System.  An overview of each of these
systems is as follows:

•   Registry of EPA Applications and Database - an information system inventory that also integrates
    registries, forms the basis for physical or virtual linkages  among EPA metadata collections, and links
    to the Environmental Data Registry, Terminology Reference System, and the Enterprise Architecture
    Repository, among others.

•   Environmental Data Registry - based on data element metadata, code sets and value domains, and
    data standards, and provides a source of well-formed data elements, related XML tags, and value
    domains. This system also promotes the reuse of data in EPA systems; enables data sharing, data
    integration, and data comparability; and supports data standards development processes, metadata
    access, and distribution.

•   Substance Registry System - obtains information about 85,605 chemicals and biological organisms
    from more than 1,000 different information resources, relates chemicals to regulations, and provides
    business object registry. This system enables searches across chemicals, biological organisms, and
    physical properties; records metadata in one place; links standard nomenclature with the way
    substances are named in regulations; and relates substances found in a variety of information
    resources.  In addition,  the Consolidated Health Informatics Initiative selected this system to store

                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          101

-------
    non-medicine chemicals. This system requires updating because of the multiple chemical abstract
    system numbers assigned to different chemicals.

•   Terminology Reference System - houses EPA definitions and terms (11,977 terms from 229 sources),
    data from the General Multilingual Environmental Thesaurus, and terminology from EPA program
    offices, information systems, regulations, and state collections. This system serves as a single
    resource of environmental terminology for the Agency.

•   Environmental Information Management System - includes data set information, geographic data
    sets, and ORD information products.

•   Facility Registry System - includes information about places of interest, implements EPA's facility
    identification standard, and provides a business object registry.


There is also an XML registry that stores trading partner agreements between states and EPA.

EPA's SoR relies on consistent environmental information; semantic management, including tools for
Web resources for data management, meanings documentation, and structured terminology; content
management of Web pages, documents and records, data in applications systems, and data sets;
knowledge management; grid computing; and data and metadata management.  More information on the
SoR can be found at http://www.epa.gov/sor.

Questions and Answers
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics. These included: (1) ways to assess the
credibility of data being integrating into all of these new databases and system networks; (2) preparing for
the enormous growth of data that is going to be available and the ability to handle it; and (3) criteria to
determine when Environmental Information Exchange Network systems are ready for use or if a state is
in operational mode.

Developing Science-Based Information for Coastal Systems
Following opening remarks by Kevin Summers, NHEERL, six speakers addressed the National Coastal
Assessment Program, and coastal assessments for the Pacific  Coast, Florida, New Hampshire, and the
Northeastern United States.

From Tropical Beaches to Fjords, An Overview of Western Coastal EMAP,
Western Pilot Study

Henry Lee II, with NHEERL, summarized the Western EMAP, which is a part of the National Coastal
Assessment Program.  Researchers supporting the Western EMAP hope to build capacity within the
western states and tribes of the United States to monitor for status and trends in the condition of the
nation's coastal ecosystems, and to complete a second National Coastal Condition Report.

The sampling program supporting the coastal component of the Western EMAP utilized sites in small
estuaries of Washington, Oregon, and California (1999); large estuaries of Washington, Oregon, and
California (2000); coastal systems of Hawaii and south central Alaska (2002); estuarine tidal areas of
Washington, Oregon, and California (2002); continental shelf of Washington, Oregon, and California
(2003); and additional estuaries in Washington, Oregon, California, Hawaii, and southeast Alaska (2004).
Sampling activities have involved many partnerships for this long-term effort.
102                         EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Based on the sampling efforts in 1999 and 2000, coastal sites were ranked as poor, fair, or good if a
predetermined number of contaminated sediment samples fell within certain ranges when compared to
values for "effects range median" (where adverse effects occur 50 percent of the time) or "effects range
low" (where adverse effects occur 10 percent of the time). Sediment contamination appears to be
localized with only about three percent of the sites ranked "poor," and these involved the San Francisco
and Los Angeles harbors, the Columbia River near Portland, OR, and some areas in Puget Sound.

Only one percent of the sites received a "poor" ranking for dissolved oxygen; however, 36 percent of the
sites received a "poor" ranking for water clarity. The poor water clarity may be a natural phenomenon
involving high rainfall, steep coastal terrain, and high tides.  These studies are also addressing arthropod
toxicity, chlorophyll levels, nitrogen levels, and invasive species.

In 2002, the study included sampling of intertidal areas, which are important on the West Coast as they
have extensive tidal flats and about 50 percent of these areas are tidal. This required development of new
sampling strategies to address these challenges. Two pilot studies evaluated wetlands and used a
landscape approach in addition to a biomass sampling approach.  These studies evaluated landscape
conditions based on the ratio of tidal flat to tidal marsh, patch size frequency distribution of tidal marsh,
connectivity to tidal marsh patches, marsh edge area ratios, and percentage of land border undeveloped.
Data from these studies are just now becoming available.

Also in 2002, studies were initiated in Hawaii and  Alaska. A coastal survey was conducted of all the
Hawaii Islands. This again required modification of the sampling approach to address issues associated
with coral and the need to use scuba equipment; this study also analyzed samples for bacteria, which is of
particular interest to the State of Hawaii given some problems encountered in certain areas.  The effort in
Alaska encountered a particularly challenging study environment given that the coastal area is quite large,
the water is deep, and the sampling season is short as well as difficulties encountered in sampling site
selection complicated by the presence of glaciers.

In 2003, a pilot study sampled areas of the continental shelf from the border of the State of Washington to
the United States border with Mexico.  This involved sampling in 30 to 120 meters of water as well as
sampling inside and outside of marine sanctuaries.

One of the advantages of probabilistic sampling design as is  used in EMAP is the ability to evaluate the
area of a resource meeting certain conditions.  Thus, many of the results are presented in mapped form.

The Western Coastal EMAP is providing the first regional-scale assessment of ecological conditions of
coastal ecosystems in California, Oregon, Washington, Hawaii, and Alaska. Preliminary findings of these
coastal sampling and assessment efforts include the following:

•   By sampling from the intertidal to the continental shelf areas, Western Coastal EMAP provides a
    spatially-comprehensive assessment of the coastal conditions for three states

•   Results from the 1999-2000 sampling efforts indicate that ecological conditions are generally good,
    but there are some exceptions for water clarity and invasive species

•   Sampling new habitats and environments presented a number of challenges and required developing
    new techniques.

Partnerships were key to the successful data collection and research efforts of the Western EMAP coastal
sampling program.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          103

-------
The Utility of NCA-type Monitoring Data for EPA Decision Making

Diane Regas, with OWOW, described efforts to assess and evaluate coastal conditions of the nation, and
Darrell Brown, with OWOW, provided an overview of EPA's coastal management program and
associated goals. EPA funds 28 National Estuary Programs to develop the National Coastal Assessment,
and these efforts are supported by partnerships with local and state governments, Federal agencies, and
local fishermen residing in or utilizing these coastal areas.

EPA's Strategic Plan provides the background that requires these efforts, as indicated in the Ocean and
Coastal Goals 2 and 4:

•   Goal 2, Clean and Safe Water
    o   Ensure drinking water is safe

    n   Restore and maintain oceans, watersheds, and their aquatic ecosystems to protect human health
    o   Support economic and recreational activities

    o   Provide healthy habitat for fish, plants, and wildlife.

•   Goal 4, Healthy Communities and Ecosystems - protect, sustain, or restore the health of people,
    communities, and ecosystems using integrated and comprehensive approaches and partnerships.

By 2008, the National Coastal Assessment Program will enable the EPA to improve the overall health of
coastal aquatic ecosystems nationally and regionally as well as aquatic system health for the 128 estuaries
that are part of the National Estuary Program.  Progress will be measured using the National Coastal
Condition  Report indicators.

OWOW is using two response indicators and three stressor indicators in the first National Estuary
Program report being developed for publication in 2006.  The two response indicators are the benthic
invertebrates index and the fish contaminants index.  Stressor indicators include the water quality index,
sediment quality index, and coastal habitat index. Research efforts will help to answer questions such as:

•   How are local, regional, and national plans connected?

•   How do the goals and activities at the National Estuary Program watershed (or sub-watershed) level
    compare with the national goals?

A current program need is to aggregate and inter-relate the information being developed at the local
estuary level, through the National Estuary Program, to the national level. The Lower Columbia River
Program was offered as an example of the assessment of an estuary condition, which at a broad scale
involves the entire Pacific Coast, and at an even broader scale involves large marine systems. It may be
necessary  to relate the story of coastal health at all of these different levels, which is what is important to
the National Coastal Assessment and its mission to identify targets for improving coastal and estuary
waters. This involves the aggregation of 2,000 national sampling sites under the National Coastal
Assessment along with the sampling sites from 128 National Estuary and other programs.

The next steps for the National Coastal Assessment Program involve inter-relating the National Strategic
Plan, Regional and Great Waterbody Strategic Plans, and state and local strategic plans.  The National
Estuary Program Report is anticipated to be published in 2006. OWOW is partnering with local
scientists,  NOAA, USGS, the U.S. Fish and Wildlife Service, and coastal states to complete these
activities.
104                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Florida's Inshore-Marine Monitoring and Assessment Program

Kevin Madley, with the Florida Fish and Wildlife Conservation Commission, summarized the Inshore-
marine Monitoring and Assessment Program (IMAP), which is Florida's statewide initiative under the
National Coastal Assessment Program. The primary focus of IMAP is coastal sampling. This season is
the last of the 5-year process.

The sampling design is a two-tiered, probabilistic design following the EMAP criteria.  In the sampling
design, researchers divided Florida's coastal areas into two scales—coastal/statewide and regional. IMAP
samples annually,  addressing several locations throughout the State to cover a wide range of areas.
Sampling occurs in late-summer each year with a total of 180 stations sampled annually because of the
large coastal line in Florida.

A total of 30 sites were sampled each year (2000 to 2003), and the same number will be sampled in 2004.
Five sites are fixed, and the remaining sites are randomly selected. There are 19 different estuaries within
the sampling locations, and researchers utilize indicators such as water, sediments, benthic infauna,
nekton, and submerged aquatic vegetation to determine coastal conditions. These indicators are the same
for each sampling  site within Florida.

Typical water quality measurements are taken at each station, and sediment sampling is also completed.
Researchers also use structural indicators, since dynamic indicators are not used in sampling efforts, as
well as light measurements to indicate water characteristics.

Although there have been large data collection efforts in the state of Florida, IMAP is the first program
where national and state data are reported and evaluated.  IMAP is only for coastal areas, but it is a part of
the Integrated Water Resource Monitoring Network, which is an umbrella study under which marine,
fresh water, and groundwater monitoring are all conducted within a probabilistic framework. IMAP also
can be used to increase data availability for future National Coastal Condition reports and to fill in gaps of
the National Coastal Condition Report II (e.g., fish tissue contaminant concentrations).  IMAP data from
2000 to 2004 will be available for the next round of TMDL assessments, and can be used in annual 303(d)
reports and biennial 305 (b) reports to the Florida Department of Environmental Protection.

National Coastal Assessment:  A Successful State-Federal Collaboration in New
Hampshire

Phil Trowbridge, with the New Hampshire Department of Environmental Services, described ongoing
coastal assessment activities and results. The New Hampshire Department of Environmental Services
developed a partnership with EPA in 2001 to support their efforts in the National Coastal Assessment
Program, enabling the Department of Environmental Services, for the first time, to assess 100 percent of
the estuaries within the State.  This partnership also includes the University of New Hampshire to support
data collection.

The National Coastal Assessment Program provides standardized indicators and methods, probabilistic
sampling design, national coverage, and the flexibility to add other indicators or designs as needed at the
state level.  The New Hampshire Department of Environmental Services utilizes the national coastal
assessment to improve 305(b) reports and assessments that look at the use of aquatic life. Use of
probabilistic surveys, which are unbiased, provides better coastal assessment data and more useful results.

The national coastal assessment data also are being used to optimize study designs so that they include
more random samples and, therefore, provide better data quality. This random sampling approach was
utilized in studies of the Cochoco River, Southeast Great Bay, Salmon Falls River, and  Piscataqua River,
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                         105

-------
where New Hampshire national coastal assessment data were compared to actual samples of mercury.
The New Hampshire coastline is only 18 miles and benefited this study because sampling sites were so
close together and the data points were very tight. This enabled researchers to look at spatial correlation
and its effects on coastal assessments.

The National Coastal Assessment Program provides valuable insight into the bigger picture of the effects
of mercury in the Northeastern Region. In the past, New Hampshire ranked at the top of the region for
mercury concentration evaluations.  However, upon visual representation of the national coastal,
researchers realized that the mercury results for New Hampshire directly result from the distribution of
the data.

In the field of technology transfer, New Hampshire's role has expanded over the last 5 years. Previously,
New Hampshire would collect data and send it to others to perform the interpretation. Now, most data
interpretation is done in-house.

In summary, the National Coastal Assessment Program has been a very successful partnership between
EPA and the New Hampshire Department of Environmental Services.  National coastal assessment data
has been used to meet the State's needs to complete 305(b) reports in a cost-efficient manner, and national
coastal assessment technology has built capacity in the State for coastal monitoring.

National Coastal Assessment: Approach and Findings in the  Northeast

Henry Walker, with NHEERL, provided an overview of the National Coastal Assessment Program
activities in the Northeastern United States, which are supported by partnerships between EPA, EPA
Regions, and several states.  The goals of these program activities are to assess ecological conditions of
estuarine resources, based on unbiased data of known quality; determine reference conditions for studies
on ecological responses and stressors; and build capacity in states and EPA regions.  In order to achieve
this, Federal and state agencies must partner in collecting, processing, and analyzing data samples;
developing state and regional infrastructure; and providing for communication and education of findings.
The value  of this effort will be seen more clearly in one or two decades from now. The baseline data used
in the National Coastal Assessment is based on sound science, and an effort of this large magnitude has
not been done before.  Benefits of this program are illustrated by the  experience of the State of Maine,
which did not have a coastal assessment structure to test waters when this program first started, and now
has methodologies and an approach unique to the State to evaluate their waters.

The National Coastal Assessment Program approach uses consistent, measured indicators and a
probability survey design that allows extrapolation and addresses 305(b) requirements. This Program
includes water quality  indicators, sediment quality indicators, habitat indicators, and biota indicators.
The coastal assessment approach also has the potential to incorporate existing monitoring programs and
hybrid monitoring designs. When considering a merger of existing monitoring data and thereby using
both predetermined site and random site selections, researchers faced the question as to whether the states
would be able to use both data sets or discard the results of past sampling efforts (i.e., lose previous
monitoring investment).  Additional questions related to the merger of existing monitoring data include:

•  How to use pre-existing sites without being biased to historical data?

•  If all existing sites are replaced with randomly selected stations, do we risk the ability to track trends
   and therefore lose valuable data?

In order to answer these questions, researchers used a combined set of data taken from the Long Island
Sound site, and incorporated the combined data set into the national coastal assessment probability
design.  Comparison of the two data sets resulting from this effort indicated that they measured up


106                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
statistically and basically provided the same answers. Therefore, using both data sets—historic data from
predetermined sites and data from randomly selected sites—would provide more reliable, accurate
reporting of coastal assessments based on percent area and "good, fair, and poor" area condition
evaluations.

Preliminary findings of the coastal assessments in the Northeast are summarized in Chapter 3 of the Draft
National Coastal Condition Report II.  The Northeast Region is a very urbanized area, housing more than
125 people per square mile, and has a number of legacy pollutants because of the many industrial areas.
The spatial distribution of water quality components, including dissolved inorganic phosphorus, dissolved
inorganic nitrogen, and surface chlorophylls, among others, are being mapped to look for patterns,
gradients, and percentage of area affected.

An interactive tool has also been developed to visualize and analyze data found in the Northeast Region.
This tool enables researchers to pull up data in Excel, map the data using a GIS component, and evaluate
data on a percent area basis, with ratings of "good, fair, and poor." This tool also enables researchers to
evaluate the effects of changing the rating thresholds on the outcome of the analysis.

In the near future, researchers will be able to use electronic, Web-based reporting with possible links to
National Coastal Assessment data, analysis tools, National Estuary Programs, and state environmental
management programs.

National Coastal Assessment: Monitoring and Modeling in Support of  TMDL
Calculations

Henry Walker, with NHEERL, presented research findings from the National Coastal Assessment
Program.  An important goal of this Program is to report, in 2008, additional  research and findings to
address current problems that have been identified in the coastal regions of the United States.

The assessment process begins with a survey of present conditions and, if applicable, requires follow-up
monitoring. Questions faced by researchers in performing the assessment include the following:

•   Where is follow-up monitoring needed?

•   Will opportunities to address other important issues be missed by supporting follow-up monitoring in
    a designated area?

•   Should sampling occur during another vulnerable period  and where?

An example involved the issue of low levels of dissolved oxygen  in the stratified waters of Narragansett
Bay, RI. In the summers of 1997 and  1998, sampling efforts showed that the dissolved oxygen content of
the upper Bay area was lower than expected. Researchers thought that this could be caused by nutrient
loadings from the rivers flowing through, as well as the large number of wastewater treatment facilities.
Although there were low levels of dissolved oxygen in the Narragansett  Bay, this did not cause the Bay to
be placed on the list of impaired waters of the Northeast at that time. However, researchers wondered
whether continuous, follow-up monitoring efforts should be conducted to avoid problems in the future. In
August 2003, an episodic fish kill confirmed that there was a problem.

This is a great example of the potential need for diagnostic monitoring. National coastal assessment data
are used only to explain the present conditions of coastal areas. The data alone cannot be used to predict
future problems or future problem areas, nor do the data provide suggestions for regulatory changes or
management actions to protect areas of concern.  However, diagnostic monitoring can assist in these
efforts.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                         107

-------
 In the case of Narragansett Bay, diagnostic monitoring was used to anticipate, to a spatial extent, an acute
 dissolved oxygen event.  This prediction was based on observations of low oxygen content in the surface
 layer being less than the chronic criterion for a 10-day period, and low oxygen content at 0.5 meters
 above the bottom being less than the acute criterion for a 5-day period.

 The National Coastal Assessment data can also be used for problem solving such as for the issue of
 nitrogen overloading. Using the  SPARROW model, the product of a partnership between  EPA Region I
 and the USGS, researchers were able to input National Coastal Assessment data that detailed atmospheric
 deposition and different types of nutrient fluxes from several agricultural areas and forested lands. The
 model provided estimates of conditions in lakes and streams as well as fluxes of nutrient loadings in a
 predetermined area, and enabled researchers to determine point sources of nutrient loadings. Efforts have
 also been successful in integrating data from the SPARROW model and the Estuary Nitrogen model to
 determine nitrogen levels in Narragansett Bay. When comparing the combined model output with actual
 sampling data from a past sampling event, the results  showed that model output  for nitrogen
 concentrations was within 0.2 mg/L of sampling results.

 Use of diagnostic monitoring as described above can help to identify when impairments are expected and
 why.  From this information, management and regulatory decisions can then be determined.

 Scientific Computing
 Following opening remarks by Rick Martin, with OTOP,  three speakers addressed mechanisms and tools
for high performance computing  and their applications to EPA programs.

 The Center of Excellence for Environmental  Computational Science

 Joseph Retzer, with OTOP, discussed the need for scientific computing capability at EPA and the
 evolving role of the Center of Excellence for Environmental Computational Science.  Scientific
 computing is the provision of high performance computing  services and infrastructure on demand. Major
 drivers for enhanced scientific computing capability at EPA include:

 •   Increased internal and external collaboration between scientists and the associated need to interface
    with other computing systems and data repositories with different data security requirements

 •   Tremendous shift toward in silico science

 •   Creation of enormous amounts of new data (up to terabytes in size) with associated management,
    storage, and transfer issues

 •   Evolution of technology including opportunities to cluster computers (i.e., link together a number of
    different computers) and to develop the scientific desktop of the future, which links high-end desktop
    computers across a grid.

 The Center of Excellence for Environmental Computational Science involves an EPA team (from OEI,
 ORD, and other Offices) that brings together cutting-edge science with information technology solutions
 to upgrade and enhance EPA's scientific capabilities. This  initiative has three goals:

 •   Build a network for environmental research with improved collaboration tools, a research
    subnetwork, and grid computing

 •   Develop a science portal to provide the tools and  information needed by EPA decision makers and
    scientists

 •   Upgrade high-end computational capability.
 108                         EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Information technology collaboration is essential to EPA science and this Center of Excellence. EPA has
developed an efficient, effective, and tight system that now receives positive recognition from outside the
Agency.  EPA now faces the need to enhance collaborative capability, which may require adjustments
beyond just the firewall or how EPA handles information security. Therefore, in the fall of 2003, EPA
began to investigate the information technology impediments to scientific collaboration, and identified a
number of specific issues:

•   Difficulty with electronic file transfer (i.e., those files not suitable for email)

•   Web seminar attendance/hosting

•   Difficulty connecting to databases outside the EPA firewall

•   Log-in capability to other systems

•   Difficulty of others to log into the EPA system

•   Difficulty for employees to log in from home or on travel

•   Help desk abilities/understanding to be able to support scientific applications

•   Obtaining EPA security practices supportive of scientific needs and outside interface/collaboration.

Decisions and actions taken in early 2004 to address these issues include the establishment of a scientific
server for electronic file transfer, adding a Scientific Access Coordinator to work with scientists who are
having access difficulties, and establishing a work group to specifically address collaboration issues. The
Scientific Electronic File Transfer Server Project has developed a system without "per use" or other
access fees and is accessible to outside collaborators while maintaining adequate security.  The new
server has 3.5 terabytes of storage to support large files, and has become available internally beginning in
May 2004 to ORD scientists; access by all EPA users is anticipated to occur in the summer of 2004.  The
Scientific Access Coordinator responds to firewall and other external collaboration issues. This
Coordinator understands the information technology access needs of the scientists as well as the limits
imposed by EPA's computer security systems, and can work with the EPA security team to find solutions
for scientific access needs.

EPA also has begun a process to identify desirable design features of a separate EPA scientific
subnetwork. Some of the issues being addressed include how to use a secure shell for data encryption,
access to computers outside of EPA, connecting scientific applications to databases from outside EPA,
providing remote access for scientific users, and obtaining approval for passive file transfer protocols
through the EPA firewall.

The next steps include the addition of features such as electronic signature to initiate accounts, building
out the science grid and portal that will become major collaboration tools, and further defining
requirements for the scientific subnetwork.

Current Projects and High Performance Computing and Visualization Direction

John Smith, with OTOP, discussed the new computing system installed in February 2004, the growth in
data, and grid computing. EPA has several hundred servers and a mainframe at Research Triangle Park.
The new computing system has multiple nodes dedicated to interactive processing, input/output, and
batch processing that will enable enhanced support to long-running calculations, with a processing
capacity of 768 Gflp/sec. The new system also has a high performance file system (16 terabytes) to
ensure that the file system is not limiting the computing speed. This new computer system is the latest in
a series of high performance computing expansions that began in 1992 with the acquisition of EPA's first
supercomputer.


                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          109

-------
This recent computer acquisition will ease computing constraints encountered by EPA.  There has been a
significant increase in computation processing demand, which is currently running at over 40 percent of
the new system's capacity, and is anticipated to be at nearly 100 percent of system capacity later this
summer. Projected mass data storage requirements are anticipated to increase dramatically over the next
2 years. The costs of storing and managing the data are significant.

Grid computing is a form of distributed computing that involves coordinating and sharing computing,
application, data, storage, or network resources across dynamic and geographically dispersed
organizations.  The new computer system will be part of a grid to help share data and resources. There is
a strategic long-term plan that includes the use of clusters that can support certain types of processing.
Currently EPA has islands of computing capability with data that few can access and the intention is to
integrate all of that.

There are two types of grid technology—data and computing. The data aspect focuses on moving data,
provides the ability to integrate distributed data (e.g., from clusters), and provides mechanisms for data
access as well as external access to multiple data sources. The computing aspect involves the sharing of
computing resources across a grid. Not every application is amenable to that type of architecture and
security (e.g., access, policy,  authentication, authorization) is important. This approach helps to use
resources efficiently.

The first phase of this effort was to deploy a data and computational grid across no more than two
locations.  This has been completed and was extremely successful. Both locations are in Research
Triangle Park, and it has been possible to access the grid from other locations to process computing jobs.
Efforts are also underway to identify applications that may be suitable for the grid. Commercial-off-the-
shelf and open source grid technologies are being used in order to avoid new development and custom
techniques/equipment as much as possible.

The second phase is addressing the management aspects of grid computing (e.g., performance, obtaining
certificate  authority, funding  after August, MOUs), to identify other grid-worthy applications, to extend
the grid beyond the EPA firewall to external partners, and to provide access to the grid through the
Science Portal without any security incidents.

Growing the Environmental Science  Portal

Terry Grady, with NERL, discussed the development of the Science Portal and its potential applications.
There is a need for better informed environmental decisions and improved science products. Both of
these depend on collaboration, knowledge sharing, better use of available data, and more widely available
tools/models. The overall goal is better decisions, uses, and outcomes for the science.  The Science Portal
is anticipated to be a significant tool to advance and implement the Agency's science.

What is a portal? "  . Applications that enable (organizations) to unlock internally stored information,
and provide users with a single gateway to the personalized information and knowledge to make
[collaborative] informed business decisions."—Merrill Lynch, Inc.

What does a portal do?  OEI developed a plan for an Environmental Science Portal with a certain basic
set of capabilities that encourages a self-directed work environment, allows users to leverage content and
integrate data, provides an easy-to-use interface across applications supporting manipulation on the same
screen, and provides smart, effective, secure access to internal and external systems. This type of portal
has management systems for content, identity, and data sorting/sifting beyond application integration and
information delivery.
110                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
The Environmental Science Portal will provide many capabilities such as shared work spaces, document
and data repositories, real-time interactions between workers, desk-top visualization, discussion forums,
and audio/video conferencing. This Portal will also provide access to a network of high-end scientific
computing resources such as the EPA computer grid, shared control of instrumentation via the Web, and
use of advanced models, tools, and applications via the Web (e.g., GIS, visualization).  The emphasis in
its design is on users such as decision makers and risk managers.

Advantages of the Environmental Science Portal  include:

•   Business or science on demand that provides access to EPA information through a single access point

•   Customization to individual users to organize information for their ease of use

•   Personalization that enables users to tailor pages to user preferences, job functions, characteristics,
    and use history

•   Content management

•   Collaboration tools

•   Modular structure to provide a framework for rapid integration of independent applications

•   Efficiency through a single point of entry to display everything at once to users.

The Portal also provides identity management and access control that will support EPA employee access
from work, home, or travel as well as access by Regions, states, tribes, and other trusted partners.

The prototype Environmental Science Portal is developed and is being demonstrated at this Science
Forum. The first test version in anticipated to be released in September 2004 with the first public version
anticipated to be released in May 2005.

Closing and Questions
The speakers had an opportunity to address questions from the audience, and a demonstration of the
Environmental Science Portal was provided.

A brief question and answer period addressed a range of topics. These included: (1) EPA participation in
the overall government revitalization efforts in the areas of supercomputing and computer networking; (2)
significant decreases in high performance computer system cost in conjunction with significantly
increased processing capacity (e.g., the Cray supercomputer cost $10 to 12 million in 1992 and the new
EPA computing system cost just under $2 million); (3) establishment of an interdisciplinary team within
EPA to bring together security, policy, contracting, contractors, and others to address the many issues
raised by the scientific community, which includes data storage; (4) movement toward a storage area
network or a storage grid strategy to make better  use of existing data storage and to ensure that EPA data
are carefully protected; and (5) the need to communicate the availability of this additional high
performance computing capability across EPA as there are potential users beyond those in ORE).

A demonstration of the Environmental Science Portal followed the audience question period.

Healthy Communities—One Building at a Time
Following opening remarks by Elizabeth Cotsworth, with ORIA, five speakers addressed the sharing of
science to influence public action and to promote healthy buildings and indoor environments.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                          111

-------
Indoor Air Quality:  Knowledge Base and Gaps

John Girman, with ORIA, provided an overview of indoor air pollution, its sources, and its effects.
There are many sources of indoor air pollution, including building materials, paints/finishes, products
(ranging from cleaning to personal care), human activities, and outdoor air. All building types are
affected, including residences, offices, and schools.

There are a variety of health effects from indoor air pollution, such as lung cancer, asthma, irritation
effects, and neurotoxic effects.  These effects are similar to those experienced from outdoor air exposure,
but indoor exposure levels may be different.  For example, indoor exposures to pollutants are higher
(often 2 to 5 times higher) than outdoor exposures, and people spend up to 90 percent of their time
indoors. In addition, risks from indoor air pollution are high as shown in risk assessments for radon and
environmental tobacco smoke.  Risks from other pollutants without risk assessments (such as air toxics,
mold) are also believed to be significant.

The basic principles for managing indoor air quality, in order of priority, include source control (i.e.,
remove, substitute, or modify the source), ventilation (general and spot), and air cleaning, which is
somewhat less effective than the first two methods.

Research is underway to understand the various types and sources of indoor air pollution. A recent
building assessment, survey, and evaluation report examined the activities that occurred in 100 buildings,
which produced a very rich data set that is just beginning to undergo evaluation.  The study examined the
types of rooms and usage that occur within buildings, and found unexpectedly high occurrences of
laboratories and graphic arts/print shops (around 25 percent).  Office renovations were also examined and
the study found that painting occurred almost continuously in about one-third of the buildings; in
addition, renovations also include installation of new carpet and partition/wall work, which require
different strategies for air quality.

Another area of investigation involved the occurrence of water damage and leaks (which support the
potential for mold to occur), and the findings indicated  that 45 percent of the buildings evaluated have
current water damage and about one-third of the buildings have leaks in occupied spaces.  This indicates
deficiencies in how buildings are built/operated, and either products/systems are being used that are not
easily maintained or there is a problem in how the buildings are put together.

Building maintenance  activities include heating, ventilation, and air conditioning system balancing that 44
percent of the buildings surveyed did not perform.  There also were a high number of instances of
monthly (or more frequent) pesticide applications that occupants may not know occur.

The study  also examined median indoor to outdoor concentration ratios.  This provides information about
the source (from outside or from inside) of specific substances. Initial findings indicate that the buildings
themselves are contributing toluene, luminine (lemon scent that also is an irritant), and other
contaminants.

There also have been a number of studies that link dampness and mold to respiratory health, which is a
leading cause of school and work absences. These studies have shown that high ventilation rates and low
pollutant concentrations will help improve health and productivity. A Danish field study in 2000
examined three offices in two countries, and found a two to six percent increase in the Intelligence
Quotient through changes in ventilation and removal of old carpeting.  This has been confirmed in other
field studies.
112                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
ORIA has developed a research planning document entitled Program Needs for Indoor Environments
Research (PNEIR). This is a list of topics (such as pollutants, sources, health effects) that support
planning for additional activities in this area.

The ORIA indoor environment programs are largely voluntary and there is much emphasis on outreach
and guidance.  Ongoing programs address radon testing/mitigation, smoke-free homes, asthma, indoor air
quality tools for schools, design tools for schools, and the Indoor Air Quality Building Education and
Assessment Model (I-BEAM) for office buildings. Additional information may be found at
http://www.epa.gov/iaq.

Indoor Environmental Research Base

Jim letter, with the National Risk Management Research Laboratory (NRMRL), discussed ongoing
research within ORD that is directly related to the indoor environment.  These research projects measure
sources and levels, and develop models to evaluate mitigation and prevention options. NERL uses these
results to more extensively develop exposure models and measure exposure in field studies, while
NHEERL uses the information from this research to determine effects and to assess risk.

Research facilities include a small chamber laboratory for testing small samples and a large chamber
facility to test emissions from larger items and to study pollutant interactions in a controlled space. There
is also a research house for real world testing, and a  Biological Chamber laboratory to study mold and
other biocontaminants. Research projects usually start with small chamber laboratory testing followed by
development of models that are subjected to large chamber testing.  The models are further improved and
then tested in the research house.

Pollutant sources addressed in this research include carpeting, interior paint, furniture, cabinets, and office
equipment. Activities include  collaboration with industry and EPA  program offices and industries, and
the development of standard test methods for voluntary adoption by industries and other programs.

Mold is an important indoor environment issue. Research in this area includes evaluating treatment of
building materials to inhibit mold growth, developing methods to evaluate mold growth on porous
surfaces, testing products for mold remediation and bacterial decontamination, and developing
identification techniques for mold. Asthma and allergens are another important area, which includes
research on mold and biocontaminants that exacerbate asthma and collaboration with NHEERL in an
asthma study.

There are  also research activities underway in collaboration through cooperative agreements with
Syracuse University and others.  These include quantifying the productivity/performance benefits of
indoor environmental quality to help promote improvement of the indoor environment, investigating the
resuspension of fine PM from indoor human activities, and evaluating the transport of air, moisture,
pollutants, and energy in buildings.

NERL research in this area includes multi-media method development (e.g., air, dust, residues, water,
diet), laboratory- and pilot-scale  studies such as pesticide fate and transport at the research house,
measuring exposure in field studies, and modeling exposures and dose (e.g. SHEDS model).  NERL is
also conducting research involving PM, air toxics, pesticides, and persistent organic chemicals with a
focus on measuring exposure concentrations/factors, sources, routes, pathways, and other indoor
environment variables for susceptible populations and the general population.  Examples include several
recently completed studies such as the PM Panel studies and the Children's Total Exposure to Persistent
Pesticides and Other Persistent Organic Pollutants study.
                            EPA SCIENCE FORUM 2004 PROCEEDINGS                         113

-------
NHEERL is conducting studies of mold allergens that may induce allergic asthma and indoor
contaminants that induce or exacerbate asthma (e.g., dust mites, cockroaches, and mold allergens).
Identification of biomarkers of exposure and effects is of specific interest.

There are several future research strategy drivers, including the ORD Strategic Plan and Multi-Year
Plans, the Healthy Buildings document, the Healthy People document, and PNEIR. EPA also participates
in an Interagency Committee on Indoor Air Quality that provides for information exchange among
Federal agencies involved in research on indoor air quality. In the future, other clients, such as the EPA
regions, will also drive research activities.

Delivering Technical Assistance

David Mudarri, with ORIA, provided an overview of how the information, developed from the research
described in the two previous presentations, is shared with other organizations and helps to encourage the
public to take the voluntary actions necessary to improve their indoor air quality. An important aspect in
developing these outreach products is to maintain a certain level of scientific leadership in this field.  This
presentation focused on three different product/subject areas: office and institutional buildings, green
buildings, and schools.  All of the products discussed may be found at http://www.epa.gov/iaq.

Office and Institutional BuildinRs

ORIA has developed guidance and tools to address mold and building management for a healthy indoor
environment. The goals were to develop quality  documents that sell themselves, offer practical and
feasible guidance, and respond to user needs.  There is a guidance document on mold that has been very
well received and has been downloaded from the ORIA Web site 150,000 times every month for the last
18 months. This speaks very well for EPA that the document is so popular.  Moisture and mold in
existing buildings and new construction are currently topics of great public interest.

I-BEAM is another ORIA product, and is a self-contained guidance software package that is accessible
via the Web. This tool provides information to building managers and building professionals on healthy
and cost-effective building management techniques. I-BEAM provides, in one place, a series of
educational tools, management tools, detailed maintenance tools, and budgeting tools. I-BEAM can be
used as guidance or as a training tool. The presentation included a brief demonstration of the I-BEAM
software, which involves a series of windows, like chapters in a book, and helps the user to diagnose a
problem, develop an energy retrofit program, develop a management program, or to figure out heating,
ventilation, and air  conditioning system needs for indoor air quality protection.  I-BEAM is also available
in WORD format that users can modify for their own uses.

Green Building Guidance

The green building field is growing rapidly and much interest has developed in  this type of guidance.
Some state and local governments have guidance, but EPA desired  to take a leading role in order to
improve the quality of the information being disseminated. As a result, ORIA has developed very
comprehensive guidance that is undergoing review, with posting on the Web anticipated to occur in the
fall of 2004.

Program for Schools

ORIA has developed an extensive and very popular indoor air quality program for schools. This includes
design tools in the form of Web-based guidance with a focus on school systems and those who design
schools.  The program materials provide background  information, core guidance, and a tool kit that links
114                         EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
to EPA tools. The program materials also provide links to guidance and other information that is
available from EPA and others such as the State of California, which tends to be a leader in this area.

EPA's New Indoor Air Quality Label

Sam Rashkin, with the Office of Atmospheric Programs, discussed the Energy Star program (that
addresses documented energy savings) and a joint cooperative partnership to develop a similar label for
indoor air quality. This is an example of how to make science work in the market place.

The platform for developing the indoor air quality label is the existing Energy Star program for homes.
This is a voluntary program that provides credibility from a government-backed program, sets a standard
for an energy-efficient home, and includes third party verification. Such a program must be profitable;
otherwise it will not work on a voluntary basis. The program involves a home energy rating system that
includes field inspections and diagnostics.  Every builder claims to build energy efficient homes, and the
Energy Star symbol indicates that the claim is verified.

Typical measures of an Energy  Star home include properly installed insulation (which is more  important
than the R-value), a continuous air barrier, advanced  windows that trap heat in the winter and keep heat
out in the summer,  efficient equipment with more guarantees and quieter/better performance, and tight
ducts to reduce the amount of heated/cooled air that is lost to the outside. These measures do not include
indoor air quality components, but they do provide indoor air quality benefits.

The Energy Star program is showing significant popularity as evidenced by a huge growth in homes
labeled with Energy Star—from nearly  zero to 115,000 homes between  1996 and 2003 with the number
doubling in the last 2 years. The key to this success is involving the production builder whose
participation then steers the rest of the industry. This program is particularly popular in the Southwest,
Northeast, and Texas with participation also seen in Florida and the Midwest.  Nearly 20 to 40 percent of
housing starts in these areas may have the Energy Star label.

The recently developed indoor air quality label indicates that the home is constructed with the following
features:
•   Moisture control through proper grading away from the home and measures such  as foundation
    sealing and drain tile installation to keep moisture from coming into the home

•   Drainage to remove water from behind siding

•   Continuous air barrier to prevent moisture from entering wall assemblies

•   Radon  control systems and barriers

•   Screens and barriers, including a foundation shield, for pest detection and control

•   Properly sized heating, ventilation, and air conditioning systems with tightly sealed (with mastic) and
    balanced ducts as well as proper air filtration

•   Combustion safety to protect from and remove fumes from fossil fuel, such as vents for the furnace
    and water heater

•   Use of nontoxic materials.

Moisture control and ventilation are important to controlling water damage and mold. Important elements
are barriers to prevent attraction of moisture to a low moisture home, draining every surface on the
exterior of  the home, and protecting building materials before they are placed in new construction to
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                          115

-------
prevent the introduction of moisture, mold, and insects. Once constructed, interior doors should not be
kept shut as this practice prevents proper ventilation and causes air to push out into other areas.

The new label is very simple for the consumer:  Energy Star with Air+.  This new label system adds
quality to Energy Star homes and will help to inform consumers about indoor air quality.  This is also
expected to provide consumer savings, value, and payback as the Energy Star approach and other
programs have demonstrated that consumers will pay for lower risk (e.g., buy bottled water rather than
drink tap water).

Lessons learned from implementation of the Energy Star program will be used in the implementation of
the indoor air quality label. The message to consumers is that an investment of less than 75 cents a day
will yield over 100,000 cubic feet of fresh, filtered outdoor air and additional protection against
mold/mildew problems, radon exposure, harmful pests and termites, harmful formaldehyde and volatile
organic compounds, combustion safety, and wet basements. The incentive for builders includes reduced
callbacks for problems, reduced liability, increased revenues, ability to meet a growing buyer preference,
and,  possibly, in the future, reduced insurance costs.

The indoor air quality label will be finalized in the fall of 2004, with a pilot program anticipated to be
launched in November 2004 in three test markets. Subsequent activities will involve negotiating reduced
rates with the insurance industry and evaluating pilot program results with the desire to have the new
label in place and working by November 2005.

Indoor Environments Program Strategy

Tracy Enger, with ORIA, discussed how research and guidance is turned into action, because without
taking action, there is no protection. This is the basis of social marketing, which is a series of offers and
requests to obtain a social benefit.  Social marketing is therefore important to voluntary programs, such as
those sponsored by ORIA, because voluntary programs do not have a "stick" such as enforcement, only
"carrots."

Such initiatives start with the best science possible to develop guidance and policy, then development of a
system for achieving results that is packaged in a way to attract interest and voluntary implementation.
Who delivers the message to the public can be important and sometimes that may not be EPA—other
sources may be considered more credible or people may more likely follow them. Also, not everyone is
compelled to action by the same kinds of information.  For some individuals, health concerns rather than
productivity may be of more interest.

Several examples were provided of ORIA products that demonstrate this process. One example  involves
I-BEAM, which is an interactive, user-friendly package that has been well-received by building operators.
Another example involves the indoor air quality products for schools. ORIA developed a package that
compiled all of the research information and provided the package to school administrators/staff to
convince them to take the necessary actions. The package was designed to be low cost, presented
information in an appealing manner, was adaptable to individual school/district needs, did not require
specialized training to implement, and presented a common sense approach for voluntary actions. Also,
to address current interest in mold and the public desire to take action, ORIA developed a guidance
document entitled Mold Remediation in Schools and Commercial Buildings. There are also many  ORIA
products developed in the area of radon.

Another example of this process involves the  asthma issue. EPA commissioned a report to identify the
components of indoor air that were causing asthma; this report also addressed exposures and asthma
onset, exposures and worsening asthma, and effectiveness of interventions. From this research, EPA was
116                         EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
able to identify five things on which the public could take action that might make a difference—second
hand smoke, dust mites, pets, molds, and pests. EPA then created guidance on how to clear the home of
these types of asthma triggers. To achieve voluntary action, each request must also involve an offer. In
this case, taking action on these five asthma triggers may help to improve health.

To help deliver these types of messages to the public, EPA has developed interrelationships (formal
associations) with other organizations who are respected, credible sources of information.  EPA may ask
these organizations to present the information to the public, as the public may be more likely to take
action if this information comes from them.

A public information campaign may need other tools. For example, one way to help deliver a message to
the public is to develop symbols that represent a cause, such as showing a goldfish out of water to
represent asthma; many children with asthma describe the difficulty in breathing as being like a fish out
of water. Another  example involves the development of kits, brochures, a media message, and a "take the
smoke-free home pledge" to address concerns with second hand smoke.

Questions and Answers
The speakers had an opportunity to address questions from the audience.

A brief question and answer period addressed a range of topics.  These included: (1) EPA interactions
with the Building Green Council, Lead Homes, and the Association of Home Builders to promote indoor
air quality and the  challenges of interfacing with programs that have a different mission or may not be as
comprehensive as desired; and (2) the availability of guidance to help homeowners address indoor air
quality in existing  homes and in renovations.

Net Environmental Benefit Analysis
Following opening remarks by Ann Whelan, with EPA Region V, two speakers addressed the Net
Environmental Benefit Analysis decision-making tool and its application to emergency response
planning.

Developing Consensus for Environmental Decision-Making in Emergency
Response

Bill Robberson, with the EPA Region EX Regional Response Team, discussed the Net Environmental
Benefit Analysis (NEBA) process to help develop consensus in emergency response or in any complex,
planning process with an environmental component. NEBA is an advance-planning oriented, resource
management tool designed to improve the quality and results of environmental decision making.  An
actual emergency is not the time to get to know everyone involved and is not the time to address differing
viewpoints about what to protect and when.

The NEBA process is about bringing science to decision makers. Every time an action is taken to solve a
problem, other problems are created. The NEBA process helps decision makers to look at trade-offs and
the  impacts to resources from the various options.  This tool has been used successfully in EPA Region
IX, and has proven very helpful in developing information about possible emergencies/responses in
advance of their occurrence.

The National Contingency Plan, whose authorities come from the Comprehensive Environmental
Response, Compensation, and Liability Act and the Clean Water Act, defines and establishes the
Regional Response Teams. These Teams have responsibility for ensuring that resources are available in
an emergency and  that these resources exist prior to an emergency.  The San Francisco Bay Area
Committee, one of six committees for the Southern California Regional Response Team, sponsored the


                            EPA SCIENCE FORUM 2004 PROCEEDINGS                         117

-------
first ecologic risk assessment for San Francisco Bay as a means for preparing for emergency response.
This effort took about 6 months and involved academia as well as local, state, and Federal agencies.
Participants included resource managers, responders, and emergency managers who had to identify the
concerns, their resources for natural resource management/protection, and spill response expectations.

NEBA was used in this planning effort and the process was as much the product as the resource ranking
and response option comparison outcomes. The goal was to create a culture among all the resource
trustees and responders about how to honor someone else's opinion. A key finding is that to achieve
consensus at a meeting, it must be built beforehand.

The framework for this ecological risk assessment was an oil spill.  The realities of an oil spill are that
there will be injury to the environment and the response question becomes how to minimize or eliminate
the damage. Oil spreads and oil can be hard to see, so it is not possible to remove all of the spilled oil
from the environment. In addition, wildlife will get into the oil, and birds, for example, can die from very
small amounts; so, the question becomes what damage and injury can be avoided in consideration of both
short- and long-term impacts to habitat and species. All decisions regarding oil spill response—whether
to use mechanical, chemical countermeasures, or in-situ burning, or no response at all (natural
attenuation)—have inherent trade-offs. Therefore, it is important to have as many options identified in
advance as possible to minimize the damage.

The overall goals in responding to an oil spill are to protect human  life, prevent additional/continuing loss
of oil, and to prevent/mitigate environmental damage. The assessment team established a consensus that,
in responding to such a spill, it was important to protect sensitive habitats and species, with habitat
possibly being more important (a trade-off).  The assessment team also decided that, in the open ocean,
the goal was to get to the spill while it was small in order to stop oil leakage, release, and spread. This
was deemed necessary to protect birds, which the NEBA process had determined were a real resource
driver and huge priority in the California coast.  An important observation is that there will be different
priorities in different geographic areas. A major challenge in such activities is to achieve consensus
among stakeholders on what damage is likely to occur, and the best ways to avoid or minimize it.
Contributing factors include the lack of scientific information, bias or misinformation, inadequate
communication and information dissemination, and differences in ecological reference framework.

A risk-based approach is implicit in response planning, with risks coming from a stressor. There are
different risk analysis frameworks, such as comparative risk and ecological risk assessment. Under a
comparative risk framework, the response is contingent upon response options in a tool box; selection of
the response option depends on the nature of the spill, resources to protect, route of exposure, and how to
protect the resources. In addition, selection of resources to be protected is based on evaluating the risk to
each habitat and its species compared to all others.

An ecological risk assessment, on the other hand, evaluates possible ecological consequences of a
disturbance, and emphasizes comparison of exposure stressors with an ecological effect, and involves
problem formulation, analysis, and risk characterization.  Problem formulation involves:  (1) the selection
of a scenario for analysis (such as worst case) that occurs in a specific time of year (which may be
limiting in scope since flora, fauna, and environmental conditions vary by time of year), (2) identification
of resources of concern and associated assessment thresholds, and (3) preparation of a conceptual matrix
to guide subsequent analyses. Analysis characterizes exposure, relates the exposure to ecological
concerns for each identified response option, and determines relative risk. Risk characterization
addresses other contributing factors such as political issues, social factors, economics (e.g., cost-benefit),
regulatory and legal requirements, technological feasibility, etc.  Since the consequences of risk
management are huge, it is important to agree on their impacts and the resources.
118                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
The NEBA process examines more factors than those described for ecological risk assessment.  The two
processes are very similar, but the NEBA process is more focused on dialogue to identify stressors and to
define interactions with resources, endpoints, and habitats/resources of concern.  Since response planning
involves the development and consideration of options that can be controversial, the NEBA process
provides a format for conflict resolution, opportunities to identify critical issues for discussion, and an
interactive education tool that encourages full participation. In addition, the NEBA process provides a
basis for comparing and prioritizing risk.

Lessons learned from past experiences with spill response decision making include the following:

•   Collection of spilled oil (removal) is preferred to chemical countermeasures, but it is rarely
    successful, therefore the main objective becomes how to manage the impacts of the spill

•   It is not possible to plan for everything that might occur

•   No matter who is trained to respond, someone else will show up in their place

•   Resource and management conflicts seem inevitable.

Therefore, the goal becomes having a framework for constructive discussion and consensus decision
making, such as NEBA. This decision making includes being prepared for the skills needed in response,
developing common ground, and examining the ecological trade-offs associated with potential solutions.

How is this process unique? The emphasis is on potential rather than actual risk and brainstorming "what
if' scenarios. A facilitator often helps the process to build off what the assessment team previously has
done together and to frame outcomes in black-and-white to help the response decision maker.

Steps to accomplishing a NEBA for an oil spill include the following:

•   Assemble the project team, involving specifically identified participants/specialists.  The team
    generally involves 25 to 35 individuals, which can be broken into smaller groups, and includes all
    stakeholder groups (e.g., spill response managers, natural resource managers, subject matter experts,
    non-governmental organizations). This is an important step in the process and should involve those
    with relevant knowledge who will actively participate. A robust discussion from multiple
    representatives of specific areas (i.e., different perspectives) is important to the outcome.

•   Develop the scenario in a way that is realistic. For example, type of oil, volume of oil, time of year,
    location, tidal situation, etc.

•   Gather data and participants. Have team members pull together all the data that the project team may
    need in the analysis, and have participants conduct the analysis together.

•   Define response options for consideration, including the "no response" option. Identify those
    responses that are commonly used and those that may be viable. Define each option further to
    determine if action A is taken, what is involved and what are the impacts. Note the limitations and
    outcomes if nothing is done; for example, if no  action is taken, the emulsion of oil can result in
    volumes two to three times more than originally spilled.

•   Estimate the fate of the  oil and the potential for exposure  to resources of concern.

•   Define environmental resources of concern. This is extremely important and may include distribution
    data, population data, and species of special concern. Other aspects include which resources are
    drivers, such as essential fish habitat, and consider the multiple ways in which these resources can be
    valued.  Also, eliminate resources from consideration that are not in the exposure chain.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                         119

-------
•   Consider all of the important relationships and develop a conceptual model.  Identify exposure
    pathways and stressors (e.g., air pollution, aqueous exposure) in tabular form to create a conceptual
    model matrix with habitat across the top and stressors down the left side.  Add to this a list of hazards
    at the bottom and fill in the hazard for each habitat-stressor box.

•   Connect response options to resources and develop an understanding of their interrelationships.

•   Define effects and develop thresholds to estimate the sensitivity to oil of the resources at risk. For
    example, if toxicity is the concern, identify which species and how much is toxic for each. Possible
    thresholds include the proportion of organisms in a population potentially within the trajectory of the
    spill. It is also necessary to include the laboratory toxicity data, data from field studies and related
    experiments, and data from real spills that will help to evaluate specific thresholds.

•   Conduct the analysis. This is the heart of the process, which is to create a risk ranking matrix and to
    determine the level of concern about potential effects.  A tabular format is useful to organize the
    information and to show the relationships between hazards, data, and possible thresholds.

•   Prepare the Relative Risk Summary that pulls together what was done and what it means.

•   Document the risk assessment and complete the Relative Risk Summary.

Each step and analysis activity must be a consensus decision among the participants.  It is also important
to identify specific endpoints. Examples include: prevent/minimize taking of protected species,
prevent/minimize degradation of water quality, prevent/minimize degradation of sensitive habitats, or
prevent/minimize long-term disturbance of relative abundance and diversity of communities within
habitats (i.e., the "no net loss" statement for chronic effects).

Determining the level of concern about potential effects is the key step to the risk ranking process. An
example showed a four box risk matrix where the x-axis is recovery (e.g., irreversible,
reversible/negligible) with ranges of time (in years) and the y-axis is magnitude such as percent of
resource (e.g., trivial, severe). This results in four possible risk relationships—1 A, IB, 2A, and 2B—that
can be applied to the analysis. Color coding, a consensus decision of the working group, can also be used
to overlay the risk matrix with levels of concern (e.g., high, medium, low concern).

Such a risk matrix helps to compare the hazard or threat to different resources and to identify areas where
impacts are not clearly defined (i.e., uncertainty or lack of knowledge).  This in turn allows for
comparison of possible response options, definition of likely consequences  of the spill and response, and
management of expectations.

This type of approach requires the assessment team to come to conclusions  about the planning, response
options, and information needs, and helps participants understand what is and is not known. The  process
works well using a workshop approach and focus groups, but this must be well managed.

NEBA is both a planning tool and an education tool that can be completed as a planning exercise  in real
time. This process will not be useful if applied for the first time during a spill, but it does improve spill
response if the analyses have been completed beforehand.

There is always an element of uncertainty in this analysis and it is important to look at sources of
variability. If this example had been a large-scale, detailed risk assessment, quantitative estimates might
have been conducted. As described here, the process is based on expert opinion,  but it is well
documented.

A brief question and answer period addressed a range of topics.  These included:  (1) defining consensus
as "can you live with it" rather than agreement by all participants; (2) the use of facilitation to ensure

120                          EPA SCIENCE FORUM  2004 PROCEEDINGS

-------
participation by nonspeakers, particularly in consensus-building; (3) use of risk communication and
message mapping to present relative risk determinations of this process to the public; (4) primary
responsibilities of organizations other than the Regional Response Teams for earthquake response, which
is another potential application for NEBA; and (5) how this process fits into an Environmental
Management System or other larger process as addressed in the next presentation.

Case Study of Isle Royale

Ann Whelan, with EPA Region V, presented a case study illustrating the application of NEBA to
emergency response planning for Isle Royale, Michigan. The Isle Royale area involves a very complex
international biosphere including a series of islands, a main island with a lake, an international boundary
(with Canada), a national park, extremely sensitive natural resources, Federally listed endangered species
(e.g., eagles, gray wolf), and limited local response capabilities. This area is very isolated, predominantly
wilderness, occupied only part of the year, and is home to the longest predator-prey study since wolves
went over to the main island in 1949  via an ice bridge.

The NEBA process was initiated through an area committee already in existence and with jurisdiction
over the national park. EPA was asked to collaborate with this committee regarding problems in
responding to Isle Royale in an emergency. The nature of the location and the overall situation resulted in
much more Federal agency involvement in this process than might be found for other situations.
Preliminary work involved the development of a sensitivity atlas that identified and mapped sensitive
species, since the NEBA process at this location is species driven.

The initial meeting was held in Duluth in January 2004. This involved a group of experts, including
biologists, ecologists, response contractors, EPA, United States Coast Guard, Department of Interior,
National Park  Service, Great Lakes Commission, Michigan Department of Natural Resources, and
Michigan Pollution Control Agency, among others. The group first identified potential threats, which
included the following:

•   International shipping lane within 1 mile of Isle Royale, involving approximately 600 ships (1,200
    round trips) annually
•   Vessel fuel loads of approximately 200,000 gallons per vessel with approximately three percent of the
    cargoes involving liquids

•   Heaviest shipping traffic during the late fall and early winter, which is a known hazard period for
    ships/boats in Lake Superior (e.g., the sinking of the Edmund Fitzgerald and several groundings on
    Isle Royale)

•   Oil storage on Isle Royale itself.

The potentially impacted, high priority species identified as drivers for this assessment included the gray
wolf, common loon, bald eagle, coaster brook trout, arctic shoreline plants, boreal chorus frog, and fresh
water mussel beds.

The scenario selected for analysis was not the worst case oil spill; it involved a grounded freighter with a
fuel release of 30,000 gallons (not the entire fuel capacity) that impacts the northeastern tip of Isle Royale
and occurs in late April/early May. That time of year was selected in order to maximize the number of
species that might be harmed so as to drive the dialogue about the worst possible resource impacts (e.g.,
fish had spawned but fingerlings were still around, birds were nesting, etc.). Of important note is that this
is a less politicized environment than the coastal California example in the previous presentation and the
public tends to think of oil spills as a coastal/ocean problem rather than something that may occur in the
Great Lakes.
                             EPA SCIENCE FORUM 2004 PROCEEDINGS                         121

-------
The assessment team identified in advance the following impacted habitat zones: terrestrial, coastal
wetlands, shorelines, nearshore (there are reefs—a shipping hazard), reefs, and open water. Impacted
resources (species categories) included vegetation, mammals, birds, herpetiles, fish, macro-invertebrates,
and micro-invertebrates.

The assessment team discussed and ranked specific response strategies, including natural recovery (no
action), mechanical/manual recovery, shoreline chemical cleaners, and in-situ burning of shoreline only;
dispersants were not considered because they do not work in fresh water. Since  there are inherent
problems with each response technique, it was necessary to determine which response options were
optimal.  For example, mechanical recovery requires a lot of people, a lot of equipment, a place to stage
the equipment, and a place to store recovered oil. An important consideration was the proximity of
Canada, but the inability to access that equipment.  Another example of response-specific considerations
involved in-situ burning.  Isle Royale is located in Michigan, which does not have pre-approval for in-situ
burning.  However,  there is a U.S. Forest Service equipment cache for fire fighting located 3 hours away.
This appeared advantageous because the U.S. Forest Service might be able to respond quickly and has the
knowledge and capability to conduct a burn in this environment. Drawbacks included U.S. Forest Service
inexperience with the ignition and control of petroleum fires. Therefore, there was a need to have an
interagency agreement to train personnel as they move in/out of the area, etc.

The assessment team developed the Relative Risk Matrix, as described in the previous presentation, and
assigned risk in tabular form  for each species group under each of the four identified response options.
These participants did not like using percentages to define impact, as done in the California example,
because some of the animal populations (i.e., wolves) were very small and even  a small percentage could
be more catastrophic in impact to some species than to others.  Instead, this team opted for additional
ranges of color coding options,  for example, the use of three different reds.

The assessment results were graphed in three dimensions to visually demonstrate how optimal response
techniques were derived for each  species. For example, the graph showed that leaving the oil there (no
response) and aggressively cleaning up the oil both resulted in a higher spike for damage to resident
species than other response options.  The assessment outcomes included a map of optimal recovery that
showed what techniques might work best in which geographic location, with the recognition that those are
not the only actions that can be taken.

The information developed from this process must be put together with a lot of other work about what is
possible/practicable for any given location in order to come up with what can or should be done to protect
specific species.  Being able to identify that the response options are so limited enabled the Park
Superintendent to begin looking at other options such as prevention, seed banking, and gathering more
equipment.

A brief question and answer period addressed a range of topics. These included: (1) the ability to use
both species-based and habitat-based approaches since some response techniques do not work in every
environment; (2) the greater importance of habitat in coastal/ocean area studies;  and (3) the challenges of
solely using habitat  for inland zones where habitat is more integrated, different species use the same
habitat in different ways, and species are much more compacted, which results in either small habitat
changes that are not worth considering or the need to do a NEBA for each habitat.
122                          EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
Appendix A:      Meeting  Agenda


   EPA 2004 Science Forum: Healthy Communities and Ecosystems
                        June 1-3, 2004, Washington, DC
                               FINAL AGENDA


TUESDAY, JUNE 1,2004

10:00 AM -   AAAS Session (Oriental Ballroom A/B)
11:30 AM
            The American Association for the Advancement of Science (AAAS) has organized a
            session that kicks off the Science Forum events on Tuesday morning. The session is
            designed to review the relationship between the AAAS Environmental Fellows Program
            and the U.S. Environmental Protection Agency. The session will be chaired by Fran
            Sharpies, Director of the National Research Council's Board on Life Sciences.
            Presentations by current and recent Environmental Fellows will address the history of this
            AAAS/EPA relationship, its accomplishments, and prospects for its future activities.

            Chair:        Fran Sharpies, Director, Board on Life Sciences, National Research
                         Council

            Panelists:      Terry Keating, Senior Environmental Scientist, EPA/OAR and
                         Venkat Rao, Director, Health Research & Informatics, Computer
                         Sciences Corporation

10:00 AM     Poster and Exhibit Room Opens  (Grand Ballroom and Foyer)
            Plenary Session (Oriental Ballroom)

1:00 PM      Mike Leavitt, EPA Administrator

1:15 PM      Michael Steele, Lieutenant Governor, Maryland

1:30 PM      Jimmy Palmer, Regional Administrator, EPA Region 4
1:45 PM      Kim Nelson, Assistant Administrator, Office of Environmental Information, EPA

2:00 PM      Break

2:20 PM      Paul Gilman, Assistant Administrator, Office of Research and Development, EPA

3:00 PM      R. Steven Brown, Executive Director, Environmental Council of the States

3:15 PM      Jack Marburger, Director, Office of Science and Technology Policy

3:30 PM      David McQueeney, IBM
5:00 PM -    Poster Session & Awards Reception (Grand Ballroom)
7:00 PM
            The Poster Session will highlight EPA research related to the three themes, allowing
            presenters the opportunity to quickly and efficiently communicate their research in an
            easy-to-view format conducive to walk-through traffic. The Poster Session will include
            over 220 posters and will allow participants to study the information and discuss the
            posters one-on-one with the presenters.
                        EPA SCIENCE FORUM 2004 PROCEEDINGS                     123

-------
                                   List of Acronyms
             ATSDR       Agency for Toxic Substances and Disease Registry
             AAAS         American Association for the Advancement of Science
             AKC          American Kennel Club
             BEACH       Beaches Environmental Assessment and Coastal Health Act
             CA           California
             CDC          Centers for Disease Control
             CHAMACOS   Center for Health Analysis of Mothers and Children of Salinas
             CT           Connecticut
             DEC          Department of Environmental Conservation
             DOH          Department of Health
             DMA          Deoxyribonucleic Acid
             DNR          Department of Natural Resources
             ECOS         Environmental Council of the States
             EMAP         Environmental Monitoring and Assessment Program
             EPA          Environmental Protection Agency
             FL           Florida
             GIS           Geographic Information Systems
             HQ           Headquarters
             IAQ           Indoor Air Quality
             MD           Maryland
             MIT           Massachusetts Institute of Technology
             NAS          National Academy of Sciences
             NBII          National Biological Information Infrastructure
             NCA          National Coastal Assessment
             NCEA         National Center for Environmental Assessment
             NCER         National Center for Environmental Research
             NEPA         National Environmental Protection Act
             NERL         National Exposure Research Laboratory
             NESCAUM     Northeast States for Coordinated Air Use Management
             NH           New Hampshire
             NHEERL      National Health and Environmental Effects Research Laboratory
             NIEHS        National Institute of Environmental Health Sciences
             NOAA         National Oceanic and Atmospheric Administration
             NRMRL       National Risk Management Research Laboratory
             NYS          New York State
             OAQPS       Office of Air Quality Planning and Standards
             OAR          Office of Air and Radiation
             OEI           Office of Environmental Information
             OIAA         Office of Information Analysis and Access
             OPPT         Office of Pollution Prevention and Toxics
             ORD          Office of Research and Development
             ORIA         Office of Radiation and Indoor Air
             OST          Office of Science and Technology
             OTOP         Office of Technology, Operations and Planning
             OW           Office of Water
             OWOW       Office of Wetlands, Oceans, and Watersheds
             REMAP       Regional Environmental Monitoring and Assessment Program
             TMDL         Total Maximum Daily Load
             USGS         United States Geological Survey
124
EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
WEDNESDAY, JUNE 2, 2004

2
<
8
o
1
2
<
°.
CO
Science and Innovation to Protect Health and Environment
(ORD) Sessions
Oriental Ballroom A
Advanced Remote Sensing
Introduction -Terrence Slonecker, EPA/DRD/NERL
The Status of the 2001 National Land Cover Data -
James Wickham, EPA/ORD/NERL
Evaluating Environmental Quality Using Spatial Data
Derived from Satellite Imagery — K. Bruce Jones,
EPADRD/NERL
Development of Landscape Indicators for Potential
Nutrient Impairment of Streams in EPA Region 8-
Karl Hermann, EPA Region 8
Multi-Scale Remote Sensing Mapping of Anthro
pogenic Impervious Surfaces: Spatial and Temporal
Scaling Issues Related to Ecological and
Hydrological Landscape Analyses - S. Taylor
Jarnagin, EPA/DRD/NERL
LIDAR: A Remote Sensing Tool for Determining
Stream Channel Change? - David Jennings,
EPAA3RD/NERL
The Use of Remote Sensing in the Detection and
Removal of Chemical Weapons in Spring Valley -
Steven Hirsh, EPA Region 3
Using Science to Make a Difference
(Region) Sessions
Oriental Ballroom B
Can You Hear Us Now? EPA's Role in Invasive
Species Research and Management
Introduction - Michael Slimak, EPAADRD/
NCEA
Snakeheads, Green Crabs, and Other Nasty
Things -An Overview of Invasive Species
- Henry Lee II, EPA/DRDMHEERL
OW and Aquatic Nuisance Species - Whaf s
Underway and What's Planned - Diane
Regas, EPA/DW&WOW
Rapid Assessment Surveys: Marine
Bioinvaders in the Northeast - Judith
Pederson, MIT Sea Grant College Program
Targeted Screening for Invasive Species in
Ballast Genomic Approach es- Michael
Blum, EPA/DRDMERL
Non-native Oysters in Chesapeake Bay -
Michael Fritz, EPAfchesapeake Bay
Program Office
Delivering Science-Based Information to
Decision Makers (OEI) Sessions
Oriental Ballroom C
The Future of EPA's Environmental Indicators
Initiative and Report on the Environment
Introduction to the Draft Report on
Environmental Indicators-M/crtae/
Flynn. EPA/DEI/DIAA
Overview of the Outcome Chapters -
Denies Shaw, EPA/ORD
Human Health Trends and Outcomes -
JudyQualters, CDC
Questions and Discussion - Heather
Case, EPA
10: 00 AM -10: 30 AM Break
c
o
o
I
2
<
o
CO
0
Innovations in Risk Assessment Improving Data Resources
Introduction -George Woodall, Jr., EPA/DRD/
NCEA
The Need for Scientific Data in Regulatory Decision-
Making - Roy Smith, EPA&AR/3AQPS
The ATSDR Experience in Using the Supplemental
Documents Database in Developing Toxicological
Profiles - Henry Abadin, ATSDR
Distributed Database Approach to Sharing Data - Ann
Richard, EPA/ORD/NHEERL
The Chemical Effects in Biological Systems
Knovdedgebase - Michael Waters, NIEHS
Monitoring and Assessment to Protect Tribal Health
and Ecosystems
Introduction - Valerie Bataille, EPA Region 1
Protection of Tribal Cultural Practices Through
the Development of Native American
Exposure Pathways - Fred Corey,
Aroostook Band of Micmacs
Towards a Better Understanding of Mercury
Fate and Transport on the Fond du Lac
Reservation: Monitoring Air, Water,
Sediments and Biota - Nancy Costa, Fond
du Lac Reservation Environmental Program
Primary Production Study of Coastal Waters
of Ihe Bay of Fundy - Stephen Crawford,
Passamaquoddy Tribe at Pleasant Point
Using Geospatial Tools to Make
Program Decisions
Introduction -Brenda Smith, EPA and
Wendy Blake-Coleman, EPA
OEI Support for EPA HQ Emergency
Operations Center: Emergency
Response Analyzer - Joe Anderson,
EPAADEI/OIAA
Assessing Urban Growth and Land
Cover Trends Using Remote Sensing
Imagery and Landscape Metrics -
Gary Roberts, EPAADEI&IAA
NEP Assist GIS Tool to Support Work on
the National Environmental Protection
Act - Julie Kocher, EPA/DEI/DIAA
Noon -1:00 PM Lunch
                                     EPA SCIENCE FORUM 2004 PROCEEDINGS
                                                                                                       125

-------
   WEDNESDAY, JUNE 2, 2004 continued
          Science and Innovation to Protect Health and
                Environment (ORD) Sessions
                   Oriental Ballroom A
                                              Using Science to Make a Difference
                                                    (Region) Sessions
                                                    Oriental Ballroom B
                                           Delivering Science-Based Information to Decision Makers
                                                           (OEI) Sessions
                                                         Oriental Ballroom C
            Using Human Data in Risk Assessment
                                          R-EMAP: The Application of EMAP Indicators
                                             Delivering Consistent Information on Health and the
                                                            Environment
    o
    eo
    §
Introduction -John Vandenberg, EPAADRD/
  NCEA
The Ethics of Research Involving Human
  Subjects -James Childress, University of
  Virginia
EPA Clinical Research: Implications for Air
  Quality Standards -Bill McDonnell,
  EPA/ORD/NHEERL
Research with Human Subjects: Future
  Challenges, and Opportunities - Richard
  Sharp,  Baylor College of Medicine	
The Past, Present, and Future of the Regional
  Environmental Monitoring and Assessment
  Program - Brian Hill, EPAADRD/NHEERL
Southeastern Wadeable Streams R-EMAP -
  Peter Kalla, EPA Region 4
Maryland Biological Stream Survey: Science
  for Streams - Daniel Boward, MD DNR
William Sonntag, EPA/DEI
Collaborative Opportunities in Ecoinformatics -
  Mike Frame, USGS NBII
Environmental Information Exchange Network -
  Molly O'Neill, ECOS
Larry Fitzwater, EPA/DEI
   2:30PM-3:00PM    Break
          Supporting Innovations in Science to Identify
           Children's Vulnerability to Environmental
         	Exposures	
                                              Great Places Demand Great Science
                                          Developing Science-Based Information for Coastal Systems
    o
    CO
    o
    o
Introduction- Nigel Fields, EPA/DRD/NCER
Children's Health and Environmental
  Exposures: The Most Important
  Unanswered But Answerable Questions
  - Michael'Weitzman, American Academy
  of Pediatrics Center for Child Health
  Research
Highlights from the Columbia Center for
  Children's Environmental Health:
  Studying Air Pollution in Community
  Context - Virginia Rauh, Columbia
  Center for Children's Environmental
  Health
The National Children's Study - Carole
  Kimmel, EPA&RD/NCEA
Wrap-Up and Discussion
Introduction - Rochelle Araup, EPA
Next Generation Chesapeake Bay Nutrient
  and Sediment Loading Caps: Two Decades
  of Estuarine Science at Work - Richard
  Batiuk, EPA Region 3
The Great Lakes: Collaborative Science to
  Inform and Help Frame Policy - John Lyon,
  EPA/ORD/NERL
Sustainability of the Gulf of Mexico: The Role
  of Science, Management, and Activism -
  Quenton Dokken, Texas A&M University
Wrap-Up and Discussion
Introduction -Kevin Summers, EPA&RD/NHEERL
From Fjords to Tropical Beaches, EMAP's
  Assessment of Coastal Conditions on the Pacific
  Coast - Henry Lee II, EPAADRD/NHEERL
The Utility of NCA-type Monitoring Data for EPA
  Decision-making - Diane Regas, EPA/DW/
  OWOW
Use of Coastal Monitoring Data in Management
  Decision Making in Florida - Gil McRae,  FL Fish
  and Wildlife Conservation Commission
National Coastal Assessment A Successful State-
  Federal Collaboration in New Hampshire- Phil
  Trowbridge, NH Department of Environmental
  Services
National Coastal Assessment Approach and
  Findings in the Northeast - Henry Walker, EPA/
  ORD/NHEERL
National Coastal Assessment Monitoring and
  Modeling in Support of TMDL Calculations -
  Henry Walker, EPA/ORD/NHEERL	
126
                                           EPA SCIENCE FORUM 2004 PROCEEDINGS

-------
THURSDAY, JUNE 3, 2004

s
<
0
q
o
1
2
<
o
CO
CO
Science and Innovation to Protect Health and Environment
(ORD) Sessions
Grand Ballroom A
Sustainability - Educating for the Future
Education for Sustainability Initiatives -Alan
Hecht, EPA/ORD
Principles and Practice of Sustainability Education in
Schools -Jaimie Cloud, The Sustainability
Education Center, Inc.
National Efforts in Sustainability Education -Alan
Elzerman, Clemson University
Building Partnerships for Sustainable Science
Education -Sally Shuler, NAS & Smithsonian
Institution
Summary and Open Discussion
Using Science to Make a Difference
(Region) Sessions
Grand Ballroom B
Looking into the Future of a Region
Ecological Forecasting - K. Bruce Jones,
EPA/ORD/NERL
A Weight-of-Evidence Approach to Projecting
Land-Use Change and Resulting Ecological
Vulnerability - Laura Jackson, EPAADRD/
NHEERL
Land-Cover Change, Alternate Future
Scenarios and Nutrient Export in the M id-
Atiantic Region - James Wickham, EPA/
ORD/NERL
Statistical Modeling of Ground-Water
Vulnerability in the Mid-Atiantic Region:
Present and Future - Earl Greene, USGS
Forecasting Species' Distributions: The Shape
of Things to Come - Daniel Kluza, EPA/
ORD/NCEA
Putting It All Together: Implications for the
Mid-Atiantic Region in 2020 - Betsy Smith,
EPADRD/NERL
Delivering Science-Based Information to Decision
Makers (OEI) Sessions
Grand Ballroom C
Scientific Computing
Introduction -Rick Martin, EPAADEIA3TOP
The Center of Excellence for Environmental
Computational Science - Joseph Retzer,
EPA/3EI/DTOP
Current Projects and High Performance
Computing and Visualization Direction -
John Smith, EPA/DEI/3TOP
Growing the Environmental Science Portal
-Terry Grady, EPA/DRD/NERL
Closing and Questions - Joseph Retzer,
EPAA3EIADTOP
10: 00 AM -10: 30 AM Break
c
o
o
I
s.
<
o
CO
0
Partnering with New York on Air Quality and Human Health:
Issues, Challenges and Perspectives
Federal-State Partnerships for Enhanced
Understanding for Air Quality and Health
Relationships - ST Rao, EPA/ORD/NERL
Tracking Public Health - Vickie Boothe, CDC
NOAA-EPA's Air Quality Forecast Capability -
Paula Davidson, NOAA/National Weather Service
Air Quality: A Regional Perspective -Kenneth
Colburn, NESCAUM
Air Quality Management and Challenges in New
York State - David Shaw. NYS DEC
Health Surveillance -Nancy Kim, NYS DOH
Regional Research Partnership Program
Introduction - Tom Baugh, EPA Region 4
Microbial Source Tracking: The Application of
a DNA-Based Molecular Approach to
Identify Sources of Fecal Contamination-
Bon/fa Johnson, EPA Region 4
Land Cover Diversity Measured by Satellite as
a Proxy for Biodiversity - David Macarus,
EPA Region 5
The Relationship of Terrestrial Ecosystems to
Manganese Emissions from Wood Burning
-Dan Ahern, EPA Region 4
Healthy Communities -One Building at a Time
Introduction - Elizabeth Cotsworth, EPA/
OAR/ORIA
John Girman, EPA/DAR&RIA
Indoor Environment Research Base - Jim
Jetter, EPA/DRD/NRMRL
Delivering Technical Assistance -David
Mudarri, EPA£>AR£>RIA
EPA's New I AQ Label - Sam Rashkin, EPA
TracyEnger, EPA/DAR/DRIA
Noon-1:OOPM Lunch
                                         EPA SCIENCE FORUM 2004 PROCEEDINGS
                                                                                                                   127

-------
  THURSDAY, JUNE 3, 2004 continued

s
0.
°.

-------
POSTERS & EXHIBITS (Grand Ballroom and Foyer)
The Poster Session will highlight EPA research related to the three themes, allowing presenters
the opportunity to quickly and efficiently communicate their research in an easy-to-view format
conducive to walk-through traffic.  Each Forum Poster will provide a broad perspective of an
environmental issue,  the  scientific approach to resolve  the  issue,  partnerships  in both
conducting the work and applying the  results and the impact that  EPA science has made or
expects to make on the issue.

The  Poster Session  will include over 220 posters  and  will allow participants to study the
information and discuss the posters one-on-one with the presenters.  Posters and organizational
exhibits will be on display from Tuesday morning through Wednesday afternoon (breakdown to
occur after the afternoon break at approximately 3:00 PM).

DRINKING WATER RESEARCH TRAILER (MOBILE LAB) (Maine
Avenue Entrance)
The  Drinking Water Research Trailer will  be on  display outside the Maine Avenue entrance
(Ballroom Level) of the Hotel from Tuesday through Wednesday.

DOGS & POLLUTION PREVENTION (Grand Ballroom)
Demonstrations  on how  dogs  can  be  used in pollution prevention will be held in the Grand
Ballroom on Tuesday and  Wednesday.  In this demo, an AKC registered Swedish Vallhund
demonstrates his unique skill of vapor intrusion detection.

PRODUCT EXPO (Grand Ballroom)
This year's Forum will include a Product Expo, which will highlight several "ready-to-use" EPA
Products. EPA developed a list of health and environmental questions for which we have one
or more "ready-to-use" EPA science products, such as models, maps, databases or guidance
documents.

We asked a group of Regional, State, and Tribal Senior Managers to select 8-10 questions,
which they would like to see addressed, as related to high priority environmental science issues.
Their input was used to make the final selection of Product Expo Exhibit questions and
products, as follows:

       Water Quality
       •   How do I prioritize where to focus my TMDL Program? How do I locate impaired
          waters and develop targets to remove their impairments?
       •   What rapid-response technologies are available to determine if our beaches are
          safe?
       •   What help  is available for small drinking water systems?
       •   What technologies are available to help small drinking water plants meet the 10 ppb
          arsenic drinking water standard?

       Air Pollution
       •   Are technologies available for my electric utility to remove mercury from their air
          emissions?
       •   Can a national mold standard be developed using EPA-patented technology?
       •   How can area source emissions of air pollutants be measured in near real-time to
          support state and regional air quality objectives?
                        EPA SCIENCE FORUM 2004 PROCEEDINGS                      129

-------
          Ecosystem Protection
          •   What ecological resources should be evaluated in an ecological risk
             assessment?

          Human Exposure
          •   What exposure databases and models are available to help me determine
             human exposures to specific pollutants and how age (i.e., children vs. adults)
             and human activities affect these exposures?
130                       EPA SCIENCE FORUM 2004 PROCEEDINGS

-------