100R05008
PROCEEDINGS
EPA SCIENCE FORUM 2003:
PARTNERING TO PROTECT HUMAN HEALTH
AND THE ENVIRONMENT
May 6-8, 2003
United States Environmental Protection Agency
Ronald Reagan Building and International Trade Center
Washington, DC
-------
Table of Contents
Acronyms : viii
Executive Summary. xiii
Section I: Overview. l
Section II: Plenary Session 2
Opening Remarks 3
Keynote Addresses 3
EPA Administrator Keynote Address 3
EPA Science Advisor and ORD Assistant Administrator Keynote Address 4
EPA Region 4 Administrator Keynote Address 5
White House Council on Environmental Quality Keynote Address 6
Science Forum Overview 8
Plenary Addresses 8
Homeland Security 8
Moving Science Into Action 10
Year of Water 13
Emerging Technologies 14
National Homeland Security Research Center and Office of Homeland Security 16
Closing Remarks 17
Section III: Homeland Security 18
Anthrax: Response and Research 20
Anthrax Response and Recovery: Applied Science and Technology
and Future Needs ' 20
EPA's Homeland Security Research Program 21
Secondary Aerosolization of Viable Bacillus Anthracis Spores
in an Office Environment 22
Anthrax: Detection, Sampling, and Analysis 23
Environmental Sampling of Bio-Aerosols 23
Panel Discussion/Questions and Answers 26
Anthrax: Fumigation and Re-Occupancy 26
Fumigating Anthrax-Contaminated Sites: Building on Experience 26
Clearance Determinations: Judging Remediation Success and
Readiness for Re-Occupancy 28
Panel Discussion/Questions and Answers 30
Anthrax: Decontamination Technologies 31
The Hunt for Anthrax Contamination Chemicals 31
Laboratory Support for Evaluating Decontamination Technologies 32
Efficacy Testing Science Issues and Follow-up Research 33
Panel Discussion/Questions and Answers 34
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Table of Contents (continued)
Building Partnerships Towards Homeland Security 35
Security: The Business of Chemistry's Action 35
Homeland Security, Emergency Management, and a Water Utility 35
A Public Utility Manager's View of Our World Post-9/11/2001 36
The EPA Safe Buildings Program 39
Panel Discussion/Questions and Answers 40
Biowatch—Nationwide Early Detection of Airborne Biological Agents 40
World Trade Center: Lessons Learned, and Personnel Protection and Training 41
World Trade Center Lessons Learned and the Interagency Collaboration 41
9/11 Lessons Learned for Worker Protection 42
Immediate Response and Collaboration: EPA Region Perspective..... 42
Immediate Response and Collaboration: ATSDR Perspective 43
Longer Term Response and Collaboration: NIEHS Perspective 43
Evaluation of Health Effects of Cleanup and Recovery Workers
at the World Trade Center Disaster Site 44
World Trade Center Assessment Report 44
Panel Discussion/Questions and Answers 45
Preparing for Bioterrorism Threats in Water 45
NHSRC's Water Security Research and Technical Support Program 46
Potential Technologies for Detection of Biological Threats in Water Supplies 47
"Early Warning Monitoring" and Sensor Technology Development 48
Detection of Biological Agents in Water 48
Panel Discussion/Questions and Answers 50
Section IV: Moving Science Into Action 51
Regional Vulnerability Assessment (ReVA): Improving Environmental
Decisionmaking Through Client Partnerships 53
ReVA's Client Partnerships: Improving Environmental Decisionmaking
Through Applied Research 53
ReVA's Web-Based Application: A Tool for Regional, State, and Local
Decisionmakers 55
The Sustainable Environment for Quality of Life (SEQL) Program: A Partnership
Between EPA ORD and OAQPS, and State and Local Governments 55
ReVA's Partnership with the Maryland Department of Natural Resources;
Opportunities to Optimize the Future 56
Panel Discussion/Questions and Answers 58
EPA SCIENCE FORUM 2003 PROCEEDINGS IN
-------
Table of Contents (continued)
Partnership with State and Local Government 58
Delta Cross Channel Gate Operation on Water Quality and Migration of
Juvenile and Adult Salmon in Northern California 58
Integrated Environmental Planning Across Two States, 15 Counties, and 36
Municipalities: Do You Believe in Miracles? 59
Michigan Environmental Science Board and Protecting Children's Health 60
Panel Discussion/Questions and Answers 61
Advancing Science Through Environmental Monitoring and Assessment Program
(EMAP) Partnerships '. 61
EMAP-West: Introduction 61
The EMAP Western Pilot in Region 8 62
Perspective from the State of California 62
EMAP Tribal Perspectives 63
National Coastal Assessment: Past, Present, and Future 63
The Interactions of EMAP and SCCWRP: Help in the Past, Necessity
for the Future 64
The Role of the National Coastal Assessment in Developing a Continuing
South Carolina Estuarine Monitoring Program 65
The Application of EMAP and REMAP in the EPA Regions 66
Panel Discussion/Questions and Answers 67
Working with Tribes: Cultural Values and Tribal Lifeways Inform Health Assessments 67
Tribal Partnerships in Pesticide Management to Protect Human Health 67
Establishing Self-Sufficiency in Alaska Native Communities to Minimize
Exposure to Environmental Contaminants 68
Bioaccumulative Toxics in Native American Shellfish 69
Moving Science Into Action - Step One: Get the Data! 70
Uses of Toxics Release Inventory Data 70
Integration of State and County Stream Monitoring Programs: A Maryland
Case Study 71
Effects of Urban Growth on Fish Assemblages in a North Carolina Metropolitan
Area, 1970-2000 72
Dynamic Choropleth Maps 73
Emerging Innovations in Regional Ecosystem Protection 74
Regional Ecosystem Protection: A Pattern, An Opportunity, A Challenge? 74
Use of Geospatial Tools to Identify High Quality Midwest Ecosystems (Landscape-
Scale Characterization of Ecosystem Health in the Upper Midwest) 74
Synoptic Model to Rank Wetland Ecosystems for 404 Permitting: An Application
of Regional Critical Ecosystems Protection 75
Southeastern Ecological Framework's GeoBook - Software for Mapping
Partnerships and Ecosystem Protection 76
Iv . EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Table of Contents (continued)
The Mid-Atlantic Highlands Action Program: Transforming the Legacy 77
Panel Discussion/Questions and Answers 78
Site Characterization and Decision Analysis of Contaminated Sediments 78
Introduction of Concepts and Tools 78
Initial Sample Designs 79
Spatial Estimation 81
Decision Analysis 82
Section V: Year of Water—30 Years of Progress Through Partnerships 84
Waterborne Disease in the United States .....'. 86
Drinking Water Related to CWA Endemic and Epidemic Waterborne Disease:
An EPA and CDC Partnership 86
Using Randomized Trials to Study Waterborne Pathogens Among
Susceptible Populations 88
Maintaining Microbiological Quality of Drinking Water in the Distribution System 89
Panel Discussion/Questions and Answers 90
Mississippi River Basin Hypoxia 90
Hypoxia in the Gulf of Mexico 91
Action Plan for Reducing, Mitigating, and Controlling Hypoxia in the Northern
Gulf of Mexico 92
Panel Discussion/Questions and Answers 93
The Millennium Challenge: EPA's Response to Invasive Species 93
The Office of Water Perspective 94
United States Coast Guard Research: Research in Support of the Coast Guard's
Program To Prevent the Introduction of Nonindigenous Species by Ships 95
International Treaty Effort to Address the Transfer of Invasive Species Via
Ballast Water 96
A "Shocking" Solution for Controlling the Spread of Asian Carp into the
Great Lakes 97
Introduced (Invasive) Species and Pesticide Control Programs 98
Environmental Perspectives on Invasive Species Control:
Precaution and Prevention 99
Panel Discussion/Questions and Answers 100
Social Science and Resistance to Water Fluoridation 100
EPA Drinking Water Regulations for Fluoride 100
Fluoridation: An Undefendable Practice 102
Panel Discussion/Questions and Answers 104
Development of Biological Indices for Coral Ecosystem Assessments 104
Assessing the Consequences of Global Change for Coral Reef Ecosystems 104
Applying Biocriteria for Coral Reefs in Water Programs 106
Development of a Coral Reef Index of Biotic Integrity 106
Panel Discussion/Questions and Answers 108
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Table of Contents (continued)
The Impacts of Urban Drainage Design on Aquatic Ecosystems in the United States 108
OWOW and Smart Growth: Integration of Water Programs through Smart
Growth Principles 108
Two Tools for Smart Growth '. 109
Panel Discussion/Questions and Answers Ill
Innovative Monitoring Techniques Ill
30 Years of Progress Through Partnerships: Biological Indicators 112
Innovative Monitoring: Probabilistic Monitoring Approaches 113
The Next Generation of Wetlands Assessment - The Upper Juniata Watershed 114
Panel Discussion/Questions and Answers 115
Volunteer Monitoring—Ten Years of Progress 115
Volunteer Monitoring: 10 Years of Progress, What's the Future? 115
Wetland - Volunteer Monitoring: Ten Years of Progress 116
Volunteer Monitoring: A Coastal Perspective 117
What's in the Future? 118
Section VI: Emerging Technologies 119
Applying Computational Toxicology to Solving Environmental Problems 121
Computational Toxicology: Bolstering the EPA's Mission 121
Toxicogenomic Predictive Modeling 121
EPA's Research Program on Computational Toxicology 123
Novel Informatics and Pattern Recognition Tools for Computational Toxicology 124
Computational Toxicology and Genomics: The Next Wave of Drinking
Water Research .- 126
The Genomic Path from Exposure to Effects in Aquatic Ecosystems 127
Structure-Activity Tools for Assessing Pesticides and Toxic Substances—Past,
Present, and Future 128
NIEHS Toxicogenomics Centers: Model for Partnerships 129
Panel Discussion/Questions and Answers 130
Innovation to Advance the Detection of Threats and Optimize
Environmental Decisionmaking 131
Information Technology Science: Bolstering the EPA's Mission 131
Meeting National Environmental Goals: Coordinated Federal Information
Technology Solutions 132
Application of Advanced Information Technology to Promote, Educate, and
Address Environmental Concerns 133
Monitoring Stressors to Human and Ecosystem Health from Space 134
ASPECT: Protecting Americans Through Rapid Detection of
Atmospheric Contaminants , 135
Simulation and Visualization of the Smoke/Dust Plume from the
World Trade Center 136
vi EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Table of Contents (continued)
Real-Time Monitoring and Communication of Air Quality 137
Panel Discussion/Questions and Answers 139
Applying Biotechnology to Achieve Sustainable Environmental Systems 139
Molecular Fanning for Sustainable Chemistry 139
EPA Biotechnology Program Overview 141
Regulatory Perspective on Bioengineered Crops 143
Linking Strategic Environmental Monitoring to Risk Assessment of Biotechnology
Products with Plant Incorporated Protectants 144
Remote Sensing for Bioengineered Crops 145
Environmentally-Benign Polymeric Packaging from Renewable Resources 146
Science-Based Opportunities for Interagency Interactions Through the USDA
Biotechnology Risk Assessment Research Grants Program 147
Panel Discussion/Question and Answer ; 148
Applying Nanotechnology to Solve Environmental Problems 149
Nanotechnology and the Environment: Keeping an Emerging Technology Green 150
The Future of the National Nanotechnology Initiative 151
Nanostructured Porous Silicon and Luminescent Polysiloles as Chemical
Sensors for Carcinogenic Chromium (VI) and Arsenic (V) 152
Nanoscale Biopolymers for Decontamination and Recycling of Heavy Metals 153
Molecular-Dynamics Simulation of Forces Between Colloidal Nanoparticles 155
Development of Nanocrystalline Zeolite Materials as Environmental Catalysts:
From Environmentally Benign Synthesis to Emission Abatement 156
Panel Discussion/Questions and Answers 157
Appendix A: Meeting Agenda 158
EPA SCIENCE FORUM 2003 PROCEEDINGS vli
-------
Acronyms
AC AT Alaska Community Action on Toxics
ACC American Chemistry Council
AOC Assimable Organic Carbon
APR air purifying respirator
AQI air quality index
ASPECT Airborne Spectral Photographic Environmental Collection Technology
ATSDR Agency for Toxic Substances and Disease Registry
BDOC Biodegradable Dissemblic Organic Carbon
CABW California Aquatic Bioassessment Workgroup
CBEN Center for Biological and Environmental Nanotechnology
CDC Centers for Disease Control and Prevention
CHPPM Center for Health Promotion and Prevention Medicine
CWA Clean Water Act
DENR Department of Environment and Natural Resources
DEP Department of Environmental Protection
DHHS Department of Health and Human S ervices
DHS Department of Homeland Security
DNA deoxyribonucleic acid
DNR Department of Natural Resources
DOD Department of Defense
DOE Department of Energy
viil
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Executive Summary
The Environmental Protection Agency (EPA) presented the 2003 Science Forum: Partnering to
Protect Human Health and the Environment on Monday, May 5, through Wednesday, May 7,
2003, in Washington, DC to kick off May 2003 as "EPA Science Month." This Science Forum
highlighted EPA's scientific accomplishments, showcased EPA's commitment to quality
science, and demonstrated, through examples, the use of science in decisionmaking and
policymaking. The Science Forum also provided an opportunity for dialogue and interaction
among EPA scientists, clients, stakeholders, and colleagues with over 1,100 attendees at this
event, including EPA program, research, and regional staff; members of other Federal agencies;
the scientific community; and the public.
The Science Forum consisted of a full day plenary session with keynote and plenary speakers
and four two-day breakout sessions. Each breakout session examined a theme area—homeland
security, moving science into action, year of water, and emerging technologies. The Science
Forum included 189 posters on current EPA research activities and speaker-specific topics, EPA
scientists/engineers present to discuss their research efforts, 16 exhibits of EPA scientific and
educational programs, and an awards ceremony for scientific accomplishment.
Plenary Session
The purpose of this session was to provide keynote addresses on the role and value of science
and partnerships to support environmental decisionmaking and policymaking, plenary addresses
on each of the four topic areas (homeland security, moving science into action, year of water,
and emerging technologies), and to introduce the newly-created EPA Office of Homeland
Security and National Homeland Security Research Center.
Keynote Addresses. EPA Administrator Christie Todd Whitman opened the Science Forum with
a perspective on the role of sound science and research in public policy as well as an overview of
several EPA research program initiatives such as personnel retention, collaboration with other
agencies and organizations, and communicating research results and directions externally. EPA
Science Advisor and Assistant Administrator for the Office of Research and Development
(ORD), Dr. Paul Oilman, provided highlights of numerous science-related initiatives to address
the science needs of EPA as well as the quality of the scientific products. The Regional
Administrator for EPA Region 4, Mr. Jimmy Palmer, presented examples illustrating the
regional perspective of the EPA's science assets and future scientific needs. Chairman of the
White House Council on Environmental Quality, Mr. James Connaughton, discussed
overarching science needs and issues facing Federal agencies and the United States government
internally and internationally. Director of the Office of Science Policy, Dr. Kevin Teichman,
provided an overview of the three-day Science Forum.
Plenary Addresses. Director for the Biological and Chemical Porfolio, Dr. John Vitko, provided
an overview of the new Department of Homeland Security, identified key initiatives in the
biological threat area, and discussed current activities and research initiatives. Secretary of the
North Carolina Department of Environment and Natural Resources, Mr. William Ross, Jr.,
EPA SCIENCE FORUM 2003 PROCEEDINGS xlil
-------
discussed the Federal/state partnership and provided examples illustrating how the relationship
between North Carolina and EPA are moving science forward in ways that contribute to the
environment and human health. Director of the Haudenosaunee Environmental Task Force, Mr.
James Ransom, discussed how cultural issues affect science and provided an understanding of
the role of traditional knowledge in conjunction with Western science in problem solving.
Marine Biologist and Explorer-in-Residence with the National Geographic Society, Dr. Sylvia
Earle, discussed the importance of scientific exploration and its role in understanding water, the
environment, and human impacts. Director of the Foresight and Governance Project with the
Woodrow Wilson International Center for Scholars, Mr. David Rejeski, discussed the current
technology revolution and the changes in thinking, approaches, and organizations necessary to
address the environmental challenges posed by these emerging technologies.
i
National Homeland Security Research Center (NHSRC) and Office of Homeland Security.
EPA Deputy Administrator, Ms. Linda Fisher, discussed the newly-created NHSRC and its role
in supporting EPA responsibilities for homeland security, including water infrastructure
protection, safe buildings, rapid risk assessment, and incident response. The Director of the EPA
Office of Homeland Security, Ms. Mary Kruger, discussed the role and mission of this new
Office both within and external to the Agency.
Homeland Security
This two-day session focused on Homeland Security efforts, specifically response and
remediation efforts, threat detection, and incident preparedness. A key theme in these
presentations is that interagency collaborations are essential to research programs, successful
implementation of security measures, and response and remediation activities.
Anthrax: Response and Research. Dr. Lee Hermann, with the Office of Solid Waste and
Emergency Response (OSWER), led this session addressing lessons learned in responding to the
anthrax attacks and new EPA research initiatives in support of homeland security. Mr. Thomas
Voltaggio, Deputy Regional Administrator for EPA Region 3, discussed the response and
remediation actions taken at the Hart Senate Office Building to address the anthrax
contamination. Mr. Timothy Oppelt, Director of NHSRC, discussed the founding of the
Homeland Security Research Program and the direction of the Program's current research. Dr.
Chris Weis, with the National Enforcement Investigation Center, discussed the importance of
science support and coordination in environmental emergency situations, and the practical safety
and risk assessment challenges encountered by On-Scene Coordinators (OSC).
Anthrax: Detection, Sampling, and Analysis. Dr. Hofmann, with OSWER, led this session
examining sampling methods, protocols, and challenges encountered in responding to the
anthrax attacks. Captain Kenneth Martinez, with the National Institute for Occupational Safety
and Health (NIOSH), discussed the strategies and procedures involved with the decontamination
of the Hart Senate Building. Mr. Mark Dumo, EPA Region 5, discussed details of the sampling
activities at the Hart Senate Building.
Anthrax: Fumigation and Re-Occupancy. Ms. Anna Treinies, with OSWER, led this session
reviewing lessons learned regarding fumigant selection, remediation activities, and developing
clearance determinations to re-occupy public spaces contaminated by anthrax. Dr. Dorothy
Xiv EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Acronyms (continued)
DOJ Department of Justice
DOT Department of Transportation
ECBC Edgewood Chemical and Biological Center
ECC Environmental Clearance Committee
EDC endocrine disrupting chemical
EMAP Environmental Monitoring and Assessment Program
EPA Environmental Protection Agency
ETV Environmental Technology Verification
FBI Federal Bureau of Investigation
FEMA Federal Emergency Management Agency
FIELDS Field Environmental Decision Support
FIFRA Federal Insecticide, Fungicide, and Rodenticide Act
GIS geographic information system
GLNPO Great Lakes National Program Office
GPRA Government Performance and Results Act
GPS global positioning system
GSA General Services Administration
HEPA high efficiency particulate air
HCGI highly credible gastrointestinal illness
HIV human immunodeficiency virus
HVAC heating, ventilation, and air conditioning
EPA SCIENCE FORUM 2003 PROCEEDINGS Ix
-------
Acronyms (continued)
IBI • Index of Biological Integrity
IRIS Integrated Risk Information System
JS AWM Joint Service Agent Water Monitor
LANL Los Alamos National Laboratory
MAC Mycobacterium Avium Complex
MCL maximum contaminant level
MCLG maximum contaminant level goal
MDEQ Michigan Department of Environmental Quality
Med-Fly Mediterranean fruit fly
MESB Michigan Environmental Science Board
MOUs Memoranda of Understanding
NASA National Aeronautics, and S pace Administration
NCEA National Center for Environmental Assessment
NCER National Center for Environmental Research
NEIC National Enforcement Investigation Center
NERL National Environmental Research Laboratory
NHEERL National Health and Environmental Effects Research Laboratory
NHSRC National Homeland Security Research Center
NIEHS National Institute for Environmental Health Sciences
NIOSH National Institute for Occupational Safety and Health
NNI National Nanotechnology Initiative
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Acronyms (continued)
NOAA National Oceanic and Atmospheric Administration
NPDES National Pollutant Discharge Elimination System •
NRMRL National Risk Management Research Laboratory
NSF National Science Foundation
OAQPS Office of Air Quality, Planning, and Standards
OPP Office of Pesticide Programs
OPPTS Office of Prevention, Pesticides, and Toxic Substances
ORD Office of Research and Development
OSC On-Scene Coordinator
OSHA Occupational Safety and Health Administration
OSWER Office of Solid Waste and Emergency Response
OWOW Office of Wetlands, Oceans, and Watersheds
PAH polycyclic aromatic hydrocarbons
PBT persistent, bioaccumulative, and toxic
PCB polychlorinated biphenyl
PLA polylactides
PM particulate matter
PPE personal protective equipment
PVC polyvinyl chloride
QA quality assurance
QA/QC quality assurance/quality control
EPA SCIENCE FORUM 2003 PROCEEDINGS xl
-------
Acronyms (continued)
QSAR quantitative structure-activity relationship
ReVA Regional Vulnerability Assessment
RNA . ribonucleic acid
SBIR Small Business Innovation Research
SCCWRP Southern California Coastal Water Research Program
SCECAP South Carolina Estuarine and Coastal Assessment Program
SDWA Safe Drinking Water Act
SEQL Sustainable Environment for Quality of Life
SIP State Implementation Plan
SMCL secondary maximum contaminant level
SNL Sandia National Laboratories
SSC Science Support Coordinator
STAR Science to Achieve Results
TIO Technology Innovation Office
TRC Toxicogenomics Research Consortium
TRI . Toxics Release Inventory
TSCA Toxic Substances Control Act
USCG United States Coast Guard
USD A United States Department of Agriculture
USGS United States Geological Survey
WTC World Trade Center
Xil
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Clark, Chief Scientist for Bioterrorism Issues in OSWER, discussed the remediation of multiple
sites as a result of the anthrax mail attacks. Mr. Matt Gill en, a Senior Scientist with NIOSH, and
Mr. Jack Kelly, with EPA Region 3, jointly discussed post-remediation re-occupancy focusing
on the organizational, scientific, and communication issues related to building clearance for re-
occupancy.
Anthrax: Decontamination Technologies. Mr. Marty Powell, with EPA Region 3, led this
session addressing the identification and evaluation of chemicals for use in the treatment of
anthrax contamination. Mr. JeffKempter, Senior Advisor to the Orifice of Pesticide Programs
(OPP), discussed the crisis exemption for pesticides and the challenges involved with cleanup
efforts at anthrax sites. Mr. Jeff Heimerman, with the OSWER Technical Innovation Office,
discussed the challenges faced in evaluating emerging anthrax decontamination technology. Ms.
Rebecca Schultheiss, with the EPA Environmental Science Center, discussed the duties and
capabilities of this laboratory as they relate to the evaluation of decontamination chemicals. Dr.
Stephen Tomasino, also with the EPA Environmental Science"Center, discussed the complex
nature of determining the effectiveness of anti-microbial chemicals, the EPA role in strategy
development and advancement of the science, and a research plan for this scientific area.
Building Partnerships Towards Homeland Security. Mr. Craig Mattheson, with the EPA
Chemical Emergency Preparedness Office, led this session examining critical infrastructure
protection and homeland security, and the interconnectivity of these industries/sectors. Mr.
Marty Durbin, Security Team Leader for the American Chemistry Council, discussed security
measures and communications tools for consideration in the development of a security plan for a
chemical facility. Mr. Paul Bennett, Director of Emergency Management at New York City
Department of Environmental Protection, discussed issues related to the response of water
utilities in the event of an attack, including the formation of essential partnerships with other
agencies/organizations. Mr. Michael Marcotte, with the District of Columbia Water and Sewer
Authority, discussed security measures taken at the Blue Plains Water Treatment Facility and
security issues for wastewater collection and treatment. Mr. Gordon Smith, Manager of the
Public Safety and Technologies Department at SandiaNational Laboratories, discussed the
development, design, and evaluation of risk assessments as this process relates to vulnerability
assessment. Ms. Janet Pawlukiewicz, Director of the EPA Water Protection Task Force
(WPTF), discussed accomplishments and ongoing activities in support of the EPA Homeland
Security Strategic Plan. Dr. Nancy Adams, with NHSRC, provided highlights of EPA activities
under the Safe Buildings Program to address detection, containment, decontamination, and
disposal issues.
BioWatch. Mr. Thomas Coda, the lead for Homeland Security Programs in the Office of Air
Quality Planning and Standards (OAQPS), provided an overview of the development,
implementation, and capabilities of the Bio-Watch surveillance network. Also discussed were
the multi-agency responsibilities in this program to rapidly recognize the releases of biological
agents before the on-set of illness.
World Trade Center: Lessons Learned and Personnel Protection and Training. Mr. Larry
Reed, with the National Institute for Environmental Health Sciences (NIEHS), presented lessons
learned regarding the importance of interagency collaborations in the aftermath of the World
Trade Center (WTC) attack. Mr. Joseph Hughes, Jr., with NIEHS, and Mr. Bruce Lippy, with
EPA SCIENCE FORUM 2003 PROCEEDINGS XV
-------
the National Clearinghouse for Worker Safety and Health Training, discussed the conditions and
problems encountered at the WTC site with regard to worker safety and training. Dr. Mark
Maddaloni, with EPA Region 2, discussed the physical and chemical challenges encountered by
EPA Region 2 in supporting the response actions after the WTC attack. Mr. Sven Rodenbeck,
Section Chief for the Superfund Site Assessment Branch with the Agency for Toxic Substances
and Disease Registry, described the challenges encountered in conducting sampling at the WTC
site and the development of the WTC Exposure Registry. Dr. Claudia Thompson, Program
Administrator for the NIEHS Superfund Basic Research Program, discussed WTC-related
research activities encompassing exposure, modeling, and health effects. Dr. Alison Geyh,
Assistant Professor in the School of Public Health with Johns Hopkins University, discussed
partnerships that aided in the evaluation of health effects resulting from exposures during the
WTC site cleanup. Mr. Herman Gibb, with the National Center for Environmental Assessment
(NCEA), discussed the focus and principle findings of the WTC Assessment Report including
lessons learned for future responses.
Preparing for Bioterrorism Threats in Water. Dr. Jafrul Hasan, with the Office of Science and
Technology (OST), and Mr. Chris Zarba, with the National Center for Environmental Research
(NCER), led this session addressing security and detection technology research as they relate to
bioterrorism threats in water. Mr. Jonathan Herrmann, with the NHSRC, provided an overview
of the key principles, scope, and approach of the Water Security Research and Technical Support
Program to provide, within three years, appropriate, affordable, reliable, tested, and effective
technologies and guidance for preparedness, detection, contamination, decontamination, and risk
of chemical/biological attacks on buildings and on water systems. Ms. Grace Robiou, with the
EPA WPTF, discussed the role of the WPTF in water infrastructure security and provided
highlights of current research projects for agent prioritization, a response protocol for
contamination threats to drinking water, and assessment of laboratory capabilities and capacity.
Dr. John Ezzell, a Senior Scientist with the United States Army Medical Research Institute of
Infectious Diseases, presented an overview of various technologies and approaches used to
detect biological threat agents in water. Ms. Janet Jensen, Project Manager with the United
States Army Soldier and Biological Chemical Command, discussed the concept and design of the
Joint Service Agent Water Monitor program to develop advanced capabilities to detect, identify,
and quantify chemical and biological contaminants in source treated and distributed consumer
water supplies. Dr. Alan Lindquist, Technical Lead for Detection of Contaminants of Concern in
the NHSRC Safe Buildings and Safe Water Programs, discussed the ongoing initiatives and
futures plans for developing the necessary approaches and protocols for the detection of
biological agents in water.
Moving Science Into Action
This two-day session focused on ongoing projects and activities involving the use and
development of environmental models, data management systems, and interactive tools on a
national, regional, state, local, and tribal level. This session presented several pilot projects and
communication efforts supported by EPA as well as the goals and uses of scientific data to assess
environmental conditions and human health risks. Key themes in the presentations are the need
to communicate with decisionmakers in government and industry, and the importance of
developing partnerships between Federal, state, local, and tribal governments and organizations.
xvl EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Regional Vulnerability (ReVA) Assessment: Improving Environmental Decisionmaking
Through Client Partnerships. Dr. Betsy Smith, with the National Environmental Research
Laboratory (NERL), led this session and provided an overview of ReVA as a flexible
framework, including a web-based tool and modeling system, promoting partnerships to build,
sustain, and improve community planning, while protecting the environment and human health.
Dr. Michael O'Connell; President of the Waratah Corporation, demonstrated the features of the
ReVA web-based integration tool, and provided example maps and histograms depicting
environmental assessment data for use by decisionmakers. Ms. Rebecca Yarbrough, Project
Manager with the Centralina Council of Governments, provided highlights of the Sustainable
Environment for the Quality of Life Program, a partnership between EPA, state, and local
governments to incorporate environmental considerations into local and regional decisionmaking
using ReVA tools. Mr. William Jenkins, Director of the Watershed Management and Analysis
Division in the Maryland Department of Natural Resources (DNR), discussed the partnership
between ORD and Maryland in the use of ReVA to increase the effectiveness and efficiency of
watershed enhancement and restoration activities.
Partnership With State and Local Government Mr. Gilberto Alvarez, with EPA Region 5, led
this session presenting ongoing projects that are prime examples of partnerships between EPA,
state or regional agencies, and other organizations. Dr. Bruce Herbold, with EPA Region 9,
described the studies of water quality, based on the migration of salmon, at the Delta Cross
Channel Gate in California. Dr. Linda Rimer, with EPA Region 4, discussed efforts to promote
quality of the environment in land use planning and political decisionmaking as well as the
threats to the environment and human health resulting from human sprawl and land development.
Mr, Keith Harrison, Director of the Office of Special Projects with the Michigan Department of
Environmental Quality, provided highlights of efforts by the Michigan Environmental Science
Board to protect children's health, including an environmental standards investigation.
Advancing Science Through Environmental Monitoring and Assessment Program (EMAP)
Partnerships. Dr. Michael McDonald, Director of EMAP, led this session providing highlights
of EMAP and its applications as well as current and future initiatives involving this web-based
tool. Dr. Roger Blair, with the National Health and Environmental Effects Research Laboratory
(NHEERL), provided an overview of the EMAP and EMAP-West efforts and tools to estimate
the current status/trends of selected environmental indicators on a regional basis, seek
associations between indicators and stressors, prepare periodic assessments, and to define
quantitative biocriteria among other goals. Mr. Karl Hermann, with the EPA Region 8
Ecosystem Protection Program, discussed the EMAP Western Pilot project to produce a regional
assessment of the ecological conditions of streams in EPA Region 8, and stakeholder
partnerships important to project success. Mr. James Harrington, with the California Department
of Fish and Game, discussed collaboration of the EPA and the State of California to develop •
biocriteria and improve water quality monitoring programs, methods, and protocols. Mr.
Jefferson Davis, a Scientist with the Nez Perce Tribe, described the role of EMAP in supporting
current conditions of streams within the reservation and using the bioassessment applications to
develop water quality standards and criteria, complete a Clean Water Action Section 303(d)
listing of impaired areas, and develop maximum daily load values.
In a second session on EMAP, Dr. Kevin Summers, with NHEERL, provided highlights of the
National Coastal Assessment Program and initiatives to build the scientific basis as well as the
EPA SCIENCE FORUM 2003 PROCEEDINGS xvli
-------
local, state, and tribal capacity to monitor the status and trends in the condition of the Nation's
coastal ecosystems. Dr. Stephen Weisberg, with the Southern California Coastal Water Research
Program, discussed how EMAP has supported and influenced the Southern California coastal
monitoring programs and current activities to develop cooperative regional monitoring surveys.
Dr. Robert Van Dolah, with the South Carolina Department of Natural Resources, provided
highlights of the South Carolina Estuarine and Coastal Assessment Program to monitor and
report on the conditions of biological habitats, including tidal creeks and open water. Ms.
Darvene Adams, the EPA Region 2 Monitoring Coordinator, described EMAP and Regional
EMAP objectives in EPA Region 2 to support state monitoring programs and address regional
priorities, including an EMAP design for monitoring based on probability, approaches for
indicator development, and approaches for water quality standards.
Working with Tribes: Cultural Values and Tribal Lifeways Inform Health Assessments. Mr.
Thomas Baugh, with EPA Region 4, led this session providing examples of partnering between
tribes and government agencies to maintain healthy environments, acquire new or better data,
develop data analysis tools, and to communicate environmental risks and.conditions. Ms. Sarah
Ryan, with the Big Valley Rancheria, explained the traditions and goals of the reservation to
improve pesticide management, community recycling, and communication of diverse
environmental and human health effects within their community. Ms. June Gologergen-Martin
with the Alaska Community Action on Toxics (ACAT) discussed the ACAT Program and tribal
efforts to address environmental health issues prevalent on St. Lawrence Island, including
funding, investigation of the nature and extent of contamination resulting from the United States
military and other sources, and incorporating their input into decisionmaking efforts of
surrounding areas. Mr. Larry Campbell and Ms. Jamie Donatuto, with the Swinomish Indian
Tribal Community, presented the issues of contamination of subsistence-harvested shellfish and
their project to study the bioaccumulative toxics in shellfish on the Swinomish reservation to
address environmental and human health concerns.
Moving Science into Action — Step One: Get the Data! Ms. Pamela Russell and Mr. Mike
Flynn, with the Office of Environmental Information (OEI), led this session presenting the
acquisition and analysis of data critical to completing environmental and human health risk
assessments as well as current initiatives and partnerships. Ms. Gail Froiman, with OEI,
provided an overview of the EPA Toxics Release Inventory (TRI) program and discussed uses of
the data to support communication and decisionmaking efforts for government agencies,
community organizations, industry, and international organizations. Dr. Ron Klauda, with the
Maryland DNR, and Mr. Keith Van Ness, with the Montgomery County Department of
Environmental Protection, presented highlights of an EPA, Maryland, and Montgomery County
partnerships using TRI data to improve stream monitoring and watershed assessment. Dr.
Jonathan Kennan, with the United States Geological Survey, discussed ongoing projects with
OEI to address adverse effects of urbanization; evaluate the relations among land use, extant fish
species composition, and stream water quality; and determine if there are significant relations
between fish assemblage structure and environmental quality across a disturbance gradient. Dr.
William P. Smith, with OEI, discussed the use of TRI data in creating dynamic choropleth maps
to visually depict trends in human health and environmental conditions.
Emerging Innovations in Regional Ecosystem Protection. Mr. Doug Norton, with the Office of
Water, provided highlights of an EPA workshop on critical ecosystem assessment and the use of
xvili EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
geospatial modeling to support such assessments. Dr. Mary White, with EPA Region 5,
discussed the use of geospatial analysis to characterize ecosystems, including the composite
assessment of diversity, sustainability, and land cover rarity factors. Ms. Brenda Groskinsky,
with EPA Region 7, described synoptic modeling as a method to rank and prioritize wetland
ecosystems to support decisionmaking on resource allocation and wetland protection. Dr. John
Richardson, with EPA Region 4, provided highlights of the Southeastern Ecological
Framework's GeoBook project that uses geographic information system (GIS) modeling to
determine appropriate ways to study and protect ecosystems as well as to support decisionmakers
in identifying issues important to surrounding communities. Mr. Tom DeMoss, with the Canaan
Valley Institute, described the Mid-Atlantic Highlands Program for collaborative monitoring,
research, management, and restoration activities within the Mid-Atlantic Highlands.
Site Characterization and Decision Analysis of Contaminated Sediment Dr. John Bing-Canar,
with EPA Region 5, led this session that illustrated new tools and techniques supporting
scientific analysis and decisionmaking at a contaminated sediments site. Mr. Brian Cooper, with
EPA Region 5, provided an overview of a collaborative project between EPA, academia, and the
National Oceanic and Atmospheric Administration to apply GIS tools for three-dimensional
visualization, characterization, and decision analysis of contaminated sediments. Dr. John Kem;
with Kem Statistical Services, Inc., discussed sampling design and procedures for the same
contaminated sediments study. Dr. Bing-Canar discussed exploratory data analysis and other
spatial estimation methods used to determine chemical mass and volume at the contaminated
sediments study site. Mr. Charles Roth presented the methods used for spatial estimation that
result in defendable, repeatable, and accurate data, and the creation of a risk analysis tool to
support decisionmaking.
Year of Water—30 Years of Progress Through Partnerships
This two-day session focused on human impacts on water systems, ecological and human health
implications of impaired systems, tools and techniques for improved tracking and monitoring of
water system degradation, improvement in overall water quality, the relationship between
drinking water and waterbome disease, and specific challenges involving invasive species and
coral reef management. All presentations highlighted EPA partnerships with state, local, and
tribal governments as well as the role of volunteer monitoring in addressing water issues. Key
themes in all of the discussions were the need for diverse bioindicators, increased understanding
of water habitat stressors, techniques for information sharing, and challenges in reversing
impairment that has already occurred.
Waterbome Disease in the United States. Dr. Fred Hauchman, with NHEERL, led this session
on waterbome disease trends and factors affecting microbiological contamination of drinking
water. Dr. Rebecca Calderon, Chief of the Epidemiology and Biomarkers Branch at NHEERL,
discussed the challenges in detecting and determining the causes of waterbome diseases and the
related research conducted by EPA and the Centers for Disease Control and Prevention (CDC).
Dr. Jack Colford, Associate Professor of Epidemiology at the University of California, Berkeley,
presented the results of a study of drinking water intervention in human immunodeficiency virus
(HlV)-sensitive populations and the frequency of gastrointestinal illnesses as a result of impaired
drinking water these populations. Dr. Mark LeChevallier, Director of Research at American
EPA SCIENCE FORUM 2003 PROCEEDINGS xlx
-------
Water, discussed monitoring and control techniques to maintain the biological integrity of
drinking water during distribution to users.
Mississippi River Basin Hypoxia. A panelist discussion provided an overview of the complex
hypoxia issue involving the Mississippi River basin and the northern Gulf of Mexico. Mr. Lee
Mulkey, Associate Director for Ecology at the National Risk Management Research Laboratory
(NRMRL), discussed the relationship of nonpoint source nutrient loading and hypoxia in the
Gulf of Mexico as well as current interest in free market solutions to address this area of concern.
Dr. Mary Belefski, with the Office of Prevention, Pesticides, and Toxic Substances, provided
highlights of six reports examining the science and economic aspects of the hypoxia issue, and
research involved in developing analysis tools and potential resolutions. Ms. Katie Flahive, with
the Office of Wetlands, Oceans, and Watersheds (OWOW), presented an overview of an Action
Plan to reduce, mitigate, and control hypoxia in the northern Gulf of Mexico as well as
partnerships among Federal agencies, states, and tribes to implement Action Plan goals to reduce
nutrient loading and reduce the size of the hypoxic zone.
The Millennium Challenge: EPA's Response to Invasive Species. Mr. Michael Slimak,
Associate Director for Ecology at NCEA, and Ms. Marilyn Katz, with OWOW, led this session
and provided background on the impacts and economic cost associated with invasive species and
efforts underway to combat this complicated issue. Assistant Administrator of the Office of
Water, Mr. G. Tracy Mehan III, provided an overview of the extent of the invasive species issue,
and key initiatives to control the introduction. Dr. Richard Everett, with the United States Coast
Guard, presented highlights of research related to invasive species, and the use of new
technologies, regulations, and best management practices to actively combat invasive species
entry routes. Ms. Kathy Hurld, with OWOW, presented the international perspective of invasive
species and the progress toward the development of an international ballast water treaty. Dr.
Marc Tuchman, Team Leader for both the Sediment Assessment and Remediation Team and the
Invasive Species Team at the Great Lakes National Program Office, discussed the development
and introduction of an electrical barrier to prevent the spread of Asian Carp into the Great Lakes.
Mr. Daniel Rosenblatt, Team Leader for the Emergency Response Team at OPP, discussed how
Federal pesticide laws and insect control programs help to control the spread of invasive species
and health considerations related to pesticide use. Ms. Jacqueline Savitz, Pollution Campaign
Director and Senior Scientist for Oceana, discussed the importance of preventing ihe
introduction of invasive species as a primary management control and the need for careful
consideration and precaution before using toxic chemicals for invasive species control as
evidenced by unintended consequences of past chemical use.
Social Science and Resistance to Water Fluoridation. Mr. Bill Hirzy, with the National
Treasury Employees Union, and Ms. Roberta Baskin, a Senior Reporter, introduced this session
and the intent to host a debate about the science and national policy of water fluoridation, which
is considered a controversial issue. Dr. Ed Ohanian, Director of the Health and Ecological
Criteria Division in the Office of Science Technology, presented an overview of drinking water
regulations and the health benefits of fluoride addition as well as current initiatives to review
new health effects and exposure data. Dr. Paul Connett, a professor at St. Lawrence University,
presented research, data, and other information in support of a counter viewpoint on the necessity
for a national water fluoridation policy and the health consequences of fluoride ingestion.
xx EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Development of Biological Indices for Coral Ecosystem Assessments. Mr. Kennard Potts, with
OWOW, led this session and introduced the goals for developing indicators for coral reef health,
the EPA role and partnerships in coral reef management, and the current research initiatives. Dr.
Jordan West, with NCEA, discussed the importance of coral reefs, local and global stressors
affecting reefs, and efforts of the United States Coral Reef Task Force to provide interagency
collaboration on the issue of coral bleaching. Dr. Richard Zepp, Senior Research Scientist with
NERL, discussed the linkage between increasing irradiance (light) and increasing water
temperature as well as El Nino effects and coral reef decline. Mr. William Swietlik, Program
Manager for the Biocriteria Program in the OST, discussed the utility of biocriteria to assess
coral reefs and the benefits of incorporating such biocriteria into water quality standards. Dr.
Steven Jameson, President of Coral Seas, Inc., discussed the development and use of an Index of
Biological Integrity (IBI) as a more accurate method to monitor and assess coral reefs drawing
on the success of IBIs in freshwater environments and the transferability of IBIs as indicators to
marine environments.
The Impacts of Urban Drainage Design on Aquatic Ecosystems in the United States. Mr.
Jamal Kadri, with the Office of Water, led this session on the use of Smart Growth tools and
initiatives for watershed protection and management, and described the EPA role and interest in
urban drainage design. Ms. Diane Regas, with OWOW, discussed Smart Growth principles,
their use in addressing major land development and other threats to estuaries and watersheds, and
emphasize the need for EPA to continue to foster partnerships with local governments in this
endeavor. Ms. Hye Yeong Kwan, Executive Director for the Center for Water Protection,
discussed the use of impervious cover as an indicator for watershed quality and a roundtable
approach to introduce Smart Growth concepts to community leaders seeking to protect their
watersheds.
Innovative Monitoring Techniques. Ms. Susan Holdsworth, with OWOW, led this session and
discussed the importance of developing innovative monitoring techniques to fill gaps in
monitoring that exist for the majority of the waters of the United States that in turn challenges
our ability to understand which waters are impaired or in danger of being impaired. Ms. Susan
Jackson, with the Biocriteria Program, discussed the use of biological indicators to assess water
quality, demonstrated the added value of their use, and noted partnerships important to
addressing important monitoring questions. Mr. Barry Burgan, a Senior Marine Scientist in
OWOW, presented probabilistic monitoring approaches as a cost-effective, innovative technique
to assess wetland and estuarine quality. Ms. Denise Wardrup, Assistant Director of the Perm
State Cooperative Wetlands Center, discussed the use of GIS, land use, and landscape
information to conduct a variety of assessments of watershed condition at the desktop level prior
to conducting onsite surveys.
Volunteer Monitoring—Ten Years of Progress. Mr. Joe Hall, with OWOW, led this session
addressing diverse examples of volunteer monitoring efforts and their contribution to
environmental programs. Ms. Alice Mayio, with OWOW, provided an overview of volunteer
monitoring in the past and present, EPA sponsorship of volunteer monitoring, partnerships, and
the applicability of volunteer data Ms. Kathleen Kutschenreuter, with OWOW, discussed how
volunteer monitoring relates to wetlands protection and quality data collection drawing on
examples of successful volunteer wetlands cooperative projects and partnerships. Mr. Joe Hall,
with OWOW, described how volunteers support coastal and estuarine monitoring initiatives with
EPA SCIENCE FORUM 2003 PROCEEDINGS xxl
-------
examples from the National Estuary Program. Ms. Mayio concluded the session with a review
of the future challenges and opportunities for volunteer monitoring programs.
Emerging Technologies
This two-day session focused on the application, use, and research directions for diverse
emerging technologies, including computational toxicology, genomics, advanced information
technology for simulation and modeling, biotechnology, and nanotechnology. Key themes in all
of me discussions were the fast pace of development and introduction of these technologies; the
increasing ability to understand toxicity, chemical reactions, and other mechanisms at the
molecular and genetic levels; the great promise for more environmentally-benign manufacturing;
the need for interdisciplinary and interagency collaboration on research programs supporting the
development of these emerging technologies; and the need to understand the effects and future
implications of these new technologies on human health and the environment to support
decisionmaking and regulation.
Applying Computational Toxicology to Solving Environmental Problems. Dr. William Farland,
Acting Assistant Administrator for Science and Research and Development for ORD, led this
session, defined computational toxicology, and noted potential applications to reduce animal
testing for understanding biology and risk. Dr. Donna Mendrick, with Gene Logic, Inc.,
discussed current efforts to build and apply atoxicogenomic database to predict pharmaceutical
and chemical effects as well as the mechanisms of toxicity. Dr. Lawrence Reiter, Director of
NHEERL, provided an overview of computational toxicology research at EPA and the use of a
conceptual or science framework for guiding research. Dr. William Welsh, with the Robert
Wood Johnson Medical School and the University of Medicine and Dentistry of New Jersey,
discussed the development and application of computation tools useful to risk assessment and
regulatory control with emphasis on quantitative structure-activity relationship models. Dr.
Douglas Wolfe, with NHEERL, presented applications of computational toxicology and
genomics to risk assessment for drinking water. Dr. David Lattier, withNERL, discussed the
use of computational toxicology in conjunction with genomics, proteomics, and metabonomics
to assess the path from stressor to exposure to effect in aquatic ecosystems. Mr. Joseph
Merenda, Jr., with the Office of Science Coordination and Policy, provided the program office
perspective on the use of structure-activity tools, the types of tools in use, and gaps or needs to
fill. Dr. Bennett Van Houten, Chief of the Program Analysis Branch at NIEHS, provided
highlights of the recently established National Center for Toxicogenomics and its research
initiatives.
Innovation to Advance the Detection of Threats and Optimize Environmental
Decisionmaking. Dr. Gary Foley withNERL led this session and discussed the role of advanced
information technology and modeling to gather, integrate, and interpret environmental data to
improve risk assessment and develop decision tools to support multi-stressor regional
decisionmaking. Dr. David Nelson, with the White House National Coordination Office for
Information Technology Research and Development, provided highlights of the Federal
Networking Information Technology Research and Development Program including examples of
information technology applications to environmental issues. Ms. Ramona Trovato, with OEI,
discussed EPA use of information technology to acquire and manage incoming data and the use
of such data to make sound decisions. Mr. David Williams, with ORD, and Mr. Jim Szykman,
xxii EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
with OAQPS, addressed the use of satellite-based remote sensing systems to evaluate human and
ecosystem health issues. Dr. Mark Thomas, with EPA Region 7, presented an airplane-based
technology in use by EPA to assist with emergency response to incidents involving chemical
releases. Dr. Steven Perry, with NERL, discussed the use of computer imaging and wind tunnel
testing to characterize the temporal and spatial patterns of contaminant movement and deposition
from the WTC collapse. Mr. Timothy Hanley, with the Office of Air and Radiation, addressed
the use of real-time monitoring data to communicate air quality conditions to the public.
Applying Biotechnology to Achieve Sustainable Environmental Systems. Dr. Hugh McKinnon,
Director of the National Risk Management Research Laboratory, led this session on diverse
types and applications of biotechnology. Dr. Barry Marrs, with the Fraunhofer Center for
Molecular Biotechnology, discussed the use and implications of molecular fanning to replace
traditional chemical manufacturing with emphasis on the use of enzymes as catalysts. Dr.
Lawrence Reiter, Director of NHEERL, presented highlights of the EPA biotechnology research
program to improve understanding of the health and environmental effects from agricultural
biotechnology products and to support the EPA mandate to regulate such products. Dr. Janet
Anderson, with OPP, discussed the role of science in the regulation of biotechnology. Dr.
Robert Frederick, with NCEA, provided an overview of monitoring strategies to support risk
assessment and decisionmaking with regard to bioengineered crops. Dr. John Glaser, with
NRMRL, presented potential applications of satellite-based remote sensing systems to support
compliance monitoring and evaluation of the development of insect resistance in bioengineered
crops. Dr. John Dorgan, with the Colorado School of Mines, discussed the environmentally-
friendly production and use of biopolymers to create biodegradable plastics and other products.
Dr. Deborah Hamernik, with the United States Department of Agriculture (USD A), provided
highlights of the US DA Biotechnology Risk Assessment Research Grants Program to support
multiple Federal agencies in making science-based decisions about the safety of introducing
genetically-modified organisms into the environment.
Applying Nanotechnology to Solve Environmental Problems, Dr. Jack Puzak, Acting Director
of NCEA, led this session, provided highlights of EPA and other Federal research initiatives for
nanotechnology, and identified potential nanotechnology applications and research directions.
Dr. Vicki Colvin, with the Center for Biological and Environmental Nanotechnology, discussed
current research for the creation of nanomaterials to improve membrane filter performance and
to remove specific contaminants from wastewater. Dr. Mike Roco, a Senior Advisor with the
National Science Foundation, presented highlights of the National Nanotechnology Initiative,
nanotechnology applications, and future research directions. Dr. William Trogler, with the
University of California-San Diego, presented recent research results for the production and use
of polysiloles as nanotechnology chemical sensors for arsenic and hexavalent chromium. Dr.
Wilfred Chen, with the University of California-Riverside, addressed current research results for
the production, modification, and use of biopolymers to selectively remove specific heavy metals
from wastewater. Dr. Kristen Fichthom, with Pennsylvania State University, discussed new
understandings of molecular dynamics of colloidal nanoparticles derived from simulations and
experimental research. Dr. Vicki Grassian, with the University of Iowa, presented techniques to
generate zeolite nanoparticles and potential applications of the resulting properties in catalysis
and optically transparent films and coatings.
EPA SCIENCE FORUM 2003 PROCEEDINGS xxlil
-------
(This page intentionally left blank.)
xxiv EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Section I: Overview
The Environmental Protection Agency (EPA) presented a Science Forum at the Ronald Reagan
Building and International Trade Center in Washington, DC, on Monday, May 5, through
Wednesday, May 7,2003, to kick off May 2003 as "EPA Science Month." The EPA 2003
Science Forum: Partnering to Protect Human Health and the Environment was an opportunity
to showcase the activities of EPA and other organizations in key areas of environmental research
and to spotlight new initiatives and recent successes. As the second of what is anticipated to be
annual event, this Science Forum built upon the first ever Agency-wide Science Forum held in
May 2002, and was co-sponsored by the Office of Research and Development (ORD), the Office
of Water, the Office of Solid Waste and Emergency Response (OSWER), and EPA Region 4.
The Science Forum highlighted selected high priority topics and EPA's scientific
accomplishments, showcased EPA's commitment to Duality science, and demonstrated, through
examples, how science influences Agency decisions. The Science Forum also provided an
opportunity for dialogue and interaction among EPA scientists, partners, clients, stakeholders,
and colleagues with over 1,100 attendees at this event. Attendees included EPA program,
research, and regional staff; members of other Federal agencies; stakeholders; the scientific
community; and interested members of the public. The Science Forum included 189 posters
addressing current EPA research activities and specific topics addressed by speakers, discussions
of research efforts by EPA and external scientists and engineers, and 16 exhibits of EPA -
scientific and educational programs.
EPA Administrator Christie Todd Whitman opened the first day of the Science Forum with a
perspective on the role of sound science and research in public policy and provided an overview
of several EPA initiatives to maintain and promote the Agency's commitment to quality science.
Other keynote speakers provided highlights of ongoing initiatives to address the science needs of
EPA and the quality of its scientific products, the regional perspective of EPA's science assets
and future scientific needs, and overarching science needs and issues facing Federal agencies and
the United States government internally and internationally. Subsequent plenary presentations
addressed each of the four theme areas emphasized in this Science Forum as well as EPA's
newly created National Research Center for Homeland Security and Office of Homeland
Security. At a reception following the keynote and plenary addresses, Level 1 Scientific and
Technological Achievement Awards were presented to four EPA personnel in recognition of
their high quality research of national significance or impact.
Four two-day breakout sessions each examined a theme area—homeland security, moving
science into action, year of water, and emerging technologies. The audience had an opportunity
in each session to ask questions of the speakers. Poster sessions followed the plenary session
and each breakout session addressing session-specific and related topics. EPA engineers and
scientists were available at these poster sessions to provide additional information and to address
questions of attendees.
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Section II: Plenary Session
Monday and Tuesday, May 5-6,2003
The purpose of this session on the first day and the beginning of the second day of the meeting
was to provide keynote addresses on the role and value of science and partnerships to support
environmental decisionmaking and policymaking, provide plenary addresses on each of the four
topic areas (homeland security, moving science into action, year of water, and emerging
technologies), and introduce the newly created EPA Office of Homeland Security and National
Homeland Security Research Center.
EPA Administrator Christie Todd Whitman opened the Science Forum with a perspective on the
role of sound science and research in public policy as well as an overview of several EPA
research program initiatives. EPA Science Advisor and Assistant Administrator for ORD, Dr.
Paul Oilman, provided highlights of ongoing initiatives to address the science needs of EPA as
well as the quality of the scientific products. The Regional Administrator for EPA Region 4, Mr.
Jimmy Palmer, presented examples illustrating the regional perspective of the EPA's science
assets and future scientific needs. Chairman of the White House Council on Environmental
Quality, Mr. James Connaughton, discussed overarching science needs and issues facing Federal
agencies and the United States government. Director of the Office of Science Policy, Dr. Kevin
Teichman, provided an overview of the three-day Science Forum.
Director for the Biological and Chemical Countermeasures Portfolio, Dr. John Vitko, provided
an overview of the new Department of Homeland Security (DHS), identified key initiatives in
the biological threat area, and discussed current activities and research initiatives. Secretary of
the North Carolina Department of Environment and Natural Resources (DENR), Mr. William
Ross, Jr., discussed the Federal/state partnership and provided examples illustrating how the
relationship between North Carolina and EPA is moving science forward in ways that contribute
to the environment and human health. Director of rne Haudenosaunee Environmental Task
Force, Mr. James Ransom, discussed how cultural issues affect science and provided an
understanding of the role of traditional knowledge in conjunction with Western science in
problem solving. Marine Biologist and Explorer-in-Residence with the National Geographic
Society, Dr. Sylvia Earle, discussed the importance of scientific exploration and its role in
understanding water, the environment, and human impacts. Director of the Foresight and
Governance Project with the Woodrow Wilson International Center for Scholars, Mr. David
Rejeski, discussed the current technology revolution and the changes in thinking, approaches,
and organization necessary to address the environmental challenges posed by these emerging
technologies.
EPA Deputy Administrator, Ms. Linda Fisher, discussed the newly created National Homeland
Security Research Center (NHSRC) and its role in supporting EPA responsibilities for homeland
security. The Director of the EPA Office of Homeland Security, Ms. Mary Kruger, discussed the
role and mission of this new Office.
Et>A SCIENCE FORUM 2003 PROCEEDINGS
-------
Opening Remarks
Director of the Office of Science Policy within the Office of Research and Development (ORD),
Dr. Kevin Teichman, welcomed all the attendees to this second annual EPA-wide Science
Forum: Partnering to Protect Human Health and the Environment, and introduced the co-chair
of the development committee, Ms. Megan Grogard. Dr. Teichman introduced the keynote
addresses for the morning plenary session.
Keynote Addresses
The EPA Administrator, the EPA Science Advisor, and the Regional Administrator for EPA
Region 4 provided opening addresses to Science Forum attendees on the role of science at EPA,
current initiatives, and future directions. The Chairman of the White House Council on
Environmental Quality provided a national and international perspective on how changes in
scientific understanding of the environment and natural resources impact government
decisionmaking.
EPA Administrator Keynote Address
EPA Administrator Christie Todd Whitman acknowledged the unique opportunities provided by
this Science Forum to bring the EPA science community together with other partners and to
showcase a broad range of cutting-edge research. EPA relies on sound science to understand the
environmental problems and the risks they pose to quality of life, identify potential solutions and
analyze which will do best, and determine the effectiveness of current programs and public
policies to meet the overall goal of cleaner air, purer water, and better protected land. To rely on
something this strongly requires the foundation to be strong, and this is a hallmark of EPA
science.
Nonetheless, efforts continue to look for ways to strengthen tiiis vital program and to increase the
role of science into the decisionmaking all the way through to the final product (e.g., rule,
guidance). At the beginning of her tenure, Administrator Whitman established a commission to
review the use of science at the Agency and a number of the recommendations have been
implemented, including tiie improvement of resources that determine the quality of EPA science,
partnering with external organizations to address important scientific research common to many
programs, and enhanced communication and coordination internally and externally.
While EPA attracts some of the best and brightest minds in the United States, efforts continue to
retain EPA scientific personnel and to support a strong post-doctoral program to continue their
contributions to EPA programs and produce future leaders in science and engineering.
Recognizing that recruiting great scientists is not sufficient and their work must be included in
regulations and other programs, EPA has increased the number of laboratory engineers and
scientists supporting regulatory programs, and holding the regulatory programs to the same
scientific standards as in the research programs.
Important scientific work is being conducted outside of the EPA and the United States
government that is useful to EPA activities. In addition, no one agency or organization has
sufficient resources to conduct all of the research necessary, therefore all interested organizations
EPA SCIENCE FORUM 2003 PROCEEDINGS 3
-------
benefit from sharing of information as well as resources. Examples of such collaboration include
a joint initiative with the American Chemistry Council to understand chemical impacts on
children's immune systems, and the Science to Achieve Results (STAR) program that has
supported outside agency research by providing more than $700 million through more than 800
grants since 1995 a true example of meaningful investment.
However, having "best in the world science" is of little value if the results are not communicated
or do not establish public confidence. Through the leadership of Dr. Paul Oilman, EPA Science
Advisor and Assistant Administrator for ORD, EPA is continuing to improve internally the
application and use of science, and communicating externally the value and strength of the
science. From the Science Policy Coordinating Council, which helps to direct the use of
scientific and technical information, to policy decisions and developing the information quality
guidelines that set high standards for scientific information, there is a much clearer
understanding today in how to use EPA science and what the Agency expects from its science.
Multiple initiatives continue to ensure that this understanding is communicated to the outside
world and is coordinated across the Agency and its regions, partners, and the public.
Administrator Whitman also noted that this Science Forum provides the opportunity to visit
exhibits showcasing EPA research efforts, meet with top Agency scientists and engineers,
participate in panel discussions on pressing environmental topics, and develop ideas on how to
expand both science and partnerships. Administrator Whitman thanked Dr. Oilman and
everyone who helped put this Science Forum together, and noted the need to continue to rely on
the sound science that has led to the achievements thus far in order to address environmental
challenges that are more complicated today than those in the past.
EPA Science Advisor and ORD Assistant Administrator Keynote Address
EPA Science Advisor and Assistant Administrator for ORD, Dr. Paul Oilman, provided
highlights of ongoing initiatives to address the science needs of the EPA, with titles of EPA
scientist awards and publications projected in the background. The number of personnel
involved in the regulatory planning process has doubled and the number of different projects that
have involved EPA scientists and engineers increased by about 35 percent in the last year. More
efforts focus on retaining the core research that provides the tools necessary to identify and
address emerging issues. At this time, EPA research is equally split between basic science and
problem-driven science, and expectations are that this will be maintained in the future.
Examples of science-related initiatives at EPA include:
• Creation of an inventory of all research activities, involving more than 4,500 entries,
anticipated to be publicly available on the Internet in the next year
* Preparation of information quality guidelines drawing on the quality assurance/quality
control (QA/QC) program already in effect at EPA and that guides how to do the work well
and thoroughly
• Compilation of the first comprehensive list of bio-indicators with assistance from the Council
on Environmental Quality, anticipated to be available in the next few months
4 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
• Development of a genomics policy to address the implications of a new set of technol ogies
across ORD and the EPA Program Offices as well as tools acceptable to the regulators and
the regulated community
• Implementation of research initiatives on asthma and aging
• Enhancement of the Integrated Risk Information System (IRIS) database, a flagship database
receiving thousands of queries annually
• Re-invigoration of regulatory environmental modeling, including an inventory of computer
models used by the Agency, and working with the National Academies of Science to address
a future vision for model use and QA/QC
• Conduct of a cross-agency forum on environmental measurements (including methods,
validation processes, and training to disseminate new methods) to ensure development of
sound, reliable measurements to support sound and reliable decisionmaking
• Revamping the Government Performance and Results Act (GPRA) process and reducing the
goals from 10 to five, with each goal highlighting research and cross-cutting initiatives
involving the EPA Programs, Regions, and ORD
• Implementation of homeland security research, an integrated effort of the Offices of Water,
Research and Development, and Prevention, Pesticides, and Toxic Substances in conjunction
with a vast interagency effort.
Congressional concern for improving QA programs at Federal agencies is also a key issue for
EPA, Ten years ago, EPA actively responded to criticism of its peer review process and solicited
input on the conduct peer review for scientific and other types of EPA products. As a result of
this effort, the number of products undergoing peer review increased from 112 in 1995 to 895 in
2002, of which 450 underwent external peer review, 225 underwent journal peer review for
publication, and 75 underwent internal review. An EPA Science Advisory Board member
recently testified positively to the House of Representatives on the status and improvements of
the EPA peer review program, noting that little more needs to be done.
EPA Region 4 Administrator Keynote Address
The Regional Administrator for EPA Region 4, Mr. Jimmy Palmer, presented the regional
perspective of the science assets and future scientific needs. EPA Region 4 is a co-sponsor of
this Science Forum, and is a lead EPA Region for research and information. Many of the day-to-
day regulatory and compliance issues faced in Region 4 cross all of the topic areas addressed in
this Science Forum.
Mr. Palmer presented a number of examples illustrating the role and use of science in regulatory
decisionmaking, how scientific understanding changes over time, the associated implications in
regulation, and the importance of science in addressing public concerns. Across all these
examples are three common elements: the science, data availability, and the public. These
EPA SCIENCE FORUM 2003 PROCEEDINGS S
-------
examples included: (1) the need for legislative change regarding models used for stream flow
(that controlled the ability of farmers to remove water to irrigate their fields) given existing
drought conditions no longer met the assumptions of the model (i.e., presence of a stable
hydrbgeologic regime); (2) the provision of multiple, conflicting scientific opinions to residents
near a large polychlorinated biphenyl (PCB)-contaminated area resulting in confusion and
concern; and (3) the challenges in communicating valid scientific information to distraught
families and a concerned public that did not support their assumptions regarding ah
environmental cause for an asthmatic attack leading to a child's death. The latter two examples
demonstrate the desire of the public to hear and give credence to scientific understanding, yet are
at risk of others with their own agenda who may not be as dedicated to the integrity of the
science.
Examples of emerging technologies within Region 4 included: (1) the use of phytoremediation
(trees and bulrushes) to cleanup contaminated groundwater and trap heavy metals from surface
water at the Savanna River Site; (2) following up on a contact via a defense contractor to become
involved in cutting-edge technology development for measurement of biological agents in
surface water as an early warning system for contamination at Oak Ridge National Laboratory;
(3) successful collection and transfer of landfill gas to an automobile manufacturing plant for
use; and (4) the promise of a plasma technology developed by the Georgia Institute of
Technology and a private partner to convert municipal solid waste to glass, which can be ground
and reused for paving material, for building material, and to entrain pollutants including
radionuclides.
Mr. Palmer noted that the discussions at this Science Forum will address both pure and applied
science. Mr. Palmer also announced a new initiative from Administrator Whitman to begin a 45-
day self-assessment (beginning with EPA Region 4) to identify how to better marshall the
Agency's scientific assets to meet the practical needs at the Regional level and to identify the
research needs at the Regional level.
White House Council on Environmental Quality Keynote Address
Chairman of the White House Council on Environmental Quality, Mr. James Connaughton,
discussed overarching science needs and issues facing Federal agencies as well as the United
States government. The past 30 years have seen great advancements in risk assessment and in
understanding underlying scientific principles such as uptake and how systems interact. The
next 30 years will build on this foundation and associated tools with a key question being how to
fit this understanding into a policy framework. Historically, advisory and policymaking
personnel often waited to hear the scientific message and were less involved in moving the
science forward; with the current need for faster decisionmaking, risk communication is now
very important. Eight examples illustrated the role of science, risk assessment, and risk
communication in environmental policymaking and decisionmaking.
First, forest management decisions made over 100 years ago, based on the science of the time,
resulted in a build up of fuel that now causes very intense fires. This illustrates the need to
revisit policy decisions as scientific understanding changes over time. Challenges faced in the
current healthy forests initiative include understanding the truly natural condition, how to
effectively return to this condition, and the need to interface with scientific input, the Federal
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
sector with forest management responsibilities, and the contractors who conduct the actual work.
This combines understanding of risk assessment, practical field knowledge, and accomplishing
projects with scientific basis and merit
Second, improved understanding of the environmental condition has resulted in a global interest
on research into technologies to better support sustainable development, yet the areas of desired
emphasis vary. For example, the fundamental needs of developing countries are for energy,
agriculture, and water, with expectation of assistance in these areas from the developed
countries. The world's first zero pollution, zero green house gas emission, coal-fired power
plant will be built in the United States in the next 10 years and this construction will coincide
with the next round of investment in developing countries; this represents a future opportunity to
transition such a technology. Also, renewed interest in fusion as an energy source is a result of
computer advances enabling improved understanding of the risks in development to support
better decisionmaking on the timeframe for fusion development.
Third, a very rich debate is currently underway worldwide regarding biotechnology applications
for future agricultural practices. Many of these discussions occur at high government levels and
with superficial assertions about the science. Future discussions must stay centered in the
scientific enterprise, since the benefits to the starving and the risks of introduction cannot be
ignored. There is a need to explore ways to make such products available and to develop the
framework for other countries to bring this technology forward with a sound basis in science as
well as helping to ensure that the regulatory capacity of these countries is sufficient to offset the
risks of biotechnology introduction.
Fourth, 1he recent Kyoto Water Forum addressed the goal of providing substantially safer water,
involving both low and high technology and infrastructure perspectives as well as considering
both classic water infrastructure and new technology perspectives. Traditionally, the focus is on
large infrastructure investments, yet science must provide the full array of possibilities to
decisionmakers to meet diverse needs worldwide.
Other examples included: (1) the need to build in-country scientific capacity to support
sustainable development and to advance environmental protection and better human health; (2)
improving integration of global observation systems and ground-truthing data to support
sustainable development, global climate change, and long-term planning; (3) the 50-year
commitment required for global climate change research to help with long-term planning and to
deliver better information to decisionmakers; and (4) the need to continue to improve the
effectiveness of marine protection (e.g., coral reefs), advance scientific understanding to expand
management options and to assess whether the management initiatives are accomplishing the
goals, and improve the effectiveness of market-based initiatives such as fish quotas. Expanding
the scientific understanding of the management- and market-based techniques and their role in
healthier fish stocks, ecosystems, and coral reefs, will support informed policymaking in both
developed and developing countries.
A question and answer session with the speaker and audience addressed the following topics: (1)
the importance of understanding both the cultural and scientific aspects to support successful
introduction of anew initiative; (2) the change in emphasis at the Kyoto Water Forum from large
to small systems as well as an exchange of financing ideas, such as the use of a revolving fund
EPA SCIENCE FORUM 2003 PROCEEDINGS 7
-------
that is prevalent in the United States but not in the rest of the world; (3) the status of
decisionmaking on an appeal from the State of Oklahoma regarding the status of a request for
assistance from the Council for Environmental Quality to address a complex Superfund site; and
(4) the increased understanding of relative risks to improve funding decisions (i.e., help focus
funding on highest risk scenarios) as an outgrowth of recent homeland security initiatives.
Science Forum Overview
Dr. Kevin Teichman, Director, Office of Science Policy, presented highlights of the Science
Forum activities and presentations.
Director of the Office of Science Policy in ORD, Dr. Kevin Teichman, provided an overview of
the three-day Science Forum and the comprehensive information packet received upon
registration. Dr. Teichman noted the opportunity for participants to view more than 200 posters
on the four themes of the Science Forum (homeland security, moving science into action, year of
water, and emerging technologies), and more than 16 exhibits located in the conference area and
outside the building. The afternoon sessions involve a series of plenary talks on each of the four
themes featuring speakers from outside the EPA. In the late afternoon poster session, the
Science Advisory Board will recognize the recipients of the Level 1 Science Achievement
Awards. The second and third days of the Science Forum will involve breakout sessions on each
of the four themes.
Dr. Teichman noted that of the 1,100 registrants approximately 70 percent were from EPA, and
thanked the EPA co-sponsors of this event—EPA Region 4 (moving science into action),
OSWER (homeland security), the Office of Water (year of water), and ORD (emerging
technologies).
Plenary Addresses
Following introductory remarks by the Associate Assistant Administrator for OSWER, Mr.
Thomas Dunne, five speakers addressed each of the four Science Forum theme areas. On the
second day of the meeting, the plenary session continued with two additional speakers on the
newly-formed EPA National Homeland Security Research Center and the EPA Office of
Homeland Security -with speaker introductions provided by Dr. Teichman.
Associate Assistant Administrator for OSWER, Mr. Thomas Dunne, introduced the plenary
session for the four Science Forum theme areas. The last 18 months have seen a growing change
in the EPA regarding the types of emergencies addressed and the growing responsibilities in the
area of homeland security. Since the terrorist attacks of September 11,2001, OSWERhas
responded to emergencies that were very different from those of the past; examples included the
collapse of the World Trade Center (WTC), anthrax in the Senate office building, searches for
pieces of the Space Shuttle Columbia, and the recent "tractor man" incident in Washington, DC.
These represent significant changes in EPA's involvement with emergency response.
Homeland Security
Director for the Biological and Chemical Countermeasures Portfolio, Dr. John Vitko, provided
an overview of the DHS, described the identification of needs in the biological area, and
ft EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
discussed current activities and research initiatives. Federal legislation created the DHS in 2002
with three mission elements: (1) to prevent terrorist attacks in the United States, (2) to reduce
vulnerability to terrorist attacks at home, and (3) to minimize damage and assist in recovery. The
DHS organization consists of four Directorates: Borders and Transportation System,
Information Analysis and Infrastructure Protection, Emergency Preparedness and Response, and
Science and Technology (to fill new needs). Creation of the DHS merged all or part of 22
government departments and also involves over 80 external agencies. Therefore, coordination
and partnering with other Federal, state, and local government organizations is critical.
The DHS Science and Technology Directorate provides for a full range of research and
development including testing, acquisition, and implementation with a focus on responsibility for
deployment and acquisition. The internal organization reflects an emphasis on research and
applications, including both intramural an extramural research,
A unique organizational feature is the emphasis on portfolios, which integrate across all
organizational lines, e.g., biological, radiological/nuclear, and information. The Portfolio
Manager formulates the overall vision, sets priorities, and "contracts" with "agents" to manage
and execute the work to meet specific objectives and priorities. The Portfolio Manager
integrates feedback from interagency working groups, strategic direction from the Homeland
Security Council, and the broad science and technology community vision to produce intramural
and extramural research, development, testing, evaluation, and systems studies. The Portfolio
Manager delegates broad responsibilities and funds for producing the desired end product.
An example is the Biological Countermeasures Portfolio with a mission "to deter, detect, and
mitigate possible biological attacks on this nation's population, infrastructure, or agriculture."
This portfolio is broad ranging with biological threats involving low, medium, and high
sophistication, and must work in an integrated fashion from intelligence through response. An
early effort to assess potential catastrophic events and their consequences served to focus current
activities on several specific scenarios with selected planning cases to be used to guide and
measure initial activities. Efforts are also underway to construct a "report card" to assess or
guide an integrated, end-to-end response; this helps to ascertain current status, mid-range goals
(in three years), long-range goals (five to seven years), and progress towards achieving these
goals. This provides for a top-down, systems-driven picture of the current status and how both
threat and technology/response are being addressed. This will also help to focus attention on
areas that may prove more difficult to accomplish than originally thought.
Biological attacks differ from all other kinds of attacks and natural disasters, which have
immediate and obvious effects. Biological attack can result in exposure over time or may require
time after exposure for the biological agent to act. This points to the need to be able to intervene
before a biological agent gets too much of a hold in the body. Current efforts in this area are
being performed in conjunction with EPA for safe buildings, including detection, air flow
control, and evacuation. More challenging to address is a broad city exposure event because
response and consequences will be different than for a single building.
Integrated bio-surveillance and environmental monitoring are key to dealing with a broad range
of biological attacks. A biowarning system helps to identify the range, how many are exposed,
and how broad an area is involved. Environmental monitoring looks for exposure of people or
EPA SCIENCE FORUM 2003 PROCEEDINGS 9
-------
animals and, once the biological agent is detected, wide area monitoring can be implemented. A
nested urban environmental monitoring system is also being pursued that will include wide area
monitoring (detect-to-treat), facility monitoring (detect-to-wam), and critical support
technologies.
Collaborations occur at multiple levels in accomplishing the DHS mission. At the highest level,
DHS is charged to work with other agencies to develop national policy and a strategic plan.
DHS set up an interagency committee to accomplish this, including milestones to measure
progress. Other interagency working groups are addressing food and chemical analysis
laboratories, and EPA is involved in these. In addition, DHS is developing agency-to-agency
Memoranda of Understanding (MOUs) for individual collaboration, for example in developing
detection and decontamination technologies. DHS will be issuing two major "calls for
proposals" in the next few months, which represent additional opportunities for direct partnering
and collaboration.
Examples of DHS partnering with EPA at the program-to-program and researcher-to-researcher
level include:
• Critical partner in "BioWatch" air monitoring systems for major cities and metropolitan areas
• Programmatic coordination with the EPA Homeland Security Research Center addressing
safe buildings, water security, and rapid risk assessment
• Programmatic coordination with the EPA Environmental Technology Verification (ETV)
Program on testing and standards
• Diverse research interactions.
Moving Science Into Action
Secretary of the North Carolina DENR, Mr. William Ross, Jr., discussed the Federal/state
partnership and how the relationship between North Carolina and EPA are moving science
forward in ways that contribute to the environment and human health. Three examples
illustrated the Federal/state partnership and demonstrated several themes including the power of
science, the power of partnerships, and the power of leadership. Specifically:
• Science provides a better understanding of the life support systems upon which we depend,
how we exceed the ability of ecosystems to absorb what we give them, and how things
impact our health and well-being
* Partnerships join spheres of influence and provide the ability to draw on resources to
accomplish more man one might do alone
* Talented leaders step up to a challenging situation, even though they are not required to do
so, and seize the opportunity to move a situation forward.
10 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
The first example involved increasing air pollution problems in the Southern Appalachian
Mountains ranging from Alabama to West Virginia. Several states, Federal agencies, Federal
land managers, private citizens, universities, and other interested parties formed the Southern
Appalachian Mountains Initiative to gain better understanding of the air quality in the
Southeastern United States. This initiative was voluntary, consensus-driven, and led by the eight
states involved. To understand the air quality, an advanced, integrated assessment model was
developed and showed that the greatest benefit of NOx and SOx reductions would be in the
regions where those were emitted. These results fed into an ongoing debate in North Carolina
about whether coal-fired plant emissions should be reduced. While the study itself was not
focused on health effects, the modeling helped to achieve better understanding of health effects
information coming from other sources. The model results helped to demonstrate that there was
a clear need for action and the question then became how to pay for these reductions. North
Carolina Governor Easley was able to bring together the utilities and other private entities to
accept a rate freeze and accelerated amortization.
The second example involved improvements in wetlands compensatory mitigation for highway
construction projects. Various sources indicated that compensatory mitigation was not providing
adequate offsets to impacts of road construction on wetlands and streams. The North Carolina
Department of Transportation noted that performing such mitigations were also a major cause of
transportation project delay and expense. Thus, the existing compensatory mitigation process
was not delivering satisfactorily on either the ecologic or process sides. Therefore, the North
Carolina Department of Transportation, the North Carolina DENR, the United States Army
Corps of Engineers, and various Federal and other agencies involved in permitting highways
examined how to improve the process and the benefits. This initiative drew on many programs
and information sources about the existing natural environment, and found ways to focus on the
functions of a watershed and how to protect those functions by first identifying impacts of the
construction then determining how to change stream/wetlands values or functions to compensate
for those impacts.
The third example addressed the challenge of maintaining both environmental protection and the
military mission at Fort Bragg, which faced limitations on important resources due to a drought
affecting the Little River, air-related issues, and an endangered red-cockaded woodpecker. The •
solution was to find ways to make both the military base and its surrounding region sustainable.
An Army officer brought together the North Carolina DENR, Fort Bragg representatives, and
other agencies to examine specific issues and to determine how to achieve the necessary
sustainability. This process is ongoing, but the enthusiasm of all parties to try to come up with
the answers is promising.
Director of the Haudenosaunee Environmental Task Force, Mr. James Ransom, addressed how
cultural issues affect science and provided an understanding of the role of traditional knowledge
in conjunction with Western science in problem solving. Mr. Ransom focused on three things:
use of traditional teachings as a guide to bettering relationships between tribes and EPA,
traditional knowledge as a science, and initiatives to create a health and well-being model as a
tribal alternative to EPA risk assessment.
The Haudenosaunee Environmental Task Force was created by the traditional governments, but
does not work for a specific tribal government. This Task Force provides an opportunity to
EPA SCIENCE FORUM 2003 PROCEEDINGS 11
-------
create environmental programs based on tribal teachings. The principles that underlie tribal
teachings on how to live in harmony with the natural world, which contributed to Native
American survival for thousands of years, may apply today. To illustrate this point, Mr. Ransom
presented and described an historic wampum teaching belt that incorporates the concepts of the
need for good communication, mutual co-operation, and positive contributions to form good
relationships. Several examples illustrated how the concepts in this wampum teaching belt
provide an analogy for environmental work as well as for relationships between tribes and the
Federal government.
The EPA Indian policy was first put in place in 1984 for administration of environmental
programs on Indian Reservations. EPA was the first Federal agency to have such a policy,
before the Indian Health Service and the Bureau of Indian Affairs. The policy focused on
working with tribes as representatives of another government, recognized that tribal governments
have the authority to set their own programs, and indicated the willingness to help the tribal
government develop their programs.
Culture encompasses government, language, lifestyles, and knowledge systems. Attendees of
this Science Forum are very familiar with Western science, but may be less familiar with
traditional knowledge. When any people live in a particular area for a long time, they gain a
knowledge of interactions of all the parts of the natural area Traditional knowledge is the
collective knowledge of a people and is transferred from one generation to the next. The Native
Americans consider this a science and the information in this collective knowledge dates back
over a thousand years.
Western and traditional knowledge are different but both have value. Traditional knowledge is
more holistic (e.g., interest in the big picture), spiritual, and qualitative, while Western
knowledge is more analytical, tends to look at small pieces, emphasizes the physical world, and
is more quantitative. In additional, traditional knowledge has an ecosystem approach (e.g.,
interactions of all the parts), while Western knowledge tends to break the ecosystem down into
air, water, and other components.
Experience has shown that combining the two knowledge systems results in a powerful problem-
solving tool. One example involved a disease outbreak in the Navajo Nation in the early 1990s.
Navajo traditional knowledge of precipitation events and their impact on the environment and
wildlife enabled the Centers for Disease Control and Prevention (CDC) to identify the hanta
virus as the health problem. Both sides had important knowledge, yet neither alone had the
solution; joining this knowledge enabled the problem to be solved.
The EPA Tribal Science Council is developing a health and well-being concept. The EPA risk
assessment process often fails to consider the relationship of humans and the natural world,
cannot measure the spiritual/mental connection between us and the rest of creation, and does not
consider the values tribes place on the relationship with the natural world nor the cultural ties or
the cost of avoidance of in terms of these relationships. Finally, the EPA model is based on the
risk of exposure to toxic chemicals, which measures illness, dying, and death. From the Native
American perspective, the model should instead focus on the health and well-being concept—we
cannot be healthy if the environment is not. Therefore, it is necessary to identify indicators
whether they relate to health or the need to make the environment health. An indicator can be as
12 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
simple as the number of picnics held as a community. This is a concept that is just starting to
gain momentum.
Year of Water
Marine Biologist and Explorer-in-Residence with the National Geographic Society, Dr. Sylvia
Earle, discussed the importance of scientific exploration and its role in understanding water, the
environment, and human impacts. The National Marine Sanctuaries Act came into being around
the same time as the Clean Water Act (CWA) and the Endangered Species Act among other
environmental legislation, and this series of legislation changed both this country and how we
perceive ourselves.
Dr. Earle recently viewed a videotape of explorers of underground rivers in Florida. This
provides a totally different perspective about groundwater and what influences such places. The
videotape also showed how the condition of the water in sink, holes has changed since she last
explored them. Such exploration helps to engage agencies in the understanding of water and
how it is recycled again and again in the environment. Thus, what we drink may have been
contaminated somehow, somewhere, and in some way.
While many believe that space and the oceans are the last great unexplored regions, the
conclusions of all of the National Geographic Society Explorers-in-Residence is that we are on
the edge of the greatest exploration yet. Emerging technologies are providing us the ability to
sense the world around us in new ways, and are enabling us to scientifically explore oceans,
rivers, etc. In addition, we have learned more in the past 25 years than at any other time in
history while at the same time we may have also lost a lot as a result of our own actions to the
environment of our world despite the many progressive actions taken and our growing awareness
of ourselves and our being a part of the natural system. An example of the loss included the
damage incurred to the wildlife riches and ecosystem of Kuwait after Iraqi attacks and oil field
fires from the Gulf War in 1991, illustrated in a video clip—one of the largest ecological
disasters of all time and involved a deliberate attack on the ecology that affected the suitability of
air to breath, water safe to drink, and fish safe to eat. Another example involved Iraqi actions to
control the "marsh people" of southern Iraq by reducing the flow of the Euphrates River to
destroy the marshes, which can severely impact migratory waterfowl.
Water is becoming a major international issue. Turkey is considering the diversion of some of
the water from the Euphrates River upstream of the areas already impacted by Iraqi actions.
There have also been many water wars. Water sells for more per ounce than oil in the United
States and in the rest of the world.
The amount of water in the world is finite. While most is in the oceans, the issue is availability
of water and the ability to access it. We have a water-based planet and our life depends on this.
In years past, water was readily availability and drinkable; this is no longer the case and is
changing rapidly. Water distribution and contamination is changing dramatically in our lifetime,
and therefore sound, safe water is what so many are working on.
A 1991 trip to Kuwait also demonstrated the relationships between human condition and
environmental sustainability. The need to restore the Kuwaiti economy involved questions of
EPA SCIENCE FORUM 2003 PROCEEDINGS 13
-------
food, air, water, and places to live - environment and security. If there is not much to hope for,
then desperate actions may occur. Thus, maintaining the integrity of our environment, having a
place to live, and having food to eat leads to the need to sustain our systems and ourselves. This
is a long-term view when gains today are more and more a short-term horizon. This points to the
need to embrace and protect the cultural, historic, and natural heritage. An example illustrating
this long-term view is the action taken by President Teddy Roosevelt to preserve special areas
within the United States in the wake of massive forest destruction typical of that era.
From an international perspective, all waters eventually connect with one another. The United
States has enjoyed a leadership role in this area, yet there are limits to what we can continue to
do. This includes taking care of environmental areas for the future as well as new policies for
protection and enforcement Scientific exploration is helping in this area. For example, deep
ocean submersibles enable scientists and teachers to go into the ocean environment and observe
the dead zones that show the consequences of our actions.
Key questions are what will the world environment be like in 30 years and what are the
consequences from today' s actions. We have the power to make a difference. Too many
decisions are terminal and it is not possible to get back what is lost. Key to this is the
understanding that everything is connected and that the knowledge gained from new
technologies can make a profound difference for all that follow. A film clip showed how unique
conditions came together to form life, how all things are connected, and how anything we do
affects our home and ourselves.
Emerging Technologies
Director of the Foresight and Governance Project with the Woodrow Wilson International Center
for Scholars, Mr. David Rejeski, discussed the current technology revolution and anticipated
future directions. The first industrial revolution began in 1850 to 1860 and was all about oil,
coal, and the internal combustion engine. EPA, created in 1970, focused on cleaning up over
100 years of environmental pollution from this industrial revolution and getting the continued
pollution under control. The second industrial revolution began in 1975 to 1980, and involves
information and biotechnology.
EPA is currently straddling two different industrial revolutions, and the legal methods used to
address the first industrial revolution may not be the right tool to address the issues of the
second. In addition, the first 30 years of EPA's existence primarily addressed the by-products of
production. By 1990, EPA began to switch its emphasis to the products of production (green
design, etc.). The next great challenge to EPA is to address production itself.
The current industrial revolution involves how and where things are made, and will involve great
changes such as creation of biological chips instead of silicon chips, making products via
trans genie animals, and the creation and use of nanomaterials. How such new materials will
behave in a landfill is not yet understood. In addition, manufacturing is becoming a mobile,
nonpoint source; so, new approaches are necessary since control of such sources with traditional
methods may not be possible. This revolution is also about whether things are made (e.g.,
download onto a CD rather than manufacture) or whether things make themselves via
autonomous, computer-based evolutionary design. Since the code inside the computer now
14 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
matters (the computer is making decisions), the question becomes what is EPA's role in a virtual
world?
EPA is an adaptive agency and must become a shaping agency in the second industrial
revolution. Changes in science and technology necessitate changes in the Agency's approaches.
The first industrial revolution involved atoms, sharp boundaries, incremental change, and the
science of discovery. The second industrial revolution for the near future involves atoms and
bits (e.g., digital and physical converge), interconnectivity (fluid, mobile), exponential change,
and the science of disruption.
Speed and convergence are new challenges with everything moving much faster. Examples
included the doubling of the logic density of silicon integrated circuits every 18 months and the
halving of the cost of sequencing deoxyribonucleic acid (DNA) base pairs every 27 months.
Different organizations have different clock speeds, with government organizations typically
requiring longer reaction time to change than private industry. To overcome this, government
agencies may have to fund external entities. An example was the inability of the CIA to develop
information technology internally at a sufficiently fast pace, and overcame this by creating an
internal venture arm that funded external entities to meet CIA needs.
Early warning becomes critical in a fast-paced world because there is less time to intervene and
reverse damages at reasonable cost. Therefore, it is necessary to spend time and energy in
developing the science of early warning for ecosystems, etc.
The sciences .are converging—bio-info-nano-cogno. Information moves across all these areas
with much potential for societal/social impacts. Fluidity in the work force is necessary to move
across these areas. There is much talk about the need for multidisciplinary personnel; but the
workforce rewards are still focused on staying within one's discipline.
Addressing this second industrial revolution requires a change in thinking and stresses the
importance of what is going on all around, not just in a specific field. Without such "peripheral
vision" there is no early warning of changes, the loss of context results in unintended
consequences, and the response to the changes can be shock, surprise, and the inability to act.
There are a number of different peripheries to consider in this industrial revolution, including
geographic, idea (accepted knowledge versus new concepts), temporal (thinking several years
forward), and intellectual (different styles of thinking). Six ways to destroy the peripheral vision
include leadership failure, "not invented here" attitude, goal obsession, adversarial relationships,
workforce monoculture with no variety in thinking, and impermeable boundaries. Of great
importance is the need to shape the public dialog regarding the ideas from the "peripheral vision"
around science and technology.
To effectively address the speed of change issue is to move from learning about consequences
"after the fact" to "learning before doing" (e.g., design molecules in a computer, model air
issues). This must operate in the science and technology research area since research and
development and product/process design is the "learning before doing" scenario. In slow
learning/adaptation, environmental impacts are an unintended consequence of technology
development and deployment with regulation applied to reduce the impacts. In fast
learning/shaping, the environment is co-optimized as part of technology development and
EPA SCIENCE FORUM 2003 PROCEEDINGS 15
-------
deployment, or may be the primary goal. This is a change that puts researchers not regulators in
the driver's seat.
Another adaptive scenario is the "leap." Future thinking tends to be based on the present and the
past, and such cognitive processes may interfere in the development of breakthrough
technologies. The National Aeronautics and Space Administration (NASA) applied a different
"future thinking" approach that considered how to achieve goals unattainable with today's
knowledge/technology to help lead to breakthrough ideas and technology. Institutionalizing the
ability to take incredible leaps in thinking and to consider previously unattainable capabilities
and performance enables extrapolation to achieve breakthrough technologies that result in new
instruments, tools, and techniques.
To address this second industrial revolution, new organizational behaviors are needed: continual
situation awareness (peripheral vision), capacity to recognize emergent systems and
opportunities, high organizational clock speed, and breakthrough thinking. To accomplish this
requires new budget priorities, such as establishing and funding an environmental-legal-social
implications program as part of the total environmental research budget, focusing 40 to 50
percent of the environmental research on shaping the emerging technological infrastructure
(change from mission support to mission control), and establish venture funds (similar to CIA
approach) to support high risk, high value, game-changing technologies.
National Homeland Security Research Center and Office of Homeland Security
EPA Deputy Administrator, Ms. Linda Fisher, discussed the newly created NHSRC and its role
in supporting EPA responsibilities for homeland security. Ms. Fisher noted the importance of
enhancing the Agency's science and scientists, and how this forum provides an opportunity to
see how science supports Agency policies and to understand the underlying science.
EPA involvement in homeland security predates the terrorist attacks of September 11,2001, but
EPA's role became more obvious in the post-attack support provided to New York City and the
Pentagon as well as in EPA and CDC teaming together to cleanup the anthrax in the Hart
Building. This is a tribute to the creativity of the team to determine how to address the problem
and to obtain acceptance of the building's occupants (Senators and their staff) for the proposed
approach, resulting in a building that is open again to the public.
EPA has two primary areas of responsibility for homeland security: protection of the nation's
water infrastructure and serving as the lead agency for cleanup of a chemical attack (with a
support role for a radiological attack). For water infrastructure protection, EPA is working with
municipal drinking water facilities around the country, has committed $100 million to assist
them in performing vulnerability assessments to identify weakness, and is supporting efforts
currently underway to address those weaknesses.
To support these efforts, EPA established the NHSRC in Cincinnati, Ohio, with a budget of $50
million and 80 scientists nationwide. NHSRC personnel work side-by-side with Office of Water,
ORD, OSWER, and Office of Prevention, Pesticides, and Toxic Substances (OPPTS) personnel
also involved with homeland security. NHSRC is a temporary organization with a three-year
16 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
commitment to focus on homeland security needs drawing on EPA's historic research role as
well as cutting-edge science.
EPA developed a research plan in conjunction with its Program Offices and other Federal
agencies whose missions EPA shares or complements, including the Department of Energy
(DOE), the Department of Defense (DOD), and CDC. This research plan includes three major
research program areas: safe buildings, water security, and rapid risk assessment. The safe
buildings component focuses on protecting buildings and their occupants as well as cleanup of
building contamination with the goal of communicating information on how to protect the
occupants/buildings and what to do in the event of contamination. The water security
component is identifying contaminants in drinking water to improve drinking water monitoring
and analytical methods, including the development of contingency planning to support water
suppliers in providing alternative water supplies. The rapid risk assessment component is
developing risk assessment techniques to provide fast answers for first responders and
policymakers; this includes the identification and prioritization of different scenarios that might
be faced, improving understanding of short-term (acute) and long-term (lifetime) exposure, and
examining risks of attacks to understand the types of exposures and what they may mean.
A key question from the public following the September 11th attacks was what effect the
contaminants may have for exposures lasting several weeks. This points to the importance of
being able to explain after an attack what the exposures and risks are. EPA research will identify
current knowledge and gaps in knowledge, then bring together the scientific community to
address these gaps to identify scientifically sound actions to protect the American public in the
event of another attack.
In addition, the EPA Administrator created an Office of Homeland Security as a focal point to
develop policy within the Agency and to serve as the primary liaison between EPA, the new
DHS, and other Federal agencies that are working together on these issues. Ms. Fisher
introduced Ms. Mary Kruger, Director of the EPA Office of Homeland Security, who briefly
discussed the role and mission of this new Office. Emergency response and homeland security
include cross-media issues, interagency and intra-agency interactions, and cutting-edge science
involving multitudes of stakeholders. After September 11th attacks, each EPA Program Office
was involved with homeland security issues as well as interacting across the government in this
area. The EPA Administrator formed a working group, which evolved into the new Office of
Homeland Security, operating since February 2003 with a staff of five to six persons. Activities
include reviewing the EPA strategic plan on homeland security, whether activities are on the
right track, and helping the Program Offices address both the ongoing programs and their
additional homeland security activities.
Closing Remarks
Concluding the plenary sessions on the second meeting day, Dr. Kevin Teichman noted that
while this is ORD's second annual Science Forum, this is the first one to include co-sponsors
within EPA, specifically the Office of Water, OSWER, and EPA Region 4. Dr. Teichman
thanked the partner organizations as well as specific ORD personnel for their efforts in
successfully planning, organizing, and conducting this major event.
EPA SCIENCE FORUM 2003 PROCEEDINGS 17
-------
Section III: Homeland
Security
Tuesday and Wednesday, May 6-7,2003
The purpose of this breakout session on the second and third days of the meeting was to focus on
homeland security as it applies to response to past scenarios (e.g., anthrax and the WTC), lessons
learned from those events, the building of partnerships among the various agencies, and research
and other initiatives to prepare for future threats. Each session included a panel discussion or
opportunities to respond to audience questions that provided additional information and insight
on a variety of homeland security topics.
Dr. Lee Hofmann, with OSWER, led a session addressing the responses to the anthrax attacks
and the research that is being done to develop better response systems. Presentations included
descriptions of the remediation activities at the Hart Building and other anthrax contamination
sites, an overview of the EPA Homeland Security Research Program, and an evaluation of the re-
aerosolization capability of anthrax spores.
Dr. Hofmann also led a session addressing the detection, sampling, and analysis of anthrax
involving presentations on the procedures for collecting and analyzing bio-aerosol samples and
the anthrax sampling procedures used at the Hart Building.
Ms. Anna Treinies, with OSWER, led a session addressing the fumigation and re-occupancy of
buildings contaminated with anthrax. Presentations included a description of the procedures for
anthrax decontamination and issues in the determination of building safety for re-occupancy.
Mr. Marty Powell, with EPA Region 3, led a session addressing anthrax decontamination
technologies. Presentations included the crisis exemption evaluations for anthrax
decontamination chemicals, laboratory evaluation of chemicals for use in the treatment and
decontamination of anthrax-contaminated materials, and a research program to further develop
efficacy testing.
Mr. Craig Mattheson, with the EPA Chemical Emergency Preparedness and Prevention Office,
led a session addressing the partnerships that are being built to support homeland security and
terrorism response. Presentations included security plans and communication tools being
developed by private industry, emergency response lessons learned for water supplier and
wastewater systems, threat reduction actions taken by a water utility, risk assessment for
vulnerability assessment methodologies, highlights of the EPA Water Protection Task Force
activities, and an overview of the EPA Safe Buildings Program,
18 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Mr. Thomas Coda, with the Office of Air Quality Planning and Standards (OAQPS), led a
session focused on the development nationwide implementation of the Bio-Watch Early
Detection System.
Mr. Larry Reed with the National Institute of Environmental Health Sciences (NIEHS), led a
session addressing lessons learned from the WTC response with respect to exposure and personal
protection. Presentations included challenges encountered with interagency collaboration,
descriptions of the health and safely issues encountered at the WTC site, the physical and
chemical challenges encountered in the response, health effects for cleanup and recovery
workers, and an overview of the World Trade Center Assessment Report addressing exposures
and health effects for the exposed public and recovery workers.
Dr. Jafrul Hasan, with the Office of Science and Technology in the Office of Water, led a session
addressing activities underway to prepare for bioterrorism threats in water. Presentations
included a perspective on the wide-ranging impacts of a biological attack, an overview of the
EPA Water Security Research and Technical Support Program, technologies potentially
applicable for detection of biological threats in water, development of "early warning
monitoring" and sensor technology, and highlights of an initiative to develop a protocol for
detection of biological agents.
EPA SCIENCE FORUM 2003 PROCEEDINGS 19
-------
Anthrax: Response and Research
Following opening remarks by Dr. Lee Hofinann, with OSWER, three speakers addressed the
remediation activities at the Hart Building and other anthrax contamination sites, the EPA
homeland security research program, and evaluations of the re-aerosolization capability of
anthrax spores.
Anthrax Response and Recovery: Applied Science and Technology, and Future
Needs
Deputy Regional Administrator of EPA Region 3, Mr. Thomas Voltaggio, discussed the
response actions taken at the Hart Building immediately following the anthrax contamination.,
the reasoning behind those actions, the cleanup methods, and the results of those efforts. The
anthrax contamination in the Daschle suite is the first large-scale bioattack ever to occur in the
United States, and was addressed through a multi-agency response effort aided by the support of
EPA, CDC, the National Institute for Occupational Safety and Health (NIOSH), and the United •
States Capitol Police.
An unprecedented level of effort was required to clear the entire Capitol Hill Campus, which
consisted of 30 to 40 buildings and 15,000 employees. The initial role of the EPA was to aid in
the sampling efforts. The sampling strategy was to sample the areas of the initial anthrax hits
(the Daschle suite), then to follow the trail of the mail to determine how the anthrax spread from
the location where the letter was initially opened. The owners of the building subsequently put
EPA in charge of the cleanup efforts.
The cleanup was divided into two sites: the Daschle suite (where the letter was opened) and
cross-contamination sites with efforts focused on the mail room. A command post was set up in
the Botanical Gardens, where the response was organized. Teams from Health & Safety,
Sampling & Analysis, Contracts & Resources, and Disposal were onsite to aid in the
organizational efforts. Chlorine dioxide liquid was selected for spot cleaning lightly
contaminated areas. However, a different mechanism of contamination in the Daschle suite
necessitated the use of a different remediation method. Chlorine dioxide gas was chosen as the
fumigant for the Daschle suite based on pilot testing, which indicated that the gas breaks down
rapidly (reducing residual risks), no breakdown products of concern are formed, and it has been
used safely in other areas of commerce. The initial plan to fumigate the entire building
underwent peer review, which resulted in a decision to fumigate only the Daschle suite and the
heating, ventilation, and air conditioning (HVAC) system.
Establishing a cleanup goal was difficult uncertainty in defining a safe level of anthrax spores. A
risk management decision was made that areas would be sampled and, if anthrax growth was
found, those areas would be spot cleaned. In addition, materials would be removed from the
building and cleaned with aqueous chlorine dioxide.
Fumigation of the Daschle suite took place on December 1 and 2,2001. Chlorine dioxide was
mixed in an onsite generator outside of the building. The HVAC system was cleaned from the
basement of the building. The fumigation had to be done twice. A high efficiency particulate air
(HEPA) vacuum was used in the areas that were porous. Post-cleanup restoration included the
20 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
replacement of carpet and ceiling tiles. A room-by-room review was conducted to evaluate the
cleanup efforts, and the efficacy of the treatments was tested by sampling for growths and by
using thousands of spore strips in the Daschle suite. Aggressive air, swab, and wipe sampling
was conducted prior to reoccupation and, upon clearance by the Assistant Physician for the
Capitol, the Hart Building was reopened on January 22, 2002.
There were many lesson learned from this cleanup effort. There were command structure issues
in that there was no model for legislative branch roles as well as uncertainty as to who was in
charge, considering it was really a police incident. The project had challenging schedule
demands, and credibility is lost when such schedules are not kept The stretch on resources
concerning the unique health and safety issues of this undertaking demonstrated the need for
improvements in contracting support. In addition, early coordination is critical for disposal of
such unique cleanup wastes.
EPA's Homeland Security Research Program
Director of theNHSRC, Mr. Timothy Oppelt, discussed the-Homeland Security Research
Program's foundation and current activities. The NHSRC opened in October 2002 and, due to
the great sense of urgency to address this type of work, was established drawing from existing
staff from EPA research organizations in order to be operational as soon as possible with staff
experienced in such areas as indoor air pollution, site remediation, analytical methods, and water
supply.
The research program consists of three components: the protection of water systems, the
protection of buildings, and rapid risk assessment in the aftermath of events. The research
program goal is a three-year process to explore ways to provide methods and guidance for
preparedness, detection, containment, and decontamination of facilities as well as an
understanding of the risks with an emphasis on chemical and biological attacks. The scope of
the program, in terms of the hazards, involves:
* Examining the pathogenic bacteria (whether they are weaponized or not)
• Viruses and bacterial toxins
• Chemical warfare agents that have been developed
• Toxic industrial chemicals produced in large volumes that could be used in attacks (e.g.,
chlorine, anhydrous ammonia, etc.)
• Toxins that could be used in attacks on water systems
• Radiological contamination in drinking water.
The success of the research program depends on partnerships between NHSRC, other EPA and
government organizations, and the private sector. Key NHSRC internal collaborations include
the Office of Water, Water Protection Task Force, OSWER, Office of Pesticides Program (OPP),
and ETV. External collaborations include the United States Army, specifically the Edgewood
EPA SCIENCE FORUM 2003 PROCEEDINGS 21
-------
Chemical and Biological Center; CDC; United States Air Force; DHS, DOE, and DOD. These
collaborations will focus on key knowledge gaps using a two-pronged approach: the use of
lessons learned in identifying immediate research needs, and the identification of key threat
scenarios that could result in large impacts and are technically feasible and probable. The
examination of attacks on chemical production facilities is the responsibility of EPA rather than
NHSRC at this time.
The results of screening level simulations and risk analyses currently being conducted will drive
priorities for technological needs. The results will also be used to produce final verification tools
towards the end of the program, and to provide technical guidance to the Agency and
decisionmaking officials. The seven key pieces of the research program are characterization,
detection, prevention, containment, decontamination, risk assessment, and scientific back up.
Mr. Oppelt provided examples of the research, and the questions driving that research, associated
with each of these key program areas noting that there is no emphasis on new technology
development. Instead, the primary emphasis is on the application of commercial technology.
This research is short-term, high intensity, and highly focused on user needs (i.e., first
responders, owners and operators of buildings, and water system operators).
Secondary Aerosolization of Viable Bacillus Anthracis Spores in an Office
Environment
Toxicologist and Coordinator for Homeland Security at EPA's National Enforcement
Investigation Center (NEIC), Dr. Chris Weis, discussed the role of Science Support Coordination
(SSCs) for. environmental emergencies and the practical safety and risk assessment problems
facing On-Scene Coordinators (OSCs). Dr. Weis also discussed the procedures used in the
decontamination of the Hart Building after the anthrax attack, and the rationale behind those
procedures.
An SSC is any scientist who provides onsite support, and is most often the OSC. The SSC will
coordinate science support needs and reach out to whomever is available to answer practical
questions. Examples were provided of situations where the SSC was used successfully in the
aftermath of environmental events. Rapid risk assessment as well as chemical and biological
knowledge are essential to onsite coordination efforts.
The anthrax decontamination of the Hart Building was a collaboration among scientists from
various agencies, specifically the United States Army Center for Health Promotion and
Prevention Medicine (CHPPM), EPA Region 8, EPA Region 5, and the Navy Biological Defense
Directorate. Practical questions surrounding the decontamination efforts included:
• Would the personal air tanks have to be carried with the cleanup personnel?
• Would it be possible to alternatively use air-purifying respirators (APRs)?
• What were the principle and secondary pathways of exposure in the Hart Building?
• Is it possible for the anthrax spores to re-aerosolize?
Formation of the study design was based the decision of what was to be measured, how it was
going to measured, and what the endpoints of the study would be. The study design involved
stationary (surface, dust, and swab) sampling and personal air sampling under both minimal and
22 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
simulated active office activity. Nominal spore sizes and airborne concentrations were
measured. Anthrax colonies were grown and the particle diameters were measured using a
cascade impact device. In an effort to avoid a secondary extraction step, spores were pulled
directly from the desk onto a gelatin filters, then transferred to sheep's blood augers. This
procedure allowed the quantification of the surface contamination.
The study revealed that viable anthrax spores do re-aerosolize under both quiet and active office
conditions. Greater than 80 percent of the spores were measured within the respirable range
(under three microns). However, no spores were measured at a diameter small enough to pass
through the APRs (smaller than about one micron). Therefore, it was determined that APRs
could be used safely in the cleanup efforts. EPA assessed the risk of isolation of spores, and
determined that routine activity in the area of spore contamination could cause reaerosolization.
An important aspect is to not expect a bioterrorism agent to behave according to a pre-existing
understanding or dogma.
Anthrax: Detection, Sampling, and Analysis
Following introductory comments by Dr. Hofinann, with OSWER, two speakers discussed the
procedures for sample collection and analysis of bio-aerosols. A panel discussion including an
audience question and answer period followed the presentations.
Environmental Sampling of Bio-Aerosols
Captain Kenneth Martinez, an Industrial Hygiemst with NIOSH, discussed the strategies and
procedures involved with the decontamination of the Hart Building. At NIOSH, the
understanding of organisms is broken down into two categories (obligate parasites and foculative
saprophytes), and that much of what is known is based on recognition, evaluation, and control.
Exposure assessment of the anthrax contaminated Hart Building was difficult because most of
the existing exposure assessment tools are designed for clinical operations. Difficulties were
encountered in adapting these analytical techniques for use in the environmental arena. Onsite
health concerns included infections, immunologizations, and toxic effects. A characterization of
the size of the anthrax spores needed to be conducted. It was known that the spores respirable,
very resistant to environmental extremes, and that electrostatic properties have an effect on the
spores. The endemic nature of the spores was determined to be negative. Of the 10,000 samples
taken, there were no positive background samples. Each contaminated letter contained a very
high concentration of anthrax spores, approximately 1 to 2 grams in each.
Surface and air sampling characterized what had settled and what had been re-entrained into the
air. Factors to consider in-understanding aerosol particle behavior include the settlement of
particles, impaction of the particles, charge effects, particle releases from surfaces, and
agglomeration/de-agglomeration of particles in the air. Anthrax spores behave like a gas in that
they remain in the air for a very long period of time, and the spores settle differently in stagnant
and turbulent air. Pathways for particle transport included doorways, vents, and people moving
from one location to another. Determination of what the samples meant was a challenge given
the absence of numeric criteria for interpreting such environmental measurements.
EPA SCIENCE FORUM 2003 PROCEEDINGS 23
-------
Preparation for sampling included training of NIOSH and other sampling personnel, safety
precautions (posted on the CDC website), and appropriate record keeping and documentation.
Investigative strategies involved following the trail of the mail, examining high traffic areas and
ventilation systems, and examining all areas that could potentially collect dust. Sampling
considerations included:
• How the spores will be disseminated through air or materials
• Sampling methods based on the porosity of the surfaces
• Validated sampling protocols
• How the methods of analysis will be applied.
Also important was whether the purpose of the environmental sampling was to determine the
presence of spores or the extent and degree of contamination, whether the data from the sampling
supported medical treatment and cleanup decisions, and whether the results provided guidance
on re-occupancy.
Furniture, floors, the ventilation system, vehicles, and clothing were sampled. Issues
surrounding the sampling efforts included collection efficiency of instrumentation, recovery
efficiency, limits of detection, confirmatory testing, and sample shipping. Most of the samples
taken were either bulk or surface samples, although some were air samples. An important aspect
is to determine which sampling method is going to be most effective in each area. The surface
sample study resulted in the following conclusions:
• Swabs are effective for use in cracks and crevices
• Wipes are effective for light dust loading on non-porous surfaces
• Vacuums are effective for heavy dust loading and for large areas.
Anderson devices were the most consistent/sensitive for air sampling. Andersons have lower
processing risks and quicker turn-around times, and there is less laboratory bias resulting from
the reduced amount of processing. Recommendations for event responders are available on the
CDC website and include the importance of decontaminating the samples as well as people
before they leave a contaminated area and that sampling strategies will differ from place to place.
The strategic plan developed by CDC has most of the sample analysis being conducted through
the multi-level (A through D) Laboratory Response Network. Level A laboratories are clinical
laboratories used to rule out potentially dangerous substances. If a substance is determined to be
a potential danger, it will be sent to a level B, C, or D laboratory, as applicable, for further
analysis.
In addition, procedures for shipping samples extremely important. Packaging must be rigorous,
and all samples must be appropriately labeled as infectious substances. Triple packaging is
required with a primary container to hold the samples, a secondary container to provide
waterproof protection, and the outer packaging providing durability. The contents of the
package must be document as being hazardous. Training is required for those that will be
handling the packages.
24 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Collaborations with the industrial, agricultural, and environmental communities are likely in
future research efforts. Much of what has been learned in the last 18 months regarding anthrax is
now being applied to the current Severe Acute Respiratory Syndrome epidemic.
Mr. Mark Durno, an On-Scene Coordinator (OSC) in the Emergency and Response Branch of
the EPA Region 5 Superfund Program, discussed details of the sampling procedures used at the
Hart Building.
The first responders made a number of mistakes resulting in cross-contamination of the building
by leading Daschle suite employees through hallways and stairwells. EPA led the initial review
team and NIOSH, CDC, CHPPM, and DOD aided in the initial sampling assessment. The
National Institute of Standards and Technology and the EPA Analytical Operations/Data Quality
Center helped with air modeling and mapping of the site. The pathways were followed and 124
samples were collected, of which 12 were positive for anthrax spores. Contamination pathways
also included tile frail of the mail and foot traffic areas. Sampling all components of the HVAC
system tested the air contamination pathways. Air sampling methods included gelatin, Anderson
cascades, dry filter units, and open agar plates. Sample characterization results identified anthrax
contamination in 11 suites, three committee rooms, one bathroom, three hallways, one elevator,
and three stairwells.
Full characterization involved sampling every desk and mail bin, every monitor screen, and high
traffic floor areas. The results of the full characterization revealed that very few rooms were
contaminated. Composite sampling of all horizontal work spaces was conducted in every room
so that individuals could return to their workstations knowing that there was no residual
contamination in their areas. Of the four suites that had positive anthrax hits (in addition to the
Daschle suite), only one was found to have any additional contamination. This sampling effort
involved the collection and management of 10,000 samples and associated data by four to eight
people.
Post-remediation sampling was the biggest challenge in the Hart Building response. Sampling
was conducted on floors, every horizontal surface, all drawers, and ceiling plenums. Tried and
true methods of air sampling were used with final decisions made by best professional judgment
This bio-aerosol sampling approach is described in the Anthrax Technical Assistance Document,
Chapter 6, available on the following website: www.nrt.org. Considerations for pre-remediation
sampling include the goals for the sampling efforts, data objectives, keys for a successful
strategy, lessons learned in addressing the absence of current standards, and sampling plan
development. Objectives of pre-remediation sampling should be developed in consultation with
professionals (e.g., medical, public health, industrial hygiene, laboratory, building experts, and
local, state, and Federal agencies). The sampling approach should consider monitoring,
screening, bulk material, questionable articles, extent of contamination, effectiveness of
decontamination, clearance for re-occupancy, transitional sampling, and consideration of every
sample as potential crime scene/forensic evidence. In addition, the sampling approach should be
logical and systematic, scheduled, and risk-based. The decision to use a targeted or statistical
approach should be made depending on the available information. Other aspects of site
remediation discussed in the Anthrax Technical Assistance Document include: methods,
analytical, transportation, coordination, and interpretation of data
EPA SCIENCE FORUM 2003 PROCEEDINGS 25
-------
EPA Region 5 also developed Regional Sampling Guidance that provides equipment use
guidance. This booklet is currently in draft form, and is not yet available for distribution.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing upon
questions from the audience.
A brief question and answer period addressed topics. These included: (1) speculations of
effective doses of anthrax for healthy and susceptible persons and the minimal number of spores
required for determining a test to be positive; (2) the question of a second contaminated letter;
(3) the New York City subway incident; (4) sampling at the outlet port; (5) best guesses for
exposure at a site, given limited sampling a capabilities; (6) suggestions for outside sampling
areas; (7) making the distinction between cultured and naturally occurring spores, and the
possibility that persons infected by naturally occurring spores are not diagnosed; and (8) advice
for planners.
In closing, Dr. Hofmann thanked Mr. Martinez and Mr. Dumo for their presentations and
encouraged the audience to visit the displays in the courtyard.
Anthrax: Fumigation and Re-Occupancy
Three speakers addressed the procedures for anthrax decontamination and evaluation to
determine the safely of re-occupancy.
Fumigating Anthrax-Contaminated Sites: Building on Experience
Chief Scientist for Bioterrorism Issues in the OSWER, Dr. Dorothy Clark, discussed the
remediation of multiple sites as a result of the anthrax mail attacks. The 1999 Consensus
statement on anthrax as a biological weapon, produced by the Department of Health and Human
Services (DHHS) Working Group on Civilian Biodefense, was tough to sell to Congress. The
2002 Consensus statement was updated and suggested that only experienced personnel should
participate in remediation efforts.
Anthrax-contaminated sites included media offices in New York City and in Boca Raton, Florida
(AMI), postal facilities, the Capitol Hill complex, and private residences. The NBC, ABC, CBS,
and New York Times media offices received letters containing cutaneous anthrax, which is not
nearly as dangerous as inhalation anthrax; treatment with antibiotics produces an 80 percent
survival rate in persons infected with cutaneous anthrax. However, the Capitol Hill complex and
the AMI building in Florida were contaminated by letters containing inhalation anthrax.
Therefore, these latter two sites required site remediation (i.e., fumigation).
Anthrax site remediation processes include site assessment, isolation of contaminated areas,
artifact/critical item removal, source reduction (via HEPA vacuum), post-remediation sampling,
further remediation (as needed), and disposal of decontamination materials such as
decontamination water, expendable personal protective equipment (PPE), and debris. Associated
environmental sampling processes involve:
26 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Confirmation of the existence of contamination
Characterization of. the nature/extent of the contamination
Aid in selecting a remedial approach
Determination of the effectiveness of remediation
Contributions to decisionmaking on re-occupancy.
The New York City media offices, the Capitol Hill complex, and the Department of Justice
(DOJ) mail facility have.undergone complete remediation. Remediation of General Services
Administration (GSA) Building 410 is currently underway. The Annex-32, Hamilton, and AMI
building remediations are in the planning stage.
Sites requiring fumigation remedies are those that are contaminated with inhalation anthrax or
that have areas with high concentrations of dangerous spores. These sites include the Hart
Senate Office Building, DOJ mail facility, Brentwood Post Office, Hamilton, GSA Building 410,
Annex-32, and the AMI building. Fumigants being used at these sites include:
• C1O2 - at the Hart Senate Building, Brentwood, and Hamilton
• Vaporized hydrogen peroxide - at GSA Building 410 and Annex-32
• Paraformaldehyde (historically used by medical, academic, and army labs to cleanup
anthrax) - at the DOJ mail facility.
Fumigant selection is a site-specific decision. A match between the fumigant chosen and the
agent of contamination, the decontamination process, and the site requiring treatment is essential.
Drivers for use of gaseous treatment include the concentration of the contaminant, exposure
time, relative humidity, and temperature. When using ClOj gas, it is important that the relative
humidity be greater than 70 percent, and it is best to have a certain conditioning time prior to
introduction of the gas to ensure that enough moisture gets to the spores.
i
An Anthrax Fumigation Evaluation Project is funded by ORD as part of the Homeland Security
Research Program. This project is slated to begin July 1, 2003 and will involve an in-depth
analysis of a group of representative anthrax fumigations. The analysis will include three
different fumigants and will draw on a multidisciplinary team of experienced personnel. Aspects
of fumigation remedies considered in this study include:
• Environmental sampling—nature and extent of pre-remediation sampling and sampling
during remediation
* Pre-fumigation source reduction activities—removal of materials from the site and the nature
and extent of surface treatments
• Safety—containment of the space to be fumigated, extent of pre-fumigation testing for key
equipment, action levels for ambient concentrations of agent, monitoring for leakage of the
EPA SCIENCE FORUM 2003 PROCEEDINGS 27
-------
fumigant, removal of the fumigant at the end of the fumigation process, and adequacy of
emergency response plan
• Efficacy—selection of fumigant based on penetrability (toxic properties, materials
compatibility, history of usage/success), post-treatment aeration to address residues, cost
considerations, generation of agent onsite, control of process variables (temperature, relative
humidity, concentration, and exposure time), minimum conditions necessary to continue in
each phase of the fumigation process and to progress to the next stage, and site-specific
considerations (e.g., fumigate the entire building or just a section, need for redundancy of key
equipment, containment of area, distribution of gas, circulation of gas, measurement of
process variables, and aeration)
• Cost/downtime—cost issues, national security issues, and public opinion
• Output—a report for each site in the study to include an evaluation of alternative mechanisms
for maintaining containment, an evaluation of cost, and documentation of methods to
enhance future fumigation.
Clearance Determinations: Judging Remediation Success and Readiness for Re-
Occupancy
Mr. Matt Gillen, with NIOSH, and Mr. Jack Kelly, with EPA Region 3, discussed a joint
CDC/EPA project, focusing on the general approach, scientific issues, lessons learned, and
information from published reports, regarding clearance determinations for post-remediation
building re-occupancy.
Mr. Jack Kelly, an OSC with EPA Region 3, provided an overview of the remediation process.
The entire remediation process includes the following: a contamination event, an initial
response/outbreak investigation, facility closure and isolation, a characterization sampling phase,
development and approval of remediation and clearance plans, remediation activities, clearance
verification sampling, evaluation of the clearance/technical determination, a period of
refurbishment, and re-occupancy.
Clearance comes at the end of the remediation process, and involves an expression of interest or
request from the facility owner, creation of across-disciplinary Environmental Clearance
Committee (ECC), ECC briefing by facility owner and remediation team, ECC input into the
sampling plan, ECC review of clearance data upon completion of remediation activities,
preparation of technical determination statement, and assistance with risk communication (if
requested).
The three key clearance issues are organization, technical/scientific, and communication.
Organizational issues include:
• Determining the need for and benefits of using an ECC.
28 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
* Usefulness of multidisciplinaiy peer review when there is a significant contamination event,
a complex cleanup, a need for an independent opinion, a need for external assurances, or a
novel situation with no established cleanup procedures.
• Selection considerations for the ECC chair person and committee candidates. EPA and local
health department representatives have served as co-chairs. Participants can be drawn from
all levels of government, military, and private sector/academia, and selection should cover
pertinent disciplines. Generally, the ECC has not been used as a mechanism for stakeholder
involvement.
• ECC reporting, including the need to present a recommendation that either remediation has
been successful or that additional actions are needed with reports provided to the facility
owner,
• The scope of the ECC review, which is all information necessary to reach a decision on re-
occupancy,
• The need for the ECC to be an independent function with ECC chair person(s) providing a
liaison to facility owner through at least two coordination meetings; cooperation of the
facility owner/operator is crucial for the ECC to perform its duties.
• The usefulness of a charter to define goals and responsibilities of the ECC and to address
expectations of bom the ECC and the facility owner/operator. This helps to prevent surprises
and avoids misunderstandings. The charter can be structured as a "charge" to the ECC or in
the form of a possible list of questions.
Another organizational consideration is the role of a technical working group and its relationship
to an ECC. A technical working group provides input earlier than ECC on aspects such as the
remediation plan during the EPA crisis exemption evaluation. In some cases, technical working
group members may also serve on the ECC. There are advantages and disadvantages to each of
these approaches.
Related organizational issues include whether ECC members serve as individuals or as
representatives of their Agency, and defining the relationship between the ECC and the facility
owner/operator.
Mr. Matt Gillen, a Senior Scientist with NIOSH, discussed the technical/scientific issues. The
overall goals of this joint project include the use of the best science available; the acquisition and
use of high quality data; the use of valid, effective methods; incorporation of new scientific
developments; conduct of thorough and rigorous clearance sampling; and use of "negative
growth" as the clearance criterion for judging the success of remediation efforts.
Technical challenges and research gaps faced in this project include the lack of a detection limit,
formally validated methods, and risk-based cleanup criteria to address the question of "how clean
is clean?" CDC testimony states (in regard to the Brentwood Post Office) " it is the goal of
CDC to minimize illness and disease to the greatest extent possible ."
EPA SCIENCE FORUM 2003 PROCEEDINGS 29
-------
The clearance sampling test sequence involves the following steps: spore strip testing
(biological indicators), surface sampling, then aggressive air sampling. Sampling strategies used
for clearance include focused sampling to target previously positive locations, biased sampling to
target other most likely locations, and grid/random sampling to systematically check other areas
of the facility,
Mr. Jack Kelly, with EPA Region 3 discussed the communication issues. One issue involves the
independence of the ECC, which is dependent for information, but independent for evaluation.
Another issue is how the ECC handles a lack of consensus in decisionrrjaking with consensus
being the preferred route, but with the use of minority opinion if necessary. A third issue
involved the ECC's role with regard to returning workers. Recommendations were for such
deliberations and discussions to occur in a "closed" venue with ECC members available to the
public for question and answer.
A further communications consideration is the nature of the formal statements provided by the
ECC and to whom those statements are provided. Short statements addressed to the Incident
Commander or other relevant authority that serve as technical determinations are preferred,
should be written for a general audience with the inclusion of technical terms and caveats where
needed, and should be suitable for public release.
Highlights of this joint project include:
• Organizational—the first use of the ECC concept, recognition that public health agencies
should be involved in re-occupancy and reuse decisions, and how the ECC members pulled
together
• Technical/Scientific—sample analysis, composite samples, surface sampling, first use of
aggressive air sampling, room-by-room clearance approach, and the importance of outreach
sessions for returning workers
• Communication—the "no-growth" cleanup standard for the Brentwood Post Office, and the
interim statement provide by ECC; public availability of the decision document is being
assessed.
Lessons learned in conducting this joint project include the importance of determining whether
an ECC is needed or wanted, the need for all parties to know their roles and responsibilities,
confidentiality issues, consideration of forming an ECC of experienced personnel from non-
regulatory government agencies and academia/private industry, debate over ECC members
serving on a technical working group, and the make up of the ECC membership.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience,
A brief question and answer period addressed a range of topics. These included: (1) the
obligation of the sampler to describe expected risks; (2) sampling work done in the Hart Building
30 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
following re-occupancy; (3) application of the ECC concept to privately owned buildings; (4)
technical documentation of these approaches; and (5) clarification of who provides clearance,
clearance policies, and experiences with local governments.
Anthrax: Decontamination Technologies
Following opening remarks by Mr. Marty Powell, with EPA Region 3, four speakers addressed
the determination and evaluation of chemicals for use in the treatment of anthrax contamination.
A panel discussion including an audience question and answer period followed the
presentations.
The Hunt for Anthrax Contamination Chemicals
Mr. Jeff Kempter, Senior Advisor to the Antimicrobial Division of OPP, discussed the crisis
exemption and the challenges involved with cleanup efforts at anthrax sites. The Federal
Insecticide, Fungicide, and Rodenticide Act (FIFRA) requires that a pesticide be registered
before it can be distributed or that it be exempted for emergency purposes. FIFRA defines a
pesticide as any substance intended to prevent, destroy, repel, or mitigate any pest, and defines a
pest as any form of plant or animal life or virus, bacteria, or other micro-organism, except on or
in living man or animals. In regulating decontamination chemicals, anthrax has been defined as
a pest.
The standard for EPA approval of a product as a pesticide requires demonstration that the
product may not cause unreasonable risk to humans or the environment, demonstration of the
benefits or efficacy of the product, and demonstration that the product cannot cause unacceptable
human dietary risks.
Pesticide registration requirements include the provision of information on the product and its
composition (chemicals, inert components, production, source of active ingredients, etc.), the
toxicity of product, efficacy of the data, and product labeling. The product may be new or
amended.
Efficacy data for sterilants are currently tested using the AOAC Sporicidal Activity Test. This is
a qualitative carrier test for both hard and porous surfaces. Success is measured as "zero
growth" on all 720 carriers in the test.
The crisis exemption has been set up for anthrax decontamination chemicals. Approval of a
crisis exemption requires submission of safety and efficacy data as well as sampling and
monitoring plans. To date, 54 crisis exemption requests have been received from Federal
agencies, companies, and registrants resulting in 21 exemptions issued, 27 requests rejected, 4
requests pending, and 2 requests withdrawn.
Liquid chemicals approved for crisis exemption (for hard surfaces only) include: Aqueous
hydrogen peroxide/peractic acid, sodium hypochlorite, and hydrogen peroxide/quaternary
ammonium compound (foam). Gases approved for crisis exemption include C1O2 gas, ethylene
oxide, paraformaldehyde, vaporized hydrogen peroxide, and methyl bromide.
EPA SCIENCE FORUM 2003 PROCEEDINGS 31
-------
There are six major challenges involved in the cleanup of anthrax contamination. The first
challenge is Ihe actual cleanup of the contaminated sites and involves the following issues: the
building owner, EPA, and other agencies must work together at each site, technical working
groups and ECCs should provide expert guidance, and crisis exemptions need to be approved for
chemicals to be used onsite. The second challenge is in the documentation and evaluation of
cleanup efforts and involves the following: comprehensive reviews of major site cleanups; the
evaluation of different decontamination chemicals as to their effectiveness, safety, and cost; and
the objective comparison of cleanup methods and distillation of lessons learned. Other
challenges involve the validation of efficacy test methods, research and development of
decontamination technologies, preparation for other biological agents, and preparation for new
and emerging pathogens.
Mr. Jeff Heimerman, with the OSWER Technology Innovation Office (TIO), discussed the
evaluation of new and emerging technology. To relieve the burden of the emergency center, a
clearinghouse website (www.epatechbit.org) was established. The EPATechBit Helpline aids in
answering questions on efficacy data and related information. To date, 51 decontamination
devices and various other technologies are included on the website.
Meetings have been held by a "Red Team" in an effort to focus on developing a systematic
approach for examining these new technologies, since the quality of information received to date
has not been good. All input has been unsolicited and included broad claims but no supporting
data To address this, an example building scenario was developed to test these technologies.
The evaluation model was based on a schematic weighted scale, which included environmental
conditions. An evaluation chart was also developed.
The Red Team noted during their review process that the lack of test data is a hindrance to
evaluation. This evaluation process may be a useful tool to use as triage for technology, and that
a cross-agency body of people who believe in this type of technology is needed.
Gaps in this process include the need for a triage function to determine when vendors should be
directed to other agencies and how others should be prioritized into the ETV program. Also
needed is the development of a more permanent vendor tracking system.
Laboratory Support for Evaluating Decontamination Technologies
Ms. Rebecca Schultheiss, with the EPA Environmental Science Center at Fort Meade, Maryland,
discussed the duties and capabilities of this laboratory in evaluating decontamination technology
and techniques. The Environmental Science Center is a state-of-the-art, biological level safety 3
microbiology laboratory, which provides support to OPP, conducts efficacy testing of
antimicrobials, and participates in other projects such as genetically modified plant methods,
method development, and semi-quasi mode research.
The Environmental Science Center supported the Hart Building and Brentwood Post Office
decontamination evaluations. For example, spore strip analysis was conducted prior to
fumigation for the Hart Building, and this analysis supported the development of PPE
requirements during remediation. Line 17 analysis was conducted for the Brentwood Post
32 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Office. In addition, testing was also conducted to assess contamination of chemical
decontamination solutions and to assess the success of fumigation.
One of the roles of the Environmental Science Center is to provide guidance. This included the
conduct of trial tests of the decontamination capability of bleach, ClOa, and decontamination
foam. Eleven trials were conducted to test a range of bleaches, contact times, and unadjusted pH
solutions. pH was found to be a significant factor in the trials for successful use on nonporous
surfaces. Since porous surfaces treated with bleach have not passed the test, the Agency decided
to take a conservative approach to the use of this product.
Four liquid C102 trials were conducted that involved different contact times as well as porous and
non-porous surfaces. The ClOa passed the tests for nonporous surfaces. After several failures,
the evaluation of C1O2 for use on porous surfaces was halted.
Decontamination foam was originally given a crisis exemption. Decontamination foam comes in
two parts, liquid and powder, that must be mixed together. The trials conducted at the
recommended 1-hour contact time for both porous and non-porous surfaces failed. As a result,
the Section 18 Crisis Exemption was lifted.
Issues encountered during these trials included:
• Need for range finding studies
• Critical role of pH for a successful decontamination outcome
• Neutralization of the active ingredient was critical
• Carrier count method had to be refined resulting in the development of a more accurate
method.
In addition, the qualitative test only evaluated growth or non-growth.
On-going activities of the Environmental Science Center are focused on developing an improved
evaluation method. Future challenges include:
• Dealing with the organism and its surrogates
» Re-evaluating the use site, as it has now changed to large buildings and airplanes
• Developing methods to determine effications of spore size
• Identifying materials that will work on porous surfaces.
Efficacy Testing Science Issues and Follow-up Research
Dr. Stephen Tomasino, Team Leader at the Environmental Science Center in Fort Meade,
Maryland, discussed the complex nature of determining the effectiveness of antimicrobial
chemicals, EPA's role in developing an efficacy testing strategy to advance the science, and the
research plan. Measuring efficacy is complex and involves the following issues:
EPA SCIENCE FORUM 2003 PROCEEDINGS 33
-------
• Micro-organisms may or may not grow (depending on their nature), making them more
difficult to deal with than chemicals
• Some of this information/technology has not been updated for decades
• Difficulty in simulating porous/hard surfaces
• Absence of textbook expertise requiring technical expertise to be handed down and
developed over time
• Controlled conditions are necessary and expensive to simulate
• Recovery of both viable and damaged spores is difficult.
Factors affecting efficacy testing include concentration, formulation, application method,
application rate, dilutent, contact time, temperature, organic burden, treated surface, product age,
pH, and test microbe. The Environmental Science Center is attempting to design and evaluate a
method that combines formulations and materials. Important decisions include the formation of
a test subgroup and formation of subgroups for surrogates and gases and vapors. In addition, test
method evaluation attributes include cost, readily available equipment, expertise, flexible contact
times and temperature, methods sensitivity, adequate controls, enumeration method, percent
recovery, deactivation of product, reproducibility, turnaround time, and validation.
The goals of the research plan are to:
Replace or improve the current qualitative method (AOAC Sporicidal Test)
Study quantitative methods for liquids on hard surfaces
Perform comparative side-by-side testing
Comparative efficacy data are essential to the development of future regulatory guidance
Develop expertise in conducting multiple sporicidal tests
Perform statistical analyses to determine mean, variances, etc.
Select two quantitative methods for further investigation and validation testing..
Activities to be conducted in the remainder of 2003 involves the development of standard
operating procedures and a QA document, training and practice, performance of pre-
collaborative studies, initiation of research in late summer, compilation of data and performance
of statistical analyses, and reporting findings to Federal agencies. Future plans for the this
research effort include the addition of surrogates to the testing matrix, exploration and testing of
a variety of materials, pursuit of a screening method, conduct of validation testing, and
evaluation of field test methods.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing in questions
from the audience.
34 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
A brief question and answer period addressed a range of topics. These included: (1) the need for
registration of a product for multiple uses; (2) tests conducted with anthrax and concrete; (3) the
possibility of using a 14-day spore strip analysis; and (4) the incubation time period necessary
for successful germination of cells.
Building Partnerships Towards Homeland Security
Following opening remarks by session moderator, Mr. Craig Mattheson, with the EPA Chemical
Emergency Preparedness and Prevention Office, six speakers discussed critical infrastructure
protection and homeland security, and the interconnectivity of these industries/sectors. A panel
discussion including an audience question and answer period followed the presentations.
Security: The Business of Chemistry's Action
Dr. Marty Dubin, Security Team Leader for the American Chemistry Council (ACC), discussed
security plans and communication tools. The ACC has over 160 member companies comprising
more than 90 percent of the chemical industry in the United States involving diverse medical,
telecommunications, and defense support. In October 2001, the ACC finalized guidelines for
site and transportation security and used this as a foundation to add mandatory security care (site,
cyber, and transportation) for member organizations.
This resulted in a New Responsible Care Security Code enacted by the ACC Board in June 2002.
This includes an overall plan, actions, checks, and improvements as well as provisions for
independent third party audit. Priorities were based on the attractiveness of the target and the
severity of the attack based on attractiveness of the target. Four threat categories are assessed,
including uncontrolled releases, theft, product contamination, and significant economic
disruption. Any of the four categories of threats could cause offsite impacts.
The guidelines address physical, value chain, and cyber security. The physical security plan may
include perimeter barriers, access control, inventory control, surveillance, and process control
systems and equipment. Value chain security involves issues of communication, storage, and
transit. In addition, guidance is provided for cyber security. An important aspect to consider is
that no two chemical facilities are alike when evaluating security measures.
Facility response plans have been developed. Responses to security threats are color-coded, and
are based on general and specific threats. A 24/7 hazardous material response capability is
important, and the Federal Bureau of Investigation (FBI) built onto the existing 24/7 hazardous
material response system to include a two-way response system with the DHS.
The intention of the New Responsible Care Security Code is to work very closely with the local
communities and law enforcement. Performance and partnerships are extremely important
within the chemical community. However, there are some challenges with jurisdiction.
Different agencies/industries claim jurisdiction over areas of transport (e.g., United States Coast
Guard [USCG], DOD, railways, etc.). The ACC is working very closely with the railroad
industry on security issues. The industry has developed its own security plans post-9/11, and the
Department of Transportation (DOT) is working to assure that the distribution chain is secure.
EPA SCIENCE FORUM 2003 PROCEEDINGS 35
-------
Homeland Security, Emergency Management, and a Water Utility
Mr. Paul Bennett, Director of Emergency Management at the New York City Department of
Enviromental Protection (DEP), discussed issues related to the response of water utilities in the
event of an attack, including both drinking water and wastewater responses. DEP is responsible
for the water supply to New York City, and manages, operates, and protects the wastewater
system. Thus, the DEP is both a regulator and a regulatee.
Emergency management is relatively new to the DEP. Prior to 1996, there was little
coordination between agencies responding to an incident In 1996, the Office of Emergency
Management was moved from the New York Police Department to the Mayor's office, where
aggressive interagency planning and response began.
The DEP plan is one of coordination with all interagency plans and resources available with a
focus on protection of the water sector. There are currently partnerships between the DHS, DEP,
and various other agencies. DEP provides pro-active training and is currently working with the
United States Army Corps of Engineers in the development of a response protocol and early
warning and detection.
The DEP is a first responder to many types of incidents including fires, building collapses, and
raids, and is therefore familiar with the key people and concerns at various organizations. The
DEP response to the WTC attack was well coordinated based on already developed plans and
experience gained post-1996. Mr. Bennett discussed some of the activities of the DEP at the
WTC site including the need to shut down broken water mains without interfering with
firefighting efforts, debris blocking manhole access to underground utilities, coordination with
firefighters to restore water supply operation in needed areas, and working with the FBI to
deliver documents from the WTC from catchbasins and other wastewater system areas.
Emergency response issues associated with drinking water include:
* The need for fast and accurate detection
• An understanding of what the findings and results mean
• Clear response plans supported by local, state, and Federal agencies
• Clear decontamination protocols supported by local, state, and Federal agencies.
In addition, discharge of decontamination water is a major issue and the agencies involved must
come to an agreement on the management of such wastes.
Emergency response issues associated with wastewater include the need to:
• Take clear action by wastewater treatment plants receiving runoff and discharge from
hospitals
* Determine pre-chlorination effectiveness
* Consider the use of sewers to facilitate attacks
36 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
* Clear decontamination protocols.
A Public Utility Manager's View of Our World Post-9/11/2001
Mr. Michael Marcotte, with the District of Columbia (DC) Water and Sewer Authority, Mr.
Gordon Smith, with Sandia National Laboratories (SNL), and Ms. Janet Pawlukiewicz, with the
EPA Water Protection Task Force, provided a joint presentation on security issues, security
approaches, and risk assessments in regard to protecting the water supply.
Mr. Marcotte, Deputy General Manager and Chief Engineer of the DC Water and Sewer
Authority, explained that operation of the Blue Plains Water Treatment Facility, which serves
2,000,000 people daily, is governed by an 11 member regional board. This water treatment
facility is highly visible and many municipal entities are on the "front line" as candidates for
attack.
Chlorine disinfection products have historically been stored in large quantities onsite in rail cars
with a master plan to change this by 2005. Alternative disinfectants to chlorine products have
been investigated and, in December 2001, the last of the railcars holding the chlorine products
was removed from the facility. These products are now stored in discrete one-ton containers as
opposed to the 90-ton rail cars that previously sat by the Potomac River in plain view. As a
result of these actions, methanol is now the most dangerous chemical stored onsite in large
quantities and that the potential for disruption or sabatoge is low.
The wastewater collection system feeding to the treatment plant involves 1,800 miles of piping.
This is a concern because large mains provide access that can be used to introduce biohazards,
flammable materials, etc. Monitoring devices have been added to some pumping stations due to
the serious concern about explosives and fires.
Despite all these actions to reduce threats or provide early warning of unsafe situations,
additional security concerns remain as a result of an internal worker culture that "does not see"
(i.e., does not report) suspicious items or activities.
Water security issues at the treatment facility include:
• Water pumping and storage—perimeter security (high), reliability/redundancy (high), and
supervisory control, data acquisition, and related cyber issues (medium)
• Water distribution—remote quality monitoring (med/high) and hydrant/cross-connection
control (medium)
• Culture/internal security (high)—training/personnel selection issues.
Security approaches at the treatment facility involve vulnerability analysis; fences, barriers, and
walls; alarms, cameras, and sensors; and law enforcement involvement
EPA SCIENCE FORUM 2003 PROCEEDINGS 37
-------
Mr. Gordon Smith, Manager of the Public Safety and Technologies Department at SNL,
discussed the risk assessment process and its design and evaluation. SNL is a multi-program
research and development laboratory under DOE. The counter-terrorism laboratory, run by
Lockheed-Martin, is the lead laboratory for the physical security of nuclear sites that have
materials or devices that could be fashioned into nuclear weapons.
Risk assessment in the context of this discussion is defined as a systematic approach to
determining relative risk and is the backbone for vulnerability assessment methodologies. SNL
developed a risk assessment methodology for chemical facilities and is currently developing a
risk assessment methodology for communities as well. These are available for access by Federal
agencies and by non-Federal persons (after signing certain agreements).
The risk assessment process involves the evaluation and consideration of the following:
• Planning to identify areas of vulnerability and support
• Quantifying consequences and effects
• Identifying targets using fault tree analysis
• Defining threats (highest threat now is the eco-terrorist), the likelihood of an attack, and a
table or list of "most likely threats."
Design and evaluation of security procedures includes detection, security systems (protection),
and risk comparison. The response force must be able to neutralize any adversary; this requires
the ability to eliminate any threat. Security cameras detect but do not eliminate threats. In
addition, security systems can be modeled to evaluate their effectiveness; many facilities have
security systems that have a very low likelihood of stopping a terrorist attack. A risk comparison
helps to assess the utility of upgrades in conjunction with changes in risk level to achieve the
desired effectiveness.
Ms. Janet Pawlukiewicz, Director of the EPA Water Protection Task Force, discussed the EPA
role in water security, accomplishments of the Water Protection Task Force, the EPA Homeland
Security Strategic Plan, and the Public Health Security and Bio-terrorism Preparedness and
Response Act of 2002 (hereafter referred to as the Bio-Terrorism Act).
Major areas of accomplishment related to the Homeland Security Strategic Plan include the
development of tools, training, financing, research, technology development, building security,
and information exchange. Funding includes $90 million in grants for vulnerability assessments
and security planning among other areas. EPA also is developing protection strategies for large
and small water systems, workshop and consultation services for medium-sized drinking water
systems, emergency response guidance, and guidance on how to change security practices as the
threat levels change. Research and technology development activities include the development
of a comprehensive research plan for the entire water sector to be implemented by various
agencies and organizations, and developing models through the ETV program to show the fate
and transport through distribution systems and watersheds in the event of an incident. Funds
have been allocated to table-top incident response exercises. Information exchange includes the
38 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
creation of a Water Information Sharing Analysis Center, coordination with interdependent
infrastructure (e.g., electricity, transportation, telecommunications), and coordination with
emergency responders, public health officials, and law enforcement. '
Hie Bio-Terrorism Act covers approximately 9,000 systems within the United States and amends
the Safe Drinking Water Act (SOWA). Requirements under this Act pertinent to water utilities
include, the conduct of vulnerability assessments, development or revision of emergency
response plans, and submission of certifications to the EPA. These activities are to be conducted
in accordance with a phased schedule from March 2003 to June 2004 (depending on the size of
the utility).
EPA is required to establish protocols for protecting the vulnerability information that is
submitted, fa addition, EPA is examining attack vulnerabilities and vulnerability assessment
methodologies through collaborations and partnerships with a variety of agencies and
organizations.
The Water Protection Task Force has a very detailed website (www.epa.gov/safewater/securily)
where additional information and updates on initiatives are provided. The activities being
conducted by the Water Protection Task Force include multiple benefits such as emergency
response plans and improvements in water quality as well as an emphasis on a security-oriented
culture, which is very important. The "Four Ps" for Task Force activities are partnering,
planning, protection, and practicing.
The EPA Safe Buildings Program
Dr. Nancy Adams, with the NHSRC, discussed the Safe Buildings Program and associated
research activities. The Safe Building Program is one of three areas of emphasis at the NHSRC
wilh the other two involving water security and rapid risk assessment. The Safe Buildings
Program is focused on answering three questions:
• How to protect the occupants of built structures (schools, subways, etc.) and prevent
purposeful contamination of these structures
• How to decontaminate built structures (the focus of the research)
* How to distribute this information to those who need it.
The four NHSRC sections are detection, containment, decontamination, and disposal. The
NHSRC research approach involves:
• Separate consideration of detection, containment, decontamination, and disposal issues
• Selection of the most difficult to treat threat for initial decontamination studies
• Grouping materials as aerosols or gases for studies on prevention and containment
• Grouping gases by chemical and physical characteristics for containment
Detection involves the testing and verification of existing detection devices (via the ETV
program), development of new devices or sampling and analysis methods, and the design of
EPA SCIENCE FORUM 2003 PROCEEDINGS 39
-------
sampling and detection networks. Current detection activities include surface sampling for
spores and the use of four multi-analyte detection systems.
Containment involves HV AC improvements, development of gas and particle filters,
specifications for safe havens, retrofit improvements, and economic considerations. Current
containment activities include modeling indoor releases and emergency responses, and
developing a building owner's guide.
Decontamination efforts involve efficacy testing, safety issues, cost issues, and regulatory
support to OPP. Current decontamination activities include technology evaluations by the
Army's Edgewood Chemical Biological Center (ECBC), the EPA ETV program, and the EPA
Small Business Innovation Research (SBIR) projects.
Disposal efforts involve testing of incineration and landfill application methods, selection of
appropriate disposal facilities, and assessment of residuals. Current disposal activities include
carpet incineration and workshops with waste managers.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period addressed a range of topics. These included: (1) the use of
plastic sheeting and duct tape; (2) suggestions for working out the differences between
methodologies and actual decontamination efforts; (3) the needs of onsite responders; (4)
background concentration determinations; (5) coordination of state, local, and Federal agencies
in the development of guidelines; (6) public access to information regarding water and air
pathogens; and, (7) the attack of critical facilities.
BioWatch - Nationwide Early Detection of Airborne Biological Agents
Mr. Thomas Coda, the lead for Homeland Security Programs in OAQPS, discussed the
development and implementation of the Bio-Watch surveillance network for early detection.
Bio-Watch involves the rapid recognition of releases of biological agents before the on-set of
illness, and measures the extent of the releases.
The concept began with a system called BASIS, which was designed for the Salt Lake City
Olympic Games. After the September 11th terrorist attacks, EPA met with other agencies to
determine how this system could be deployed nationwide.
The Bio-Watch system consists of over 3,000 monitoring systems in highly populated area
across the United States. Locations for monitors were chosen based on the same kind of
meteorological modeling used to select other types of monitoring sites. The structure of the
system is as follows:
• DOE has the responsibility for selectivity, technical support, and acceptability of the
equipment
40 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
» EPA has the responsibility for sampling
* CDC has the responsibilities of performing analyses and data management.
Mobile training teams sent out across the Nation set up the monitors and provided hands-on
training to local, state, and other environmental personnel. The monitors consist of 47-
millimeter filter units that draw high volumes of air over them. The filter units are collected
every 24 hours and taken to CDC, where a series of tiered assays are run. These laboratory
assays take approximately 6 to 8 hours to run with results returned in as little as 36 hours. This
system is used for the types of releases that cover large areas and has been validated using live
agents. Initial samples were used to determine background levels.
Use of this system could result in much quicker isolations of areas and early treatment because
the exposure time would be reduced. The current status of the Bio-Watch system is that is has a
robust capability deployed and operating across the United States. The system is providing
information back to the local communities who are responsible for developing a consequence
management plan. The CDC provided a template for such a plan, but because every city is
unique, it is left to the up to the cities to develop their own plans.
The Bio-Watch system is not perfect, but it is an appropriate technology for what is trying to be
accomplished. The system will be reconfigured as necessary based on lessons learned and table-
top exercises.
World Trade Center: Lessons Learned, and Personnel Protection and
Training
Session moderator, Mr. Larry Reed, with NIEHS, spoke briefly about some of the lessons learned
in the aftermath of the WTC attack regarding collaborations, after -which five additional
speakers discussed worker protection issues and the importance of immediate response
collaborations. A panel discussion including an audience question and answer period followed
the presentations.
World Trade Center Lessons Learned and the Interagency Collaboration
Mr. Larry Reed, with NIEHS, began this session with a discussion of some of the lessons learned
regarding collaboration in the context of the responses to the WTC attacks. Federal agencies
involved in the aftermath of the September 11th terrorist attacks included EPA, DHHS, the
Agency for Toxic Substances and Disease Registry (ATSDR), NIEHS, NIOSH, the Occupational
Safety and Health Administration (OSHA), the Federal Emergency Management Agency
(FEMA), and their state and city government counterparts.
Difficulties encountered with interagency collaboration stem from differing agency cultures,
communication methods, stakeholders, and missions as well as agency "tunnel vision." The
benefits of collaboration include leveraged resources, faster transfer of knowledge, and more
supportive stakeholders through consistent communications (i.e., the public received the same
information from all agencies involved). An Interagency Task Force was established to assist in
EPA SCIENCE FORUM 2003 PROCEEDINGS 41
-------
data analysis and interpretation. The Task Force was led by representatives from EPA, ATSDR,
and OSHA, and shared information as well as database and website development. Other areas
for collaboration include evaluation of longer-term impacts, assessment of health impacts, and
first responder training. Of final note was that all disaster response should be conducted in
compliance with the requirements of OSHA under 29 CFR 1910.120.
9/11 Lessons Learned for Worker Protection
Mr. Joseph Hughes, Jr., with NIEHS and Mr. Bruce Lippy, with the National Clearinghouse for
Worker Safety and Health Training, discussed the conditions and problems encountered at the
WTC site with regard to worker safety and training.
Mr. Joseph Hughes, Jr., Director of the NIEHS Worker Education and Training Program, noted
that being onsite at the WTC presented an opportunity to be in a position to examine site safety
and health plans, and to be able to examine the use of PPE and environmental monitoring
practices. An onsite safety and health training program was put together for approximately 4,000
workers. Since the WTC experience, NIEHS has conducted workshops regarding lessons
learned from that disaster and workshops focus and preparedness. The NIEHS Worker
Education and Training Program was also involved in training the anthrax cleanup crews who
worked onsite at the Hart Building and the Brentwood Post Office.
Mr. Bruce Lippy, with the National Clearinghouse for Worker Safety and Training, discussed
worker conditions at the WTC. Arriving about one week after the WTC collapse, his focus was
to ensure protection of equipment operators. OSHA distributed over 120,000 respirators yet only
about 20 percent of the personnel wore the respirators, and there was no appreciable amount of-
respirator fit testing until about 36 days later. This pointed to the need for skilled support
personnel.
Other lessons learned included the need to:
* Improve command, control, communications, and coordination during disaster response
• Provide better protection of support personnel at disaster response sites
• Provide pre-incident training
» Establish effective injury and illness surveillance and exposure monitoring at disaster
response sites.
In addition, training input for disaster response needs to be continuous.
Immediate Response and Collaboration: EPA Region Perspective
Dr. Mark Maddaloni, with EPA Region 2, discussed the physical.and chemical challenges
encountered after the WTC attack.
42 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
The initial hurdles after response were the physical challenges such as the facility building being
closed for three weeks following September 11,2001. There were also staffing constraints,
including the availability of key personnel. Re-location of people to Edison, New Jersey was
difficult and chaotic. In addition, communication was very difficult as a result of limited
computer access and telephone service.
The chemical challenges encountered included:
• Management of real-time data, QA/QC, limited laboratory capacity, and data interpretion
• Difficulty in assessing acute toxicity criteria
• Ability to address chemical mixtures
* Limited sampling data and experience with certain types of sampling such as particulate
matter (PM)
• Anticipated exposure duration since tens of thousands of people were allowed to come within
one block of a disaster area.
Solutions to these challenges included the use of cell phones, the use of libraries, reaching out to
sister agencies, and conference calls to communicate results.
Immediate Response and Collaboration: ATSDR Perspective
Mr. Sven Rodenbeck, Section Chief for the Superfund Site Assessment Branch at ATSDR,
discussed the difficulties encountered in conducting sampling at the WTC. Multiple national
response teams were deployed, and at the four first aid stations positioned around the WTC site,
over 9,500 responders were seen and/or treated. There were questions about what was inside of
the buildings (especially the residential buildings) that was now being blown around. While
there was limited residential sampling conducted, a limited investigation was performed to assess
the need for follow-up sampling. A total of 34 buildings were sampled in November and
December 2001.
ATSDR assisted with this sampling effort, which was focused on the main constituents of the
buildings such as concrete, wallboard minerals, asbestos, and fibers. There were higher levels of
mineral and fibrous materials in the settled material, and it was estimated that there would be a
higher risk of lung cancer as a result of exposure if the exposure concentrations remained the
same. This was not the case since the site was cleaned up.
Current directions for ATSDR in this process at the present time involves the WTC Exposure
Registry. This registry will be used to compare exposure with background conditions and also
will follow-up on mental health issues. In addition, a Rapid Response Registry will support
response to future terrorist events to identify those that may have been exposed in an attack
within hours of the event. This'will also provide a mechanism for later follow up on health and
mental effects.
EPA SCIENCE FORUM 2003 PROCEEDINGS 43
-------
Longer Term Response and Collaboration: NIEHS Perspective
Dr. Claudia Thompson, Program Administrator for the NIEHS Superfund Basic Research
Program, discussed NIEHS WTC research activities. Investigators were at the WTC site very
soon after the attack, and within the first few months afterwards were collecting onsite dust •
samples and beginning studies. This involved a collaborative effort on the part of different
universities, the EPA, and state and local Departments of Health.
WTC-related activities involve exposure, modeling, and health effects (e.g., respiratory effects,
pregnancy outcomes, and developmental effects). Examples of collaborative efforts include the
Public WTC Exposure Database, WTC brochures, community forums, publications, and joint
scientific planning meetings. Also, publications of the activities at the WTC site have begun to
be released.
NIEHS received an additional $4.5 million for research. This includes future opportunities for
collaboration on homeland security research, including:
• Developing a program for public health preparedness and physician and nurse training on
environmental medicine to include the creation of preparedness teams and development of
training courses
• Developing a basic and applied research program in chemical terrorism as it impacts human
health and the environment.
Evaluation of Health Effects of Cleanup and Recovery Workers at the World Trade
Center Disaster Site
Dr. Alison Geyh, Assistant Professor in the School of Public Health at Johns Hopkins
University, discussed the partnerships that aided in the evaluation of health effects resulting from
exposure to the WTC site. The initial exposure assessment led to many questions, which led in
turn to a very large study being conducted on those individuals involved in the WTC cleanup
efforts. This study, funded by the NIEHS since October 2002, is being conducted through strong
partnerships with New York and Columbia Universities, labor unions (teamsters, engineers,
laborers international), state government agencies such as the New York City Departments of
Health and Sanitation, the EPA, and health care facilities.
Monitoring stations were placed around the WTC site and on people to generate a data set that
includes data from right after the terrorist attack through several months later. This database will
be made available soon and will include data from the partner organizations. Health assessments
of cleanup and recovery workers are being conducted by Johns Hopkins School of Public Health
and Columbia University.
World Trade Center Assessment Report
Mr. Herman Gibb, with the National Center for Exposure Assessment (NCEA), discussed the
WTC Assessment Report. This report was compiled at the request of EPA Region 2, which has
responsibility for New York City, and was reviewed and released on the EPA website in
44 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
December 2002. The WTC Assessment Report focuses on outside measurements of exposure
with some discussion of indoor exposure, general population exposure with some discussion on
worker exposure, and relationships to air concentration benchmarks. Data sources include the
EPA website, EPA's national health and environmental effects research, and background
concentrations among others.
There were three principle findings of the report. First, persons exposed to high levels of
ambient PM and its compounds are at risk for immediate acute (and possibly chronic) respiratory
and other types of symptoms such as cardiovascular. Second, some health effects cannot be
determined effectively for the September 12-23,2001 time period because some of the
contaminants were not measured until September 23, 2001. Third, the surrounding community is
unlikely to suffer short- or long-term effects, except for the first few days after the attack when •
the concentrations were at their highest.
Lessons learned from mis assessment include:
• Health guidance for acute and sub-chronic exposures is needed
• Beginning sampling as early as possible after an event is important
• Earlier and more extensive indoor sampling is helpful
• Monitoring objectives need to be clearly defined
• Measurement techniques need to be identified (for example, dioxin).
Another lesson learned is that risk communication is a major issue for such as response.
Comments were made that it would have been helpful for the government to speak in one voice
as opposed to hearing different agencies saying different things.
i
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period addressed a range of topics. These included: (1) the
examination of brominated contaminants; (2) the compilation of a responder/worker competency
list; (3) determination of adequate training time; (4) the use of privately collected data; (5) a
process for compiling data and other information from government agencies; and (6) the
potential to normalize data
Preparing for Bioterrorism Threats in Water
Following opening remarks by Dr. Jafrul Hasan, with the Office of Science and Technology, and
Mr. Chris Zarba, with National Center for Environmental Research (NCER),five speakers
addressed security and detection technology research as they relate to bioterrorism threats in
water. A panel discussion including an audience question and answer period followed the
presentations.
Dr. Jafrul Hasan, with the Office of Science and Technology in the Office of Water, provided a
brief session overview and noted the EPA contributions to homeland security. Of particular note
EPA SCIENCE FORUM 2003 PROCEEDINGS 45
-------
is the need to improve analytical monitoring and detection in drinking water systems as well as
the need to define existing technologies to support this. Gaps and opportunities exist such as
addressing bacteria, viruses, and toxins with the potential challenge effacing a disinfectant-
resistant virus introduced into the water supply.
Mr. Chris Zarba, with NCER, provided a perspective on biological effects including the need to
draw not only on in-house expertise but to also reach outside of EPA. The anthrax scare of 2001
alone involved four contaminated letters, potentially three other letters never recovered, 23
contaminated locations in the United States, over 30,000 samples taken in the Hart Senate Office
Building alone, over 30 tons of waste generated from decontamination that had to be managed as
hazardous waste, about $130 million spent to cleanup just the Brentwood Post Office, and almost
$1 billion spent so far for all anthrax site cleanups. Thus, there are good reasons for biological
threats to be high on the list of threat considerations.
NHSRC's Water Security Research and Technical Support Program
Mr. Jonathan Herrmann, with the NHSRC, discussed the operating principles and the scope of
the Water Security Research and Technical Support Program. The goal of this research program
is to provide, within three years, appropriate, affordable, reliable, tested, and effective
technologies and guidance for preparedness, detection, containment, decontamination, and risk
of chemical and biological attacks on buildings and on water systems. The key principles of this
research program are short-term, high-intensity, applied efforts; an understanding of and focus
on user needs; targeting key knowledge gaps; producing high quality, useful products quickly;
and partnering with ORD, EPA, other agencies, and the private sector. The overall homeland
security strategy involves critical infrastructure protection, preparedness, response, recovery,
communication, and information as well as protection of EPA personnel and infrastructure.
The scope of the Water Security and Technical Support Research Program encompasses physical
threats as well as biological, chemical, and radiological contaminants in drinking water and
wastewater systems. Facilities available support this research include the Biological
Containment Facility (a level-3 facility), a small drinking water pilot plant, the Test and
Evaluation Facility in Cincinnati, Ohio, and a water distribution simulation facility.
The overall approach of this research program involves identification, detection, containment,
treatment, decontamination, disposal, risk, and information sharing. An Action Plan has been
developed for this research program and is 5 chapters long, with chapter three being heart of the
plan and focused on drinking water. The Action Plan also addresses physical and cyber
protection as well as wastewater systems, rapid risk assessment, and technology verification.
The three areas of emphasis in the rapid risk assessment approach are rapid risk assessment
following an event, risk assessment of defined threat scenarios, and long-term risk assessment
research. The roles of the rapid risk assessment are to help prepare responders to address health
issues related to water threats and to participate in the response with expert risk assessment
information.
Technology verification activities in terms of water security include monitoring and detection
(e.g., cyanide and toxicity), point of use treatment (such as reverse osmosis filtration and
46 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
ultraviolet irradiation), drinking water system decontamination, and treatment of
decontamination waters.
Key collaborators in the supporting this research program include:
Office of Water, OPPTS, Office of Radiation and Indoor Air, and OSWER
EPA Regional Offices
United States Army's Edgewood Chemical and Biological Center
Food and Drug Administration's Forensic Chemistry Center
Air Force Research Laboratory
Metropolitan Water District of Southern California
United States Geological Survey (USGS)
CDC
United States Army Corps of Engineers
DOE's National Laboratories
National Science Foundation (NSF).
Ms. Grace Robiou, with the EPA Water Protection Task Force, discussed the role of the Water
Protection Task Force in water infrastructure security, which includes:
• Assisting the water sector in understanding the threats to water security
• Helping utilities assess their vulnerabilities to possible attack
* Providing tools based on the best scientific information and technologies to assess risk and
respond in the event that an incident occurs.
The types of threats that are of concern to the Water Protection Task Force include biological,.
chemical, radiological attacks; physical destruction or damage; cyber attack; and interruption of
interdependent activities such as fire suppression, electricity, and/or transportation.
The three projects highlighted for consideration are agent prioritization, the development of a
response protocol for contamination threats to drinking water, and the assessment of laboratory
capabilities. The objectives for agent prioritization are to conduct technical activities in support
the development of procedures for analysis of unknowns in water and the water security research
plan. Agents posing a threat to water include pathogens (e.g., protozoa, bacteria, viruses),
biotoxins (e.g., plants, algae, bacteria), chemicals (e.g., pesticides), and radionuclides (sealed
sources).
The objectives for developing a response protocol for contamination threats to drinking water are
to provide a framework of considerations and procedures to guide the response to a water
treatment threat, and to focus on the question of what a utility, laboratory, or emergency
responders need to consider in preparation for an event
EPA SCIENCE FORUM 2003 PROCEEDINGS 47
-------
The objectives for the assessment of laboratory capabilities are to identify laboratories able to
implement analytical response protocols for unknown contaminants in water and laboratories
with basic capabilities to support water utilities in an emergency.
Potential Technologies for Detection of Biological Threats in Water Supplies
Dr. John Ezzell, a Senior Scientist with the United States Army Medical Research Institute of
Infectious Diseases, discussed various technologies used in the detection of biological threat
agents in water. The currently recognized biological warfare agents were selected many years
ago because of the ability of these agents to be stabilized for weaponization. These agents are
infectious as aerosols and can be produced in mass quantities. "Fear" is the key word in
bioterrorism.
Waterborne infectious diseases (acquired by ingestion) include viruses and bacteria such as
salmonella, e. coli, and cryptosporidium. It is very difficult to monitor for a broad range of
bacteria and small facilities do not have the manpower, equipment, educated individuals, etc., to
monitor any better than they are currently doing.
When samples are in Hie fluid state, they can be moved into other types of technology such as
polymerase chain reaction, immunoassays, and cultures. Automated sample processing systems,
such as GeneXpert, are useful, and instrumentation currently being used by the CDC includes the
Threshold and Bio Threat Alert Test Strip. Immunological assays that may be applied to water
testing for biological threats include dried-down chemistries, broad dynamic range, and 30-
minute assays. Other tests that may be applied include enzyme-linked immunosorbent assay and
fluorescent antibody assay.
In addition, sentinal types of approaches may be necessary. For example, if a change in pH
results in a change in a protein, this may serve as an indicator to identify the need for further
testing. Research efforts need to look for such common denominators.
"Early Warning Monitoring" and Sensor Technology Development
Ms. Janet Jensen, Project Manager for the Joint Service Agent Water Monitor (JSAWM)
Program with the United States Army Soldier and Biological Chemical Command, discussed the
JSAWM program, which was designed to develop advanced capabilities to detect, identify, and
quantify chemical and biological contaminants in source treated and distributed consumer water
supplies. Research program components include sensor technology, new models, working with
USGS in surface and ground waters, and working with EPA in product waters.
The goal is for rapid results and the ability to work in different kinds of waters. The program
involves the systems concept with technological modules that are "plug and play" as well as easy
to update. Currently, there is no single commercially available product that meets all of these
needs.
The program plan is currently in the testing phase for biological agents, and candidate
technologies are being sought. The program plan has successfully passed peer review and is
48 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
scientifically sound. Upcoming efforts include sensor simulation, proof-of-concept for
reagenfless detection, and development of a database of processes entitled Tech Watch.
Detection of Biological Agents in Water
Dr. Alan Lindquist, Technical Lead for Detection of Contaminants of Concern in the Safe
Buildings and Safe Water Programs at NHSRC, discussed the detection of agents of biological
terrorism in water and an approach for moving the science forward.
Currently, there is no written approach for detection of biological agents in water and protocols
have not been tested. The analytical technology currently available includes concentration
followed by use of modified classical models, concentration followed by use of molecular
methods (polymerase chain reaction-based), antibody tests, and black boxes.
What should be available includes:
• Written protocols that cover all aspects of detection from sampling to the interpretation of
results
• All pertinent information, including but not limited to supplies and suppliers, standards, and
training requirements
• Appropriate quality control
• Protocols that have been tested in multiple laboratories using realistic challenges.
Moving the science forward involves the development of a draft protocol that undergoes peer
review and laboratory testing. The protocol should address such topics as large volume
sampling, field concentration, and laboratory testing for bacteria, viruses, and protozoa of
interest, including both presumptive molecular testing and classical methods.
There are a number of advantages and disadvantages associated with various types of
approaches, such as:
• Rapid field screening cannot yet be recommended because it does not include analytes
specific for water.
• Molecular assays give presumptive results, yet benefits of this technique are the potential for
automation and specificity to the organisms of interest. Weaknesses include viability and
unknown sensitivity.
• Classic assay techniques consist of cultures, antibody detection for protozoa, and cell cultures
for viruses. Advantages include the availability of validated protocols, which are generally
"reference" methods. Disadvantages include sensitivity, the potential for low specificity,
long-term requirements, and assay methods are not available for some organisms.
Immediate actions and future directions include:
EPA SCIENCE FORUM 2003 PROCEEDINGS 48
-------
• Drafting, testing, laboratoiy validation, peer review, and dissemination of the draft protocol
• Evaluating commercial technologies
• Research to fill weak areas of the protocol
* Research on alternative methods including rapid assays, diffuse monitoring systems, and
biologically based monitoring.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period addressed a range of topics. These included the availability
of information on the ETV website, and concerns about mailing tests.
50 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Section IV: Moving Science
Into
Tuesday and Wednesday, May 6-7,2003
The purpose of this session on the second and third days of the meting was to focus on ongoing
projects, activities, and initiatives of current or anticipated databases, models, and decision
support tools on a national, regional, state, local, and tribal level. This included several pilot
projects, partnerships, and communication efforts supported by EPA as well as pertinent needs
and uses of scientific data to assess environmental conditions and human health risks. Each
session included a panel discussion and opportunities to respond to audience questions that
provided additional information and insight on a variety of emerging technology topics.
Dr. Betsy Smith, with the National Environmental Research Laboratory (NERL), led a session
presenting the Regional Vulnerability Assessment (ReVA) tool and its applications to
environmental analysis and decisionmaking. Presentations included an overview of the ReVA
tool, partnerships in ReVA development, the functions of the web-based tool, an overview of a
regional and intergovernmental Sustainable Environment for Quality of Life (SEQL) project, and
examples of ReVA application at the state level focusing on initiatives underway in the State of
Maryland.
Mr. Gilberto Alvarez, with EPA Region 5, led a session addressing scientific projects involving
partnerships with state and local governments to provide scientific insight to support
decisionmaking. Presentations included determinations of water quality from salmon migration
in the San Francisco Bay area, conducting integrated environmental planning to address
urbanization and land use expansion efforts in the Central North Carolina area, and efforts of the
Michigan Environmental Science Board (MESB) in protecting children's health.
Dr. Michael McDonald, Director of the Environmental Monitoring and Assessment Program
(EMAP), led a two-part session providing highlights of EMAP and its application. Presentations
included an overview of EMAP and the EMAP Western Pilot program, EMAP capabilities and
uses in state programs featuring examples from California, EMAP uses by the Nez Perce tribe,
an overview of the National Coastal Assessment Program, application of EMAP to the Southern
California Coastal Water Research Program (SCCWRP), the role of the National Coastal
Assessment Program in supporting South Carolina Estuarine and Coastal Assessment Program
activities, and the application of EMAP and REMAP to CWA compliance initiatives within EPA
Region 2.
Mr. Thomas Baugh, with EPA Region 4, led a session addressing diverse scientific initiatives
and other tribal activities to understand and address environmental issues. Presentations
included investigations into pesticide use and exposure at the Big Valley Rancheria, tribal
partnerships to investigate and minimize exposure to environmental contaminants from military
EPA SCIENCE FORUM 2003 PROCEEDINGS 51
-------
sites and other sources at the St. Lawrence Island in Alaska, and Swinomish tribal initiatives to
address contamination of shellfish, a critical subsistence food item.
Ms. Pamela Russell and Mr. Mike Flynn, with the EPA Office of Environmental Information, led
a session addressing diverse Federal/state partnerships for data acquisition and analysis, the use
of data collected from the Toxics Release Inventory (TRI), and web-based tools for
environmental data analysis to determine environmental and human health impacts.
Presentations included diverse applications of data collected for the Toxics Release Inventory
(TRI) program, a Maryland program to integrate state- and county-level stream monitoring
programs, an evaluation of the effects of urban growth on environmental health, and a mapping
tool to visually depict trends in human health and environmental conditions.
In another session, five speakers provided overviews of several regional projects to determine
and protect the environmental conditions of ecosystems. Presentations included the application
of geospatial tools in EPA Region 5 characterize and rank ecosystem quality, synoptic modeling
in EPA Region 7 to prioritize and rank ecosystems, a partnership for ecosystem study and
protection in EPA Region 4, and the findings and benefits of the Mid-Atlantic Highlands Action
Program.
Dr. John Bing-Canar, with EPA Region V, led a session addressing tools used in a contaminated
sediments study supporting decisionmaking for remediation. Presentations addressed the
contaminated sediments study and the approaches and tools used for site characterization, initial
sampling design, spatial estimation of contamination, and decision analysis.
52 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Regional Vulnerability Assessment: Improving Environmental
Decisionmaking Through Client Partnerships
Following opening remarks by Dr. Betsy Smith, with NERL, four speakers addressed the ReVA
project, its web-based analysis tool, and applications of the tool to support scientifically-based
decisionmaking. A panel discussion including an audience question and answer period followed
the presentations.
Dr. Betsy Smith, with NERL, provided opening remarks including an overview of ReVA, its
features, and capabilities as a web-based tool available to ORD's client partners. Dr. Smith then
introduced the other speakers in this session.
ReVA's Client Partnerships: Improving Environmental Decisionmaking Through
Applied Research
NERL scientist, Dr. Betsy Smith, described the ReVA project and its benefits to client partners.
This project receives funding from ORD and is a sister program to ORD's EMAP. ReVA uses
monitoring data from EMAP, as well as other programs, to support risk management actions.
ReVA is being utilized in cross-agency laboratories and interagency programs, with support
provided by ORD and its partners, including the USGS, United States Forest Service, and the
Tennessee Valley Authority.
ReVA is an applied research program designed as a flexible framework for use and fine-tuning
by various national, regional, and local decisionmakers responsible for building, sustaining, and
improving their communities while protecting the environment and human health. Current client
partners of the ReVA program include the Centralina Council of Governments, EPA Region 3
Air Protection Division, EPA Region 4 Air Toxics Assessment and Implementation Section, the
Pennsylvania DEP, the Maryland Department of Natural Resources, Baltimore County (in
Maryland), and the Canaan Valley Institute.
ReVA was initiated for the Mid-Atlantic region of the United States. Government agencies and
other organizations in the Mid-Atlantic region have historically maintained various types of data
as a result of many research efforts and environmental initiatives: A primary feature of ReVA is
the ability to manipulate and handle varied types of data, therefore, the Mid-Atlantic region
provided as a great basis for project initiation. ORD has completed its first assessment of ReVA
in the Mid-Atlantic region and will soon begin a second phase of assessment in another region.
ReVA offers a web-based tool and modeling system to help decisionmakers protect the
environment and the human health of their communities by:
• Estimating conditions and exposures for every point on the map, including watersheds
* Identifying current and future vulnerabilities by providing ecological forecasting with the use
of a modeling system
EPA SCIENCE FORUM 2003 PROCEEDINGS 53
-------
• Enabling trade-off analyses through "what if scenarios and evaluating different alternatives
with the use of a modeling system
• Linking environmental health with economic and human health for a truly integrated
assessment
* Synthesizing data to determine vulnerabilities, manage like units, and track the completion of
program tasks and goals.
Although EPA continues to cleanup historic problems to allow for a healthy environment, ORD
has seen declines.in biological populations despite compliance with environmental regulations
set to strengthen biological populations. Such declines could be the result of population growth,
changes in land use for mining and timber activities, extraction of, rich resources, an increase in
pollutants, and the invasion of exotic species, as well as the cumulative and aggregate impacts
from all of these influences. ReVA enables users to evaluate current problems, such as these,
and to project future problems using regional modeling.
ReVA also can help to project land use changes, which are the result of economic influences,
planned roads and developments, and rural and urban transformations. These same land use
changes that can benefit a community may also negatively affect pollution, pests and pathogens
in forests, conservation of native biodiversity, flood risk, nonpoint source pollution, urban
sprawl, drinking water quality, and economic opportunities. When evaluating problems and land
use changes, the ReVA program considers air deposition of pollutants, such as sulfates, nitrates,
ozone, and PM; sediment loadings; agricultural chemicals used across the region; total maximum
daily loadings; and forest health and biodiversity.
As an example, the ReVA program can help leaders in the Highlands area of West Virginia to
make decisions on economic priorities while considering the environmental effects of increasing
coal mines and chip mills in the regional area. Hie Highlands of West Virginia is known as a
globally unique area because of the abundant intact, deciduous, temperate forest, and therefore, a
generous habitat for large, migratory species. The Highlands area also has one of the highest
unemployment rates within the Mid-Atlantic region, and is targeted by resource extraction
industries because of its coal mining and hardwood forests areas. ReVA can provide
decisionmakers opting for an increase in resource extraction activities with information on the
resulting potential impacts on native biodiversity, water quality, and overall quality of life as
well as employment rate increases and economic benefits.
ReVA provides a way to integrate and combine data to facilitate and understand multiple criteria
that need to be considered in making regional decisions. Decisionmakers can look at scientific
data, but also can consider other aspects such as stakeholders, water quality, effects on
employment rates, changes in the environment, politics, and economics. With its web-based
integration tool, ReVA turns spatial data into easy-to-understand and useful information for
decisionmakers, and enables them to integrate criteria and prioritize using selectable subgroups
of data
54 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
ReVA's Web-Based Application:. A Tool for Regional, State, and Local
Decisionmakers
President of the Waratah Corporation, Dr. Michael O'Connell, demonstrated the capabilities of
ReVA's web-based integration tool that extracts information from spatial data. ReVA extracts a
signal from noise in the background of spatial data to determine land plots, watersheds, rivers
and lakes, and any other geographical area to be considered.
Users of the ReVA web tool can specify map types to be used in the information gathering
process, including maps that depict water and air influences, communities and residents of
geographical areas, and terrestrial categories. These maps also can be manipulated to create
"weighted maps" that are more influenced by certain indicators than some others. The spatial
map types selected by the user are accompanied by a histogram to show additional information
(e.g., number of watersheds).
• *
The ReVA web tool also has on-line diagnostics to aid users in looking at distributions in more
detail and in integrating data Users also are able to create a radar plot for data.
The Sustainable Environment for Quality of Life Program: A Partnership Between
EPA ORD and OAQPS, and State and Local Governments
Project Manager with the Centralina Council of Governments, Ms. Rebecca Yarbrough,
described the SEQL project and its benefits to politicians, land builders, and other leaders in the
Charlotte, North Carolina region. The SEQL project is a team-sponsored initiative between EPA
ORD, the Centralina Council of Governments in Charlotte, North Carolina, and the surrounding
areas. Program partners also include EPA OAQPS and the North Carolina Department of Health
and Environmental Control.
The SEQL project was initiated when the Charlotte, North Carolina Mayor Pat McCrory and
other elected officials considered ways to improve the environment in particular regions of the
Charlotte area. Through the SEQL project, leaders and elected officials plan to make a
difference in the quality of life in the Charlotte region; influence intergovernmental collaboration
and cooperation; promote involvement, innovation, and change; and implement regionally-
endorsed environmental initiatives, such as improvements in air and water quality as well as
smart (or sustainable) growth. SEQL project team members also are hopeful to institutionalize
environmental considerations in local and regional decisionmaking.
In the past, no one considered how land use changes, economic growth, and urbanization within
one area impacted another area. Also, as communities grow closer through expansion, there are
more adverse impacts on air and water quality, human health, forests, watersheds, rivers and
lakes, and other environmental areas of concern. The SEQL project addresses several action
items, including environmental education, tree planting ordinances, smoking vehicle
enforcement, stream buffering, retrofitting public vehicles with less-polluting energy sources,
open burning, and ozone awareness. These goals can be achieved by developing and distributing
educational tools (e.g., toolboxes and how-to documents), providing regional and peer support in
implementation, engaging in governmental and non-governmental partners in developing
consensus solutions, and establishing a database that permits measuring and reporting successes.
t
'EPA SCIENCE FORUM 2003 PROCEEDINGS 55
-------
Toolboxes and how-to documents are designed as educational items that explain the goals of the
SEQL project, benefits and costs, and ordinances. The toolboxes and how-to documents also
include required forms for land use and environmental restoration activities, and provide users
with simple step-by-step instructions on how to address action items and goals of the SEQL
project.
Science plays an important role in the SEQL project, and ReVA provides the body of evidence
that is needed to defend decisions on future land use changes, smart growth, and environmental
restoration activities. For example, a politician is asked to develop land in an area near a
watershed, and the new residential community will include one-acre lots. The same politician
also has been asked to approve a budget that includes an extra $0.12 per gallon to change the fuel
use of public school buses to ultra-low diesel fuel gasoline, which allows for improvements in
transportation emissions. The SEQL project, along with ReVA, can help with these decisions by
providing the politician with scientific data to compare the environmental benefits of an ultra-
low diesel fuel gasoline and related costs with the effects of land use changes. ReVA is a
framework for looking at cumulative impacts and alternative growth scenarios and can permit
leaders to analyze cumulative impacts on a multi-county basis.
ReVA's Partnership with the Maryland Department of Natural Resources:
Opportunities to Optimize the Future
Director of the Maryland Department of Natural Resources (DNR), Watershed Management and
Analysis Division, Mr. William Jenkins, described specific State programs and activities that can
benefit from ReVA and its web-based tools. Maryland is a relatively small state, with 6.2
million acres. Approximately 70 percent of the State consists of privately owned farmland and
forests, and 18.5 percent is developed. However, statistics depict a future increase in
development involving 15,000 acres per year for residential and commercial land uses.
With the increased land use, the Maryland DNR hopes to increase the effectiveness and
efficiency of their enhancement and restoration activities, and ReVA can help to achieve these
objectives by providing the best technology and technically sound information. ReVA also
enables Maryland DNR scientists to create desktop and web-based decision support tools,
including automatic geographic information system (GlS)-based analytical functions, simplified
user interfaces, and automatic report generation capabilities. ReVA enables Maryland DNR
researchers to analyze all scenarios in order to address thresholds, vulnerable areas, data
sensitivity, and use of specific indicators for different thresholds. ReVA also enables the
Maryland DNR to focus on emerging management issues and the future impacts of exotic and
invasive species.
ReVA can help the Maryland DNR consider how much watershed restoration or protection is
sufficient, based on the current conditions and future threats, as well as to prioritize watersheds
requiring immediate attention. In 1996, the Maryland DNR developed watershed-based
indicators to assess regional watersheds from environmental and socioeconomic indicators of:
• Resource conditions—defined as one or more aspects of existing environmental quality
56 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
• Landscape stress (or vulnerability)—defined as the extent or magnitude of human-induced
activities
• Programmatic response—defined as the extent or effectiveness of programmatic activities.
The Maryland DNR used a comparative watershed assessment program based on a GIS
(ArcView) application in order to combine these indicators and establish threshold values to
produce watershed aggregations for each priority watershed. The ArcView application enabled
researchers to combine and assign weights to indicators, but was unable to provide a statistical
analysis and could not be accessed via an internal network or web browser. Use of ReVA will
help to address these drawbacks.
The Maryland DNR also initiated a GreenPrint Program to preserve the State's green
infrastructure and to safeguard the State's most valuable ecological lands. With the green
infrastructure assessment program, Maryland DNR created a funding mechanism to provide
support to protect ecological components using a landscape model, but was limited to a coarse-
scale analysis, an incomplete range of ecosystem elements and-features with GIS data. The
GreenPrint Program focused on hubs (large contiguous blocks of natural resource lands) and
corridors (ecological routes between hubs). ReVA can help to evolve the model to allow for
analyses at various scales with combined and weighted parameters. ReVA also can help the
Maryland DNR to combine the current watershed and landscape assessment tools.
The Maryland DNR also hopes to use ReVA applications to enhance the Surf Your Watershed
project, a cooperative effort with the Maryland Department of the Environment. Surf Your
Watershed is a tool to catalog important environmental, socioeconomic, and programmatic
information on a watershed basis. The catalog provides a list of selected watershed indicators for
Maryland and allows the user to select an indicator in order to view a map that represents the
data.
In working with the ReVA project, the Maryland DNR hopes to:
• Identify and develop analytical tools and indicators necessary to interpret stressor-receptor
relationships at different spatial scales
• Develop scientifically defensible methods for assessing watershed and landscape sensitivity,
condition and function, and threshold values for indicators at different spatial scales
* Create the capability to make analytical tools and data available to the resource
decisionmakers and the public via the Internet
• Enhance DNR's capability to'integrate economic and ecological information through the
creation of decision support systems
* Establish a long-term working relationship with other regional, state, and national
environmental protection programs.
EPA SCIENCE FORUM 2003 PROCEEDINGS 57
-------
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from (he audience.
A brief panel discussion addressed a range of topics. These included: (1) approaches for
outreaching and educating students and the public, (2) allowing easy access to the ReVA web
tool so that universities and smaller organizations can use the tool, (3) including either
qualitative or quantitative predictions as well as statistical methods, (4) data quality and data
quality guidelines, and (5) utilizing the ReVA web tool to enforce or influence environmental
regulations, specifically total maximum daily loadings of pollutants in watersheds and
waterways.
Partnership With State and Local Government
Following opening remarks by Mr. Gilberto Alvarez, with EPA Region 5, three speakers
presented ongoing projects that are prime examples of partnerships between EPA, state or
regional agencies, and other organizations. A panel discussion including an audience question
and answer period followed the presentations.
Mr. Gilberto Alvarez of EPA's Region V provided the opening remarks for this session, and
provided an overview of the three examples of successful regional and state projects presented in
this session. Mr. Alvarez then introduced other speakers in this session. .
Delta Cross Channel Gate Operation on Water Quality and Migration of Juvenile
and Adult Salmon in Northern California
Dr. Herbold, with EPA Region 9, described studies of water quality based on the migration of
salmon. The Delta Cross Channel is a controlled diversion channel mat diverts water from the
Sacramento River into the Snodgrass Slough, near the San Francisco Bay area in California. The
opening of the Delta Cross Channel protects water quality at export pumps from salinity
intrusion.
In Fall 2001, scientists from EPA Region 9, the United States Fish and Wildlife Service, the
USGS, and the California Department of Fish and Game, along with other collaborators, released
large groups of adult Chinook salmon with individual tracking devices into the Delta Cross
Channel. Scientists conducted this release with the understanding of the status of the channel
gates (open/close) and the impact on migration—when the channel gates are closed, salmon
smolts remain on track to the ocean, however, when the channel gates are closed, up-migrating
salmon adults stray. Therefore, the scientists worked with the channel gates both open and
closed, and on bom ebb and flow tides.
The study worked to answer the following questions:
• How does the cross channel affect interior delta water quality?
• How does the cross channel affect adult migration through the delta?
58 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
• How does the cross channel affect smolt passage?
A series of experiments were conducted to answer these questions. The first study examined the
migratory pathways of salmon along the Sacramento River and the San Joaquin River. Salmon
were released at the Montezuma Slough along the Sacramento River and at Jersey Point along
the San Joaquin River. Study results showed that 84 percent of the salmon were tagged and
therefore traveled along the Sacramento River, while 16 percent were tagged in the San Joaquin
River.
Another study, the Delta Cross Channel Fish Passage Study, used hydrodynamic instruments to
detect the pathway and numbers offish moving down the Sacramento River and into the Delta
Cross Channel. Researchers released 120,000 adult Chinook salmon in the Sacramento River,
approximately three miles upstream from the Delta Cross Channel. After the gates of the
channel opened, scientists collected 1,282 of the released fish, and approximately 91 percent
were captured in the Sacramento River downstream from the channel. The majority
(approximately 99 percent) of the salmon were captured at night when the tides were high.
Therefore, the researchers concluded that nearly all water quality benefits were obtained when
the channel gates were open and that the fish followed the tides of the Sacramento River. The
location of the captured salmon reflected past hydrodynamic study results.
Another team led by Dave Vogel, wilh Natural Resource Scientists, Inc., also studied juvenile
salmon in the Delta Cross Channel and those experiments resulted in the same conclusions. In
addition, the use of radio equipment to study the salmon, which were injected with antennas, led
to more confident, multi-dimensional data because depth locations of the fish were also
identified.
Integrated Environmental Planning Across Two States, 15 Counties, and 36
Municipalities: Do You Believe in Miracles
Dr. Linda Rimer, with EPA Region 4, discussed new threats to the environment and human
health resulting from urban sprawl. The mission and goals of EPA Region 4 including
conducting research, monitoring, and modeling; establishing policies and setting standards;
developing rules; writing permits; and conducting inspections. Science has taught the regulators
and decisionmakers that the approaches used for growth and urban development have adversely
affected the environment because most decisions are made at the local level. Direct and indirect
effects of decisionmaking on land use patterns at the local level leads to a reduction in water
quality, water quantities, and air quality. Recent research documents threats to human health, as
well as the natural environment, as a result of poor, localized decisionmaking.
Generally, the quality of human health is a focus of many politicians; however, the quality of the
environment often is not. EPA Region 4 has initiated some early steps to highlight the
importance of the environment for decisionmakers. For example, EPA Region 4 established a
smart growth network, the Sustainable Urban Environment (SUE) Program, and State
Implementation Plan (SIP) credits for land use planning and energy conservation.
EPA SCIENCE FORUM 2003 PROCEEDINGS 59
-------
Many of these integrated, regional-based programs were piloted in the Centralina regional area
(i.e., central North Carolina). Mecklenberg County, North Carolina, is a non-attainment area for
the 1-hour ozone standard, and the rivers and waterways of the City of Charlotte discharge into
the South Carolina region. Region 4 helps to support a Charlotte/Rock Hill Project that utilizes a
"toolbox" to educate leaders and decisionmakers on air, water, and land use (or smart growth)
initiatives and approaches. Region 4 also supports the SEQL Program.
With these programs, EPA Region 4 works towards its goals of creating a model to be replicated
across the United States to influence politicians and decisionmakers to consider the quality of the
environment as a critical priority when initiating land use changes. The science to support these
goals can be established by documenting the impact of the built environment on the natural
environment, documenting the impact of the built environment on human health, modeling the
impacts of the interventions, and providing the model-based decision support tools to local and
state governments.
Michigan Environmental Science Board and Protecting Children's Health
Dr. Keith Harrison, Director of the Michigan Department of Environmental Quality (MDEQ)
Office of Special Projects, provided an overview of the MESB and the goals of partnering for
state and local governments in order to protect children's health. The sole MESB mission is to
provide the Governor of Michigan with advice and recommendations on environmental issues
based on sound science. The MESB is comprised of a variety of scientists and researchers who
are called upon at the request of the Governor. To date, the MESB has submitted 17 reports on _
specific issues, such as human health and environmental impacts of mercury, chlorine, and lead
contaminants; the human health impact of low-level hydrogen sulfide exposure; and cancer
trends among firefighters. The MESB also has proposed a uniform fish advisory for use with the
Great Lakes; a list of environmental indicators to be used to assess the overall state of the natural
environment in Michigan; and Michigan-specific generic cleanup criteria for indoor air
inhalation at sites of environmental contamination, low-level radioactive waste isolation facility
siting criteria, and environmental standards as they relate to children's health.
An example is the February 2000 document on the Children's Environmental Standards
Investigation. This document addressed the MESB's challenges from the Governor: (1) to
identify and prioritize the environmental standards that may need re-evaluation as a result of
either outdated and/or limited scientific data; and (2) to indicate, where possible, the nature of
the type of research to be undertaken to address any identified deficiencies. The children's health
document is available at www.michigan.gov/mesb.
The MESB found that risk assessment methodology currently used by the MDEQ to evaluate the
level of risk from exposure to specific environmental contaminants closely corresponds to that
currently used by EPA. The methodologies of both agencies explicitly consider children when
data are available for the specific contaminant under consideration. However, neither
methodology incorporate a standardized process to account for possible increased risks in
children. Instead, the two agencies rely on scientific judgment based on available information
and literature. A large body of data exists in relation to adult exposures to contaminants, but
there is little data that distinguish infants and children from adults. Considering these results, the
60 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
MESB determined that there is not a compelling scientific rationale for an additional, distinct
safety factor to account for exposures of infants and children.
The MESB also determined that the public health goals of specific MDEQ standards are difficult
to maintain because they are beyond regulatory authority (e.g., indoor air pollution), either
because they are currently unregulated or because similar exposures are allowed under other
State or Federal regulations. The MESB recommended that MDEQ and EPA re-evaluate current
.risk analysis methodologies for addressing and communicating risk to the public. Finally, a
process was recommended to keep abreast of pertinent scientific literature and research relating
to children, as well as cancer risk assessment, uncertainty factors in non-cancer risk assessment,
contaminant mixtures, cumulative risk, indoor and outdoor air contamination, and soil exposures.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief panel discussion addressed a range of topics. These included: (1) approaches to get local
governments to act regionally, (2) ways to educate politicians and elected officials on the
importance of the environment and programs or tools available to them, (3) conflicts of interest
or differences of opinion within the MESB, and (4) other groups throughout the United States
that are similar to the MESB.
Advancing Science Through Environmental Monitoring and
Assessment Program (EMAP) Partnerships
Following opening remarks by Dr. Michael McDonald, Director of EMAP, eight speakers
provided highlights of EMAP and its applications as well as current and future initiatives
involving this web-based tool. The Environmental Monitoring and Assessment Program (EMAP)
is an ongoing EPA project that supplies scientists and researchers -with tools to better estimate
regional, environmental indicators in order to assess environmental conditions. Panel
discussions, including an audience question and answer period, followed the early and late
afternoon sessions.
Dr. Michael McDonald, Director of EMAP, provided opening remarks and an overview of
EMAP, its features, and capabilities as a web-based tool. Dr. McDonald then introduced the
other speakers in this two-part session.
EMAP-West: Introduction
Dr. Blair, with the National Health and Environmental Effects Research Laboratory (NHEERL),
provided an overview of EMAP and the EMAP-West initiatives. EMAP aims to estimate the
current status and trends of selected environmental indicators on a regional basis, estimate
geographic coverage and extent, seek associations between indicators and stressors, and provide
the tools to prepare annual statistical summaries and periodic assessments. Additionally, EMAP-
West hopes to establish a framework for designated uses, develop indicator estimates that can be
used in data sets critical to defining quantitative biocriteria, and provide data for models to
EPA SCIENCE FORUM 2003 PROCEEDINGS 61
-------
support the 303d listing/delisting process. EMAP-West also offers surface water tools to support
sample survey design, estimates of ecological indicators, and establishing reference conditions.
When establishing indicators, researchers must individually evaluate indicator criteria by
addressing the following questions:
• How can we realistically get this sampling done?
• How can we best measure it (how far upstream or downstream)?
• How responsive is it (e.g., is it going to react to the stressors)?
• How variable is it?
• Can we score it?
Scientists must also consider data management while collecting data for indicators and/or
stressors. Researchers should consider the importance of having full and open sharing of data,
continuously updated systems that support environmental assessments, and consistent data bases
ready to accept data from coastal, surface water, and landscape components across the country
(i.e., using the STORET archival system).
EMAP and EMAP-West house the tools to address indicators and stressors, data management,
and many other issues. EMAP-West partnerships between EPA, the states, and the Native
American tribes strive to create unbiased estimates of the condition of ecological resources, such
as streams and rivers, to establish comparative rankings of stressors, to produce tools for
biocriteria, and to support a framework for the 303d process.
The EMAP Western Pilot in Region 8
Mr. Karl Hermann, with the EPA Region 8 Ecosystem Protection Program, discussed the EMAP
Western Pilot and associated activities involving surface waters, landscapes, stakeholder •
engagement, and ecological assessment of condition. The objectives of the EMAP Western Pilot
project include producing a regional assessment of the ecological conditions of streams in EPA
Region 8; developing partnerships with EPA ORD, states, tribes, and the USGS; and improving
technology transfer to the states and tribes.
In 2004, members of the EMAP Western Pilot project will work with the USGS to collect data
from the Yellowstone National Park basin. Other future EMAP assessment efforts will include
studies in the Southern Rocky Mountains (in Montana), other areas of Yellowstone National
Park, and other rivers.
Perspective from the State of California
Dr. James Harrington, with the California Department of Fish and Game, discussed the long
history of collaboration with EPA involving studies of biocriteria. The current goal of the
California Department of Fish and Game is to make a conscious effort to include both biology
and toxicology in the State's water programs and in methods of water quality monitoring.
California has several water quality boards; however, there is littler interaction between them.
62 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
The California Department of Fish and Game is developing biocriteria for California. Efforts
include building an aquatic bioassessment laboratory and establishing the California Aquatic
Bioassessment Workgroup (CABW). The Department also hopes to develop and promote
standardized field and laboratory protocols, and to promote the development of an Index of
Biological Integrity (IBI). In order to achieve these goals, the Department works with the Chico
State Taxonomy Laboratory, Chico State Research Foundation, and the Rancho Cordova
Laboratory. These laboratories work together to conduct research and taxonomy, EMAP and
field studies, enforcement, and taxonomy.
The CABW was established in 1994 to create a forum to communicate and exchange aquatic
bioassessment information. Since its inception, the CABW has finalized the California Stream
Bioassessment Procedures manual, formulated the process for developing biocriteria in
California, and provided a forum for updating attendees on bioassessment.
To support the efforts of developing and promoting a standardized and consistent field and
laboratory protocols, the California Department of Fish and Game used the California Aquatic
Macroinvertebrate Laboratory Network, developed an inter-laboratory QA/QC program, and
modeled the CalEDAS Database development.
In the future, the California Department of Fish and Game hopes to develop a new cooperative
agreement with EPA to work with EMAP-West data, start anew California Regional Ecological
Assessment Program, complete regional reference condition and IBI development, and work on
tiered aquatic life uses standards and use attainability analysis guidelines.
EMAP Tribal Perspectives
A Scientist with the Nez Perce Tribe, Mr. Jefferson Davis, described the goals of the Nez Perce
Tribe and the role of EMAP in their local science objectives. The Nez Perce are one of the first
tribes to adopt an EMAP approach with their own funds.
The Nez Perce Tribe is located in north central Idaho. The reservation involves approximately
750,000 acres across four counties. Approximately 30 percent of the reservation is tribally
owned. The remainder of the reservation has been sold to industrial companies or other
businesses. The tribal reservation has diverse landscapes, and therefore, requires diverse
approaches when managing their environment. Most of the reservation's land is used for cultural
activities, agriculture, recreation, timber management, and livestock management.
The Nez Perce developed an interest in the EMAP project when Mr. Davis attended an EMAP
training session in June 2001. Mr. Davis encouraged the tribe to fund its own EMAP, and in
2002, Mr. Davis and other scientists started a training review of EMAP field sampling protocols.
In 2003, the tribe initiated its own sampling for EMAP data, and will complete a final report of
its findings and results in 2005.
The Nez Perce Tribe will use the EMAP bioassessment applications to develop water quality
standards and criteria, complete a 303d list of impaired areas based on the state of their aquatic
community, and create total maximum daily loadings, among other project goals. EMAP will
play a major role in assessing the current condition of streams within the reservation.
EPA SCIENCE FORUM 2003 PROCEEDINGS 63
-------
National Coastal Assessment: Past, Present, and Future
Dr. Kevin Summers, with NHEERL, discussed the National. Coastal Assessment Program and
the use of EMAP to achieve goals to build the scientific basis as well as the local, state, and
tribal capacity to monitor the status and trends in the condition of the Nation's coastal
ecosystems. The National Coastal Assessment Program also hopes to integrate states and tribes
in utilizing status and trend data to obtain a national perspective.
There remain some uncertainties and questions regarding EMAP. For example, researchers are
still unclear of the status, extent, and geographic distribution of ecological resources. Other
•questions include: (1) what proportions of these resources are declining or improving, where,
and at what rate; (2) what factors are likely to be contributing to declining conditions; and (3)
whether pollution control, reduction, mitigation, and prevention programs are achieving overall
improvement in ecological condition.
Through the National Coastal Assessment Program, scientists will complete a probability survey,
estimate the ecological extent and conditions of resources, and characterize trends and the
conditions of resources to represent spatial patterns with known certainty. Another activity
includes training states and tribes to use the methodology in sampling and training on sampling
devices. This will require a QA document. States that will benefit from this survey include
Maine, New Hampshire, New York, New Jersey, Pennsylvania, Delaware, Maryland, Virginia,
North and South Carolina, the southeastern corridor, Texas, California, Oregon, Washington,
Alaska, Hawaii, and Puerto Rico.
According to some survey results, indicators such as water clarity, dissolved oxygen, sediment
contamination, and fish tissue show that overall conditions around the country are very poor.
Also, no data were collected in the Great Lakes; therefore, estimates of indicator values were
used. Future reports will include more data, but will not yet data from the Great Lakes. As
researchers of the assessment program draw conclusions on overall environmental conditions,
the step of integrating the monitoring framework still needs to be addressed.
EPA is currently helping states to develop sampling strategies to determine the overall condition
of aquatic resources. Research from the National Coastal Assessment Program will help states to
determine where sampling is needed in order to confirm impairment. Another recommendation
of the National Research Council on the CWA Section 303b listing process was the use of
preliminary lists of impaired waters, and the work of this Program should provide a rationale to
develop this preliminary listing. Future activities of the National Coastal Assessment Program
include supporting water quality goals, completing condition research to develop probability of
impairment models based on landscape characteristics and their relation to water quality, using
these impairment models to assist the states in completing integrated 305b and 303d reports, and
determining the cause of impairment in order to develop total maximum daily loads for impaired
waterbodies.
64 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
The Interactions of EMAP and the Southern California Coastal Water Research
Program: Help in the Past, Necessity for the Future
Dr. Stephen Weisberg, with the SCCWRP, discussed EMAP's evolving influence on Southern
California's coastal monitoring programs. The California coast is the most monitored coast in
the country, and agencies complete routine monitoring of its rivers, lakes, and other waterways,
excluding estuaries. EMAP has influenced California's monitoring protocols over the last 10
years.
Currently, California has nine water monitoring boards, and they each follow different protocols
and have different assessments. To address this, the SCCWRP developed three cooperative
regional monitoring surveys involving studies with 12 organizations completed in 1994, with 62
organizations completed in 1998, and with 68 organizations completed in 2003. This research
effort stemmed from the need to look at all coastal discharges using the same approach. EMAP
has had a big influence on this program.
The SCCWRP program seeks to answer the following questions with the help of EMAP:
• What is the spatial extent of chemical contamination and the associated biological effects?
• What probability-based sampling design could be used to evaluate potential impact areas?
• What multiple indicators should be evaluated at each site?
Answering these questions with EMAP will help this SCCWRP to identify sediment totals, hot
spots, priority sites in the State, and concentration levels in priority sites, as well as other sites in
California.
In order to determine sediment concentrations in waterways, SCCWRP researchers study fish
contamination. Researchers also have expanded the study areas to include beaches and wetlands.
With the use of EMAP, the SCCWRP can bridge the gap between indicator research and
implementation in state monitoring programs.
The Role of the National Coastal Assessment in Developing a Continuing South
Carolina Estuarine Monitoring Program
Dr. Robert Van Dolah, with the South Carolina DNR, described the role of the national coastal
assessment in developing a continuation of South Carolina's estuarine monitoring program. The
South Carolina estuarine monitoring program is supported by two State agencies (DNR and
Department of Health and Environmental Control). The Department of Health and
Environmental Control conducts ambient surface water monitoring by routine sampling of the
state's freshwater and estuary sites, and monitors for water quality and sediment quality. The
DNR monitors fishery resources through numerous inshore and offshore monitoring programs on
state-wide and regional scales. The South Carolina environmental monitoring programs indicate
that pollution and habitat alterations are driving the poor conditions. However, new monitoring
programs are needed in order to integrate water and sediment monitoring with biological
response measures, increase spatial coverage, and expand monitoring to other critical habitats.
EPA SCIENCE FORUM 2003 PROCEEDINGS 65
-------
South Carolina is working towards these goals with the help of the South Carolina Estuarine and
Coastal Assessment Program (SCECAP). SCECAP objectives are to monitor the conditions of
biological habitats and report those findings to the public. Biological habitats monitored include
tidal creek habitats and open water habitats, such as rivers, bays, and sounds. Tidal creeks are
primary nursery habitats and are very important because they serve as the first point of input into
estuaries. In addition, water sediments and pollutants found within the tidal creeks will influence
the water quality of estuaries.
The monitoring approach used for the biological habitat studies consist of a probability-based
sampling design mat ensures an unbiased sampling protocol and sampling of 60 stations each
year. Habitats are monitored for salinity, dissolved oxygen, pH, temperature, total nutrients,
dissolved nutrients, biological oxygen demand, fecal coliform bacteria, metals, sediment
contaminants and toxicity, phytoplankton composition, and finfish and crustacean community
measures. According to the 1999-2000 studies of estuarine habitat, integrated water quality
scores showed that 38 percent of the State's creeks had poor conditions, and 11 percent of the
open water habitat was in poor condition. Integrated sediment quality scores showed that 38
percent of the State's creeks had poor conditions, and 30 percent of the open water habit was in
poor condition. When merging water quality, sediment quality, and biological quality, 12
percent.of the State's creeks had poor conditions, 8 percent of the open water habitat was in poor
condition.
SCECAP is providing good data on the conditions of the South Carolina coastal habitat. Data
improvements in recent studies include the addition of nursery habitat and estimates of
proportion of the State's water that meets or fails expected conditions. Also, the data have been
very useful to South Carolina by showing the need to continue with research efforts.
The Application of EMAP and REMAP in the EPA Regions
EPA Region 2 Monitoring Coordinator, Ms. Darvene Adams, described EMAP and Regional
EMAP (REMAP) activities. EPA Region 2 provides technical assistance, funding for special
programs such as REMAP, and training. The goals of the Regional Office are to support state
monitoring programs and address regional priorities. The state monitoring programs strive to
comply with the CWA, and therefore, study spatial coverage, indicators, and water quality
standards.
Previously, state compliance with the CWA within EPA Region 2 was target-based and focused
on only one chemical or pathogen. Also, some water resources were not addressed because
funding was not available. After 30 years since the inception of the CWA, states within EPA
Region 2 have not fulfilled their compliance obligations for any water resource. In order to
address these issues, EPA Region II has enlisted support from EMAP and hopes to achieve state
goals with the use of EMAP tools, including the EMAP design based on probability, approaches
for indicator development, and approaches for water quality standards.
Also, EPA Region 2 developed REMAP in order to provide status or trends, address broad scale
or high interest issues, make associations, demonstrate the probability approach, and provide
management support at different scales. To date, REMAP has provided baseline data for the
New York/New Jersey harbor areas (including data on toxicity, chemistry, and benthos), and has
66 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
been used to study watersheds of high interest (or impaired watersheds). REMAP also
incorporates groundwater in the assessment data for state watersheds. Future REMAP goals
include:
• Additional studies on wetlands, groundwater, arid lands, large rivers, offshore, and Alaskan
ecosystems
• Complete transfer of capability to the states and tribes
• Reference conditions
• Integrated monitoring for status/trends and impaired water identification.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief panel discussion after the early afternoon and late afternoon sessions addressed a range
of topics. Topics from the early afternoon discussions included: (1) the importance of
groundwater being factored into the EMAP-West data; (2) use of wetland density as a critical
stressor; (3) and the Nez Perce tribe's future goals and use of EMAP data. Topics from the late
afternoon discussions included: (1) the goals of the National Coastal Assessment program
beyond 2006; (2) pathogens in California; (3) the use of groundwater to determine sediment
contamination; (4) existing work with California officials on the Southern California Coastal
Water Research Program; (5) studies involving invasive and exotic species; and (6) the cost of
monitoring programs.
Working with Tribes: Cultural Values and Tribal Lifeways Inform
Health Assessments
Following opening remarks by Mr. Thomas Baugh, with EPA Region 4, five speakers addressed
examples of partnering between tribes and government agencies to maintain healthy
environments within the reservations, to gather new or better data, to develop models or
modeled approaches, and to communicate environmental risks and conditions to members within
the reservations.
Mr. Thomas Baugh, with EPA Region 4, discussed the overview and goals of this session. Then,
Mr. Baugh introduced the session speakers.
Tribal Partnerships in Pesticide Management to Protect Human Health
Ms. Sarah Ryan, with the Big Valley Rancheria, explained the traditions of the Big Valley
Rancheria reservation and their goals for environmental improvement The environmental goals
of the Big Valley Rancheria are to gather as much information as possible on health hazards and
to provide outreach to the community. In order to meet traditional and environmental goals, the
EPA SCIENCE FORUM 2003 PROCEEDINGS 67
-------
Big Valley Rancheria uses income from its tribal-owned casino and also relies on grants from the
United States. Department of Housing and Urban Development, EPA, and the Bureau of Indian
Affairs.
Pesticide management and community recycling are priority goals. Hunting and gathering from
the lands of Big Valley Rancheria is a long standing tradition of the community. Big Valley is a
descendent of the Xa-ben-na-po Band of Indians, and Xa-ben-na-po is defined as hunters and
gatherers. The tribal members occupy 375 acres and are committed to protecting their lands.
However, pesticide use in Lake County has resulted in diverse environmental and human health
effects within their community.
Pesticides are used to protect pear, walnut, and apple trees as well as wine grapes. There are
residents that live as close as 60 to 100 feet from pear and wine grape orchards, respectively.
Tribal schools are less than 40 feet from pear orchards, and residents along the Soda Bay Road
are less than 50 feet from pear orchards. Tribal elders also have documented pesticide use
outside of their homes. These repeated exposures have resulted in asthma in five family
members.
In 2001, the Big Valley Rancheria completed a pesticide history investigation and report, which
focused on the pesticides 2,4-D, paraquat, azinphos-methyl, chlorpyrifos, methyl bromide, and
petroleum oils. The Big Valley Rancheria also has investigated 2002 data on exposure from
pesticide spray drift, including chlorpyrifos (lorsban).
It is vital that Big Valley Rancheria address pesticides use and pesticide drift in the areas lying
between Lake County and the reservation. Tribal members use plants for food and medicine, use
baskets built with plants, use plant products for cooking utensils and ingredients, and even use
plants for baby rattles. These plant uses result in exposure to pesticides. Other tribal
environmental issues include native plants, repatriation of items, fish warnings, mercury, and
pesticides found in Clear Lake.
Establishing Self-Sufficiency in Alaska Native Communities to Minimize Exposure
to Environmental Contaminants
Ms. June Gologergen-Martin, with the Alaska Community Action on Toxics (ACAT), explained
the goals of the St. Lawrence Tribe and support received from the ACAT Program. There are
229 tribes in Alaska, and St. Lawrence is a tribal-owned island. St. Lawrence Island has worked
to address issues of limited funding, information gathering on the nature and extent of data on
island contaminants, the exclusion of their local input into decisionmaking efforts regarding
surrounding areas, and contaminants resulting from United States military sites. St. Lawrence
Island also recognizes trends in local and traditional knowledge and wisdom not being
adequately integrated into younger generations. In order to address some of these challenges,
members of the St. Lawrence Island community work with the ACAT Program. The initial
ACAT support resulted from a meeting with Annie Alowa, formerly with ACAT, and the receipt
of a grant to help the St. Lawrence community address the health issues prevalent on the island.
68 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
With AC AT assistance, St. Lawrence partners with the communities of Gambell and Savoonga,
Norton Sound Health Corporation, and the State University of New York to achieve several
environmental goals, including:
• Acquisition of more information about environmental health clinics
• Education and involvement of young people
• Development of tribal abilities to interpret data
• Arrangements for advance planning and creation of information management protocols
• Development of strategies to increase funding allocated to the DOD for cleanup statewide
and nationally
• Development of strategic partnerships for policy advocacy needs
• An increase in elder input.
On a project basis, St. Lawrence is working with other organizations to identify sources of
contamination affecting the communities of St. Lawrence Island, including military sites and
distant sources; to determine health problems that may be linked to environmental
contamination; and to develop cleanup protocols for contaminated sites. The tribe also hopes to
create a training program about prevention and treatment of environmental health problems, and
to develop a model of communication that might be helpful for other Alaska Native communities
in addressing environmental contamination.
To date, the tribe has established an advisory committee with representation from the tribal
government, city council, and village corporation of the Savoonga and Gambell communities;
held leadership and community meetings in Gambell and Savoonga; completed portions of a
pilot study to determine environmental exposures to contaminants, as well as other
environmental studies; and held planning meetings with community leaders.
Bioaccumulative Toxics in Native American Shellfish
Mr. Larry Campbell and Ms. Jamie Donatuto, with the Swinomish Indian Tribal Community,
discussed their current project of studying bioaccumulative toxics in subsistence-harvested
shellfish on the Swinomish reservation. This project is supported by an EPA grant.
The Swinomish reservation is located 75 miles north of Seattle, Washington, and has 750 tribal
members currently living on a reservation covering approximately 7,400 acres of which 2,900
acres are tribal-owned. Their reservation is unique in that 90 percent of their land is surrounded
by water. Therefore, shellfish are vital to their community and are a key subsistence food of the
Swinomish tribe. Shellfish are incorporated into the common diet and sold to produce funding
for the tribal families. The community has environmental and human health concerns because
heavy metals, PCBs, lead, mercury, dioxins, and furans are common contaminants found in the
nearby waters and in the shellfish.
EPA SCIENCE FORUM 2003 PROCEEDINGS 69
-------
To address these concerns, the Swinomish tribe uses their grant funding toward the following
goals:
• Determine whether Swinomish people who eat shellfish harvested from the reservation or
other nearby areas are exposed to bioaccumulative toxics by testing sediment, clams, and
crabs
• Effectively communicate those risks in a culturally appropriate manner
• Develop mitigation measures
• Confirm major health problems on the reservation that may be related to eating contaminated
shellfish
• Develop hypotheses between the health problems and toxics found.
Testing of the shellfish, as well as land, involves the collection of sediment, clam, and crab
(shellfish) samples, as well as developing additional protocols to prevent further contamination.
The reservation scientist will collect data to determine concentrations and other information on
heavy metals, such as arsenic, copper, cadmium, selenium, mercury, lead, and nickel; PCBs;
polycyclic aromatic hydrocarbons (PAHs); dioxins and furans; chlorinated pesticides; and
butyltins. Sampling locations were chosen based on historic and present frequencies of
subsistence food gathering.
The reservation also is completing their Tox in a Box ambassador's guide to educate school age
children on toxics in the community and common health effects determined from their studies.
Tribal members also participate in community gatherings where reservation scientists
disseminate environmental and human health information. Finally, the tribe provides public
service announcements on the Swinomish cable channel to communicate findings and risks.
Activities also include the development of mitigation methods since stopping the harvest of
shellfish is not an option and moving shellfish harvesting to another area is also not an option
given the treaty agreements required for such an action. Alternatives must be found to cleanup
the shellfish contamination and to prevent further exposure and contamination.
The Swinomish tribe is also working closely with EPA to build capacity, yet to move forward
they have begun to hire scientists to capture more scientific information as well as to gain
credibility with government and state scientists. The focus is on an agenda of empowerment to
understand the issues, move forward, and take action to fix problems as well as to provide their
own funding to support these initiatives independent of Federal monies.
Moving Science into Action - Step One: Get the Data!
Following opening remarks by Ms. Pamela Russell and Mr. Mike Flynn, with the EPA Office of
Environmental Information, Jive speakers addressed the acquisition and analysis of data critical
70 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
to completing environmental and human health risk assessments as well as current initiatives
and partnerships.
Uses of Toxics Release Inventory Data
Ms. Gail Froiman, with the Office of Environmental Information, introduced the EPA Toxics
Release Inventory (TRI) program and discussed general uses of the data collected under this
program. EPA initiated the TRI program when a chemical spill at Union Carbide in Bhopal,
India, resulted in irrecoverable environmental damage. The TRI program, which falls under the
Emergency Planning and Community Right-to-Know Act, requires the reporting of toxic
chemicals from industrial facilities in the United States. Companies must report the uses of toxic
chemicals, the approximate amounts maintained onsite, the management and media releases of
toxic chemicals, reductions, and pollution prevention activities used on a facility-by-facility
basis. All data are made public, and this requirement has recently resulted in some controversy.
Companies reporting toxic chemical data must meet predetermined criteria in order to report
their toxic chemical data to TRI. The reporting of toxic chemical data is required if the facility
employs 10 or more full-time equivalent personnel; is listed under specific groups of Standard
Industrial Classification codes; and manufacture, produce, or otherwise use toxic chemicals
meeting or exceeding specific thresholds established by EPA. In 2001, EPA added its persistent,
bioaccumulative, and toxic (PBT) chemical rule to the TRI regulations. Under the additional
rule, facilities must report on PBT chemicals that meet or exceed reduced reporting thresholds.
The data collected from TRI reporting can help states to set priorities. These data may also=be
used by community organizations. Examples of such use include the following:
• A community organization in Louisiana used TRI data to refute the claims of a commodity
chemicals company seeking to locate its facility in Louisiana (see http://www.leanweb.org)
• The Oneida Tribe of Wisconsin used TRI data to inform a local labor union about health
risks at their workplace, and the union in turn lobbied their employer to reduce the emissions
at contract renewal (see http://www.oneidanation.org)
• The Ecology Center, working with residents of Flat Rock, Michigan, discovered from TRI
reports that emissions from an automobile assembly plant had increased over time, and the
environmental organization and residents together negotiated an agreement with Auto
Alliance International to lower the emissions (see http://www.ecocenter.org)
• Communities for a Better Environment combined 1996 TRI data with GIS mapping data to
develop environmental justice conclusions for Los Angeles County, California (see
http://cbecal.org).
TRI data are often used by industry, financial services, government, and international affiliations
for diverse purposes. For example, the Haartz Corporation uses their TRI estimates to determine
cost savings of $200,000 per year resulting from improved control of methyl ethyl ketone
emissions. In another example, Governor O'Banno influenced Indiana companies to voluntarily
EPA SCIENCE FORUM 2003 PROCEEDINGS 71
-------
agree to reduce TRI emissions by 50 percent. In addition, Green Century Funds uses TRI data
as a measure of corporate environmental performance.
More information on TRI can be accessed at the TRI Explorer website at
www.epa.gov/triexplorer/. TRI data also can be accessed via the EPA ENVIROFACTS database.
Integration of State and County Stream Monitoring Programs: A Maryland Case
Study
Dr. Ron Klauda, with the Maryland DNR, and Mr. Keith Van Ness, with the Montgomery,
County DEP, described a partnership with EPA involving the use of TRI data to improve stream
monitoring and watershed assessments. The Maryland DNR began its stream monitoring
program in 1993, and in 1995 expanded the program to include state-wide streams and
waterways. Montgomery County initiated its stream monitoring program in 1994.
With EPA funding, the Maryland DNR is conducting a Biological Stream Survey to monitor
over 1,000 stream sites over several years; approximately 200 to 300 sites are being monitored
each year. The goal of the Biological Stream Survey is to address key issues for program
integration and to develop approaches that can be applied elsewhere. The survey results will be
analyzed by a probability-based design, and the samples are randomly selected. The Maryland
DNR is focusing on water chemistry, physical habitats, and biological communities. The
Maryland DNR surveys also will help to determine reference-based indicators of integrity for
fish, benthic invertebrates, and physical habitats; hotspots of biological diversity; and 305(b)
reporting and 303(d) listing.
The Montgomery County Stream Monitoring Program involves the study of over 400 sites
county-wide, with approximately 90 sites monitored each year. This monitoring program will
help the Montgomery County DEP to characterize stream conditions, develop a county-wide
Stream Protection Strategy, target and assess watershed restoration effectiveness, meet watershed
National Pollutant Discharge Elimination System (NPDES) monitoring requirements, assess
impacts of development, and determine 305(b) reporting and 303(d) listing.
As a result of integrating the monitoring projects of these two Maryland agencies, researchers
can obtain an increased confidence in estimates of stream condition, maintain consistency in
public statements about stream conditions, reduce the costs of sampling programs, and increase
the use of local information in state estimates of stream conditions.
The integrated monitoring study supports the contention that the Montgomery County and State
stream monitoring efforts can be effectively integrated. Researchers also suggest that EPA
consider similar integration of other programs through comparability analyses across states in the
Mid-Atlantic and other regions.
Effects of Urban Growth on Fish Assemblages in a North Carolina Metropolitan
Area, 1970-2000
Dr. Jonathan Kennen, with the USGS, discussed ongoing projects related to urban growth in
partnership with the EPA Office of Environmental Information. Urbanization is an increase in
72 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
human habitation, combined with increased per capita consumption and extensive landscape
modification. An approach to address adverse effects of urbanization is to use aquatic
communities to evaluate the influence of changes in the environment. Aquatic communities are
susceptible to multiple physical, chemical, and biological influences (or stressors). Researchers
can use these stressors to determine changes and trends in the environment.
The USGS and EPA have teamed together to: (1) evaluate the relationships among land use,
extant fish species composition, and stream water quality, and (2) determine if there are
significant relationships between fish assemblage structure and environmental quality across a
disturbance gradient. Study locations include Phoenix, Arizona; Detroit, Michigan; Minneapolis
and St. Paul, Minnesota; and Chicago, Illinois; Milwaukee, Wisconsin; and the Raleigh-Durham
area, North Carolina.
The Raleigh-Durham area represents a contiguous metropolitan area surrounded by Cape Fear
and two other lakes. A 20-mile buffer of major drainage basins also surround the area and was
chosen to be included in the study area to aid in projecting urban growth of the metropolitan area
beyond 2000.
The fish community studies involve the use of the double pass assessment method in upstream
and downstream directions. Land use data from the National Land Cover Database, North
American Land Cover, aerial photos, and other GIS data layers also are used in the study of the
fish communities. Population, transportation, and economic data are gathered from the United
States Census, the Texas Transportation Institute, and state and local governments to help
determine the effects of land use and urban sprawl on fish communities, and therefore, the
environment
The preliminary results from the Raleigh-Durham area study indicate that urban influences, such
as urban land cover, population density, and other related variables, impose the strongest
negative influence on the fish communities. Also, increasing forest fragmentation and patch
diversity, as well as a decrease in contiguous forest corridors, adversely affect the fish
communities in the Raleigh-Durham area. For example, the loss of a protective stream buffer
can destabilize stream banks and eventually leave streams more vulnerable to erosion and stream
bank damage. Finally, the preliminary study results also showed that edge forest, percent forest
cover, and the portion of large forest patches in a basin are important for maintaining healthy
streams.
Dynamic Choropleth Maps
Dr. William Smith, with the Office of Environmental Information, described the uses of TRI data
in creating dynamic choropleth maps to visually depict trends in human health and
environmental conditions. TRI Explorer, accessed from the EPA website, and a JAVA-based
web tool are used to create the dynamic choropleth maps. The map tool can be manipulated
using all web browsers. The dynamic choropleth maps are created using several data sets of
environmental, human health, statistical, and geographic information, and the data tree in the tool
accesses 10 data cubes and 300 indicators that determine trends in the environment, human
health, demographics, and economics.
EPA SCIENCE FORUM 2003 PROCEEDINGS 73
-------
Dynamic choropleth maps have user controls and sliders that allow real-time interaction with the
database and selection of the specific indicators to consider in determining environmental
conditions and human health trends in county locations across the United States. On each map,
each county is color coded for the values of the principal data component with a separate
designation for a county with missing data, or with data that have been filtered out.
Dr. Smith demonstrated the various levels of data sets, a maximum of three, mat a user can
choose to display trends in human health risks or diseases among ethnic groups, geographical
areas, and age groups. The latest development of the dynamic choropleth map web tool can be
accessed at http://users.erols.com/turboperl/dcmaps.html or www.epa.gov/triexplorer.
Emerging Innovations in Regional Ecosystem Protection
Five speakers provided examples of regional projects to determine the environmental conditions
of ecosystems and goals in order to protect these ecosystems from further exposure to hazards
and toxics. A panel discussion including an audience question and answer period followed the
presentations.
Regional Ecosystem Protection: A Pattern, An Opportunity, A Challenge?
Mr. Doug Norton, with the Office of Water, provided highlights of an EPA workshop on critical
ecosystem assessment and the priorities determined from the workshop. EPA Regions 3,4, 5,
and 7 have partnered to develop ecosystem modeling and other tools, and have become a
network in geospatial modeling. The current partnerships allow the EPA Regions to be EPA's
best unit for implementing cross-program ecosystem protection.
Geospatial modeling can help researchers analyze ecosystem health. Geospatial modeling
involves the use of GIS to show the contributions of different ecosystems. Ecosystem protection
assessment is a priority in Region 7. Goals of the geospatial modeling programs and networks
are to: (1) apply the concept of regional-scale ecosystem protection, (2) determine appropriate
national roles, and (3) protect, sustain, or restore the health of people, communities, and
ecosystems using integrated and comprehensive approaches and partnerships. Applying the
concept of regional-scale ecosystem protection requires regional dialogues with partners to
identify widely valued ecological endpoints, regions and states must identify their ecological
identity and heritage, the regions must identify places and processes critical to regional
ecological health, and the researchers must maintain mainstream cross-program use of
centralized data and tools.
The second goal of geospatial modeling, to determining appropriate national roles, includes the
development of core data sets as well as decision or priority setting tools, the provision of core
information and tools, support to regional initiatives that align with national program goals, the
conduct of research and technology transfer activities, and the generation of an introspective look
at national program ecosystem effects.
The goal "to protect, sustain, or restore the health of people, communities, and ecosystems using
integrated and comprehensive approaches and partnerships" is taken directly from the EPA
Strategic Plan Goal 4, Healthy Communities and Ecosystems. This includes efforts to maintain
74 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
regional and state ecological identities, create more partnerships and program integration,
placing the geospatial data and tools in the mainstream, and protection of critical ecosystems.
Use of Geospatial Tools to Identify High Quality Midwest Ecosystems (Landscape
Scale Characterization of Ecosystem Health in the Upper Midwest)
Dr. Mary White, with EPA Region 5, discussed the use of geospatial tools by EPA Region.5 over
the last several years to assess ecosystem condition. High quality ecosystems have three
priorities: (1) diversity, (2) sustainability, and (3) rarity in endangered species. These priorities
have been used to characterize ecosystems, and the characterizations would serve as indicators.
Using geospatial analysis to characterize ecosystems, researchers developed a diversity
composite layer to determine land cover diversity calculated by ecoregion; temperature and
precipitation maximums by ecoregion; appropriateness of land cover; and contiguous sizes of
undeveloped areas.
The sustainability composite layer consists of a fragmentation layer and a stressor layer. The
fragmentation layer includes area and perimeter calculations, waterway impoundments per water
body, road density, contiguous sizes of individual land cover types, and appropriateness of land
cover. The stress layer includes airport noise and aerosols, Superfund sites, hazardous waste
cleanup sites, water quality summaries from the BASINS model, air quality from the OPPT air
risk model, waterway obstructions, and urban disturbance.
Finally, a rarity composite layer consists of land cover rarity by ecoregion, species rarity, the
number of rare species, and the number of rare species per taxa.
Examples were provided of individual composite maps and maps created by combining the three
composites with color coding to represent areas with poor scores, which in turn indicate poor
ecosystems. Combining the three composite layers, each representing a key component in a
healthy ecosystem, enabled researchers to develop an overall "score" useful in evaluating
ecosystem condition. Within EPA Region 5, the average ecosystem score was 143, and
individual scores ranged from 10 to 288.
A triage model, where sustainability scores are plotted against diversity results in a scattering
effect and in a triage effect, was also described.
Synoptic Model to Rank Wetland Ecosystems for 404 Permitting: An Application
of Regional Critical Ecosystems Protection
Ms. Brenda Groskinsky, with EPA Region 7, presented a specific analytical process, synoptic
modeling, to rank wetland ecosystems. National surveys indicate mat a large proportion of
historic wetlands have been replaced by other land covers and use types. Studies show that
within EPA Region?, states have experienced pre-settlement wetland area losses: in Idaho,
researchers estimated a 95 percent loss; in Missouri, an 87 percent loss; in Kansas, a 48 percent
loss; and in Nebraska, a 35 percent loss.
EPA SCIENCE FORUM 2003 PROCEEDINGS 75
-------
There is a need for prioritization to fairly address problem ecosystems. EPA Region 7 aims to
reduce the loss in wetland species biodiversity and will need to focus on existing activities (e.g.,
CWA Section 404 permit reviews). The goal is to obtain and use a defensible, rigorous, and
repeatable framework.
A synoptic framework was chosen for this effort. Variables are the environmental indicators and
models produce indices enabling results to be ranked. An indicator is a direct measurement that
provides information about other conditions. An index is a numerical quantity, usually
dimension!ess, denoting the magnitude of some effect and can also be used to denote quantity.
The conceptual model focuses on wetland prioritization and the prioritization methodology is
analogous to a cost-benefit ratio. The prioritization looks at changes in risk rather than values of
risk, and considers marginal increases in impacts to regional wetland species and the risk
avoided.
Such a synoptic framework must be quantitative and appropriate in scale. EPA Region 7 hopes
to use information that is not too costly and to take advantage of data collection opportunities to
gain ecological knowledge and develop theories. The synoptic model will help to ensure that
relevant data are included when providing ecosystem protection.
As a result of the ecosystem prioritization effort, EPA Region 7 researchers created a
management tool that can assist in making decisions about resource allocation; enable resource
managers to place wetland site-specific decisions with regional context; and protect wetlands.
The EPA Region 7 researchers also expanded the synoptic processes to include terrestrial and
aquatic areas. The resultant rankings are appropriate environmental measures.
Southeastern Ecological Framework's GeoBook - Software for Mapping
Partnerships and Ecosystem Protection
Dr. John Richardson, with EPA Region 4, presented highlights of the Southeastern Ecological
Framework's GeoBook project. A team of scientists, including Dr. Richardson, is looking at the
limited natural resources that remain to determine appropriate ways to study and protect these
resources. The methodology involves the use of GIS modeling to identify ecological hubs,
determine model parameters, identify linkages, and complete an ecological framework. This
holistic systems approach is necessary to avoid fragmentation of studies or exclusion of pertinent
data.
The EPA Region 4 researchers modeled the Florida Ecological Network that uses a GIS
approach to protecting the environment. In developing the ecological framework, EPA Region 4
must consider the varied types of ecological areas, and therefore, must select an ecosystem
consisting of optimized hubs. Biodiversity, ecosystem services, threats to ecological integrity,
and recreation potential are prioritization categories in the Southeastern Ecological Framework.
The EPA Science Advisory Board reviewed the proposed Southeastern Ecological Framework,
and recognized and praised the significant efforts undertaken to create such a framework, which
is useful for integrating EPA programs in regions, as well as for providing a landscape context
for decisions by states, local governments, and private landowners in a region. The Science
Advisory Board recommended that the Southeastern Ecological Framework be enhanced to
76 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
include a wider range of ecological attributes that are important to regional ecological integrity.
Also, the Science Advisory Board also recommended that the process for setting criteria to select
priority lands be made more explicit and that the criteria and individual data layers used in the
Framework receive additional peer review. With these caveats noted, the Science Advisory
Board agreed that application of the Southeastern Ecological Framework approach would be
beneficial in other regions of the United States, although different data layers and/or different
criteria for selecting priority areas likely would be needed.
As a part of this project, EPA Region 4 also is developing a Southeastern Ecological Framework
GeoBook. The GeoBook enables researchers to answer the following questions:
• How can our green space protection efforts support clean drinking water for our community?
• What funding sources are available to support water quality protection in our community?
• Which land is at most risk to developmental pressures?
The GeoBook also supports decisionmakers in identifying issues that are important to the
surrounding communities, and supports easy access to natural resource data and information.
Decisionmakers using the GeoBook also receive support in prioritizing locations, identifying
potential funding for water quality protection and improvements, protecting drinking water
sources in surrounding communities, and identifying potential threats to the critical ecosystem
that services a community.
EPA Region 4 is working towards a national ecological framework that will provide a baseline
ecological framework assessment that is consistent across the United States, is temporally
repeatable with new National Land Cover Data, and is linked to programmatic needs and
activities. The national framework would have to be adaptable to individual regions.
The Mid-Atlantic Highlands Action Program: Transforming the Legacy
Mr. Tom DeMoss, with the Canaan Valley Institute, described the Mid-Atlantic Highlands
Action Program, which provides support and training for the end user and the background
hardware and software to gather other data tools. The Mid-Atlantic Highlands Action Program
is a citizens-inspired program for collaborative monitoring, research, management, and
restoration activities within the Mid-Atlantic Highlands. EPA commits two, full-time senior
staff to the project; contributes outreach, science, and technical staff to assist with project
activities; and provides education, training, and demonstration sites for users.
EPA ORD has focused on the Highlands for the last 10 years. Highland areas are prime
examples of biodiversity; they have terrestrial and aquatic diversity and serve as the home to
national hotspots or areas of concern. These diverse areas also contribute to the Nation's
economy claiming earnings of approximately $26 billion per year from the tourism industry and
$11 billion per year from the forestry industry. These areas also contribute to the economics of
energy since they are natural resources for coal, oil, natural gas, and wind.
However, landscape indicators show that 47 percent of the Highlands region is rated fair or poor.
Based on an assessment of the types of birds found, 57 percent of the landscape is not in good
condition. Fish assessments indicate that 67 percent of the stream miles in the region fail to meet
EPA SCIENCE FORUM 2003 PROCEEDINGS 77
-------
a good rating, and assessment of aquatic insects indicate that 75 percent of the stream miles in
the region are in fair or poor condition.
Economic conditions in the Mid-Atlantic region are poor. There are high rates of children in
poverty, low educational attainment, high unemployment rates, low labor force participation
rates, and low per capita income. These conditions result in environmental justice issues and
migration of the working-age population out of the region to obtain better living conditions.
The Mid-Atlantic Highlands Action Program will improve economic conditions and provide
opportunities for stewardship. Through this initiative, researchers may be able to revitalize
18,000 stream miles and 5,000 square miles of forest in the Mid-Atlantic region and reduce
sediment loadings by approximately 105,954,000 tons per year. The Mid-Atlantic Highlands
Action Program also may create 3,000 jobs in the Mid-Atlantic region.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief panel discussion addressed a range of topics. These included: (1) use of geospatial data
in environmental regulations; (2) support from EPA to provide funding and other needs in
ecosystem protection, including the use of geospatial modeling; and (3) the role of economics in
ecosystem protection to ensure that the economic benefits can be understood by decisionmakers.
Site Characterization and Decision Analysts of Contaminated
Sediment
Following opening remarks by Dr. John Bing-Canar, with EPA Region 5, four speakers
discussed various scientific and analysis aspects regarding the study of contaminated sediments.
Dr. John Bing-Canar, with EPA Region 5, provided opening remarks and introduced the
presentations of this session. All of the presentations addressed diverse scientific and analytical
aspects involved in the assessment and remediation of contaminated sediments.
Introduction of Concepts and Tools
Mr. Brian Cooper, with EPA Region 5, provided an overview of an EPA Region 5 Field
Environmental Decision Support (FIELDS) System to complete site characterization and
decision analysis for a study of contaminated sediment. This is an important goal of EPA
Region 5 because the use of models, tools, and methods shorten the time required to complete
projects and studies in the Superfund remedial investigation/feasibility study process and provide
tangible benefits of integrating the methods characterization.
The University of Tennessee, contractors, and other government agencies such as the National
Oceanic and Atmospheric Administration (NOAA) support the development of this system.
There was no existing fully integrated system and the team worked with OSCs and remediation
managers to know the needs of remediation efforts. This strongly supported the development of
78 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
decision analysis and visualization tools for scientifically-based decisionmaking that is
reproducible and defendable. FIELDS addresses projects predominantly in Region 5, but
supports projects elsewhere in the United States, China, and Latvia to address contaminated soil
and sediment.
The FIELDS technology combines GIS, global positioning system (GPS) for accurate spatial
coordinates, visualization (two- and three-dimensions), and database access and query capability
to support field planning, create data storage, and provide for spatial analysis and modeling for
remediation decisionmaking. Benefits of this technology include increased speed of analysis
through automation, the ability to visualize contaminant location and movement, and tools for
spatial analysis and geostatistics integrated with the database of information. Use of this
technology also provides standardization in the analyses to obtain results that are reproducible
and defendable.
Typical steps in site characterization at sediment sites involve compilation of historical data,
acquisition of new data, and data analysis followed by decision support analyses such as
contaminant concentration and movement patterns, multi-dimensional visualization, and
remediation scenario development and assessment. The FIELDS system serves as a repository
for all of the data for the project. A Sample Design Module assists in designing new data
collection efforts, including considerations of the number of samples to be taken, sampling
locations, data quality objectives, and funding constraints; the system also includes a variety of
sampling approaches including judgmental, random, systematic, and linear as well as the ability
to assess the likelihood of missing a "hot spot" based on the sampling approach selected. The
sampling design can be exported to a GPS unit for subsequent use in the field to find the pre-
selected locations.
The FIELDS system includes a number of query tools to extract data for analysis regardless of •
storage form. The system also includes a variety of statistical and other data analysis techniques
to assess the collected data, including the ability to estimate the total amount that is contaminated
and average contaminant concentration distribution across a specified depth as well more
advanced techniques such as variography correlation models and Krieged interpolation models.
The system also supports the design of secondary sampling efforts such as adaptive fill or radial
nested design. Data analysis and interpolation tools for decision support include algorithms to
delineate "hot spots" and to estimate contaminant mass/volume, define remediation areas, and
evaluating what areas to address based on cleanup goals. There are also a number of quality
assurance checks including estimation error and cross-validation of the modeling and other
interpolation methods.
Final products from the decision support analyses include screening level risk analyses; maps,
analyses, and reports; and pre- and post-remediation comparisons regarding achievement of
cleanup goals. These outputs support communication with decisionmakers.
Initial Sample Designs
Dr. John Kern, with Kem Statistical Services, Inc., provided an overview of sampling designs
developed and/or implemented on various contaminated sediment projects as well as those
strategies that appear to work best for decisionmaking. A number of reports challenge the
EPA SCIENCE FORUM 2003 PROCEEDINGS 79
-------
effectiveness of dredging as a remediation technique yet failure to sample properly or the failure
to properly delineate the contamination may result in this outcome in addition to the dredging
technique itself. A 2001 National Research Council report noted that a contributing factor is
leaving untargeted sediment deposits in place. Sampling generates data that support delineation
of the target areas requiring remediation, therefore, proper sampling design will minimize the
potential for this problem to occur.
Drawing on an analogy in mineral exploration, there are two sampling phases—a broad
exploration phase and more focused (smaller scale) sampling of specific deposits intended for
development using 20-foot borehole spacings. Extrapolation of this approach to contaminated
sediment studies raises the concern that to identify contaminants present in much lower
concentrations than found in mineral exploration, it may be necessary to sample on a smaller
scale or to change the decision process. In mining, the decision area is small and contains many
samples, while in sediment remediation, the area tends to be large with fewer samples. Where
large transects are used in sampling design, the lack of resolution may limit the utility of the
collected data
Objectives to design for include ecological and human health risk assessment, presence or
absence of contamination, nature and extent of contamination, baseline quantification, volume
and mass .estimation, fate and transport modeling. The initial sampling phase is important
because it tends to form the basis for all future investigations. Investigations beyond the
remedial investigation/feasibility study stage, such as for remedial design and implementation,
require more detailed information. This is similar to the second phase of mineral exploration and
likened to a high density, in-fill sampling approach. Finally, long-term monitoring is a
completely different objective; however, these data must comparable to the initial sampling data
in order to determine if the remedial action is performing as intended on a long-term basis.
Often the first phase of design is haphazard, biased, and judgmental and later arises the question
about how to compare monitoring data to initial results in a meaningful way.
Remediation projects often stall between the remedial investigation/feasibility study and
remedial design phases, and this often involves disagreements over data adequacy. One
approach may be to use the more "global" data to complete the remedial investigation/feasibility
study process, then include short-term sampling to delineate the areas to address during the
design phase. A decision chart illustrated such an approach. Use of a systematic sampling
design will also help with subsequent data analysis and comparability with future data. In
addition, addressing the question of whether the data are sufficient to conduct a cost-
effectiveness analysis may be another decision point on whether to proceed to design; this
translates into a "denser" sampling design at this stage of the remediation process.
An overarching recommendation was to invest in the initial sampling phases because the sample
data collected early in the Superfund process affect discussions for the life of the project, and the
impact of a poor design will be multiplied with time. The long-term cost of arguing over data
adequacy and validity is likely to be greater than the cost of collecting good data at the outset.
Of final note, each project may require a different design approach.
Several examples were explored to demonstrate the sampling design principles discussed in this
session. A contaminated sediments project on the Kalamazoo River in Michigan showed that
80 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
more than doubling the sampling (data) density confirmed earlier estimates and increased
resolution for the feasibility analysis. Optimization of the remediation decision should consider
the cost of sampling, moving clean material, and failure to properly target contaminated areas.
Another example addressed the use of a double sampling technique on a Michigan project.
There are physical variables that are easier and less expensive to measure than contaminants and
may be used correlate or extrapolate to contamination if a relationship between the two exists.
This can minimize analytical cost and generate data adequate to estimate large-scale parameters
such as volume/mass of contaminated sediments and average surface concentrations.
A third example involved infill sampling of contaminated soil at a lead smelter in Dallas. On
consideration was where to use this technique. Since the actions necessary to address areas
where the contaminant concentration is above cleanup goals or very low are known, infill
sampling is most useful in areas where concentrations are near thresholds for risk.
General recommendations were to use biased sampling initially to confirm if it is worth the time
and expense of doing something significant at a site, followed by broad-based, systematic,
unbiased sampling to get even coverage over the site and to get at the large scale parameters such
as mass and volume. In conjunction with statistical evaluation of data adequacy, targeted infill
sampling can be used to refine/update final design and implementation followed by confirmation
sampling (might be the most dense sampling network of any remediation phase), then broad-
based long-term monitoring to evaluate performance similar to initial sampling.
Spatial Estimation
Dr. John Bing-Canar, with EPA Region 5, discussed the methods used for spatial estimation
(e.g., contouring) in the study of contaminated sediments. Spatial estimation involves a number
of methods to generate estimates for unsampled locations, and these estimates are functionally
dependent upon the surrounding data. The estimated values from spatial estimation support the
development of visualizations, aid in secondary sampling design, estimate mass and volume of
contaminated material, support decisions on areas to cleanup, and evaluate changes in sediment
surface over time including post-dredging evaluations.
Dr. Bing-Canar also discussed exploratory data analysis (EDA), which uses descriptive statistics
to examine extreme values, outlier tests to determine data acceptability, and spatial duplicate
values and how to use it, limit of detection (e.g., value to use for "not detected"), and coordinate
transformation (e.g., "straightening" to represent a relationship in true space). The FIELDS
system includes an algorithm to check and address values for spatial duplicates and has
transformation algorithms.
Many spatial estimation methods involve interpolations such as weighted or moving averages.
In general, these apply some weight to original data to create an estimate for a particular
sampling location. Commonly used spatial estimation methods include Inverse Distance
Weighting, Natural Neighbor, and Kriging among others. The differences are in the weighting.
For Inverse Distance Weighting, weights are inversely related to difference; points close by are
given greater weight than those farther away and directionality is not a factor. For Natural
EPA SCIENCE FORUM 2003 PROCEEDINGS 81
-------
Neighbor, weights depend on the geometric area around each data point. For Kriging, weights
derive from spatial correlation.
A consideration in conducting spatial estimation or interpolation is cell size. The smallest cell
size is the size of the sampling core. It is best to use as small a cell size as possible because
larger cell sizes smooth the data and may misrepresent the information.
Data transformation can also influence the data representation unless the proper correction is
applied.
Geostatistics are a set of mathematical tools that describe (model) the spatial correlation of data
and make predictions about these data using Kriging, which can create a standard error to
understand the precision of the estimate. Geostatistics can also be used to look for directionality.
Often, especially in a riverine or harbor environment, the sediment contamination tends to follow
water flow. Another use of geostatistics is to evaluate the "dis-similarity" between data
separated by a distance. The natural logarithm is often taken of the original data and used for
analysts because it tends to "tighten" the information and make it easier to fit curves.
Model validation helps to determine confidence in the interpolation. Cross-validation is a pre-
estimation method used for validation. Another method is to estimate error by determining the
difference between each original value and its respective interpolated value. Most important to
evaluate is where estimates are made between measurements rather than the measurement itself.
Other methods include data splitting and bootstrapping.
Take home messages are that it is just as possible to "lie" in data interpretation/analysis with
interpolation as with statistics. To avoid this, it is important to know exactly how the spatial
estimation was performed, what was done with the data before it was interpolated (e.g., data pre-
processing such as duplicate handling and data transformation), what interpolation method
(algorithm) was used, what parameters were used (e.g., number of neighbors, search radius), and
any other assumptions such as which areas were not interpolated or were excluded from analysis.
Spatial estimation (regardless of technique) is just as liable to all the limitations inherent in
prediction. There is no such thing as giving the exact answer.
Of final note, insufficient data or highly clustered data lead to poor, misleading, and/or
inaccurate estimates; therefore, when to say "no" to proceeding with insufficient data for analysis
and requiring additional data collection is important. In addition, use of secondary sampling is
recommended such as adaptive fill and radials to quantify spatial correlation.
Decision Analysis
Mr. Charles Roth, with EPA Region 5, discussed decision analysis tools and their application to
support remediation decisionmaking. There is a need to start using more sophisticated
methodologies such as FIELDS not only to help with data analysis and decisionmaking, but also
because the potentially responsible parties are using more of these techniques. The decision
analysis tools of FIELDS provide information to decisionmakers in a useful form and the results
are repeatable by others. All the work is based on generating information that someone will use.
82 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
There are several decision analysis tools available. Maps are one such tool and are useful to
understanding the extent of contamination and areas requiring remediation, which are easier to
see than explain. Thus, maps can be very powerful communication tools. In one glance, the
problem and its extent can be understood. This is useful in public presentations as well as in
support of more detailed analysis of vertical contamination.
Data analysis focuses on important end products such as average concentration, volume of
contaminated soil/sediment, contaminant mass, and cleanup curves for decisionmaking as well as
pre- and post-remediation comparisons (e.g., were goals met). This enables identification of
areas above specified concentration levels and the conduct of mass/volume comparisons to
determine the need for different concerns/approaches to address contaminant variation by layer
in the subsurface. Ecological and human risk assessments provide a range of values that are
protective, and cleanup curves can be generated to help the decisionmaker assess action levels
and which techniques are most effective in consideration of cost and other factors. In addition,
how much a site costs to cleanup depends upon how much volume must be cleaned up, and the
contaminant mass helps in understanding the extent of cleanup achieved and evaluate reductions
in average contaminant concentration. Also, mapping mass estimates can help decisionmakers
address how to remove a specified percentage of contaminant mass.
Tools available to help risk assessors with the human health and ecological risk assessments
include computer models to support human risk analysis, and algorithms to evaluate
contamination data in conjunction with levels that produce harm mapped to habitats to determine
impacts. There are also tools to estimate exposure over time.
There are also tools useful for pre- and post-remediation to follow cleanup progress, assure
cleanup goals are met, and supports continued monitoring of ecosystem recovery. These tools
provide a standardization of the process and demonstrate that any variability found is not due to
analysis methods.
Additional information on FIELDS and the associated analytical tools is available at:
www.epa.gov/region5fields and www.tiem.utk.edu/~fields.
EPA SCIENCE FORUM 2003 PROCEEDINGS 83
-------
Section V: Year of Water—30
Years of Progress
Through
Partnerships
Tuesday and Wednesday, May 6-7,2003
'The purpose of this breakout session on the second and third days of the meeting was to focus on
human impacts on water systems, ecological and human health implications of impaired systems,
improved tracking and monitoring of water system degradation, improvement in overall water
quality, the relationship between drinking water and waterbome disease, and EPA partnerships
with state, local, and tribal governments on a variety of local and overarching water issues. Each
session included opportunities for a panel discussion and to respond to audience questions that
provided additional information and insight on a variety of water-related topics.
Dr. Fred Hauchman, with NHEERL, led'a session addressing the implications and prevention of
waterbome disease in drinking water. Presentations included efforts to estimate the occurrence
of waterborne disease, implications for high risk, susceptible subpopulations, and the sources and
control measures for microbial contamination during drinking water distribution.
Ms. Katie Flahive, with the Office of Wetlands, Oceans, and Watersheds (OWOW), led a session
addressing the science of hypoxia and steps being taken to address the hypoxic region in the Gulf
of Mexico. A panel discussion presented an overview of hypoxia, the contribution of freshwater
rivers feeding into the Gulf of Mexico, and recent reports on the causes and potential solutions.
A separate presentation addressed the creation and implementation of an Action Plan to reduce,
mitigate, and control hypoxia in the Northern Gulf of Mexico.
Mr, Michael Slimak, with NCEA, led a session addressing the threats posed by invasive species
and diverse actions underway to prevent Iheir introduction and control those already present.
Presentations included regulatory initiatives and partnerships to address the pathways for
invasive species introduction, highlights of USCG research activities, efforts to develop an
international treaty, the use of an electric barrier to control the spread of Asian carp, pesticide
control programs related to the management of invasive species, and precautions for prevention
actions derived from historic toxic chemical usage.
Mr. Bill Hirzy, with the National Treasury Employees Union, and Ms. Roberta Baskin, a Senior
Reporter, led a session addressing the arguments for and against the national policy for fluoride
addition to drinking water. Presentations included an overview of drinking water regulations and
84 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
the health benefits of fluoride addition to drinking water as well as a counter viewpoint on the
necessity for a national water fiuoridation policy and health consequences of fluoride ingestion.
Mr. Kenneth Potts, with OWOW, led a session addressing current issues in coral reef
management and initiatives to develop biological indices to reduce ecosystem stressors locally
and globally. Presentations included human-induced and natural stressors on coral reef health,
regulatory strategies to develop and incorporate biocriteria pertinent to coral reefs, and the
development and application of indicators to provide early warnings of adverse changes to coral
reef health.
Mr. Jamal Kadri, with the Office of Water, led a session addressing initiatives and tools to
protect and restore watersheds from the impacts of population growth. Presentations included
Smart Growth principles and their application to watershed planning and restoration, impervious
cover as an indicator for watershed quality, and the use of partnerships and community
roundtables to aid in watershed protection.
Ms. Susan Holdsworth, with OWOW, led a session addressing innovative sampling methods and
data analysis techniques to identify stressors and to assess watershed condition in support of
decisionmaking. Presentations included the utility of biological indicators, the use of
probabilistic sampling and monitoring strategies, and a desktop tool for rapid, desktop wetlands
assessment.
Mr. Joe Hall, with OWOW, led a session addressing the increased use of volunteer monitoring
efforts in support of wetlands, coastal, and estuarine programs. Presentations included an
overview of volunteer monitoring activities, partnerships and tools supporting volunteer
monitoring for wetlands, volunteer monitoring and information sharing in support of estuary and
coastal condition evaluations, and future challenges and opportunities.
EPA SCIENCE FORUM 2003 PROCEEDINGS 85
-------
Waterborne Disease in the United States
Following opening remarks by Dr. Fred Hauchman, with NHEERL, three speakers addressed
waterborne disease trends and factors affecting microbiological contamination of drinking
water. Question and answer sessions occurred at the end of each presentation and at the end of
the session.
Dr. Fred Hauchman, with NHEERL, noted that safe drinking water can only occur the source is
protected. Dr. Hauchman outlined how this session on waterborne disease would progress
mentioning mat this was one of the few drinking water programs scheduled. Dr. Hauchman
discussed EPA's strong partnership with the CDC in the areas of waterbome disease and
drinking water monitoring.
Drinking Water Related to CWA Endemic and Epidemic Waterborne Disease: An
EPA and CDC Partnership
Chief of the Epidemiology and Biomarkers Branch at NHEERL, Dr. Rebecca Calderon,
reiterated the importance of the EPA-CDC collaborations, which include waterbome disease
surveillance and outbreak investigations. EPA also maintains an official database of waterborne
diseases dating back to 1971. Dr. Calderon differentiated between endemic and epidemic
diseases, defining an epidemic as an increase above the background rate of illness. An outbreak
usually occurs when at least two persons are diagnosed with a similar illness; however, cases of
legionosis are not included in the identification of an outbreak.
Epidemiological evidence must implicate drinking water in order to classify an outbreak as a
waterbome disease. Common drinking water micro-organisms that may cause gastrointestinal
illness include Giardia, Shigella, and the Norwalk virus, which is famous for its effects onboard
cruise ships.
Detection of an epidemic occurs when someone is in the right place at the right time and puts all
of the pieces together. Waterborne disease surveillance is a joint partnership with the states, but
is a passive surveillance system in that EPA must wait for states to report back. States are the
ones investigating and reporting incidences of disease, which leads to inconsistency of reporting
and underreporting.
Examination of drinking water disease outbreak trends over the last 30 years reveals a big
increase in drinking water-related disease outbreaks in the early 1980s. During this time, a
tremendous amount of infrastructure was put into place in state and local systems, which may
have led to contamination. As infrastructure systems are updated in the next decade, we may see
another such increase. The trends show drinking water disease outbreaks declining as the
drinking water regulations work to zero out contamination of drinking water supplies. Within 10
years, a similar trend is expected to be seen for groundwater as a result of up and coming
regulations.
Drinking water intervention can occur either at the community level through the upgrade of
treatment before the water is distributed or at the household level by increasing the quality of
86 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
water in the home. EPA has typically conducted household intervention studies. Three studies
involve 300 families recording daily diaries of incidences of diarrhea and vomiting, as follows:
» Community I: October 1995 to January 1999, completed
• Community II: September 1999 to December 2001, currently evaluating data
• Community III: January 2003 to June 2003, in the last stages of the experiment
The results of the Community I study indicate a decrease of approximately 30 percent in
gastrointestinal illness rates mat is most likely attributable to increased quality of water. Water
quality testing for parasites showed no real change with parasite population counts remaining
negative the entire study. However, there was a drop in heterotrophic plate count and turbidity.
There was also a significantly lower rate of gastrointestinal illness in children, but hospitals did
not report lower rates of illness. Cryptosportdium showed no change leading researches to
believe that Cryptosporidium is not a significant drinking water contaminant in that community.
An analysis of household intervention studies indicates that all systems met United States
drinking water standards after treatment. The first study reported that 35 percent of microbial
gastroenteritis is attributable to water, and the second study reported that 14 percent of microbial
gastroenteritis is attributable to water. An Australian study showed similar results. The results
from an Iowa study are still pending.
There is a need for a national estimate of waterbome diseases. However, this has not been easy
to identify. Two components of a national estimate include the population attributable risk
(illness due to drinking water exposure) and the incidence of acute gastrointestinal illness. In
1998, Foodnet conducted telephone surveys to identify background rates of illness for diseases
from all causes. Three rounds of surveys were conducted for a total of 32.2 million people
surveyed (12 percent of United States population). This survey found that 1 in 12 persons seek
medical care.
Epidemic studies continue to examine waterborne disease outbreaks, emerging micro-organisms
(e.g., Cryptosporidium), and water treatment technology/operations. Endemic intervention
studies show that water treatment improvements can reduce diarrhea! disease. Future research
collaborations include:
Continuing to investigate and evaluate epidemic levels
Preparing water security
Developing an Nl estimate of waterborne diseases for endemic levels
Examining the distribution system
Focusing on susceptible populations.
When asked about any international considerations/partnerships, Dr. Calderon explained that
EPA has had meaningful discussions with Mexico. Mexico is better about collection of data and
reporting for diarrhea and vomiting. The challenge is obtaining access to the data, as Mexico is
currently unwilling to provide such access. EPA will continue to work with the Mexican
government to improve informational sharing.- There also is a tremendous effort underway to
EPA SCIENCE FORUM 2003 PROCEEDINGS 87
-------
improve the infrastructure in Mexico with EPA sponsorship of supporting programs, such as the
Border 20-12 program.
A question also arose regarding sensitive subpopulations in the first contaminant intervention
study. While the study did not specifically consider individuals with compromised immune
systems, the trends for the elderly looked similar to those for children. Both the elderly and
children are high-risk populations. However, the risk for children decreases as they approach
adolescence.
Using Randomized Trials to Study Waterbome Pathogens Among Susceptible
Populations
Associate Professor of Epidemiology at the University of California, Berkeley, Dr. Jack Colford,
presented a study of drinking water intervention in susceptible subpopulations, specifically
human immunodeficiency virus (HlV)-positive populations to develop an estimate of risk of
gastrointestinal illness attributable to drinking water. Prior research indicates that anywhere
from 0 to 40 percent of gastrointestinal illness can be attributed to drinking water intervention
and up to 85 percent of the cases of cryptosporidiosis could be attributable to drinking water.
Prior to this study, randomized drinking water intervention trials had not been conducted for HIV
subpopulations even though drinking water is a concern for this subpopulation. This was a
randomized, triple blinded home intervention study that included 50 individuals. Participating
individuals were provided with a water filtration device installed in their home (attached to the
faucet) and participants filled out a daily diary for 16 weeks. The study was a triple blinded trial
in that neither the participants, the investigators, nor the statisticians/data evaluators knew which
home received an active or dummy water filtration device. The "blinding" of the participants
was done in order to prevent any personal sense of what their own health should be when
reporting. The success of blinding is measured by a blinding index, with scores ranging from
zero (worst case) to 1.0 (perfect) and 0.5 being an acceptable score.
Ninety percent of the initial participants completed the study (89 percent who had an active
device and 97 percent who had the dummy device). A highly credible gastrointestinal illness
(HCGI) requires that there be six disease free days between episodes before an illness can
qualify to be measured. The results are as follows:
• Dummy devices—31 HCGI episodes, 6.3 episodes/person/year
• Active devices—16 HCGI episodes, 4.0 episodes/person/year
• Odd ratio of illnesses (Dummy vs. Active)—3.3 episodes/person/year.
The results had a blinding index of 0.62 indicating that the majority of the individuals were
unable to identify which device they had been given.
While mis study is not large enough to make recommendations based on the results, this does
demonstrate the feasibility of conducting these types of trials, the findings are consistent with
findings presented in prior research findings, and if large enough, future studies could have an
impact on drinking water habits.
88 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
When questioned about drinking water research priorities in the future,, Dr. Colford
recommended a focus on (1) smaller, well-done studies as big as they can be rather than larger
time-series studies, (2) more specific pathogen identification, and (3) target susceptible
populations. When asked what study size would prove large enough to support
recommendations, a tentative number of about 150 was suggested.
Maintaining Microbiological Quality of Drinking Water in the Distribution System
Director of Research at American Water, Dr. Mark LeChevallier, discussed monitoring to
maintain the biological integrity of water distribution systems to ensure that high quality water
from a treatment plant maintains that quality in distribution and receipt. The first concern is how
microbes enter the system - by growth or through distribution system contamination. Studies
based on coliform control show a complex interaction of factors related to regrowth:
• Filtration
* Temperature (the summer months are more problematic)
• Disinfectant type and residual
*
• Assimable Organic Carbon (AOC) and Biodegradable Dissemblic Organic Carbon (BDOC)
levels
• Corrosion (a potentially large factor as it provides an area that may protect bacteria from
disinfectants)
• Characteristics of the distribution system.
Mycobacterium Avium Complex (MAC) occurs in water supplies and includes M avium andM.
intracellulare. MAC is resistant to disinfection and regrows in biofilms. Disinfection studies
show that MAC is hundreds of times more resistant to chlorine than E. coli. Individuals with
compromised immune systems are at the greatest risk, and MAC has been documented in Boston
and San Francisco, both of which are areas with large immunocompromised subpopulations.
Current water treatment is capable of removing MAC. Research indicates lower MAC levels in
the distribution system near die treatment plant with levels increasing later in the distribution
system. Research demonstrates that rates of gastrointestinal illness are lower near the treatment
plant.
There are three ways to control MAC: (1) reduce turbidity, (2) reduce nutrients, and (3) heat
treat the water. Increases in M. avium levels correlate with elevated AOC and BDOC levels;
therefore, reducing nutrients in the water would decrease the presence of M avium.
Groundwater has lower levels of AOC because the water percolates through the ground
removing much of the AOC. When water temperature is increased to 52 degrees Celsius, there
were no M. avium detected in the water (>99.9 percent inactivation). This indicates that a good
treatment option for M. avium may be to flush the system with hot water.
EPA SCIENCE FORUM 2003 PROCEEDINGS 89
-------
Surge analyses show that anything with the potential to quickly stop water flow in the
distribution system (such as service interruptions, sudden changes in demand, and distribution
system operation) can cause pressure transients resulting in pressure waves traveling through the
distribution system. These surges may produce negative pressures and hydrologic surge waves
may be .additive thereby increasing the stress on the distribution system. At times of negative
pressure, microbes may enter.
The amount of separation between water distribution and sewer lines is also of concern due to
the potential for microbes to migrate between pipelines from leaks or breaks. Typical separation
distance is 10 feet, but standards allow for a minimum of 18 inches of separation. Research has
identified a correlation in the winter months between the number of leaks repaired and an,
increase in coliphage concentrations.
Maintaining water quality in the distribution system will dominate drinking water concerns of
the next decade. More information is needed on pressure transients and the determinants of
regrowth. Also needed are ways of measuring residual effectiveness for intrusion control, and a
better understanding of the system characteristics leading to intrusion.
When asked about the importance of the home distribution systems (pipes, etc.), the ratio of
surface area to volume was noted as important for growth of pathogens. In the water distribution
system, a large ratio between volume and surface area decreases the likelihood of contamination.
Cross-connections typically found in households significantly increase the risk of contamination.
There is usually higher surface area in homes relative to the distribution system resulting in a
greater opportunity for contamination in the household systems. Older homes have copper
plumbing systems and now the standard is to use poly vinyl chloride (PVC) piping. Both
materials, relative to biofilm growth, have advantages and disadvantages. The plastic surface of
PVC allows for disinfectants to be more effective. Using heat treatment with copper pipes
requires a lower heated water temperature to inactivate biofilm than for PVC. Therefore, the
relative importance of the material of the home distribution system depends on which treatment
methodologies are preferred.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period addressed a single topic involving potential differences
between rural and urban areas in waterborne disease outbreaks and the absence of studies on
small populations.
Mississippi River Basin Hypoxia
Following opening remarks by Ms. Katie Flahive, with OWOW, a panelist discussion addressed
the science ofhypoxia and the session concluded-with a presentation on the steps being taken to
mitigate hypoxia.
90 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Ms. Katie Flahive, with OWOW, introduced the panel discussion participants: Mr. Lee Mulkey,
Associate Director for Ecology at the National Risk Management Research Laboratory
(NRMRL); Dr. Mary Belefski, with OPPTS, and Dr. Rochelle Araujo, Associate Director for
Ecology at NERL. Ms. Flahive also.briefly introduced the concern of hypoxia in the Gulf of
Mexico as the basis for the panel discussion.
Hypoxia in the Gulf of Mexico
Mr. Lee Mulkey initiated the panel discussion by providing an overall perspective for the
concern over hypoxia using the Gulf of Mexico as an illustrative example. The Gulf of Mexico
drainage area is equivalent in size to two-thirds of the land area of the United States. After
dealing for many years with point sources of water pollution (e.g., water treatment plants,
industrial plants, etc.), there now exists a dominant non-point source problem, and there is much
difficulty encountered in identifying the sources of nutrient loading into freshwater river basins.
Estuaries, in turn, are extremely vulnerable because freshwater rivers are discharging large
amounts of nutrients into these areas. The eutrophication process in estuaries is recognized as a
national problem. Increased nutrient loading leads to an increase in biological productivity and
results in an increase in organic matter, which in turn leads to dissolved oxygen depletion and the
hypoxia problem.
Non-point sources in the Midwest are contributing excess nutrients to the freshwater rivers
feeding into the Gulf of Mexico, therefore the hypoxia problem becomes atransboundary issue.
There is an inability for the source to internalize the costs making this an area ripe for bringing in
social sciences and to look at free market solutions. Assistant Administrator Tracy Mehan is
very interested in examining the solutions that a free market model may bring. Modeling and
sampling continues in hypoxia region within the Gulf of Mexico, and EPA's EMAP is
collaborating with state and local programs to work towards solving the hypoxia problem. This
is the water quality challenge. The questions regarding who pays and how to resolve this
equitably remain unanswered.
Dr. Mary Belefski identified six key reports related to the science of hypoxia:
1. Committee on Environment and Natural Resources of the Office of Science Technology
Policy - Characterization of Hypoxia: Nutrient Loading ratios in the Gulf of Mexico. This
was a heavily peer-reviewed scientific study of the hypoxia zone.
2. Ecological and Economic Consequences of Hypoxia. This report examined the economic
consequences in other hypoxic areas in the world such as the Black Sea. An economic loss
due to hypoxia in the Gulf of Mexico is not being seen because there has not been a change
in fish harvesting rates. Current science does not enable identification of how current
hypoxic conditions in the Gulf of Mexico compare to the historic stages of hypoxia
development in the Black Sea, which makes it difficult to predicts potential economic losses
for the Gulf of Mexico and when those may occur.
3. Flux and Sources of Nutrients in the Mississippi-Atchafalaya River Basin. This report
presented a study involving a mass balance analysis of nutrients (nitrogen and phosphorus)
EPA SCIENCE FORUM 2003 PROCEEDINGS 91
-------
and the flux of these nutrients into and out of this river basin. This study found that 90
percent of the nutrient sources now come from non-point sources.
4. Effects of Reducing the Loads to Surface Waters within the Mississippi River Basin and Gulf
of Mexico. A University of Minnesota professor in collaboration with others examined
nutrient loading models to answer the question as to what would happen if nutrient loads
were reduced.
5. Reducing Nutrient Loads, Especially Nitrate-Nitrogen, to Surface Water, Groundwater, and
the Gulf of Mexico. This study examined the methods to reduce the nutrient load with an
emphasis on restoring wetlands in the watershed to act as filtering mechanisms. This study
also considered the impacts of changing agricultural practices to reduce loading.
6. Evaluation of Economic Costs and Benefits of Methods for Reducing Nutrient Loads to the
Gulf of Mexico. This study evaluated the costs to implement the load reduction programs as
identified in the report in Item #5, above.
NOAA has published all the reports on its website at:
http://www.nos.noaa.gov/products/pubs_hypox.html.
Action Plan for Reducing, Mitigating, and Controlling Hypoxia in the Northern
Gulf of Mexico
Ms. Katie Flahive, with OWOW, presented the Action Plan for Reducing, Mitigating, and
Controlling Hypoxia in the Northern Gulf of Mexico (2001) that was developed through
collaboration of nine Federal agencies, nine states, and two tribes. The Action Plan is designed
to work on a basin-by-basin basis.
In 1985, the first measurements of the hypoxic zone in the Gulf of Mexico were recorded.
Published articles and public notifications increased public awareness of the problem. In 1996,
EPA convened the Federal Principles, which proved to be the background for a task force. The
task force has met nine times since 1997, and is open to public comments. In 1998, Congress
passed the Harmful Algal Bloom legislation and Hypoxia legislation. The Action Plan was then
developed and published in January 2001. The task force conducted an analysis of point source
loading to the region and estimated that annual nutrient discharges for total nitrogen and
phosphorus were occurring from 11,500 point source facilities. With nine river basins included
in the scope of concern, the Ohio River valley and middle Mississippi River valley are major
areas of concern for nitrogen.
The collaborative principles of the task force are to: (1) improve the scientific understanding of
hypoxia causes and solutions by increased monitoring throughout the basin, (2) maintain national
focus on the basin and increasing communication between the public and scientific community
for an overall behavior change in the basin using the Chesapeake Bay area as a model, (3)
address voluntary and practical "win-win" projects for local impairments and the larger hypoxia
problem, and (4) provide measurable outcomes, parallel sciences, and adaptive actions.
92 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
The size of the hypoxic zone is increasing each year. In 2001, the Mississippi-Atchafalaya
River Basin hypoxia zone was larger than the size of Chesapeake Bay. Trends indicate that the
hypoxic zone is growing as normal rainfall continues year by year, and that it is nearing the size
of Lake Erie. A 5-year running average estimates the size at 14,000 square kilometers.
The task force developed three goals to address these changes:
• Coastal goal to reduce the size of the hypoxic zone to less than 5,000 square kilometers by
2015
• Quality of life goal to improve community and economic conditions across the Basin
• Basin goal to restore and protect the waters of the 31 states and 77 tribes in the Basin.
The task force is committed to 11 actions of which the most notable are to develop another
budget proposal this year, to get all sub-basin committees running (currently at about halt)., to
develop a pilot trading program to reduce nitrogen loading from point sources throughout the
Basin via voluntary actions, to use farm bill funds to support assistance to landowners and
agricultural producers, and to assess in 2005 and every five years thereafter the results of these
actions, leading to the goal of reducing the hypoxic zone to less than 5}000 square kilometers by
2015,
More information on the trading program is available at www.nutrientnet.org and at
http://www.envtn.org/.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief question and answer session with the
audience.
A brief question and answer period addressed a range of topics. These included: (1) the
behavior changes that led to reduced nutrient loading in the Chesapeake Bay; (2) utility of
mandatory nutrient management plans required but a few states to reduce nutrient loading and
the need for sub-basin areas to work together; (3) incorporation of lessons learned from trading
programs in Connecticut and New York and the challenges of integrating non-point sources into
such programs; (4) wastewater treatment plants as main sources of nutrients; and (5) credit
recipients of trading programs (farmers, producers).
The Millennium Challenge: EPA's Response to Invasive Species
Following opening remarks by Mr. Michael Slimak, with NCEA, and Ms. Marilyn Katz, with
OWOW, six speakers addressed diverse Federal and international initiatives to prevent, manage,
and control invasive species. A panel discussion including an audience question and answer
period followed the presentations.
Mr. Michael Slimak, with NCEA, provided opening remarks and defined invasive species as
non-native species, plants, animals, and microbial pathogens that harm the environment, public.
EPA SCIENCE FORUM 2003 PROCEEDINGS 93
-------
health, agriculture, and industry. Scientists, business leaders, and natural resource managers
acknowledge that invasive species are among the most serious ecological, human health, and
economic threats of the 21st century. Invasive species infest every state and affect everything
from biodiversity to grazing and agricultural lands. In 1993, a report by the Office of
Technology Assessment concluded that a lack of interagency cooperation was a significant
barrier to addressing the threat posed by invasive species. This report led to Executive Order
13112, which established the National Invasive Species Council. The cost of invasive species
exceeds $100 billion annually in the United States and ranks with habitat loss and fragmentation
as among the most critical threats to maintaining ecosystem integrity.
Partnerships are essential to deal with the problems posed by invasive species. EPA is an active
partner of the National Invasive Species Council, and has helped to establish the National
Invasive Species Management Plan. Additionally, EPA is a member of the Aquatic Nuisance
Species Task Force established to coordinate efforts to combat non-indigenous aquatic species in
the United States.. EPA is an active partner with other agencies in the war on invasive species
and looks to use regulations, better information, research and monitoring, and other measures to
help combat this complicated problem that does not have an easy solution.
Ms. Marilyn Katz, with OWOW, introduced the session speakers.
The Office of Water Perspective
Assistant Administrator for the Office of Water, Mr. G. Tracy Mehan III, provided the Office of
Water's perspective on invasive species. Historically, EPA may have not had a large role in this
issue; however, this is changing as EPA looks to combat biological threats. Paraphrasing a
statement by Dr. Bill Cooper, the biggest risk to the integrity of the fauna and flora of the Great
Lakes is not toxic substances, but invasive species. A greater return may be achieved by
allocating investments to combating invasive species than to the incremental cleanup of toxic
substances. The future will focus on ecological risks rather than the human health side.
Controlling invasive species is a costly endeavor as evidenced by the 50 years of control of the
sea lamprey, which entered the Great Lakes through the St. Lawrence Seaway. This low-level
control program has resulted in tens of millions of control costs, and is the basis of the entire
fishery system in that region. Zebra mussels and the spiny water flea are two additional
examples of invasive species in the United States. Approximately 160 such invasive species
have entered since me opening of the St. Lawrence Seaway and from ballast water discharges.
Predictions based on shipping patterns identify the potential for 17 new introductions of invasive
species from the Baltic and Caspian Sea alone. The possible entry of the Asian carp into Lake
Michigan is of current concern. The massive scope of these challenges presents a tremendous
threat to the waters of the United States, and the threat of invasive species ranks only second to
the loss of habitat Over 500 invasive species inhabit the coastal and marine habitats of North
America
Efforts taken by the Federal government are encouraging. This is being viewed as a watershed
management issue with invasive species, both aquatic and terrestrial, as integral parts and threats
that need to be addressed. The "principal of prevention" may need to be applied; once these
species are established, they are here forever, and at that point the best that can be done is to
94 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
focus on integrated pest management. As a result, there is a need to focus on preventing the
introduction of invasive species through the key vectors for introduction, which is ballast water
for aquatic invasive species. A large toolbox or toolkit is needed due to the massive scope of the
challenges and because the threat is multifaceted. Five areas of activity are of immediate focus:
• Prevention—working with partners such as the United States Fish and Wildlife Service and
the USCG
• Rapid response and monitoring—the Australians have been leaders in this area
• Controlling and managing the invaders—necessary, but the also the least preferable approach
as it is after-the-fact management
• Education and outreach—to inform the public of the magnitude of this concern
• Leadership—show by example that invasive species control is a priority and place invasive
species near the top of the list of threats.
Ballast water is critical in the Great Lakes as well as the coastal areas, and this needs to be better
understood and controlled. The ETV Program is involved in the identification of hew
technologies to assist in the fight against invasive species. EPA also plans for more research
investment in this area, as it is the "preeminent environmental issue."
United States Coast Guard Research: Research in Support of the Coast Guard's
Program to Prevent the Introduction of Nonindigenous Species by Ships
Dr. Richard Everett, with the USCG, provided highlights of research conducted by the USCG
related to invasive species, research results, and initiatives to actively combat invasive species
entry routes using new technology, regulations, and best management practices. Since the
USCG is not a basic science research agency, all of the USCG research directly supports basic
operations or its regulatory activities. Research has identified that the magnitude and relative
importance of ship-mediated invasions has increased, and an increase in regulations and controls
in this area should be expected in the near future.
The ship-mediated vector includes the following pathways for invasive species introduction:
ballast water discharge, hull fouling, prop fouling, sea chest fouling, and the chain and locker.
The main types of ships responsible for ship-mediated invasions include passenger vessels/cruise
ships, container ships, and tankers. Each type of vessel discharges different quantities and
qualities of water. On average, tankers discharge the largest amount of ballast water (over
10,000 cubic meters), with passenger ships discharging the least amount (only a few hundred
cubic meters of water). Rates of discharge and the age of water differ by ship type as well.
Legislative directives such as the'National Aquatic Nuisance Prevention and Control Act (1990)
and the National Invasive Species Act (1996) have directed the USCG research to address the
ship vector and invasives. NISA created'the National Ballast Information Clearinghouse that
requires reporting from all vessels with foreign ballast water. USCG regulatory activities require
EPA SCIENCE FORUM 2003 PROCEEDINGS 95
-------
all vessels to report outside of 200 miles, and require all vessels to conduct active ballast water
management. This may lead to establishing a quantitative ballast water discharge standard.
Verification of ballast water exchange is a research priority. In the Great Lakes region, salinity
is used currently to verify that ballast water exchange has occurred. Research partners include
the Smithsonian Environmental Research Center and international agreements.
There is both a national and international effort to develop a discharge standard. A percent
reduction standard is under consideration, but this may prove problematic due to the difficulty in
identifying the initial biotic composition of the water.
Treatment of ballast water is a challenge because ballast water flow rates are very high
(approximately 2,500 cubic meters per hour) and a wide range of organisms in the water makes it
difficult to establish a universal standard. The USCG is working in partnership with the EPA
ETV'Programto identify technologies that will assist in ballast water monitoring and treatment.
International Treaty Effort to Address the Transfer of Invasive Species Via Ballast
Water
Ms. Kathy Hurld, with OWOW, presented the international perspective on invasive species,
progress toward the development of an international ballast water treaty, and the challenges in
keeping the treaty in line with the goals of the United States. Currently, there are no standards,
no available technologies, and many questions that still need to be answered, yet the regulatory
process must move forward. An international ballast water treaty is necessary because ships
travel internationally and most ships are not registered in the United States, rendering any United
States regulations useless. A ballast water treaty will help to protect biodiversity and provide a
balance between commerce and the environment. There are many Federal agencies working
toward developing the treaty via an interagency working group to identify objectives important
to the United States, which include the following:
Setting standards stringent enough to drive technology
Basing the treaty on sound science
Setting verifiable, measurable, and enforceable standards
Having everyone on board because it is an international effort
Preserving the right of states with ports to take more stringent measures
Preserving the freedom of navigation since port delays due to monitoring would be costly
Completing the treaty in a quick timeframe.
The International Maritime Organization faces several challenges as discussions toward an
international treaty move forward:
• Whether a standard be based on discharge or best available technologies
• The need to set a numerical standard based on concentration not percent removal
96 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
• Whether to set an interim standard and then move toward a final standard that is biologically
meaningful
• How to sample in port without causing large delays
• The timing of when the treaty would enter into force (i.e., sooner vs. later)
• Applicability of the standard for existing ships and ships built in the future, and the
availability of technology to set a standard for all types of ships, including passenger ships (a
preference of the United States)
• Are we ready for a diplomatic conference where countries can sign a treaty?
The current timeframe includes a meeting in July 2003 of the Maritime Environmental
Protection Committee of the International Maritime Organization followed by a diplomatic
conference in February 2004. Uncertainty still exists as to when the United States will ratify the
treating or when the treaty will enter into force. The principle objective is to have a treaty
provide a vehicle that is compatible with United States goals and other Federal programs. The
United States has successfully pushed for a discharge standard in the treaty and NISA is likely to
be the implemented regulations. Every step of the process is being done through partnerships
and that need to be maintained as efforts move forward.
A "Shocking" Solution for Controlling the Spread of Asian Carp into the Great
Lakes
Dr. Marc Tuchman, Team Leader for both the Sediment Assessment and Remediation Team and
the Invasive Species Team at the Great Lakes National Program Office (GLNPO), presented a
localized, species-specific approach to controlling the spread of invasive species. The GLNPO is
using an electric fence in the Illinois River to prevent the migration of the Asian carp into Lake
Michigan. If the carp were to reach Lake Michigan, then management of their populations may
be nearly impossible and the ecological damage would be significant.
Asian carp were imported into Missouri catfish farms for the purpose of cleanup, since they are
planktovores, and involve three species:
• Big head carp, which is moving up the Illinois River towards the Lake of Michigan
• Silver carp from China, which can grow up to 100 pounds and have been documented as
being able to jump out of water
• Black carp from China, which is used to clear snails out of catfish farms and can grow up to
70 pounds.
In the 1990s, Asian carp escaped the catfish farms as a result of accidental release and flooding,
and the populations exploded in size. In a Missouri River fish kill, 97 percent of the fish were
Asian carp and only 4 native species were found. Asian carp grow rapidly, consume large
EPA SCIENCE FORUM 2003 PROCEEDINGS 97
-------
amounts of plankton, out compete the native species, and disrupt the food chain. The State of
Indiana has banned the transport, sale, and possession of the Asian carp. There have been
reports that the Black Carp has escaped also, but this has yet to be officially confirmed. If this
species was released, it may have the greatest impact since it also feeds on mollusks, which
could potentially disrupt industries dependent on mollusks.
The Asian carp is well suited for the Great Lakes since they have temperatures similar to those
preferred by this fish species. The question now is how to stop the Asian carp from entering the
Great Lakes. Various prevention alternatives were considered, and an electrical organism barrier
about 30 miles from Lake Michigan was selected as the most practical solution. The electrical
barrier was designed by the United States Army Corps of Engineers and spans a 50-foot stretch
of river. The sensitivity of electrical field was of concern, since the field would need to be strong
enough for the fish to feel it and turn away but not strong enough to stun or kill the fish. Funding
constrained the scope of the project, providing for a barrier with only a three-year life span. The
barrier was constructed in April 2002 in an industrial area; therefore, barges serving the
industrial area may provide a pathway for carp through the electrical barrier causing the efficacy
of the barrier to be in question.
Monitoring activities in the area include the tagging of fish. Current predictions were that the
first tagged fish would encounter the barrier in about six months. However, one tagged carp
recently passed through the barrier, but uncertainty exists as to how this happened. As a result,
the electrical field of the barrier was increased. A power surge shut down the electrical field for
a day, so some fine-tuning, of the barrier is required over the next four to five months before the
Asian carp reach the barrier. A second barrier, with a 25-year life span, is being constructed
about 1,000 feet downstream.
Introduced (invasive) Species and Pesticide Control Programs
Mr. Daniel Rosenblatt, Team Leader for the Emergency Response Team with OPP, discussed the
authorities of Federal pesticide laws and programs applicable to controlling the spread of
invasive species and associated health considerations related to pesticide use. The two major
pesticide laws are the FIFRA and the Federal Food, Drug, and Cosmetic Act.
FIFRA is the primary pesticide law, focusing on licensing of pesticides. FIFRA Section 18
contains a key exemption provision that provides the EPA Administrator with the authority to
waive the pesticide use requirements in the event of an emergency. FIFRA Section 18 also
allows for quarantine pesticide programs, which are often pursued on Plant Protection Act
grounds by the US DA Animal and Plant Health Service. Some of the more common quarantine
exemption programs include the use of Spinosad for the Mediterranean fruit fly (Med-fly) in
Florida and California, the use of acetaminophen for the brown tree snake, the use of caffeine for
the Coqui frog in Hawaii, and the use of Diquat to combat the snakehead fish in Maryland.
The Med-Fly Program uses border control and field detection in the high-risk, warmer, citrus-
producing states such as Florida and California. In the 1980s, Tampa, Florida, and Los Angeles
County started using the preventive release of sterile flies to combat increasing Med-fly
populations. While this program has shown some degree of success in maintaining control over
Med-fly populations, it is a very costly program. Other pesticide programs target adults and
98 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
immature life stages. In the event of an outbreak, chemicals are applied that are specifically
targeted to each life stage. Regulatory control of host material is another measure used to
combat invasive species.
The application of pesticides under the Med-Fly Program differs from other invasive species
programs in that the objective is eradication rather than reduction. Outbreaks of the Med-fly are
typically seen in populated or high traffic areas (airports, etc.), which increase the difficulty of
pesticide application. In addition, there is disagreement regarding the best response method—
ground versus air application. Additionally, community tolerance for pesticides varies by
locality (Community Tolerance for Pesticide Exposure Variable) and provisions for advance
notification to protection natural resources and sensitive persons are realities faced in the field.
The EPA role in exemption requests is to assess the aggregate risk, factoring in dietary risks, the
bystander risk analysis, the occupational risk for the individual applying the pesticide, the impact
on non-target species including endangered species, and the necessary response time. Key
accomplishments in the Med-Fly Program include the adoption of a reduced risk pesticide,
Spinosad, the ongoing sterile fly release in high-risk areas (California and Florida), and
community participation in prevention programs.
\ • '
In summary, the FIFRA quarantine exemption process is a key provision for EPA use on
introduced and invasive species. EPA is responsible for preparing the risk assessment for
quarantine uses of pesticides and the EPA Region provides field presence and technical support.
There is an increasing national and bio-security aspect of the program as well. For example, the
cleanup of the Hart Building and other buildings in Washington, DC, was based on a Section 18
exemption for a variety of decontamination chemicals.
Environmental Perspectives on Invasive Species Control: Precaution and
Prevention
Ms. Jacqueline Savitz, Pollution Campaign Director and Senior Scientist for Oceana, reiterated
the need for invasive species control programs using prevention of introduction as the crux for
invasive species management and asked for careful consideration and precaution when
introducing toxic chemicals into the environment as evidenced by past chemical uses. Before
using toxic remedies, consider all unintended consequences to the extent that they can be
anticipated, including long-term effects, fate and transport, protecting the integrity of the
community structure, potential chemical interactions both additive and synergenistic, the low-
level effects on non-target species (immune and reproductive systems), and the ability for the
ecosystem to recover. Many examples were provided that illustrated the negative consequences
and our history of chemical use gone awry.
When taking a precautionary approach, consideration of the non-toxic alternatives, prevention
options, and remedy effectiveness are important. Prevention of invasive species introductions
should be the primary course of action. Ballast water is the leading cause of aquatic species
invasions, yet no mandatory controls exist except in the Great Lakes and implementation of
Ihose controls have not shown a change. Deliberate introduction of exotic species continues to
be a widespread problem. New Zealand and other countries are using the precautionary principle
to combat invasive species (e.g., flycatcher, anole, aphid-dogs, and seal pups-toads) and are not
EPA SCIENCE FORUM 2003 PROCEEDINGS 99
-------
allowing the introduction of any new species. Ballast water solutions include requiring mid-
ocean exchange, onboard technologies, and onshore technologies. Mandatory requirements are
also necessary to drive technology and use, and these needed to be set already. Oceana is
working on cruise ships with technology that eliminates the need for cruise ships to uptake or
discharge ballast water. This is an example of the "out-of-the-box" technology that is needed.
Two key take-home points are: (1) use precaution when applying chemicals because chemical
use is never entirely safe, and (2) prevention of introduction should be the baseline for
controlling invasive species. A credible prevention effort is needed along with careful
consideration and precaution in the introduction of toxic chemicals into the environment. The
overall recommendation is to not attack a millennium challenge with a bicentennial solution.
Panel Discussion/Questions and Answers
. The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period addressed a sole issue regarding the funding of initiatives
necessary to address the complex problems posed by invasive species.
Social Science and Resistance to Water Fluoridation
Following opening remarks by Mr. Bill Hirzy with the National Treasury Employees Union, and
Ms. Roberta Baskin, a senior reporter for NOW, two speakers addressed the arguments for and
against the national policy on the addition of fluoride to drinking-water. A question and answer
period followed the presentations.
Mr. Bill Hirzy, an officer of the National Treasury Employees Union Chapter 280, provided an
introduction to this session noting the original intent for this session to be an information
exchange about the science and national policy of water fluoridation, yet the CDC and the United
States Public Health Service, with a goal of fluoridation for 100 percent of the water supplies in
the United States, declined to publicly defend this national policy. The first involvement of the
National Treasury Employees Union in fluoridation was in 1985 as a matter of profession elhics
when the recommended maximum contaminant level (MCL) for fluoride was being developed.
An employee came to the Union expressing concerns over the standard stating that it did not
protect public health against severe dental fluorosis, which many people view as a serious health
concern. The Union's most recent involvement has been to sign a statement of concern along
with hundreds of other organizations calling for a national review on the policy of water
fluoridation.
Ms. Roberta Baskin, a senior reporter for NOW, identified water fluoridation as a controversial
issue that needs to be discussed. Her interest in fluoridation peaked in 1997 when the Food and
Drug Administration began requiring notices on toothpaste about fluoride as well as warnings
about what to do in case of accidental ingestion. This session provided a great opportunity to ask
tough questions of the experts.
100 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
EPA Drinking Water Regulations for Fluoride
Director of the Health and Ecological Criteria Division in the Office of Science Technology, Dr.
Ed Ohanian, presented background information regarding fluoridation and the viewpoints in
favor of fluoridation regulations. The current primary drinking water regulations under the
SDWA require EPA to set a maximum contaminant level goal (MCLG), an MCL, and a
secondary MCL (SMCL). The MCGL is anon-enforceable health goal that allows for an
adequate margin of safety, the MCL is the enforceable health standard that is set as close to the
MCGL as feasible, and the SMCL is a non-enforceable standard in place to protect against
cosmetic and aesthetic effects from drinking water.
In the case of fluoride, the MCL is set at 4 mg/L to protect against skeletal fluorosis, a crippling
disease that can result either from a hardening of the bone (osteosclerosis) or a softening of the
bone density due to impaired mineralization (osteomalacia); skeletal fluorosis effects include
limitation of joint movement, calcification of ligaments, crippling deformities, and muscle
wasting.
The SMCL for fluoride is set at 2 mg/L to protect against objectionable dental fluorosis defined
as visible dark stains and pitting of teeth; this standard was set based on the incidence of
moderate and severe dental fluorosis, with a 0 to 15 percent incidence at this level. Distinct
increases in the incidence of moderate dental fluorosis have been documented at levels of 1.9
mg/L and severe dental fluorosis at levels above 2.5 mg/L. Public notification must be sent by
utilities if the SMCL standard is exceeded
Fluoridation is the intentional addition of fluoride to drinking water to reduce the incidence of
dental decay and caries. The Unite States Public Health Service has established fluoridation
levels of 0.7-1.2 mg/L, which are less than both the MCL and SMCL and are not in violation of
EPA drinking water regulations. Of particular note is that the SDWA prohibits EPA from
requiring the addition of any substance to drinking water for preventative health care purposes.
In 1997, the status of fluoride changed from a beneficial substance to a nutrient citing that
fluoride inhibits the initiation and progression of dental cavities as well as stimulates new bone
formation. The Food Nutrition Board of the National Academies of Science developed
recommended dietary guidelines for the intake of fluoride based on the age and sex of
populations, as follows:
• Infants: 0.1-0.5 mg/day
* Children: 0.7-2 mg/day
* Adolescents: 3 mg/day
• Adults: 3-4 mg/day.
In 1993, EPA requested a review of fluoride regulations by the National Research Council under
the National Academies of Science. This review concluded that MCL of 4 mg/L was an
appropriate interim standard and recommended re-evaluation of the standard when additional
research results became available. EPA responded by publishing its intent to maintain the 4
mg/L MCL level. In 2002, an EPA review of the fluoride regulations examined all drinking
water regulations established before 1996 and identified new health effect studies published after
EPA SCIENCE FORUM 2003 PROCEEDINGS 101
-------
the 1993 review. This prompted EPA to request the National Academies of Science to update
their 1993 assessment and recommended an independent review of the data. The EPA charge to
National Academies of Science was to review the new health effects and exposure data, evaluate
the scientific and technical basis for the MCL and SMCL to identify if they are protective of
public health, and identify data gaps for future research. This review will be completed in 2005.
More information on this review (project name is Toxicological Risk of Fluoride in Drinking
Water and project number is BEST-K-Q2-02-A) is available at www.nationalacademies.org, and
this website also provides the opportunity, for feedback on this project.
Fluoridation: An Undefendable Practice
Dr..Paul Connett, a professor at St. Lawrence University, defined fluoridation as the addition of
chemicals to drinking water to yield a fluoride ion concentration of 1 mg/L or 1 ppm. While
fluoridation began in 1939, the first trials started in the United States and Canada in 1945. None
of the chemicals used for fluoridation are pharmaceutical grade.
Fluoride is a naturally occurring element in the sea, in rocks, and in some groundwater samples.
Fluoride also occurs in mother's breast milk at 0.01 ppm, which is 100 times less than that added
to drinking water systems under fluoridation programs. Thus, a baby's first meals have
extremely low levels of fluoride; Nobel prize winner Arvid Carlsson identified this trend and
used it as the basis of his charge against fluoridation in Sweden.
Fluoride is an extremely inert chemical, yet its thermodynamic potential lends itself to a very
active state that interferes with hydrogen bonds, forms complex ions, and mobilizes the
movement of metal ions such as aluminum into places that it normally would not travel. As an
example, fluoride accumulates in the human pineal gland, which produces melatonin, and is able
to access this gland because it is not protected by the blood-brain barrier. Calcium hydroxy
apatite crystals, which form in the pineal gland, were found to have a fluoride level of 9,000
ppm. In animals, fluoride lowers melatonin production by inhibiting enzymes; since there are
four enzymes required to produce melatonin, uncertainly remains as to the specific enzyme
affected by fluoride.
Opposition exists to a national water fluoridation policy because it is believed to be:
• Unethical by medical standards— fluoridation violates the individual's right to informed
consent to medication, does not allow for individual sensitivity to dose, does not control the
dose to the individual, and does not allow for the individual response to be monitored. There
are many unresolved issues and more research is needed to fill these data gaps.
• Unnecessary— children are already receiving overdoses of fluoride without water
fluoridation, and research indicates that 13.5 percent of children already have dental fluorosis
on at least two teeth.
* Inequitable— the wealthy can afford to avoid fluoridated water if they so choose. The poor
cannot afford bottled water or other avoidance measures, and are forced to receive
fluoridated water regardless of their preference. In India, it is well established that
fluoridation toxicity effects are the most severe in those with poor nutrition.
102 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
• Inefficient— the vast percentage of the added fluoride (-99.97 percent) will be flushed down
the drain and toilet, or washed away during car washing and other activities. Fluoridation is
only cost-effective because industrial grade fluoride is used rather than pharmaceutical grade.
• Ineffective—a 21 city study determined that there is an inverse relationship between tooth
decay and fluoride concentration in drinking water. Using the same data, another researcher
(Ziegelbecker) demonstrated that there is no correlation between fluoride and dental disease,
and identified a direct relationship between dental fluorosis and fluoride levels. In 1990,
Brunelle and Carlos conducted the largest survey ever in the United States (involving 39,000
children and 84 communities) on differences in tooth decay in fluoridated and non-
fluoridated communities; the results demonstrated only very small (not significant)
differences between fluoridated and non-fluoridated communities in the amount of tooth
surface "saved." An Australian study showed even lower findings and a 1998 study by De
Liefde concluded that the difference in decay, missing, and filled teeth between fluoridated
and non-fluoridated populations is "clinically meaningless." Locker, in 2001 also noted that
the magnitude of the effect is not large, not statistically significant, and may not be of clinical
significance.
Boston has been fluoridated since 1978, yet the Boston Globe published a front page article
indicating a dental crisis in the metropolitan area.
The CDC listed fluoridation as one of the top 10 public health achievements of the 20th
century, meaning that the incidence of decay, missing, and filled teeth for 12 year olds has
decreased while the number of individuals drinking fluoridated water has increased. Yet data
from the World Health Organization over the same time period, show that other countries,
both fluoridated and non-fluoridated, have exhibited a similar trend, thereby refuting the
CDC correlation with fluoridated water. CDC did note that the major benefits of fluoride are
topical not systemic. Therefore, to this speaker, the addition of fluoride to toothpaste makes
more sense than adding it to drinking water.
• Unsafe—as proven by the incidence of dental fluorosis. In the United States, 29.9 percent of
the children already have dental fluorosis (mottling of enamel) on at least 2 teeth. Heller et al
concluded mat the severity of dental fluorosis increases with dose, and the daily dose
received by children in unfluoridated areas is already nearing 1 mg/L.
There also is a superlinear relationship between the incidence of bone fractures in children
and the increase in fluoride concentrations; 50 percent of the ingested fluoride is excreted in
the urine daily; the remainder accumulates in bone. Wix and Mahamedally (1980) studied
the increase of fluoride concentrations in bone over time, but the United States government
has not conducted such studies. The pre-clinical symptoms are similar to those of arthritis.
Fluoride affects different bones in different ways, and clinical trials found bone hardening in
the vertebrates and increased hip breakage, which is a particular concern for the elderly who
are most susceptible to this risk. Fluoride also accumulates in the cortical bone, which is
cause for concern.
EPA SCIENCE FORUM 2003 PROCEEDINGS 103
-------
Other studies report negative health effects related to fluoride such as earlier onset of
menstruation in young girls (Hilleboe et al.). An Urbansky study stated that silica fluoride
would completely dissociate in water, yet this was disproven in German PhDthesis.
Thus, the science still is not there to completely understand all of the effects of fluoridation,
such as the effects on hypersensitive populations and an appropriate margin of safety
(established by public health policy) to protect more vulnerable populations, which is
ultimately what public health policy is supposed to do. Those who promote fluoridation have
a formidable task to convince us that the water system is ideal to deliver medication.
Of final note, the use of fluoridation distracts from the real causes of dental decay, which are
poverty, poor nutrition, and poor dental hygiene. There is a much greater correlation between
tooth decay and poverty than there ever will be in between fluoridation and toot effects. There
is a need to conduct independent scientific research on this topic without pressure and
intimidation from either the pro- or anti-fiuoridation viewpoints.
More information is available at http://fluoridealert.org/reference.htm.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period addressed a range of topics. These included: (1) whether
EPA policy contents that fluoridation provides safe and effective prevention of dental disease;
(2) the potential for the generation of hydrogen fluoride fumes from fluoridated water; (3) the
incidence of water fluoridation in Europe and countries that have discontinued the practice; (4)
opportunities for public comment on the upcoming National Academies of Science data review;
(5) the need for a good exposure assessment as a basis for developing a good risk assessment;
and potential health effects of fluoride, especially involving the elderly.
Development of Biological Indices for Coral Ecosystem Assessments
Following opening remarks by Mr. Kennard Potts, with OWOW, four speakers addressed the
global and local stressors that human activities, climate change, and El Nino place on coral
ecosystems as well as the incorporation of biological indicators for coral in water quality
standards. A panel discussion including an audience question and answer period followed the
presentations.
Mr. Kennard Potts, with OWOW, introduced the idea of developing indicators for coral reefs in
order to provide information on stressors, a 1996 workshop that analyzed methods for assessing
reefs, EPA's role and partnerships in coral reef management, and the research EPA is conducting
on reefs. Since the early 1990s, EPA has been engaged in coral reef issues with the International
Coral Reef Initiative and at a broader level with the Coral Reef Task Force and the Coral Reef
Conservation Act of 2000. The goal is to identify low cost and low technology methods to
monitor reefs that are simple, inexpensive, and easy to use for "technology transfer" to all coral-
•bearing nations. In addition, ecosystem management is really people management with the
104 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
presentations given in this session providing a framework for developing and applying biocriteria
indices.
Assessing the Consequences of Global Change for Coral Reef Ecosystems
Dr. Jordan West, with NCEA, discussed the importance of coral reef systems for tourism,
fishing, and biodiversity. Coral bleaching occurs when coral is exposed to an environmental
stressor. Coral bleaching disrupts the coral-algal symbiosis resulting in the algae being expelled
from the coral host Coral can recover unless the stressor is continuous and significant. The
consequences of coral bleaching include algal overgrowth, disease outbreaks, and a decrease in
biodiversity. Local stressors (e.g., pollution, salinity shock, sedimentation, disease) and global
stressors (e.g., increasing temperature, light, and CO2 levels) result in coral bleaching.
There is a linkage between elevated irradiance (light), increasing water surface temperatures, and
coral bleaching. Once the mean temperature change of the ocean is exceeded by 1 degree
Celsius, bleaching will occur. Both photosynthetic active radiation and ultraviolet radiation can
contribute to bleaching. Light penetration into the water is linked to coral bleaching; therefore,
greater bleaching occurs in shallow waters, on top of the coral, and in less cloud-protected areas.
Senior Research Scientist with NERL, Dr. Richard Zepp, expanded upon the science linking
irradiation and water temperature as well as El Nino effects on coral bleaching. Ultraviolet light
induces the formation of a thymine dimer in the DNA of the coral, and this inhibits normal coral
functions. This can be measured and used as an indicator of damage.
El Nino, mainly a Pacific Ocean phenomenon, affects many of the other bodies of water. During
El Nino years, surface water temperatures increase. The result is a stratification of temperature
where the water is hotter at surface with cooler stratifications as depth increases. The clarity of
water increases during El Nino; this increases ultraviolet light penetration resulting in increased
opportunities for coral bleaching. Research has demonstrated changes in water clarity
corresponding with the end of the El Nino event.
Human impacts on coral reefs include changes in seagrass and mangrove areas that contribute to
increased sediment and organic detritus. Uncertainties exist as to whether this increases or
decreases coral stress. Sediment issues can work both ways to attenuate damage and recovery.
The net effect still needs to be understood. In 2002, the Black Water Event in the Florida Bay
was a large toxic algal bloom resulting from a large amount of nutrient loading, possibly from
the Everglades. This event correlated with significant damage to the coral, but it is still unknown
whether or not this may only be a coincidence.
Efforts to address coral issues including the creation of the Coral Reef Task Force under an
Executive Order by former President Clinton with a focus on climate change and coral
bleaching/disease, land based sources of pollution, over fishing, public awareness, and
recreational overuse. The Task Force developed Resolution 5 (October 2002) in partnership with
the Department of the Interior, NOAA, and EPA. The first step is to hold a stakeholder
workshop in June 2003 on "Corals, Climate, and Coral Bleaching" in Hawaii. This workshop
plans to bring together all stakeholders (Federal agencies, non-profits, academia, etc.) for an
EPA SCIENCE FORUM 2003 PROCEEDINGS 105
-------
informational sharing session to develop a manager's toolkit for rapid response and program
development.
Lastly, Dr. West examined a framework for collaborative assessment that recognizes the
interconnection of the following:
• Physiochemical patterns—global climate, remote sensing, regional monitoring, and site
monitoring
• Coral bleaching—range map, monitoring, biochemistry, and physiology
* Conservation strategies—mitigation, testing strategies, and site assessment and management.
Applying Biocriteria for Coral Reefs in Water Programs
Program Manager for the Biocriteria Program in the Office of Science and Technology, Mr.
William Swietlik, discussed the role of biocriteria in the regulatory programs for the waters of
the United States. The adoption of biocriteria for all waters and incorporating biocriteria in
water quality standards has been a priority in the Office of Water for a number of decades.
There has been significant progress and success in developing biocriteria for streams and small
rivers, as well as beginning to develop regulatory standards based on these criteria Once
biocriteria are incorporated into water quality standards, they will influence the rest of the
processes in the water quality management cycle. The opportunities for application of biocriteria
for coral reefs occur in the CWA Sections 305(b), 303(d), 301(h), 403(c), and 303(c).
A water quality standard has the potential to define designated uses, the criteria to protect the
uses, and an antidegredation policy for a water body. The designated uses can be either existing
uses or restoration goals. Water quality criteria may place limits on a particular pollutant or on a
condition of a water body, and the criteria can be designed to protect the designated use.
Biological information can be used in water quality standards to develop biological criteria to
protect aquatic life uses, describe existing uses, assign appropriate designated uses, refine and
subcategorize designated uses, and help make attainment decision. Water quality standards also
can address human induced stressors.
Florida, die Virgin Islands, Puerto Rico, and the Hawaiian Islands all have coral reefs and have
established water quality standards providing the opportunity to apply biocriteria for coral reefs.
The Virgin Islands may provide for the best application of water quality standards using the
criteria of the "existing natural conditions shall not be changed."
What do states and tribes need to do to incorporate biocriteria into water quality standards?
States and tribes need tested bioindicies, guidance on bioassessment methods and biocriteria
development, and program support in order to move forward on incorporating biocriteria into
water quality standards. Additional information is available at: www.epa.gov/OST/biocriteria
andwww.epa.gov/owow/oceans/coral/index.html.
106 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Development of a Coral Reef Index of Biotic Integrity
President of Coral Seas, Inc., Dr. Steven Jameson, discussed the development and use of an IBI
for coral reefs as a more accurate way to monitor and assess them, pointing to the success of IBIs
in freshwater environments and the transferability of IBIs as indicators to marine environments.
Traditional monitoring uses poor reference conditions with constantly shifting baselines and does
not provide for early warning capability.
IBIs are a better way to monitor and assess coral reefs, since they allow for the classification of
similar environments to support "apples to apples" comparisons. Properly designed IBIs would
be sensitive to water quality, habitat structure, flow regime, food, and biotic interactions.
Calibrated dose-response metrics capture the most important biological attributes because they
assess only the pollutants that are biologically available, assess synergistic and antagonistic
pollutant relationships, and reveal biological effects at contaminant levels below the current
chemical analytical detection limits. The most important advantage of IBI metrics is that they
are useful in detecting degradation by humans that is caused by factors other than chemical
contamination, such as temperature, turbidity, salinity, pH, light intensity, and disease. IBIs are
sensitive because they assess only pollutants that are biologically available and consider the
community assemblage structure, taxonomic composition, individual condition, coral breakage,
and biological processes.
Use of an IBI allows for the following:
• Inclusion of different metric types—different metrics can be combined (summed) to obtain a
total IBI score, which allows for the ranking of habitats
• Creation of indices for different types of organisms (e.g., focus areas)
• Detection of cumulative impacts through the use of fixed reference (baseline) conditions to
remove the effect(s) of shifting baseline condition
• Early warning capability determined based on a dose-response curve.
IBIs have a diagnostic capability, which uses specific response metrics and metrics sensitive to
mixtures of pollutants to monitor and assess coral reef health. IBIs can also use data to identify
what is causing the changes, which in turn supports trend and problem identification. IBIs also
can be used to certify marine protected areas or to identify whether progress in remediation is
being made.
Building upon previous IBI experience with freshwater, the addition of more IBIs may not
produce greater resolution. IBIs only require one sampling method within the same physical
environment (coral reef zone) thereby minimizing the sampling effort, data volume,
environmental impacts, and cost.
There is a need to establish a classification system for monitoring and assessment of coral reefs,
establish marine ecoregions, standard classification terminology, test metrics for dose response
capabilities, and reference conditions for viable metrics. From these efforts, IBIs can be created
EPA SCIENCE FORUM 2003 PROCEEDINGS 107
-------
and calibrated for specific conditions. These IBIs will in turn support the development of
biocriteria. Determining what is acceptable for coral reef biotic integrity will be challenging
scientifically because of the current limitations in our understanding of the natural system and
how far the system can be stressed while still sustaining the ecology. In addition, incorporating
effective biocriteria into regulation will be politically challenging because of their significant
economic impacts.
Coral reefs are threatened globally by CO2 and sea surface temperature changes, therefore it is
important to reduce other stressors and increase survivorship. Local success in preserving coral
reefs requires actions at the local and global levels. Time is running out quickly for coral reefs
and progress must continue to be made through local, national, and international partnerships.
This will require strong national leadership and a sustained commitment to forge and maintain
these partnerships. Establishing anew paradigm for coral reef monitoring assessment, using
IBIs and biocriteria, will provide critically needed early warning and diagnostic tools to help
reduce the impacts of controllable local stressors.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period following each presentation addressed a range of topics.
These included: (1) nutrient sources for the Black Water Event and whether color can be used as
an indicator of nutrient movement; (2) interagency programs to address nutrients in the
Everglades restoration; (3) whether elevated C02 as a stressor of coral reef health is a direct or
indirect effect; (4) how to calibrate dose-response metrics for a coral reef; and (5) concerns about
releases of asbestos and PCBs from World War II ships and their potential impacts on coral
reefs.
The Impacts of Urban Drainage Design on Aquatic Ecosystems in the
United States
Following opening remarks by Mr. Jamal Kadri, with the Office of Water, two speakers
addressed the use of Smart Growth principles in urban areas and tools supporting Smart Growth
initiatives. A panel discussion including an audience question and answer period followed the
presentations.
Mr. Jamal Kadri, with the Office of Water, initiated the discussions by describing EPA's role and
interest in urban drainage design. Mr. Kadri then introduced the session speakers.
OWOW and Smart Growth: Integration of Water Programs through Smart Growth
Principles
Ms. Diane Regas, with OWOW, discussed Smart Growth principles as another approach to
watershed protection and management, and how these principles can be used to protect water
resources such as estuaries, which are extremely important ecologically and economically.
Estuaries generate $ 110 billion per year in revenue and help to maintain biodiversity.
108 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Approximately 44 percent of estuaries are impaired, and estuary impairment is growing as
population increases.
The number of vehicle miles traveled and the rate of land development are increasing much
faster than population growth, and these changes correlate with increases in pollution seen in
water. A study showed that approximately 32 percent of the nitrogen loading to the Chesapeake
Bay is attributable to air deposition. Approximately 53 percent of the population in the United
States lives on the coasts. In addition, almost 10 percent of watersheds had 15 percent or more
of total watershed land is developed, particularly in the East Coast and Great Lakes areas. When
effective imperviousness exceeds 10 percent of a watershed's total acreage, large hydrological
changes in the watersheds and a huge decline in biodiversity are seen. Intuitively, some would
men argue to maintain watershed imperviousness at a level below 10 percent.
Annual vehicle miles traveled per household decreases by 35 percent when residential density
increases from 2 units per acre to 10 units per acre. Since residents are closer to work, stores,
and neighbors, the number of miles traveled is reduced as population density increases. Yet
growth affects habitat integrity and quality, and land development can disrupt wetlands. For
every acre of redeveloped brownfields, it is estimated that 5 acres of greenfields are saved.
Watershed protection requires being concerned with a suite of goals that need to include
economic and biodiversity goals not just water quality goals.
EPA has worked closely with local officials since watershed planning is an important tool to help
local officials make informed decisions. As an example the Office of Water recently developed
a Phase II storm water management guide. Yet, EPA cannot protect water resources alone. EPA
has and must continue to foster partnerships with local governments to protect drinking water
resources. EPA is focused on investing in these partnerships and shifting funding to non-point
source projects and smart growth projects to protect drinking water and other water resources.
EPA has worked with the National Association of Counties on non-point source issues to help
municipal officials identify how their decisions will affect the water quality surrounding them.
Two Tools for Smart Growth
Executive Director for the Center for Water Protection, Ms. Hye Yeong Kwan, discussed two
tools for Smart Growth:
• Impervious cover as an indicator for watershed quality
• Use of local roundtables to introduce smart growth concepts to community leaders seeking to
protect their watersheds.
Impervious cover is defined.as land uses that do not permit rainwater runoff such as parking lots,
streets, and buildings. A model of impervious cover can be used to develop a scale for stream
degradation based on the percentage of impervious cover in the watershed. At 10 percent
impervious cover, impacts on the stream are visible and the stream is classified as sensitive. At
30 percent impervious cover, streams are labeled as impaired or non-supporting. Using an
example of a Piedmont stream, at 5 percent impervious cover, the stream appears healthy and
exhibits great diversity and quantity of biota. At 8 percent, the stream still has sinuosity, good
EPA SCIENCE FORUM 2003 PROCEEDINGS 109
-------
riparian cover, and still appears relatively heallhy, but sediment deposition is beginning to occur.
At 20 percent, there is increased sediment deposition and bank degradation that exposed a sewer
pipe.. At 30 percent, tree roots are visible on the stream banks. At greater that 65 percent
impervious cover, this stream may still be labeled a "riparian corridor," but not in appearance.
The Center for Water Protection has conducted over 225 studies on the relationships between
impervious cover and aquatic quality relationships. These studies identified other indicators of
watershed quality, including:
* Forest cover - leading candidate, has the opposite effect of impervious cover
• Cultivated land - parallels impervious cover
• Riparian forest community
• Turf over in impacted watersheds (10 to 25 percent impervious cover).
The first three indicators are critical for sensitive streams (0 to 10 percent impervious cover).
Studies on the relationship between impervious cover and hydrology showed a high correlation
for annual runoff, peak discharge, and channel backflow frequency. In addition, as urbanization
increases, riparian buffer width decreases, and the floodplain increases with increases in the
impervious cover, which in turn results in channel enlargement increases. Furthermore, large
woody debris decreases as impervious cover increases and is a very good indicator of stream
quality.
In the study entitled "Effect of Urbanization on the Natural Drainage network in the Four Mile
Run Watershed," the Center for Water Protection examined the relationships between
impervious cover and biological indicators in that region. Such studies show a strong positive
relationship between an increase in impervious cover and an increase in bacteria, nutrient,
sediment, and pesticide levels. Other findings include:
• Fecal coliform levels in urban storm water average about 200; maintaining low coliform
levels is extremely important to maintaining fishable and swimmable waters
• As impervious cover increases, insects, fish diversity, and other biotic indicators decrease
and biological integrity also decreases
• Impervious cover impacts natural biota (sensitive species) negatively.
For sensitive streams, pre-development hydrology is a reasonable goal, but there may be other
concerns, such as swimmability or contact issues in more urban areas. Impervious cover is not
an exact measure but "a critical point" and should always be viewed as such. For example,
planning in the Goose Creek watershed used impervious cover to assess overall watershed
quality even before work commenced. The assessment found that barely any subwatersheds had
less than 10 percent impervious cover, 35 of 42 subwatersheds fell into the 10 percent or
sensitive category, and two subwatersheds fell into the impacted (greater than 25 percent
impervious cover) category. Using the "rural water quality impacted" criteria, almost half of the
watersheds are classified as impacted.
110 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Since development happens locally, a local roundtable approach seeks to change local codes and
ordinances to align with smart growth objectives using a consensus building process. This
involves six steps:
1. Selecting a community—requires a political and local jurisdiction that is willing to change,
has a current growth rate mat is significant, and has growth management and costs that are
current and pressing issues. To date, about a dozen communities have participated in this
project.
2. Doing the background research—this is important to understand the basic tenets of Smart
Growth, to become familiar with local codes and ordinances, and to identify and contact
potential stakeholders by thinking beyond those that comprise the planning commission, such
as developers and environmental persons.
3. Introducing the stakeholders to the process—hold meetings to get to know the community
leaders, introduce smart growth, develop a roundtable, review consensus-building process,
and divide into subcommittees by issue.
4. Facilitating consensus—this requires the organizer to be an active partner in facilitating a
consensus handholding and personal phone calls. Advocates for change must satisfy
community concerns and this requires keeping an open mind (i.e., avoid being set on certain
expectations).
5. Holding a final roundtable meeting—to provide closure to the process.
6. Conducting "after" care—follow-up is essential so that people understand what went into this
consensus building process.
Online resources with more information on this roundtable process are available at:
www.cwp.org and www.stormwatercenter.net.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period following each presentation addressed a range of topics.
These included: (1) consideration of other indicators in Smart Growth, such as lifestyle, number
of vehicles per household, and gas efficiency that link to environmental impacts; (2) challenges
in consensus building with stakeholders that believe nothing is wrong in the community and the
need to make an economic link to smart growth as well as to break barriers through consensus
building; (3) how to balance scale in watershed protection by setting reasonable restoration goals
for impervious cover; (4) consideration of efforts necessary to restore developed watersheds to
achieve a desired impervious cover goals; (5) the limited benefits of impervious cover data and
research for urban areas pointing to the need for multiple tools such as impervious cover
percentage for an entire watershed; and (6) the challenges in providing technical assistance to all
the communities that wish to undertake Smart Growth principles.
EPA SCIENCE FORUM 2003 PROCEEDINGS 111
-------
Innovative Monitoring Techniques
Following opening remarks by Ms. Susan Holdsworth, with OWOW, three speakers addressed
innovative monitoring techniques being developed to better understand which waters are
impaired or are in danger of impairment. A panel discussion including an audience question
and answer period followed the presentations.
Ms. Susan Holdsworth, with OWOW, noted the need for additional water quality monitoring and
data to fill holes and gaps in our current knowledge about which waters are impaired or in danger
of impairment. Only a fraction of the waters in the United States have been monitored, mostly
on streams and rivers, and this raises questions about our efficiency in resource utilization to
address water quality across the United States. EPA resources have been devoted to providing
guidance for state monitoring programs, training, technical assistance, developing, testing, and
refining of methods. Ms. Holdsworth then introduced the three speakers in this session.
30 Years of Progress Through Partnerships: Biological Indicators
Ms. Susan Jackson, an Environmental Scientist with the Biocriteria Program, discussed the use
of biological indicators as a means for assessing water quality and demonstrated the added value
of their use. Biological indicators are any organism, species, assemblage, or community
characteristic of a particular habitat or indicative of a particular set of environmental conditions,
such as the presence or absence of a species. Human activities are major drivers in the alteration
of water resource features and affecting the biological endpoint and responses.
An example illustrating the added value of biological indicators involves a 12-year effort in
Ohio. Bioassessment is the evaluation of the biological condition of a water body using
biological surveys of the structure and function or the community of resident biota Research
comparing the results of chemical and biological assessments noted that:
• 58 percent of the time, the findings of the chemical assessment and bioassessment agree that
a problem exists
• 36 percent of the time, the chemical evaluation indicates no impairment when a biological
survey does indicate an impairment
• 6 percent of the time, the bioassessment indicates no impairment while the chemical
assessment does, representing a disagreement in findings when agreement was expected.
Thus, this variable can act as a good internal quality check.
When selecting the community components (such as target species and taxa) for metrics, a key
question is whether tribes and states can implement the tools. Also important is the need to
develop more man one metric, such as fish, algae, etc. Good wetland indicators include fish,
birds, and plants.
112 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Bioassessments are still required on larger rivers. The current focus is on intermittent streams,
and this is being conducted by working on a waterbody-by-waterbody basis across the states.
Stream bioassessment is conducted either by collecting fish through electric shock or using
insects as indicator organisms in a benthic macroinvertebrate community bioassessment. Other
methods include collection and analysis of artificial substrates colonized by insect larvae;
identifying sensitive organisms in streams such as caddisflies, dragonflies, and magflies;
identifying tolerant organisms in streams such as leaches, midges, snails, and scuds; and
examining the metric behavior along the stressor gradient.
Biological information will help answer tough management questions at the global, Federal,
state, and local levels including:
What is the condition of the resource?
Is there a problem?
What do we tackle first?
What do we want to maintain or restore? "
How do we know when we get there?
Under the CWA, states have the primary responsibility to manage their water. Therefore,
different methods, indicators, and management practices increase the difficulty in sharing and
communicating data, but do allow for analysis of creative best management practices. However,
information sharing across state, tribal, and political boundaries is important. In addition, critical
questions for such monitoring programs include how to acquire comparable data, how to
aggregate data, and how to communicate the data to the public. Solutions to these challenges
include the development of performance-based monitoring programs.
An example illustrates the use of an aquatic life conceptual model to identify a Biological
Condition Gradient The purpose of the Biological Condition Gradient exercise is to protect high
quality waters, produce scientifically defensible benchmarks, and create a common framework
for working with the public. This involved "blind" data exercises to "pick the brains" of
scientists in different regions to determine how they establish the reference condition and if there
are common decisionmaking patterns, terminologies, and scientific principles regardless of
method. Key findings were that the scientists rapidly built a consensus and used common
scientific principles. The draft aquatic life conceptual model includes six levels of ecological
change: Natural Structure and Function, Minimal Change, Evident Change, Moderate Change,
Major, and Severe.
EPA continues to work with diverse partners to promote the more frequent use of biological
indicators and to move the science forward. These partners include state and tribal scientists and
the scientific community (academic, agency, private), as well as internal EPA collaboration.
Innovative Monitoring: Probabilistic Monitoring Approaches
Mr. Barry Burgan, a Senior Marine Scientist in OWOW, discussed probabilistic monitoring
approaches as a cost-effective, innovative technique to assess wetlands and estuaries. This
approach incorporates randomized site generation and allows for a description of the whole by
sampling the parts, thus lending applicability to national assessments. Probabilistic monitoring is
EPA SCIENCE FORUM 2003 PROCEEDINGS 113
-------
not a substitute for compliance monitoring because it does not identify all impaired assessment
units. Instead, probabilistic monitoring is best suited for targeted, site-specific monitoring. This
type of monitoring design enables evaluation of national, wide-spread issues in a cost-effective
manner, generate scientifically defensible comprehensive assessments of water resources at all
scales and at less cost, arid provide core surface water, estuarine, and wetland indicators for
comparable results.
Many states use probabilistic design approaches and are still evaluating the data. Indiana is the
first state to have reported using probabilistic designs. A national costal condition report divided
the country into different regions, examined each region separately, and aggregated the data for
consolidated reporting. Two additional examples include a multi-year design study in Casco
Bay, Maine, under the National Estuary Program and an Indiana fish community probabilistic
assessment, which used IBIs to calculate a total score mat related to the condition of the fish and
the quality of the watershed.
Benefits of probabilistic monitoring includes a cost-effective approach to establish the baseline
and trends for national/regional/state water quality, the ability to assess human health and
ecological program effectiveness at all scales, and the ability to support the equitable allocation
of ihe CWA 106 resources among states.
Incorporating habitat differences into a probabilistic design model depends upon the scale of the
study. For a state-level model, habitat differences are less important as a result of to the larger
scale. At the river or stream level, identifying the specific habitat of interest and targeting those
habitats is more important given the smaller scale of the analysis.
The Next Generation of Wetlands Assessment - The Upper Juniata Watershed
Assistant Director of the Perm State Cooperative Wetlands Center, Ms. Denise Wardrup,
discussed the use of GIS, land use, and landscape information to assess watershed condition
assessment prior to an onsite survey, and drew on the Upper Juniata Watershed as an illustrative
example of this approach. Wetland assessment on a watershed basis is important because
watersheds are a more efficient unit from a financial, social, and ecological perspective. In
addition, the watershed level is conceptually attractive for local managers because it occurs at a
scale that people can manage.
Not all decisions call for the same level of information. Therefore a multi-level assessment
methodology is needed that targets:
» Inventory—how do we find the wetlands?
• Condition—how do assess their ecological integrity?
• Restoration—how do we use the information to improve conditions?
The Upper Juniata is a large tributary to the Susquehanna River, which is the largest tributary to
the Chesapeake Bay. The objective of assessing the conditions of the wetlands in the Upper
Juniata Watershed proved difficult because most of the wetlands were on private property.
Because intern and volunteer teams were used to assess the watershed, there was some concern
that EPAs quality control requirements for the data might not be met; however, the data did meet
114 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
those requirements demonstrating that this may be a cost-effective approach for monitoring and
assessing wetlands. The reference standard is forested area; agriculture use is the major
conversion activity in the region and occurs at varying degrees.
Analyses of land conversion from forested to urban and agricultural land was conducted using
GIS software to analyze land use patterns such as percent forested area, mean forest patch size,
the diversity index, and road density. In the Upper Juniata Watershed, land use began as
forested, but is moving toward agricultural. Using an approach of correlating percentage of
forested area with ecological condition, 83 wetlands in the watershed were classified at the
desktop.
A rapid assessment was also conducted. This requires the use of site-specific stressors for the
four main types of wetlands, as they differ by water source. Also necessary for rapid assessment
is a landscape profile, which, for the Upper Juniata wetlands, showed headwater complex as the
most important and most frequent class. Also required is a site visit to identify stressors, such as
dissolved oxygen levels, hydrological modification, sedimentation, contaminant toxicity,
turbidity, and eutrophication among others, that cannot be identified from aerial photography.
Analysis of these stressors when coupled with landscape information provides the following
results:
» At greater than 85 percent forested area, the only stressor evident is hydrologic modification
at approximately 33 percent of the sites
• At 50 percent forested, there are a large number of stressors and more than 60 percent of the
sites have hydrological modification.
Are stressors associated with land cover characteristics? At the reference site, 42 percent of the
headwater floodplains were affected by sedimentation. Certain human activities give rise to
stressors and those have important impacts on the systems.
When comparing landscape and rapid assessments, rapid assessment provides more information
because it draws on site-specific information.' For landscape assessments, close to 70 percent of
the information can be explained if the wetland type is specified; however, for rapid assessments,
the amount of information that can be explained increases significantly. In addition, rapid
assessment is meant to be more predictive not more prescriptive.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period following each presentation addressed a range of topics.
These included: (1) use of large-scale, random sampling to site-specific identification of
stressors; (2) incorporation of habitat differences in estuary evaluations; (3) different methods for
land use conversions; and (4) consideration of systematic bias in data reporting for landscape and
rapid assessments.
EPA SCIENCE FORUM 2003 PROCEEDINGS 115
-------
Volunteer Monitoring—Ten Years of Progress
Following opening remarks by Mr. Joe Hall, with OWOW, four speakers addressed the use of
volunteer monitoring in general and in specific support to wetlands, coastal, and estuarine
programs,
Mr. Joe Hall, an Environmental Scientist in OWOW, introduced this session and the speakers.
Volunteer Monitoring: 10 Years of Progress, What's the Future?
Ms, Alice Mayio, Environmental Protection Specialist in OWOW, provided an overview of
volunteer monitoring in the past and present, EPA sponsorship of volunteer monitoring,
partnerships, and the applicability of voluntary data. People become volunteers because of their
genuine concern about local water, their desire to know more about the surrounding
environment, and their desire to share whatever special localized knowledge they may have to
offer. The largest group of volunteers involves mid-life adults (40 percent) followed by youth
under 18 (28 percent) and seniors (25 percent). Currently, there are 830 volunteer programs
nationwide to monitor rivers, streams, lakes, reservoirs, estuaries, and wetlands, and these
programs collect "professional" data on physical conditions for habitat assessment
EPA sponsors volunteer monitoring programs to support education, promote environmental
stewardship, promote involvement, support decisionmaking, and gather data. Some volunteers
may have collected data on streams and rivers for up to 25 years. Currently, only 19 percent of
all stream miles have been assessed; therefore, this information from volunteer programs is quite
valuable. EPA provides technical support, workshops for information exchange, and limited
funding under the National Estuary Program and non-point source funding programs.
Universities and other educational institutions, independent organizations, and environmental
organizations such as the Sierra Club all sponsor volunteer monitoring as well.
Volunteer monitoring data is currently used for local action. "Persistent eyes and ears" can
report invasive species first as well as loss of macroinvertebrates leading to local
decisionmaking. Volunteers also support education and outreach at local fairs, provide local
articles in the newspaper, or make presentations to children's classrooms. At the state level, the
data collected by volunteers supplement limited state data for reporting under CWA Sections
305(b) and 303(d) and serve as a screen for problem identification. The credibility of volunteer
data depends on (1) the use of an approved Quality Assurance Project Plan, (2) receiving training
in quality assurance in the field and the laboratory, (3) cooperation with data users, (4) strong
leadership, and (5) the involvement of scientists and professionals.
Two models were discussed as illustrating the volunteer approach: a "community workers"
model, and a "science by the people model." Under the "community workers" model, a
sponsoring agency determines the course of the project and uses the volunteers as labor. For
example, the Maryland DNR recruits and trains volunteers to use specific protocols at designated
sites, and the DNR conducts the analysis of the collected data. Under the "science by the
people" model, volunteers conduct the majority of problem identification and determine how to
collect, analyze, and use the data. An example program is the Alliance for Aquatic Resource
Monitoring under which individuals identify a topic for a study, scientists) assist in the setup of
116 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
the research model and training, and the volunteers ultimately collect, analyze, and use the data
in the way they want. The advantages of these approaches include citizen empowerment and
involvement as well as maintaining the interest of the volunteers. The disadvantages include the
need for the service provider to continuously come up with new programs, which is time
consuming.
A few success stories of partnership results include the Virginia Save Our Streams initiative that
developed a revised macroinvertebrate method and the Florida Lake Watch that analyzed
reliability protocols and produced 24 peer-reviewed articles.
Wetland - Volunteer Monitoring: Ten Years of Progress
Ms. Kathleen Kutschenreuter, with OWOW, discussed how volunteer monitoring relates to
wetlands protection and data collection. Wetland ecosystems provide an amazing diversity yet
over half of the wetlands in the United States have been drained or converted. Wetlands are a
vital link and are highly productive with unique hydrology and soils. Wetlands provide many
key functions such as flood reduction and recreation yet they tend to be overlooked when surface
water is in discussion. National Wetland Goals established under the CWA include no overall
loss of wetland acreage and to provide for an annual net increase of 100,000 acres annually by
2005.
Volunteer program support such as technical and financial assistance is provided under the
National Wetland Monitoring Strategy. While EPA could simply support volunteer monitoring
for the education and outreach benefits, EPA instead takes volunteer monitoring one step further
by using volunteers to help meet the objectives of the CWA. For example, a recently established
National Wetland Monitoring Working Group includes members from a majority of the states as
well as 5 tribes. EPA is looking for ways to include volunteers in this Working Group.
Grants under 104(b)3 are the biggest financial tool in developing wetland tools to assist state and
local tribes and involve an investment of about $15 million per year. Through these grants EPA
is seeking to increase the protection of vulnerable waters, adopt an ambient wetland monitoring
and assessment program nationwide, and improve successful mitigation of impacts on wetlands.
Cooperative projects and partnerships need to focus on transferable processes and products to
enable information exchange between regions on best practices. Examples of successful
cooperative EPA projects and partnerships include the Isaac Walton League (wetlands
conservation and sustainability), the Massachusetts Bay Volunteer monitoring and Health
Assessment Group (Handbook for Monitoring New England Salt Marshes, potentially applicable
regionally or nationally), and the Earthforce Wetlands Project (state, tribal, and local
coordination case studies and trends to identify volunteer programs, quality assurance in
volunteer programs, and how states can use these data to fulfill their CWA obligations). These
examples demonstrate the usefulness of volunteer monitoring.
Volunteer groups are unique and vary in funding, background quality assurance, and other areas.
As a result, Volunteer Wetland Monitoring: An Introduction and Resource Guide was developed
to maximize the benefits achieved by the volunteers and the states from volunteer monitoring
programs.
EPA SCIENCE FORUM 2003 PROCEEDINGS 117
-------
May is American Wetlands month and more wetland information is available from the EPA
Wetland Hotline at 1-800-832-7828.
Volunteer Monitoring: A Coastal Perspective
Mr. Joe Hall, with OWOW, explained how volunteer monitoring supports coastal and estuarine
monitoring initiatives. The National Estuary Programs use volunteer monitoring to collect data
on the condition of the Nation's estuaries. There are 28 National Estuary Programs that address
estuaries designated as areas of national significance. These areas are continuously monitored
and evaluated. In addition, EPA developed the Volunteer Estuary Monitoring Methods Manual,
which provides for the standardization of methods and pooling of resources for increased buying
power. EPA also provides technical support and coordination to volunteer groups.
Volunteer estuary monitoring workshops and a newsletter are two ways of sharing information
between volunteer groups. Two-lhirds of the participants in the workshop are the volunteer
monitoring leaders themselves. Government organizations at all levels (state, local, Federal) and
non-governmental organizations also participate at these workshops. Examples of special topics
addressed at these workshops include:
GIS
Dealing with Hydrilla
Observing field conditions
Nutrients and pesticides
International collaborations (Canada and Mexico) to look at border problems of pollution
Equipment management
Land use at Charlotte harbor.
More information on volunteer monitoring resource ideas and other topics is available at:
www.epa.gov/owow/oceans and http://www.epa.gov/owow/estuaries/monitor/.
What's In the Future?
Ms, Alice Mayio, with OWOW, discussed future challenges and opportunities for volunteer
monitoring programs. Challenges include funding, improved quality assurance of data, data
management, and broader use of volunteer data. Future opportunities include invasive species
monitoring, creating regional networks to cost-effectively pool data, adding volunteer monitoring
data to STORET (through facilitation of data entry), and participation on monitoring councils.
118 EPA SCIENCE FORUM 2003 PROC EEDINGS
-------
Section VI: Emerging
Technologies
Tuesday and Wednesday, May 6-7,2003
The purpose of this breakout session on the second and third days of the meeting was to focus on
the application, use, and research directions for diverse emerging technologies, including
computational toxicology; advanced information technology, simulation, and modeling;
biotechnology; and nanotechnology. Each session included a panel discussion and opportunities
to respond to audience questions that provided additional information and insight on a variety of
emerging technology topics.
Dr. William Farland, Acting Assistant Administrator for Science and Research and Development
for ORD, led a session addressing the application of computational toxicology to solve
environmental problems. Presentations included a toxicogenomic predictive model of chemical
effects, highlights of EPA's research activities involving computational toxicology and its
applications for risk assessment, quantitative structure-activity relationship models and other
computational tools supporting evaluation of chemical effects on human health, applications of
computational toxicology and genomics to drinking water research, research initiatives using
genomics to assess risk to ecological sustainability from environmental stressors, historical and
future uses of structure-activity tools to assess pesticides and toxic substances, and highlights of
the recently established NIEHS National Center for Toxicogenomics.
Dr. Gary Foley, with NERL, led a session addressing innovations to advance the detection of
threats and to optimize environmental decisionmaking. Presentations included an overview of
the Federal Networking Information Technology Research and Development Program, diverse
information technology initiatives within EPA for enhanced data acquisition and analysis, use of
satellite-based remote sensing systems to evaluate human and ecosystem health issues, an
aircraft-based surveillance system to support first responders to chemical spills and other
emergencies, use of computer imaging and wind tunnel testing to characterize the temporal and
spatial patterns of contaminant movement and deposition from the collapse of the WTC, and the
use of real-time monitoring data to communicate air quality conditions to the public.
Dr. Hugh McKinnon, Director of the National Risk Management Research Laboratory
(NRMRL), led a session addressing developments and applications of biotechnology.
Presentations included the use and implications of molecular farming to replace traditional
chemical manufacturing, highlights of the EPA biotechnology research program, the role of
science in the regulation of bioengineered crops, monitoring strategies to assess the risks
associated with bioengineered crops, use of satellite-based remote sensing systems to support
compliance monitoring for bioengineered crops, the production and use of biopolymers to create
biodegradable plastics and other products, and highlights of the USDA Biotechnology Risk
Assessment Research Grants Program.
EPA SCIENCE FORUM 2003 PROCEEDINGS 119
-------
Dr. Jack Puzak, Acting Director of NCEA, led a session addressing developments in
understanding nanotechnology and its applications. Presentations included the use of
nanotechnology to enhance filtration membrane performance, highlights of the National
Nanotechnology Initiative, the production and use of polysiloles as chemical sensors for arsenic
and hexavalent chromium, the production and action of biopolymers to remove heavy metals
from wastewater, understanding the molecular dynamics of colloidal nanoparticles, and the
development and use of nanocrystalline zeolites as catalysts.
120 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Applying Computational Toxicology to Solving Environmental
Problems
Following opening remarks by Dr. William Farland, Acting Assistant Administrator for Science
and Research and Development for ORD, seven speakers addressed the concepts, tools, and
applications of computational toxicology for understanding the human health and environmental
effects of chemicals. A panel discussion including an audience question and answer period
followed the presentations.
Computational Toxicology: Bolstering the EPA's Mission
Acting Assistant Administrator for Science and Research and Development for ORD, Dr.
William Farland, welcomed attendees, thanked Dr. Elaine Francis for organizing the speakers for
this session, and provided an overview of this half-day session on computational toxicology.
The general topic, computational toxicology, is defined as an emerging science that begins to
integrate genomics, quantitative-structure activity relationships (QSARs), informatics, and
systems biology, including pharmacodynarnics useful to understanding the mode of action and
screening chemicals in virtual system (in silico), thus moving away from in vivo and in vitro
types of studies.
This emerging science brings together diverse disciplines through systems biology and
computation methods, which is anticipated to lead to more streamlined testing programs for all
chemicals that the EPA is charged to evaluate. At present, there is heavy reliance on animal
testing to understand potential impacts to humans. Computational toxicology will enable the
achievement of similar understanding through a systems approach to the biology and risk, with
the associated advantages that include reduction in use of animals for such testing purposes.
Toxicogenomic Predictive Modeling
Vice President for Toxicogenomics at Gene Logic, Inc., Dr. Donna Mendrick, discussed current
activities to build atoxicogenomic database, use of customer-generated data, and prediction of
pharmaceutical and chemical effects using the database and related toxicogenomic tools. While
classic in vivo toxicity studies monitor about 100 parameters, toxicogenomics allows monitoring
of the entire genome. This in turn enables prediction of toxicity before or in the absence of
classical signs, and provides clues as to the mechanisms of toxicity such as the genes, metabolic
pathway, or biological pathway involved.
The ToxExpress® Working Group is developing a toxicogenomic database that will serve as a
tool to improve chemical safety through predictive modeling. The development of this database
involves many firms and requires integration of expertise from bioinformatics, database
management/software, biostatistics, molecular biology, pharmacology, toxicology, and
microarrays among others. Efforts to date have looked at more than 130,000 microarrays. To
build a robust database, vehicle,' multiple doses, and time points are included in each study to
enable evaluation of time effects and to address differences in animal feed that result in a drift in
results. Since each run generates 26,000 data points, this database becomes quite large.
EPA SCIENCE FORUM 2003 PROCEEDINGS 121
-------
Uses of toxicogenomics identified by their customers include in vitro .screening to rank
compounds and determine which show the least toxicity; short-term in vivo experiments; and
mechanistic analysis of toxicity to understand the underlying mechanisms and to look for clues
in the genes to help refine further research and to determine whether the impacts are animal-
specific and therefore not relevant to humans.
The methodology examines the impacts on each gene in the normal population. This involves
graphing the frequency versus the average difference value, then comparing this information to
each gene's result when exposed to a toxicant. Each gene receives a different weighting (linear
discriminant analysis) depending upon its response to the toxicant. Note that there is often much
overlap between normal and exposed genes.
Gene Logic uses multiple methods to assess data compatibility. A large reference database such
as this one allows the selection of genes that exhibit the lowest variability due to biological and
processing differences. This facilitates comparison of data from different sites. In addition,
predictive model validation is also conducted involving extensive statistical cross-validation.
Models are also tested using customer-generated data provided to Gene Logic, in a blind fashion.
The data are sent to Gene Logic to assess gene compatibility, run the predictive models, then
issue a report of whether the compound appears toxic or not. To date, Gene Logic has correctly
identified 90 percent of the true hepatotoxins with a zero percent false positive rate. Gene Logic
subsequently determined that the data came from five different rat strains so the majority of the
compounds tested (22 of 32) were not already in the predictive models, thus resulting in a true
test of the database as a predictive tool.
Dr. Mendrick provided examples of the application of the database and associated predictive
modeling involving diethylnitrosamine and ciprofibrate. These examples illustrated that
toxicogenomic predictive modeling can be applied 24 hours after dosage, thus saving time and
money, and can correctly identify both short- and long-term hepatotoxicity. This technique also
enables mechanisms of toxicity to be evaluated in conjunction with the pathology and
compound-match information as well as analysis of individual genes.
Dr. Mendrick also provided examples of predictive modeling for three acetylcholinesterase
inhibitors that are not toxic to animals but are toxic to humans. The toxicogenomic predictive
models showed results comparable to human clinical findings that classical clinical chemistry did
not detect. The key is to know what genes to look at. In this study, over 200 genes were
deregulated by the compounds being tested, but many were not hepatotoxicity markers.
Therefore, a change in a gene does not necessarily signify a toxic event.
This combination of animal testing, a large database of genes, and toxicogenomic predictive
modeling enables accurate toxicity predictions for compounds not already studied. While in a
minority of cases data compatibility issues have arisen, methods have been successfully
developed and implemented to enable the use of such data In addition, building a large database
of genes helps to identify normal expressing ranges of genes that are not involved with toxicity,
supports statistical confidence in determining statistically significant events since many toxicity-
relevant genes exhibit small changes in gene expression, and enables the construction and
reliable use of robust toxicogenomic models. Because the database design incorporated
122 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
variability information, broader applicability was achieved such as the prediction of results in
different rat strains.
One question raised during the presentation involved model sensitivity to a chemical that has
more than one mode of action or responses to chemical mixtures. Where a chemical has multiple
effects, the more prominent effect may mask more subtle effects that may be of interest. A
second question concerned the false positive rate encountered in this approach. The model can
be adjusted to sensitivity including the need to reduce false negatives, which in turn would
increase the number of false positives. The database includes a number of negative controls, in
particular the elimination of genes that are responding but are not toxicity-relevant. .
EPA's Research Program on Computational Toxicology
Director of NHEERL, Dr. Lawrence Reiter, provided an overview of the health toxicity program
at EPA, challenges facing the Agency, and the use of a conceptual or science framework for
guiding research. To carry out its mission, EPA relies on quantitative risk assessment and uses
mat information to inform decisionmaking. There is a whole risk continuum beginning with
source/stressor formation that proceeds to exposure then to effect/outcome. This is typically
applied in research to one chemical at a time. The EPA programs have many lists of priority
chemicals including endocrine disrupting chemicals (EDCs), pesticide inerts, high production
volume chemicals, and Contaminant Candidate List chemicals, with no risk-based criteria for
setting testing priorities, yet EPA cannot possibly test all of these for every possible endpoint. In
addition, mere are multiple regulatory authorities for testing and each has different testing
requirements with few options for flexible testing approaches. There is currently a lack of data
necessary to reduce uncertainties that exist in comprehensive risk assessment such as
extrapolation of toxicity data across species.
Better tools are needed to understand the risk continuum and genomics shows much promise for
application in mis area. Computational methods and bioinformatics may help to identify what
endpoints to measure and the quantitative models may help to address the challenge of
evaluating large numbers of chemicals. The overall goal is to integrate modern computing and
information technology with molecular biology and chemistry to improve EPA's ability to
prioritize data requirements and risk assessments for toxic chemicals. Overall program objectives
are to improve linkages in the source-to-outcome risk paradigm, improve predictive models for
screening and testing, and enhance quantitative risk assessment.
Better characterization of toxicity pathways supports better understanding of the exposure-to-
adverse-outcome portion of the risk continuum paradigm. Exposure to axenobiotic chemical
results in altered organ function and adverse outcome. In vivo approaches help in understanding
these relationships at the molecular/subcellular, cell, organ/tissue, and individual organism
levels, which can be linked to an outcome that is relevant to EPA and to risk assessment. In vitro
methods can be used to understand cellular processes and to improve predictive models. Then,
in silico methods can be used to understand and/or predict the molecular/subcellular effects. An
example is current research into the potential use of diagnostic molecular indicators and whether
gene expression pathways can serve as indicators for a specific stressor or family of stressors.
With 193 genes detectable in both blood and uterus, only 18 of these genes are significantly
EPA SCIENCE FORUM 2003 PROCEEDINGS 123
-------
altered when exposed to estradiol; if these changes can be shown to occur in a dose-related
fashion, models can then be developed.
Use of QSAR approaches help to improve predictive models for screening and testing, including
pollutant-driven and chemical-driven approaches. Estrogen receptor binding was offered as an
example of how to address the need to test 6,000+ chemicals by using predictive modeling to
identify the highest priority chemicals for testing, the lowest priority chemicals for testing, and
chemicals that may not require any testing. Another example involved the use of QSAR to
predict the relative potency of haloacetic acids and the concentrations necessary to produce
developmental malformations; experimental results tracked well with QSAR prediction and
efforts are now being extended to the pathway of effect - looking at changes in tissue level, gene
activation patterns, and proteonomic approaches to evaluate protein phosphorylation, believed to
be the key mechanism leading to the birth defects. The overall desire is to expand the QSAR
approach to a wider range of chemicals.
Cross-species extrapolation continues to be a major area of uncertainty when conducting risk
assessment. Genomics enables identification of similarities/differences among species at the
cellular level and to do the extrapolations. The default assumption in risk assessment is that we
can extrapolate across species, but the literature indicates that this may not be the case. To
address this, EPA is researching effects of exposure to endocrine disrupting chemicals in
phytogenetically diverse species. This research is identifying/isolating receptors, determining
receptor binding characteristics, then looking across species regarding the structure and function
of these receptors. Structural differences have been found between genes; whether this can
explain differences in binding characteristics is not yet known. When similar binding
characteristics are found, such data improve confidence in extrapolating across species.
The next steps in the research program are to complete the strategic plan for the computational
toxicology program, which is to be presented to the Science Advisory Board in July 2003. EPA
will continue to coordinate and collaborate with other research organizations to help guide
research in this area; these include NIEHS, DOE, and the CUT Centers for Health Research.
Strong collaboration and coordination with other Federal agencies will be necessary to achieve
this research program's objectives as the amount of research to be done far exceeds EPA's
capacity. The ultimate goal, however, is to link the science to solving Agency-related
challenges.
Novel Informatics and Pattern Recognition Tools for Computational Toxicology
Dr. William Welsh, with the Robert Wood Johnson Medical School and the University of
Medicine and Dentistry of New Jersey, discussed the development and application of
computation tools useful to risk assessment and regulatory control, with emphasis on QSAR
models. The high throughput technologies are yielding prodigious amounts of information and
the mandate of the informatics tools is to digest this vast amount of data to yield meaningful
conclusions and interpretations. The apparent simplicity of QSAR is that a set of compounds
with common structural similarity or lexicological endpoint can be constructed on a computer,
which can extract the various features (descriptors) that are in common and can correlate changes
in endpoints with these features. This combines activity data and molecular descriptors to make
124 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
predictions on a larger array of chemicals and to interpret changes in biological activity (how
changes lead to biological effect).
Creation of a QSAR model requires lexicological endpoints, chemical structures, calculated
properties, and the use of statistical techniques to build models for prediction and interpretation.
The value of QSAR models is that they are:
* Extremely fast, so they are amenable to large scale screening
• Predictive, enabling existing data to be leverage (if used correctly)
* Economical, enabling prioritization of expensive testing
• Informative, by yielding hidden patterns and insights into the mechanism(s) of action
* Humane, by reducing the extent of testing on animals.
Given the large number of untested chemicals (potentially more than 85,000), a tiered
computational approach is necessary because the biological data are not available to do QSARs
for each one. One approach is to consider the use of structural filters to determine which
chemicals might be EDCs to identify a potential subset of chemicals, then use classification
models to separate active and inactive compounds from this subset. For active compounds, the
approach would be to generate data and QSAR models, and use the results to design animal
studies. This approach prioritizes large numbers of chemicals with the outcome of testing the
most suspect ones.
Under an NCER grant, various QSAR models were developed and applied to two estrogen
receptors to explore the action of EDCs, which mimic the effects of estrogen, androgen, and
other hormones and may give rise to possible adverse health effects. EDCs are highly
structurally diverse, pervasive, and include both agonists and antagonists; as a result, their effects
are harder to understand. The resulting data showed a high correlation between QSAR predicted
binding affinities and biological measurements of binding affinities.
Receptor-based approaches look at the chemical of interest in silica to examine the binding
geometry, use ahomology model to build a structural model from crystal structure information,
then "dock" small molecule compounds into the pocket and calculate binding affinities (i.e.,
ligand-receptor affinity and energies). The resulting data again showed a high correlation
between calculated (modeled) and measured (experimental) findings for both estrogen and
androgen receptors. In addition, enhanced QSAR models can be developed by supplementing .
QSAR models with binding energy information. Results from such models are even closer to
experimental results than QSAR alone.
Research efforts also investigated the effects of certain mutations that occur within the receptor
binding pocket. One example is prostate cancer in which a single mutation to alanine in one
location significantly reduces the effectiveness of chemotherapy by increasing affinity of the site
to a much broader range of compounds. The result changes a chemotherapy drug from an
inhibitor to an agonist, an undesirable outcome.
Another research area involved the activation/antagonism of the PXR/SXR receptor by
chemicals. For example, PCBs will bind to the PXR receptor and induce the generation of
enzymes that metabolize the PCBs. However, activation of the PXR receptor appears to vary
EPA SCIENCE FORUM 2003 PROCEEDINGS 125
-------
species by species; PCBs turn PXR receptor "on" in rats and mice, but turns it "off' in humans.
This implies that the use of rat models for PCBs may not be appropriate. In addition, different
PCB compounds have different effects on this receptor and the level of chlorination appears to
be the source of the difference. As an outgrowth, a series of guidelines were developed as to
which PCBs may be antagonists. . . ., < .
Other computational tools of interest include shape signatures and polynomial neural networks
(artificial intelligence). A limitation of QSAR models is the need for biological data, which may
not exist. Shape signature approaches can be used to compare small molecules with one another
and these differences can be extrapolated to screen and identify EDCs for example. Polynomial
neural networks may be useful in developing nonlinear QSAR models for data sets that are very
noisy and very large.
Computational Toxicology and Genomics: The Next Wave of Drinking Water
Research
NHEERL scientist, Dr. Douglas Wolf, discussed applications of computational toxicology and
genomics to risk assessment. Computational toxicology is the intersection of computational
methods and the "-omics" technologies. The traditional risk assessment paradigm is that
exposure at some dose results in a response that may be measured. Toxicogenomics is a new
field examining how the entire genome is involved in biological responses to environmental
toxicants and stressors. By combining information from genome m-ribonucleic acid (mRNA)
profiling (genomics), cell or tissue protein profiling (proteomics), and genetic susceptibility,
computational models can be developed to better understand stressors and disease. This results
in a new risk assessment paradigm in which, at the response level, effects arising in the genome
(what gene affected), transcriptome (which gene transcribed), and proteome (what protein is
transcribed) can be examined to determine whether or not an adverse health effect occurs; this
helps to better understand what the response is and whether it can turn into an adverse effect.
Such an approach helps to conduct better quantitative risk assessment through:
• Identification of biomarkers of exposure and response
* Better definition of dose-response curve
• Definition of mode and mechanisms of action
• Evaluation of mechanisms across species
• Construction of biologically-based dose-response models.
There are a cascade of biological, chemical, and physiological changes that result from
interactions between a chemical and a biological system. For risk assessment, these toxicity
pathways must be understood. While the typical toxicity pathway involves exposure, intake into
organism, uptake into tissue, transfer to target cell population, cellular metabolism, cellular
response, tissue response, and perhaps even organism response, the process is not
straightforward and there are many pieces involved in each step.
Where genomics can have a big impact is in understanding the cellular/tissue response and
predicting the response and whether there is an organ response. This helps to differentiate
126 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
between different types of toxicants (carcinogenic, noncarcinogenic). The goal is to be able to
rapidly identify subsets of adverse health effects.
An example of the application of these techniques involved the understanding of the risk of
urinary bladder cancer from disinfected drinking water. Chlorinated water appears to increase
the risk of bladder cancer. In an experiment, bladder tissue responded to chlorinated water
through cellular and gene expression changes. This experiment demonstrated the importance of
moving beyond the recognition of the pattern of gene expression to turn that impact into
quantitative data as well as the importance of understanding the normal biology. About half of
1he expressed genes involved cell structure, but other gene expression changes suggest changes
in the ability of the cells to break down chemicals in the urine, which will be determined through
future studies.
Another important consideration is to integrate the biology with the chemistry. Computational
toxicology supports this through integration of bioinformatics and chemoinformatics to compare
data on many levels, to understand altered toxicity pathways, and to compare across similar
chemicals or classes of chemicals. Bioinformatics generally refers to the analysis of arrays of
gene or protein expression data, and looks at gene expression patterns to determine mechanisms
of action. Chemoinformatics generally refers to the analysis of chemical activity databases (e.g.,
results from biological assays for many chemicals) to determine and quantify relationships
between chemical structure/property and activity.
Computational toxicology may also support coordinated, high throughput screening of chemicals
because it is designed to interact between response profiling (genomics), virtual models (systems
biology), and computational chemistry toxicity (QSAR). Thus, horizontal integration of
computational toxicology approaches across ORD will support prediction, prioritizatibn, and
more quantitative risk assessment.
The Genomic Path from Exposure to Effects in Aquatic Ecosystems
NERL scientist, Dr. David Lattier, discussed research initiatives to assess risk to ecological
sustainability from environmental stressors by linking aquatic exposure to physiological and
reproductive effects in individuals and populations. Such data enable Federal, state, and local
environmental managers to diagnose causes and forecast future ecological condition in a
scientifically and legally defensible fashion, to more effectively protect and restore valued
ecosystems.
Normal cell functions involve gene transcription that results in generation of a protein (the
endpoint of a cellular event). Personal behavior or environmental conditions can change this
process resulting in over-expression of a protein, a damaged protein, or no protein generation;
this results in a non-normal endpoint that can affect reproduction, development, and/or overall
organism sustainability.
»
Gene expression studies are using the fathead minnow because it is easy to distinguish males and
females, the species is hardy and found throughout the lower 48 states, it has rapid generational
turnaround (reproductive maturity in 4 to 5 months), and over 30 years of toxicological data exist
for this species within EPA.
EPA SCIENCE FORUM 2003 PROCEEDINGS 127
-------
While many genes are expressed at equal levels in all cells (so-called housekeeping genes), a
subset of genes can be identified that are specifically activated by a certain environmental
stressor or class of stressors. This may be useful to link exposure to population and reproductive
effects as well as evaluating effectiveness of remediation technologies and characterizing
exposure of aquatic organisms to mixtures.
A single gene indicator for estrogenic aspects of water, the Vitellogenin gene, has been in use for
several years. While this gene is normally expressed only in egg-bearing females, gene
expression can be environmentally induced in male fish. Thus, this gene is an excellent indicator
of estrogenicity in aquatic ecosystems.
The genomic aspect gets involved when moving from consideration of one gene to the
consideration of many genes. By extracting and covalently binding hundreds of genes to a
microscope slide, which is then washed with a chemical of interest, microscopic examination can
identify changes in gene expression (up and down regulated) as well as those with no change in
expression. Bioinformatics is then used to look at the patterns to see which genes are grouped in
functionality. Proteomics can also be used to profile the proteins. If done correctly, genomics
can yield a genome-wide, hypothesis-free snapshot of dynamic biological systems.
Ecotoxicogenomics is another field in which microarrays can be used to determine
bioavailability, generate molecular signatures and patterns of gene induction by classes of
toxicants, and aid in structural characterization of exposure-specific up-regulated genes.
The overall goal is to enhance computational toxicology through the use of genomics,
proteomics, and metabonomics to assess single stressor exposure patterns, and to describe the
predictive and mechanistic biology of environmental stressors. The challenges faced in this area
are which genes become up regulated and the functional significance of that change, defining
individual variability and its relationship to gene expression, impacts of complex mixtures of
stressors, and cross-species extrapolation.
Structure-Activity Tools for Assessing Pesticides and Toxic Substances—Past,
Present, and Future
Mr. Joseph Merenda Jr., with the OPPTS Office of Science Coordination and Policy, provided
the Program Office perspective on the use of structure-activity tools, the types of tools in use,
and the gaps/needs to fill. Structure-activity tools are a core technology in OPPTS programs.
While not widely used, they are a key component of many activities, including:
• Hazard screening for new industrial chemicals, e.g., pre-manufacturing notifications under
the Toxic Substances Control Act (TSCA)
• Setting priorities for chemical testing (e.g., high production volume chemicals, EDCs),
including a mandate under the Food Quality Protection Act to screen and test a large number
of chemicals
128 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
• Promoting pollution prevention through selection and design of reduced-risk chemicals and
use of PBT profilers with the goal of better communicating technologies to the private sector
and encouraging their use as new products are developed.
Applications of structure-activity approaches include identifying properties of homologous
chemicals, correlating narcotic potency with partition coefficient, estimating chemical properties
using structural features (through correlative, mechanistic, and structural techniques), and
screening/discovering drugs. These involve three families of structure-activity approaches:
expert-based, regression-based, and molecular modeling.
Expert-based structure-activity approaches draw on past experience in evaluating structure,
analogs, and mechanisms to determine what hypotheses can be developed, to conduct qualitative
hazard prediction, to determine whether laboratory (in vivo) testing is necessary, and to
determine whether there is a concern that warrants regulatory activity. An example of the
application of this approach is in the health effects analysis for pre-manufacturing notifications.
Regression-based structure-activity approaches can be used to estimate ecotoxicity for new
chemicals, and require regression models for relevant chemical classes. This approach is
primarily applicable for non-specific toxicity. The octanol-water coefficient appears to be the
best predictor to date, and there are some techniques applicable to situations where there is little
to no data.
Molecular-modeling structure-activity approaches require extensive knowledge of the toxicity
mechanism at the molecular level. The approach is promising but not yet in use, and it is being
evaluated for endocrine disrupter pre-screening and priority setting.
The reasons to invest in better structure-activity tools include: (1) the need for better, more
targeted testing to address huge data gaps, resource limitations, and animal welfare concerns, (2)
the need for tools that incorporate the rapidly advancing genomic knowledge, and (3) the
challenge of addressing real world exposures to multiple chemicals and other stressors.
NIEHS Toxicogenomics Centers: Model for Partnerships
Dr. Bennett Van Houten, Chief of the Program Analysis Branch at NIEHS, discussed the
Toxicogenomics Research Consortium (TRC) and the use of partnering to fully utilize the
potential of toxicogenomics. The NIEHS National Center for Toxicogenomics was established
about 2l/2 years ago to combine toxicology with gene expression profiling, proteomics,
metabonomics, and single nucleotide polymorphism analysis using a relational database. As part
of this Center, the TRC consists of cooperative research members (five academic centers and the
NIEHS microarray center), two large resource contractors, and extramural staff.
The primary TRC goals are to enhance research in environmental stress responses using
microarray gene expression profiling, provide leadership in toxicogenomics by developing
standards and practices for analysis of gene expression data, develop a robust relational database,
and improve public health through better risk detection and earlier intervention in the disease
process. TRC cooperative research members work on common problems, send RNA and other
data to the contractors, who provide microarray and bioinformatics support, then the consortium
EPA SCIENCE FORUM 2003 PROCEEDINGS 129
-------
develops practices and standards for common adoption for data generation and submission to the
database. The contractors deposit the data into the NIEHS chemical effects in biological systems
knowledge base that NIEHS hopes will become a resource for the research community.
Use of centralized contractors was key to information flow and facilitating the communal work
of scientists at different locations and academic centers. Each academic center has three funding
areas: core support, toxicology experiments, and basic research using gene expression profiling.
To .build this standardization project, a common language (standard) was created for gene
expression experiments to generate high quality data and to compare/compile data across
multiple microarray platforms and laboratories. Otherwise, the many sources of variation in
microarray experiments and in the application of bioinformatics tools make it difficult to compile
and compare data across different databases. As part of this effort, an experimental protocol was
developed and implemented at the academic centers to determine variation in RNA labeling and
hybridization. Analysis is currently underway regarding data reliability, reproducibility, and
quality. The program used common genes across the platforms as well as genes on the standard
chips generated by each academic center. Other areas of interest include sources of technical
variation such as direct versus indirect labeling, background correction, image analysis and raw
image processing, normalization, and probe performance (for the same genes).
Preliminary findings show a high correlation of results within one microarray, but this
correlation decreases for the same sample analyzed within different microarrays, and decreases
even further when two different mice species are involved. Generally good correlation was •
found within individual academic centers and less correlation across academic centers. Use of
different scanners also influenced results and correlation improved when the different academic
centers used the same scanner.
These results indicate that development of standards for gene expression experiments are
necessary for a number of reasons. Large amounts of standard RNA (single tissue, mixed tissue)
are necessary and must be reliable, reproducible, and stable over time. In addition, quality
control genes need to be included on chips for analysis, a list of predictive genes is needed that
can hold up across platforms (species-specific, universal), and common gene annotation is
necessary (e.g., accession number/Unigene cluster, sequence information, commercial arrays).
The TRC projects are a partnering opportunity for academic centers to work together and
eliminate variation in gene experiments. Information gained by TRC will be used by individual
sites and 600 grantees to move the field forward in industry as data become available. This
effort envisions the generation of standards and best practices to enable data comparison across
platforms, to develop a microarray library, and to develop a Chemical Effects in Biological
Systems knowledge base. These will all support the use of toxicogenomics to improve public
health by:
• Enhanced efficiency in toxicity assessment
• Enhanced efficiency in drug design/safety assessment
• Individualized medicine for prevention, diagnosis, and treatment
• Individualized risk assessment.
130 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period addressed a range of topics. These included: (1) limitations
of current technology to conduct high throughput assays for analysis over time, which results in
"snapshot" results (one sample result in time) that may affect our ability to distinguish between
normal variability in gene expression and the response to a stressor; (2) the potential for more
extensive use of quantum mechanics, associated limitations regarding molecular features that
may be evaluated through QSAR, proteomics, and genomics, with a future vision of combining
all of these techniques to understand the properties of chemicals and especially small molecules;
(3) differences in the state of the technology in application by drug designers/discoverers, who
are looking for a very specific therapeutic outcome and know what receptor to target, as well as
the application in environmental toxicology where the chemical decides the binding point; (4)
whether such techniques will be able to address large ecosystems where habitat issues may be
more important that chemical concentrations and the use of historical approaches (such as
EMAP) in conjunction with the "-omic" techniques, stressing the importance of distinguishing
between adaptive responses and damage from stressors; (5) expectation that chemical approaches
and the various "-omic" approaches will merge over time with the example offered of the use of
genetic approaches to understand population dynamics and impacts of resistance incorporated
into plants to establish a baseline for population density, gene flow, and diversity to support
analysis of the effects of genetically-engineered crops onnontarget organisms; and (6) the ability
of large databases to look at normal as well as toxicological changes including the impacts of
animal care and feeding on the genetic level and gene expression as well as the use of pathways
approaches for an ecosystem rather than gene-by-gene approaches.
In closing, Dr. Farland noted that all of the session discussions presented examples of the values
of partnerships.
Innovation to Advance the Detection of Threats and Optimize
Environmental Decisionmaking
Following opening remarks by Dr. GaryFoley, with NERL, six speakers addressed innovative
technologies and tools understanding the human health and environmental effects of chemicals.
A panel discussion including an audience question and answer period followed the
presentations.
Information Technology Science: Bolstering the EPA's Mission
NERL scientist, Dr. Gary Foley, welcomed session attendees and discussed the complexity of
issues to address in protecting human health and the environment. There are a number of aspects
to consider including development, wildlife, sensitive human populations, soil, air, a vast range
of scales (cellular to ecosystem and global), a breadth of alternatives, and a myriad of
uncertainties.
EPA SCIENCE FORUM 2003 PROCEEDINGS 131
-------
Given these complexities, advanced information technology and modeling are essential to gather,
integrate, and interpret (layer) data; combining data with modeling to predict changes to
exposures and from regulatory actions as well as understand the underlying issues; predicting
outcomes by combining all of this information such as evaluating PM2.5 regulatory strategy
impacts on contaminant levels and resulting changes in exposure; and bringing in decision tools
to help make multi-stressor regional decisions. These tools and approaches enable EPA to make
progress on critical issues such as clearer skies, cleaner water, more vital ecosystems, and
reducing impacts by understanding actions at the molecular level. All this can be combined to
understand how to make people healthier around pollutants and to predict risks.
Meeting National Environmental Goals: Coordinated Federal Information
Technology Solutions
Dr. David Nelson, with the White House National Coordination Office for Information
Technology Research and Development, provided an overview of the Federal Networking
Information Technology Research and Development Program and provided several modeling
examples of Federal agency partnering. Since 1991, this Program involves coordinated, focused,
long-term interagency research and development in information technology involving 12
agencies.and departments. Activities encompass high-end computing, large-scale networking,
high confidence software and systems, human-computer interaction, software design and
productivity, and social, economic, and workforce implications of information technology.
Examples of environmental-related information technology research include simulation of
aquaporin protein function in a cell to transport water molecules (NSF, NIH), environmental
modeling of the Chesapeake Bay (NOAA, EPA, DOD), high-resolution, long-term climate
modeling (DOE, NSF, NOAA), and "smart dust" (DOD and Intel).
Aquaporin is a common protein in mammals that controls water flow in and out of a cell. If this
protein mutates, it does not function correctly and can lead to serious health outcomes; the
protein may have a role in glaucoma and diabetes. Visualization was used to understand the
function of this protein and determined that the water molecules flip position, first entering the
cell membrane to a certain point via the oxygen molecule then flipping to complete entry via the
hydrogen molecules. Experiments would have been unable to obtain such information on
cellular phenomena at the molecular level.
Modeling and visualization were combined in evaluating the salinity in the Chesapeake Bay.
The computed salinity was visualized and the model used to compute salinity was checked
against measured data such as dissolved oxygen. This approach was useful for data analysis by
users mat were not skilled computational scientists. The results indicated that a significant
portion of the nitrogen enters the Chesapeake Bay via air pollution from sources located
hundreds of miles away from the watershed. Therefore, to control nitrogen, both air and water
quality needed to be addressed. This illustrates the use of modeled information to inform and
advise regulators and policymakers.
One long-term climate modeling example included a 1,200 year control run of existing and
extrapolated climate data to evaluate the El Nino cycle. The model found large variability in the
cycle, identified periods where the cycle diminished in intensity, and determined that there are
natural cycles that must be considered in addition to greenhouse gas impacts for global climate
132 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
change analysis. A second climate modeling example involved the simulation of a tropical
cyclone near Madagascar that demonstrated the power of a Japanese earth simulator for better
resolution of local features. This effort found significant differences in resolution based on grid
size, with the smallest size providing sufficient resolution to clearly visualize the cyclonic
pattern. These results demonstrated that for climate modeling, a smaller grid size is necessary, *
which in turn requires more data points and larger computers for analysis. This simulation
shows that computers can be used to conduct regional analysis using models run at global scale.
"Smart dust" is a research activity being conducted by the University of California, Berkeley
with a near-term goal of developing a millimeter-sized sensor and communication package using
radio frequency, laser, and modulated corner reflector for communications and sensors. These
would replace current sensors that are largely mechanical. The small size would facilitate broad
distribution for environmental monitoring or surveillance. In an experiment, an aircraft dropped
a number of such sensors equipped with magnetometers and radio frequency communications
along a road on a military base and the sensors were able to sense the passage of vehicles, spot
traffic patterns, and communicate the speed and size of vehicles. In the next 10 years, these may
be useful as pollutant sensors in the environment.
Other examples of information technology applications to environmental issues include
combustion modeling to reduce emissions, transport modeling of toxic plumes, hydrology
models of surface and groundwater, networks of real-time sensors to detect toxic chemicals or
biologicals, digital libraries of mass spectral prints for chemical compounds, models of
biological activity of toxic chemicals, and information on genetic mutations due to chemicals.
These examples illustrate how the use of information technology can confront environmental
problems perhaps faster than through other techniques and with a sound basis in science.
Application of Advanced Information Technology to Promote, Educate, and
Address Environmental Concerns
Ms. Ramona Trovato, with EPA's Office of Environmental Information, discussed EPA use of
information technology to manage incoming information and to use that information to make
good decisions. EPA has collected information for diverse programs for a long time.
Approximately half of EPA's $7.5 billion budget goes to states to carry out the environmental
programs, which in turn requires information to come back to the EPA. There is also the need to
share information with the USGS, NOAA, and other partners.
Key issues are the need to obtain and disseminate timely and quality data for EPA and its
partners, the need to disseminate data to the public in an easily understandable and usable form
to help make decisions and to get involved in issues; and the need to use the collected
information in sophisticated ways to better understand the environment.
The National Environmental Exchange Network is a recent initiative that is a different way of
sharing data via a standards-based, secure exchange environment. There are many legacy
databases, and much of the data may be old with data updates occurring as the states are able or
willing to do so. The new system enables rapid data access and looks across programs rather
than following the traditional statutory "stove pipes." The overall goal is to exchange more
information across more users by improving data quality and reproducibility, reducing the
EPA SCIENCE FORUM 2003 PROCEEDINGS 133
-------
burden for all partners, improving public and regulator access to data, ensuring data stewardship,
and improving the timeliness of quality data availability.
Approximately 40 states are currently participating in this network, which primarily involves air
data. The network includes data standards for data elements such as latitude and longitude, and
also includes metadata (critical associated parameters) that help modelers and others with data
analysis and in understanding data anomalies. The network will include partner network access
nodes as well as data exchange templates to facilitate data sharing and more frequent data
updates.
EPA also disseminates data to the public via its website, which is being modified to facilitate
searches on topical information. Pages are being added to Ihe website to focus on topics such as
the Mercury Portal Project and these will form the basis for more in-depth topical searching.
Another change will be the addition of the ability to integrate, organize, and analyze information
from EPA, USGS, and others for geographic location(s) of interest to the user. This will enable
the public to look at their particular area of interest to see what is occurring, view aerial
photographs, and access information on water, air, and waste. Envirofacts is another tool for
information dissemination that is focused more on the scientists and data analysts.
EPA is also using information technology to analyze and use data. An example is the use of high
performance computing (i.e., supercomputers) that enable more variables and conditions to be
addressed and facilitate data visualization. An example is the Cyberlung Project in which EPA
developed a model of the lung and how it works, and uses fluid computational dynamics to
understand how the lung takes in and processes PM. This will contribute to greater
understanding of lung diseases such as asthma, which is the leading cause of children's absence
from school. Another example is a three-dimensional physical model of Manhattan built to
better understand human exposure to urban microenvironments from the aftermath of the
terrorist attacks on the WTC. High performance computing supports this effort and the
extrapolation of this understanding to other cities.
A final example is the EPA Indicators Initiative supporting the State of Environment Report.
The purpose is to answer questions about the environment an health by looking at air, water,
land, and ecosystem to define environmental indicators that in turn will help EPA to define the
issues and prioritize funding - a clear example of using scientific knowledge to help resolve
issues.
Monitoring Stressors to Human and Ecosystem Health from Space
Mr. David Williams, with ORD, discussed the use of remote sensing data acquired from
satellites to support EPA and global climate change issues as well as atmospheric scientists and
modelers. A number of space-based satellites exist today that are already collecting data that can
be used for determining air quality, understanding landscape change, measuring atmospheric
constituents, and monitoring natural and technological hazards (oil spills, fires). There are over
1,800 active earth orbiting satellites. While most are for communications or GPS, 56 have earth
observing remote sensing systems of which a few have high resolution reconnaissance
technologies or all weather capable imaging radar sensors.
134 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Measurements obtainable from these satellites include earth surface imagery from 0.6 meters to 1
kilometer resolution, infrared to thermal energy, passive and active microwave, atmospheric
constituents (ozone, methane, aerosols), and the ocean. Examples of the use of such remote
sensing data include landscape change such as the urbanization of the Las Vegas valley over
time, which enables assessment of impacts on the ecosystem as well as health impacts of
urbanization. City light data can be used to map human populations; while the United States has
extensive census data, population sizes in other global locations are not as well known and this
information is important to understanding growth. Thermal imagery can assist with geological
mapping since different rock types will show as different colors. In addition, passive microwave
sensors assist with rainfall mapping which in turns helps us to understand rainfall intensity.
Such remote sensing data imagery is a tool for scientific research and data analysis including
human and ecosystem health monitoring. However, a satellite image is an array of numbers and
computational techniques can analyze those underlying data for a variety of purposes. Possible
examples include:
• Impacts of rangeland on water quality in arid environments using satellite imagery to detect
vegetation changes over time and to develop a monitoring plan by combining climate data
and land use information
• Detection of invasive species drawing on differences in how plant species "green up" during
the year
• Effects of landscape imperviousness on stream biology using satellite maps of land cover,
determining impervious areas (estimation of percentage not available from traditional maps),
and combining this with water quality, hydrographic, and topographic data
• Monitoring natural and technological disasters by providing daily imagery from multiple
satellites to response teams, or combining the imagery with population data to map regions
where people have been exposed to hazardous materials
• Air quality monitoring using satellite observations (for daily aerosol optical depth
measurements) in conjunction with web-based maps to show regions of human exposure to
unhealthy levels of pollution.
Jim Szykman, with OAQPS, discussed current research efforts in conjunction with the NASA
Langley Research Center to incorporate data into remote sensing modeling. This example
involved a data fusion demonstration to bring together two data sets for September 2000 - one
from sensors on satellites and one that is state/local to measure aerosols - in order to relate
satellite data to ground level PM2.5 concentrations. This effort combined hourly PM2.5
measurements from ground level monitors, daily optical depth imagery (for total loading of
aerosols within the atmosphere) from a polar orbiting satellite, and daily cloud optimal thickness
(from satellite). The overlay of these data and their animation over time visualized the transport
of the change in PM2.5 that moved from the Midwest into the Texas/Louisiana/Gulf of Mexico
area and demonstrated the correlation of the satellite and ground data, while providing additional
context to understand what is occurring.
EPA SCIENCE FORUM 2003 PROCEEDINGS 135
-------
ASPECT: Protecting Americans Through Rapid Detection of Atmospheric
Contaminants . ,
Dr. Mark Thomas, with EPA Region 7, presented an airborne technology in use by EPA to assist
with emergency response to incidents involving chemical releases. EPA is tasked with providing
a very rapid chemical detection capability to chemical emergencies as defined in the National
Contingency Plan and is developing .tools to assist with this mandate. From the perspective of an
OSC, emergency situations require direct integration with the local incident commander, near-
real-time collection of data, aerial photography capability (to visualize the entire situation), basic
telemetry to transmit information, and the ability to detect chemical plumes coupled with
automatic data processing .that .does not require sophisticated scientific technology. EPA
developed such a capability drawing on a standoff battlefield chemical detector originally built
by the Army that evolved into the Airborne Spectral Photographic Environmental Collection
Technology (ASPECT). This system has assisted with EPA response to 12 incidents to date
including the Winter Olympic games and the recent space shuttle disaster (to look for
monomethylhydrazine).
The ASPECT system consists of a very stiff, high wing aircraft equipped with two primary
sensors: an infrared line scanner to image a plume and an airborne Bomem high speed
spectrometer. These systems are networked together in conjunction with a variety of GPS feeds
to provide special coordination. ASPECT also includes high-resolution aerial photography,
videography, and a special link to transmit information between the aircraft and the ground using
a telemetry unit that can be parachuted to a ground team to set up and operate with a 2-mile
coverage using wireless Ethernet. The unique infrared system is designed to obtain four sweeps
of the ground through one rotation with two calibration points for each sweep.
Gyroscopic response measured during flight is used to make corrections to the information. Data
processing includes radiometric calibration and correction capability for pitch/roll, yaw, band
overlay, vibration (jitter), and geo-rectification (to show north at the top of the image and for
compatibility with GIS packages).
To enable faster access to spectrometer data, signal processing approaches developed for radar
systems are used to create a band pass filter to look only at the analytes of interest using
information collected from the interferogram stage rather than continue the data processing from
the interferogram using fourier transforms. This approach eliminates background interferences
such as ozone.
The ASPECT approach is designed for addressing larger volume chemical spills rather than
parts-per-billion air releases from landfill caps. Ongoing ASPECT enhancements include sensor
upgrades to provide better throughput, expansion of the compound library, aerial photography
enhancements, and oil and radiation detection capability.
Simulation and Visualization of the Smoke/Dust Plume from the World Trade
Center
NERL scientist, Dr. Steven Perry, discussed laboratory simulation of smoke/dust plumes and air
pollutant transport from the collapse of the WTC in New York City. The purpose of this
136 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
simulation is to characterize the temporal and spatial patterns of contaminant
concentration/deposition to support risk assessment of potential human exposure from this
several month long pollution event, and to improve understanding of pollutant pathways in urban
areas. Computer simulations are necessary in these situations because of the complex nature of
the urban environment and laboratory efforts are necessary to validate such models. In turn, an
urban computer modeling tool can support improved air quality estimates from routine emissions
such as from urban traffic, risk assessment, and emergency response.
4
The laboratory simulation involved the creation of a three-dimensional digital representation of
the WTC and surrounding buildings in the Lower Manhattan area of New York City using data
collected prior to September 11, 2001. A time series computational fluid dynamics simulation of
the WTC norm tower collapse was conducted to show the change in the pollutant plume flows
over time (during and after collapse). Comparison of the dispersion model simulations with
satellite imagery of the WTC plume showed many similarities.
A number of studies demonstrate the influence of urban structures on dispersion model results.
An EPA computational fluid dynamics simulation of carbon monoxide from vehicle traffic in
Manhattan showed complex mixing and differences from street to street. Los Alamos National
Laboratory (LANL) modeled a pollutant release in Portland, Oregon, that considered both the
presence and absence of buildings in the vicinity; the presence of the buildings resulted in
significant changes in pollutant plume flow and contaminant distribution. Another LANL
simulation involved the Urban 2000 experiment in Salt Lake City to determine plume transport
to help emergency responders identify safe areas; this simulation found that even after the source
was turned off, pockets of pollutants remained in downtown areas and even an hour or more after
the "event" that strong pockets of pollutants remained elsewhere in the city.
To better evaluate these types of findings, EPA conducted a wind tunnel study of Lower
Manhattan in order to develop a controlled laboratory database that characterizes local flow and
pollutant dispersion regimes. Potential uses of such a database include computational fluid
dynamics model improvement and evaluation, characterization of urban concentration patterns
possibly as a source term for regional models, and development of emergency responder "rules
of thumb" and engineering approaches for quick response (e.g., quick response models).
This study involved the creation of a scale model of Lower Manhattan for use in a wind tunnel
for testing. The scale model was created from the digital building geometry data, satellite
photographs, and actual photographs to develop the geometry of the WTC rubble pile. Three
major components of the study included visible smoke visualization, flow characterization in
street canyons (measured velocity and turbulence using laser Doppler velocimeter), and tracer
concentration measurements in street canyons and above the city. The simulation for the first
component focused on the smoldering fires and fugitive dust that continued on the WTC site for
weeks after building collapse, and used both smoke visualization and lasers to see flow and
circulation. Preliminary results indicated that the remaining buildings caused the plume flow to
turn and eventually move up other streets in the opposite direction of the main plume flow,
which would affect occupants if such buildings have fresh air intakes on their rooftops (safe
buildings aspect).
EPA SCIENCE FORUM 2003 PROCEEDINGS 137
-------
These studies are underway and the results will be linked with air pollution monitoring and
regional scale modeling to develop potential human exposure patterns. This will serve as an
important component in EPA's urban scale and emergency response model development
programs.
Real-Time Monitoring and Communication of Air Quality
Mr. Timothy Hanley, with the EPA Office of Air and Radiation, discussed efforts to
communicate real-time air quality information to the public. .Under EPA's Air Quality Index
(AQI) program, EPA provides daily reports on the AQI results for the previous day; these
include ozone, PM, carbon monoxide, SC>2,. and NOx. The AQI is important because it is the
single best tool to communicate air quality to the public for ozone, and soon for PM, via the
media. The AQI links air quality levels and a health message, increases the role of state and local
agencies, promotes voluntary forecasting, and is a useful public service. The AQI has color
codes ranging from green to red (good to very unhealthy, respectively) that focus on air quality
and health implications, and includes separate cautionary statements for ozone and PM in each
category; ozone cautions distinguish between indoor and outdoor considerations whereas PM
cautions do not. Each person is affected differently by the air quality. Individual effects can
occur even if air quality levels are in the moderate category. As a result, the system has been
enhanced to include cautionary statements for that category as well.
AIRNow is a voluntary program that provides a national vehicle for exchange of real-time air
quality data and forecasts using the AQI. The initial focus was on ozone and is now moving to
include fine PM (PM2.5) and may soon include PM10 data, which are important for forecasters
to predict the next day's AQI.
AIRNow is a cooperative effort between EPA, NPS, state, and local air agencies to collect,
quality assure, and transfer real-time and forecast air quality information to the public. This is
intended to provide the public with fast and easy access to understandable air quality information
to assist individuals in making health-based decisions about daily activities.
Visualization is an important tool for forecasting air pollution. The most important report is the
daily forecast because it affects how individuals might schedule their activities for the next day.
Data become available in the afternoon, scientists make a prediction for the following day
creating a forecast map sent out to the media as a public service. If enough large cities
participate, the big media services are willing to disseminate the information, and once one large
media organization begins publishing such data, the others will follow.
State, local, and other participating agencies gather the data from their monitoring networks and
transfer the data via dial-up modems or broadband telemetry systems to the AIRNow Data
Management Center. Quality control checks are performed using a variety of techniques and
mapping domains are produced on a defined schedule once a minimum number of states' data
are received. Current activities include more rapid transmission of data to the Data Management
Center, which will require some cities/states to invest in more modern transmission systems.
In 2002, AIRNow had nearly national coverage for real-time ozone mapping. In 2003, AIRNow
is moving towards overlaying fine PM (PM2.5) and ozone data with an educational/media
138 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
outreach program for PM2.5. Data are coming from many locations for nationwide coverage as
well as portions of Canada, particularly southern Ontario. In October 2003, EPA will launch a
national campaign to kick-off year-round AQI. .
Longer-term goals include the development of a National Air Quality Forecast Model in
partnership with NOAA, providing web-based access to all data coming into the AIRNow
database, and providing stakeholder access (technical personnel rather than general public) to
AIRNow database. Supporting activities include the resolution of technical issues for real-time
reporting and mapping of PM2.5 (for example, averaging 3 to 6 hours of data rather than waiting
for 24-hour results), AQI forecasting activities and piloting efforts in 36 major cities in the
United States, and funding state/local government agencies to develop PM2.5 forecast tools.
AIRTomorrow is an initiative addressing forecast models to provide national coverage for both
urban and rural areas. This will be a long-range tool for use by experienced and inexperienced
forecasters mat might be run as one of several forecast components in a large city but might be a
default forecast for smaller cities/areas. A visualization of PM2.5 concentrations was run for
August 2002 when a significant regional transport/stagnation pollution event resulted in high
levels of ozone, PM2.5, and other secondary pollutants. This visualization showed fairly good
correlation of resulting haze with that shown on an aerial photograph for the same time period.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period following the early afternoon sessions addressed a range of
topics. These included: (1) use of remote sensing information to aid in planning the national air
monitoring network and consideration of dead zones (e.g., areas with no monitoring); (2)
consideration of the use of airport visibility and other monitors in conjunction with remote
sensing to enhance estimates of air quality; (3) the relationship between the environmental
indicators project and the Report on the Environment (formerly entitled the State-of-the-
Environment Report); and (4) methods for enhanced, computer-based information exchange
including issues associated with firewalls, timeliness of future data accessibility, and ability to
distinguish between computer-based attacks and friendly user access.
A brief question and answer period also followed the later afternoon sessions and addressed a
range of topics. These included: (1) level of resolution for infrared imagery from the ASPECT
system; (2) viewpoints on use of unmanned vehicles for monitoring to support emergency
response activities; and (3) measuring success of AQI predictions, the level of public knowledge
about the AQI, and how individuals take action to change their activities to reduce exposures or
their contributions to air pollution during elevated levels.
Applying Biotechnology to Achieve Sustainable Environmental
Systems
Following opening remarks by Dr. Hugh McKinnon, Director ofNRMRL, seven speakers
addressed various types ofbiotechnology, current research, and existing as well as potential
EPA SCIENCE FORUM 2003 PROCEEDINGS 139
-------
biotechnology applications. A panel discussion including an audience question and answer
period followed the presentations,.
The Director of NRMRL, Dr. Hugh McKinnon, welcomed session attendees and provided a
general session overview. ...
Molecular Farming for Sustainable Chemistry
Executive Director of the Fraunhofer Center for Molecular Biotechnology, Dr. Barry Marrs,
discussed the use and implications of molecular farming, which is the use of plant biotechnology
to advance protein expression. The use of biotechnology to manufacture chemicals is advancing
faster than many realize and may have significant changes on both the chemical industry and the
programs that ensure manufacturing is conducted safely. This.represerits the third wave of
biotechnology (industrial chemicals) with the first wave involving Pharmaceuticals and the
second wave involving agricultural chemicals (currently in progress).
The industrial applications of biotechnology involve the use of industrial biocatalysis to
manufacture chemicals and materials. This is an attractive technology because its benefits
include cost-effectiveness (less expensive catalysts), cleanliness (less waste, therefore, less
expensive), and sophisticated chemistry. Challenges include the current limited availability of
enzymes for the purposes desired, the need to renew the enzymes (e.g., they have limited
lifetimes), and process development (a cross-disciplinary effort involving engineers and
biochemical scientists).
There are several powerful technology drivers that will accelerate the use of biocatalysis in the
chemical industry. First is finding better catalysts in nature through expression cloning and other
techniques; the vast majority of existing organisms cannot yet be cultivated and new technology
is developing to enable us to cultivate many more microbes than is currently possible. Second is
the use of directed evolution to improve upon catalysts already found in nature; nature designed
proteins to function within specific parameters (e.g., water, temperature) and to produce specific
proteins (enzymes) for specific functions. Current laboratory techniques are able to change the
enzyme produced; for example, to work faster. Third, low cost manufacturing through
development of large-scale production techniques; for example, molecular farming may be able
to make an enzyme for a portion of the cost of more traditional fermentation techniques.
The specially chemical industry is experiencing the leading edge of this third wave of
biotechnology, which is a young science. Several examples were provided of specialty
chemicals produced using various biotechnology techniques. In one case, a comparison of the
chemical and biotechnology production of L-camitine demonstrated significant reductions in the
quantities of waste to be managed as well as significant changes in the waste stream constituents.
Since biological proteins are designed by nature to create a specific product with few to no
byproducts, rather than a range of products, there is less waste and therefore less associated
waste management costs.
This is a new, evolving technology and change comes rapidly. Countries and companies that do
not embrace the new technologies are vulnerable to replacement.
140 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Industrial biocatalysis will have a smaller environmental footprint than conventional chemistry-
based manufacturing approaches. This may result in the need for more emphasis on ecology
(e.g., eutrophication) and less on toxicology and different types of exposure analysis. For
example, a consideration of the consequences of consuming a com product containing an
industrial enzyme and how to safely produce low-cost biocatalysts in field crops.
Bioremediation may also become more powerful driving considerations of secondary
consequences to the environment, such as what happens to the biological once its "job" is done.
A major challenge is the lack of an ecological testing system, such as long-term ecological
testing parks, to support research in these new areas of consideration.
Directed evolution involves engineering an enzyme and engineering a plant to produce the
enzyme. An example is the use of oxidized guar to improve the wet strength of paper. Only one
enzyme in nature produces this compound, galactose oxidase. Since there are limited amounts in
nature, the enzyme is expensive. Using directed evolution, larger volumes could be produced
resulting in a dramatic price decrease. Directed evolution involves the selection of genes to
improve, creation of a library of variants (mutations), insertion of the gene library into an
expression vector, insertion of the library/vector into a bacteria to produce enzyme variants,
screening the resulting colonies for improvements in the properties of interest (e.g., more
productive), isolating the improved gene(s), and repeating the process until improvement is
achieved. Multiple cycles will significantly reduce the overall improvement in properties.
Molecular farming involves two techniques: transgenic plants (a method of direct expression of
foreign genes in plants) and plant virus vectors (method for transient expression of foreign genes
in plants). The latter technique is faster (and therefore less expensive) man raising plants, but the
trait is not inherited and there are issues associated with virus transmission. These techniques are
most suitable for greenhouse environments using non food crops rather than planting fields.
In summary, nature created an incredible array of enzymes that serve as catalysts. Tools of
directed evolution can quickly create new enzymes, including those suitable for industrial
applications. Molecular farming can produce enzymes at much lower cost than fermentation
techniques. Life cycle analyses for such plant techniques have not yet been performed and
would need to consider energy consumption as well as potential reductions in greenhouse gases
(plants consume €62). Results are expected to be favorable as compared to fermentation.
EPA Biotechnology Program Overview
NHEERL Director, Dr. Lawrence Reiter, presented an overview of the EPA biotechnology
research program, the scientific framework that is shaping and driving the research, and research
approaches to address key scientific issues. Dr. Reiter also discussed the ORD research initiated
to address the growing use of agricultural biotechnology products and the EPA pesticide
program responsibilities to regulate them.
There has been a huge growth in acreage of genetically modified crops worldwide since 1996
with the number of countries involved in this effort doubling over this time period and the United
States continuing to be the major grower. EPA has a regulatory role in the area of genetically
modified crops through FIFRA, which establishes the requirements for EPA to review and
register all pesticides used in the United States including DNA used in plants for pesticidal
EPA SCIENCE FORUM 2003 PROCEEDINGS 141
-------
protection purposes. FIFRA includes an obligation to ensure that these pesticides will not pose
unreasonable risk of harm to human health and the environment. Many reports address modified
crops and three general recommendations form the basis for the EPA research program:
assessing allergenicity risks from genetically-modified crops, assessing the possibility for gene
transfer and ecological risks associated with such crops, and managing gene transfer and
resistance.
Food allergies are not uncommon and are more prevalent in children than adults. The novel .
proteins introduced into food through bioengineering could be allergens. The research need is
for a rodent model to evaluate allergenicity of genetically-modified crops, explore susceptibility,
and understand age-related differences in allergy occurrence. Ongoing research is identifying
allergenic proteins in indoor mold, developing an animal model, and measuring the development
of the allergic antibody IgE and eosinophilic inflammation (the allergic response). Ultimately,
the model will be adapted to the study of food allergens and to identify the immunological
responses in the gastrointestinal system to understand mechanisms, potency, and vulnerability.
Areas of ecological risk research needs include understanding the likelihood and impact of gene
transfer, characterizing impacts of crops on non-target species, and determining/managing
pesticide resistance. Research to evaluate gene flow from genetically-modified crops include the
potential for transfer of novel genetic material to non-target plants and whether gene transfer
confers traits onto non-target plants that change how the plant fits into an ecological system and
the potential for unintended ecological consequence. This involves the need to develop methods
to identify gene transfer, movement, and expression, which is occurring in the Corvallis, Oregon,
laboratory. Genomic techniques are being designed to study gene flow from transgenic plants
and to design molecular markers to detect transgenes in plants. The next step will be to identify
the target genes in the transgenic plants. Research efforts focus on canola and creeping bent
grass because they are grown in close proximity to a wild relative (therefore a potential exists for
gene transfer) and the modified plants are available for study. Greenhouse studies of the
modified plants and wild relatives will aid in understanding the biological and nonbiological
factors in gene flow, including evaluation of the fitness and ecological effects of crops and non-
crop hybrids on growth, seed production, and other factors affecting species survival and
diversity.
The goal of research to identify impacts on non-target organisms is to develop genetically-based
approaches to monitor non-target populations to determine if exposure has occurred and the
impact of such exposure. This will involve evaluation of sites near both traditional and Bt agro-
ecosystems (corn, cotton) to establish baseline measures for population structure followed by
monitoring over time to determine whether these structures are modified by the presence of the
Bt crop. A first step is to develop novel methods for sensitive exposure monitoring using gene
markers, development of genetic methods to evaluate population structure (size, density) and
gene flow, then measuring temporal patterns in population sizes and allele frequencies of
response genes to see if exposure occurred. Utility of the monitoring program will be assessed
after three years then turned over to the EPA Program Offices, Regions, and states for long-term
monitoring aspects.
The last area is resistance management. Resistance evolution in insects is common, and OPP
recognizes this concern and requires the development of resistance management plans to prolong
142 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
the usefulness of Bt crops, to avoid the need to return to broad-spectrum pesticides, or to render
ineffective the Bt spray applications for organic farming. For resistance management, EPA
requires implementation of a high dose, structured refuge strategy so that there is some portion of
the Bt crop field that is set aside to grow the non-Bt crop with enough dose in the modified
plants to kill the targeted insect. Resistance management is a model-based approach that is also
dependent on accurate biological information (such as local insect population sizes, dispersal
patterns, and mating patterns) collected using genetic markers. Models also depend on the
genetics of resistance (number and importance of genes), therefore laboratory studies will be
conducted to evaluate chromosomal distribution of resistant genes. Tools will also be used to
identify the emergence of resistance in the field in order to put in place corrective actions.
Regulatory Perspective on Bioengineered Crops
OPPTS/OPP scientist, Dr. Janet Andersen, discussed the role of science in the regulation of
biotechnology. Regulation of biotechnology products requires a strong scientific base, and
scientific research improves the regulation of such cutting-edge technology. For OPP, this draws
on scientific expertise from Program Office programs, research support from ORD, and outside
peer review by the FIFRA Scientific Advisory Panel.
Science supports regulation by identifying possible risks/benefits. OPP considered data
requirements, appropriate mitigation, the types of data and studies needed, scientifically-valid
protocols for conducting the studies, and standard protocols for conducting reviews, models, and
monitoring. The goal was to monitor for what is actually occurring in the field since
bioengineered crops are an important public policy issue worldwide. Decisionmaking based
upon the scientific data will vary from country to country due to cultural and environmental
differences. However, an international organization, CODEX, has adopted the United States'
approach to risk assessment of plant incorporated protectants and foods derived from
biotechnology, and this is a testament to EPA efforts to regulate these products. The final step is
to monitor conditions to see if the impacts are as expected, with risk and benefit assessments
refined as more information become available.
The Scientific Advisory Panel helped shape the EPA approach and to validate the scientific work
from the beginning in the 1980s regarding potential risks of pollen drift (an ecological rather
than worker issue) and the peer review in 1995 of the first risk assessment for a plant
incorporated protectant (Bt potato) to current activities involving non-target species and insect
resistance management. The purpose of this Panel is to advise on science with decisionmaking
on pesticide approval for use residing with the Program Office.
Early ORD research efforts were very important for setting the early regulatory approach. While
biotechnology research decreased in the early 1990s, the recent resurgence of research in this
area is promising. This partnership, between regulatory program offices and researchers, is
important to an effective program.
The role of science in regulatory decisionmaking was illustrated by two science issues: (1)
Monarch butterfly exposure and risk from Bt crops, and (2) insect resistance. The initial risk
assessment for Bt potato concluded that there were no significant effects to non-target butterflies
or their caterpillar precursors; subsequently, a letter to Nature presented research results
EPA SCIENCE FORUM 2003 PROCEEDINGS 143
-------
implying that the pollen from altered com would kill Monarch butterfly caterpillars. Industry
and EPA immediately began research and scientific efforts (via USD A) that used science and the
scientific process to demonstrate that no significant exposure occurs. However, the impact on
regulation was to begin requiring the testing of a neutral relative of the target pest prior to
granting commercial product approval and led to ongoing efforts to evaluate non-target organism
testing for plant incorporated protectants.
As a second example, insect resistance management programs are only required for Bt crops.
Initial concerns, for protecting naturally-occurring Bt evolved into concerns regarding crop
protection and the size of the refuge to plant. Answering these questions required exploration of
the basic biological mechanisms of insects such as the frequency of resistant alleles in
conjunction with modeling to predict years to resistance, with science improving methods to
monitor for resistance. The impact on regulation was the first-time involvement of entomology
researchers directly in the regulatory process.
Linking Strategic Environmental Monitoring to Risk Assessment of
Biotechnology Products with Plant Incorporated Protectants
Dr. Robert Frederick, with NCEA, discussed monitoring in the context of risk assessment, which
is at the heart of EPA's decisionmaking. NCEA focuses on how risk assessment is done, the
principles behind it, methods used, and how to prepare for what is coming in the future.
Monitoring is very directed, so it is important to understand what to monitor in order to set up a
successful program. Decisionmakers must understand what should be monitored, the reasons,
how monitoring is to be carried out, and the purpose for the collected data as well as
consequences and impacts of the decisions. All of these questions involve basic science.
Monitoring helps to improve the risk assessment process, provides information to
decisionmakers, identifies important scientific developments, and helps to build public
confidence. Yet, the public must understand that decisionmaking is not the endpoint and the
decisionmakers must understand that they will not always have all of the information they would
like.
The insect resistance management program for Bt cotton registration was cited as an example of
a targeted, well-defined monitoring program. The initial program underwent reassessment that
resulted in the inclusion of additional requirements for both Bt corn and cotton. This includes
evaluating the efficacy of the refuge strategy to determine if impacts correlate with modeling
results, determining if insect resistance is being delayed, and considering impacts on non-target
species.
The National Research Council has conducted many workshops and produced reports on these
issues, and has charged USDA and EPA to examine monitoring efforts in more depth. The
potential use of genomics in these efforts is very exciting and will improve the risk assessment
process in the future.
Challenges in monitoring include defining how much effort is sufficient, what to look for, how
long to continue the monitoring, resources, and the ability to gain useful information in a
144 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
timeframe that is useful to EPA. Therefore, monitoring methodologies should be aimed at
answering specific questions or concerns, must be appropriate and adequately targeted, consider
the cost of monitoring compared to the value of the information gained, and must strategically
define what information will be useful rather than collecting information solely for scientific
curiosity.
Current research needs to support monitoring and risk assessment include baseline data for
ecosystems and agro-ecosystems, fitness (defining and determinative characteristics), ecological
effects such as non-target impacts and consequences following gene flow, environmental
indicators, and mitigation technology. NCEA research is focusing on the agro-ecosystem
condition, specifically in-field and near-field situations. Areas of particular interest include
practical indicators) of change or impact, methods to address spatio-temporal issues, and
methods to accommodate natural biological variability. NCEA is working with interagency
panels and others to help develop this research program.
Remote Sensing for Bioengineered Crops
NRMRL scientist, Dr. John Glaser, discussed the potential applications of remote sensing to
•support compliance monitoring and development of insect resistance in bioengineered crops.
Remote sensing is used in agriculture to monitor crop state, crop condition, and landscape
characteristics. This information supports current applications in precision agriculture as well as
nutrient and pesticide management. An example aerial image demonstrated how this capability
can identify crop stress.
Since 1996, EPA has been involved with transgenic crops and their registration as pesticides.
Each registration activity requires development by the seed producer of specified information on
a variety of topics relating to the characteristics of the crops. New techniques are being
developed to address information gaps identified from analysis of these data. ORD held a series
of workshops in partnership with OPP to examine issues associated with pest simulation model
design/validation, monitoring/detection, resistance estimation and refuge consideration, and
remedial action strategies.
Bt corn is viewed as an environmental asset as a result of possible avoidance of pesticide
applications that affect human health and ecosystems. Therefore, the ability to sustain this crop
as long as possible is important. Risk management concerns for this crop include regulatory
success, damage analysis, and manageability of risks with resistance management (the delay or
prevention of adaptation) as a key consideration for sustainability.
Monitoring and surveillance are components of the Insect Resistance Management Plan for Bt
corn. The purpose of resistance monitoring is to understand susceptibility at a baseline level,
understand the changes in the frequency of resistant alleles, whether any dominant alleles
develop into exposure, control failures, and what to do in response to failures. The strategy put
forth by industry is to look at four distinct geographic areas of high Bt corn use and to look at
specific insects. This is the basis of an aggressive, ongoing monitoring approach in use today.
However, concerns have arisen as to whether the use of four limited geographic sections of the
Bt corn crop can really provide for resistance detection if insect infestations begin as local
phenomena. This in turn leads to consideration of remote sensing applications to aid in assessing
EPA SCIENCE FORUM 2003 PROCEEDINGS 145
-------
compliance, such as whether the grower is following requirements for refuges or as an early
warning system by examining crop stress (an indirect approach in that pests infest stressed crops
earlier than healthy crops).
Remote sensing sources include proximate sensing, aerial imagery, and satellite imagery
(multispectral, hyperspectral). Multispectral imagery involves band analysis with sensor
reception of specific bandwidths, while hyperspectral imagery focuses on distinct wavelengths.
These imagery data in turn are correlated with field survey information (i.e., ground truthing to
verify the image as received by the satellite), and then mapped for interpretation and analysis.
Use of remote sensing data from satellites requires understanding of the reflectance spectra since
the reaction is different for different objects and their constituents. Vegetation includes a limited
set of spectrally active compounds and their relative abundance can indicate vegetation
condition; radiant energy from leaves changes remarkably between healthy, sick, and dead
conditions. Vegetation structure also significantly influences reflectance. For example, leaf .
pigments, such as chlorophyll, respond to visible light, and there is a distinct wavelength shift
from healthy green vegetation to stressed vegetation and to severely stressed (dying) vegetation
with yellow and red pigments becoming more prevalent later in the crop cycle (plant die off). In
addition, recent applications of satellite imagery have shown the ability to distinguish between
conventional corn and Bt corn using one of the chlorophyll infrared wavelengths, raising
questions as to whether the Bt corn is more photosynthetically prominent or more successful in
using the incident light than conventional corn.
One potential application of this technique to refuge compliance monitoring is to acquire, ground
truth; and map remote sensing data for specific geographic locations to determine if the required
amount of refuge is actually in place. Candidate locations include areas of highest Bt corn crop
usage such as parts of Minnesota, Pennsylvania, and Kansas.
Another potential application of remote sensing is to use infrared imagery and visible light
imagery to detect pest infestation in a bioengineered crop. This approach uses indirect evidence
to infer the development of resistance from differences in the spectral crop signature with the
expectation that other types of crop impacts, such as herbivory, stress, and leaf senescence, can
be distinguished one from the other.
Environmentally-Benign Polymeric Packaging from Renewable Resources
Associate Professor with the Colorado School of Mines, Dr. John Dorgan, discussed the
applications of biotechnology to produce environmentally-benign plastics. Plastics production
has roughly doubled every 10 years, relies on oil as a precursor, and results in a product that is
predominantly for one time use and does not degrade in landfills. While there are many societal
benefits of plastics (protect food, sanitary medical applications, secure packaging), there are also
many sustainability issues given the geopolitical issues associated with oil, the production of
greenhouse gases such as CC«2 from plastics production, and the ever decreasing landfill capacity
worldwide.
Production of plastics and packaging materials (the single largest use of plastic) from biomass in
a biorefinery may realize some advantages such as environmental benefits (less toxic or less
146 - EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
volume waste streams), national security benefits (reduced dependence on foreign petroleum),
and rural economic growth (for raw materials). A biorefinery is analogous to a petroleum
refinery and would produce fuels and materials based on raw biomass such as crops and
agricultural byproducts.
Formidable technical challenges exist but advances in biotechnology and biochemical
engineering are moving this goal closer to reality. There are several emerging success stories
including polylactides (PLA) produced by DOW and a polymer produced by DuPont Key to
this is the ability to create a thermoplastic material that will break down in a landfill.
PLA has the lowest non-renewable energy content of many thermoplastic polymers. PLA is
produced from com using a combination of chemical and biological processes, including wet
milling (to make corn starch and unrefined dextrose), fermentation (to generate lactic acid), and
reactive distillation (to produce monomer and polymer). While PLA was first developed in
1932, mass production did not occur until a continuous process for high lactide purity, reactive
distillation, was developed in the 1990s.
PLA is much like polystyrene and can go into many products. PLA can also be blended with
other plastics to improve strength and the blend is 90 percent biodegradable. Barriers to
acceptance include the need for test data for material properties and process evaluation as well as
acceptance to support manufacturing switch over from oil-based chemical production techniques
to combined biological and chemical techniques.
Current research efforts, many in partnership with EPA, are addressing the understanding of
PLA properties and improving manufacturing quality control. Efforts to date have been able to
reduce the QA/QC procedures to a single viscosity measurement to determine molecular weight,
which is a key factor. This research also developed a tool to predict viscosity as manufacturing
parameters, such as temperature, are changed; this can be used in planning for a change in a
production process to substitute a biodegradable plastic by comparing the curve in a graph of
viscosity versus sheer flow for their material with that of PLA.
Future developments in this area include more biotechnology-agrotechnology collaboration to
produce new monomers to make biopolymers, development of nanocomposites to improve
biopolymers, creation of bioplastic blends using starch, and the use of future farms as
"macrobreweries" in which biorefmeries evolve from a platform in plastics to fuels - the reverse
of the historical progression in petroleum refining.
Science-Based Opportunities for Interagency Interactions Through the USDA
Biotechnology Risk Assessment Research Grants Program
Dr. Deborah Hamernik, with the USDA, discussed a research grants program jointly
administered and funded by USDA organizations including the Cooperative State Research,
Education, and Extension Service, the Agricultural Research Service, and the United States
Forest Service (also under the USDA) to assist Federal regulatory agencies in making science-
based decisions about the safely of introducing genetically modified organisms into the
environment. This Biotechnology Risk Assessment Research Grants Program was authorized in
the 1990 Farm Bill and awards grants for extramural research, largely to land grant universities.
EPA SCIENCE FORUM 2003 PROCEEDINGS 147
-------
However, all public or private research or educational institutions in the United States are
eligible to compete for these funds. While Federal research laboratories are eligible, this
program awards only grants and not cooperative agreements.
Since 1992, approximately 100 grants totaling over $16 million have been awarded. From 1990
to 2002, a tax on biotechnology outlays funded this program. The 2002 Farm Bill increased this
tax resulting in program funding of approximately $3 million annually, and set forth several key
areas to address:
* Identify and develop appropriate management practices to minimize physical/biological risks
associated with genetically engineered animals, plants, and micro-organisms. Examples
include technology to reduce undesired spread of genetically engineered organisms,
modeling of management strategies, developing effective genetic containment strategies, and
identifying pests or pathogens that are developing resistance to transgenic resistance genes.
• Develop methods to monitor the dispersal of genetically-engineered animals, plants, and
micro-organisms. Examples include strategies for large-scale deployment of genetically
engineered organisms; the role of insects, birds, and other animals in distributing viable
transgenic seeds; and survivability profiles and/or fitness studies.
* Further knowledge of the characteristics of rates and methods of gene transfer between
engineered, wild, and agricultural organisms. Examples include potential for viral
recombination, impacts of gene flow, and fate/stability of genes introduced by outcrossing
into population of nontransgenic organisms.
• Conduct environmental assessment research to compare relative impacts of genetically
modified organisms to other types of production systems. Examples include environmental
effects associated with changes necessary for optimal agricultural management of transgenic
crops, relative impacts of agricultural and forest management systems using transgenic
versus nontransgenic organisms on ecosystem biodiversity, and whether introduction of
transgenic organisms alters the impact of agriculture on the rural environment.
Other relevant areas of research include non-target effects and effects of genetically engineered
plants with "stacked" resistance genes or genes that confer broad resistance to insects and
diseases. Program efforts will also support conferences designed to bring together scientists,
regulators, and other stakeholders to review science-based data on risk assessment and
management of genetically modified organisms released to the environment. Areas not
supported by mis research program include clinical trials, commercial product development,
product marketing strategies, food safely risk assessment, human/animal health effects, and
social/economic issues.
This program posts Requests for Applications on the USDA website to solicit investigator-
initiated research. Under this program, investigators design the projects and set the priorities. All
proposals undergo peer review by a panel of scientific experts and Federal regulatory agencies,
including the EPA, to help ensure that the funded research helps the regulatory process and
agency research needs. Review criteria include scientific merit, relevance to risk assessment/risk
management and Federal regulation of agricultural biotechnology, investigators, and institutional
148 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
capabilities. FY2002 research awards addressed a broad range of topics including gene flow,
recombinant fungus in soil, risk assessment on the use of fungal insecticides, and large-scale
ecological effects of herbicide tolerant crops on avian communities and reproduction. Abstracts
and progress reports are available on the website.
Panel Discussion/Question and Answer
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period addressed a range of topics. One discussion area considered
additional issues associated with transgenic crops. These included: (1) methods for grower
education on transgenic crops including responsibilities of the companies providing the seed and
collaboration with county extension agents; (2) responsibilities for monitoring refuges to ensure
proper planting including third party telephone surveys, education programs, field surveys by
grower representatives, and company policies regarding sales to growers who refuse to comply
with refuge requirements; (3) the impacts of plant-manufactured enzymes on CCh production and
in the overall chemical production process, noting that only the front end of manufacturing the
catalyst (enzyme) is largely what changes; (4) consideration of land cost, irrigation, fertilizer,
etc. in life cycle cost analysis for biopolymer production and use; (5) whether the energy cost is
negative for plant production as it is for food, which might render biotechnology commercially
nonviable, or whether commercial viability will be determined by the difference in the cost of
manufacture and the price at which the product can be sold; and (6) the status of current efforts
to determine if there are any soil changes resulting from transgenic crops and if the proteins of
interest do breakdown in the soil.
Another discussion area addressed Ihe USDA grant process and interagency interactions. There
are often a large number of good research project proposals received in response to an RFA, but
there may insufficient funding or the project may not fit within the RFA. A challenge is how to
pass good projects along to the grant processes of other government agencies. Potential
solutions included an interagency coordination effort initiated last fall by EPA for strategic
planning and ongoing discussions of opportunities to co-fund STAR grants by EPA and USDA.
A third discussion area considered gene flow and future research directions for nontarget species.
These included: (1) current research activities to understand baseline gene flow in order to
measure changes and to address non-governmental organization concerns that zero gene flow
should be the regulatory requirement; (2) use of genetic approaches to look at population
structure (density, size) and connectivity with subpopulations; (3) existence of genetic banks
developed by USDA for various crops to support gene flow studies; (4) questions about
transgene movement to nonmodified crops or sexually compatible species, which is a particular
concern in Europe where different crops are in much closer proximity than in the United States;
(5) differences in emphasis (European and American) on avoiding genetic pollution or
contamination versus prevention of gene^flow with negative consequences; and (6) current
laboratory and in-field testing and population evaluations for biotechnology products involving
diverse nontarget species to determine positive and negative environmental impacts of these
products.
EPA SCIENCE FORUM 2003 PROCEEDINGS 149
-------
Applying Nanotechnology to Solve Environmental Problems
Following opening remarks by Dr. JackPuzak, Acting Director ofNCEA, six speakers addressed
the development and application ofnanotechnologies. A panel discussion including an audience
question and answer period followed the presentations.
Acting Director of NCEA, Dr. Jack Puzak, welcomed session attendees and provided an
introduction to nanotechnology, which is the ability to work at the molecular level to create new
structures. An interagency initiative, the National Nanotechnology Initiative, began in 2001 and
doubled the government funding for nanotechnology research. EPA joined this effort in 2002
and also supports nanotechnology research through NCER's STAR grant and SBIR programs.
Last year, NCER issued over $16 million in grants under the first call for nanotechnology
research and is expected to issue another $5 million in grants later this year.
Nanotechnology may support revolutionary advances crossing all environmental areas from
pollution prevention, sensors, and waste treatment to remediation. Yet these technologies may
result in the release of new hazardous materials to the environment from their use or
manufacture. Research efforts are beginning to address these issues as well as today's concerns
and are enabling nanotechnology to be a vital new tool. Research planning efforts underway
include the development of guidelines for nanotechnology research being held this week with 50
participants from academia, industry, and government.
Nanotechnology and the Environment: Keeping an Emerging Technology Green
Dr. Vicki Colvin, with the Center for Biological and Environmental Nanotechnology (CBEN),
defined nanotechnology and addressed the state of its development as well as potential future
applications. CBEN is one of six centers funded by the NSF for large groups of investigators.
The mission is to create sustainable nanotechnologies that improve human health and the
environment involving research in the basic sciences and engineering, partnerships with other
government and academic organizations, and education. CBEN focuses on three research areas:
biological engineering, nanostructures, and environmental engineering, including applications to
water treatment.
Nanotechnology is one of the biggest research investments (over $1 billion) in the United States
and is an emerging industry with many nanomaterials already in use today. Nanomaterials are
small and are typically developed for structural perfection with large surface areas available for
reaction, interaction, etc. The environmental applications and implications are only now
beginning to be assessed.
A research goal of CBEN is to develop nanomaterials to help solve environmental issues.
Research involves nanomaterials for environmental waste remediation and sensing as well as
proactive investigations of the impact of nanomaterials in the environment. Key to this research
are collaborative, cross-discipline communication and interaction.
A particular research focus is on membrane filter applications and enhanced performance.
Municipal use of membrane technologies for high efficiency removal of specific contaminants
has increased in the past 35 years. Such membranes offer cost-effective, high performance water
150 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
treatment, yet can be difficult to use given the pressures that must be generated for filtration.
These efforts involve the construction of membranes with nanoparticles using alumoxanes
(porous monoliths from alumoxanes) that have highly uniform pore size, reject material above 30
nanometers, and is a ceramic material useful in highly corrosive waste streams.
Existing challenges are the high pressures required for performance (>100 psi), fouling from
pore blocking, and ability of the filter to remediate and remove contaminants. Changes in
membrane architecture, by developing membranes with much more open structures, result in
lower pressures required for performance; in addition, reactive components can be included in
such structures to remediate the water stream as well.
The large surface areas found in nanoparticles provide large capacities for sorbing particles,
which led to the concept of generating materials that attract contaminants and effectively remove
them from solution. One example is the use of a magnetic removal process in which a
nanoparticle core and surface are engineered for different reasons - the core to support the
material that in turn supports the surface chemistry mode of action.
New technologies include both benefits and risks. Several past examples were presented of
technologies that achieved specific results, yet resulted in undesirable, unintended consequences;
for example, use of pesticides to improve crop yields, yet are toxic to animals. Environmental
costs are expensive, are paid over the long-term, and can be social deal-breakers for new
technologies with the decisions on new technology use residing with the policymakers. The
economic realities of funding a new material without understanding the long-term liability is
significant and can include limited markets, recalls, and lawsuits.
Environmental and health risks associated with nanomaterial use are of concern recognizing that
it may not be possible to conduct a risk assessment on every nanomaterial developed. Exposure
issues include questions on the quantities involved, the effects from ingestion, and fate and
transport including how engineered nanoparticles may be naturally concentrated and degraded.
These are areas of CBEN research as well as bioaccumulation. Very little is known about the
toxicity of engineered nanomaterials, and many core toxicily issues may be different for these
types of very small particles developed to be perfect. The role of public policy will be to help
the public determine if these are safe.
The Future of the National Nanotechnology Initiative
NSF Senior Advisor, Dr. Mike Roco, discussed the development and implications of
nanotechnology as well as the creation and implementation of the National Nanotechnology
Initiative (NNI). Nanotechnology has not left any major field of science untouched worldwide
and involves increasing government and private sector investment. This is a promising
technology that is still in the exploratory phase with much emphasis on catalysts and computer
components.
«.
The definition of nanotechnology varies around the world. In the United States, nanotechnology
involves work at the atomic, molecular, and supramolecular levels (scale measured in
nanometers) in order to understand and create materials, devices, and systems with
fundamentally new properties and functions because of their small structure. Areas of specific
EPA SCIENCE FORUM 2003 PROCEEDINGS 151
-------
interest include miniaturization, novel properties/phenomena/processes, and efficiencies in
manufacturing. Broad societal implications of this technology encompass improved
understanding of life and nature, new products, sustainable development, and improved
healthcare.
The history of nanotechnology begins over 1,000 years ago with the accidental discovery of
carbon black. However, most of the technology development effort has occurred since 1990
with isolated applications involving catalysts and composites. The first generation involved
passive nanostructures and this is evolving into a second generation of active nanostructures.
This is anticipated to be followed by a third generation of three-dimensional nanosy stems with
heterogeneous nanocomponents, diverse assembling techniques, and nanoscale networking.
The NNI involves many Federal departments and independent agencies, with matching funds or
other investments by states beginning last year. NNI prepared and continues to prepare many
reports and evaluations of nanotechnology, its development, its potential future, and societal
implications. These documents as well as a review of NNI are available on the at
www.nano.gov. NNI also conducts a number of national and international workshops on various
application areas and research directions
Key investment strategies are to focus on fundamental research and transition to technical
innovation, addressing broad societal aspects, long-term vision (over next 20 years) of the
evolution of this technology, and preparing the nanotechnology workforce. All of these
initiatives involve partnerships for interdisciplinary and interagency collaboration.
Nanotechnology funding across Federal agencies has increased from $270 million in 2000 to
almost $600 million in 2002 with Federal agencies committing more in the future than NNI
originally anticipated. The number of research proposals is increasing faster than funding, and
the research is looking to more complex systems and applications.
Nanotechnology produces revolutionary technologies, products, and services. Growth areas
include materials, chemicals (catalysts), Pharmaceuticals, and electronics. Emerging areas
include nanomedicine, energy conversion and storage, agriculture/food systems, molecular
architectures for manufacturing, realistic multiphenomena and multiscale simulations, and
environmental implications.
A September 2000 report noted the need to make social, ethical, and economic research studies a
priority to be able to communicate with the public and address unexpected consequences as well
as to develop a basic reference for interaction with the public. This in turn will support the
ability to take faster advantage of the benefits. At the international level, there is much attention
being given to potential dangers and this is well-funded and vocal. Lessons learned from
interacting with these organizations are the need to communicate what research has already been
conducted and its findings to address their potential issues of concern, to be conscientious in
addressing issues of all people, to be aware of the diverse public concerns, and to address the
serious issues. This points to the need for increased investment in societal, educational, and
environmental implications, and NNI is currently funding a number of research projects in these
areas.
152 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Nanotechnology holds meg or, positive implications for the environment. For example,
nanotechnology provides a means for sustainable development through "exact" manufacturing
and methods to address current health and environmental issues using nanoscale sensors.
Key issues for 2003 and beyond are the need for coherent five to 10 year programs (long-term
vision and investment), horizontal rather than vertical science and technology development to
spread the technology into different fields, and a research and development vision founded in
those who will make the products.
Nanostructured Porous Silicon and Luminescent Polysiloles as Chemical
Sensors for Carcinogenic Chromium (VI) and Arsenic (V)
Dr. William Trogler, with the University of California-San Diego, discussed nanotechnology
applications of polysiloles to detect pollutants in aquatic systems. Polysiloles possess a chain of
silicon atoms surrounded by phenyl constituents with a structure resembling a silicon wire with a
phenyl coating. Polysiloles are electroluminescent materials involving fluorescence rather than
phosphorescence. Chemicals that come into contact with the reactive silicon core change the
luminescence, which can be detected.
Polysiloles are easy to produce in a two step process (as compared to other types of polymers
that require a 12 step process). The resulting polymer is photoluminescent, soluble in organic
solvents, and stable in air and water. These serve as excellent sensors for TNT with a detection
limit of approximately 1 part per trillion. The mechanism of detection is passive.
A catalytic dehydrocoupling method is used to produce polysiloles. Potential applications of this
redox coupling technique are to detect heavy metals such as hexavalent chromium (Cr+6) and
arsenic. Chromium analysis is commonly required for diverse EPA regulatory compliance
programs, and often relies on total chromium analyses when hexavalent chromium is the toxic
species of interest Arsenate, the mobile form of arsenic in aerated water, is another toxic
species of even greater concern. Regulatory limits for both of these species are in the parts per
billion range. Inexpensive nanosensing elements could support wide application from remote
monitoring to process control to replace reliance on grab sampling followed by laboratory
analysis.
The luminescent polysiloles require treatment with amines in order to detect chromium. Early
research did not achieve the desired detection limit. Additional difficulties included undesirable
physical property changes that occur when adding water to an organometallic, which creates
colloids. By changing the technique from a polymer sensor to a nanoparticle sensor, the required
EPA detection limit could be met for hexavalent chromium. Efforts are underway to improve the
detection capability for arsenic.
These results led to consideration of organometallic colloids as sensors—entitled quantum dot
sensing. The colloids scatter light in the visible range, but are also very luminescent This is a
property that may warrant further exploration.
An advantage of nanotechnology is the ease of making modifications to molecules. Creation of
the aminosilole nanoparticles for hexavalent chromium detection improved selectivity for the
EPA SCIENCE FORUM 2003 PROCEEDINGS 153
-------
chromium even further because the molecular surface was loaded with more functionality. Thus,
this nanoparticle technology has the potential for use as a field test because changes in
luminosity that correlate to different hexavalent chromium concentrations are visible to the
human eye.
Another example involved the use of a biomemitic approach to nanocoat a nanoparticle with
silica, which smoothed the surface of the material.
Nanoscale Biopolymers for Decontamination and Recycling of Heavy Metals
Dr. Wilfred Chen, with the University of California-Riverside, discussed the design, production,
and application of biopolymers to remove heavy metals from aqueous solutions and
environmental media. A challenge faced by industry is that conventional technologies can
significantly reduce heavy metal concentrations in wastestreams, but secondary processes (such
as metal chelating polymers that are produced with toxic solvents) are often required to achieve
regulatory standards. The polymers are then removed by an ultrafiltration membrane, which is
energy intensive and subject to clogging. A potential solution is to develop metal-binding
materials that can be recovered by environmental stimuli.
Metal chelating biopolymers are based on biological building blocks such as ammo acids. They
offer good control over composition and properties because they can be pre-programmed within
a DNA template, do not require chemical synthesis, high quantities can be produced
economically by bacteria, and are environmentally friendly. An elastin biopolymer, for example,
is very simple with five peptides being the most frequently repeating units, is structurally similar
to the repeating elastometric peptide sequence of the mammalian protein elastin, and undergoes a
reversible phase transition from water soluble forms into aggregates as temperature increases.
By creating both an elastin domain and a metal binding domain in the same molecule, the
properties can be controlled and the affinity to different metal species can be fine tuned.
Temperature is then used to remove captured metals from the polymer protein to regenerate the
material for reuse in metal removal.
Production involves preparation of a DNA template for the desired composition, then adding the
template to a bacterial cell, which in turn will generate the biopolymer through fermentation.
This process enables precise control of the link and composition as well as the metal binding
domain to produce customized polymers for the properties of interest. The desired transition
temperature can be obtained by controlling the length of the polymer and the metal binding
domain can be customized for specificity and capacity.
The transition property can be measured by measuring turbidity changes with temperature. It is
possible to achieve a specific transition temperature within the range of 20 to 40 degrees
Centigrade by controlling the chain length and salt concentration. Regeneration is quite rapid,
and toe transition temperature is sensitive to ionic strength.
Experimental results using cadmium demonstrated that biopolymers do sequester metals from
solution. Research also addressed repeated the regeneration process and determined that the
binding capability remained fairly constant over four regeneration cycles (using acid for
154 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
regeneration). Current research involves injection of a biopolymer solution into cadmium-
contaminated soils and preliminary results indicate significant levels of cadmium removal.
First generation biopolymers as described above serve as simple metal binding domains. Nature
offers bacteria and other micro-organisms that can concentrate heavy metals. For example, .
bacteria can produce an enzyme to change metallic mercury to another form that is much more
volatile yet a high affinity is required to respond to trace amounts of mercury. Research
activities are investigating whether biopolymers with such properties can be produced and
purified. Results indicate fairly consistent mercury binding capability across a pH range of 4 to
9, when most other binding proteins typically have a much narrower range. In addition, the
research demonstrates that the binding was very specific to mercury even in conditions where
zinc and cadmium were also present.
The ultimate goal is to develop an array of such metalloregulatory proteins to address many
heavy metals including arsenic and chromium. By tuning the elastin composition, differential
precipitation and recovery will be accomplished. It may be possible to design different transition
temperatures for different metals enabling differential metal removal coupled with protein
recovery.
Molecular-Dynamics Simulation of Forces Between Colloidal Nanoparticles
Dr. Kristen Fichthorn, with Pennsylvania State University, discussed new findings in molecular
dynamics from simulation of forces between colloidal nanoparticles, which are potential building
blocks for materials such as catalysts as well as optical, structural, and electronic materials. With
nanoparticles, changing a few atoms can change the action. This property can be used to create
specific patterns such as a hexagon or a square, yet it is difficult to assemble nanoparticles into
specific shapes or to disperse them, as there is a tendency toward aggregation in solution.
Dispersants added to solutions to prevent aggregation have been found to take up a much larger
molecular volume than found in conventional colloids. Nanoparticle behavior, and the forces
involved, is typically extrapolated from that of conventional colloids of microparticles; however,
current research findings indicate that there is a difference.
Molecular dynamics involves the following colloidal forces: van der Waals and electrostatic
forces (between micron-sized particles), solvation forces derived from ordering of molecules in a
solvent (which may be much stronger than previously thought), and depletion forces (entropic)
that occur in a mixture of different sizes of objects in a colloid, A major question to address
through theoretical and experimental research is how these forces work for colloidal
nanoparticles.
Electrodynamic simulations model every molecule in a suspension and consider the force
between two nanoparticles separated by a specified distance. The simulation enables
understanding of the spatial location of each molecule to both better understand the forces and to
understand experimental data. EPA funding is supporting a large-scale parallel molecular
dynamics simulation in a Beowulf Cluster Cruncher using a Lennard-Jones liquid and both
solvophilic and solvophobic nanoparticles. Since molecular forces are sensitive to the shape and
size of the colloid, the research examines differences in behavior for various sized spheres as
well as cube-shaped nanoparticles. Two key questions are whether the van der Waals forces for
EPA SCIENCE FORUM 2003 PROCEEDINGS 155
-------
nanoparticles scale according to a specific formula relating to separation distance, and the role
that the solvent molecules have for colloidal nanoparticles.
Simulation findings indicate that current theories do not accurately describe van der Waals forces
for small particles. Findings also indicate that the solvation forces are comparable to van der
Waals forces and may have a significant impact in modeling as well as in understanding
nanoparticle colloids. Shape appears to have a significant influence on the solvation forces; for
example, solvation forces for nanoparticle cubes were estimated to be significantly higher than
for the nanoparticle spheres and were also stronger than the van der Waals forces. Simulation
results also indicated that solvophobic forces are much weaker than the solvophilic forces, the
solvent-solid interaction is greater than the solvent-solvent interaction, and the solid-solid
interaction is strongest.
Conclusions from simulation results to date are that current theories do not accurately describe
forces for small nanoparticles, solvation forces are important for colloidal nanoparticles, and
solvation forces are strongly dependent on particle size, shape, surface roughness, and particle-
solvent interactions.
Development of Nanocrystalline Zeolite Materials as Environmental Catalysts:
From Environmentally Benign Synthesis to Emission Abatement
Dr. Vicki Grassian, with the University of Iowa, discussed the manufacture of nanocrystal zeolite
materials, their properties, and potential applications. Environmental catalysis uses a catalyst to
make molecules in an environmentally benign manner. This minimizes the generation of
hazardous/toxic materials and their management, promotes waste minimization/reuse, and
supports pollution control. Of particular interest is the controlled synthesis and formation of
zeolite nanoparticles and nanostructures for catalysis and sensor technology.
Zeolites are well-known crystalline, aluminum silicate, nanoporous materials. Zeolite particle
sizes and aggregates are typically about 1000 nanometers in diameter and their nanoproperties
derive in part from the size of the internal cavities. There are both natural and synthetic zeolites
available with uses such as shape-selective catalysts, separation (through size exclusion),
adsorbent (drying agent), sensors, and ion exchange. Commercially available zeolites are crystal
aggregates, while commercial synthesis generates large particles.
Zeolite preparation via controlled synthesis (via hydrothermal or confined space methods) results
in smaller particle sizes with minimal aggregation. The hydrothermal method is the preferred
method for this research effort because of the ease of production and the shorter development
time. Changing the pH and temperature conditions enabled production of much smaller particle
sizes (to 40 nanometers) with additional modifications further reducing particle size to 10
nanometers. Techniques used to characterize the resulting particles are microscopy,
spectroscopy, and chemical and physical methods. Advantages of nanometer-sized zeolites
include: (1) more uniform and controlled size and site distribution, (2) ability to form dense,
uniform nanostructures (films), (3) optical transparency, (4) increased external surface area, and
(5) ease of adsorption and desorption.
156 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
The nanostructures of interest in this research program are the use of zeolite nanoparticles in thin
films and coatings. Preparation of thin films via hydrosol evaporation resulted in significantly
increased optical transmission and a smoother film surface than that achieved with commercial
zeolite particles. Interest in optically transparent films includes potential applications for partial
hydrocarbon oxidation, an important chemical industry process. A major problem encountered
in this process is selectivity because the desired end products are more easily oxidized; therefore,
conversions are kept low. This technology may enable the conversions to be increased.
Experimental results found that as the zeolite layer became thicker, a specific chemical
conversion (p-xylene to p-tolulaldehyde) decreased. Oxidation reactions of cyclohexane also
indicate greater conversion with higher quality optical materials.
Results of this research to date demonstrate that controlled synthesis of zeolites yields smaller
particles and that particle size can be tuned using reaction conditions. In addition,
nanocrystalline zeolites can be used to produce high quality nanostructure materials (e.g., thin
films and coatings) that are better than commercially available materials and have different
physicochemical properties. Future research efforts will focus on obtaining smaller particle sizes
and exploring applications in environmental catalysis.
Panel Discussion/Questions and Answers
The speakers had an opportunity to participate in a brief panel discussion drawing on questions
from the audience.
A brief question and answer period addressed a range of topics. One discussion area considered
identification of the most important nanotechnology application. These included: (1) diverse
potential applications of nanotechnology with particular emphasis on use as catalysts with high
efficiency for one product and broad environmental benefits through minimal generation of
byproducts and waste; (2) potential future emphasis on both selectivity and efficiency; (3) use in
sensor technology; (4) ability to support real-time calibration for field applications; (5)
integrating processes into a single material; (6) use of biomass for catalysts; (7) aid in
understanding boundary lines; and (8) the importance of understanding the fundamental aspects
nano-sized structures, how they are built, and how they work in order to develop applications.
Another discussion area considered research questions, approaches, and future directions. These
included: (1) better understanding of nanoprocesses with an example being how PCB sorption in
sediments occurs; (2) understanding the physics of nanoparticles and how they assemble, to
facilitate future exploitation of such processes in combined natural/manmade systems; (3)
differences in research approaches between countries with Japan focused on a single application,
the United States having a broader approach, and Europe more imaginative; (4) the need for
international collaboration and information exchange; (5) integration of existing research such as
integration of nanofiltration with nanosensors for biofouling; (6) more inter-disciplinary
emphasis on future grant solicitations; (7) determination of research emphasis such as pollution
prevention or remediation as the technologies are different; (8) importance of green
manufacturing for nanomaterials; (9) consideration of research center opportunities and
approaches (e.g., multidisciplinary and multi-investigator groups) to overcome challenges faced
when only a few research programs attempt to combine expertise; (10) having the theoretical
EPA SCIENCE FORUM 2003 PROCEEDINGS 157
-------
research occur in tandem with experimental research; and (11) the need for funding mechanisms
in a period of decreasing grant budgets.
158 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Appendix A: Meeting Agenda
EPA 2003 Science Forum: Partnering to Protect Human
Health and the Environment
May 5-7, 2003, Washington, DC
FINAL AGENDA
Note: The poster and exhibit rooms will be open from 8:00 AM to 7:00 PM.
7:00 AM - 8:45 AM Registration (Atrium Hall Lobby)
9:00 AM - 9:20 AM Plenary (Christine Todd Whitman, EPA Administrator)
9:20 AM - 9:40 AM Plenary (Dr. Paul Gilman, Science Advisor, EPA)
9:40 AM - 10:00 AM Plenary (Mr. Jimmy Palmer, Regional Administrator, EPA Region 4)
••A on AU -m on AM Plenary (Mr. James Connaughton, Chairman, White House Council
10:00 AM-10:30 AM on EnvVonmenta| Qua|ity) y
m in AM 11 nn AM Overview (Dr. Kevin Teichman, Director, Office of Science Policy,
lu.ju AM - 11.uu AM QRD EpA)
11:00 AM - 1:00 PM Lunch
1:00 PM - 1:30 PM Plenary (Homeland Security - Dr. John Vitko, Director for the
Biological and Chemical Countermeasures Portfolio, Department of
Homeland Security)
1:30 PM - 2:15 PM Plenary (Moving Science Into Action - Mr. William G. Ross, Jr.,
Secretary, North Carolina Department of Environment and Natural
Resources; Mr. James Ransom, Director, Haudenosaunee
Environmental Task Force, Mohawk Nation of Akwesasne)
2:15 PM - 2:45 PM Plenary (Year of Water - Dr. Sylvia Earle, Marine Biologist and
Explorer-in-Residence, National Geographic Society)
2:45 PM - 3:15 PM Plenary (Emerging Technologies - Mr. David Rejeski, Director,
Foresight and Governance Project, Wood row Wilson International
Center for Scholars)
3:15 PM - 3:45 PM Break
3:45 PM - 7:00 PM Poster Presentations/Reception/Awards Ceremony (Atrium Hall)
EPA SCIENCE FORUM 2003 PROCEEDINGS 159
-------
Note: The poster and exhibit rooms will be open from 8:00 AM to 5:00 PM.
7:00 AM - 8:30 AM Registration (Atrium Hall Lobby)
8:30 AM - 9:00 AM
9:00 AM - 9:30 AM
Plenary (Recap of Day 1 and Open of Day 2)
Plenary (Ms. Linda Fisher, EPA Deputy Administrator)
o
o
o
T!
65
o
o
I
£
a.
a
o
o
-------
o
T!
ur>
Q.
o
o
Advancing Science Through Environmental Monitoring and Assessment
Program (EMAP) Partnerships (continued)
National Coastal Assessment: Past, Present, and Future - Dr. Kevin
Summers, NHEERL, ORD, EPA
The Interactions of EMAP and SCCWRP: Help in the Past, Necessity for the
Future - Dr. Stephen Weisberg, Southern California Water Research Program
The Role of the National Coastal Assessment in Developing a Continuing
South Carolina Estuarine Monitoring Program - Dr. Robert Van Dolah, South
Carolina DNR
The Application of EMAP and REMAP in the EPA Regions - Ms. Darvene
Adams, EPA Region 2
Questions and Answers
EPA SCIENCE FORUM 2003 PROCEEDINGS
161
-------
^ Computational Toxicology: Bolstering the Environmental Protection
o Agency's Mission - Dr. William Farland, ORD, EPA
° Toxicogenomic Predictive Modeling - Dr. Donna Mendrick, Gene Logic, Inc.
3 EPA's Research Program on Computational Toxicology - Dr. Lawrence Reiter,
NHEERL, ORD, EPA
§ Novel Informatics and Pattern Recognition Tools for Computational
o Toxicology - Dr. William Welsh, Robert Wood Johnson Medical School and The
- UMDNJ Informatics Institute
1X1 Computational Toxicology and Genomics: The Next Wave of Drinking
Water Research - Dr. Douglas Wolf, NHEERL, ORD, EPA
Break
Applying Computational Toxicology to Solving Environmental Problems
a (continued)
o The Genomic Path From Exposure to Effects in Aquatic Ecosystems - Dr.
- David Lattier, NERL, ORD, EPA
• Structure Activity Tools for Assessing Pesticides and Toxic Substances -
S Past, Present, and Future - Mr. Joseph Merenda, Jr., OSCP, OPPTS, EPA
* NIEHS Toxicogenomics Centers: Model for Partnerships - Dr. Bennett Van
Real-Time Monitoring and Communication of Air Quality - Mr. Timothy
g Hanley, OAR, EPA
o Panel Discussion / Questions and Answers
o
162 EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
o
o
o
Introduction - Dr. Fred Hauchman, NHEERL, ORD, EPA
EPA Studies of Endemic and Epidemic Waterborne Diseases - Dr. Rebecca
Calderon, NHEERL, ORD, EPA
Using Randomized Trials to Study Waterborne Pathogens Among
Susceptible Populations - Dr. Jack Colford, University of California, Berkley
Maintaining Microbiological Quality of Drinking Water in the Distribution
System - Dr. Mark LeChevallier, American Water
Break
o
o
1
o
Moderator: Ms. Katie Flahive, OWOW, OW, EPA
The Science of Hypoxia - Dr. David Flemer, OST, OW, EPA
Mississippi Basin Implementation - Ms. Katie Flahive, OWOW, OW, EPA
Q.
01
o
o
(N
O_
O
n
o
o
Lunch
Moderators: Mr. Michael Slimak, NCEA, ORD, EPA and Ms. Marilyn Katz, OWOW,
OW, EPA
The Office of Water Perspective - Mr. G. Tracy Mehan, III, OW, EPA
Research in Support of the Coast Guard's Program to Prevent the
Introduction of Nonindigenous Species by Ships - Dr. Richard Everett, U.S.
Coast Guard
International Efforts to Address the Transfer of Invasive Species Via
Ballast Water - Ms. Kathy Hurld, OWOW, OW, EPA
A "Shocking" Solution to Controlling the Spread of Asian Carp into the
Great Lakes - Dr. Marc Tuchman, GLNPO, EPA
Environmental Perspectives on Invasive Species Control - Ms. Jacqueline
Savitz, Pollution Program, Oceana
Invasive Species and Pesticide Control Programs - Mr. Daniel Rosenblatt,
OPP, OPPTS, EPA
Break
Moderator: Ms. Roberta Baskin, Public Affairs TV (NOW with Bill Movers)
Fluoridation: An Undefendable Practice - Dr. Paul Connett, St. Lawrence
University
TBA
EPA SCIENCE FORUM 2003 PROCEEDINGS
163
-------
o
o
o
(
Anthrax Response and Recovery: Applied Science & Technology, and
Future Needs - Keynote: Mr. Thomas Voltaggio, EPA Region 3
EPA's Homeland Security Research Program - Mr. E. Timothy Oppelt, ORD,
EPA
Secondary Aerosolization of Viable Bacillus Anthracis Spores in an Office
Environment - Dr. Chris Weis, NEIC, EPA
Break
o
o
Environmental Sampling of Bio-Aerosols - Mr. Mark Durno, EPA Region 5 and
Mr. Ken Martinez, NIOSH
<
o
Q.
O
o
o
Lunch
Fumigating Anthrax-Contaminated Sites: Building on Experience - Dr.
Dorothy Canter, OSWER, EPA
Clearance Determinations: Judging Remediation Success and Readiness
for Re-occupancy - Mr. Jack Kelly, EPA Region 3 and Mr. Matt Gillen, NIOSH
Break
Q.
O
ur
o.
o
o
The Hunt for Anthrax Decontamination Chemicals - Mr. Jeff Kempter, OPP,
OPPTS, EPA and Mr. Jeff Heimerman, TIO, OSWER, EPA
Laboratory Support for Evaluating Decontamination Technologies - Ms.
Rebecca Schultheiss, OPP, OPPTS, EPA
Efficacy Testing Science Issues and Follow-Up Research - Dr. Stephen
Tomasino, OPP, OPPTS, EPA
164
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
Working with Tribes: Cultural Values and Tribal Lifeways Inform Health
s Assessments
Q Moderator: Mr. Thomas Baugh, EPA Region 4
o Tribal Partnerships in Pesticide Management to Protect Human Health - Ms.
o Sarah Ryan, Big Valley Rancheria
i Establishing Self-Sufficiency in Alaska Native Communities to Minimize
z Exposure to Environmental Contaminants - Ms. June Gologergen-Martin,
* Alaska Community Action on Toxics
n Bioaccumulative Toxics in Native American Shellfish - Ms. Jamie Donatuto
o° and Mr. Larry Campbell, Swinomish Indian Tribal Community
Break
2 Moderator: Ms. Pamela Russell, OEI, EPA
o Introductory Remarks - Mr. Mike Flynn, OEI, EPA
° Uses of Toxics Release Inventory Data - Ms. Gail Froiman, OEI, EPA
£| Integration of State and County Stream Monitoring Programs: A Maryland
i Case Study - Mr. Wayne Davis, OEI, EPA and Dr. Ron Klauda, Maryland DNR and
£ Mr. Keith Van Ness, Montgomery County Department of the Environment
0 Effects of Urban Growth on Fish Assemblages in a North Carolina
™ Metropolitan Area, 197O-2000 - Dr. Jonathan Kennen, USGS and Ms. Ming
3 Chang, OEI, EPA
Dynamic Choropleth (DC) Maps - Dr. William P. Smith, OEI, EPA
Lunch
Regional Ecosystem Protection: What Does It Offer Our Future? - Mr. John
Z Perrecone, EPA Region 5 and Mr. Doug Norton, OW, EPA
£ Use of Geospatial Tools to Identify High Quality Midwest Ecosystems - Dr.
o Mary White, EPA Region 5 and Mr. John Perrecone, EPA Region 5
" Synoptic Model to Rank Wetland Ecosystems for 404 Permitting: An
^. Application of Regional Critical Ecosystems Protection - Ms. Brenda
Q- Groskinsky, EPA Region 7
° Southeastern Ecological Framework's GeoBook - Software for Mapping
,U Partnerships and Ecosystem Protection - Dr. John Richardson, EPA Region 4
The Mid-Atlantic Highlands Action Program: Transforming the Legacy - Mr.
Tom DeMoss, Mr. Randy Pomponio, and Ms. Jennifer Newland, Canaan Valley
Institute
EPA SCIENCE FORUM 2003 PROCEEDINGS 165
-------
Break
o
o
uri
o
rn
CO
Introduction - Dr. John Bing-Canar, EPA Region 5
Introduction to Concepts and Tools - Mr. Brian Cooper, EPA Region 5
Initial Sample Design - Dr. John Kern, Kern Statistical Services, Inc.
Spatial Estimation - Dr. John Bing-Canar, EPA Region 5
Decision Analysis - Mr. Charles Roth, EPA Region 5
166
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
^ Introductions and Welcome - Dr. Hugh McKinnon, NRMRL, ORD, EPA
o Molecular Farming for Sustainable Chemistry - Dr. Barry Marrs, Fraunhofer
° Center for Molecular Biotechnology
S EPA Biotechnology Program Overview - Dr. Lawrence Reiter, NHEERL, ORD,
EPA
§ Regulatory Perspective on Bio-Engineered Crops - Dr. Janet Andersen, OPP,
o OPPTS, EPA
" Linking Strategic Environmental Monitoring to Risk Assessment of
Biotechnology Products with Plant Incorporated Protectants (PIPs) - Dr.
Robert Frederick, NCEA, ORD, EPA
Break
z Applying Biotechnology to Achieve Sustainable Environmental Systems
Q. cont.
§ Remote Sensing for Bio-Engineered Crops - Dr. John Glaser, NRMRL, ORD, EPA
^ Environmentally Benign Polymeric Packaging from Renewable Resources -
1-1 Dr. John Dorgan, Colorado School of Mines
2 Science-Based Opportunities for Inter-Agency Interactions through the
< USDA Biotechnology Risk Assessment Research Grants Program - Dr.
£ Deborah Hamernik, US Department of Agriculture
o Panel Discussion / Questions and Answers
Lunch
1 Introductions and Welcome - Dr. Jack Puzak, NCEA, ORD, EPA
o Nanotechnology and the Environment: Keeping an Emerging Technology
^ Green- Dr. Vicki Colvin, Center for Biological and Environmental Nanotechnology,
i Rice University
2 The Future of the National Nanotechnology Initiative - Dr. Mike Roco,
0 National Science Foundation
?! Nanostructured Porous Silicon and Luminescent Polysiloles as Chemical
1-1 Sensors for Carcinogenic Chromium (VI) and Arsenic (V) - Dr. William
Trogler, University of California, San Diego
Break
Q Nanoscaie Biopolymers for Decontamination and Recycling of Heavy
o Metals- Dr. Wilfred Chen, University of California, Riverside
^ Molecular-Dynamics Simulation of Forces Between Colloidal Nanoparticles
^ - Dr. Kristen Fichthorn, Pennsylvania State University
a- Development of Nanocrystalline Zeolite Materials as Environmental
£5 Catalysis: From Environmentally Benign Synthesis to Emission Abatement
m - Dr. Vicki Grassian, University of Iowa
Panel Discussion / Questions and Answers
EPA SCIENCE FORUM 2003 PROCEEDINGS 167
-------
o
o
<
o
o
o
(N
I
Z
o
CO
Moderator: Mr. Kennard Potts, OWOW, OW, EPA
Assessing the Consequences of Global Change for Coral Reef Ecosystems -
Dr. Jordan West, NCEA, ORD, EPA
Biological Indices for Assessing Coral Reefs: UV Impacts - Dr. Richard Zepp,
NERL, ORD, EPA
Mr. William Swietlik, OST, OW, EPA
Development of a Coral Reef Index of Biotic Integrity - Dr. Stephen Jameson,
Coral Seas Inc.
Break
The Impacts of urban Drainage Design on Aquatic Ecosystems in the
United States
Moderator: Mr. Jamal Kadri, OW, EPA
The Impacts of Urban Design on Aquatic Ecosystems in the U.S. - Ms. Diane
Regas, OWOW, OW, EPA, and Ms. Hye Yeong Kwan, Center for Watershed
Protection
o
o
O
PI
Lunch
Indicators - Ms. Susan Jackson, OST, OW, EPA
Probability-Based Monitoring Design - Mr. Barry Burgan, OWOW, OW, EPA
Integration of Water Quality Data and Landscape Information - Ms. Denice
Wardrup, EPA Region 4
Break
Q.
O
O
o.
o
PO
Moderator: Mr. Joe Hall, OWOW, OW, EPA
Volunteer Monitoring: 10 Years of Progress, What's in the Future? - Ms.
Alice Mayio, OWOW, OW, EPA
Volunteer Monitoring: A Coastal Perspective - Mr. Joe Hall, OWOW, OW, EPA
Volunteer Monitoring: Ten Years of Progress - Ms. Kathleen Kutschenreuter,
OWOW, OW, EPA
168
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
o
o
<
o
fO
00
Moderator: Ms. Kathy Jones, CEPPO, EPA
Security: The Business of Chemistry's Action - Mr. Marty Durbin, American
Chemistry Council
Homeland Security, Emergency Management, and a Water Utility - Mr. Paul
Bennett, New York City DEP
A Public Utility Manager's View of Our World Post 9/11/20O1 - Mr. Michael
Marcotte, D.C. Water and Sewer Authority
Mr. Gordon Smith, Sandia National Laboratories
Ms. Janet Pawlukiewicz, WPTF, EPA
The EPA Safe Buildings Program - Dr. Nancy Adams, NHSRC, EPA
Break
o
o
-------
Break
Q.
O
O
ui
I
O
T!
n
Overview and Introduction: Moderator: Dr. Jafrul Hasan, OST, OW, EPA
Welcoming Remarks - Mr. Chris Zarba, NCER, ORD, EPA
NHSRC's Water Security Research and Technical Support Program - Mr.
Jonathan Herrmann, NHSRC, EPA
Ms. Grace Robiou, WPTF, EPA
Potential Technologies for Detection of Biological Threats in Water Supplies
- Dr. John Ezzell, US Army Medical Research Institute of Infectious Diseases
"Early Warning Monitoring" and Sensor Technology Development - Ms.
Janet Jensen, US Army Soldier and Biological Chemical Command
Dr. Alan Lindquist, NHSRC, EPA
170
EPA SCIENCE FORUM 2003 PROCEEDINGS
-------
List of Acronyms
ATSDR Agency for Toxic Substances and Disease Registry
CDC Centers for Disease Control
CEPPO Chemical Emergency Preparedness and Prevention Office
DC Dynamic Choropleth
DEP Department of Environmental Protection
DNR Department of Natural Resources
EMAP Environmental Monitoring and Assessment Program
EPA Environmental Protection Agency
GED Gulf Ecology Division
GLNPO Great Lakes National Program Office
NCEA National Center for Environmental Assessment
NEIC National Enforcement Investigation Center
NERL National Exposure Research Laboratory
NHEERL National Health and Environmental Effects Research Laboratory
NHSRC National Homeland Security Research Center
NIEHS National Institute for Environmental Health Sciences
NIOSH National Institute for Occupational Safety and Health
NOAA National Oceanic and Atmospheric Administration
NRMRL National Risk Management Research Laboratory
NSF National Science Foundation
OAQPS Office of Air Quality Planning and Standards
OAR Office of Air and Radiation
OEI Office of Environmental Information
OPEI Office of Policy, Economics, and Innovation
OPP Office of Pesticide Programs
OPPTS Office of Prevention, Pesticides, and Toxic Substances
ORD Office of Research and Development
OSCP Office of Science, Coordination and Policy
OST Office of Science and Technology
OSWER Office of Solid Waste and Emergency Response
OW Office of Water
OWOW Office of Wetlands, Oceans, and Watersheds
PIPs Plant Incorporated Protectants
R&D Research & Development
REMAP Regional Environmental Monitoring and Assessment Program
ReVA Regional Vulnerability Assessment
SCCWRP Southern California Coastal Water Research Project
SEQL Sustainable Environment for Quality of Life
TBA To Be Announced
TIO Technology Innovation Office
UC University of California
UMDNJ University of Medicine & Dentistry of New Jersey
USDA United States Department of Agriculture
USGS United States Geological Survey
WPTF • Water Protection Task Force
EPA SCIENCE FORUM 2003 PROCEEDINGS
171
------- |