^eDsrx
. " £% '
\ w'
PRO^
U.S. ENVIRONMENTAL PROTECTION AGENCY
OFFICE OF INSPECTOR GENERAL
Hotline Report:
Improving EPA research programs
Operating efficiently and effectively
Management Alert:
EPA Should Promptly
Reassess Community Risk
Screening Tool
Report No. 17-P-0378	September 7, 2017
>

-------
Report Contributors:
Patrick Gilbride
Erin Barnes-Weaver
Monica Brym
Raul Adrian
Claire A. Brady
Abbreviations
CARE	Community Action for a Renewed Environment
C-FERST Community-Focused Exposure and Risk Screening Tool
EJSCREEN Environmental Justice Screening and Mapping Tool
EPA	U.S. Environmental Protection Agency
IT	Information Technology
NERL	National Exposure Research Laboratory
OEI	Office of Environmental Information
OIG	Office of Inspector General
ORD	Office of Research and Development
QAPP	Quality Assurance Project Plan
READ	Registry of EPA Applications and Databases
SHC	Sustainable and Healthy Communities
U.S.C.	United States Code
Cover photo: Image from the C-FERST "View Your Community" web page. (EPA photo)
Are you aware of fraud, waste or abuse in an
EPA program?
EPA Inspector General Hotline
1200 Pennsylvania Avenue, NW (2431T)
Washington, DC 20460
(888) 546-8740
(202) 566-2599 (fax)
OIG Hotline@epa.gov
Learn more about our OIG Hotline.
EPA Office of Inspector General
1200 Pennsylvania Avenue, NW (2410T)
Washington, DC 20460
(202) 566-2391
www.epa.gov/oiq
Subscribe to our Email Updates
Follow us on Twitter @EPAoig
Send us your Project Suggestions

-------
o**eDsrx
• JL v
I®/
U.S. Environmental Protection Agency
Office of Inspector General
At a Glance
17-P-0378
September 7, 2017
Why We Did This Review
We received a hotline complaint
alleging concerns with the
U.S. Environmental Protection
Agency (EPA) Office of Research
and Development's (ORD's)
Community-Focused Exposure
and Risk Screening Tool
(C-FERST). To address the
hotline allegations, we evaluated
how ORD planned, developed
and implemented C-FERST.
C-FERST is an online information
and mapping tool, launched in
September 2016, that
communities and the public can
use to learn more about their
environmental issues and
exposures. According to ORD,
C-FERST is intended to serve a
broad range of users (e.g.,
general public, state/local risk
assessors, public health agencies
and environmental justice
coordinators). The purpose of this
management alert is not to raise
a health concern but rather to
timely notify the EPA so that it
can promptly act to better
manage its planned investment in
C-FERST to prevent waste.
This report addresses the
following:
•	Improving EPA research
programs.
•	Operating efficiently and
effectively.
Send all inquiries to our public
affairs office at (202) 566-2391
or visit www.epa.gov/oiq.
Listing of OIG reports.
Management Alert: EPA Should Promptly
Reassess Community Risk Screening Tool
What We Found
Our review substantiates some hotline
allegations about C-FERST. We found that ORD
took 8 years to develop a tool that:
•	Is different from its intended purpose.
•	Did not have a project proposal or request
for its development.
•	Was outside the agency's information
technology requirements.
•	Overlaps with other EPA tools.
•	Was not widely used in the approximately
9 months after it was publicly released,
according to available user data.
C-FERST overlaps with
other tools and is not yet
widely used,
underscoring its $400,000
planned yearly
investment as a risk.
Efforts by the agency to
cut costs, streamline
activities and avoid
duplication compound
the need for the EPA to
promptly review C-FERST
and similar tools.
ORD planned and designed C-FERST internally as a research tool—outside of
the agency's information technology monitoring and accountability
requirements—and altered the original purpose of the tool during development
without properly documenting this change. ORD also did not consider outcome
measures or possible joint governance with similar EPA tools.
Without proper accountability controls, ORD creates a risk that the estimated
$400,000 it plans to spend annually for maintenance, operation and
enhancements of C-FERST is wasteful government spending. Without metrics
to measure performance, it is unclear if C-FERST is being used for its intended
purpose or meets user needs. Further, having multiple agency mapping tools
that perform similar functions can confuse potential users.
Recommendations and Planned Agency Corrective Actions
We recommend that the Assistant Administrator for ORD review C-FERST and
develop an action plan to address issues identified, including whether to retain
the tool. If retained, we recommend that ORD develop performance metrics
and a user survey. We also recommend that ORD develop certain policies and
procedures, review new and existing ORD research tools to determine
applicability of the EPA's information technology requirements, and work with
agency offices responsible for other geospatial mapping tools to develop a
decision support matrix on when to use certain tools. ORD agreed with our
findings and recommendations and provided acceptable corrective actions and
estimated completion dates. ORD's recommendations are resolved with
corrective actions pending.
We also recommend that the Deputy Administrator examine all of the EPA's
web-based risk screening and mapping tools to ensure the need for each tool.
This recommendation is unresolved.

-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
THE INSPECTOR GENERAL
September 7, 2017
MEMORANDUM
SUBJECT: Management Alert: EPA Should Promptly Reassess Community Risk Screening Tool
This is our report on the subject review conducted by the Office of Inspector General (OIG) of the
U.S. Environmental Protection Agency (EPA). The project number for this review was OPE-FY17-0006.
This report contains findings that describe the problems the OIG has identified and corrective actions the
OIG recommends. This report represents the opinion of the OIG and does not necessarily represent the
final EPA position. Final determinations on matters in this report will be made by EPA managers in
accordance with established audit resolution procedures.
Action Required
In accordance with EPA Manual 2750, the Office of Research and Development provided planned
corrective actions in response to our recommendations. All recommendations to the Office of Research
and Development are considered resolved. The Office of Research and Development is not required to
provide a written response to this final report because it provided agreed-to corrective actions and
planned completion dates for all report recommendations. The OIG may make periodic inquiries on the
progress in implementing these corrective actions. The Office of Research and Development should
update the EPA's Management Audit Tracking System as it completes planned corrective actions.
The recommendation to the Deputy Administrator is unresolved. In accordance with EPA Manual 2750,
appropriate OIG and Deputy Administrator staff will discuss resolution within 30 days of the final
report.
FROM:
Arthur A. Elkins Jr.
TO:
Mike Flynn, Acting Deputy Administrator
Dr. Robert Kavlock, Acting Assistant Administrator
Office of Research and Development
We will post this report to our website at www.epa.gov/oig.

-------
Management Alert: EPA Should Promptly
Reassess Community Risk Screening Tool
17-P-0378
Table of C
Chapters
1	Introduction		1
Purpose		1
Background		1
Responsible Offices		4
Scope and Methodology		4
2	ORD Should Reassess C-FERST and
Complete Planned Actions to Mitigate Risks		6
C-FERST Does Not Calculate Cumulative Risks as Originally
Intended, and This Change Was Not Properly Documented		6
ORD Developed C-FERST Absent a Project Proposal or
Request for the Tool's Development		9
ORD Developed C-FERST as a Research Tool, Outside of
Requirements for an IT Application		10
C-FERST Overlaps With Other Agency Tools and
ORD Has Not Described C-FERST's Unique Components		11
C-FERST's Research Plan Did Not Include Outcome Measures		13
OIG Sampling Indicates Limited Use to Date		14
Planned ORD Actions Could Mitigate Some Risks		15
Conclusion		17
Recommendations		17
Agency Comments and OIG Evaluation		18
Status of Recommendations and Potential Monetary Benefits		19
Appendices
A Office of Research and Development's Written Comments	 20
B Office of the Administrator's Written Comments	 28
C Distribution	 30

-------
Chapter 1
Introduction
Purpose
The U.S. Environmental Protection Agency (EPA) Office of Inspector General
(OIG) received an anonymous hotline complaint alleging concerns with the
Office of Research and Development's (ORD's) Community-Focused Exposure
and Risk Screening Tool (C-FERST). The hotline complaint alleged, among
other concerns, that C-FERST:
•	Was poorly conceived from the beginning.
•	Was undertaken with little understanding of information technology (IT)
and design architecture protocols.
•	Is redundant with other EPA tools (specifically, the Environmental Justice
Screening and Mapping Tool, also known as EJSCREEN).
•	Overstates claims of usefulness in setting community priorities in response
to pollution exposures.
In response to the allegations, we evaluated how ORD planned, developed and
implemented C-FERST. The purpose of this management alert is not to raise a
human health or environmental concern but rather to provide timely notification
to the EPA so that it can promptly act to better manage its investment in
C-FERST to prevent waste.
Background
C-FERST is a web-based geospatial mapping tool developed by ORD to provide
users with access to resources to help identify and provide information about local
environmental health issues. The tool includes fact
sheets and information on 20-plus pollutants, and
structured guides that can help communities assess
local environmental conditions. C-FERST users are
intended to be able to explore exposure and risk
reduction options, and use the tool to plan
community-based projects that mitigate harmful
environmental exposures.
ORD started developing C-FERST around 2008 and launched the tool on the
EPA's public website in September 2016. ORD invested over $1.4 million during
fiscal years 2010-2016 on contractor support to develop both C-FERST and
ORD's Sustainable and
Healthy Communities
national research program
approved C-FERST and its
annual budget, and the
National Exposure
Research Laboratory
developed the tool.
17-P-0378
1

-------
Tribal-FERST.1 ORD staff time on C-FERST grew over that time period, from
1.5 to 2.75 full-time equivalents per year.2 Ongoing annual costs are projected to
be nearly $400,000 for fiscal years 2017, 2018 and 2019, to include: maintenance,
updates and enhancements, outreach through training and technical assistance,
case studies on uses and needs, and integration/interoperability with other EPA
tools.
C-FERST Impetus
EPA staff involved in C-FERST's development described its impetus in early
journal articles, including the following:
[C-FERST] was designed to support communities' environmental
justice (EJ) efforts. This tool is being developed by the EPA's
Office of Research and Development in the National Exposure
Research laboratory, which is conducting research to provide tools
that enhance community-based cumulative risk assessments. This
research responded to requests from the EPA's [Community
Action for a Renewed Environment] CARE program, the Office of
Environmental Justice, EPA regional offices, and communities
themselves, as well as recommendations from the National
Academy of Sciences, National Academy of Public
Administration, and other agencies.3
The EPA's former CARE program was
intended to empower communities to develop
and implement their own environmentally
focused projects. C-FERST was originally
envisioned as a tool that could walk
communities through the various stages of
the CARE Roadmap, a 10-step planning
process that communities can follow in
conjunction with C-FERST to learn about
environmental health risks and impact; and
build local consensus, partnerships, and
community capacity sustainable in the long
run. C-FERST provides resources for each of
these steps.
CARE Roadmap Process
1)
Build a partnership.
2)
Identify community concerns.
3)
Identify community

vulnerabilities.
4)
Identify community assets.
5)
Identify concerns for

immediate action.
6)
Collect and organize

information.
7)
Rank risks and impacts.
8)
Identify potential solutions.
9)
Set priorities for action and

begin work.
10)
Evaluate results and become

self-sustaining.
1	Tribal-FERST is a tool similar to C-FERST targeted at tribal groups; this tool has not yet launched.
2	These figures represent a portion of costs since the agency, with some exceptions, does not track EPA staff time by
project in the agency's official timekeeping system.
3	Zartarian et al. The Environmental Protection Agency's Community-Focused Exposure and Risk Screening Tool
(C-FERST) and Its Potential Use for Environmental Justice Efforts. American Journal of Public Health.
Supplement 1, 2011, Vol. 101, No. SI, page S286.
17-P-0378
2

-------
C-FERST Pilot Testing and Peer Review
As a part of C-FERST's development, from 2010 to 2012, ORD conducted case
studies with two pilot communities in EPA Region 1 that wanted to identify and
prioritize environmental issues. The goals of these case studies included:
•	Providing CARE partners with useful information to identify and
prioritize issues.
•	Enhancing C-FERST usability.
•	Providing transferable, generalizable methods to identify and prioritize
issues.
•	Enhancing cumulative exposure and risk science to inform decision-
making.
The pilot communities considered such environmental issues as asthma rates,
air pollution, ambient and indoor air quality, lead, and water quality. The pilots
identified expectations of communities, current uses of C-FERST, and needed
modifications. Additionally, ORD provided 455 individuals with access to the
field-testing version of C-FERST from September 2011 to December 2014. These
individuals included EPA regional and program office representatives, state and
local governments, non-profit organizations, academic institutions, industry, and
consulting firms. From 2014 to 2016, the C-FERST team also worked closely
with EPA Region 10 to conduct trial tests with regional community users.
ORD conducted a contractor-led external letter peer review of C-FERST in
2013-2014, receiving a final peer review report in February 2014. A letter peer
review seeks individual written comments from independent experts. Each
reviewer evaluates the draft technical work product independently without
consultation from other peer reviewers. The EPA contractor selected four peer
reviewers based on a list of qualifications provided by ORD. The C-FERST peer
review charge included 19 questions under eight categories. Peer reviewers had
about 5 weeks to conduct their review. ORD summarized its responses to peer
reviewer comments in an internal report in April 2014.
EPA's Geospatial Platform and Tool Development
As noted above, C-FERST is a web-based geospatial mapping tool. The EPA's
geospatial platform helps people identify and describe environmental situations in
specific locations to understand local environmental health issues, and target areas
with high environmental risk (see Figure 1). This platform supports multiple tools
and ensures that there is a level of consistency across tools.
17-P-0378
3

-------
Figure 1: C-FERST facility map for a portion of Washington, D.C., area
ArcGIS » Community-FERST and Tribal-FERST Map
o m
Legend
l acflllin (SHet)
* 5har» & Prim n Maasurc L4 Bookmarks Washington. O. C.. District of Columbia. United SUn X |C^|
lrdep«rv£trc« Ave S\
Major Water Discharger
«
Hazardous Waste (Active)
Tax* Release inventory (TRI)
Impaired Waters wHh IMIHs
Total Na> Dally Loads Point
Total Ma. Daily Loads Una
Total Mai Daily Loads Area
*
Source: Map generated by the OIG using the C-FERST website.
The EPA maintains an application inventory of tools in its Registry of EPA
Applications and Databases (READ). This system, which tracks all information
resources across the agency, is maintained by the EPA's Office of Environmental
Information (OEI). ORD maintains a similar system called the ORD Application
Inventory. ORD's system provides a centralized repository of known IT assets
(systems, databases, models and decision-making tools) that have been planned,
developed/acquired or are currently under development to address specific
research questions and administrative requirements. ORD's inventory feeds into
READ and currently lists about 320 IT assets. C-FERST is one of the IT assets
listed in both READ and ORD's Application Inventory.
Responsible Offices
ORD—which includes the Office of Science Information Management, the
National Exposure Research Laboratory (NERL), and the Sustainable and Healthy
Communities (SHC) national research program—has primary responsibility for
the subjects covered in this review.
Scope and Methodology
We conducted our review from January to June 2017. We conducted this
performance audit in accordance with generally accepted government auditing
standards. Those standards require that we plan and perform our work to obtain
sufficient, appropriate evidence to provide a reasonable basis for our findings and
conclusions based on our audit objectives. We believe that the evidence obtained
17-P-0378
4

-------
provides a reasonable basis for the findings and conclusions in this report based
on our audit objectives.
We analyzed numerous documents pertaining to research tool design, quality and
peer review, including policies, procedures and guidance documents. We also
reviewed journal articles on the impetus of C-FERST and contracts supporting its
development.
We interviewed key ORD staff and managers responsible for the C-FERST
concept/design, development, implementation and quality assurance. To compare
C-FERST with other EPA tools, we interviewed staff in other agency program
offices (e.g., Office of Policy and Office of Environmental Justice, on
EJSCREEN). To understand coordination of C-FERST and other tools on the
geospatial platform, we met with ORD's Office of Science Information
Management and OEI. We also interviewed all four C-FERST external peer
reviewers on the peer review process and their perspectives on the tool.
To obtain additional perspectives, we obtained information from ORD on
226 individuals—127 EPA and 99 non-EPA4—who either contacted the agency
through accessing the site, attended a C-FERST pilot training session, or
participated through a Region 10 partnership. We interviewed nearly 10 percent
within the EPA and each category of non-EPA individuals for their experiences
with C-FERST.5
4	The 99 non-EPA employees encompassed the following categories: 44 state/local government employees,
35 individuals from organizations, 14 individuals from academic institutions, five individuals with private emails or
missing contact information, and one individual from a non-EPA federal agency.
5	We interviewed 20 individuals representing the EPA and the largest categories, as follows: 12 EPA employees,
four representatives of state/local government, two representatives of organizations, and two representatives of
academic institutions. We refer to these individuals as "users" throughout our report.
17-P-0378
5

-------
Chapter 2
ORD Should Reassess C-FERST and
Complete Planned Actions to Mitigate Risks
Our review substantiates some concerns raised in the hotline allegation about
C-FERST. We found that ORD took 8 years6 to develop a tool that:
•	Is different from its intended purpose.
•	Did not have a project proposal or request for its development.
•	Was outside the agency's information technology requirements.
•	Overlaps with other tools.
•	Was not widely used in the approximately 9 months after it was publicly
released, according to available user data.
ORD planned and designed C-FERST internally as a research tool—outside of the
agency's IT monitoring and accountability requirements—and altered the original
purpose of the tool during development without properly documenting this
change. ORD also did not consider outcome/performance measures or explore
ways to reduce overlap with similar tools through joint governance. Without
proper accountability controls, ORD cannot ensure that it will spend the estimated
$400,000 it plans to spend annually for maintenance, operation and enhancements
of C-FERST in a manner that prevents waste. Without metrics to measure
performance against goals, it is unclear if C-FERST is being used for its intended
purpose or meeting user needs. Further, having multiple agency tools that perform
similar functions can confuse potential users.
C-FERST Does Not Calculate Cumulative Risks as Originally Intended,
and This Change Was Not Properly Documented
When C-FERST was first conceived, it was
included as part of Long Term Goal 2 in the
ORD's Human Health Research Program
(the predecessor to the SHC national research
program) Multi-Year Plan for 2006-2013. The
plan noted a long-term objective to produce a
research framework outlining tools and
approaches to characterize and assess aggregate
exposures and cumulative risks. Our review of
documentation related to the planning and development of C-FERST shows that a
significant change occurred in the design objectives of the tool between when
6 ORD staff noted that part of this time included reviewing the tool through pilot testing and peer review, which we
described in Chapter 1, as well as awaiting updated air toxics data.
Cumulative risk assessment is
defined as an analysis,
characterization, and possible
quantification of the combined
risks to health or the
environment from multiple
agents or stressors.
- EPA's Framework for Cumulative
Risk Assessment, May 2003
17-P-0378
6

-------
ORD released the prototype version (circa 2008) and the official launch of the
tool in September 2016. The change had to do with the claim that C-FERST
would characterize cumulative risks to toxic substances. We observed a range of
statements over time on the extent to which C-FERST addresses cumulative risk
assessment:
2010	[C-FERST] balances innovative, high quality science with
user-friendliness to characterize the cumulative impact of
multiple stressors for: prioritizing environmental issues
within communities; identifying communities at risk; and
ultimately, assessing impact of actions (accountability)7
2011	C-FERST will contain exposure-based cumulative risk
characterizations based on the best available information
and science.8
2013 C-FERST is designed to help identify, prioritize, and
manage community environmental and public health
issues by providing various types of information related
to potential exposures and cumulative risks to provide
communities with a scientifically credible means for
evaluating, prioritizing, and mitigating environmental
health concerns.9
Present Although C-FERST offers multimedia environmental data
and demographic data, it does not add together or
otherwise calculate cumulative risks for different
environmental exposures. ... C-FERST does not calculate
cumulative risk or impacts.10
ORD officials verbally acknowledged this change to the OIG and noted that early
communications were more aspirational whereas later communications reflected
C-FERST's actual abilities. Both the NERL Director and Deputy Director, as well
as the C-FERST Principal Investigator, said that cumulative risk assessment was
more complex than previously anticipated. The Principal Investigator also note
that the pending issuance of the EPA's cumulative risk assessment guidelines
contributed to the change in objectives.11 One NERL scientist said C-FERST
cannot rank risks—the seventh step in the CARE roadmap that C-FERST
purportedly supports. ORD is developing another tool—not yet launched—called
7	From ORD's 2010 contract with one of the C-FERST developers.
8	Zartarian et al. The Environmental Protection Agency's Community-Focused Exposure and Risk Screening Tool
(C-FERST) and Its Potential Use for Environmental Justice Efforts. American Journal of Public Health.
Supplement 1, 2011, Vol. 101, No. SI, page S288.
9	From ORD's 2013 contract for the external peer review.
10	From the C-FERST website page, "Questions and Answers about C-FERST," as of December 13, 2016.
11	The EPA issued its Framework for Cumulative Risk Assessment in May 2003, but the agency's cumulative risk
assessment guidelines are still under development.
17-P-0378

-------
the Community Cumulative Assessment Tool, which may be integrated as a
module within C-FERST to address the cumulative risk assessment and risk
prioritization.
This change in objectives was not reflected in any C-FERST project-specific
documentation. Absent a proposal on C-FERST's objectives, which we discuss in
the next section, we reviewed the tool's Quality
Assurance Project Plan (QAPP). ORD has issued
three QAPPs to date on
C-FERST—in 2009, 2013 and 2017. However,
none of these described the shift away from
cumulative risk assessment, which had been
articulated in strategic planning documents and
research articles, such as those cited above.
ORD officials said that the QAPP may not be an appropriate vehicle for
registering a change in the project's goals because QAPPs only address data-
related issues. However, NERL's 2012 Quality Management Plan notes that
QAPPs document, among other elements, background information and project
goals. Additionally, the EPA's quality assurance requirements note that QAPPs
integrate all technical and quality aspects of a project, including planning,
implementation and assessment. The requirements mandate that QAPPs document
the project description and any subsequent significant changes thereof:
When a substantive change is warranted, the originator of the
QA Project Plan shall modify the QA Project Plan to document the
change and submit the revision for approval by the same authorities
that performed the original review. Only after the revision has been
received and approved (at least verbally with written follow-up) by
project personnel, shall the change be implemented.12
Documenting in the QAPP the objectives of a project and any changes that may
occur, such as a shift away from the cumulative risk assessment, becomes
critically important as a means of tracking the evolution of a project throughout
its lifecycle. Moreover, EPA quality assurance requirements specify annual
reviews of the QAPP for programs or projects of long duration.13 ORD's 2017
C-FERST QAPP revision now requires that the Principal Investigator perform
annual reviews of the QAPP, which is an improvement over the prior 4-year
update interval.
"This QAPP is to be
reviewed regularly, and
changes and new additions
to the C-FERST site will be
reflected in future versions."
- C-FERST QAPP,
August 2009
12	U.S. EPA. EPA Requirements for Oualitv Assurance Project Plans. EPA QA/R-5. March 2001.
13	Id.
17-P-0378	8

-------
ORD Developed C-FERST Absent a Project Proposal or Request for
the Tool's Development
We did not see evidence that ORD developed a project proposal that outlined
goals, objectives and/or performance outcomes for the C-FERST prototype (circa
2008). Instead, ORD indicated that early ORD presentations and publications,
developed in 2008-2009, described the vision for C-FERST. ORD's project
management controls over research tools, such as C-FERST, do not explicitly
require formal project proposals before approving funding. Rather, general areas
of research are approved as part of ORD's process to set Strategic Research
Action Plans for each national research program. C-FERST is part of SHC's
research portfolio, and SHC's predecessor—ORD's Human Health Research
Program—initially approved C-FERST.
While early presentations and publications on C-
FERST generally described ideas for the tool, we did
not see evidence that ORD performed a systematic
assessment of community needs for the tool. This
was noted in 2014 by one of the peer reviewers and
agreed to by ORD in its written response to peer
review comments.
In its written responses to peer review comments, ORD said the tool was
developed with extensive input from EPA regional offices and communities. As
noted above, a 2011 journal article said C-FERST research responded to requests
from, among others, the CARE program, the Office of Environmental Justice,
EPA regional offices, and communities. In interviews with our team, SHC
leadership said the air program requested an electronic tool for the CARE
roadmap. However, ORD has not provided clear documentation from any EPA
program office or community requesting the tool.
Instead, in its 2011 journal articles ORD cited gathered input from two CARE
program project officers as impetus for the tool's early development efforts. The
articles also generally described the need for, and anticipated benefits of,
C-FERST in helping communities understand and/or map exposures:
• In an effort to advance the science to accurately
characterize and communicate community health risk,
ORD is developing the Community-Focused Exposure
and Risk Screening Tool. ... C-FERST will automate the
laborious process of generating maps of interest for
community mapping projects.14
14 Hammond et al. Assessment and Application of National Environmental Databases and Mapping Tools at the
Local Level to Two Community Case Studies. Risk Analysis. Vol. 31, No. 3. 2011, page 485.
One peer reviewer said
C-FERST is unlikely to be
useful to communities
unless it is redesigned to
respond to specific
community concerns.
17-P-0378
9

-------
• [C-FERST] provides regions and communities with a
user-friendly tool to understand local exposure information
(based on solid science) so that they can make informed,
cost-effective decisions and take action. ... C-FERST
bridges the gap between the emerging community-based
cumulative risk science and its actual use by, first, the
EPA's regional offices and then community groups at
large.15
ORD said it does not explicitly require formal project proposals and, instead,
general areas of research are approved as part of ORD's process to set strategic
research plans for each national research program. NERL leadership said that,
early on, this amounts to more of a "concept" instead of formal plan. However, an
approved project proposal or similar document—containing the objectives,
justification, design approach and methodology for measuring performance—
would provide a clear path and ensure accountability as the project moves
forward.
ORD Developed C-FERST as a Research Tool, Outside of
Requirements for an IT Application
The EPA's policy governing its IT
investments defines IT to include any
equipment/websites used for data
management and display. According to
ORD's Office of Science Information
Management—the ORD office that
supports IT and application development
and maintenance—C-FERST and other
ORD-developed web tools are not
considered IT applications. Consequently,
these tools are excluded from requirements in the EPA's IT policies, including
system lifecycle management. Instead, C-FERST is categorized as a "research
tool."
"A comprehensive approach [to
system life cycle management]
ensures that EPA IT systems and
applications are properly planned
and managed, controllable, cost-
effective, and support the mission
and business goals of the Agency."
- EPA Information Procedures: System
Life Cycle Management Procedure,
CIO 2121-P-03.0, September 21, 2012
ORD designed C-FERST internally, working with an ORD contractor, and
coordinated with OEI when necessary to add the mapping capabilities of the
agency's geospatial platform. SHC approved and funded C-FERST as part of the
SHC research plan while noting that they were not familiar with IT requirements
for research tools. ORD's Office of Science Information Management said
C-FERST's development was not its lane of responsibility, and it did not track or
report C-FERST under the EPA's IT requirements. The Director of the Office of
15 Zartarian et al. The Environmental Protection Agency's Community-Focused Exposure and Risk Screening Tool
(C-FERST) and Its Potential Use for Environmental Justice Efforts. American Journal of Public Health.
Supplement 1, 2011, Vol. 101, No. SI, page S293.
17-P-0378
10

-------
Science Information Management also said that the office does not have OEI
review each research tool developed by ORD. Even if it did, OEI does not
coordinate agencywide management of research tools, which could prevent
overlap of efforts through, for example, joint governance of similar tools.
SHC said day-to-day management of the research tool fell to NERL. One NERL
scientist with experience in systems dynamics and decision science said that the
tool was developed without coordination between the ecosystems and human
health sides of ORD, and that C-FERST was developed before staff with spatial
data experience heard about it.
A staff person within ORD's Office of Science Information Management said the
recent introduction of the ORD Application Inventory (mentioned in Chapter 1)
should help ORD review and monitor new applications, such as research tools.
That office now checks the inventory first before approving any new application.
C-FERST Overlaps With Other Agency Tools and
ORD Has Not Described C-FERST's Unique Components
As noted above, C-FERST is a geospatial mapping tool developed by ORD as an
environmental screening tool for users to identify potential environmental harm in
their communities. There are key similarities and differences between C-FERST
and other EPA geospatial mapping tools, including EJSCREEN, EnviroAtlas,
NEPAssist and MyEnvironment. Table 1 shows environmental data available in
each tool.
Table 1: Environmental data in C-FERST and four other EPA geospatial mapping tools

C-FERST
EJSCREEN
EnviroAtlas
NEPAssist
MyEnvironment
Air Toxics
Yes
Yes
No
Yes
Yes
Particulate Matter and Ozone
Yes
Yes
Yes
Yes
Yes
Demographics
Yes
Yes
Yes
Yes
No
EPA Regulated Facility
Yes
Yes
No
Yes
Yes
Ecosystem Services
No
No
Yes
No
No
Water Quality
Yes
Yes
Yes
Yes
Yes
Source: OIG summary of geospatial mapping tools.
According to the U.S. Government Accountability Office, overlap occurs when
multiple programs "have similar goals, engage in similar activities or strategies to
achieve them, or target similar beneficiaries."16 Duplication occurs when two or
more programs "are engaged in the same activities or provide the same services to
the same beneficiaries."17 Overlap and duplication can affect program
implementation, outcomes and impact, and cost-effectiveness. Overlap might not
necessarily lead to actual duplication, and some degree of overlap may be
16	U.S. Government Accountability Office, 2016 Annual Report: Additional Opportunities to Reduce
Fragmentation, Overlap, and Duplication and Achieve Other Financial Benefits, GAO-16-375SP, April 13, 2016,
page 2.
17	Id.
17-P-0378
11

-------
justified. Agencies can mitigate negative effects of overlapping programs by
leveraging resources for joint activities such as training and outreach efforts. To
validate the effects, the duplicative or overlapping program/activity should be
assessed and compared to determine relative performance and cost-effectiveness.
EPA documentation notes that C-FERST has some similarities to other EPA web-
based tools. All four C-FERST peer reviewers commented on potential overlap
and the benefit of integrating C-FERST with similar tools.
While multiple EPA geospatial mapping tools draw from the EPA geospatial
platform, which creates some operational efficiencies, some interviewees stated
that: (1) it is challenging for users to differentiate between the various EPA
geo-spatial analysis tools to determine which to use and when, and (2) the tool is
not immediately intuitive and requires training to be used effectively. One of the
peer reviewers said there were problems with the "usability, functionality, and
navigability" of C-FERST. Some sampled users expressed confusion on when to
use certain tools (a subsequent section describes the OIG's user sampling). ORD
said a four-tool comparison chart provided to our
team (and available via the EJSCREEN website)18 is
intended to help users differentiate among the tools.
However, this chart provides broad, overview
information for only four tools and does not include
other similar/available EPA tools or help users
determine which to use and when.19
In our assessment, based on our interviews with EPA staff and others, C-FERST
overlaps most closely with EJSCREEN. EJSCREEN and C-FERST have similar
functions and capabilities for identifying environmental risks in a community.
Both tools provide a combination of demographic and environmental data (see
Table 1), and allow users to compare the data from their local community (census
tract) with state-level data. While each tool has some unique features, most users
and peer reviewers that we interviewed could not clearly differentiate the tools.
Following are features unique to EJSCREEN and C-FERST:
•	EJSCREEN: Users can look at environmental justice indices
(combinations of environmental and demographic information) based on
percentiles, and narrow in on specific demographic information not
available elsewhere (including by language, place of birth, age, etc.).
•	C-FERST: Users can access more specific environmental data
(particularly from the National Air Toxics Assessment dataset), find
detailed information on potential ways to mitigate the harms of specific
18	See the chart at https://www.epa.gov/eiscreen/epa-4-tool-comparison-chart. In addition to C-FERST, the chart
includes EJSCREEN, EnviroAtlas, and the National-Scale Air Toxics Assessment.
19	Other tools mentioned by users that are not included in this chart include the Benefits Mapping and Analysis
Program (BenMAP), and the Community-LINE Source Model.
Some people interviewed
said that they found
C-FERST challenging to
use, and it would be even
more so for community
members without specific
training.
17-P-0378
12

-------
environmental pollutants through C-FERST's Issue Profiles, and see
examples and advice for how to plan a community-level project with
C-FERST's CARE Roadmap.
ORD and others—such as the EPA's Office of Environmental Justice—described
C-FERST and EJSCREEN as distinct and complementary tools, noting that
C-FERST provides context to information that may be available from other
sources (e.g., MyEnvironment or EJSCREEN). One EPA regional user remarked
that the nuanced level of data C-FERST provides is useful in their work. Other
users of the tool said they use EJSCREEN first as a broad screening tool and then
move to C-FERST to hone in on more community-specific data.
One OEI staff person said joint governance of EJSCREEN and C-FERST would
be a good idea (i.e., merging the management and resources of the two tools to
make future decisions on functions, use and needed updates). While an Office of
Policy staff person involved in the development of EJSCREEN said he does not
believe the tools should be consolidated, he does believe all of the EPA's tools
should be harmonized to avoid duplication of effort and to use shared resources.
The developer further noted that documents helping users determine which tools
to use would be helpful.
C-FERST's Research Plan Did Not Include Outcome Measures
The Government Performance and Results Act Modernization Act of 2010
requires agencies to use performance information in decision-making, and it holds
agencies accountable for achieving results and improving government
performance.20 The act defines program evaluation as an assessment of the
manner and extent to which federal programs achieve intended objectives.21
Additionally, the U.S. Government Accountability Office notes that program
performance measurement entails "the ongoing monitoring and reporting of
program accomplishments, particularly progress toward pre-established goals,"
and is typically conducted by program or agency management.22 Performance
measures address process (type or level of activities conducted), outputs (direct
products and services delivered), or outcomes (results) of "any activity, project,
function, or policy that has an identifiable purpose or set of objectives."23
ORD did not identify performance measures during the C-FERST development
process. The NERL Director told us that NERL is now exploring ideas for how to
measure whether C-FERST is meeting community needs, and the SHC Director
20	Government Performance and Results Act Modernization Act of 2010. Public Law 111-352, January 4, 2011,
Sections 3 and 4. See also 31 U.S.C. §§ 1115, 1116 and 1121.
21	Government Performance and Results Act Modernization Act of 2010. Public Law 111-352, January 4, 2011,
Section 4. See also 31 U.S.C. § 1115(h)(12).
22	U.S. Government Accountability Office. Performance Measurement and Evaluation: Definitions and
Relationships. GAO-11-646SP. May 2011.
23	Id.
17-P-0378
13

-------
confirmed that no measures are currently available for monitoring C-FERST
performance. As of April 2017, the C-FERST team acknowledged that the only
metric they collect is Google Analytics use data on the website, which the SHC
Director said is not a useful metric for measuring the performance of the tool. In
the absence of output or outcome metrics, the Google Analytics data on webpage
views provides the only input on the level of user activity for the C-FERST
website. Reporting of unique webpage view data24 for C-FERST from September
2016 to April 2017 shows spikes in viewers navigating to the site after ORD
conducted outreach/training on the tool.
As noted above, during the development phase, NERL collected and incorporated
user feedback from community case study pilots and field testing. This included
developing additional updates, adding a community forum, developing training,
improving user interface accessibility, and improving map and data functions.
Current SHC and NERL managers were unable to explain why performance
measures were not included in the design or development of C-FERST, although
they acknowledged that there had been changes in both ORD's structure and
C-FERST management and leadership during the development period. In
addition, as discussed above, the intended purpose of C-FERST changed from a
cumulative risk assessment tool to focus more on risk screening and community
engagement.
ORD indicated there are plans to incorporate additional monitoring measures for
C-FERST. The C-FERST lead said the next steps include an effort to try to collect
more quantitative data on system usage, but the first priority is providing training
sessions for the tool. However, the recently updated QAPP (May 2017) under
"Assessments and Oversight" only discusses planned development, training and
outreach activities for C-FERST and Tribal-FERST, as well as planned reviews of
data and web links. There is no mention of performance measurement of the tool
going forward, except the intent to review usability through user feedback.
ORD indicated that C-FERST's estimated costs cover not only maintenance,
updates and enhancements for the tool, but also public health data and models,
outreach in the form of training and technical assistance, case studies, and
potential integration/interoperability with other tools (e.g., EJSCREEN and
EnviroAtlas). The case studies will reportedly be used to gain feedback on uses
and agency/community needs.
OIG Sampling Indicates Limited Use to Date
ORD asserted that C-FERST is well used by communities and has served as a tool
in helping with decisions about community risks. However, we found that the four
C-FERST peer reviewers and a majority of our sample of users (16 of 20) are not
using the tool. We found that only four out of 20 sampled users (three EPA staff
24 The data category "unique page views" tracks a person's usage on the entire site for a period or session of time
(generally about 30 minutes).
17-P-0378
14

-------
and one academic representative) had used the tool to either directly assist
communities in identifying environmental risks or to provide C-FERST training.
Cited reasons for not using the tool included:
•	Forgetting about the tool.
•	Not being aware that it had been made publicly available.
•	Not being familiar enough with the tool.
•	Using other tools to meet their needs.
Three individuals indicated they would like to use the tool in their future work
now that they were reminded of it through our interviews. The four individuals
who said they use C-FERST—as well as four non-users, so 40 percent of our
sample—expressed enthusiasm for the tool's potential capability and what it can
do for communities, though some cautioned that
training to use the tool was needed. Additionally,
one community non-profit user described tools
like C-FERST as "tremendously useful" and
added: "There is an interest in the communities
but people are just not aware. ... Government
agencies don't do a good enough job informing
people about issues and tools." On this, we note below ORD's plans to increase
C-FERST outreach. Nonetheless, for now, C-FERST remains a publicly available
tool without identified performance measures or a plan for monitoring the
progress against established goals. ORD has insufficient evidence to support its
assertion that communities are using C-FERST as a community engagement tool.
Without metrics to measure C-FERST's performance against its goals, it is
unclear as to whether C-FERST is being used for its intended purpose and
meeting identified user needs.
Planned ORD Actions Could Mitigate Some Risks
During our review, we learned of planned ORD actions that could address some
noted risks for C-FERST, as well as research tools generally.
For C-FERST, ORD plans to increase marketing and outreach via train-the-trainer
and "tools ambassador" efforts, which could result in increased use. ORD's SHC
launched the tools ambassador effort as an informal initiative about a year ago at
the request of regional staff volunteers through the agency's Skills Marketplace
program.25 Tools ambassadors conduct training and advertise tools to colleagues
and local groups. SHC's Director believes ambassadors are currently in place in
less than half of the EPA's regions.
ORD also plans to issue a report summarizing C-FERST usage data sometime in
2018. ORD noted this planned report in its 2014 response to peer review
25 Skills Marketplace is a part-time project application to bring on extra hands for a set time period (1 year or less).
One EPA regional staff
person told us the reason for
not using C-FERST was
because she understood the
tool to be a community tool
and not intended for EPA use.
17-P-0378
15

-------
comments, specifically in response to one peer reviewer who said it would be
helpful to "provide examples of how communities have used data (made more
accessible by C-FERST) to influence decision making." NERL's Director added
that they are working with state partners, such as the Environmental Council of
States, to get inputs on community uses. Also, ORD plans to conduct user
surveys—an approach suggested by one peer reviewer and twice used by the
EJSCREEN team—to obtain data and feedback. We provided ORD a copy of
EJSCREEN's 2016 "impact survey" as an example.
For ORD research tools generally, NERL prepared a draft guidance in March 2016
that considers the life-cycle of a web-based tool in five steps (Table 2):
Table 2: Sequence of events for web-based tools
Life cycle
Clearance
Step 1
Define concept and plan

Communicate:
awareness and
support
Step 2
Develop
Validate technical quality
and compliance
Step 3
Clear and deploy
Step 4
Operate and maintain

Step 5
Revisit concept and plan


Source: Draft NERL guidance on development and release of web-based tools.
The draft guidance notes that technical quality, awareness and partnering would
be accomplished through use of a launch team comprised of, among others, the
lead investigator; members of the research team; and representatives from ORD's
Office of Science Information Management, OEI and quality assurance. The draft
guidance further notes that:
The final organizational level and corresponding requirements for
clearance depend on the breadth of use and impact. The greater the
likely volume of use, potential financial impact and potential social
impact, the greater the more technical review and higher
organizational involvement required stringency and amount of
requirements for clearance.
Aside from generally noting impact, the draft guidance does not address expected
performance measures or outcomes. NERL notes that the draft guidance is for
"development and approved public release of a web based tool while a more
detailed guidance resource is being assembled." Finalizing this guidance could
address concerns we heard in interviews on a lack of direction in this area. For
example, one NERL scientist said: "I don't think we have a clear strategic plan
for developing, implementing and maintaining these types of things." Another
NERL scientist said ORD needs a lab-wide procedure on tool development.
17-P-0378
16

-------
Additionally, ORD acknowledges the importance of helping users distinguish
among numerous tools. ORD has discussed creating a "decision logic guide" to
assist people in determining which tool to use
in addressing their particular query or issue.
The guide is in the concept stage and shows
the relationship between ORD tools. ORD said
further development of this guide is not a high
priority item at this time given other budgetary
priorities and limitations. ORD also said the
agency is working on a "local government
portal" to help get people to the right tool for
their needs, and that SHC is also considering
ways to help users sort through the available
tools for a particular issue (e.g., green
infrastructure). At this time, ORD has not
developed a "wizard"-type mechanism for
community-based mapping tools, but SHC's
Deputy Director acknowledged it would be
good to do so.
Conclusion
We are alerting the EPA to risks associated with C-FERST so that the agency can
take steps to promptly assess and mitigate the risks. The intent of C-FERST
changed during development. Further, it overlaps with other tools and there are no
means to measure its performance. Anticipated EPA budget cuts and current
efforts by the agency to reshape priorities and programs, streamline activities and
avoid duplication further compound the need for the agency to promptly review
C-FERST and similar tools in light of the risks identified.
Recommendations
We recommend that the Assistant Administrator for Research and Development:
1.	Review the Community-Focused Exposure and Risk Screening Tool and
develop an action plan with timeframes to address issues identified,
including considerations on whether to retain the tool. If retained:
a.	Develop metrics for measuring the tool's performance and
establish a regular schedule for performance evaluations.
b.	Survey users to obtain feedback on tool utilization and any needed
improvements.
2.	Develop policies and procedures for planning, developing, implementing
and monitoring the performance of web-based research tools. Policies and
procedures could build on the draft guidance for web-based tools
The EPA's National Environmental
Justice Advisory Council
recommended in 2004 that the
EPA provide guidance regarding
minimum criteria for selection and
use of a particular tool.
All four C-FERST peer reviewers
and 19 of 20 OIG-sampled users
believe a "tool decision matrix" of
some kind would be useful. Many
users (14 of 20) were familiar with
other geospatial mapping tools
(e.g., EJSCREEN and
EnviroAtlas). One state agency
user said that having too many
tools overwhelms people, and
leads to confusion and tools not
being used.
17-P-0378
17

-------
developed by the National Exposure Research Laboratory, and should
ensure that any new Office of Research and Development research tool
stems from a clear project proposal that includes ongoing monitoring
metrics and outcome measures, and vetting to ensure there is a need and
no overlap with other tools.
3.	Review new and existing Office of Research and Development research
tools to determine the applicability of the agency's information technology
requirements.
4.	Work with agency offices responsible for other geospatial mapping tools
to develop a decision support matrix for when to use certain tools and for
what purposes.
We recommend that the Deputy Administrator:
5.	Examine all of the EPA's web-based risk screening and mapping tools to
ensure the need for each tool and to avoid potential overlap, duplication
and waste.
Agency Comments and OIG Evaluation
Based on discussions during a June 2017 meeting with EPA managers and our
review of written comments, we made changes to the report where appropriate.
ORD agreed with our findings and recommendations and provided acceptable
corrective actions and estimated completion dates, as well as subsequent
clarification to its response to recommendation 2. Recommendations 1 through 4
are resolved with corrective actions pending.
While the Deputy Administrator's office agreed with our findings, we do not
believe that the office's response fully addressed recommendation 5. In
subsequent correspondence, the Deputy Administrator's office said that the EPA's
program and regional offices determine the need for web-based risk screening and
mapping tools as they consider how best to implement their programs. In addition,
the Deputy Administrator said that he has asked OEI to reinforce with the
program offices and regions that only tools addressing clearly defined needs
should move forward to development. We believe that the Deputy
Administrator's office has full authority to review the need for web-based tools
and should not delegate these decisions to program offices. Recommendation 5 is
unresolved with resolution efforts in progress.
Appendices A and B document, respectively, written comments from ORD and
the Office of the Administrator.
17-P-0378
18

-------
Status of Recommendations and
Potential Monetary Benefits
RECOMMENDATIONS
Rec.
No.
Page
No.
Subject
Status1
Action Official
Planned
Completion
Date
Potential
Monetary
Benefits
(in $000s)
1
17
Review the Community-Focused Exposure and Risk Screening
Tool and develop an action plan with timeframes to address
issues identified, including considerations on whether to retain
the tool. If retained:
a.	Develop metrics for measuring the tool's performance and
establish a regular schedule for performance evaluations.
b.	Survey users to obtain feedback on tool utilization and any
needed improvements.
R
Assistant Administrator for
Research and
Development
9/30/19

2
17
Develop policies and procedures for planning, developing,
implementing and monitoring the performance of web-based
research tools. Policies and procedures could build on the draft
guidance for web-based tools developed by the National
Exposure Research Laboratory, and should ensure that any new
Office of Research and Development research tool stems from a
clear project proposal that includes ongoing monitoring metrics
and outcome measures, and vetting to ensure there is a need
and no overlap with other tools.
R
Assistant Administrator for
Research and
Development
9/30/18

3
18
Review new and existing Office of Research and Development
research tools to determine the applicability of the agency's
information technology requirements.
R
Assistant Administrator for
Research and
Development
9/30/19

4
18
Work with agency offices responsible for other geospatial
mapping tools to develop a decision support matrix for when to
use certain tools and for what purposes.
R
Assistant Administrator for
Research and
Development
9/30/19

5 18 Examine all of the EPA's web-based risk screening and mapping U Deputy Administrator
tools to ensure the need for each tool and to avoid potential
overlap, duplication and waste.
1 C = Corrective action completed.
R = Recommendation resolved with corrective action pending.
U = Recommendation unresolved with resolution efforts in progress.
17-P-0378
19

-------
Appendix A
Office of Research and Development's
Written Comments
VN.U-Q	nosvhNIAt	AGfcVSV
WASHINGTON. D C *§«§
Off *£€ €*
fKWMCH
SUBJECT? Response to Office of Inspector General <010} Draft Management Afeii No.
0PE-FY17-0006 "EPA Slieuli Pwwptly Reassess Community Risk Sweating
Tool." dated June 14,2017
Office of Research and Development
IO:	Carolyn Copper
Assistant inspector (It twal
Office of Program !• \ aliuitisi*
The EPA's Office of Research and Development (ORD) welcomes the opportunity to review and
comment on the OIG's draft Management Alert titled: "EPA Should Promptly Reassess
Community Risk Screening TooF (Project No. OPE-FY17-0006) (Draft Management Alert).
We appreciate the thorough review conducted by OIG's investigators and an opportunity to
provide this feedback. This response reflects ORD's understanding of OIG's planned changes to
their final Management Alert version.
Of foremost concern to ORD is the perceived misclassification of the C-FERST report as a
Management Alert. The U.S. EPA Audit Evaluation Management Manual 2750 defines
Management Alert as the following: "convey significant, time-critical issues to agency
management before completing the ongoing project (pp. 172)." The OIG's findings do not
identify any issues that neither impact, nor cause risk to the public health or environment.
Further, the OIG did not find any gross mismanagement of EPA resources, nor time-critical
issues. Accordingly, and in second light - if a Management Alert is issued before a project is
completed, ORD notes that C-FERST as referred to in the hotline complaint was completed and
released in Sept 2016. There is new work planned on C-FERST, but that is a separate and new
project.
17-P-0378
20

-------
Having noted the above, ORD acknowledges there are areas for improvement, but we want to
emphasize that at its core the Community-Focused Exposure and Risk Screening Tool (C-
FERST) is a research tool and the nature of research can change during its course of conduct.
Immediately below are responses to the OIG's specific recommendations that are directed to
ORD. ORD notes that recommendation 5 is addressed to the Deputy Administrator. In the
attachment, we provide additional detailed comments with respect to statements in the Draft
Management Alert.
Recommendation 1: Review the Community-Focused Exposure and Risk Screening Tool and
develop an action plan with timeframes to address issues identified, including considerations on
whether to retain the tool. If retained:
Response: ORD agrees and since ORD does intend to retain this tool, we have provided
responses to the additional recommendations below.
Planned Completion Date: September 30, 2019
a.	Develop metrics for measuring the tool's performance and establish a regular schedule for
performance evaluations.
Response: ORD agrees. ORD has already initiated the development of performance metrics
for C-FERST and other tools. ORD intended to have this be the topic for discussion and
review by the BOSC which is now on hold pending appointment of new BOSC members. A
completion date is therefore pending when the BOSC is formed and is able to advise ORD on
recommendations for appropriate metrics.
Planned Completion Date: September 30, 2018
b.	Survey users to obtain feedback on tool utilization and any needed improvements.
Response: ORD agrees and as was mentioned in previous discussions with OIG, is
partnering with ECO S (Environmental Council of States) and ASTHO (Association of State
and Territorial Health Organizations) as part of an MO A established with EPA April 2016 to
survey state agencies. (This survey is targetedfor FY2018.)
Planned Completion Date: September 30, 2019
Recommendation 2: Develop policies and procedures for planning, developing, implementing
and monitoring the performance of web-based research tools. Policies and procedures could
build on the draft guidance for web-based tools developed by the National Exposure Research
Laboratory, and should ensure that any new Office of Research and Development research tool
stems from a clear project proposal that includes ongoing monitoring metrics and outcome
measures, and vetting to ensure there is a need and no overlap with other tools.
Response 2: ORD agrees and will work with OEI and the Chief Information Officer to develop
criteria to determine when a research tool should be subject to the agency's information
technology requirements. ORD will use the criteria to review its new and existing major public
interface research tools to determine the applicability of the agency's information technology
17-P-0378
21

-------
requirements. In addition, ORD will continue improving its investment portfolio review process
for IT investments as required under various laws, policies, and regulations including FITARA.
ORD will expand its application development roadmap and checklist to require informing the
Office of Science and Information Management (OSIM) before such projects are started and to
report progress and expenditures on such development projects on a regular basis (at least
annually or more frequent). OSIM will review and help the developers through the appropriate
Life Cycle reviews throughout the project duration and ORD will regularly monitor performance
of these web-based tools. This process is being developed and will be implemented starting FY
2018 and will be continuous.
Planned Completion Date: September 30, 2018
Recommendation 3: Review new and existing Office of Research and Development research
tools to determine the applicability of the agency's information technology requirements.
Response 3: ORD agrees and as stated in the response to recommendation #2: ORD will work
with OEI and the Chief Information Officer to develop criteria to determine when a research tool
should be subject to the agency's information technology requirements. ORD will use the criteria
to review its new and major existing public interface research tools to determine the
applicability of the agency's information technology requirements.
Planned Completion Date: September 30, 2019
Recommendation 4: Work with agency offices responsible for other geospatial analysis tools to
develop a decision support matrix for when to use certain tools and for what purposes.
Response 4: ORD agrees that such a decision matrix is valuable and will work other offices,
predominantly OEI on this effort. ORD has started to develop ORD controlled tools and will
coordinate with OEIfor a wider review in 2017 and 2018, with a final assessment by 3/31/2019.
Planned Completion Date: September 30, 2019
Recommendation 5: Examine all of the EPA's web-based risk screening and mapping tools to
ensure the need for each tool and to avoid potential overlap, duplication and waste.
Response 5: Regarding recommendation 5 addressed to the Deputy Administrator, it is ORD's
opinion that the Review of C-FERST does not form a basis to recommend an agency-wide
review of all risk based screening and mapping tools.
If you have any questions regarding this response, please contact Jennifer Orme-Zavaleta, PhD,
Director, National Exposure Research Laboratory (NERL) at orme-zavaleta.i ennifer@epa. gov .
Attachment
cc: Tim Watkins
Michael Slimak
Andrew Geller
17-P-0378
22

-------
Jerry Blancato
David Updike
Stefan Silzer
Deborah Heckman
Beatriz Cuartas
Maureen Hingeley
Bill Ocampo
17-P-0378
23

-------
Detailed ORD Comments on Draft OIG Management Alert: Community Risk Screening
Tool (C-FERST)
ORD is providing the following comments and clarifications regarding the draft management
alert prepared by the OIG. The information provided is generally organized in the order issues
were raised in the draft OIG Alert.
Page 1: The public version of C-FERST "is different than its intended purpose."
It is correct that the scope and implementation of C-FERST evolved during implementation to
address new information, needs, and capabilities with existing resources. The impetus for
developing C-FERST (circa 2008) was 2-fold: (a) to help automate the Community Action for
Renewed Action (CARE) step-by-step community assessment roadmap and provide easier
access to information for following the roadmap, and (b) advance the science of community-
level cumulative risk assessment. C-FERST was requested by the CARE program leads and
Program/Regional Office partners, and the tool, as released, did meet the first objective and even
included other community assessment roadmaps. With respect to the second objective, C-FERST
was envisioned as a framework for developing and communicating cumulative exposure and risk
science, as described in the following two papers.
•	Zartarian et al. " The Environmental Protection Agency's Community-Focused Exposure
and Risk Screening Tool (C-FERST) and Its Potential Use for Environmental Justice
Efforts. " American Journal of Public Health. 2011, Vol. 101, No. SI.
http://aiph.aphapublications.org/doi/abs/10.2105/AJPH.2010.300Q87
•	Zartarian and Schultz, 2009, " The EPA's human exposure research program for
assessing cumulative risk in communities." Journal of Exposure Science and
Environmental Epidemiology (2010) 20, 351-358; doi:10.1038/jes.2009.20; published
online 15 April 2009, http://www.nature.com/ies/iournal/v20/n4/full/ies20Q920a.html
Page 1: Without metrics to measure performance against goals, it is unclear if C-FERST is
being used for its intended purpose or, importantly, meeting user needs.
We agree that formal metrics could be more robust, but argue that C-FERST is being used for its
intended purpose, and that the tool will gain users as it becomes more well known. Furthermore,
we identified the difficulties and costs to systematically collect quantitative information about
uses and users, including the need to OMB clearance for any data collections.
Informing potential users of C-FERST's capabilities is a major purpose for doing case studies,
including joint applications with EJSCREEN. In addition, we described and provided examples
of working with our Regional partners and their local agency, academic, and community
partners. This allowed us to test the tool with real-world users and obtain and user feedback and
recommendations in the publicly released version.
We provided OIG with the current QAPP, which describes our intent to continue outreach,
identify partnerships, and work with our partners to obtain information about how the tools
17-P-0378
24

-------
(working with EJSCREEN) are used, user-feedback and recommendations for future
developments.
ORD also notes that the Management Alert does indicate that some of the C-FERST users who
were interviewed "expressed enthusiasm for the tool's capability and what it can do for
communities." ORD views this as a positive indication of the potential for C-FERST.
Page 2: Related to this finding, ORD altered the original purpose of the tool during
development without properly documenting this change" and "The change had to do with
the claim that C-FERST would characterize cumulative risks to toxic substances.... While
National Exposure Research Laboratory (NERL) officials verbally acknowledged this
change to us, it was not documented in the C-FERST Quality Assurance Project Plan
(QAPP)."
While in retrospect we agree documentation could have been better, attempts to document the
evolutionary changes were made. ORD does note that Management Alert indicates that the C-
FERST Quality Assurance Project Plans (QAPPS) did not document a change in purpose relating
to the extent to which C-FERST addresses cumulative risk assessment. ORD also notes that the
Management Alert states only the 2009 QAPP made a passing reference to cumulative risk and
noted that the C-FERST prototype version did not yet include the ability to calculate cumulative
risk. Therefore, the C-FERST QAPPs developed in 2009, 2013, and 2017 were actually
consistent and never implied that the tool had the capability to calculate cumulative risk. For
example, the purpose of the project as stated in the 2013 QAPP was "to continue development of
the Community-Focused Exposure and Risk Screening Tool (C-FERST) and Tribal-Focused
Environmental Risk and Sustainability Tool (Tribal-FERST), web-based GIS and information
access toolkits to enhance screening-level community environmental health decision-
making." This is consistent with the current version of C-FERST. The data and functions
covered in the 2013 QAPP are the same as those included in the public release version. The
QAPP was followed for the tool development and data updates. After the public release in 2016
and in response to new ORD guidelines for QAPPs and "laboratory notebooks" for all projects,
the QAPP has been revised and an electronic notebook (OneNote) is being used to document
decisions, research and products. ORD also notes that although how the tool is described has
changed and how the tool is used may change, these changes do not necessarily require revision
of the QAPP. Finally, ORD notes that the change from conducting cumulative risk assessment
was properly documented in the responses to the peer review comments, and proposed in the
SHC 2.62 Project Plan and Task 2.62.1.
Page 2: Related to this finding, "Our review of documentation related to the planning and
development of C-FERST shows that a significant change occurred in the design objectives
of the tool between when ORD released the prototype version (circa 2008) and the official
launch of the tool in September 2016."
Both the external peer review and internal agency concerns identified the need to be clear about
what was in the tool and what it could do. The earlier descriptions of C-FERST were about the
concept and plans for the tool. However, we had to describe what was actually in the tool at the
time of the public release.
17-P-0378
25

-------
Page 4: Related to this finding, "ORD did not develop a project proposal for the C-FERST
prototype (circa 2008)."
ORD's project planning process has matured significantly since the inception of C-FERST. The
proposed C-FERST project followed the ORD project/task planning review process at the time
(pre- National Programs) for NERL/HEASD Task 21163. Numerous briefings on the C-FERST
plan were given to ORD management, as well as briefings to solicit collaborative input from our
CARE Program partners across EPA's program and regional offices (including senior leaders in
OPPT and OW).
Page 4: Related to this finding, "While early presentations on C-FERST generally
described ideas for the tool, we did not see evidence that ORD performed a systematic
assessment of community needs for the tool."
The introductions in the following 2 publications, and references therein, document the
systematic assessment.
•	Zartarian et al. The Environmental Protection Agency's Community-Focused Exposure
and Risk Screening Tool (C-FERST) and Its Potential Use for Environmental Justice
Efforts. American Journal of Public Health. 2011, Vol. 101, No. SI.
http://aiph.aphapublications.org/doi/abs/10.2105/AJPH.2010.300Q87
•	Zartarian and Schultz, 2009, The EPA's human exposure research program for assessing
cumulative risk in communities. Journal of Exposure Science and Environmental
Epidemiology (2010) 20, 351-358; doi:10.1038/jes.2009.20; published online 15 April
2009, http://www.nature.com/ies/iournal/v20/n4/full/ies20Q920a.html
Page 4: Related to this finding, "ORD said the tool was developed with extensive input
from EPA regional offices and communities but has not provided clear documentation
from any EPA program office requesting the tool."
CARE partners requested that ORD develop an automated version of the CARE roadmap, and
CARE leads in OPPT and OW, as well as ORD managers, were briefed early on about the
collaboration. We also led regular C-FERST development calls with CARE Program and
Regional Office partners in 2008, and the CARE program suggested ORD conduct several pilot
studies in their communities of the beta version. CARE leads at the time assisted ORD with the
three pilot studies as well as with seeking internal review across Program Offices for input on the
prototype version of the tool. ORD can provide names of CARE collaborators on the early tool
development calls if needed.
Page 5: Related to this finding, "One NERL scientist with experience in systems dynamics
and decision science said that the tool was developed without coordination between the
ecosystems and human health sides of ORD, and that C-FERST was developed before staff
with spatial data experience heard about it."
ORD disagrees with this one individual's assessment. There were many ongoing discussions and
17-P-0378
26

-------
joint briefings as early as 2009 between the ecosystems and human health researchers,
specifically regarding coordination of C-FERST and EnviroAtlas.
Page 5: "C-FERST Overlaps with Other Agency Tools and ORD Has Not Described C-
FERST's Unique Components"
ORD disagrees with this assertion. While there may be overlap (as opposed to duplication in the
Management Alert), there have been presentations, fact sheets, and Q&As describing how C-
FERST is unique compared to EJSCREEN and other tools. The text in the OIG Management
Alert summarizes some of the unique components:
"In our assessment based on our interviews with EPA staff and others, C-FERST overlaps
most closely with EJSCREEN. EJSCREEN and C-FERST have similar functions and
capabilities for identifying environmental risks in a community. Both tools provide a
combination of demographic and environmental data (see Table 1), and allow users to
compare the data from their local community (census tract) with state-level data. Unique to
EJSCREEN, users can look at Environmental Justice Indices (combinations of environmental
and demographic information), based on percentiles, and narrow in on specific demographic
information not available elsewhere (including by language, country of origin, age, etc.).
Unique to C-FERST, users are able to access more specific environmental data (particularly
from the National Air Toxics Assessment dataset), find detailed information on potential
ways to mitigate the harms of specific environmental pollutants through C-FERST's Issue
Profiles, and see examples and advice for how to plan a community-level project with C-
FERST's CARE Roadmap. ORD and others, such as EPA's Office of Environmental Justice,
described C-FERST and EJSCREEN as distinct and complimentary tools explaining that C-
FERST provides "context to information that may be available from other sources" {i.e.,
My Environment or EJSCREEN). One EPA regional user remarked that the nuanced level of
data that C-FERST provides is useful in their work. Other users of the tool explained that
they use EJSCREEN first as a broad screening tool, and then move to C-FERST to hone in
on more community-specific data."
One area of apparent overlap is with regard to maps. There are some map layers in common,
since these are on EPA's GeoPlatform, and some that were developed specifically for C-FERST
(e.g., the NATA service, including contribution of source categories).
ORD has worked with OEJ (EJSCREEN) and OAQPS (NATA) to communicate the similarities
and differences between our tools (see attached 4 tool comparison for NATA.) These include a
description of the "focus for each tool" (C-FERST, EnviroAtlas, and EJSCREEN), and
complementary uses. Several agency and public presentations have included comparisons and
examples for using EJSCREEN and C-FERST together.
ORD was and remains cognizant to avoid duplication and minimize overlap with similar tools.
We have worked with EJSCREEN to identify opportunities for collaboration on outreach and
training.
17-P-0378
27

-------
Appendix B
Office of the Administrator's Written Comments
kphi ,
5SSZ/
% PmcfiS?
UNITED STATES ENVIRONMENTAL PHOHCIiOM AGENCY
vA.SHiNniDh f : '*M£r
Juh .V 2<«i
OFFICE Of THE
MmmistmnM
Mmmmsm
SUBJECT: Response to Office of Inspector General Draft Management Alert: EPA Should
Promptly Reassess Community Risk Screening Tool
Thank you for the opportunity to review and comment on the Office of Inspector General's draft
Management Alert: EPA Should Promptly Reassess Community Risk Screening Tool. I
appreciate the OIG's efforts to investigate hotline complaints and the OIG's commitment to
preventing waste, fraud and abuse.
The OIG's draft Management Alert contains the following recommendation for the
Deputy Administrator:
Examine all of the U.S. Environmental Protection Agency's web-based risk screening
and mapping tools to ensure the need for each tool and to avoid potential overlap,
duplication and waste.
I agree that EPA's web-based risk screening and mapping tools should be developed to meet the
targeted users' needs and, where possible, build on existing tools to avoid duplication and reduce
waste. The EPA's Office of Environmental Information has mechanisms in place for
coordinating mapping tools and data services across the agency. For example, OEI leads the EPA
Geospatial Advisory Committee, which is the advisory board for the EPA's geospatial program
17-P-0378	28
FROM: Mite flyi
Acting I")
f »»;
lUnck < tilhride. Duiy.*. I ro nwinicnlai Research Programs
O'tke"! l*i*tvtiu I- '.al.uiiiuit
( >:-!kV	( VfVfYtl

-------
overall and has representation from almost all programs and regions. OEI also leads the
GeoPlatform Change Control and Operational Management Board. Programmatic groups
developing applications that use EPA's GeoPlatform shared service and/or new enterprise
geodata services are invited to discuss issues like consistency and re-using shared services.
I have asked OEI and the Chief Information Officer to review their existing policies and
procedures to ensure that sufficient mechanisms are in place to identify potential overlap or
duplication during the development or modification of any web-based risk screening and
mapping tools.
With respect to existing web-based risk screening and mapping tools, the agency has developed a
4-Tool Comparison Chart to guide users to the tool or tools that will best serve their needs. The
chart includes C-FERST as well as EJSCREEN, NATA, and EnviroAtlas, and is available at
https://www.epa.gov/ejscreen/epa-4-tool-comparison-chart.
17-P-0378
29

-------
Appendix C
Distribution
The Administrator
Deputy Administrator
Chief of Staff
Chief of Staff for Operations
Deputy Chief of Staff for Operations
Assistant Administrator for Research and Development
Assistant Administrator for Environmental Information and Chief Information Officer
Agency Follow-Up Official (the CFO)
Agency Follow-Up Coordinator
General Counsel
Associate Administrator for Congressional and Intergovernmental Relations
Associate Administrator for Public Affairs
Associate Director for Science, Office of Research and Development
Deputy Assistant Administrator for Management, Office of Research and Development
Deputy Assistant Administrator for Research and Development, Office of Research and
Development
Associate Assistant Administrator, Office of Research and Development
Director, Office of Science Information Management, Office of Research and Development
Director, National Exposure Research Laboratory, Office of Research and Development
Audit Follow-Up Coordinator, Office of the Administrator
Audit Follow-Up Coordinator, Office of Research and Development
Audit Follow-Up Coordinator, Office of Environmental Information
17-P-0378
30

-------