EVIDENCE ACT
LEARNING AGENDi
YEAR 1 FINAL REPORT
24 JANUARY 2022
GEPA
This report was prepared
with support from
Industrial Economics,
Incorporated under U.S.
EPA Contract No.
GS-10F-0061N
xxxxxxx
-------
Table of Contents
Table of Contents
Overall Findings 1
Summary 1
Next Steps 3
Introduction + Purpose 4
Priority Questions - Year 1 5
Data Collection 5
Results by Sub-Question 7
a. Do the grant programs have specific targets associated with their outputs/outcomes? 7
b. What types of grant commitments data are tracked 8
i. How do tracked grant commitments data vary across the Agency? 10
ii. To what extent does the data reported by grantees provide EPA with information on progress
towards meeting grant commitments? 13
iii. To what extent does the data reported by grantees provide information that currently allows EPA
to measure outputs, outcomes, and impacts related to equity and climate change? 17
c. How do grant programs identify relevant grant commitments to track? 18
d. What data reporting processes, tools, and systems do EPA's grant award programs use? 19
i. How do grant award reporting systems vary across the Agency? 25
e. How do grant programs use the grant commitments data they collect for program implementation? 26
f. How do grant programs present and communicate the results of the grant commitments data they
collect? 29
Appendix A: Methodology 31
Purpose of the Work 31
Data Collection Methodology 33
Data Sources and Strategies 39
Sampling Approach 41
Data Use 42
Data Analysis Plan 42
Report Audience 42
Data Limitations and Validation 42
Appendix B: Survey Instrument 44
Appendix C: NPM Information Request 45
Appendix D: Grant Commitments Survey Effort Overview 46
Appendix E: Grant Commitments NPM Information Request Effort Overview 47
Appendix F: Anticipated Survey Responses 48
Appendix G: Program Level Fact Sheets 54
-------
Brownfields Multipurpose, Assessment, Revolt in ; lh
-------
GRANT COMMITMENTS MET LEARNING AGENDA SUPPORT | Year 1 Baseline
Overall Findings
This section presents the overall findings from each of the evaluation sub-questions and helps answer the Year 1
question: How do EPA's existing grant award and reporting systems identify and track grant
commitments? Detail on each of the sub-questions follows the summary and next steps sections.
Summary
Overall, the evidence indicates strong data collection efforts across EPA grant programs, which allows EPA to
gather the intended information from grantees. Notably, the vast majority of respondents agreed that the output
and outcome data currently being collected enables their program to track progress on grant commitments and
recognize and address problems with grantee performance. Some of the practices that help support these data
collection efforts include frequent communication between EPA and grantee to build rapport; and use of work
plans to clearly articulate scope of data reporting requirements, including what is reported and how frequently
data are reported - all agreed upon by the grantee and grant manager. Standard reporting processes and reporting
templates also support grant programs" efforts to track grant commitments data. Consequently, respondents
reported the opposite for instances where grantee reporting did not provide data needed for tracking progress or
addressing problems (for example, unclear workplan requirements or no standard outcome metrics).
Other reasons respondents provided for why grantee reporting may not track progress or address problems is
that outcomes (and sometimes even outputs) occur outside the timeframe of the grant. This may be reasonable
given the long timeframes for improving environmental and/or health conditions. In cases like these, it may be
beneficial for EPA to collect output, outcome, or other types of information that serve as leading indicators,
and/or that EPA can use to model or project outcomes on a programmatic level. This may also be a preferred
reporting framework alternative in instances where grantees lack the technical ability and/or capacity to provide
accurate and/or relevant outcome data.
Although EPA is able to effectively collect the data from grantees, there may be room for EPA to improve their
data processes and consolidation efforts to better support Agency-level reporting on outcomes from grant
programs. High rates of use for collecting grantee data via Word document and 'text in the body of an email"
suggest less-than-ideal data collection formats for the purpose of consolidating information across projects. The
large and disparate number of databases that grant programs use to store data from grantees (55 - when there are
approximately 100 grant programs) indicates further challenges to EPA collectively reporting on outcomes of
grant programs. Notably, several respondents indicated the use of a database for data tracking (reporting and/or
consolidation) as a best practice.
Through the document review, the Grant Commitments Met Workgroup (henceforth referred to as the
Workgroup) was able to identify several examples of grant programs that have plans with well-articulated
program logic including objectives, program activities, and associated metrics and targets for tracking outputs
and outcomes. This program logic clearly lays out the path for the grant program to succeed and to demonstrate
its success. Programs can also use these types of documents to clearly articulate what is reasonable for the
program to accomplish given any constraints on funding, timing, etc. This can better contextualize the outputs
and outcomes from a grant program and better communicate what EPA grant programs should be and are
accomplishing.
1
-------
Answers to each specific sub-question for Year 1 are summarized below:
a. Based on the documents provided via the NPM information request, only 21% or 15 programs have
specific targets associated with their outputs and outcomes as evidenced in guidance documents and/or
summary reports.
b. Nearly all (99%) of programs collect data on outputs. Fifty-six percent of programs collect data from
grantees on program outcomes (early- mid-, long term-). Approximately one-third of programs (31%)
collect data that can be used to report on the environmental and/or health results of EPA's grant
programs. As discussed above, depending on the focus of a particular grant program, this may be
appropriate. EPA may be able to use the data currently reported by grantees to infer or calculate
environmental outcomes - for example, EPA could convert reported energy savings into greenhouse gas
reductions, using standard emissions factors. In other cases, however, additional opportunities may
exist to document health and environmental outcomes that may result from the Agency's grant
programs. Although the survey provided specific definitions and examples of outcomes, responses
received often reflected outputs. This may indicate an opportunity for improving staff 'measurement
literacy' to better ensure relevant grant commitments data are tracked (e.g., environmental and/or health
outcomes).
i. The types of tracked grant commitments data vary more for outcomes than outputs when
comparing across grant programs' media type and region.1 However, in general, data do not
vary substantially across the Agency.
ii. Respondents provided positive responses that the output and outcome data received from
grantees meet the data reporting requirements as specified in their work plan. This suggests a
high level of alignment between the actual output and outcome data provided by grantees and
the requirements of their work plan; indicating the grantee-to-EPA reporting processes are
generally functioning to provide the data needed to report on grant commitments. Overall, data
indicate it may be slightly more difficult to manage a grant managed as part of a PPG, this is
particularly relevant for grant programs where some parts are managed as a PPG, and some are
not.
iii. Available information indicates the current data reported by grantees provides limited
information on outputs and outcomes related to equity and climate change. Metrics that were
identified were most often associated with program's that directly address these types of issues,
for example tribal-specific grant programs.
c. Year 2 of the work will explore this in greater detail via interviews, but available evidence suggests that
programs may identify relevant grant commitments to track from a centralized top-down process (e.g.,
statute, determination by the national program), past performance of the program, or a program-level
action plan.
d. EPA's grant programs most often use Word documents for collecting data from grantees and
SharePoint/Teams/OneDrive for storing data collected from grantees. Respondents also indicated
grantee data are stored in 55 different databases across the Agency. Respondents indicated the reporting
interval required for grantees, as specified in the grant guidance, was most often semi-annual, quarterly,
annual or some combination of these. As a whole, there is no clear relationship between grantee
reporting interval and respondents' agreement that the reporting frequency is appropriate to effectively
track progress on grant commitments and recognize and address problems. A major takeaway from
1 Media types: Air, Brownfields, Environmental Education, Environmental Justice, Multi-Media, Pesticides, Research, Solid Waste, Superfund, Toxic
Substances, Water
2
-------
analysis of open-ended responses on reporting frequency is the importance of regular communication
with grantees. Fifty-two respondents indicate this allows EPA grant managers an open line of
communication with the grantees helping to improve grant management and reporting.
i. In general, the reporting mechanisms by region and media type follow the pattern of the full
dataset; Word documents are the most common mechanism, followed by Adobe PDF, Excel
and 'text in the body of an email.' The data storage systems used by grant programs generally
correspond to the focus area of a given media type. For example, the air media programs
predominantly use AQS, the designated EPA air quality database.
e. Respondents provided overwhelmingly positive responses that the output and outcome data currently
being collected enables the program to track progress on grant commitments and recognize and address
problems occurring with grantees not meeting grant commitments. About 91% of respondents agree or
strongly agree that output data enables the program to track progress on grant commitments and 85% of
strongly agree or agree it enables the program to address problems. This suggests that EPA staff are
using the data collected from grantees to understand what grantees are accomplishing and to effectively
manage the performance of grantees.
f. Based on the existing data collection and presentation/communication practices of grant programs,
EPA's individual programs have the ability to report out on the outputs and outcomes of several
individual, but not all, grant programs. However, as discussed at the outset of this report, the
management and tracking of the individual awards are dispersed amongst approximately 1,400 staff
throughout headquarters and EPA's ten regional offices, which makes tracking results at the national
level challenging. The Agency's lack of a comprehensive system for tracking and reporting grant-
related activities leads to an inability to proficiently evaluate environmental outcomes on a national
scale.
Based on the data analysis plan and the information learned from this Year 1 study, the Workgroup proposes the
following next steps for consideration by the Grant Commitments Met Team for scoping Year 2 of the study:
• Best practices. Use interviews to explore list of practices from survey to further identify best practices
that could help EPA better determine grant commitments met.
• Program targets. For programs that are interviewed in Year 2, request information on program targets
(if not already available) to better understand how grant programs use targets to determine and
communicate program success.
• Program endpoints. Related to targets, better understand the reasonable endpoint for grant programs. It
may not be reasonable for grantees to realize environmental and/or health outcomes during the
timeframe of their grant. What are the reasonable endpoints EPA can and should track for each grant
program during the course of the grant period?
• Calculating long-term outcomes. Consider opportunities for EPA to use outputs, early outcomes, mid-
term outcomes, and other types of data for purposes of calculating and/or modeling long-term
outcomes.
• Identifying relevant data to track. Related to targets and endpoints, discuss program processes for
identifying the grant commitments data tracked with interviewees. What are some of the practices for
identifying relevant data? Who identifies these data? How do they select appropriate goals, metrics, and
targets?
3
-------
• PPG management. Based on survey responses investigate why PPG programs may be more difficult to
manage than non-PPG programs.
• Reporting frequency. Overall, the analysis of open-ended responses provides differing opinions on the
role reporting frequency has on data quality and usefulness. There is nuance to reporting frequency that
is not adequately captured via the survey. Interviews will help shed further light on how reporting
frequency can best support understanding grant commitments met.
• Data management. How do grant programs decide what data reporting and storage mechanisms to use?
What other (not related to grant programs) data are stored in the databases that grant programs currently
use? How do (if at all) programs use the existing databases to support data reporting? What are the
positive attributes of existing databases?
• Current use of data. Probe into more examples of how grant programs are using current data being
collected from grantees to manage grant performance. Do grant programs use all the information they
are collecting? Are there opportunities to streamline?
• Administrative Priorities. Align future efforts with the priorities of the current administration and key
legislation impacting grant programs.
III! I ¦ IlMIII I
In September 2020, EPA developed an Interim Learning Agenda that identified three Learning Priorities. The
Learning Agenda stems from the Foundations for Evidence-Based Policymaking Act (Evidence Act), which
provides a framework to promote a culture of evaluation, continuous learning, and decision making using the
best available evidence. As part of the Learning Agenda, EPA has initiated efforts to:
1. Develop priority questions.
2. Develop capacity to undertake new evidence-building activities.
3. Take the first step in developing a Learning Agenda that will inform the FY 2022-2026 EPA Strategic
Plan.
Grant Commitments Met was one of the Learning Priorities identified in the Interim Learning Agenda. Every
year, EPA awards over $4 billion in grants and other assistance agreements. Through these grants, EPA helps to
protect human health and the environment through the work of its grantees. The management and tracking of the
individual awards are dispersed amongst approximately 1,400 staff throughout headquarters and EPA's ten
regional offices, which makes tracking results at the national level challenging. The Agency's lack of a
comprehensive system for tracking grant-related activities leads to an inability to proficiently evaluate
environmental outcomes on a national scale. Work under the Interim Learning Agenda is a first step toward
better understanding current grant reporting and tracking processes across the Agency's 100+ current grant
programs. This baseline will help EPA develop a sustainable and consistent process for negotiating and tracking
the environmental outputs and outcomes resulting from EPA's grant funding.
EPA's Office of Congressional and Intergovernmental Relations (OCIR) established the Grant Commitments
Met Workgroup to address the Priority Questions in the Interim Learning Agenda.
The initial phase of work (Year 1), which is summarized in this report, addresses the priority question: How do
EPA's existing grant award and reporting systems identify and track grant commitments? Year 2 will
address: What EPA practices and tools effectively track whether grantees are fulfilling their workplan grant
4
-------
commitments, including outputs and environmental outcomes? The final year of work, Year 3, will address: Are
the commitments established in EPA's grant agreements achieving the intended environmental results?
The remainder of this report focuses on the Year 1 priority question. The next section presents the full list of
sub-questions, followed by the general data collection approach, results by sub-question, and overall findings
and next steps for the Grant Commitments Met Learning Agenda work. The appendices contain the full
methodology, final survey instrument, National Program Manager (NPM) information request, survey overview
2-pager, and NPM information request overview 2-pager, survey and NPM information request program
respondents, and program-level fact sheets. The Data Analysis Plan is also available as Attachment A.
Priority Question
The list of priority questions and sub-questions associated with Year 1 work include:
1. How do EPA's existing grant award and reporting systems identify and track grant
commitments?
a. Do the grant programs have specific targets associated with their outputs/outcomes?
b. What types of grant commitments data are tracked?
i. How do tracked grant commitments data vary across the Agency?
ii. To what extent does the data reported by grantees provide EPA with information on
progress towards meeting grant commitments?
iii. To what extent does the data reported by grantees provide information that currently
allows EPA to measure outputs, outcomes, and impacts related to equity and climate
change?
c. How do grant programs identify relevant grant commitments to track?
d. What data reporting processes, tools, and systems do EPA's grant award programs use?
i. How do grant award reporting systems vary across the Agency?
e. How do grant programs use the grant commitments data they collect for program
implementation?
f. How do grant programs present and communicate the results of the grant commitments data
they collect?
Data Collection
This work draws on multiple data sources to answer the priority question, including a brief online survey
administered to EPA staff managing or implementing EPA's grant programs, and a document review of reports
and grant guidance documents solicited through a NPM Information Request. Overall, we received documents
from 72 of the 93 programs we targeted (77%). As such, our findings only reflect the programs where we have
available documents and may not represent the programs that did not provide information. See Appendix F for a
complete list of programs providing responses to the NPM Information Request. The documents provide
specific information on how grant programs are organized, and how they communicate and present results. The
survey solicited information from regions/headquarters for each grant program that is active in each region.
Therefore, each response reflects a self-reported region/headquarters program response and can be interpreted as
5
-------
the implementation practices of a grant program in a particular region or headquarters office. Overall, we
received 462 responses from across the Agency; based on the number regions with active grants in FY 2021 we
were anticipating 489 responses. The face-value response rate was 94%, but we also received several responses
from several regions that were unanticipated.2 The response rate excluding these unanticipated responses and
reflective of the missing anticipated responses is 86%. See Appendix F for a list of programs providing
responses. The survey consisted of predominately closed-ended questions, with several optional open-ended
questions.
Together, these data sources provide different but complementary types of information to answer the priority
questions. This approach makes the fullest use of existing data while undertaking targeted new data collections
to provide more comprehensive and robust answers to the priority questions. Additional details on the
methodology can be found in Appendix A: Methodology, Appendix B: Survey, and Appendix C: NPM
Information Request.
2 It is unclear why the survey captured responses from regions where a response was unanticipated. However, this may have occurred in
regions that did not disseminate money to grantees during FY 2021 (therefore designated as not active) but typically do have active
grants.
6
-------
Results by Sub-Question
Outputs and Outcomes
In program measurement and evaluation, the
terms outputs and outcomes refer to two
distinct types of program results.
Output - is what a program produces or
delivers and is a direct result of program
activities. These tend to be tangible and
measurable as simple counts. Some examples
include stakeholder meetings, trainings
conducted, and funded projects.
Outcome - is the result of program outputs
and generally fall into three sequential tiers:
1. Short-term - changes in awareness,
knowledge, attitude, skills, and
understanding of the intended audience. For
example, after a training, participants will
have increased knowledge of a topic.
2. Intermediate - changes in behaviors,
practices, or decisions. For example,
training participants implement best
practices they learned at the training.
3. Long-term - changes in conditions, such as
environmental or public health conditions.
For example, after training participants
implement best practices, water quality
improves.
This section presents the results from the survey and document
review by each sub-question. Throughout this report we refer to
program outputs and outcomes; see the sidebar to the right for
definitions of these terms in the context of program measurement
and evaluation and this work.3 Additional analysis of several
specific grant programs is provided in Appendix G: Program Fact
Sheets.
a. Do the grant programs have specific targets
associated with their outputs/outcomes?
Based on the documents provided via the NPM information request,
only 21% or 15 programs have specific targets associated with their
outputs and outcomes, as evidenced in guidance documents and/or
summary reports.4 For the purpose of this study, we define 'target"
as an articulated measurable goal associated with the program's
outputs and/or outcomes. Clear and measurable targets are an
important tool for determining program success. We identified
several examples of grant programs with well-documented program
logic and associated targets.
The Long Island Sound (LIS) Program document, LIS
Comprehensive Conservation and Management Plan 2015,
discusses four grant themes, three of which have specified
objectives. The document outlines multiple strategies and specific
action steps for each objective. As one example, the Clean Waters
and Healthy Watersheds theme defines ecosystem targets associated
with outputs and outcomes. One target relates to sediment quality
improvement and establishes a goal of 20% reduction in the area of
impaired sediment compared to the 2006 baseline by 2035. The Management Plat provides a clear path for the
grant program to succeed by identifying targets associated with the grant outputs and outcomes.
The Great Lakes Restoration Initiative (GLRI) guidance document, LRI Action Plan III (2020-2024), provides
another robust example of a program with specific objectives tied to program outputs and outcomes. The Action
Plan Summary includes a matrix with the program objectives, associated commitments, and metrics and targets
for tracking outputs and outcomes. For example, the commitments (e.g., program activities) for Objective 4.1:
protect and restore communities of native aquatic and terrestrial species important to the Great Lakes are (1)
identifying habitats to support these species and (2) taking actions to promote the health and connectivity of
these habitats. The associated outcome metrics are 4.1.1: acres of coastal wetland, nearshore or other habitats
3 The survey also provided respondents with these definitions.
4 The Workgroup characterized report types received via NPM Information Request in several categories to help with analysis, these
include, guidance documents, other types of approach for success documents, and summary reports. Guidance documents cover the most
recent objectives, goals, and specific programmatic requirements for the grant program. Approach for success documents include other
supporting documents that inform and define how the program achieves success. Summary reports consolidate and present information
across grantees in the form of a full report, simple data report, summary report rollup or basic data report.
7
-------
restored, protected, or enhanced; and 4.1.2: miles of connectivity established for aquatic species. Each of these
metrics has a baseline measurement with targets laid out for each year of the plan. For example, the baseline for
4.1.1 is 370,488 acres of habitat, with a target of 422,000 acres by FY 2024. With this tiered framework, the
GLRI creates a pathway to track progress with clear and measurable targets associated with grant program
commitments.
Overall, through the document review, the Workgroup was only able to identify a small proportion of programs
with specific targets. However, the documents NPM's provided may not be comprehensive for all types of
documents and/or data systems that contain information on grant program targets. In Year 2, for each program
that is interviewed, the Workgroup will again request this information (if not already available). Articulating
specific objectives or targets associated with program outputs and outcomes is an important part of tracking
grant commitments met. It is essential for a program to determine whether or not they are succeeding in
implementing the program as designed and/or changing the conditions the program seeks to effect. A program
may still be able to track the program outputs and outcomes without targets but would not be able to indicate
whether the outputs and outcomes met the program goals
b. What types of grant commitments data are tracked
Grant programs frequently track grant commitments data on outputs but less frequently on outcomes, especially
long-term outcomes, or changes in environmental and/or health conditions. Based on self-reported data from the
survey, 457 respondents (99%) indicated that grantees report on outputs and 392 (85%) indicated grantees report
on outcomes. However, the Workgroup's assessment and characterization of the types of outcome data
respondents reported, indicates the actual number is 257 or 56%. The survey also asked respondents to report
"other types of data or information from grantees that is not a direct program output or outcome but does allow
the program to otherwise calculate or determine program outcomes," hereafter 'other types of data.' Collection
of other types of data was far less frequent among respondents, with only 11% indicating this occurs.
Respondents indicated outreach, education, public engagement activities or materials; and plans, strategies,
procedures, protocols were the top two types of outputs that grantees report (see Figure l).5 These reflect the
5 Complete definitions of output types:
• Plans, strategies, procedures, protocols (e.g., program plan, implementation strategy, assessment procedures, identification
protocols)
• Policy or regulatory adoption (e.g., organizational or governmental adoption of specific policy or regulation; standard policy
language integrated into state, local, tribal policies or contract specifications)
• Inspections, compliance monitoring, enforcement activities (e.g., field or laboratory actions, or desktop reviews that assess
conditions compared to established standards)
• Other types of testing or monitoring (e.g., types of field or laboratory testing not related to enforcement)
• Outreach, education, public engagement activities or materials (e.g., general public communication, open-houses, webinars,
conferences, websites, publications, partners engaged/communicated with)
• Formal training/certification materials or sessions (e.g., curricula development for training courses, implementation of training
courses)
• Other types of technical support or assistance activities (e.g., program implementation guidance or support)
• Agreements with governmental entities or non-governmental partners (e.g., interagency agreements, partnership agreements, MOU,
MOA)
• Funding provided (e.g., sub-awards or other funding disseminated)
• Leveraged resources (e.g., external funding leveraged with EPA's funding; volunteer recruited)
• Projects or practices Implemented (e.g., on-the-ground projects implemented such as green infrastructure installed, best
management practices, purchase of new equipment, partnership activities conducted)
• Data collection and reporting other than grantee reporting requirements (e.g., reporting environmental data in an EPA database)
8
-------
types of activities that EPA's grant programs are engaging in, what they anticipate from those activities, and
what EPA can expect to report and communicate about their efforts.
Figure 1. Types of Outputs Grantees Report on as Specified in Their Work Plan
Outreach
Plans
Implemented Projects
Technical Assistance
Funding
Other Testing
Training
Data Collection
Agreements
Policy
Leveraged Resources
Inspections
Other
379
357
307
280
I 250
I 246
242
¦ 205
I 196
192
177
176
For those that indicated grantees report on outcomes, the survey asked respondents to list up to five of the most
important outcomes grantees report on as specified in their work plan. Across all five of the responses, the
Workgroup coded the information reported collectively as one or more of the types of outcome categories: early,
mid, and long. A single respondent could provide responses reflective of more than one type of outcome; 257
respondents provided examples of outcomes. If more than one outcome from a respondent fell into the same
category, it was only coded once, so counts reported reflect programs as implemented in a region. We also
coded instances of when the program reported outputs as opposed to outcomes. Of the 257 that provided actual
examples of outcomes, most reported mid-term outcomes; these are understood as changes in behaviors that
would lead to long-term outcomes or changes in environmental or health conditions (see Figure 2). However,
respondents also frequently reported outputs when asked about outcomes (either exclusively, or in conjunction
with outcomes). Across all 462 responses, only 145 or 31% indicate that grantees report on long-term outcomes
or changed environmental or health conditions. Depending on the specific goals of the grant program, capturing
early, or mid-term outcomes (or even sometimes outputs) in lieu of long-term outcomes may be appropriate, but
the data collected to-date do not currently provide this important contextual information. Year 2 of this project
will further explore the appropriate metric endpoints for grant programs.
Figure 2. Number of Respondents Indicating Grantees Report Outcomes (by-Type) on As Specified in Their Work Plan and
Number of Respondents Mistakenly Reporting Outputs When Asked About Outcomes.
T
(ft
Early (Awareness)
Mid-Term (Behaviors)
Long-Term (Conditions)
Outputs
The difference between the number of self-reported outcome responses and our assessment may reflect a
disconnect in staff s understanding of what an 'outcome" is in the context of program measurement and
9
-------
evidence building. Although the survey provided a specific definition and examples (see sidebar box on page 3),
this may indicate an opportunity for improving staff s 'measurement literacy' to better ensure relevant grant
commitments data are tracked.
Among the 11% of responses self-reporting that grantees report on "other types of data or information from
grantees that is not a direct program output or outcome but does allow the program to otherwise calculate or
determine program outcomes," only a handful reported data that actually fit in this category. Across all
responses, ten indicate grantees report success stories, six lessons learned, six emissions inventories, four
research objectives, three sub-recipient progress, and two provided demographic information. This suggests that
collection of other types of data for the purpose of determining program outcomes by EPA staff is not a regular
occurrence. In the absence of directly collecting long-term outcomes from grantees, collecting other types of
data or information from grantees that allows the program to calculate or otherwise determine program
outcomes, is a viable approach for determining grant commitments met. These data may not be relevant for all
programs but could also complement direct output and/or outcome data collected from grantees.
i. How do tracked grant commitments data vary across the Agency?
The types of tracked grant commitments data vary more for outcomes than outputs when comparing across grant
programs by media type and region.6 However, in general, data do not vary substantially across the Agency.
This section examines variation in tracked grant commitments data (output and outcome types) across the
Agency based on these two factors. We also present an in-depth comparison of program outcomes for the subset
of programs presented in the program fact sheets (Appendix G).7
The types of outputs that respondents reported did not vary substantially across regions. Generally, respondents
in each region reported plans and outreach as the most common types of outputs, following the same pattern as
the general survey results. Output types also did not vary substantially across grant program media types.
Respondents across media types indicated that plans and outreach were the most common types reported by
grantees, again mirroring the survey-wide results. The minor variation in output types across grant program
media types logically corresponds with the media focus area. For example, the Environmental Education media
type indicated trainings as one of the most common types of outputs, which aligns with expected program
activities for grants in this media area.
The types of outcomes respondents reported varied more than outputs across regions. Most regions generally
followed the same pattern as the survey-wide results; mid-term outcomes are the most commonly reported type.
Compared to outputs, outcome types were more varied across grant program media types. Although mid-term
outcomes are the most common type in the majority of media types, there are some exceptions. Respondents
reporting for Environmental Education media programs indicated early outcomes as the most common type.
Logically, this makes sense; education is inherently focused on changing the target audience's awareness of
issues and may reflect the ultimate goals of this particular program. Long-term goals may not be a reasonable
outcome for these types of programs.
6 Media types: Air, Brownfields, Environmental Education, Environmental Justice, Multi-Media, Pesticides, Research, Solid Waste,
Superfund, Toxic Substances, Water
7 Sub-set of seven programs identified by the Grant Commitments Team include: Capitalization Grants for DWSRF (66.468- FS), Diesel
Emission Reduction Act (DERA) National Grants (66.039 - DE), Diesel Emission Reduction Act (DERA) State Grants (66.040 - DS),
Brownfields Multipurpose, Assessment, Revolving Loan Fund, and Cleanup Cooperative Agreements (66.818 - BF), Pollution
Prevention Grant Program (P2) (66.708 - NP), Indian Environmental General Assistance Program (GAP) (66.926 - GA), Environmental
Justice Small Grant Program (66.604 - EQ), WPC State and Interstate Program Support (Section 106) (66.419 -1). See Appendix G for
additional analyses on these programs.
10
-------
For an in-depth exploration of variation in reported outcomes, The Workgroup evaluated a select subset of
programs for variation in actual outcomes and outcome types reported. The table on page 8 summarizes
outcome reporting for each program in the subset. As anticipated, the outcomes reported reflect the topical focus
of each grant program (see 'outcome characterization' column). However, there was inconsistency across
respondents for each grant program on the reported outcomes (see 'summary' column). Additionally, many
respondents reported outputs or other types of information in outcome prompt (see 'other information reported
in prompt' column). The Pollution Prevention Grant Program (P2) is notable for the high level of consistency in
reported outcomes across program respondents. This may be explained by the use of an Excel template for
grantee reporting, which includes a "Summary Results" tab for outcome reporting. The summary results
categories requested in the template directly correspond to the categories of outcomes reported by respondents.
11
-------
Program
Summary
Outcome
Types Rep.
Outcome characterization
Other Information
Reported in Prompt
Capitalization Grants for
DWSRF
4 out of 9 respondents report on outcomes.
Inconsistency in outcomes reported. Focus is
public health and equity outcomes.
Long
• Protecting public health through safe drinking water (4/4)
• Affordable drinking water (1/4)
Loans/grants provided,
affordability of loans,
staff training
Diesel Emission
Reduction Act (DERA)
National Grants
7 out of 10 respondents report on outcomes.
Inconsistency in outcomes reported. Focus is on
emissions reductions and benefits from
improved air quality.
Mid, Long
• Emissions Reductions (5/7)
• Vehicles/engines/machines/fleets changed to lower emitting versions (4/7)
• Air Quality Improvement (2/7)
• Benefits to Community (1/7)
Outreach activities,
implementation of
schedule, cost
effectiveness
Diesel Emission
Reduction Act (DERA)
State Grants
7 out of 10 respondents report on outcomes.
Inconsistency in outcomes reported. Focus is on
emissions reductions and benefits from
improved air quality.
Mid, Long
• Emissions Reductions (5/7)
• Vehicles/engines/machines/fleets changed to lower emitting versions (4/7)
• Fuel Savings (3/7)
• Air Quality Improvement (2/7)
• Benefits to Community (1/7)
Outreach activities
Brownfields Multipurpose,
Assessment, Revolving
Loan Fund, and Cleanup
Cooperative Agreements
7 out of 9 respondents report on outcomes.
Inconsistency in outcomes reported. Focus is on
site reuse and human health/environmental
benefits.
Early, Mid,
Long
• Sites ready for reuse (5/7)
• Jobs created (5/7)
• Site redevelopment (3/7)
• Human health benefits (1/7)
• Environmental benefits (1/7)
• Increased knowledge of Brownfields issues (1/7)
Funding leveraged,
outreach activities,
partnerships
Pollution Prevention
Grant Program (P2)
8 out of 10 respondents report on outcomes.
Very consistent outcomes reported. Focus on
water conservation, emissions reductions, cost
savings, energy savings and reduction of
hazardous materials.
Mid, Long
• Water conservation (8/8)
• Metric tons of CO2 equivalent reduced (8/8)
• Cost savings (8/8)
• Reduction of hazardous materials (8/8)
• Energy savings (1/8)
N/A
WPC State and Interstate
Program Support (Section
106)
8 out of 10 respondents report on outcomes.
Inconsistency in outcomes reported. Focus is on
water quality improvement, protection,
monitoring, and human health outcomes.
Early, Mid,
Long
• Water quality improvement/protection (5/8)
• Public awareness (4/8)
• Increase community involvement (1/8)
• Human health impacts - water quality (1/8)
Outreach activities,
monitoring activities and
results, staff training,
workgroup activities,
NPDES permitting,
TMDL/WQS
development
Environmental Justice
Small Grant Program
3 out of 10 respondents report on outcomes.
Some consistency across outcomes reported.
Focus is on reducing negative public
health/environmental outcomes, increasing
public knowledge, and influencing a positive
change in community behavior.
Early, Mid,
Long
• Increased public knowledge/awareness (3/3)
• Positive change in community behavior (2/3)
• Decrease negative public health/environmental conditions (2/3)
• New certifications, procedures and/or policies (2/3)
• Increase natural disaster resiliency (1/3)
N/A
12
-------
ii. To what extent does the data reported by grantees provide EPA with information on
progress towards meeting grant commitments?
The survey asked respondents to directly indicate their level of agreement that the types of outputs or
outcomes data received from grantees meets the reporting requirements that are specified in their work
plan. Respondents overwhelmingly agreed that the data received from grantees meets their reporting
requirements: 94% and 89% of respondents agreed or strongly agreed for outputs and outcomes,
respectively (see Figure 3). This suggests a high level of alignment between the actual output and
outcome data provided by grantees and the requirements of their work plan, indicating that the grantee-to-
EPA reporting processes are generally functioning to provide the data needed to report on grant
commitments.
Figure 3. Respondents' level of agreement that data type meets reporting requirements in grantee work plan
i Strongly disagree
Disagree
Neither
Agree
i Strongly agree
Outputs
Outcomes
5%
66%
28%
10%
62%
27%
Respondents provided several reasons for their positive responses, which were systematically coded into
the following categories (counts reflect instances of reporting). Output data meet the requirements of the
workplan because:
• Standardized reporting process (84). Grantees report on their progress to program officers at
regular intervals using a standard process.
• Clear workplan deliverables or requirements (23). Deliverables and requirements in the
original workplan are clearly stated and defined and agreed upon by grantee and grantor.
• Consistent and ongoing communication (21). Regular communication and/or meetings between
EPA and grantees ensure quality data.
• Grantee obligation to meet reporting requirement (17). A statutory, regulatory, or obligation
within the terms and conditions of the grant require that the grantee provide data.
• Database use (13). Data are tracked through a database.
• Reporting includes review (6). Program officers review grantee reports to ensure data reflects
workplan requirements and provide a quality assurance check.
• Grantee training (2). EPA provides grantees with regular training which helps ensure effective
and clear data reporting.
• Output data are tangible (1). Output data are specific and tangible making it easy for grantees
to provide this type of information.
Respondents also provided some explanations for why output data may not meet the requirements of the
workplan, including:
• Poor data quality (13). Data quality varies based on the grantee and program.
• Unclear workplan deliverables or requirements (4). If the grantees commitments are not well
defined at the outset of the grant, reporting may suffer.
13
-------
• Internal issues with grantees (3). Staff turnover, lack of capacity, and small scale of grant
recipients may cause internal issues that limit their ability to produce output data.
Outcome data reflect similar themes as those for outputs. As reported by respondents, outcome data meet
requirements of the workplan because:
• Standardized reporting process (84). Grantees report on their progress to program officers at
regular intervals using a standard process.
• Clear workplan deliverables or requirements (21). Deliverables and requirements in the
original workplan are clearly stated and defined and agreed upon by grantee and grantor.
• Consistent and ongoing communication (13). Regular communication and/or meetings between
EPA and grantees ensure quality data.
• Database use (9). Data are tracked through a database.
• Grantee obligation to meet reporting requirement (6). A statutory, regulatory, or obligation
within the terms and conditions of the grant require that the grantee provide data.
• Reporting includes review (2). Program officers review grantee reports to ensure data reflects
workplan requirements and provide a quality assurance check.
• Outcome data build on project outputs (2). Outcome data align with the project outputs. If the
outputs are met, the outcomes generally are too.
• Grantee training (1). EPA provides grantees with regular training which helps ensure effective
and clear data reporting.
Respondents' explanation for why outcome data may not meet the requirements of the workplan include:
• Poor data quality (10). Data quality varies based on the grantee and program.
• Reporting does not prioritize outcome data (8). Data reporting (including templates and
process) are focused towards collecting output data, not outcome data.
• Outcome data not standardized (4). Program has not established standard outcome metrics for
the program resulting in inconsistent outcome data reporting across the program and/or from a
single grantee.
• Outcome data are intangible (3). Outcome data are difficult to identify and impossible to track.
• Unsubstantiated outcome data (1). Grantees do not provide enough context to the outcome data
provided.
• Not evident that an outcome was achieved through outputs reported (1). Outputs reported do
not imply an achieved outcome.
• Unforeseen circumstances (1). Circumstances outside grantee's control affect grantee's ability
to report data.
• Unclear workplan deliverables or requirements (4). If the grantee's commitments are not well
defined at the outset of the grant, reporting may suffer.
• Grantees need additional training (1). Grantees need additional training on how to report
correctly.
The respondents' reasons why outcome data may not meet workplan requirements suggest EPA grant
mangers face more difficulty, compared to outputs, in collecting outcome data from grantees.
14
-------
Performance Partnership Grants (PPG)
PPG status is another variable that was examined to better understand the difficulty of tracking grantee
progress towards meeting commitments; 47 respondents (10%) indicated their grant program is managed
as a PPG; another 102 (22%) indicated some parts are managed as a PPG and some are not; and 113
(68%) indicated they are not PPG managed at all. For all grant programs, survey respondents were asked
to "rate how difficult or easy it is for your program to track grantee progress towards meeting grant
commitments/' Overall, the data indicate it may be slightly more difficult to manage a grant that is
managed as part of a PPG; this is particularly relevant for grant programs where some parts are managed
as a PPG, and some are not.
The Workgroup analyzed the dataset comparing PPG-only managed grants (n=47) and 'not PPG
managed" grant responses (n=313; see Figure 4). The spread of overall difficulty for PPG managed grants
is slightly skewed towards difficult, whereas the spread for non-PPG managed grants leans towards easy.
Respondents rated 64% of grants not managed as a PPG as very easy or easy, compared to 51% of PPG
managed grants. The Very Difficult - Difficult range accounts for 8% of total responses for PPG
managed grants, and only 4% for non-PPG managed grants. This provides some evidence that it may be
more difficult to track commitments under PPG managed grants.
Figure 4. Level of difficulty if only managed as PPG or not PPG managed
¦ Very difficult i Difficult ¦ Neither difficult nor easy BEasy ¦ Very easy
Not PPG Managed
4%
33%
45%
19%
PPG Managed
6%
40%
40%
11%
The difference in reported difficulty levels is more striking for programs where some parts are managed
as a PPG, and some are not (n=102, e.g., within one region two states have standalone grants and one
state has the grant under a PPG). For the parts managed under a PPG, very difficult - difficult responses
represent 20% of total responses (see Figure 5). In comparison, this range only accounts for 4% of total
responses for the part not managed as a PPG. The data indicates that PPG management may be most
difficult when some parts of a grant are managed under a PPG and some parts are not. This topic will be
examined further in Year 2.
Figure 5. Level of difficulty if some parts are PPG managed and others are not
¦ Very difficult I Difficult ¦ Neither difficult nor easy BEasy ¦ Very easy
Not PPG Managed
1
35%
39%
18%
PPG Managed
4%
16%
34%
34%
8%
Respondents" explanation for why it is difficult to track progress with PPG grants data include:
• Labor Intensive (13). Reporting process is labor intensive and/or time consuming for either the
program officer, grantee, or both.
15
-------
• PPG grants require extra coordination (7). Difficult to track progress because of necessary
coordination between divisions and other program offices
• Financial tracking is difficult (6). Grant funding is specifically difficult to track within the
reporting requirements.
• Communication Issues (4). Unclear or inconsistent communication between program officers
and grantees.
• Internal Issues within EPA (3). Issues within the program office make reporting difficult.
• Disorganized reporting process (3). Process for reporting is not standardized and/or lacks clear
structure.
• Multi-media PPGs are more difficult to track (2). Multi-media program grants creates
challenges in tracking progress not seen in other PPGs.
• PPG progress is interconnected (2). Problems in one grant may affect other grants within the
PPG, making it difficult to identify which grant is having issues.
• Reporting process needs improvement (1). Respondent indicated that the storage database
could be improved to ease the reporting process.
• No process for tracking progress (1). Lack of standard process to track progress towards
meeting commitments.
• Hard to track PPG progress, easy to track Non-PPG progress (1). Respondent specifically
indicated that tracking progress is more difficult with PPG grants, compared to Non-PPG grants.
Although the overall survey indicates PPG managed grants may be more difficult to manage, nine
respondents specifically indicated in the open-ended comment field that tracking progress is the same for
PPG and non-PPG grants. The overall ratings for these nine respondents ranged from very easy - neither
difficult nor easy. As an example, one respondent for the State Underground Water Source Protection
program indicated that the level of work to track progress is the same for both types.
16
-------
iii. To what extent does the data reported by grantees provide information that currently allows
EPA to measure outputs, outcomes, and impacts related to equity and climate change?8
Justice40 Initiative
Justice40 is a federal, government-wide effort
established by President Biden's Executive
Order (EO) 14008 "Tackling the Climate Crisis
at Home and Abroad" in January of 2021. Jhe
goal is to redistribute federal investments to
deliver 40% of benefits to disadvantaged
communities.
• This fact shee summarizes how the
executive order directs the Biden
administration to combat the climate crisis
through foreign policy, infrastructure,
energy, conservation, and the economy.
EPA has integrated the Justice40 Initiative
into Agency-wide operations, which drives
interest in this topic as it relates to grant
programs.
• The initiative is incorporated throughout
the objectives of the Draft FY 2022-2026
EPA Strategic Plan, released October 1,
2021.
• For example, the EPA Drinking Water
State Revolving Fund was identified as
one of the 21 priority programs in the
Justice40 pilot program.
Available information from guidance documents and the survey
indicates the data currently reported by grantees provides limited
information on outputs and outcomes related to equity and climate
change. The Workgroup reviewed program guidance documents
and searched open-ended survey responses for the presence of
metrics addressing equity, climate, and/or tribal entities.
Of the 72 programs providing documents, 13 programs track
metrics associated with outputs, outcomes, or impacts related to
equity and/or climate change. Of these 13 programs, most fell into
this category because the program directly targets under-
resourced/underserved communities, e.g., tribal-specific grant
programs. More commonly, grant programs broadly discuss the
importance of equity and/or climate change in guidance
documents, but do not define specific metrics to track outputs,
outcomes, or impacts. For example, the Long Island Sound
Program Long Island Sound 2015 Comprehensive Conservation
and Management Play, describes how environmental justice is
incorporated into program priorities. However, the plan stops short
of defining specific metrics to track progress towards meeting
environmental equity goals over the timeframe of the plan.
Based on a set of keywords, the Workgroup searched open-ended
survey responses to identify outputs, outcomes, or impacts related
to equity and/or climate change.91" The keyword search provided
13 examples of outcome metrics that relate to equity. Nine of these
examples involve outcome metrics for tribal entities. For example,
the Training, Investigations, and Special Purpose Activities of Indian Tribes under the CAA grant
program reported "Changed behaviors in tribal members as a result of healthy homes training" as one of
the five most important grant outcomes. This program also includes specific equity metrics within the
program Guidance Document that measure the number of tribes that have achieved various program
targets. The remaining examples generally relate to outcome metrics for 'disadvantaged communities."
For example, the Assistance for Small and Disadvantaged Communities Drinking Water Grant Program
lists "Enhanced technical, managerial, and financial capability of public water systems in small and
disadvantaged communities" as one of the five most important grant outcomes.
The Workgroup also identified three examples of other types of data that EPA could use to determine
equity impacts. These data include demographic information on program participants and location
information for grant activities. For example, the Region 3 Leaking Underground Storage Tank Trust
8 EPA's interest in equity and climate change issues is driven by the Justice40 Initiative, described in the call-out box above.
Notably the research, including the survey and document review occurred during the spring and summer of 2021, likely prior to
any substantial effect of the Initiative on the data tracking efforts associated with grants.
9 Search terms: equity, equality, justice/environmental justice, E.T, work force development, disadvantaged commimity(ies),
equitable access, tribal, vulnerable.
10 Search terms: resiliency, climate/climate change, global wanning, carbon dioxide, greenhouse gas, sea-level rise, emissions.
17
-------
Fund Program reports they collect location data on cleanup sites to determine sites for future priority
setting, including consideration of environmental justice data/environmental sensitivity.
The Workgroup identified 25 examples of outcome metrics that track climate related impacts. The
majority of identified examples had metrics specifically focused on emissions, including carbon dioxide
emissions. Generally, these metrics are tied to specific grant activities and depend on the grant focus area.
For example, the Pollution Prevention grants program includes "reductions in metric tons of carbon
dioxide equivalent" as one of five most important grant outcomes tracked. Other examples of outcome
metrics focused on resiliency and general climate change impacts. For example, the Long Island Sound
Program reported "system resiliency and function are maintained by protecting, restoring, and enhancing
habitats" as one of their five most important outcomes. Several outcomes we identified relate to climate,
but do not include clear metrics to track progress. One example of this is the Solid Waste Management
Assistance: Training Education Studies and Demonstrations, which lists "reduce climate change" as one
of the five key outcomes.
In general, based on open-ended responses and grant program guidance documents, there are limited data
currently available to help EPA quantitatively and comprehensively measure outputs, outcomes and
impacts related to equity and climate change. Of the programs that have equity and/or climate related
components, few include specific metrics to measure outputs, outcomes, and impacts.
c. How do grant programs identify relevant grant commitments to track?
Year 2 of the work will explore this in greater detail via interviews, but available evidence suggests that
programs may identify relevant grant commitments to track from a centralized top-down process (e.g.,
statute, determination by the national program), past performance of the program, or a program-level
action plan. In Year 1, there was no data source that directly addressed this evaluation question. Instead,
we utilized information respondents provided in open-ended responses to the survey questions to identify
relevant information.11 From the survey, the Workgroup identified nine responses that described ways in
which grant programs identify relevant grant commitments to track. These include:
• Defined by statute. One respondent indicated the State Lead Grants program has well-defined,
easily tracked program outputs as dictated by statute, the Toxic Substances Control Act: Sec.
404(g).
• National strategic performance measures. A respondent for the State Public Water System
Supervision program reported that the output data and reports are tied to national strategic
performance measures.
• Past performance of grants. Past progress dictates direction of future grants that are funded and
the subsequent grant commitments to track (e.g., Long Island Sound Program).
• Program-level action plan. The Great Lakes National Program Grants (66.469GL - Region 5)
indicated that the Great Lakes Restoration Initiative Action Plan includes objectives,
commitments, and measures that informs the reporting plan and types of output data collected.
11 The Workgroup reviewed open-ended responses to Questions 15 and Question 19. Question 15 asks how strongly respondents
agree that the types of output data that the program collects enable the program to effectively track progress and/or recognize and
address problems. Question 19 asks how strongly respondents agree that the types of outcome data that the program collects
enable the program to effectively track progress and/or recognize and address problems.
18
-------
Given the opportunistic nature of the data presented here, these categories may not cover all pathways
grant programs have for identifying relevant commitments to track. Therefore, we propose using
interviews as part of Year 2 to collect additional information on the ways in which programs go about
selecting relevant grant commitments including appropriate goals, metrics, and targets.
ci What data reporting processes, tools, and systems do EPA's grant award
prog rams use?
EPA's grant programs most often use Word documents for collecting data from grantees and
SharePoint/Teams/OneDrive for storing data collected from grantees.
Nearly 91% of respondents receive information from grantees in Word (see Figure 6). Over 50% of
respondents also indicated Adobe PDF, Excel, and 'text in the body of an email' as other popular
reporting mechanisms. Although a large proportion of respondents indicated databases and EPA data
systems are used to store data collected from grantees (61% and 42%, respectively; see Figure 5), these
are not used as frequently for grantee reporting (17.5%). This is likely by design, as not all databases
support direct user input from disparate users; but this may also indicate an opportunity for improving
data collection processes in the future.
Each of the data reporting mechanisms has implications for data standardization and how these data can
or must be stored at the Agency. For example, use of Word documents imply a data storage mechanism(s)
that can contain and organize Word documents for future retrieval. Word documents and 'text in the body
of an email' are two formats that are not necessarily conducive for standardizing data collection and
facilitating data consolidation across multiple grantee projects. For both these formats, grantees can easily
manipulate or work-around the original data request because these formats do not require respondents to
select from pre-set response options or have required data fields (such as a database approach), which
may hinder effective data collection. 'Text in the body of an email' is also concerning as a data reporting
mechanism from a data quality and tracking perspective. Only the direct recipient of the email has access
to the grantee reported data.
19
-------
Figure 6. Reporting mechanisms grantees use to report to EPA
document 420
Adobe PDF 273
workbook 262
Text in the body of an email 233
EPA data system 160
Database 81
Other 66
Microsoft Forms I 6
Microsoft Access I 4
Survey I 2
No mechanism I 1
'Other' reporting mechanisms
Meeting
28
Unspecified
H 19
Phone call
7
Hard copy documents
7
Video
4
Photos
4
Map
2
Presentation
2
Flash drive
2
Survey respondents were asked to report on any and all reporting mechanisms grantees use to report
progress on meeting grant commitments to EPA. Mechanisms that EPA grant programs use to store data
collected from grantees is dominated by SharePoint/Teams/OneDrive. Followed generally by databases,
files in a desktop folder, EPA data systems, and Excel workbooks.
Figure 7. Data storage mechanisms for data collected from grantees
SharePoint/ Teams/ OneDrive
Database
Files in a desktop folder
EPA data system (GRTS, etc.)
Excel workbook
Other
No data storage mechanism ¦ 11
Microsoft Access | 4
335
| 282
273
195
130
20
-------
'Other' mechanisms of data storage
Hard copy documents
Electronic files
Email „
Adobe 2
Unspecified 1
The survey also asked respondents to provide the names of specific databases used to store grantee data.
The total number of databases reported from both the survey and NPM information request totaled 55
across all programs. These often align with the specific grant program (e.g., P2 Grants Plus system)
and/or media type (e.g., ACRES for Brownfields). A notable number of respondents also indicated that
grantee data are stored in the EPA grant E-file system, an administrative grant management program. This
is a relatively new data system that was developed in response to the work from home practices due to the
COVID-19 pandemic; it was built to digitize a previously predominately paper-based system.12 Since this
is an administrative grants management system, it is unlikely to actually contain output and outcome
metrics. The database responses from the NPM Information Request corroborated what was provided via
the survey and provided two additional databases: the Budget Formulation System (BFS) and COMPLY
(a tool that calculate the effective dose equivalent from radionuclides released from stacks and vents).
12 Based on email and verbal communication with Grant Commitments Team representing the EPA Office of Grants and
Debarment
21
-------
Figure 8. Databases EPA grant programs use to store grantee data
IGMS/ Next Generation Grants System (NGGS)
EPA Grant E-File system started in response to COVID-19
Assessment Cleanup + Redevelopment Exch. System (ACRES)
Air Quality System(AQS) Data Mart
Water Quality Exchange (WQX)
Pollution Prevention (P2) Grants Plus system
National Incident Management System (NIMS)
State Grant Workplan and Progress Report Database
Leaky Underground Storage Tank
Central Data Exchange
Grants Reporting and Tracking System (GRTS)
Safe Drinking Water Information System (SDWIS)
Environmental Information Exchange Network (EN)
Database Rep. Innovative Vehicle Emission Red. (DRIVER)
Exchange Network Program Tracking System
Enforcement and Complaince History Online
Project Benefits Reporting System (PBR)
Clean Water Benefits Reporting (CBR) System
Resource Conservation + Recovery Information (RCRAInfo)
Superfund Enterprise Management System (SEMS)
Underground Injection Control (UIC) Database/Inventory
Indoor Air Quality (IAQ) Impact Database
Grants Solution
Beach Advisory and Closing Online Notification (BEACON)
Web Reports Manager
Wetlands Grants Database (WGD)
Program, Beach Advisories, Water quality, Nutrients (PRAWN)
Federal Insecticide, Fungicide, Rodenticide Act (FIFRA) STAG
Quality Assurance Tracker ¦
National Estuary On-line Reporting Tool ¦
Ezrecord - records management system ¦
Compass Business Objects Reporting ¦
Assessment TMDL Tracking + Implementation Sys. (ATTAINS) ¦ 2
West Coast Collaborative ¦ 2
Unliquidated Obligations ¦ 2
Plantwide Applicability Limitations ¦ 2
? Related to Graduate Research Internship Program ¦ 2
Financial and Ecosystem Accounting Tracking System (FEATS) ¦ 2
Environmental Accomplishments in the Great Lakes (EAGL II) ¦ 2
Annual Commitments Survey ¦ 2
Branch Reporting and Tracking System (BRATS)
State Database with EPA Access
State Planning Electronic Collaboration System (SPeCS)
Southeastern New England Program (SNEP)
Safe Drinking Water Accession + Review System (SDWARS)
Integrated Compliance Information System (ICIS-NPDES)
Integrated Compliance Information System (ICIS - Air)
Granttrax
Grants.gov
Great Lakes Environmental Database System (GLENDA)
General Assistance Program Online System
Enterprise Content Management System (ECMS)
Contract Administrative Tracking System (CATS)
-------
Most respondents indicated grantees report semi-annually (44%); quarterly (38%); or annually (28%). Notably,
the counts in Figure 9 are not exclusive; for example in some cases a respondent indicated grantees report
quarterly and annually.
Figure 9. Reporting frequency by type
Semi-annually
Quarterly
Annually
Rolling
¦
Variable
¦ 5
Monthly
¦ 4
NA
1 1
202
175
128
Respondents most often agreed or strongly agreed that reporting frequency allows them to track progress and
address problems with the grants they manage (Figure 10).
Figure 10. Respondents' level of agreement that the reporting frequency is appropriate to effectively track
PROGRESS AND RECOGNIZE AND ADDRESS PROBLEMS
Neither
i Strongly disagree
Disagree
Agree
i Strongly agree
Track Progress
Address Problems
9 36
273
139
12 44
270
131
The Workgroup examined the relationship between reporting frequency (semi-annually, quarterly, and
annually) and ability of a program to track progress and address problems. As a whole, there is no clear
relationship between grantee reporting interval and respondents" agreement that reporting frequency is
appropriate to effectively track progress on grant commitments and recognize and address problems. The
percent that agreed or strongly agreed for each reporting interval for both tracking progress and
addressing problem ranged between 87% and 91%. For semi-annual reporting the percent falling in these
categories for addressing problems and tracking progress were 87% and 90%; quarterly 91% and 91%;
and annual reporting 88% and 91%, respectively.
Respondents provided several reasons for their positive responses, which were systematically coded into
the following categories (counts reflect instances of reporting). The reporting frequency enables the
program to track progress and address problems because:
• Reporting is complemented with regular communication with grantee (52). Regular
communication with grantees allows EPA grant managers an open line of communication with
the grantees helping to improve grant management and reporting.
• Reporting frequency is dependent on the grant/project (14). Reporting frequency appropriate
when it is tied to the scope and timeframe of the project.
• Grant requirements dictate notification (12). Grant requirements dictate that grantees notify
EPA if they are not going to meet a commitment. Allows EPA to address problems even outside
regular reporting frequencies.
23
-------
• Reporting frequency tied to past performance (12). Allowing the reporting frequency to vary
with grantee experience and past performance allows grant program managers to find an interval
that works for the grantee to produce quality information. This frequency may be negotiated, or
decided by EPA, but it is agreed upon by both parties.
• Semi-annual reporting balances needed information with lower data burden (11). Semi-
annual reporting allows grantee to make progress and enables recognition of problems without
added burden of more frequent reporting.
• Quarterly reporting helps recognize/address problems early on (10). Quarterly reporting is
frequent enough to recognize and address problems in a timely fashion, catching them early on in
the process.
• Quarterly reporting provides progress updates in real-time (9). Given that quarterly reporting
is so frequent, program officers are able to get updates on grantee progress in real-time, allowing
for nearly continuous tracking of grantee progress.
• Reporting is complemented by frequent receipt/review of grantee data (7). Grant program
managers receive grantee data throughout the term of the grant, which supplements information
provided through the specified reporting frequency.
• Annual reporting is complemented by interim progress reports (7). In addition to annual
project reports, grant program managers require interim progress reports.
• Reporting frequency aligns with workplan/project requirements (6). Reporting frequency
aligns with requirements laid out in the workplan or by project management.
• Consistent reporting frequency (4). Consistent reporting throughout grant.
• Variable reporting frequency (4). Variable reporting frequency for different program
requirements is effective.
• Annual reporting allows grantees to make substantial progress (5). Annual reporting allows
the grantees to make substantial progress on grant activities, resulting in complete and new
information in each report.
• Recent switch of reporting frequencies improved program management (3). Switching
reporting frequency helped with program management.
• Reporting frequency dictated by regulation or headquarters program (3).
• Reporting complemented with regular site visits (3). Regular site visits allow managers to
observe progress and identify issues.
• Quarterly reporting balances needed information with lower data burden (2). Quarterly
reporting allows grantee to make progress and enables recognition of problems without added
burden of more frequent reporting.
Respondents provided several reasons why grantee reporting frequency may not help the program to track
progress on grant commitments and recognize and address problems, including:
• Annual reporting is not frequent enough to recognize/address problems on time (8). Annual
reporting allows issues with grantees to persist and delays course correction.
• Quarterly reporting is too frequent (5). Quarterly reporting creates an unnecessary burden on
grantees.
• Semi-annual reporting is not frequent enough to recognize/address problems on time (8).
Semi-annual reporting allows issues with grantees to persist and delays course correction.
• More frequent reporting is necessary (3). Although respondents acknowledge the extra burden
on grantees and grant managers, respondents feel more frequent reporting is necessary to
adequately track progress and/or address problems.
24
-------
• Time lag in reporting (3). There may be a delay in receiving reports from grantees.
• Quarterly reporting does not reflect long-term change (3). When issues need to be addressed
over long periods of time, quarterly reporting does not adequately reflect the status of that issue.
• Lack of communication with grantee (2). Need for more consistent communication with
grantee.
• Reporting frequency does not consider seasonality of grant (2). Reporting frequency does not
adequately capture activities of the grant, as there is a high level of seasonality for the project.
There were also several respondents that used the open-ended text to simply state the sufficiency of the
grantees' reporting frequency:
• Annual reporting is sufficient (26).
• Semi-annual reporting is sufficient (22).
• Quarterly reporting is sufficient (13).
Overall, the analysis of open-ended responses provides differing opinions on reporting frequency, none of
which necessarily conflict. Some respondents indicate reporting intervals are sufficient, while others
argue for more reporting, all of which may depend on the specific grant program and the specific outputs
and outcomes for that program. Year 2 will further investigate the role of reporting frequency in effective
grantee data collection. A major takeaway from this work is the importance of regular communication
with grantees. Fifty-two respondents indicate this allows EPA grant managers an open line of
communication with the grantees helping to improve grant management and reporting.
i. How do grant award reporting systems vary across the Agency?
To evaluate how grant award reporting systems vary across the Agency, the Workgroup reviewed
reporting mechanisms and storage systems across two key variables: region and media type. In general,
the reporting mechanisms by region and media type follow the pattern of the full dataset; Word
documents are the most common mechanism, followed by Adobe PDF, Excel and 'text in t body of an
email.' The exception to this is the use of Excel as a reporting mechanism, with higher reported use than
Adobe PDF and 'text in the body of an email' in Regions 8 and 9. In comparing media type, the data
indicate higher use of EPA data systems with Brownfields and Pesticides media types compared to other
reporting mechanisms.
The data storage systems that grant programs use has some variation across regions and media type. Most
regions follow the pattern of the full dataset although Region 2 has considerably more programs using
EPA Grant E-Files system to store data. This is as expected since Region 2 developed the system.13 The
data storage systems used by grant programs generally correspond to the focus area of a given media type.
For example, the Air media programs predominantly use AQS, the designated EPA air quality database.
Similarly, Brownfields grant programs exclusively use ACRES, the EPA Assessment Cleanup and
Redevelopment Exchange System. Aside from Multi-Media programs, Water programs have the largest
variation in data storage mechanisms, with respondents indicating the use of 24 different systems.
For an in-depth view, the Workgroup also reviewed the top reported databases used by survey
respondents to investigate patterns across media types. In several cases (and as expected) databases align
13 Based on email and verbal communication with Grant Commitments Team representing the EPA Office of Grants and
Debarment
25
-------
with a specific grant program or set of grant programs organized around a media type (e.g., Air) .
Otherwise, for more general databases there was no connection between grant programs and media type
and database.
• IGMS/ Next Generation Grants System (IGMS / NGGS). Used by 48 total programs across all
media types. Prominent use by Air, Multi-Media, Solid Waste, and Water media type grant
programs. No clear pattern for database use within specific programs.
• EPA Grant E-File system started in response to COVID-19 (EPA Grant E-Files). Used by 32
total programs across all media types except Brownfields and Research. Prominent use for Multi-
Media, Solid Waste, Superfund, and Water media type grant programs. No clear pattern for
database use within specific programs.
• Assessment Cleanup + Redevelopment Exchange System (ACRES). Used by 27 total
programs exclusively within the Brownfields and Multi-Media media types. This is expected
based on the focus of this database. Programs using this database perform Brownfield cleanup
and associated activities (e.g., workforce development).
• Air Quality System (AQS) Data Mart. Used by 24 total programs exclusively within the Air
and Multi-Media media types. This is expected, similar to ACRES, based on the focus of the
database. Programs using this database address air pollution and perform other actions under the
Clean Air Act (CAA).
• Water Quality Exchange (WQX). Used by 18 total programs exclusively within the Water and
Multi-Media media types. This is expected, similar to ACRES and AQS, based on the focus of
the database. Programs using this database focus on water quality and conservation efforts
surrounding water resources.
• Pollution Prevention (P2) Grants Plus system. Used by 15 total programs within the Multi-
Media media type. This database is used exclusively by the Pollution Prevention Grants Program
and Source Reduction Assistance grant programs.
e. How do grant programs use the grant commitments data they collect for
program implementation?
Respondents provided overwhelmingly positive responses that the output and outcome data currently
being collected enables the program to track progress on grant commitments and recognize and address
problems occurring with grantees not meeting grant commitments. About 91% of respondents agree or
strongly agree that output data enables the program to track progress on grant commitments and 85% of
strongly agree or agree it enables the program to address problems (Figure 11). The responses were
slightly deflated for outcomes compared to outputs, but still reflect high levels agreement: 85% and 80%
for track progress and address problems, respectively (Figure 12). This suggests that EPA staff are using
the data collected from grantees to understand what grantees are accomplishing and to manage grantee
performance.
26
-------
Figure 11. Respondents' level of agreement that output data enables the program to track progress on grant
COMMITMENTS AND RECOGNIZE AND ADDRESS PROBLEMS (N =457)
i Strongly disagree
Disagree
Neither
Agree
i Strongly agree
Track Progress 6
l«
32
308
108
Address Problems
56
285
101
Figure 12. Respondents' level of agreement that outcome data enables the program to track progress on grant
COMMITMENTS AND RECOGNIZE AND ADDRESS PROBLEMS (N = 391 )
i Strongly disagree
Disagree
Neither
Agree
i Strongly agree
Track Progress
II
53
238
93
Address Problems
,2
66
230
81
Respondents provided several reasons for their positive responses, which were systematically coded into
the following categories (counts reflect instances of reporting). Output data enables the program to track
progress and address problems because:
Standardized reporting process (53). Grantees report on their progress to program officers at
regular intervals using a standard process.
Regular reporting (24). Standard and agreed upon frequency of reporting helps EPA use data for
program implementation.
Consistent and ongoing communication (24). Regular communication and/or meetings between
EPA and grantees ensure quality data.
Clear workplan deliverables or requirements (24). Deliverables and requirements in the
original workplan are clearly stated and defined and agreed upon by grantee and grantor.
Database use (15). Data are tracked through a database.
Standard reporting template (8). Standard template across grantees and matches workplan.
Use of data in program evaluation (7). Data are used in program evaluation reports or program
reviews.
Data align with program or national goals or metrics (6). Data directly support programmatic
or national-level goals.
Data directs future projects and initiatives (3). EPA uses data to direct future projects or work
(e.g., areas of additional research, follow-up data collection, etc.).
Narrative report format (2). Narrative portion of reports provides EPA with deeper
understanding of grant data, helping program officers manage the grant.
Quantifiable data (1). Nature of output data as measurable helps EPA use these data.
Data reflect progress indicators (1). Output data is a warning system, if a grantee misses an
output, it indicates there are problems to address.
27
-------
Respondents' explanation for why output data may not help the program to track progress on grant
commitments and recognize and address problems occurring with grantees not meeting grant
commitments include:
• Data do not inform how to address problems (12). Output data may assist in identifying
problems, but it does not help project officers resolve the problem.
• Output data anticipated outside project timeframe (7). Timeline of anticipated outputs are
longer than the timeframe of the grant.
• Lack of resources hinders ability to address problems (3). Insufficient resources are the
driving factor for why output data itself may not help a program address problems.
• No standard reporting template (3). Lack of a reporting template; does not clearly show issues
or progress for a grantee.
• Variable management of grantees (2). The level of detail required in reporting depends heavily
on the project officer, which means different grantees are given different treatment in terms of
reporting data.
• Inadequate reporting (2). Grantees do not adequately respond to reporting request; this also
requires project officers to follow-up with grantee for clarification.
• Unforeseen circumstances (1). Circumstances outside grantee's control affect grantee's ability
to report.
• Lack of standard reporting process (1). No clear reporting method.
• Narrative report format (1). Narrative report format is confusing and is difficult to interpret.
• No common metrics (1). EPA does not have standard, common metrics for assessing progress.
• Unclear workplan deliverables or requirements (1). If the grantee's commitments are not well
defined at the outset of the grant, reporting may suffer.
Outcome data reflect several similar themes as those for outputs, as well as other outcome-specific
themes. As reported by respondents, outcome enables the program to track progress and address problems
because:
• Consistent and ongoing communication (31). Regular communication and/or meetings between
EPA and grantees ensure quality data.
• Standardized reporting framework, frequency, and template (31). Outcome data reported in
standard framework helps EPA use data for program implementation. Standard and agreed upon
frequency of reporting helps EPA use data for program implementation (14). Template
standardizes outcome data (4).
• Data align with workplan and/or program goals (13). Outcome data provided matches
requirements in workplan or another defined plan.
• Reporting includes review (8). Program officers review grantee reports to ensure data reflects
workplan requirements and provide a quality assurance check.
• Database use (8). Data are tracked through a database.
• Clear workplan deliverables or requirements (7). Deliverables and requirements in the original
workplan are clearly stated and defined and agreed upon by grantee and grantor.
• Grant requirements dictate notification (6). Grant requirements dictate that grantees notify
EPA if they are not going to meet a commitment
• Reporting outcome data ensures program implementation (6). Outcome data tracks progress
in grant program implementation, and the impact it has on the community.
• Data directs future projects and initiatives (6). EPA uses data to direct future projects or work.
28
-------
• Interim data reporting (3). Reporting interim data - not just at completion of project, helps
inform longer-term progress.
• Outcome data build on project outputs (1). Outcome data align with the project outputs.
• Grantee produces tangible deliverables (1). Allows deliverables to be counted and tracked.
Respondents' explanation for why outcome data may not enable the program to track progress and
address problems include:
• Outcome data anticipated outside project timeframe (15). Timeline of anticipated outcomes
are longer than the timeframe of the grant.
• Data do not inform how to address problems (8). Output data may assist in identifying
problems, but it does not help project officers resolve the problem.
• No centralized database (4). Lack of database makes it difficult to use the data.
• Reporting does not prioritize outcome data (4). Data reporting (including templates and
process) are focused towards collecting output data, not outcome data.
• Outcome data are not suited for tracking progress and/or addressing problems (4). Due to
the nature of outputs vs. outcomes, output data are better suited for tracking progress and
addressing problems.
• Outcome measures do not align with community priorities (2). EPA's expectation of results
for the grant program do not align with the priorities of the community that the grant is meant to
serve.
• Grantees lack capacity to report outcomes (2). Grantees do not have the capacity to report
outcome data either due to resource or technical constraints.
• EPA's national metrics do not align with grant program (2). EPA's national-level metrics are
not appropriate for particular grant programs.
• Unclear workplan deliverables or requirements (2). If the grantee's commitments are not well
defined at the outset of the grant, reporting may suffer.
• No common metrics (2). EPA does not have standard, common metrics for assessing progress.
• Delay in grantee reporting (1). Late reporting from grantees hinders EPA's ability to manage
grant program.
• Lack of communication with grantee (1). No established or regular line of communication
between grant manager and grantee.
• Not evident that an outcome was achieved through outputs reported (1). Outputs reported do
not imply an achieved outcome.
• Outcome data are intangible (1). Outcome data are difficult to identify and impossible to track.
Based on the list above, in Year 2 we will further probe examples of how grant data is being used to manage
grant performance; and the conditions that give rise to this.
f. How do grant programs present and communicate the results of the grant
commitments data they collect?
Grant programs present and communicate grant commitments data in a variety of different ways. Based
on the existing data collection and presentation/communication practices of grant programs, EPA's
individual programs have the ability to report out on the outputs and outcomes of several individual, but
not all grant programs. Through the NPM Information Request, we received example reports that helped
29
-------
us characterize ways in which grant program present information. Of 71 total respondents, 39 provided
examples of summary reports/rollups. These documents were categorized into four different types:
• Full report with data interpretation and graphics.
• Simple data report with some summary data roll-up and minimal graphics across
projects/activities but no interpretation.
• Project summary roll-up discusses individual projects separately but no cross-project data
interpretation.
• Basic data report out includes data numbers but no graphics or interpretation.
Based on the 39 examples, 12 provided project summary roll-ups with no cross-project interpretation, 11
provided full reports with data interpretation and graphics, and 10 provided basic data reports with no
graphics or interpretation. Simple data reports (7) are the least common
document type we received but may reflect automated reports from
existing databases.
Report Audiences
A few reports standout due to their ability to clearly display grant
commitment data and compare results back to program targets.
• Long Island Sound Program. Returning the Urban Sea to
Abundance: A five-year review of the 2015 Comprehensive
Conservation and Management Plan. Report is broken down
into four themes that correspond to primary grant program goals.
Each theme compares grant commitment data to ecosystem
target progress and gives a status overview for priority
implementation actions. The use of clear metrics, status updates
and success stories within the report clearly displays progress
towards meeting grant commitments and program
accomplishments.
• Diesel Emission Reduction Act (DERA) National Grants.
DERA Fourth Report to Congress: Highlights of the Diesel
Emissions Reduction Program. This report is broken down into
funding categories (national competitive, state, school bus
rebate, ports), each with graphics, emissions data, and associated
interpretation. Of note is the inclusion of looking forward
information for each section and the DERA program as a whole,
indicating commitments data are used to plan for future grants.
The way grant commitments data are
presented and communicated is partially
informed by the intended audience. The three
most common report audiences are described
below:
• Audience clearly identified in the title.
The report title includes the intended
audience. For example, the DERA Fourth
Report to Congress: Highlights of the
Diesel Emissions Reduction Program
• Report written for review by program
officers. Include only basic project
activities and are often simple data
reports or project summary rollups,
including administrative performance
reports. For example, the Lake
Pontchartrain Basin Restoration Program
(PRP) Administrative Semi-Annual
Performance Report.
• Report written for the general public.
Go beyond data interpretation and
progress explanation with detailed
descriptions and visually appealing
graphics to engage a broader audience.
For example, the 2021 Lake Champlain
State of the Lake and Ecosystem
Indicator Report.
Lake Champlain Basin Program. 2021 Lake Champlain State
of the Lake and Ecosystem Indicator Report. This report uses a
matrix of ecosystem indicators by lake segment to display progress on grant priorities and
commitments.14 The report makes use of color and symbols to distinguish the four program
priority areas (clean water, healthy ecosystems, thriving communities, informed and involved
14 Note that this matrix does not account for outside factors and includes an acknowledgement that progress on some indicators
may be caused by events outside the grant program activities.
30
-------
public). Additionally, the report displays detailed and specific graphics throughout that display
progress.
• Nonpoint Source Implementation Grants.
This report is broken down by Nonpoint Source (NPS) type. It
stands out due to its extensive use of graphics throughout the report; the graphics tell the story,
with interpretation to supplement. Specifically, this report includes a map on page 6 which details
the pathway of the grant program and orients the reader to key activities and accomplishments.
Appendix A: Methodology
In September 2020, EPA developed a Learning Agenda that identified three Learning Priorities. The
Learning Agenda stems from the Foundations for Evidence-Based Policymaking Act (Evidence Act),
which provides a framework to promote a culture of evaluation, continuous learning, and decision making
using the best available evidence. As part of the Learning Agenda, EPA has initiated efforts to:
1. Develop priority questions.
2. Develop capacity to undertake new evidence-building activities.
3. Take the first step in developing a final Learning Agenda that will inform the FY 2022-2026 EPA
Strategic Plan.
Grant Commitments Met was one of the Learning Priorities identified in the Learning Agenda. Every
year, EPA awards over $4 billion in grants and other assistance agreements. Through these grants, EPA
helps to protect human health and the environment through the work of its grantees. The management and
tracking of the individual awards are dispersed amongst approximately 1,400 staff throughout
headquarters and EPA's ten regional offices, which makes tracking results at the national level
challenging. The Agency's lack of a comprehensive system for tracking grant-related activities leads to an
inability to proficiently evaluate environmental outcomes on a national scale. Work under the Learning
Agenda is a first step toward better understanding current grant reporting and tracking processes across
the Agency's 100+ current grant programs. This baseline will help EPA develop a sustainable and
consistent process for negotiating and tracking the environmental outputs and outcomes resulting from
EPA's grant funding.
OCIR established the Grant Commitments Met Workgroup to address the Priority Questions in the
Learning Agenda. Subsequently, the Workgroup utilized contractor support. The initial phase of work
(Phase 1) addresses the priority question: How do EPA's existing grant award and reporting systems
identify and track grant commitments? Phase 2 addresses: What EPA practices and tools effectively track
whether grantees are fulfilling their workplan grant commitments, including outputs and environmental
outcomes? The final year of work, Phase 3, addresses: Are the commitments established in EPA's grant
agreements achieving the intended environmental results?
The full list of priority research questions and sub-questions associated with this work include:
2. Phase 1: How do EPA's existing grant award and reporting systems identify and track
grant commitments?
a. Do the grant programs have specific targets associated with their outputs/outcomes?
31
-------
b. What types of grant commitments data are tracked?
i. How do tracked grant commitments data vary across the Agency?
ii. To what extent does the data reported by grantees provide EPA with information
on progress towards meeting grant commitments?
iii. To what extent does the data reported by grantees provide information that
currently allows EPA to measure outputs, outcomes, and impacts related to
equity and climate change?
c. How do grant programs identify relevant grant commitments to track?
d. What data reporting processes, tools, and systems do EPA's grant award programs use?
i. How do grant award reporting systems vary across the Agency?
e. How do grant programs use the grant commitments data they collect for program
implementation?
f. How do grant programs present and communicate the results of the grant commitments
data they collect?
3. Phase 2: What EPA practices and tools effectively track whether grantees are fulfilling their
workplan grant commitments, including outputs and environmental outcomes?
a. How effectively is EPA able to track grantee progress towards meeting grant
commitments?
b. What factors might affect a grant program's ability to effectively track grant
commitments? Factors may include:
• Third party management
• PPG management
• Reporting frequency
• Grantee reporting mechanisms
• Data storage mechanisms
• Types of outputs reported
• Types of outcomes reported
• Type of grant program
i. What factors might affect a grant program's ability to effectively track grant
commitments related to equity and climate impacts?
c. What are promising practices and tools demonstrated by some grant programs that could
help other grant programs effectively track grant commitments data?
4. Phase 3: Are the commitments established in EPA's grant agreements achieving the
intended environmental results?
a. What are the intended environmental results of EPA's grant programs as reflected in the
grant agreements?
32
-------
b. What outcomes are EPA's grant agreements achieving?
c. What outcomes are EPA's grant agreements achieving that relate to equity and climate
change impacts?
d. What potential changes could be made to grant programs' data collection
mechanisms/processes to help EPA determine if EPA's grant agreements are achieving
the intended environmental results?
e. What can grant programs do to better communicate how grant outputs advance the
Agency's mission through environmental results?
f. What potential changes could be made to grant programs' data collection efforts to help
EPA determine equity and climate change impacts?
g. What next steps could EPA take to establish an EPA system that compiles the outputs
and outcomes of Agency grant reporting for all systems?
This methodology covers the research questions across all three phases. Key elements of the methodology
for addressing these questions are outlined below.
This work draws on multiple data sources to answer the priority questions. Key sources of information
include: 1) a brief online survey administered to EPA staff managing or implementing EPA's grant
programs, 2) document review of reports and grant guidance documents, and 3) in-depth interviews with
EPA personnel across the Agency. The survey and document review are the main data sources for
addressing Phase 1 questions (although interviews will also supplement any data gaps from Phase 1
questions). Interviews and survey responses are the primary data sources for addressing Phase 2 and 3
questions. Together, the three data sources will provide different but complementary types of information
to answer the priority questions.
The Workgroup completed the document review and survey according to the methods outlined in this
plan. The document review provides detailed information on how grant programs record, communicate,
and present guidance and results from their programs. The document review also provides information
with which to interpret or expand on the findings from the survey. The survey targeted all headquarters
offices and regional offices for each grant program that is active within a region. The survey was designed
to be brief, and consisted of predominately closed-ended questions, with optional open-ended questions.
Interviews will be conducted in Phase 2 and Phase 3. The interviews will be completed in a semi-
structured format and will be designed to explore themes and elicit detailed information based on findings
from the survey results and document review.
The data collection methodology makes use of existing data while undertaking targeted new data
collections to provide more comprehensive and robust answers to the priority questions. We will
triangulate across existing and new data, using a combination of quantitative and qualitative analysis, to
answer each question.
The following exhibit summarizes the research questions; the scope of each research question as it
pertains to all or a sub-set of grant programs; general data sources for informing each research question;
and the specific survey question(s) and/or document type(s) that will inform each research question (e.g.,
purpose of each survey question).
33
-------
Question
Scope of Question
Data Source
Survey Question(s)/Document Type(s) That Comments
Inform Question
1. How do EPA's existing grant award and
reporting systems identify and track grant
commitments?
All grant programs
•
•
•
•
Survey
Documents
NPM Information Request
Interviews
a. Do the grant programs have specific
targets associated with their
outputs/outcomes?
All grant programs
•
•
Documents
Interviews
• NPM Information Request Documents
• Guidance documents for grant
program or other documents that
describe program's approach for
determining success
• Most recent summary annual
report/report rollup
• Interviews
• Exact questions TBD but focused on
better understanding if grant
programs have specific targets.
b. What types of grant commitments data
are tracked?
All grant programs
•
Survey
• Survey
• Q12 + Q13 - types of outputs tracked
• Q16 + 17 - outcomes tracked
• Q20 + 21 - other data tracked
i. How do tracked grant
commitments data vary across
the Agency?
All grant programs
•
Survey
• Survey
• See 1b-above
ii. To what extent does the data
reported by grantees provide
EPA with information on
progress towards meeting grant
commitments?
All grant programs
•
Survey
• Survey
• Q14 - output data reported meets
data reporting requirements
• Q18 - outcome data reported meets
data reporting requirements
• Q22 - other data reported meets data
reporting requirements
34
-------
Question
Scope of Question
Data Source
Survey Question(s)/Document Type(s) That
Inform Question
Comments
iii. To what extent does the data
reported by grantees provide
information that currently
allows EPA to measure outputs,
outcomes, and impacts related
to equity and climate change?
All grant programs
• Documents
• Survey
• Interviews
• NPM Information Request Documents
• Guidance documents
• Survey
• Open-ended responses.
• Interviews
• Exact questions TBD but focused on
identifying programs and metrics
focused on equity and climate change.
How do grant programs identify
relevant grant commitments to track?
All grant programs
Survey
Documents
Interviews
• Survey
• Q15 open-ended responses
• Q19 open-ended responses
• NPM Information Request Documents
• Guidance documents
• Annual report/report rollup.
• Interviews
• Exact questions TBD, but focused on
what internal processes exist within
EPA for deciding what information and
data are collected to track grant
commitments
In Phase 1, no data source
specifically addresses this;
data will be collected and
presented on an as-
provided basis; rather than
comprehensively
d. What data reporting processes, tools,
and systems do EPA's grant award
programs use?
All grant programs
Survey
NPM Information Request
Survey
• Q5 - grantee reporting mechanisms
• Q6 - data storage mechanisms
• Q7 - name of databases for storage
NPM information request also collects
information on the databases used to
consolidate grant commitments data
i. How do grant award reporting
systems vary across the
Agency?
All grant programs • Survey
Survey
• See 1d-above
. Q1 + Q2
35
-------
Question
Scope of Question
Data Source
Survey Question(s)/Document Type(s) That Comments
Inform Question
e. How do grant programs use the grant
commitments data they collect for
program implementation?
All grant programs
•
Survey
• Survey
• Q15 - use of output data to track
progress on grant commitments +
recognize and address problems
• Q19 - use of outcome data to track
progress on grant commitments +
recognize and address problems
• Q23 - use of other types of data to
track progress on grant commitments
+ recognize and address problems
f. How do grant programs present and
communicate the results of the grant
commitments data they collect?
All grant programs
•
Documents
• NPM Information Request Documents
• Annual report/report rollup
2. What EPA practices and tools effectively track
whether grantees are fulfilling their workplan
grant commitments, including outputs and
environmental outcomes?
All grant programs
with highlights of
some grant
programs
•
•
•
Survey
Documents
Interviews
a. How effectively is EPA able to track
grantee progress towards meeting
grant commitments?
All grant programs
•
•
Survey
Documents
• Survey
• Q14 + Q18 + Q22 data reported meets
data reporting requirements
• Q9 + Q10 + Q11 - difficulty or ease of
tracking progress
• NPM Information Request Documents
• Most recent summary annual
report/report rollup may indicate the
current accessibility of collected and
vetted data
• Interviews
• Exact Questions TBD but focused on
programs with 'best practices'
36
-------
Question
Scope of Question
Data Source
Survey Question(s)/Document Type(s) That
Comments
Inform Question
What factors might affect a grant
program's ability to effectively track
grant commitments? Factors may
include:
• Third party management
• PPG management
• Reporting frequency
• Grantee reporting mechanisms
• Data storage mechanisms
• Types of outputs reported
• Types of outcomes reported
• Type of grant program
All grant programs • Survey
• Interviews
Survey
• Q3 + Q4 - third party management
• Q8 - PPG management
• Q24 + Q25 - reporting frequency and
effectiveness for use of data
• Q5 - grantee reporting mechanisms
• Q6 - data storage mechanisms
• Q12 + Q13 - types of outputs tracked
• Q16 + 17 - outcomes tracked
• Q1 - grant program type
• Data produced from 2a on
effectiveness
Interviews
• Exact Questions TBD but focused on
programs with 'best practices' or
'issues'
What factors might affect a
grant program's ability to
effectively track grant
commitments related to equity
and climate impacts?
All grant programs • Interviews
Interviews
• Exact Questions TBD but focused on
programs with 'best practices' or
'issues'
c. What are promising practices and tools
Highlights of
• Interviews
• Interviews - targeted towards respondents
Question comes from
demonstrated by some grant programs
practices and/or
• Survey
who provided information in the survey
Learning Agenda
that could help other grant programs
grant programs
that indicates the respondents employ
effectively track grant commitments
data?
promising practices/tools
• Survey - some open-ended responses
may provide this information and help to
identify respondents
37
-------
Question
Scope of Question
Data Source
Survey Question(s)/Document Type(s) That
Comments
Inform Question
3. Are the commitments established in EPA's grant
agreements achieving the intended
environmental results?
Subset of grant
programs
•
•
•
Survey
Documents
Interviews
a.
What are the intended environmental
results of EPA's grant programs as
reflected in the grant agreements?
Subset of grant
programs
•
•
Documents
Interviews
• NPM Information Request Documents
• Guidance documents
• Annual report/report rollup
• Interviews allow for in-depth discussions
and understanding of intended
environmental results.
This is different than Q1a
because it focuses on the
specific environmental
results of a subset of grant
programs
b.
What outcomes are EPA's grant
agreements achieving?
Subset of grant
programs
•
•
Interviews
Documents
• Interviews allow for in-depth discussions
of outcomes.
• NPM Information Request Documents
• Annual report/report rollup
c.
What outcomes are EPA's grant
agreements achieving that relate to
equity and climate change impacts?
Subset of grant
programs
•
•
Interviews
Documents
• Interviews allow for in-depth discussions
of outcomes.
• NPM Information Request Documents
• Annual report/report rollup
d.
What potential changes could be made
to grant programs' data collection
mechanisms/processes to help EPA
determine if EPA's grant agreements
are achieving the intended
environmental results?
Subset of grant
programs
•
•
•
Survey
Documents
Interviews
• Recommendations draw from all data
sources and findings as well as team
member expertise
e.
What can grant programs do to better
communicate how grant outputs
advance the Agency's mission through
environmental results?
Subset of grant
programs
•
•
•
Survey
Documents
Interviews
• Recommendations draw from all data
sources and findings as well as team
member expertise
Question comes directly
from Learning Agenda
f.
What potential changes could be made
to grant programs' data collection
efforts to help EPA determine equity
and climate change impacts?
All grant programs
•
•
•
Survey
Documents
Interviews
• Recommendations draw from all data
sources and findings as well as team
member expertise
g-
What next steps could EPA take to
establish an EPA system that compiles
the outputs and outcomes of Agency
grant reporting for all systems?
All grant programs
•
•
•
Survey
Documents
Interviews
• Recommendations draw from all data
sources and findings as well as team
member expertise
Question comes directly
from Learning Agenda
38
-------
IS AND STRATEGIES
The data sources - documents, survey, and interviews - and data collection strategies for this research are
outlined below.
Documents
The Workgroup collected documents for review through a request to the National Program Managers (NPMs).
Using Microsoft Forms, the information request asked NPMs to respond to the information request for each
program they manage. The goal was to capture one response to the information request for each active grant
program across the Agency. The information request specifically asked NPMs to provide the (1) the most
current guidance for the grant program covering the objectives, goals, and specific programmatic requirements,
(2) any other program documents that describe a program's approach for determining success, (3) names of
databases that the programs use to consolidate grant commitments data, and (4) the most recent summary annual
report, report rollup, or other relevant report that may consolidate and present information across grantees for a
program.
The document reviews provide information on the objectives or targets associated with grant programs; how
grant programs define and determine success; what data reported by grantees provides information that currently
allows EPA to measure outputs, outcomes, and impacts related to equity and climate change; how grant
programs identify relevant grant commitments to track; and how grant programs present and communicate grant
commitments data.
To obtain these information the Workgroup reviewed several types of documents, including:
1. Guidance document(s) for the grant program.
2. Other related program documents.
3. Recent summary annual report or report rollup.
During interviews, the Workgroup may collect additional program documents or document types provided by
interviewees if the documents were not provided during the NPM information request, or if they provide new or
additional relevant information for answering the research questions.
Survey
The survey provides a landscape view of what grantee data are being collected (and how) for reporting on the
outputs and outcomes of grant activities across EPA. The survey aimed to capture one survey response per
program per region. The goal was to conduct a census of every region and headquarters office where a grant
program is active. EPA does not have a comprehensive list of all staff persons responsible for managing grant
programs. Therefore, the Workgroup asked that the person best able to respond to inquiries regarding the
regional media grant program respond. In some cases, this could be a grant program coordinator, or other
program managers or senior project managers; instructions indicated that this person should be in a position to
coordinate with their project officers.
The survey was brief, consisted of predominately closed-ended questions, with some optional open-ended
questions. The survey was not anonymous and intentionally asks users to identify their program, region, and
name for potential follow-up questions (e.g., ensuring one response per program per region) and in preparation
for Phase 2 data collection efforts (i.e., interviews). The survey was administered through Survey Monkey, a
web-based survey host.
39
-------
The workgroup developed a survey questionnaire, intended to provide data that answers the priority questions
and supports the intended analyses. The survey was piloted with a select group prior to phased deployment
schedule across the Agency. The pilot phase included a kickoff meeting with pilot contacts to demonstrate the
survey and gather initial feedback on questions. Subsequently, the Workgroup refined the survey tool, approach,
and questions based on the initial feedback. The Workgroup then disseminated the survey with the pilot team to
collect a full round of feedback and the Workgroup adjusted the survey based on the feedback received.
Once the survey was finalized, the Workgroup administered the survey in three primary phases. The first phase
included OAR (12 grant programs), OW (26), and OCFO (2); Phase 2 included OLEM (16), OITA (3), OCSPP
(4), and OECA (5); and Phase 3 included OA (11), ORD (7), OMS (4), and the Regions (15). The final phase
entailed identifying all non-respondents and following up with program managers to obtain missing responses.
This phased approach enabled the Workgroup to spend substantial time in obtaining buy-in from grant programs
and following up with non-respondents to achieve a high response rate to the survey.
Information collected via the survey included:
• Mechanisms that grantees use to report to EPA their progress toward meeting grant commitments.
• Mechanisms that grant programs use to track input received from grantees.
• Reporting intervals required for grantees.
• Self-assessment of the adequacy of reporting frequency to effectively track progress and
recognize/address problems.
• The types of outputs that grantees are required to report on.
• The types of outcomes that grantees are required to report on.
Survey responses will be quantitatively analyzed and summarized based on the percentage of respondents
answering each of the possible responses for the individual questions. Additional analysis is discussed in the
data use section.
Interviews
Interviews will be conducted starting in the second year of the project. The goal of conducting interviews is to
help answer the priority questions by capturing perspectives and knowledge of grant programs from EPA
employees with experience administering, implementing, or otherwise participating in EPA's grant programs.
The information obtained from interviews will help supplement data gaps from Phase 1 data collection efforts
and will be focused on collecting data on practices and tools that help programs effectively track information on
grant programs.
The Workgroup will develop interview guides and conduct structured Microsoft Teams or phone interviews
with EPA employees. Interview guides will be tailored to each program based on responses received during the
survey and based on the document review. Interviews will be selected as a purposive sample to ensure adequate
representation across key EPA grant programs and regions. The sample of interviews will not be statistically
representative, and the Workgroup will not attempt to make quantitative inferences about the entirety of EPA's
grant programs based on the results of the interviews. The Workgroup proposes to conduct at least 25 interview
sessions. Note that each interview session may include one or more individuals; we will conduct group
interviews if there are teams of individuals that work together and can provide complementary perspectives.
The Workgroup will select interviewees. The target interviewees will encompass National Program Managers
(NPMs) and other program staff that are responsible for implementing and managing the grant programs. The
40
-------
source for selecting interviewees will be the NPM information request, and survey respondent lists. The
Workgroup anticipates that in some cases EPA staff we reach out to may refer us to additional interviewees to
either respond in their place or join for a group interview. To ensure the transparency of the selection process,
the Grant Commitments Met Workgroup developed the following interview selection criteria to ensure
representation of:
• Programs that report highly positive experiences with their existing grants management and data
management practices.
• Programs that report notably negative experiences with their existing grants management and data
management practices.
• Different types of grant programs including:
o PPG status.
o Programs that have large budgets,
o Media types.
o Programs that may directly address topics raised in EPA's strategic plan (e.g., Justice 40).
Criteria for conducting group (rather than individual) interviews include the following:
• Logical groupings of interviewees (by program).
• Groups of manageable size (no more than six individuals per group).
• Protocol and logistics allow for scheduling group interviews.
The Grant Commitments Met Workgroup will schedule interviews. Prior to each interview, the Workgroup will
provide the interviewee with background information about the Grant Commitments Met effort and the relevant
interview guide. During the interview, in addition to going through the interview guide, the interviewers will ask
follow-up questions as appropriate to probe further into topics of conversations raised during the interview and
relevant to the priority questions.
The Workgroup will analyze responses to each interview question to identify themes and summarize responses.
The Workgroup will use qualitative analysis to code each open-ended response. Each response may be
applicable to more than one priority question. The Workgroup will then summarize the frequency with which
each theme was raised overall and different types of grant programs. The Workgroup will also identify
illustrative quotations that capture issues that interviewees frequently raise. The Workgroup will ask
respondents' permission to record interviews, but the interviews will not be anonymous in order to ensure that
information can be tied back to a specific program and/or region. This is important for EPA in understanding
how implementation of grant programs may vary across the Agency and what factors may affect
implementation. In communications with the interviewees, the Workgroup will emphasize that the purpose of
the interviews is for learning, rather than a compliance exercise, to encourage candor and openness in their
responses.
External partners such as state, local, or tribal governments; contractors; or research institutions will not be
interviewed.
SAMPLING APPROACH
The survey and NPM document collection efforts both sought a census of all active grant programs. As discussed in
the interview approach, the interviews are a purposive sample.
41
-------
The Workgroup will analyze the data to answer EPA's Priority Questions in the Agency's Learning Agenda for
Grant Commitments Met. The analytical results for Phase 1 will address the priority question: How do EPA's
existing grant award and reporting systems identify and track grant commitments? The Workgroup will
summarize the results of the survey and document review in a report, providing a comprehensive and
consolidated overview of the Agency's grant programs. The Workgroup will also develop fact sheets that
provide in-depth details for six high-priority grant programs, which will be selected based on the size of their
program budget, coverage across media types, and Agency interests identified in the Strategic Plan (e.g., equity
and climate). Phase 1 results will also be used to focus the scope of the project for Phases 2 and 3.
Phases 2 and 3 will include an in-depth study of a smaller number of grant programs selected through a targeted
approach to address what the data can tell us about the effectiveness of EPA's grant programs. Through analysis
of the survey findings and qualitative data, the Workgroup will identify best practices that can be shared with
grant programs throughout the Agency. Ultimately this work will identify how selected grant programs
contribute to environmental results and suggest good practices for tracking environmental outcomes across
EPA's grant programs.
Ultimately, the data will be used to report on the current state of EPA's grant programs, identify good practices
and environmental results for selected grant programs, and identify processes and practices that can be used
throughout the Agency to effectively manage EPA's grant programs and report on how these programs are
advancing the Agency's mission to protect human health and the environment.
DATA ANALYSIS PLAN
The data analysis plan includes the detailed survey or NPM information request question, answer type, and
response options. Accompanying each question, the plan answers why the question exists, the data use and
analysis approach, the tie into the 'bigger picture' or plan, the expected data, and data validation steps. The data
analysis plan is included as Attachment A: Data Analysis Plan to this report.
Additional survey analyses explore the variation in grant programs across the Agency. This includes looking at
the relationship between:
• Types of outputs and outcomes across media type and region.
• Types of data reporting and storage systems across media type and region.
• PPG status and ease of reporting data.
• Reporting frequency and ability of track progress and address problems.
The Workgroup also conducted additional data analysis of documents and open-ended survey responses to
identify grant commitments information related to equity and climate issues.
REPORT AUDIENCE
The main audiences for this report include OCIR and OGD, senior policy and career leaders, managers at the
Agency, and EPA personnel responsible for managing and implementing EPA's grant programs. It is also
expected that EPA will report key results to OMB and Congress as part of annual reporting.
Below, we detail possible data limitations and mitigation strategies to ensure the quality of the data are adequate
for its intended use.
42
-------
• Potential bias associated with survey non-response. The Workgroup used a brief online survey to
develop a representative description of grants program across the Agency. However, if survey responses
unintentionally missed key grant programs, this could introduce bias into the survey findings, if
respondents are systematically different than non-respondents. Therefore the Workgroup attempted to
maximize the response rate by keeping the survey form brief and easy to use, by sending out multiple
requests to non-respondents, and when necessary, by working through non-respondents' managers to
request that their staff complete the survey. The Workgroup also collected respondents' names to directly
follow up with respondents in cases where a respondent provided multiple responses for the same
program; this improved data quality and ensured only one response per program. In other cases, different
respondents responded for the same program; the Workgroup also followed up with these respondents to
clarify which response was the appropriate response to include.
• Incomplete document collection and potential for non-response bias. EPA does not maintain a unified
system where documents (e.g., grant guidance) are stored and tracked. This means that the Workgroup
was not able to easily or comprehensively obtain the list of relevant grant guidance, grant program
reports, or additional documentation about grant programs. Instead, the Workgroup relied on the NPM
Information Request to obtain relevant documents, which relied on individuals to respond to the request.
Respondents that did not respond to the request or did not understand the request may have provided
irrelevant documents or an incomplete set of information. If the programs for which we did not receive
documents are different in some way from the programs for which we did receive documents, our overall
conclusions drawn from the guidance that we reviewed could be biased. The Workgroup reviewed all the
available information provided to extract relevant data in a consistent format and will acknowledge the
potential for missing information in analyses about specific programs.
• Potential bias associated with purposive sampling for interviews of EPA project staff responsible for
implementing grant programs. The Workgroup will not be able to select a statistically valid sample of
interviewees given the relatively small number of interviews that can be conducted within the budget and
timeline, and the several types of interviewees that need to be included. Therefore, the Workgroup will
select a purposive sample of interviewees to maximize learning opportunities. Although the results of the
interviews will not be statistically representative, they will be chosen purposefully to provide good
examples, best practices, and insights that can be useful for other grant programs and for EPA
management when considering what data can be tracked, how best to track it, and what the data can
show.
43
------- |