EVIDENCE ACT
LEARNING AGENDA

m M k

YEAR 2 FINAL REPORT
8 MARCH 2023

GEPA

This report was prepared
with support from
Industrial Economics,
Incorporated under U.S.
EPA Contract No.
GS-10F-0052R


-------
Table of Contents

OVERALL FINDINGS	1

Summary	 1

Recommendations	 4

Next Steps on Year 3	 8

INTRODUCTION + PURPOSE	9

Priority Questions - Year 2	10

Data Collection Summary	11

RESULTS BY SUB-QUESTION 	13

1a. Targets Associated with Grant Outcomes	13

1 biii. Data Collected on Equity + Climate	14

1c. Process for Identifying Data to Track	16

1d. Grant Program Processes, Tools, Systems 	16

2a. Effective EPA Practices	18

2ai. Quality Assurance and Quality Control (QA/QC) Practices	20

2b. Effective EPA Tools	21

2c. Tools for Outcome Analysis	21

2d. Factors Affecting Tracking + Reporting	21

2di. Factors Specific to Climate + Equity	23

2e. Promising Practices + Tools	24

APPENDIX A: METHODOLOGY	26

Interviews	30

Data Analysis	32

Expected Deliverables	33

Data + Methodology Limitations + Validation	33

APPENDIX B: INTERVIEW GUIDES	35

Interview Guide for Program Officer	35

Interview Guide for National Program Manager	37


-------
OVERALL FINDINGS

This section presents the summary of findings for the Year 2 evaluation question: What EPA practices
and tools (1) effectively track grantee progress towards meeting workplan grant commitments
including outputs and outcomes, and/or (2) support communication of national program-level
outputs and outcomes? In addition to an overall summary, this section includes recommendations for the
EPA and next steps for Year 3.

Summary

Through the Year 2 interviews and Year 1 survey findings and document review, the Evidence Act -
Grant Commitments Met Workgroup identified several effective practices and tools that support the EPA
in tracking and communicating grant results at the grantee and national program level:

•	Well-articulated program logic including objectives, program activities, and associated
metrics and targets for tracking outputs and outcomes.1 By documenting program activities
with specific outputs and outcomes, the program clearly lays out the path for the grant program to
demonstrate its success. Programs can also use these types of documents (e.g., management
plans, strategic plans, etc.) to articulate what is reasonable for the program to accomplish given
any constraints on funding, timing, or other external factors. These types of documents set the
expectation for regular program-level reporting and easily support subsequent program evaluation
and other types of evidence building activities. Two examples of program logic documents that
we have identified are the Long Island Sound Comprehensive Conservation and Management
Plan 2015 and the Great Lakes Restoration Initiative (GLRI) guidance document, GLRI Action
Plan 111 (2020-2024).

•	Program-, media-specific, and/or administrative databases. Interviewees reported that
databases help with data storage and grant program implementation and can also support
communication of program-level outputs and outcomes. For program-specific databases,
interviewees were particularly in favor of the Environmental Accomplishments Great Lakes
(EAGL) database from the Great Lakes National Program Office. For media specific databases,
interviewees reported positively on the Assessment, Cleanup and Redevelopment Exchange
System (ACRES), from the Brownfields program, and the Wetlands Grants Database (WGD).
Finally, respondents reported that overall, both the Next Generation Grants System (NGGS) and
Financial and Ecosystem Accounting Tracking System (FEATS) work well for the grant
administrative needs they have. These administrative databases are more generally applicable
across grant programs and focused on administering funding and tracking agreement documents.
Interviewees also noted several suggestions for improving the existing databases and expressed
concerns that a single Agencywide tracking system would be difficult to administer across
programs addressing different media types.

1 Program logic explains the thinking behind the program design; a program logic model is a graphic depiction that defines the resources (inputs),
activities, outputs, intended audiences, and short-, mid-, and long-term outcomes of the program and the shared relationships among these
program components.

i


-------
•	Standard templates for data collection. Interviewees often reported use of, or indicated a desire
for, a standard reporting template for collecting data from grantees and subsequently
consolidating these data. Interviewees most often referred to an Excel-based reporting template.
For example, the Source Reduction Assistance Program uses a standard Excel template with set
metrics for grantee reporting. Several interviewees from programs that do not have a standard
reporting template specifically asked for NPMs to provide the reporting template, with another
interviewee indicating this is currently being developed for their grant program with input from
regions.2 Interviewees also reported the need for templates to be flexible for non-standard metrics
and inclusive of narrative or "storytelling" information. While these may be incongruous
requests, standard templates may meet these needs by creating specific spaces for narrative
information and additional non-standard program metrics. Non-standard metrics are those that
programs did not previously identify and standardize at the program-level and would vary based
on the specific project.

•	Administering grant programs to build and maintain grantee capacity. Although building
grantee capacity goes well beyond data reporting, it lays the foundation for the EPA to receive
consistent and high-quality data from grantees on environmental and human health results. Year 1
survey data and Year 2 interviewees stressed the importance of frequent and open communication
with grantees as a good practice for implementing the grant and for obtaining grantee data.
Establishing open communication between the EPA and grantees empowers grantees to directly
reach out to EPA staff regarding their concerns and progress. This helps establish trust between
the EPA and grantees and allows for problem solving throughout the life of the grant, adaptation
of project implementation in response to changing conditions, and furthers the goals of the grant
program. This in turn builds the grantees' capacity to respond to the EPA's requests, and
willingness to engage in added information requests as they arise. This rapport can also extend to
maintaining relationships with previous applicants that the EPA did not award to prepare
prospective grantees for the next round of funding. More specific approaches to build grantee
capacity in grant administration and reporting include cohort building activities (e.g., supporting
cross-grantee communication to share information) and providing direct technical support
activities to grantees (e.g., grant writing webinars). In some cases, this may involve leveraging
capacity building resources from other federal partners (e.g., adopting grant writing training
another agency provided for their own training session).

The specific approach individual grant programs take to develop the practices and tools outlined above
may vary based on the needs of their programs. For example, programs with state-level grantees may
emphasize cohort building activities, while programs with smaller community-based organizations as
grantees may provide cohort building activities alongside more direct technical support to build capacity.

Through the interviews and survey, the Evidence Act - Grant Commitments Met Workgroup also
identified several potential challenges grant programs may face in using the effective practices and tools
outlined above. For example, developing we 11-articulated program logic and building and maintaining

2 Regional interview respondents often referred to HQ without articulating a specific office. It is likely this references a
program's respective National Program Office but the Evidence Act - Grant Commitments Met Workgroup cannot confirm this
with existing interview data.

2


-------
grantee capacity requires sufficient EPA staff expertise (specifically program evaluation expertise) and
resources. Interviewees also pointed to lack of guidance from HQ programs as a challenge for building
standard templates to collect information from grantees. Lack of HQ program guidance impedes
information collection on current administration priorities (e.g., climate and equity). Finally, PPGs pose a
unique challenge for the EPA because they can be more labor intensive for the EPA to administer, require
increased cross-EPA collaboration, and have consolidated progress reports that are not specific to each
program in the grant.

Answers to each specific sub-question for the Year 1 follow-up questions are below:

a.	Targets associated with grant outcomes. Based on the documents provided via the Year 1 NPM
information request, only 15 programs (21%) have specific targets associated with their outputs
and outcomes. Year 2 interviewees affirmed this finding, citing variation across grant projects,
lack of sufficient metrics, and inefficiency of a framework with rigid metrics and targets as
reasons for lack of specific targets.

b.	Measurement data on equity and climate. Year 1 findings indicate that current data reported by
grantees provides limited information on outputs and outcomes related to equity and climate
change, both of which are current administration priorities. Year 2 interviews provided some
confirmation of these findings but also indicated that many programs are actively changing their
approach to incorporate equity and climate change. For example, several interviewees indicated
they retroactively responded to data requests for equity and climate activities or are incorporating
equity and climate into future project activities (and documentation of those activities). However,
most efforts fall short of collecting data that captures the effect of these activities on equity and
climate.

c.	Processes for identifying metrics to track. Available data from Year 1 indicates programs may
identify relevant grant commitments to track from a centralized top-down process (e.g., statute,
determination by the national program), past performance of the program, or a program-level
action plan. Year 2 interviews explored this further, confirming that programs use statutory
language and NPM guidance. Notably, neither Year 1 nor Year 2 data provide evidence of a
standardized process across the Agency for identifying grant commitments.

d.	Grant program processes, tools, systems. Year 2 results affirmed our Year 1 findings regarding
grant program data reporting and storage systems. However, Year 2 interviews highlighted the
fluidity of these systems, as some interviewees reported that their programs recently adopted or
are actively considering new processes and tools.

Answers to each specific sub-question for Year 2 are below:

a. Effective EPA practices. Year 2 interviews built upon Year 1 survey results to determine
effective practices that support the EPA in tracking and communicating grant results at the
grantee and national program level; both data sources identify a variety of effective practices to
track grantee progress and support consolidation and communication of results. The most
common effective practices for tracking progress centered around organized grant administration
that prioritizes grantee capacity and regular progress updates. For communicating results,
interviewees indicated that review and summary activities across the program are particularly
helpful practices.

3


-------
i. QA/QC practices. Interviewees overwhelmingly indicated that EPA staff engage in
QA/QC of grantee data (30/31). However, some noted that EPA staff may only engage in
QA/QC for certain grants or when an issue arises. One other interviewee indicated that
although the EPA requires QA/QC, there is no guarantee that staff complete it.

b.	Effective EPA tools. Program databases are the topmost effective tool reported by interviewees.
Interviewees also emphasized the importance of continually improving and updating the
databases to meet user and data needs. Other effective tools for tracking and communicating grant
results at the grantee and national program level include standard templates (most often in Excel),
SharePoint, and using spreadsheets to track program and project level data.

c.	Tools or models for outcome analysis. Most interviewees indicated that their grant program
does not use additional tools or models for grantee data to determine alternative outputs or
outcomes. However, at least three interviewees indicated interest in using them. Of the
interviewees indicating the program uses models to further analyze data (8), three indicated this
occurs at the national (but not regional) level and programs may do this on a case-by-case basis.

d.	Factors affecting tracking and reporting. The top seven challenges in tracking grantee progress
that interviewees reported include: heavy EPA staff workloads, lack of grantee capacity
(including grantee staff turnover), inability to collect data after grant close-out, the challenging
nature of tracking collective progress, failure of existing data requests and metrics to capture all
types of project results (e.g., narrative information), and difficulty in quantifying some program
impacts.

i. Factors specific to climate and equity. Interviewees noted several challenges specific to
data collection pertaining to equity and climate. Interviewees expressed worries about
grantee reporting fatigue in having to respond to additional data requests, challenges with
identifying relevant metrics to track (and relatedly lack of guidance from HQ programs
on how to do this), and lack of statutory authority to collect new types of information.

e.	Promising/upcoming practices and tools. Some promising or upcoming practices and tools that
interviewees reported pertaining to data and reporting include program evaluation requirements
for grantees, data visualization dashboards, flexible reporting templates, and a tiered metrics
approach tied to strategic objectives. Other practices and tools pertaining more directly to
program implementation include gathering input from technical experts on grant projects,
completing cohort building activities to build grantee capacity, leveraging resources from other
federal partners, and maintaining open communication with previous applicants that the EPA did
not award to build their future capacity.

Recommendations

Beyond the grant-specific effective practices outlined in the summary above, the Evidence Act - Grant
Commitments Met Workgroup offers four Agencywide recommendations based on the overall findings
from the Year 1 survey and document review, and the Year 2 interviews. The following recommendations
assume the Agency's focus is on better communicating how grants advance the Agency's mission through
environmental results.

1. Document the anticipated program-level outcomes and grant-level outputs and/or outcomes for
each grant program; categorize each grant program based on a taxonomy of anticipated
results; and provide additional guidance and/or support for the evaluative and evidence
building activities that are appropriate for each category of program. EPA programs receiving

4


-------
BIL, ARP, and IRA funding must develop program logic models and articulate the program theory of
change. This should help answer the questions: 1) What does the EPA expect the grantee to
accomplish within the timeframe of the grant (grant-level) and 2) what does the EPA expect the
program to accomplish overall (program-level)? Building on these requirements, the EPA should
systematically and centrally document these anticipated outcomes at the grant- and program-level to
determine and characterize each grant program by type of anticipated results. Following the
description of each program category, the text explains the potential types of support the EPA can
provide to programs falling within each of those categories

CATEGORYA: OUTPUT

CATEGORY B: ANTICIPATED

CATEGORYC: ACHIEVED

FOCUSED

LONG-TERM OUTCOMES

LONG-TERM OUTCOMES

B Not focused on

Not focused on

Programs focused on

achieving long-term

achieving long-term

achieving

environmental or

environmental or

environmental or

human health

human health

human health

outcomes during the

outcomes during the

outcomes during the

lifetime of the grant

lifetime of the grant

lifetime of the grant

uDo not have

Have predictable,



predictable, specific

specific long-term



long-term outcomes

outcomes



a. Category A: Output Focused. Programs that are not focused on achieving long-term

environmental or human-health outcomes, neither during the lifetime of the grant, nor post
close-out, and do not have predictable, specific long-term outcomes. These types of programs
may eventually affect environmental or human health outcomes, but the pathway to change is
indirect, and the specific anticipated long-term outcomes are not predictable. An example here is
the Environmental Education grants program. Programs without predictable, specific long-term
outcomes are not conducive to outcome measurement in the short-term cycle of grant funding or
after grant close-out due to extensive confounding factors. For example, Environmental
Education grants aim to increase environmental literacy (short-term outcome/ knowledge change)
for the purposes of, among others, increasing community members" engagement with
environmental conservation, stewardship, and/or advocacy efforts (medium-term outcome/
behavioral change). It would be reasonable to expect grantees to collect data on the short-term
outcome of increased environmental literacy within the time frame of the grant. It may also be
reasonable to expect grantees to track and collect some information about behavioral changes
within the time frame of the grant. However, behavioral changes resulting from the increased
knowledge may also continue to occur outside the 2-year window. It would not be feasible to
document and attribute changes in environmental conditions to behavioral changes motivated by
the grant program. Another example of this type of grant may be the Environmental Justice Small
Grant Program.

• Support. The EPA should support regular development of grantee-specific case studies
that highlight the contribution of the EPA's grant program to additional behavioral changes

5


-------
and subsequent potential long-term outcomes. The collective set of case studies can speak
to the EPA's contribution to environmental results.

b.	Category B: Anticipated Long-Term Outcomes. Programs that are not directly focused on
long-term environmental or human-health outcomes but have predictable, specific long-
term outcomes post close-out. An example here might be where a program tracks an early
outcome which results in the predictable and specific long-term outcome of improving
environmental and/or human health but the program does not currently characterize or quantify
these long-term benefits. A specific example here is the Leaking Underground Storage 'Tank

t: Fund. Program. The program first identifies releases from LUST sites (an output),
resulting in an increase in the EPA's and the responsible parties' knowledge of the issue, a short-
term outcome. With this change in awareness, the program completes cleanups of petroleum
releases from federally regulated underground storage tanks, a mid-term outcome. These cleanups
result in the predictable and specific long-term outcome of improving environmental and human
health. The program captures metrics on the number of cleanups completed, but the program does
not characterize or quantify long-term benefits in their regular performance reports. Technical
information about the site characteristics may help inform the resulting environmental and human
health benefits from the site cleanup.

• Support. To determine long-term outcomes and results, the EPA should consider support
for:

•	Improved identification and documentation of the potential human and environmental
benefits associated with known outcomes and existing program metrics. Building
from the LUST example above, in lieu of reporting just the total number of sites
cleaned up, the EPA could benefit by categorizing the sites by specific characteristics
indicative of environmental and or human health benefit. Example characteristics
may be population density in proximity of site, media affected (soil, groundwater,
surface water, drinking water, etc.), migration off-site (yes/no), total acres and/or
stream miles affected. This could be a standalone exercise to better communicate the
environmental and human health results or could lead into modeling or program
evaluations as described below.

•	Modeling that can take standard grantee reported program metrics (and other
associated parameters) to estimate longer-term environmental and/or human health
results.

•	Regularly occurring program-level evaluations. These evaluations would build on the
short- and medium-term outcome data collected from grantees and would also
involve additional research. This may even include environmental research and
sampling to determine the environmental conditions resulting from grant program
efforts. Or it may involve interviews with grantees to collect information on
additional results from the grant funding. The EPA should consider opportunities to
further engage ORISE fellows to conduct program evaluations across the Agency
(one interviewee indicated their program used an ORISE fellow to conduct a program
evaluation).

c.	Category C: Achieved Long-Term Outcomes. Programs focused on achieving
environmental or human health outcomes during the lifetime of the grant. These programs
either have a grant period long enough to observe long-term environmental or human health

6


-------
outcomes or have project activities directly affecting environmental or human health changes.
Two examples of this program type include the Pollution Prevention (P2) program and the DERA
State and National grant programs, both of which require grantees to report directly on air
emissions reductions in addition to other environmental benefits.

• Support. The third category of programs are in the best position to report on the

environmental results from their program and are often already doing so. Like the second
category, these types of programs may benefit from modeling tools that can take their
existing environmental results and calculate alternative environmental results (e.g., air shed
modeling from reduced emissions to determine location-specific benefits).

2. Support purposeful, flexible changes to reporting and storage systems. Overall, any data
reporting and storage system should be flexible to accommodate the variety of data types and data
reporting schedules associated with grant implementation. Across all grant reporting and storage
systems, programs should aim to better integrate characteristics of the "ideal" grant database as
informed by the Year 1 survey and Year 2 interviews. The database should:

•	Allow for grant administration and management.

•	Allows for measures and metrics tracking - including documentation of the metric target (for
comparison to results).

•	Have the ability for users to input data directly. This may occur from the grantee (e.g., states) or
by POs overseeing grant implementation (depends on the sophistication of grantee); database
should allow for both scenarios.

•	Allow for quantitative standard and other non-standard metrics. Specifically, data storage
abilities should include:

o Quantitative data

¦	Select standard metrics across grantees.

¦	Other non-standard quantitative metrics.

o Narrative data. This could be an open field that allows users to "tell the grant story." The
database would ask users to develop the narrative text in one field and using a standard
set of fields for characterizing the narrative information such as keyword tags. This
characterization allows for easier searching and understanding the content. For example,
an associated narrative field on "type of story" may include the following options:
success story; lesson learned; explanation of confounding factors, etc.
o Centralized document storage and organization.

•	Ability for use of data, including:

o QA/QC capabilities and processes.

o Searchability, query, data retrieval, export, and visualization (e.g., data dashboards) -
including searchability within the documents (i.e., all documents text readable). This
searchability (and narrative characterization above) helps ensure that if different
administration priorities emerge in the future, staff may be able to identify and obtain
relevant information more easily and retroactively.

•	Include continuous improvements to enhance database functionality and meet changing user
needs overtime.

To meet the needs of a centralized Agencywide outcomes database across all EPA grants, the EPA
needs to engage in purposeful, extensive, cross-Agency effort to connect with each grant program

7


-------
across all media types to identify their user needs. Often each media program has its own set of
language for communicating success. Several program offices also already have databases for storing
and reporting both grant and non-grant data that function well for their existing needs.

3.	Provide additional guidance and templates from EPA HQ on implementing the current
administration's priorities. This would address interviewees' requests for additional guidance from
EPA HQ on implementing administration priorities (currently focused on equity and climate),
including setting appropriate metrics for these activities, and for standard templates to collect data
from grantees.

4.	Increase communication internally across the EPA. The Evidence Act - Grant Commitments Met
Workgroup identified two specific areas where increased intra-EPA communication would be
beneficial: between administrative and technical staff to ensure quality and relevancy of grantee-
reported data (especially important for PPGs); and between HQ staff and those implementing the
grants to increase understanding of how grant-level data rolls up to communicate program-level
outcomes.

Next Steps on Year 3

Based on the data analysis plan and the information learned from Year 1 and Year 2 of this study, the
Evidence Act - Grant Commitments Met Workgroup will determine specific activities for Year 3 of the
study. This may include identification of common data fields across media programs for potential
inclusion in a database. In Year 3, the Evidence Act - Grant Commitments Met Workgroup will continue
to support the Agencywide effort to develop an enterprise approach to support EPA grant programs, with
goals of improving the full life cycle of EPA grant management and grantee experience with EPA
systems. The Evidence Act - Grant Commitments Met Workgroup will provide vital input on best
practices for tracking post-award outputs and outcomes.

8


-------
INTRODUCTION + PURPOSE

In September 2020, the EPA developed a Learning Agenda that identified three Learning Priorities. The
EPA added a fourth learning priority in September 2021, with the full draft Learning Agenda. The
Learning Agenda stems from the Foundations for Evidence-Based Policymaking Act (Evidence Act),
which provides a framework to promote a culture of evaluation, continuous learning, and decision making
using the best available evidence. As part of the Learning Agenda, the EPA has initiated efforts to:

(1)	Develop priority questions.

(2)	Develop capacity to undertake new evidence-building activities.

(3)	Take the first step in developing a Learning Agenda that will inform the FY 2022-2026 EPA
Strategic Plan.

Grant Commitments Met is one of the Learning Priorities in the Learning Agenda. Every year, the EPA
awards over $4 billion in grants and other assistance agreements. New Agency funding provided by the
American Rescue Plan.bipartisan Infrastructure Law.4 and Inflation Reduction Act- to fund grants and
other assistance agreements underscores the importance of this Learning Priority.6 The EPA helps to
protect human health and the environment through these grants and the work of its grantees. The
management and tracking of the individual grant awards are dispersed amongst staff throughout
headquarters (HQ) and the EPA's ten regional offices, which makes tracking results at the national level
challenging.

The EPA's Office of Congressional and Intergovernmental Relations (OCIR) established the Grant
Commitments Met Workgroup to address the Priority Questions in the Interim Learning Agenda.
Subsequently, the Workgroup engaged Industrial Economics, Inc. (IEc) to provide support to the Grant
Commitments Met Learning Agenda, including development of the workplans, data collection, and report
and presentation development.

The initial phase (Year 1) of work addressed the question: How do the EPA's existing grant award and
reporting systems identify and track grant commitments? The workgroup organized an extensive
survey that gathered 462 responses from grant programs across the Agency. The workgroup analyzed
survey responses to identify what data (e.g., outputs and outcomes) programs collect and how to report on
grant activities across the EPA. Year 1 also included a request for National Program Managers (NPMs) to
provide background information on the EPA's grant programs. The analysis of the survey data and
documents provided by the NPMs is available in the Agency in a Year 1 report. The report was made
public in September 2022 and can be viewed here.

3	H.R.1319 - American Rescue Plan Act of 2021

4	H.R.3684 - Infrastructure Investment and Jobs Act

5	H.R.5375: Inflation Reduction Act of 2022

6	The American Rescue Plan, Bipartisan Infrastructure Law and Inflation Reduction Act provides around $100 million, $60.89
billion, and $350 million in additional EPA funding, respectively, for a total of around $61.34 billion in additional funding. See
https://www.epa.gov/arp/about-epas-american-rescue-plan-arp-funding, https://www.epa.gov/infrastructure/explore-epas-
bipartisan-infrastructure-law-funding-allocations, https://www.epa.gov/inflation-reduction-act/inflation-reduction-act-
programs-fight-climate-change-reducing-embodied, accessed January 23, 2023.

9


-------
This year of the project (Year 2) addressed the following question: What EPA practices and tools (1)
effectively track grantee progress towards meeting workplan grant commitments including outputs
and outcomes, and/or (2) support communication of national program level outputs and outcomes?

Year 2 data efforts include 31 in-depth interviews and additional analysis of data previously collected in
the Year 1 survey. The workgroup selected grant programs with pre-defined considerations for individual
or small group interviews with POs or NPMs.

~

Year 1 - Establish^

>

v Year 2 - Effective

\ Year 3 - Results



Baseline ^

Practices + Tools

>

/ Achieved



How do EPA's existing

What EPA Practices and

Are the commitments



grant award and

Tools

established in EPA's grant



reporting systems

Q Effectively track grantee

agreements achieving the



identify and track grant

progress towards meeting

intended environmental



commitments?

workplan grant
commitments including
outputs and outcomes,
and/or

u Support communication
of national program level
outputs and outcomes?

results?



Priority Questions - Year 2

The priority sub-questions associated with Year 2 include:

b.	What EPA practices (1) are effective in tracking grantee progress towards meeting
workplan grant commitments including outputs and outcomes, and/or (2) support
communication of national program level outputs and outcomes?

i. What quality control and quality assurance practices help ensure that the data are
accurate and complete?

c.	What EPA tools (1) are effective in tracking grantee progress towards meeting workplan
grant commitments including outputs and outcomes, and/or (2) support communication
of national program level outputs and outcomes?

d.	What tools do programs use to analyze environmental and human health outcomes with
grantee-reported data?

e.	What factors might affect a grant program's ability to (1) effectively track grantee
progress towards meeting workplan grant commitments including outputs and outcomes,
and/or (2) support communication of national program level outputs and outcomes?

10


-------
i. For relevant programs (e.g., programs that have equity or climate impacts), what
factors might affect a grant program's ability to effectively track grant
commitments related to equity and climate?

f. What are promising practices and tools demonstrated by some grant programs that could
help other grant programs effectively track grant commitments data?

Year 2 work also used interviews to follow up on Year 1 questions that needed additional clarification.
The questions selected for follow up are:

1. How do the EPA's existing grant award and reporting systems identify and track grant
commitments?

a.	Do the grant programs have specific targets associated with their outputs/outcomes?

b.	iii.7 To what extent does the data reported by grantees provide information that currently
allows the EPA to measure outputs, outcomes, and impacts related to equity and climate
change?

c.	How do grant programs identify relevant grant commitments to track?

d.	What data reporting processes, tools, and systems do the EPA's grant award programs
use?

	I -'HI 		 		

Year 2 data collection efforts included 31 purposefully selected semi-guided interviews with 26 POs from
all 10 EPA regions and five NPMs. By request of the targeted interviewee, several interviews included
additional persons; overall the interviews involved 38 people across 31 interviews. The interviews help
answer the Year 2 priority evaluation question and sub-questions and fill in remaining data gaps from
Year 1 efforts. We selected interviewees using criteria detailed in Appendix A: Methodology. The
Evidence Act - Grant Commitments Met Workgroup designed the interview selection criteria to ensure
representation across the Agency with an emphasis on current administrative and funding priorities.
During the interview process the Evidence Act - Grant Commitments Met Workgroup and EPA program
staff worked together to make any necessary updates to the original list based on the availability or status
of individuals initially identified. The appendix also contains the list of programs interviewed for this
effort. When applicable, the Evidence Act - Grant Commitments Met Workgroup also reviewed
documents provided by interviewees during or after the interview process.

The Evidence Act - Grant Commitments Met Workgroup developed separate interview guides for the
POs and NPMs prior to conducting the interviews. Interviewers used the guides to conduct interviews via
Microsoft Teams. The Evidence Act - Grant Commitments Met Workgroup also used a notetaker and the
recording and transcription function in Microsoft Teams to record responses, if approved by the

7 The question numbering corresponds to the original question numbering in the interview methodology, shown in Appendix A: Methodology.

11


-------
interviewee. After interview completion, the Evidence Act - Grant Commitments Met Workgroup
systematically organized and stored interview responses in Excel, tying the interview questions and
associated evaluation questions to each interviewee. The workgroup employed a systematic, qualitative
coding process to summarize responses, in which interviewee responses dictated what the workgroup
coded. After coding, we conducted a crosswalk and review of codes for clarity and consistency across
different the Evidence Act - Grant Commitments Met Workgroup coders.

Overall, the coded interview responses provide comprehensive and detailed answers to the priority
questions, building on what we learned from the Year 1 survey and NPM Information Request. More
information on the methodology can be found in Appendix A: Methodology and Appendix B:
Interview Guides.

12


-------
RESULTS BY SUB-QUESTION

This section presents the findings from each of the Year 2 evaluation sub-questions and helps answer
remaining sub-questions from the Year 1 work that the workgroup was unable to fully answer through the
survey and document review.

1a. Targets Associated with Grant Outcomes

Year 2 interview data indicate interviewees do not set specific targets associated with outputs and/or
outcomes at the grant program level. This confirms the results of the Year 1 NPM Information Request
document review; only 21% of the 72 respondents in Year 1, or 15 programs, had specific targets
associated with their outputs and outcomes, as evidenced in guidance documents and/or summary
reports.8 For the purpose of this study, we define 'target" as an articulated measurable goal associated
with the grant program and project's outputs and/or outcomes.

None of the NPM interviewees in Year 2 indicated the presence of specific targets for success at the grant
program level. However, some interviewees did indicate the presence of program metrics that are set at
the NPM level. For example, NPMs may set metrics for measuring site cleanup activities and outputs.
Additionally, the NPM interviewees mentioned priority activities and targets that can be set by the
grantee. Using the output metric example, state grantees may set targets for the number of sites to the
project should clean up.

The NPM interviewees cited a variety of reasons for not having grant program wide targets, including:

Grant projects had too much variation to have uniform targets. Within the grant program,
projects inherently varied in topic and scope, making uniform targets nonsensical.

The NPM guidance does not include metrics, including a lack of metrics specific to
Performance Partnership Grants (PPGs).

A rigid framework with specific metrics and targets was not productive. In a previous
iteration of the program when they did have a rigid framework, states and tribes were not
interested in participating in the program. Once they modified the program so grantees played a
greater role in setting priorities, they received more interest in program participation.

Often, if specific targets exist, grant programs negotiate them at the project level in the workplan. The
lack of specific, grant program-level targets may be indicative of the lack of well-articulated program
logic. NPMs and program management should still have a clear understanding of expected program and
project achievements and manage the program as such.

8 The Evidence Act - Grant Commitments Met Workgroup characterized report types received via NPM Information Request in
several categories to help with analysis, these include, guidance documents, other types of approach for success documents,
and summary reports. Guidance documents cover the most recent objectives, targets, and specific programmatic requirements
for the grant program. Approach for success documents include other supporting documents that inform and define how the
program achieves success. Summary reports consolidate and present information across grantees in the form of a full report,
simple data report, summary report rollup or basic data report.

13


-------
1 biii.9 Data Collected on Equity + Climate

The Evidence Act - Grant Commitments Met Workgroup
collected data in Year 2 interviews that confirms Year 1
results; grantees generally do not report data that capture
outputs or outcomes related to equity and climate change. In
Year 1, the Workgroup reviewed program guidance
documents provided in the NPM Information Request and
searched open-ended survey responses for the presence of
metrics addressing equity and climate. This review indicated
limited presence of metrics addressing equity or climate. The
presence of metrics addressing equity or climate was
primarily limited to grant programs inherently focused on
these topics, e.g., emission reduction programs.

Despite the lack of data on these topics, most interviewees
indicated that program activities do address equity (27) and
climate (19). The emphasis on these topic areas is actively
evolving in response to the current administration's priorities.
Interviewees from programs that had activities addressing
equity and climate also:

Justice40 Initiative

Justice40 is a federal, government wide effort
established by President Biden's Executive
Order (EO) 14008 "Tackling the Climate Crisis
at Home and Abroad" in January of 2021. Jhe
goal is to redistribute federal investments to
deliver 40% of benefits to disadvantaged
communities.

• This fact sheet summarizes how the

executive order directs the Biden
administration to combat the climate crisis
through foreign policy, infrastructure,
energy, conservation, and the economy.

The EPA has integrated the Justice40
Initiative into Agencywide operations, which
drives interest in this topic as it relates to
grant programs.

•	The EPA Strategic Plan incorporates
Justice40 throughout the objectives of the
Plan.

•	For example, EPA identifies the EPA
Drinking Water State Revolving Fund as
one of the 21 priority programs in the
Justice40 pilot program.

• Prioritize funding for projects involving

overburdened and underserved communities (18)
and climate activities (9). Programs often
implement this in the application process, awarding
extra points to projects involving equity or climate.

This category also includes programs that target outreach and technical support to underserved
and overburdened communities to encourage their participation.

Target program activities on addressing equity (7) and climate change (5). This includes
programs that incorporate equity and climate change into program metrics, as well as programs
that ensure project activities consider equity and climate change issues.

Inherently focused on equity (6) and climate (5). These grants centered around either
disadvantaged and underserved communities or climate change. For example, environmental
justice (EJ) specific grant programs.

As mentioned above, despite the emphasis on program activities addressing equity and climate change,
interviewees often do not collect data on the effects of these activities. However, some interviewees
reported that their programs collect data that they could use to understand the EPA's activities related to
equity and climate. These types of data include:

9 The question numbering corresponds to the original question numbering in the interview methodology, shown in Appendix A: Methodology.

14


-------
Location of project activities (18). The EPA could use location data to retroactively assess the
impacts on overburdened and underserved communities. However, interviewees fell short of
collecting data on the location(s) where grant benefits accrue.

Programs that retroactively collect climate data for existing grants (2). Some interviewees
looked through previously submitted progress reports to pull out climate related data in response
to administration priorities.

Programs that only collect data for equity (2) and climate (2) if specified in the workplan. In

these instances, collection of data related to the effects of equity and climate activities is not
consistent across grantees. Thus, the presence of these data is ad hoc and unreliable program
wide.

Programs that could conduct additional effects analysis if necessary (1). Specifically,
emissions reductions metrics could be used to model the effect of emissions reduction in an
airshed and effects on overburdened and underserved populations, though this is not currently
being completed.

For interviewees that reported collecting data on the effects of project activities on equity, the data
collected included:

Output data (5), such as number of individuals trained from a specific population. This also
includes data on the completion of activities related to equity, such as indicating progress on EJ
targets and outreach activities.

Short term outcome data (2), such as increased awareness and knowledge gained.

For interviewees that indicated their programs collect data on the effects of project activities on climate
change, the data included:

Output data (6). This also includes activities related to climate such as habitat restoration and
climate adaptation planning activities.

Short term outcome data (1), specifically, increased awareness.

Mid-term outcome data (6), e.g., emissions reduction metrics.

Several interviewees mentioned their program plans to collect these data. At least eight programs
indicated plans to collect data related to the effects of equity, and at least two indicated plans to collect
data related to the effects of climate change.10

Overall, Year 2 results build upon Year 1 findings to highlight the recent emphasis on equity and climate
within program activities in response to administration priorities. However, most interviewees do not
collect data that allows the EPA to quantify the effects of these activities on equity and climate.
Interviewees stated that the creation of HQ program guidance on specific metrics and tracking procedures

10 Note that due to the guided nature of the interviews, the Evidence Act - Grant Commitments Met Workgroup did not ask all
programs this question. Refer to Appendix B: Interview Guides, question 3 and 5.

15


-------
would facilitate the collection of these data at the regional level.11 In general, a more proactive approach
from HQ programs for identifying and communicating administration priorities would improve the
Agency's ability to report and track these items at the grant program level.

1c. Process for Identifying Data to Track

Year 2 interviewees did not indicate a standard process across all EPA NPMs to identify relevant grant
commitments to track. Interviewees indicated that grant programs generally follow some process for
identifying what to track, however, the approach varied across the Agency. Although Year 1 did not have
a data source that directly addressed this evaluation question, some open-ended survey responses
described how grant programs identify relevant grant commitments to track.12 The survey data indicated
statute or Agency long-term performance goals define some grant programs' outputs. While other grant
programs determine metrics to track using the past performance of grant projects or a program-level
action plan.

In Year 2, interviewees cited three sources that inform the metrics the grant program tracks:

Language in the statute, where legislation dictates program priorities and information programs
must track and report, e.g., the Brownfields program has long-standing statutory metrics that the
program updates over time to reflect changing grant work. Interviewees did not describe a
standard process for updating these metrics.

NPM guidance documents, which detail standard metrics for the grant program. Another
interviewee indicated that the NPM decides what information programs track in addition to
statutory requirements, based on program priorities.

Negotiation with grantees to determine outputs and outcomes, in which tracked information is
based on agreement between grantees and EPA grant POs.

Year 2 results indicate that external factors can also impact what data programs track. External factors
include other legislation, such as the Paperwork Reduction Act, which limits the amount of survey data a
program can collect from grantee. External factors may also be national standards set outside of the
program, e.g., the Brownfields program indicated that cleanups must meet American Society for Testing
and Materials (ASTM) standards, dictating certain program metrics.

1 	Ill 		 111 : 111 	 Ill	I mi			

Year 1 survey results indicate that the EPA's grant programs most commonly use Word documents for
collecting data from grantees and SharePoint/Teams/OneDrive for storing data collected from grantees.

11	Four interviewees indicated they are waiting for headquarters to develop metrics related to equity and climate change to
guide their data collection. One program indicated headquarters was working on a tracking system to facilitate the tracking of
equity data.

12	The Evidence Act - Grant Commitments Met Workgroup reviewed open-ended responses to Questions 15 and Question 19.
Question 15 asks how strongly respondents agree that the types of output data that the program collects enable the program to
effectively track progress and/or recognize and address problems. Question 19 asks how strongly respondents agree that the
types of outcome data that the program collects enable the program to effectively track progress and/or recognize and address
problems.

16


-------
Other popular reporting mechanisms from Year 1 include Adobe PDF, Excel, and 'text in the body of an
email." A substantial proportion of Year 1 respondents indicated the use of databases and EPA data
systems to store data collected from grantees. See Figure 1 and 2 below. Overall, Year 2 results confirmed
what we heard in Year 1 regarding data reporting and storage systems. Year 2 and Year 1 data show
extensive use of Microsoft Office Suite programs and Oracle databases to report and store grantee data.

Figure 1. Reporting mechanisms grantees use to report to the EPA

document	420

Adobe PDF	273

workbook	262

Text in the body of an email	233

EPA data system	160

Database	81

Other	66

Microsoft Forms	I 6

Microsoft Access	I 4

Survey	I 2

No mechanism	I 1

'Other' reporting mechanisms

Meeting

28

Unspecified

H 19

Phone call

7

Hard copy documents

7

Video

4

Photos

4

Map

2

Presentation

2

Flash drive

2

17


-------
Figure 2. Data storage mechanisms for data collected from grantees

SharePoint/Teams/OneDrive	335

Database	282

a desktop folder	273

EPA data system (GRTS, etc.)	195

Excel workbook
Other

No data storage mechanism ¦ 11
Microsoft Access | 4

'Other' mechanisms of data storage

Hard copy documents
Electronic files

Email | 8
Adobe | 2
Unspecified 1

In Year 2, the Evidence Act - Grant Commitments Met Workgroup explored grantee reporting and
storage mechanisms in interviews to learn about any potential changes since Year 1. Data reporting and
storage systems used in Year 2 echoed the Year 1 findings. However, interviewees confirmed that these
systems are actively changing as programs think about ways to improve their reporting and storage
mechanisms. For example, the Leaking Underground Storage Tank Trust Fund Program indicated some
recent modifications to the LUST-4 database to include location specific data and a new tool for
demographic layers. In some cases, external factors influenced changes; for example, the COVID-19
pandemic prompted many programs to shift from hard copies to online file storage. Other factors, such as
administration priorities, have also pushed programs to think more critically about their systems for data
reporting and storage.

Additionally, Year 2 interviews highlighted some new initiatives related to reporting and storage. For
example, one interviewee implemented Power BI, a Microsoft data visualization tool for interactive data
dashboards and storage. Another interviewee described the exploration of a data warehouse to store long-
term monitoring data.

2a. Effective EPA Practices

Year 2 interviews built on Year 1 survey information to determine effective practices that support the
EPA in tracking and communicating grant results at the grantee and national program level; both data
sources identify a variety of effective practices to track grantee progress and support consolidation and
communication of results. The most common effective practices for tracking progress centered around
grant administration that prioritizes grantee capacity and regular progress updates. For communicating
results, interviewees indicated that review and summary activities across the program are particularly
helpful practices.

18


-------
Practices that interviewees indicated are effective for tracking grantee progress include:

Good rapport with grantees (14). This involves frequent and open communication between
grant POs and grantees and maintaining long-term relationships with recurring grantees.
Interviewees noted that conversations with grantees supplement reporting and help address
problems in real time. Additionally, interviewees noted that this rapport allows the space for
grantees to ask questions related to the grant. Some interviewees noted that communication
involved sending emails to remind grantees of project deadlines, which was helpful to ensure
timely completion of grant reporting and administrative tasks.

Regular progress updates (12). Interviewees indicated that progress updates were useful, both in
the form of regular progress reports and regular updates from EPA grant POs to EPA HQ
programs. One interviewee stated that it was helpful to include photographs of project activities
with these grantee progress reports. For example, one grantee in this program included before and
after photographs of the community garden built during the project, to show how a barren piece
of land turned into a flourishing garden.

Organization related practices (5). Interviewees cited practices involving data organization,
such as a central location for file storage, and grant administration organization, such as deadline
tracking. One interviewee cited the benefit of separating of administrative and program level
grant tracking activities, in which the program has two separate offices for processing grant
administrative documents and managing grant project activities.

Using grant funds for technical support (3). For example, using grant funds to purchase
technology so grantees can access the appropriate EPA reporting systems.

Practices that interviewees indicated are effective for consolidating and communicating grant program-
level results include:

Synthesis of activities, outputs, and outcomes across the grant program (5). These practices
include cross-grant evaluation and synthesis activities, and tracking targets and outcomes at the
grant program level using both project-specific data and conditions of the ecosystem (i.e.,
ecosystem targets and environmental indicators) One interviewee indicated that for one of their
major awards with multiple subgrantees, this type of program wide synthesis was required for the
lead grantee, i.e., the actual grant recipient.

Creating public summaries of program information (3), which help consolidate and
communicate the story of the grant program. Interviewees cited practices such as compiling
grantee success stories and publishing general program summaries.

Collaboration among regional offices (1), which helps to consolidate grantee progress across
regions for a grant program.

Establishing a baseline for grantee data using past grant project results (1). This allows the
grant program to establish expectations for outputs based on past projects and subsequently
communicate on progress made

19


-------
Collecting data on effective practices allows the EPA to learn about what is working across the Agency,
and what may be beneficial for other grant programs. Year 2 interviews confirmed the effective practices
found in Year 1 data.

2ai. Quality Assurance and Quality Control (QA/QC) Practices

Interviewees overwhelmingly indicated that EPA staff engage in QA/QC practices of grantee data to
ensure the data are accurate and complete (30). Overall, this provides evidence that the data grantees
provide are meeting EPA data quality standards, when reporting out on grantee outputs and outcomes.
QA/QC practices include:

General review of completeness and crosscheck of information across data sources (11). For

example, staff often check the financial information grantees provide in quarterly reports against
spending in grant information tracking systems (e.g., COMPASS). Alignment between the two
sources confirms the grantee project implementation is occurring as planned.

Practices and checks required under the J	itv Program (7). For grantees, this may

include completing a Quality Assurance Plan (QAP). POs review the QAP and ensure adherence
to it throughout implementation of the grant.

Communication with grantee to explain and clarify data request (4). POs communicate with
grantees to clarify a data request if metrics submitted as part of a report need additional metadata.

Multi-level review of project deliverables (3). For example, in addition to the POs' review of
the data grantees submit, the regional project officer, and in some cases the NPM will also review
and check the data.

Deep-dive review of select grants (2). Some grantees and projects merit additional close review
of project data, or advanced monitoring Some interviewees reported that the EPA randomly
selects a certain number or percent of grantees to undergo more intense data quality checks.

Data quality checks of data in national-level database (2). Two interviewees indicated HQ
programs conduct a data check on the information stored in their grantee national-level database,
including a review of missing data.

Site visits (1). One interviewee indicated site visits occur every three years.

Comparison of reported results to calculations using EPA tools (1). One program that collects
emission result data from grantees indicates that they check the reported results with calculations
using their own EPA tool.

In addition to the QA/QC practices described above, some interviewees brought up limitations or
considerations regarding the EPA's ability to complete QA/QC practices for their data. One interviewee
suggested that implementation of practices may only apply to certain grants, grantees, or when an issue
arises. Another indicated that although the EPA requires these practices, POs may not always have the
time or resources to complete them. Additionally, implementation of QA/QC practices may be difficult
for administrative staff that do not have the technical knowledge needed to review data reported by
grantees.

20


-------
Ilh 	 		 Ill 	

Program databases are the topmost effective tool reported by interviewees. Interviewees also emphasized
the importance of continually improving and updating databases to meet user and data needs. Other
effective tools for tracking and communicating grant results at the grantee and national program level
include standard templates (most often in Excel), SharePoint, and using spreadsheets to track program and
project level data.

The Year 1 report presented findings indicating that the EPA has approximately 55 databases that contain
some sort of EPA grant data. Findings from the interviewees reiterate the importance of these databases to
effectively tracking and communicating data. Of those that indicated the importance of databases, three
interviewees specifically stated the importance of continually improving databases to meet user and data
needs.

Regarding effective EPA tools, one interviewee specifically noted that using spreadsheets to consolidate
data is easier and more user friendly than databases. Interviewees also reported several other types of
tools that help track grantee progress such as Power BI; a program-specific mapping tool; the EPA's
EnviroAtlas which the program uses to identify EJ communities; the EPA's EJScreen; the EPA's
electronic GreenhouseGas Reporting tool (e-GGRET); and a program-specific set of monthly budget
tracking spreadsheets.

2c. Tools for Outcome Analysis

Most interviewees indicated that their program does not use additional tools or models with grantee data
to determine alternative outputs or outcomes. Of the eight interviewees indicating the program uses
models to further analyze data, three indicated this occurs at the national (not regional) level and
programs may utilize these tools on a case-by-case basis. However, at least three interviewees indicated
interest in using these types of tools although they do not currently.

2d. Factors Affecting Tracking + Reporting

The top seven challenges in tracking grantee progress that interviewees reported include: heavy EPA staff
workloads, lack of grantee capacity (including grantee staff turnover), inability to collect data after grant
close-out, the challenging nature of tracking collective progress, existing data requests and metrics fail to
capture all project results (e.g., narrative information), and difficulty in quantifying some program
impacts. The text below organizes factors by challenges in tracking grantee progress, challenges in
communicating national progress, and other types of challenges:

• EPA staff workloads. The most common challenge interviewees reported in tracking grantee
progress is overburdened EPA staff (11), although some noted that this was actively changing
with more recent hires. Related, three interviewees reported a disconnect between administrative
and technical staff as an impediment to tracking (particularly difficult for PPG management). One
interviewee wanted more coordination between administrative and technical staff when
developing grantee workplans to ensure more relevant data collection. EPA staff turnover also
poses a challenge because it results in loss of institutional knowledge and relationships between
the EPA and grantees. For programs that rely on data collection from states, the brief period

21


-------
between when state grantees report their data and when the national program wants the data for
reporting, limits a PO's ability to QA/QC data. One interviewee also reported that manually
entering data from the grantee into an internal database is time consuming and error prone.

•	Grantee capacity. Similarly, lack of grantee capacity also poses a challenge to tracking grantee
progress (8). Grantees also face staff turnover causing both a loss of institutional knowledge for
the grantee and loss of relationships between the EPA and grantees. One interviewee reported the
lack of EPA regional coordination across state grantees to facilitate knowledge sharing among
states as another challenge. Another interviewee noted that the EPA does not require program
evaluation as part of its grants, limiting the grantee's ability to report on progress.

•	Inability to collect data after grant close-out. A driving factor limiting the ability of the EPA to
track grantee progress for projects with long-term outcomes outside the grant timeframe is the
inability to collect data after grant close-out (5). Grantees are not required to respond to EPA
requests after grant close-out. One interviewee noted that even when permitted, POs do not
always follow-up with grantees after grant closeout.

•	Metrics that the EPA requests grantees provide do not capture all project results (4).
Interviewees reported lack of qualitative data, non-standard metrics, and a need for understanding
that focusing on tracking dollars spent does not capture environmental benefits.

•	Difficult to quantify some program impacts (4). The environmental and human health long-
term outcomes that the EPA programs aim to affect are often difficult to measure.

Other less frequent factors that pose challenges to tracking grantee progress include lack of adequate tools
(3), incomplete reporting from grantees (2), too infrequent reporting from grantees (2), external factors
affecting project implementation (2; e.g., COVID-19), instances when sub-grantees are responsible for
providing reporting data (1), and a lack of a central repository for storing data and files (1).

Interviewees reported five challenges specific to communicating national grant program-level results.
Three interviewees indicated that it is challenging to track collective progress because of their program's
flexible framework and the variety of topics covered within the grant program (e.g., Regional Wetland
Program Development Grants). One other noted the difficulty in separating the impact of EPA grant
projects from other efforts, such as other agencies working in the same area. One interviewee indicated
interest in better communicating national grant program-level results but noted that their management
does not prioritize summary work and they lack program evaluation resources and expertise. On a more
procedural issue related to communication of results, one interviewee noted that in instances where a
grantee produces a case study, they face difficulty in publishing these findings on their website because it
is not a direct EPA publication.

Several other factors and challenges emerged from the interviews. Related to both tracking and
communicating, three interviewees reported that limited database functionality is an impediment. Three
interviewees (POs for Water, Brownfields, and Diesel Emission Reduction Act grant programs)
specifically noted that it is difficult to track qualitative data in their database designed to collect
quantitative data. One other interviewee noted that the EPA Wetland Program Development Grants
database does not store objectives associated with the grant (which can help with searching and filtering
results), does not allow for searching for specific documents stored in the database, and is difficult to
access when needed.

22


-------
PPGs also pose unique challenges in grantee data collection and reporting. Based on internal EPA data,
around 11% of the 167 active grant programs, or 19 programs, are PPG eligible. Both PPG interviewees
indicated that PPG management is more labor intensive and time consuming for the EPA. This is as
expected since the intent of the PPG structure is to reduce burden on the grantee. PPG POs also lack
access to all the relevant media-specific databases that are covered under their PPG, hindering progress in
tracking overall PPG results. Finally, since PPG grantees do not produce individual progress reports for
each program in the grant, and instead produce a rolled-up report covering several media grant types, the
PPG PO and/or technical program contacts must read long narratives to identify the relevant information.

				 |	 Ill' III 	 					 "jjjj" III"

Interviewees noted several challenges specific to data collection pertaining to equity and climate.
Interviewees expressed worries about grantee reporting fatigue in having to respond to additional data
requests, challenges with identifying relevant metrics to track, limited guidance from HQ programs on
how to track priority data specifically in advance of grant solicitations, and lack of statutory authority to
collect new types of information.

Interviewees cited the following challenges in tracking grant commitments related to equity:

Challenging to identify relevant metrics (6). In general, programs struggle to find ways to
quantify information related to equity. For example, one interviewee noted that data collection
was difficult because the benefit of a grant may occur outside the scope of the project (i.e.,
benefits accrue downstream from where grant activities occur). Therefore, it is challenging to
determine what information the EPA should include in reporting. Additionally, one interviewee
mentioned a lack of HQ program guidance on how to incorporate equity related grant
commitments into grantee workplans. Interviewees expressed a need to better understand what
counts as an EJ community and the extent to which programs should consider downstream
impacts.

Grantee reporting fatigue (4), specifically related to collecting additional equity metrics outside
of previously required data.

Program does not have statutory authority to collect this info (3), Thus, programs cannot
require grantees to report on this information.

Priorities for overburdened + underserved communities' conflict with grant program
purpose (1). Reported grant benefits may not actually align with community priorities. For
example, the Drinking Water State Revolving Loan Funds program reported that the EPA priority
of growing the loan does not directly align with priorities for grantees from overburdened +
underserved communities. These communities are often more focused on broader grant benefits
such as improved access to clean water.

For climate related grant commitments, interviewees cited similar challenges:

Lack of guidance from HQ programs on how to incorporate climate commitments (4).

Interviewees cited a lack of proactive guidance on what climate metrics to track and how to
incorporate them into grant commitments.

23


-------
Program does not have statutory authority to collect this info (3). Thus, programs cannot
require grantees to report on this information.

Grantee reporting fatigue (2), specifically related to collecting additional climate metrics
outside of previously required data.

Retroactive data collection is more difficult (2). Requests for information on administration
priorities after programs complete grant solicitations requires POs to retroactively review
previously submitted reports for climate related data. This creates extra work for POs who
already struggle with capacity issues.

Climate outcomes occur after completion of the grant (1). Thus, it is not possible to capture
the impact of climate related grant commitments within the scope of the grant.

In general, interviewees emphasized the need for a proactive approach to identifying administration
priorities to track, including clear guidance on what information HQ programs are looking for related to
administration priorities in advance of grant solicitations. Without it, grant programs must take a reactive
approach which inhibits their ability to define and track commitments related to administration priorities
effectively.

Ill - 	Ill llll'i 111 - '¦ 111 • I mi

The promising practices and tools that interviewees reported fell into two categories: those affecting
grantee data and reporting and those affecting general grant program implementation. Practices and tools
pertaining to grantee data and reporting include program evaluation requirements for grantees, data
visualization dashboards, flexible reporting templates, and a tiered metrics approach tied to strategic
objectives. Other practices and tools relevant to grant program implementation include gathering input
from technical experts on grant projects, completing cohort building activities to build grantee capacity,
leveraging resources from other federal partners, and maintaining open communication with previous
applicants that the EPA did not award, to build their future capacity. Additional detail on practices and
tools affecting data and reporting are below:

•	Program evaluation requirements for grantees. The grant program sets commitments, updated
every 5 years, to develop an accomplishment report and program evaluation process. The
program embeds these program evaluation requirements into the grant solicitation and eventual
grant agreement.

•	Data dashboard in Power BI. The Long Island Sound Program's Power BI data dashboard
allows the program's director to interactively explore their program data, such as data across all
projects in the program. Those with access to the dashboard can filter based on an action, type of
project, contributions towards ecosystem targets, and money allocated vs. spent. The dashboard
also includes a map of project locations (latitude and longitude data). Additionally, some of the
Power BI outputs go directly into the program's public reports.

•	Flexible reporting system. A reporting system that uses a template but also allows flexibility to
collect data specific to any grant agreement.

•	External contractor assists in review of project status to assess level of completeness.
Program uses contract support to assist in the review to determine percent of completion based on
the planned project deliverables.

24


-------
•	Long Island Sound Study metrics approach. Tiered metrics approach tied to strategic
objectives ensures data collection occurs to determine status of meeting objectives.

Other promising practices and tools related to overall grant program implementation are:

•	Open communication with previous applicants that the EPA did not award. The EPA's
Environmental Justice Collaborative Problem-Solving Cooperative Agreement Program also
provides extensive feedback to grant applicants that they did not award, to explain why the EPA
did not select the applicant. They keep these lines of communication open to prepare their target
grantees (small nonprofits) for the next round of funding.

•	Cohort building activities with groups of grantees. Region 10 recently started a small regional
grant program which involved several new grantees that had never received a federal award
before. The EPA played a larger role in supporting these new grantees (than they would for other
similar programs) to collectively welcome them into the grant program and establish expectations
across all recipients.

•	Increased staff numbers. This includes POs as well as other technical, and support staff.

•	Creating a list of priority communities to target activities. Region 4 combined data from the
Clean Water Benefits Reporting (CBR) System, Project Benefits Reporting System (PBR),
National Incident Management System (NIMS) and permitting data, to identify communities in
highest need that their state partners should reach out to. This did not require additional work for
states but better met the EPA's EJ efforts.

•	Leveraging resources from other federal partners. The EPA's Environmental Justice
Collaborative Problem-Solving Cooperative Agreement Program leverages other federal
agencies. For example, the EPA was able to build off the grant writing training another agency
provided for their own training session.

•	Presentations to technical experts to elicit feedback on grant projects. The Long Island Sound
Program engages a broader community of technical experts to provide feedback on grantee
projects and improve the work.

•	Rebate program structure. One interviewee reported the DERA rebate program structure is less
cumbersome than implementing a grant and may have more applicability across the Agency for
programs focused on purchasing of equipment. This helps optimize the funding mechanism for
the project type.

25


-------
APPENDIX A: METHODOLOGY

This three-year project draws on multiple data sources to answer the priority questions. Key sources of
information include: 1) a brief online survey administered to EPA staff managing or implementing the
EPA's grant programs, 2) document review of reports and grant guidance documents, and 3) in-depth
interviews with EPA personnel across the Agency. The survey and document review were the main data
sources for addressing Year 1 questions (although interviews will also supplement any data gaps from
Year 1 questions). Interviews and survey responses are the primary data sources for addressing Year 2
and 3 questions. Together, the data sources will provide different but complementary types of information
to answer the priority questions.

In Year 1, the Evidence Act - Grant Commitments Met Workgroup completed the document review and
survey according to the methods outlined in the Grant Commitments Met Learning Agenda Support \ Year
1 Baseline, Appendix A: Methodology. The document review provides detailed information on how grant
programs record, communicate, and present guidance and results from their programs. The document
review also provides information with which to interpret or expand on the survey findings. The survey
targeted all HQ offices and regional offices for each grant program that is active within a region. The
workgroup designed the survey to be brief, and consisted of predominately closed-ended questions, with
optional open-ended questions.

Interviews are the primary focus of this current Year 2 effort. The interviews were a semi-structured
format and explored themes and elicited detailed information based on findings from the survey results
and the document review. During interviews, the Evidence Act - Grant Commitments Met Workgroup
may collect additional program documents provided from interviewees. Additional information is
available in the "Interviews' section below.

Overall, the data methodology makes use of existing data while undertaking targeted new data collections
to provide more comprehensive and robust answers to the priority questions. We will triangulate across
existing and new data, using a combination of quantitative and qualitative analysis, to answer each
question. The following table summarizes the research questions; the scope of each research question as it
pertains to all or a sub-set of grant programs; general data sources for informing each research question;
and the specific interview question(s) and/or document type(s) that will inform each research question
(e.g., purpose of each interview question).

26


-------
Question

Data Source

Interview Question(s)/Document Type(s) That
Inform Question

Comments

1. How do the EPA's existing grant award and
reporting systems identify and track grant
commitments?

•	Survey

•	Documents

•	Interviews





Do the grant programs have specific
targets associated with their
outputs/outcomes?

•	Documents • Interviews

•	Interviews	• Q1 - information on specific targets

• Documents

• Additional documents potentially provided
through interviews

b.iii. To what extent does the data reported by
grantees provide information that currently
allows the EPA to measure outputs, outcomes,
and impacts related to equity and climate
change?

•	Survey	• Interviews

•	Documents	• Q2 - equity related program activities

•	Interviews	• Q3 - equity data

•	Q4 - climate related program activities

•	Q5 - climate data
• Documents

•	Additional documents potentially provided
through interviews

How do grant programs identify relevant
grant commitments to track?

Survey	• Interviews

Documents	• Q6 - identifying metrics/data to track

Interviews	• Q7 - person who decides what to track

•	Q8 - mechanisms/restrictions that guide what
to track

• Documents

•	Additional documents potentially provided
through interviews

In Year 1, no data source specifically
addresses this; the workgroup presented
results on an as-provided basis, rather
than comprehensively

d. What data reporting processes, tools, and
systems do the EPA's grant award
programs use?

Survey	• Survey

Interviews	• Q5 + Q6 + Q7 - data consolidation and

storage
• Interviews

•	Q9 - update on storage mechanism(s)
reported in survey

•	Q10 - update on data reporting mechanism(s)
reported in survey

27


-------
Question

Data Source

Interview Question(s)/Document Type(s) That
Inform Question

Comments

2. What EPA practices and tools (1) effectively track
grantee progress towards meeting workplan grant
commitments including outputs and outcomes,
and/or (2) support communication of national
program level outputs and outcomes?

•
•
•

Survey

Documents

Interviews





a. What EPA practices (1) are effective in
tracking grantee progress towards
meeting workplan grant commitments
including outputs and outcomes, and/or
(2) support communication of national
program level outputs and outcomes?

•
•
•

Survey

Documents

Interviews

•
•

•

Survey

•	Q9 + Q10 + Q11 - difficulty or ease of
tracking progress

Interviews

•	Q11 - elaboration on survey response (see
above)

•	Q12 - practices that support tracking

•	Q13 - practices that support consolidation

•	Q14 - region-specific questions
Documents

•	Additional documents potentially provided
through interviews

i. What quality control and quality
assurance practices help ensure
that the data are accurate and
complete?

•

Interviews

•

Interviews
• Q15 - QA/QC Practices

b. What EPA tools (1) are effective in
tracking grantee progress towards
meeting workplan grant commitments
including outputs and outcomes, and/or
(2) support communication of national
program level outputs and outcomes?

•
•
•

Survey

Documents

Interviews

•
•

Interviews

•	Q16 - tools that support tracking

•	Q17 - tools that support consolidation

•	Q18 - region-specific questions
Documents

•	Additional documents potentially provided
through interviews

c. What tools do programs use to analyze
environmental and human health
outcomes with grantee-reported data?

•

Interviews

•

Interviews
• Q19 — tools to further analyze data

28


-------
Question

Data Source

Interview Question(s)/Document Type(s) That
Inform Question

Comments

d. What factors might affect a grant

program's ability to (1) effectively track
grantee progress towards meeting
workplan grant commitments including
outputs and outcomes, and/or (2) support
communication of national program level
outputs and outcomes?

•
•

Survey
Interviews

• Interviews

•	Q20 - barriers to tracking (topic level prompts
as applicable)

•	Q21 - barriers to communicating (topic level
prompts as applicable)

•	Q22 - common barriers across grant
programs

•	Q23 - region-specific questions

Factors may include:

•	PPG management

•	Reporting frequency

•	Grantee reporting mechanisms

•	Data storage mechanisms

•	Types of outputs reported

•	Types of outcomes reported

•	Type of grant program

•	Project Officer and grant specialist
workload capacity

•	Grantee capacity to collect and report
data (e.g., tribal and EJ communities)

i. For relevant programs (e.g.,
programs that have equity or
climate impacts), what factors
might affect a grant program's
ability to effectively track grant
commitments related to equity
and climate?

•

Interviews

• Interviews

•	Q2 - other barriers to tracking equity data

•	Q4 - other barriers to tracking climate data





ii. How has the GREAT Act affected
grant programs' practices and/or
tools for collecting grant data?

•

Interviews

• Interviews

• Q24 - understanding of GREAT Act





e. What are promising practices and tools
demonstrated by some grant programs
that could help other grant programs
effectively track grant commitments data?

•
•

Survey
Interviews

• Interviews

•	Q25 - other best practices/tools for tracking
data

•	Q26 - region-specific questions

•	Q27 - awareness of any new practices/tools

•	Q28 - awareness of any new tracking
systems





29


-------
Ill	¥8

The goal of conducting interviews is to help answer the priority questions by capturing perspectives and
knowledge of grant programs from EPA employees with experience administering, implementing, or
otherwise participating in the EPA's grant programs. The information obtained from interviews will help
supplement data gaps from Year 1 data collection efforts and focused on collecting data on practices and
tools that help programs effectively track information on grant programs.

The Evidence Act - Grant Commitments Met Workgroup developed the interview guide questions (see
Appendix B) and will conduct structured Microsoft Teams interviews with EPA employees. The
workgroup tailored interview guides to each program based on responses received during the survey and
based on the document review. The workgroup selected interviewees as a purposive sample to ensure
adequate representation across key EPA grant programs and regions. The sample of interviews will not be
statistically representative, and the Evidence Act - Grant Commitments Met Workgroup will not attempt
to make quantitative inferences about the entirety of the EPA's grant programs based on the results of the
interviews. Based on the available budget, timeline, and project goals, the Evidence Act - Grant
Commitments Met Workgroup proposes to conduct approximately 30 interview sessions.

The workgroup conducted interviews with single individuals unless project officers request the
involvement of other colleagues. In those cases, groups of no more than three persons may be appropriate.
This process is not meant to encourage group interviews as, in some cases, this may preclude candor.

The Evidence Act - Grant Commitments Met Workgroup will ask respondents' permission to record
interviews via Microsoft Teams to aid in documentation and analysis. If the respondent does not consent
to a recorded interview, the Evidence Act - Grant Commitments Met Workgroup will provide a
notetaker. The EPA will receive an output of the coded interview notes. However, the interviews will not
be strictly anonymous to ensure that the workgroup can connect information back to a specific program
and/or region. This is important for the EPA in understanding how implementation of grant programs
may vary across the Agency and what factors may affect implementation. In communications with the
interviewees, the Evidence Act - Grant Commitments Met Workgroup will emphasize that the purpose of
the interviews is for learning, rather than a compliance exercise, to encourage candor and openness in
their responses.

The EPA has developed two introductory documents that the workgroup will disseminate as part of the
broader communication effort around the Grant Commitments Met project. The Workgroup will use the
Evidence Act - Grant Commitments Met Background and Request for Agency Coordination document to
promote awareness and gain support amongst grant program leadership. The Evidence Act - Grant
Commitments Met Project Overview and Interview Instructions document will provide background
information and instructions for interviewees on the Grant Commitments Met effort.

The Grant Commitments Met Workgroup will assist in scheduling interviews. Prior to each interview, the
Workgroup will provide the interviewee with the document containing background information and the
relevant interview guide. During the interview, in addition to going through the interview guide, the
Evidence Act - Grant Commitments Met Workgroup will ask follow-up questions as appropriate to probe
further into topics of conversation raised during the interview and relevant to the priority questions.

The target interviewees are program staff that are responsible for implementing and managing the grant
programs, and NPMs. The source for selecting interviewees was the survey respondent lists and NPM
information request. Reflecting the Workgroup's priorities and the scope of this project, the Evidence Act

30


-------
- Grant Commitments Met Workgroup will not interview outside collaborators such as state, local, or
tribal governments; contractors; or research institutions. To ensure the transparency of the selection
process, the Grant Commitments Met Workgroup developed the following interview selection criteria to
ensure representation of:

1.	Focus on grant programs that:

a.	Received funding from the American Rescue Plan (ARP) and/or the Bipartisan Infrastructure
Law (BIL).

b.	Demonstrated best practices in collecting data, based on Year 1 results.

c.	Demonstrated best practices in communicating reported data, based on Year 1 results.

2.	Representation of grant programs:

a.	That addressed Administration priorities, including environmental justice and underserved
communities.

b.	That consisted of both competitive and non-competitive grants.

c.	In regions that faced challenges in the grant process.13

d.	Both included and not included in PPGs.

e.	Varied in reporting frequencies, including annually, semi-annually, and/or quarterly.

3.	Representation across media program types, program sizes, and EPA regions.

Criteria for conducting group (rather than individual) interviews include the following:

1.	Logical groupings of interviewees (by program).

2.	Groups of manageable size (no more than three individuals per group).

Protocol and planning allow for scheduling group interviews.

The table below displays the programs interviewed in Year 2. This represents the final list of interviewees
after the Grant Commitments Met Workgroup made necessary edits. Note the table distinguishes between
interviewees by program officer ("PO") and (""NPM").

Interview

ma

Program

PO

ow

Surveys, Studies, Research, Investigations, Demonstrations, and Special Purpose
Activities Relating to the Clean Air Act

PO

OA

National Estuary Program

PO

OCSPP

Brownfields Multipurpose, Assessment, Revolving Loan Fund, and Cleanup Cooperative
Agreements

PO

OAR

Environmental Education Grants

PO

OW

Brownfields - Categorical Grants (State and Tribal Response Program Grants)

Interview

Program
Office

Program

PO

OLEM

Superfund Remedial - Hazardous Substances Response Trust Fund

13 For example, lack of a standardized reporting format creates challenges in tracking and communicating data.

31


-------
PO

OA

Long Island Sound Program

PO

OLEM

Leaking Underground Storage Tank Trust Fund Program

PO

OLEM

Clean Water State Revolving Loan Funds(a)

PO

OW

State Environmental Justice Cooperative Agreement Program

PO

OLEM

Source Reduction Assistance

PO

OW

Regional Wetland Program Development Grants

PO

OA

Great Lakes National Program Grants

PO

OCSPP

Environmental Justice Small Grant Program

PO

OW

Targeted Air Sheds Grant Program

PO

OW

Environmental Justice Collaborative Problem-Solving Cooperative Agreement Program

PO

OA

Lake Champlain Basin Program

PO

OAR

Diesel Emissions Reduction Act (DERA) State Grants

PO

OA

Brownfields Training, Research, and Technical Assistance Grants and Cooperative
Agreements

PO

OW

State Lead Grants

PO

OAR

PM2.5 Monitoring Network

PO

OLEM

Diesel Emissions Reduction Act (DERA) National Grants

PO

OCSPP

Drinking Water State Revolving Loan Funds(a)

NPM

OA/OCIR

Performance Partnership Grants

NPM

OW

Regional Wetland Program Development Grants

NPM

OAR

Diesel Emissions Reduction Act (DERA) National Grants

NPM

OLEM

Brownfields - Categorical Grants

NPM

OCSPP

State Lead Grants

Data Analysis

The Evidence Act - Grant Commitments Met Workgroup will analyze the data collected through the
interviews to answer the EPA's Priority Questions in the Agency's Learning Agenda for Grant
Commitments Met. The Evidence Act - Grant Commitments Met Workgroup will analyze responses to
each interview question to identify themes and summarize responses using qualitative analysis to code
each open-ended response through a systematic coding approach. This process involves a crosswalk of
the coding approach of three team members to ensure consistent coding across the responses. Note that
each response may be applicable to more than one priority question. The workgroup coded and organized
responses in Excel; the Evidence Act - Grant Commitments Met Workgroup summarized the frequency
with which interviewees raised each theme. We will also identify illustrative quotations that capture
issues that interviewees frequently raise. Through analysis of the qualitative data, the Evidence Act -
Grant Commitments Met Workgroup will identify best practices to share with grant programs throughout
the Agency.

The data analysis plan includes the detailed interview request question, answer type, and response
options. Accompanying each question, the plan addresses the purpose of the question, the data use and
analysis approach, the expected data, and data validation steps. The data analysis plan is Attachment A:
Data Analysis Plan to this report.

the Evidence Act - Grant Commitments Met Workgroup will also conduct data analysis of additional
documents provided by interviewees to complement Year 1 document review efforts. These documents

32


-------
supplement interview responses to validate interview data and provide information on specific targets,
and equity and climate data.

Expected Deliverables

The main audiences for the final Year 2 report include OCIR and OGD, senior policy and career leaders,
management at the Agency, the Evidence Act Workgroup, and EPA personnel responsible for managing
and implementing the EPA's grant programs. The workgroup expects that the EPA will report key results
to OMB and Congress as part of annual reporting. To develop these final deliverables, the Evidence Act -
Grant Commitments Met Workgroup will first develop an interim findings presentation to obtain
feedback and buy-in from the Grant Commitments Workgroup and any other key individuals. The
presentation will include the initial data results in graphics (as applicable) for the purpose of promoting
discussion on the interpretation of the results. The interim findings presentation is not intended to
summarize conclusions regarding the overall data results or recommendations.

After the interim findings presentation, the Evidence Act - Grant Commitments Met Workgroup will
develop the draft report (including recommendations), collect comments from the Grant Commitments
Workgroup, and finalize the report. The Evidence Act - Grant Commitments Met Workgroup will also
develop a final presentation for communicating and disseminating the results of the Year 2 report. As-
needed, the Evidence Act - Grant Commitments Met Workgroup will also adapt the presentation up to
two additional times to target the presentation and findings to other specific audiences.

Data + Methodology Limitations + Validation

Below, we detail data limitations and mitigation strategies to ensure the quality of the data are adequate
for its intended use.

•	Incomplete document collection and potential for non-response bias. The EPA does not
maintain a unified system where documents (e.g., grant guidance) are stored and tracked. This
means that the Evidence Act - Grant Commitments Met Workgroup was not able to obtain the list
of relevant grant guidance easily or comprehensively, grant program reports, or additional
documentation about grant programs. Instead, the Evidence Act - Grant Commitments Met
Workgroup used the NPM Information Request to obtain relevant documents, which relied on
individuals to respond to the request. Some individuals did not respond. Respondents that did not
understand the request may have provided irrelevant documents or an incomplete set of
information. The Evidence Act - Grant Commitments Met Workgroup also used the interviews as
another opportunity to collect missing document information collected in Year 1. Still, if the
programs for which we did not receive documents are different in some way from the programs
for which we did receive documents, our overall conclusions drawn from the guidance that we
reviewed could be biased. The Evidence Act - Grant Commitments Met Workgroup reviewed all
the available information provided to extract relevant data in a consistent format.

•	Potential bias associated with purposive sampling for interviews of EPA project staff
responsible for implementing grant programs. The Evidence Act - Grant Commitments Met
Workgroup will not be able to select a statistically valid sample of interviewees given the small
number of interviews that the workgroup can conduct, and the several types of interviewees.
Therefore, the Evidence Act - Grant Commitments Met Workgroup collaborated closely with the
Grant Commitments Met Workgroup to select a purposive sample of interviewees to maximize
learning opportunities. Although the results of the interviews will not be statistically

33


-------
representative, they provide good examples, best practices, and insights that can be useful for
other grant programs and for EPA management when considering what data programs track, how
best to track it, and what the data can show.

34


-------
APPENDIX B: INTERVIEW GUIDES

Interview Guide for Program Officer

Background

1.	Can you tell us briefly about your current role within the grant program?

a. How long have you been in this position?

Grant Award and Reporting Systems

2.	Your survey response from Year 1 efforts indicated your program currently uses [X] to store
grantee data. Since the survey, has your program implemented or explored new storage
mechanisms?

3.	Your survey response from Year 1 also indicated that grantees report progress on meeting grant
commitments using [X], Since the survey, has the program changed or investigating new data
reporting mechanisms?

Grant Award and Reporting Systems: Equity/Climate Goals

4.	One of the priorities of the current administration is equity. Does the program currently target
programmatic activities on benefiting overburdened and underserved communities?

5.	Do you currently collect data from grantees that reflect the effect of program activities on
overburdened and underserved communities?

6.	Another priority of the current administration is climate change. Does the program currently
target program activities on addressing climate change?

7.	Do you currently collect data from grantees that reflect the effect of program activities on climate
change?

Practice and Tools to Track Progress and Communicate Results

Best Practices/Tools

8.	Your survey response from Year 1 provided insight on how easy or difficult it was for your
program to track progress in meeting grant commitments. You indicated [XX] ease of reporting.
You also indicated [Open-ended response] - could you explain your response in a bit more
detail?

9.	Are there any [other] best practices that help you effectively track grantee progress in meeting
workplan grant commitments?

10.	Are there any [other] tools that help you effectively track grantee progress in meeting workplan
grant commitments?

11.	Does the program use any tools or models to further analyze the data provided by grantees to
determine alternative outputs or outcomes? An example here would be using energy saved
reported by grantees to determine GHG reductions, or reductions in criteria pollutants to
determine improved health outcomes.

35


-------
12.	An important piece of tracking grantee progress and communicating results is having accurate
data. Do EPA staff review the data grantees provide and engage in quality assurance and quality
control (QA/QC) practices to check the accuracy and completeness?

13.	Other than what we have previously discussed, are there other practices or tools that you would
recommend to other programs in tracking their grant commitments data?

Barriers

14.	In addition to practices and tools that help your grant program succeed, there may be challenges
you face in tracking grantee progress and communicating results. Can you identify barriers you
face in effectively tracking grantee progress towards meeting grant commitments?

15.	Other than what we have previously discussed, are there any other challenges to tracking grant
commitments data that we should know about?

Wrap Up

16.	Are you aware of other programs or regions that are investigating or implementing new grant data
practices or tracking systems?

36


-------
Interview Guide for National Program Manager

Background

1.	Can you tell us briefly about your current role within the grant program?

a. How long have you been in this position?

Grant Award and Reporting Systems

The Evidence Act - Grant Commitments Met Workgroup will reference information from the documents

NPMs provided as part of the NPM information request, these include [XXX],

2.	Does your grant program set specific targets for indicating success at the program-level?

3.	We are interested in understanding how the grant program identifies what commitments to track.
What is the process for deciding what specific metrics and type of data are collected from
grantees?

a.	Can you identify who decides what information is tracked?

b.	Are there other mechanisms or program restrictions that guide what data are collected
from grantees? An example here would be legislative requirements that dictate what data
are collected.

4.	Your response to the NPM Information Request indicated your program currently uses [X] to
store grantee data. Since the re quest, has your program implemented or explored new storage
mechanisms?

5.	Your response to the NPM Information Request also indicated that grantees report progress on
meeting grant commitments using [X], Since the request, has the program changed or
investigating new data reporting mechanisms?

Grant Award and Reporting Systems: Equity/Climate Goals

The Evidence Act - Grant Commitments Met Workgroup will reference information from the documents

NPMs provided as part of the NPM information request, these include [XXX].

6.	One of the priorities of the current administration is equity. Does the program currently target
programmatic activities on benefiting overburdened and underserved communities?

7.	Do you currently collect data from grantees that reflect the effect of program activities on
overburdened and underserved communities?

8.	Another priority of the current administration is climate change. Does the program currently
target program activities on addressing climate change?

9.	Do you currently collect data from grantees that reflect the effect of program activities on climate
change?

Practice and Tools to Track Progress and Communicate Results

Best Practices/Tools

10.	Are there any practices that support consolidation of tracking grantee progress across regions and
communication at the national level?

37


-------
11.	Aside from best practices, there may be different tools your grant program uses to track progress
and communicate results. Are there any tools that support consolidation of grantee progress
across regions and communication at the national level?

12.	Does the program use any tools or models to further analyze the data provided by grantees to
determine alternative outputs or outcomes? An example here would be using energy saved
reported by grantees to determine GHG reductions, or reductions in criteria pollutants to
determine improved health outcomes.

13.	An important piece of tracking grantee progress and communicating results is having accurate
data. Do EPA staff review the data grantees provide and engage in quality assurance and quality
control (QA/QC) practices to check the accuracy and completeness?

14.	Other than examples we have previously discussed, are there particular practices or tools that
your program uses that you think would benefit other programs in tracking their grant
commitments data?

Barriers

15.	In addition to practices and tools that help your grant program succeed, there may be challenges
you face in tracking and communicating results. Can you identify barriers you face in
communicating national program level outputs and outcomes?

16.	Other than any examples we have previously discussed, are there particular practices or tools (or
lack thereof) that your program uses that you think may impede the ability of other programs in
tracking their grant commitments data?

Wrap Up

17.	Are you aware of other programs or regions that are investigating or implementing new grant data
practices or tracking systems?

38


-------