AEPA
December 2007
Wetland Program
Development Grants:
Assessing Their Role in
State/Tribal Wetland
Programs
Promoting Environmental Results
-------
This evaluation was performed by Indtai Inc, under contract to
Industrial Economics, Incorporated (IEc) for EPA's Office of
Environmental Policy Innovation under Contract EP-W-04-023
between EPA and IEc. The Indtai evaluation team was lead by Robert
Brotzman. Judy Lieberman, Office of the Chief Financial Officer
served as the technical advisor and Romell Nandi, Office of Water's,
Office of Wetlands served as the primary liaison and subject matter
expert on behalf the Wetland Program.
This report was developed under the Program Evaluation Competition,
co-sponsored by EPA's Office of Policy, Economics and Innovation
and the Office of the Chief Financial Officer. To access copies of this
.or other EPA program evaluations, please go to EPA's Evaluation
Support Division's website at http:www.epa.gov/evaluate.
-------
Table of Contents
Executive Summary E-l
I. Introduction 2
II. Methodology 3
III. Evaluation Analysis and Results 6
A. Question 1 10
Since 2001, what progress have grantees made toward building core
elements of state/tribal wetland programs?
B. Question 2 14
> What key variables contribute to achieving core elements of
state/tribal wetland programs?
> How important is the WPDG program in developing core elements of
state/tribal wetland programs?
C. Question 3 20
What barriers do grantees face in building core elements of state/tribal
wetland programs?
D. Question 4 22
What can EPA do in the future to help states/tribes build core
elements of wetland programs?
E. Other Findings 23
IV. Potential Next Steps and Recommendations 24
Appendix 1: WPDG Grantee Cohort Wetland Program Characteristics
Appendix 2: WPDG Grantee Case Studies
Appendix 3: National and Regional WPDG Priorities
-------
Executive Summary
This report provides findings of an independent evaluation of the U.S. EPA Wetland Program
Development Grant (WPDG) program. The purpose of this evaluation project was to assess the
effectiveness of the WPDG program in helping states and tribes build core elements of their wetland
programs. The findings in this report aim to inform EPA of ways to better assist grantees in building
core elements of state/tribal wetland programs. For more information on the background of the
WPDG program and the purpose of this evaluation, see Chapter 1 "Introduction" of this report.
The evaluation focused on WPDGs awarded from FY2001 through FY2006. We examined the
wetland program development efforts of 15 states and 3 tribes. This "grantee cohort" is representative
of the state/tribal grantee universe and accounted for 20 percent of all awards and nearly 25 percent of
all WPDG funding for FY2001-6. We analyzed grant documents, EPA databases, grantee websites,
and other evaluation reports to characterize wetland program development and map WPDG projects to
the core elements. Interviews of the grantee cohort yielded additional insights including factors
contributing to and barriers impeding wetland program development. The grantee cohort also
provided suggestions to EPA to improve the effectiveness, efficiency, and fairness of the WPDG
program. For more information on the evaluation methodology used and data availability/quality
challenges faced, see Chapter 2 "Methodology" of this report. Appendix 1 profiles the grantee cohort
in great detail while Appendix 2 features case studies of four of the 18 grantees.
Throughout the evaluation we found many differences between large and small wetland
programs in terms of WPDG usage, focus on core elements, and challenges faced. Therefore, this
report often portrays results and analyses by program size. We found that large programs tended to
win more WPDG awards and the average size of grants was 50 percent greater than those awarded to
small programs. Since FY2001, EPA awarded WPDGs that primarily focus on building the
"monitoring and assessment" core element of state/tribal programs. Large programs focused on
building the "regulation" core element much more than did small programs. EPA awarded
comparable amounts of WPDG funding to large and small programs to build the "restoration" core
element. The grantee cohort said that having access to sufficient and consistent funding was most
critical to program development. Leveraging other organizations for tools, methodologies, databases,
and expertise was also found to be important to program development. Small programs are much
more dependent on WPDG funding than large programs. Small programs tend to lack the staffing
depth and grant-writing experience of large programs. Overall, the grantee cohort did not find the
WPDG grant application and reporting burdens to be unreasonable, but offered a few suggestions for
EPA's consideration. Chapter 3 "Evaluation Analysis and Results" provides a summary of grantee
suggestions and more statistics, details, and analysis of the grantee cohort.
In Chapter 4 "Potential Next Steps and Recommendations," we offer recommendations
to improve EPA's management of the WPDG program. By tailoring requirements and award criteria
to acknowledge the differences between small and large programs, EPA can reduce burdens and more
equitably promote the development of wetland programs of all states and tribes. The core element
"endpoints" need to be refined, prioritized, and perhaps categorized by program size. It is critically
important for EPA to develop and communicate clear definitions for each endpoint so that grantees
can appropriately direct efforts and developmental progress can be measured. Program management
databases should be enhanced to accommodate endpoint tracking so that EPA can better report on the
progress made by states and tribes in the development of wetland programs.
FINAL
1
28 December 2007
-------
I. Introduction
Since 1990 EPA has funded the development and enhancement of wetland protection programs
for states and tribes through the Wetland Program Development Grant (WPDG) program. The WPDG
program is the primary federal funding source that states and tribes use to build their wetland
programs. The WPDG program is a competitive grant program implemented pursuant to the Clean
Water Act Section 104(b)(3) statutory authority. EPA operates the WPDG program at approximately
$16 million a year. WPDG-funded projects support both the initial development of a wetlands
protection, restoration, or management program or the enhancement/refinement of an existing
program.
EPA developed six "core elements" of a comprehensive wetland program. This evaluation
focused on the four main core elements that align directly to the four "National Priority Areas" of the
WPDG program, and involve the majority of grant funding (i.e., monitoring and assessment,
regulation, restoration, and water quality standards or "WQS").1 The purpose of this evaluation
project was to assess the effectiveness of the WPDG program in helping states and tribes build core
elements of their wetland programs. The findings in this report may inform EPA of ways to better
assist grantees in building core elements, and what grantees themselves can do. Specifically, this
evaluation answers the following questions:
1. Since 2001, what progress have grantees made toward building core elements of
state/tribal wetland programs?
2. What key variables contribute to achieving core elements of state/tribal wetland
programs and how important is the WPDG program in developing core elements
of state/tribal wetland programs?
3. What barriers do grantees face in building core elements of state/tribal wetland
programs? and
4. What can EPA do in the future to help states/tribes build core elements of wetland
programs?
The remaining sections of this report present the evaluation methodology, evaluation analysis
and results, and potential next steps and recommendations. The analysis and results section presents
our findings, organized according to evaluation question. The potential next steps and
recommendations section presents alternatives that EPA may want to consider in addressing the
findings of this evaluation. The central aim of the recommendations is to enhance EPA's efforts to
facilitate the building of core elements in state and tribal wetland programs.
1 EPA describes all core elements at http://www.epa.gov/owow/wetlands/initiative/fV02elements.html. For this project,
EPA's primary interest was on the four core elements that match the National Priority Areas of the WPDG program.
While the remaining two core elements (i.e., Public/Private Partnerships and Coordination) and the two 'Basic Elements'
(Outreach and Education, and Watershed [Ecosystem] Approach) are not discretely addressed in this evaluation, their
importance were nonetheless cited by the grantee cohort in contributing to wetland program development.
FINAL
2
28 December 2007
-------
II. Methodology
This section briefly describes the approach and methods used to answer the four evaluation
questions and various issues and constraints encountered during the project.2
Background and Context
In developing the methodology and throughout the evaluation, we had to account for
limitations in the quality and completeness of data. For example, EPA databases do not presently
include milestones achieved or progress made by states and tribes in the development of wetland
programs. Furthermore, WPDG project descriptions contained in EPA databases do not always fully
and accurately characterize activities and results. State/tribal websites and ELI reports provide a
wealth of information on wetland program development, but the timing of accomplishments and
sources of funding were often not disclosed. These limitations created a challenge regarding how we
would address a key area of inquiry of the project: progress made by grantees since FY2001 in
developing and enhancing state/tribal wetland programs.
One of EPA's main areas of interest for this project was to evaluate the degree to which
FY2001-6 WPDG-funded projects supported state/tribal efforts to achieve core element "endpoints3."
Mapping WPDG project descriptions to core element endpoints was challenging in two respects.
First, EPA has yet to issue definitions for endpoints that clearly specify the criteria for having
achieved an endpoint. Guidance regarding how to measure progress towards achieving endpoints has
also yet to be developed. While most of the endpoints are self-explanatory, many are not and may be
interpreted differently. Second, EPA introduced endpoints in FY2005 after 92 percent of the grants
examined for this project had been awarded. WPDG grantees were not required to focus on making
progress towards core element endpoints per se. States and tribes had a higher likelihood of award if
proposed projects closely aligned with annual regional and national priorities4. See Appendix 3 for a
list of National WPDG priorities for FY2001-2007 and Regional priorities for FY2007.
We crafted a methodology that involved an ex post facto mapping of FY2001-6 WPDG project
activities to EPA-defined core elements and endpoints. We examined grants for FY2001-5 and
included a few data points for FY2006. We sought to identify all areas of wetland program
development using WPDG funding regardless of whether or not an activity or project readily mapped
to EPA's core element endpoints.
2 For a detailed explanation of the evaluation methodology, see "Methodology for Assessing the Effectiveness of the
Wetland Program Development Grant Program (WPDG) in Helping States/Tribes Build Core Elements of a Wetland
Program," Deliverable 2-4b, 13 June 2007.
3 In the context of wetland program development, EPA uses the term "endpoint" to refer to a capability to perform a
specific programmatic function. An endpoint can be a methodology, performance metric, analytical tool, database, plan,
regulatory authority, or programmatic definition. EPA defined a set of endpoints for each core element that collectively
represent the characteristics of a robust state/tribal wetland protection program. EPA began using core element
endpoints in FY2005 as criteria for awarding Pilot Demonstration Grants.
4 Grantees were only bound to the objectives and deliverables outlined in EPA-approved project plans / grant proposals
using WPDG funding. Annual national and regional priorities outlined in WPDG RFPs did not always corresponded
with core element endpoints, so it is not surprising that some WPDGs did not focus on endpoints.
FINAL
3
28 December 2007
-------
Scope of the Evaluation
Given the project budget and other limitations, it was not practicable to 1) examine all WPDG
grantees over the past five years, or 2) conduct comprehensive audits of milestones, products, and
capabilities developed using WPDG funding. Therefore, we coordinated with EPA to identify a
representative subset of grantees that would be the focus of the evaluation. We used the following
criteria to select 15 states and 3 tribes that would constitute the "grantee cohort" to be evaluated:
> Geographical diversity. Inclusion of at least one grantee per Region;
> Program size diversity. A balance of "small" and "large" programs;
> Multiple WPDGs Grantee received multiple FY2001-6 awards; and
> Coverage of four primary core elements.
The grantee cohort consisted of nine grantees with "large" wetland programs (i.e., 10 or more FTEs)
and nine "small" wetland programs (i.e., 5 or fewer FTEs). The 18 grantees examined in this
evaluation accounted for 20 percent of all awards and nearly 25 percent of all WPDG funding for
FY2001-6. While there is some selection bias towards grantees that actively use WPDGs, EPA saw
more opportunity for insights and recommendations than from those less familiar with WPDGs.
Data Sources Used
We obtained information through the following data sources to meet information objectives:
> Interviews of selected state/tribal program personnel;
> Documents including WPDG applications, deliverables, and final reports;
> EPA databases. Wetlands Grant Database and the Grants Information Control System:
> Grantee websites. Review of applicable state/tribal program websites; and
> Wetland program studies. In particular, Environmental Law Institute (ELI) reports on
wetland programs issued in 2005. 2006. and 2007.
We culled these sources for information on: WPDG project scopes; state/tribal progress in
developing wetland programs; key variables impacting the ability to fully develop wetland programs
(e.g. critical success factors and barriers); and areas where grantees offered suggestions regarding how
EPA could better assist efforts to build core elements of wetland programs. Given the aforementioned
lack of endpoint definitions and information in EPA databases regarding state/tribal progress in
developing wetland programs, we relied heavily on interviews of grantees to identify efforts focused
on building core elements. We mapped interviewee responses to core element endpoints.
Understanding that interviewees may fail to mention or recall all possible areas of progress and
accomplishments for FY2001-6, we supplemented the interview responses with a review of grant
applications and project plans. Despite requesting grant documentation from the grantee cohort (e.g.,
progress and closeout reports), we did not receive comprehensive information on what was actually
accomplished using WPDG funding.
Our assumption is that grantees probably met the objectives and deliverables identified in
WPDG project descriptions, so we mapped WPDG project scopes to endpoints. In some instances,
grantee websites and ELI reports pointed to other state/tribal efforts aimed at accomplishing core
element endpoints, so we also relied on these sources to demonstrate grantee areas of program
FINAL
4
28 December 2007
-------
development. We did not visit EPA Regions or grantee agencies to shore up information gaps, so the
findings in this report for evaluation Question 1 represent a "floor" of what the grantee cohort
addressed for FY2001-6.
We developed a variable list of client-approved questions for grantee interviews that directly or
indirectly align with the key evaluation questions. We conducted follow-up calls when appropriate to
clarify and have grantees elaborate on key topics. The interview approach intentionally did not
involve an exhaustive list of questions related to each core element endpoint to avoid the appearances
of an audit and to minimize burdens on grantees. Instead, we posed open-ended questions meant to
elicit dialogue on the four main areas of investigation.
Data Analysis
We analyzed information to formulate answers responsive to the evaluation questions. The
contrast in responses between large and small programs was often noteworthy, so we tended to
supplement answers for the entire grantee cohort with statistics and findings by program size.
In a spreadsheet, we coded and aggregated qualitative information on grantee activities aligned
with endpoints, and grantee opinions regarding critical success factors, barriers, and recommendations
from each grantee. Where appropriate, we captured quantitative information to better characterize
programs (e.g., annual budget, number of FTEs). In Appendix 1, we coded a "Y" (Yes) when a
WPDG project description reasonably mapped to an endpoint for one of the four primary core
elements. We also coded a "Y" for important areas of program development that did not readily map
to a core element endpoint; see "Other Progress" items in Appendix 1. A "Y" coded in the "Admin"
category in Appendix 1 signified that a grantee possessed a specific program characteristic (e.g., had
developed a wetland strategic plan). Finally, a "Y" coded for any of the critical success factors,
barriers, or suggestions indicated that a grantee mentioned those items during the interview.
Because most data sources did not always make explicit reference to a funding source(s) relied
upon for each state/tribal program, we cannot readily make firm conclusions regarding the role that
WPDG funding played in developing specific core element endpoints. In general, most grantees cited
the importance of WPDG funding in developing wetland programs. However, it became clear during
the interviews that small state/tribal programs have a relatively higher reliance on WPDG funding not
only for program development, but program survival.
To complement the statistical analysis of the grantee cohort, we developed case studies of four
(two large and two small programs) of the eighteen grantees; see Appendix 2. The four case studies
supplement the answers to the evaluation questions, and illustrate best practices, problems
encountered, and accomplishments of selected WPDG recipients.
Recommendations
The report contains two sources of recommendations: grantee cohort and the evaluator. The
answer to Question 4 contains suggestions from the grantee cohort, while the report section entitled
"Potential Next Steps and Recommendations" exclusively portrays the opinions of the evaluator.
FINAL
5
28 December 2007
-------
III. Evaluation Analysis and Results
This section presents statistics and analyses that support our attempts to answer the evaluation
questions posed by EPA. We supplement the findings presented here with appendices to the report
that characterize the grantee cohort (Appendix 1), provide additional details on wetland program
development by WPDG grantees in the form of four case studies (Appendix 2), and convey recent
national and regional priorities regarding wetland program development (Appendix 3).
For the period of FY2001 through FY2006 (partial)5, EPA awarded more for monitoring and
assessment than any other core element, comprising more than 50 percent of WPDG funding (See
Exhibit 1). Restoration and Regulation each accounted for approximately 20 to 25 percent of WPDG
funding. The allocation of funding by core element to the grantee cohort was consistent with national
patterns (See Exhibit 2).
After analyzing grant-level details and interviewing the grantee cohort, some fundamental
differences in grant patterns between small and large programs came to light:
> More Grants to Large Programs. Although the grantee cohort was comprised of an equal
number of large and small programs, we found that large programs received 55 percent of all
grants (see Exhibit 3) and almost twice as many dollars (see Exhibit 4);
> Different Emphasis on Core Elements. Large programs received over three times as many
grants addressing the regulation core element (see Exhibit 3), accounting for 78 percent of all
EPA spending on this core element (see Exhibit 4). Small programs received 50 percent more
grants for the restoration core element, but the level of funding was over $200,000 less;
> Larger Grants to Large Programs. The average size of grants awarded to large programs
was over 50 percent more than those awarded to small programs— $163,934 versus $107,401
(see Exhibit 5); and
> Majority of "Enhancement" Grants to Large Programs. While small and large programs
received approximately an equal share of WPDGs focused on "development," large programs
received five times as many grants enhancing capabilities already developed (see Exhibit 6).
Because EPA developed core element endpoints after the award of most WPDGs, the results of
our ex post facto mapping of WPDGs to specific endpoints is not intended to reflect any shortcoming
on the part of grantees in meeting WPDG objectives. The ex post facto application of endpoints to the
grantee cohort is for illustrating areas of emphasis on program development. A more comprehensive
audit-level evaluation of grantee progress would undoubtedly result in higher frequency counts than
those portrayed in Appendix 1 and in various report exhibits.6
5 The analysis included all WPDGs awarded in FY2001-5, but also reflects some FY2006 grants for which data were
readily available.
6 An audit-level evaluation of all grantee documentation and deliverables would require site visits that were not part of the
scope of this project. Such visits would systematically identify all WPDG deliverables and capabilities. By primarily
relying on discussions with grantee personnel, grant documentation, and EPA database information for areas of recent
program development, the frequency counts are subject to the limits of memory and institutional knowledge (e.g., in
some instances the grantee point of contact did not have historical involvement with a WPDG), availability of grant
descriptions and outcomes, and completeness of grant data. Therefore, the frequency counts represent a "floor" of
progress made by the grantee cohort.
FINAL
6
28 December 2007
-------
Exhibit 1 illustrates the national WPDG funding patterns by core element for the period of
FY2001 through part of FY2006. EPA's primary interest during this timeframe was state/tribal
development of the monitoring and assessment core element. Restoration and regulation each
accounted for comparable levels of WPDG funding, but significantly less than monitoring and
assessment. Grants involving the development of Water Quality Standards accounted for
approximately 2 percent of all EPA spending on the four main core elements addressed in this
evaluation.
Exhibit 1: National Summary of WPDG Funding for FY2001-67
Core Element
Grants
Total EPA Funding
Average
Grant Size
#
% Total
$
% Total
Monitoring and Assessment
263
54%
$29,850,518
52%
$113,500
Restoration
120
25%
$13,727,585
24%
$114,397
Regulation
88
18%
$12,629,976
22%
$143,522
WQS
13
3%
$1,414,482
2%
$108,806
Total
484
100%
$57,622,561
100%
$119,055
Exhibit 2 mirrors the types of information portrayed in Exhibit 1, but focuses on the grantee
cohort. Note how the grantee cohort closely aligns with the characteristics of the grantee universe.
Exhibit 2: Grantee Cohort Summary of WPDG Funding for FY2001-6
Core Element
Count
Total EPA Funding
Average
Grant Size
#
% Total
$
% Total
Monitoring and Assessment
52
53%
$7,472,186
55%
$143,696
Restoration
20
20%
$2,946,344
21%
$147,317
Regulation
27
27%
$3,266,982
24%
$120,999
WQS
0
0%
$0
0%
N/A
Total
99
100%
$13,685,512
100%
$138,238
7 This national summary does not include another $13+ million in WPDGs for the other two core elements. Moreover,
data for FY2006 were not available for all grants, so Exhibit 1 should not be cited as a comprehensive summary of
WPDG grant activity for FY2001-6.
FINAL
7
28 December 2007
-------
Exhibits 3-6 provide a detailed breakdown of how FY2001-6 WPDG awards to large and small
grantees have been distributed with respect to overall funding, average grant size, number of grants
within core elements, and emphasis on development versus enhancement, respectively. Exhibit 3
shows that large programs received the majority of all grants despite the fact that the grantee cohort
was comprised of an equal number of large and small programs.
Exhibit 3: Number of WPDG Awards to Grantee Cohort for FY2001-6
Core Element
# Grants by Program Size
% Grants by Program Size
Large
Small
Total
Large
Small
Total
Monitoring and
Assessment
25
27
52
48%
52%
53%
Restoration
8
12
20
40%
60%
20%
Regulation
21
6
27
78%
22%
27%
WQS
0
0
0
0%
0%
0%
Total
54
45
99
55%
45%
100%
Exhibit 4 illustrates the degree to which funding is significantly weighted towards large
programs - by nearly a two-to-one margin over small programs, with the exception of the restoration
core element.
Exhibit 4: Amounts Awarded to Grantee Cohort for FY2001-6
Core Element
Grant $ by Program Size
% Dollars by Program Size
Large
Small
Large
Small
Monitoring and Assessment
$4,716,928
$2,755,258
63%
37%
Restoration
$1,591,232
$1,355,112
54%
46%
Regulation
$2,544,302
$722,680
78%
22%
WQS
$0
$0
0%
0%
Total
$8,852,462
$4,833,050
65%
35%
FINAL
8
28 December 2007
-------
Exhibit 5 further illustrates the disparity between large and small programs in terms of average
grant size; only regulation grants were comparable in size for large and small programs.
Exhibit 5: Average Grant Size to Grantee Cohort for FY2001-6
Core Element
Average Grant Amount by Program Size
Large
Small
Monitoring and Assessment
$188,677
$102,047
Restoration
$198,904
$112,926
Regulation
$121,157
$120,447
WQS
N/A
N/A
Overall Average
$163,934
$107,401
Given that one of the stated objectives of the WPDG program is to both develop and enhance
state and tribal wetland programs, this evaluation examined EPA's emphasis on funding
developmental versus enhancement activities. We define "development" projects as those using
WPDG funding to establish a program component or capability whereas "enhancement" projects
involve modifying, testing, and/or improving an established program component or capability.
Exhibit 6 illustrates some key findings regarding award patterns to small and large programs with
respect to projects focused on development versus enhancement:
> Equal emphasis on "Development" Grants between Small and Large Programs.
> More "Enhancement" Grants to Large Programs. It is not surprising that large programs
tend to be more developed, thereby enabling them to focus more on enhancing tools, databases,
etc. The disparity between large and small programs regarding enhancement projects begs an
important question: given limited funds and significant remaining program development
needs, should there be a lesser emphasis on WPDGs focused on enhancement?
Exhibit 6: Lifecycle Scope of WPDGs Awarded to Grantee Cohort for FY2001-6
Lifecycle Scope
# Grants by Program Size
% Grants by Program Size
Large
Small
Total
Large
Small
Total
Development
22
23
45
49%
51%
46%
Enhancement
15
3
18
83%
17%
18%
Development and
Enhancement
9
3
12
75%
25%
12%
Unknown
8
16
24
33%
67%
24%
Total
54
45
99
55%
45%
100%
FINAL
9
28 December 2007
-------
A. Question 1 - Since 2001, what progress have grantees made toward building core elements of
state/tribal wetland programs?
Given the aforementioned issues regarding a lack of EPA tracking data on program
development and definitions regarding core element endpoints, we were not able to quantify
"progress" made by the grantee cohort in developing and enhancing their wetland protection
programs. Therefore, our approach to answering this question focused more on identifying those core
element endpoints for which the grantee cohort undertook projects or applied some level of effort
during FY2001-6. While we cannot discretely measure progress made by each grantee in the cohort,
we know that most WPDG projects have been completed for the period examined, so we generally
assume progress was made by virtue of meeting the objectives outlined in each grant.
In analyzing the frequency with which large and small programs cited progress towards the
achievement of specific core element endpoints since FY20018, a few highlights emerged:
> Large Programs Made More Progress in Monitoring and Assessment. States and tribes in
the grantee cohort made progress on eight endpoints for the monitoring and assessment core
element. Large programs tended to make more progress towards achieving these endpoints
(see Exhibit 7);
> Comparable Progress Made by All Programs on the Development of Restoration
Capabilities. There was not much difference between large and small programs regarding
progress on restoration endpoints. Small programs we interviewed tended to hold restoration
as a high priority (see Exhibit 8);
> Large Programs Made More Progress on the Regulation Core Element. The majority of
large programs made progress on four regulation endpoints. Large programs tended to be
more focused on permitting activities and analyzing mitigation approaches (see Exhibit 9); and
> Insufficient Data on Water Quality Standards. Initial research indicated that a few
programs in the grantee cohort used WPDGs to develop WQS, but subsequent examination of
the grants indicated more of a focus on monitoring and assessment. Therefore, we did not have
much data on grantee cohort efforts to develop or refine WQS for FY2001-6.
In some instances, grantees identified program development activities that did not readily map
to the four core element endpoints. In Exhibit 10, we summarize these areas of program development
and the frequency with which grantees cited each area. The majority of programs in the grantee cohort
(78 percent) cited education and outreach initiatives that, while not readily aligned with the four core
elements addressed in this project, are considered a 'basic element' within broader efforts to
implement a wetland program. Many grantees also cited progress made in issuing wetlands-related
guidance and developing online tools.
8 We primarily relied on unprompted disclosures from grantees regarding progress made on endpoints. This minimized
potential bias in grantee responses. We reviewed grant descriptions in EPA databases and grant material for progress
that grantees may have failed to mention.
FINAL
10
28 December 2007
-------
We undertook a two-pronged approach in this evaluation to gauge grantee efforts to develop
wetland programs in general, and towards achieving core element endpoints in particular. One method
involved aggregating interview responses on areas where grantees felt their program made
developmental progress since FY2001. In addition, we analyzed WPDG project descriptions from
grant materials and EPA databases, and reviewed grantee websites and grantee documents for wetland
program development activities and accomplishments. Exhibits 7, 8 and 9 portray findings regarding
the specific endpoints addressed by the grantee cohort for the core elements of monitoring and
assessment, restoration, and regulation, respectively. The results portrayed in Exhibits 7-9 reflect
grantee cohort efforts using all available funding sources, not just WPDGs. Exhibit 10 highlights
areas of program development that do not readily map to the four core elements of interest to this
evaluation, but nevertheless tend to align with EPA priorities since FY2001.
i. Monitoring and Assessment
Exhibit 7 provides a detailed listing of the eight monitoring and assessment endpoints
addressed by large and small programs in the grantee cohort since FY2001. Many grantees
interviewed suggested that the monitoring and assessment tools, methodologies and databases tended
to require larger budgets and more extensive technical skills, which may partially explain why they
received around 40 percent less funding. Some small programs simply did not see the utility of
extensive investments in monitoring and assessment (e.g., "We do not have many wetlands. We
already know where they are and what we need to do with them").
Exhibit 7: Grantee Cohort Progress Towards Achieving Monitoring and Assessment Endpoints
Monitoring and Assessment Endpoint
% Grantees by Program Size
Large
Small
Total
a. Has data analysis/assessment methods.
100%
89%
94%
b. Has data management systems.
100%
44%
72%
c. Has and actively uses core and supplemental
indicators.
67%
67%
67%
1. Has level 1 indicators.
67%
22%
44%
2. Has level 2 indicators.
67%
22%
44%
3. Has level 3 indicators.
44%
56%
50%
d. Has wetland inventory maps.
44%
44%
44%
e. Has a classification system.
56%
22%
39%
f. Has specific monitoring objectives/design.
56%
22%
39%
g. Conducts status reports that include wetlands.
56%
22%
39%
h. Has a wetland reference network.
33%
22%
28%
FINAL
11
28 December 2007
-------
ii. Restoration
Exhibit 8 illustrates comparable progress made by small and large programs regarding the
restoration core element. Funding for building wetland programs' restoration components generally
focused on developing guidance and training to foster local restoration efforts.
Exhibit 8: Grantee Cohort Progress Towards Achieving Restoration Endpoints
Restoration Endpoint
% Grantees by Program Size
Large
Small
Total
a. Has targeted restoration with some criteria.
78%
67%
72%
b. Conducts active research regarding restoration
techniques and methods to measure success.
44%
67%
56%
c. Funding or assistance is offered by the state/tribe
to assist entities carrying out wetland restoration.
44%
44%
44%
d. Has the ability to identify restoration needs and
prioritize wetlands.
33%
44%
39%
e. Actively uses tracking, reporting, and evaluation
components.
33%
22%
28%
iii. Regulation
As Exhibit 9 illustrates, large programs tended to focus more on developing the regulatory
component of their wetland protection programs.
Exhibit 9: Grantee Cohort Progress Towards Achieving Regulation Endpoints
Regulation Endpoint
% Grantees by Program Size
Large
Small
Total
a. Has a basis of authority for a compliance and
enforcement program.
89%
56%
72%
b. 401 Certification provides comprehensive
protection for wetlands.9
67%
56%
61%
c. Tracks permit activities.
89%
11%
50%
d. Analyzed mitigation approaches.10
67%
11%
39%
9 Much of the data supporting frequency percentages for this endpoint are derived from ELI reports that do not indicate
when a program worked on or achieved an endpoint. Therefore, these findings may overstate FY2001-6 progress.
10 While this category is not yet an EPA endpoint, the EPA WAM approved its inclusion here due to the importance of
mitigation to the regulation core element. EPA has made approaches to mitigation a national and regional priority (See
Appendix 3).
FINAL
12
28 December 2007
-------
iv. Water Quality Standards
Our discussions with grantees and reviews of WPDG project descriptions indicated minimal
attention to the development or enhancement of water quality standards since 2001. In selecting the
grantee cohort, we intentionally sought to include grantees with WPDGs primarily focused on WQS.
However, upon further review of grantee details, we determined that the two grants coded in the EPA
database as WQS were more aligned with the Monitoring and Assessment core element. Therefore,
we found little to evaluate in terms of WQS progress made by state/tribal programs overall or within
the grantee cohort. Given the lack of data points, we were not able to identify causal factors behind
the dearth of WQS funding by the WPDG program. However, in reviewing EPA's annual national
priority areas for the WPDG program, there was no emphasis on WQS for the FY2001-6 period
addressed in this evaluation project.
v. Other Progress
In some instances, grantees identified progress made that alternatively did not readily map to
core element endpoints, mapped to multiple core elements but not specific endpoints, or aligned more
to EPA's 'basic elements' of a wetland program. In Exhibit 10, we summarize these areas of progress
and the frequency with which states and tribes in the grantee cohort cited each item. The majority of
all programs in the grantee cohort (78 percent) cited progress in education and outreach initiatives that
aligned with EPA's "Education and Outreach" basic element. Grantees saw education and outreach as
an important tactic for facilitating progress on monitoring and assessment, regulation, and restoration.
For example, Massachusetts' Wetlands Protection Program provides guidance to regional and local
authorities so that they could more effectively regulate, preserve and restore wetlands.11 WPDG
funding was particularly aimed at guidance for multiple aspects of wetland efforts; of the 99 grants
awarded to the grantee cohort, 25 involved either the development or distribution of guidance material.
Exhibit 10: Other Areas of Program Development Achieved by the Grantee Cohort
Area of Program Development
% Grantees by Program Size
Large
Small
Total
a. Conducts wetland related education and outreach
100%
56%
78%
b. Uses GIS
78%
67%
72%
c. Issued guidance material
89%
33%
61%
d. Participation in task force/study group/work group
67%
56%
61%
e. Cooperation with other agencies for tools, knowledge or advice
56%
67%
61%
f. Development of a wetlands inventory
33%
33%
33%
g. Developed a wetland website tool
44%
22%
33%
11 For more details on Massachusetts' wetland program development efforts, see the case study presented in Appendix 2
FINAL 13 28 December 2007
-------
B. Question 2 - What key variables contribute to achieving core elements of state/tribal wetland
programs? How important is the WPDGprogram in developing core elements of
state/tribal wetland programs?
We asked states and tribes in the grantee cohort for their opinions regarding what factors are
critical to making progress in developing their wetland programs. Grantees cited factors and
conditions positively contributing to program development regardless of whether or not the factor or
condition currently applies to and/or affects their program. We intentionally framed the inquiry
broadly to minimize bias inherent in asking leading questions about specific critical success factors.
Exhibit 11 provides a statistical summary of grantee responses citing factors critical to progress made
since FY2001. It is important to note that the frequency percentages for critical success factors cited
in Exhibit 11 represent a "floor" level of grantee opinions because if we had explicitly polled the
grantee cohort on each factor, it is likely that the levels of agreement would have been higher.
Exhibits 12 and 13 display the degree to which WPDG projects aligned with or otherwise
addressed EPA core element endpoints. Exhibit 14 lists other sources of funding that the grantee
cohort utilized and the higher degree of success large programs had in securing non-WPDG funding.
1. Critical success factors to wetland program development
We categorized the factors into two categories: external and internal. External factors include
funding sources outside of their organization and interactions with third parties. Internal factors point
to organizational dynamics and characteristics that promote the development of wetland programs. As
with most governmental programs, factors critical to success cited by the grantee cohort addressed
such common themes as funding, planning, coordination, management support, and personnel.
Greater detail on the data provided in Exhibit 11 can be found in Appendix 1. Specifically,
Appendix 1 identifies the names of states and tribes that comprise the statistics in Exhibit 11 and the
supplemental findings discussed below in our response to Question 2.
Exhibit 11: Critical Success Factors to Wetland Program Development
Cited by the Grantee Cohort
Critical Success Factors
% Grantees Responding by Program Size
Large
Small
Total
External Factors
78%
100%
89%
a. Sufficient and consistent funding
56%
89%
72%
b. Support of and coordination with outside
organizations
67%
56%
61%
c. Having local programs active in wetlands
protection
11%
33%
22%
FINAL
14
28 December 2007
-------
Exhibit 11: Critical Success Factors to Wetland Program Development
Cited by the Grantee Cohort
Critical Success Factors
% Grantees Responding by Program Size
Large
Small
Total
Internal Factors
89%
33%
61%
d. Having a strategic plan or doing planning,
prioritizing, or preparation in advance
56%
22%
39%
e. Internal management or community support for
wetland program
33%
11%
22%
f. Having capable and qualified staff
22%
11%
17%
As expected, grantees most often pointed to the issue of funding as critical to their success in
developing their wetland programs. The grantee cohort was specific in citing the importance of
having both sufficient and consistent funding.
Most grantees, regardless of size, recognized that they could not "do it all on our own."
Interaction with other agencies, academic institutions, and consultants for technology transfer and
information sharing was seen as critical to cost-effectively developing and enhancing aspects of their
wetland programs. 67 percent of large grantees and 56 percent of small grantees indicated that
participation in a study group, work group, or task force allowed them to make advances quicker and
without "re-inventing the wheel."
The majority of large and small programs said they cooperated with, borrowed tools from,
tapped knowledge of or otherwise leveraged other organizations in developing their wetland program.
For example, many grantees said they contacted other programs for assistance in developing Level 1-3
indicators and assessment tools. Arkansas' Multi Agency Wetland Planning Team (MAWPT)
provides an interesting model for interagency coordination within a state.12 Through the MAWPT,
state agencies coordinate interagency wetland activities to reduce overlap and potential conflicts. By
jointly applying for WPDGs, Arkansas crafts stronger grant proposals and secures more consistent
funding.
Another interesting finding is the importance placed on strategic planning by the grantee
cohort. As conveyed in Appendix 1, most large programs reported having a strategic plan (56 percent)
for wetlands whereas only one small program (11 percent) indicated that it had a fully developed
strategic plan.13 As conveyed in Exhibit 11, most large programs (56 percent) view strategic planning
as critical to their success whereas only two small programs (22 percent) expressed the same
sentiment.
12 See the Arkansas case study in Appendix 2 for more information on the advantages of the MAWPT.
13 Many other programs without a strategic plan indicated that they had "elements" of one (e.g., list of goals).
FINAL
15
28 December 2007
-------
Ohio EPA (OEPA) is one of the large programs that cited the importance of strategic planning
for wetlands. In one of the four case studies presented in Appendix 2, we discuss OEPA's wetland
program development efforts and provide a link to Ohio's strategic plan that OEA updated in 2005.
EPA indicated that it will attempt to identify and make available other state/tribal strategic plans to
grantees to aid in program development.
Grantees with strategic plans tended to take state/tribal wetland priorities into consideration
when developing WPDG project proposals. We found that the portfolios of WPDG projects for
grantees with strategic plans were more cohesive and focused on major developmental themes.
WPDG projects for grantees without strategic plans tended to not always align with core elements.
2. The importance of WPDG in building core elements of wetland programs
Because there are no official definitions to use in assessing the degree of progress made
towards achieving endpoints, and thus, no means of determining whether or not a given endpoint had
been achieved, we were not able to conclusively attribute or quantify progress made using WPDG
funding. Despite the limitations regarding attributing developmental progress to WPDGs, many
interviews of grantees revealed a high level of importance they place on WPDGs. We found that
small programs have a greater reliance on WPDG funding for use in making progress toward
achieving core elements endpoints. Small programs tend to view grants as not only critical to program
building, but program survival.14 33 percent of small grantees (17 percent of the grantee cohort)
stated that without WPDG funding, their program would not be able to survive. While no large
programs viewed a loss of WPDGs as critical to program survival, 33 percent of large programs and
11 percent of small programs (collectively 22 percent of the grantee cohort) acknowledged that they
would have made significantly less progress without WPDG funding from EPA. Interview responses
citing the importance of WPDG funding in program development were unprompted and therefore
represent a "floor" level of grantee opinion on the importance of WPDGs. If we had "polled"
grantees on the importance they place on WPDGs, there would have undoubtedly been a higher
affirmative response.
We developed Exhibits 12 and 13 to quantify the extent to which the grantee cohorts' WPDG
projects map to core element endpoints. Exhibit 12 gives a summary of the level of endpoints
addressed with WPDG funding. Given that 91 of the 99 WPDGs reviewed (92 percent) align with one
or more endpoints, it is reasonable to conclude that WPDGs are vital to building state/tribal wetland
programs.
Exhibit 12: Degree of Alignment of Grantee Cohort WPDGs with Endpoints
Addressing Endpoints for Onlv One Core Element
66
Addressing Endpoints for More than One Core Element
25
No link to an endpoint
4
Unknown links to endpoints
4
14 See the St. Regis case study Appendix 3 for more information on small program reliance on WPDGs for survival.
FINAL 16 28 December 2007
-------
Total WPDGs
99
Exhibit 13 provides a frequency count of specific endpoints addressed by the grantee cohort
using WPDG funding from FY2001 through part of FY2006.
Kxhihit 13: I'reqiienov with which (jranlee Cohort W PIKis \l
l.ndpoint Typos
i«n with Kndpoints
I'reqiienov Cited In
Program Size
l.ar«e
Small
Total
Monitoring and Assessment
64
50
114
1. The state has data analysis/assessment [methods].
21
21
42
2. Level 1-3 Indicators
14
9
23
> Level 1 - Landscape (1/3 criterion)
4
2
6
> Level 2 - Rapid Assessment (1/3 criterion)
2
0
2
> Level 3 - Bio/Eco Indices (1/3 criterion)
8
7
15
3. The state has data management systems.
8
6
14
4. The state has wetland inventory maps.
6
3
9
5. The state has a wetlands reference network.
4
4
8
6. The state conducts status reports, including state 305(b) or Integrated
Reports, which include wetlands status.
3
5
8
7. The state has specific monitoring objectives.
4
1
5
8. The state has a monitoring design for each objective.
3
0
3
9. The state has a classification system.
1
1
2
Restoration
38
35
73
1 Targeted restoration by sector, type, geographic area or some other
criteria.
11
11
22
2. Active research regarding restoration techniques and methods to
measure success.
3
9
12
3. Ability to ID restoration needs and prioritize wetlands.
5
7
12
4. Funding or assistance by S/T to assist entities carrying out wetland
restoration.
6
5
11
5. Actively uses tracking, reporting, and evaluation components.
7
3
10
6. Actively implementing a S/T-wide tracking/reporting system for
restoration.
5
0
5
FINAL
17
28 December 2007
-------
Exhibit 13: Frequency with which Grantee Cohort WPDGs Align with Endpoints
Endpoint Types
Frequency Cited by
Program Size
Large
Small
Total
7. Has timeline and performance measures for the strategy.
1
0
1
Regulation
15
2
17
1. The state has a basis of authority for a compliance and enforcement
program.
6
0
6
2. The state has targets for individual permit, general permit, and
mitigation site inspections (e.g. a set percentage of individual permit
site inspection annually).
4
1
5
3. Whether all wetlands are subject to 401 Water Quality Certification.
3
0
3
4. The state is meeting those goals over the last two years, (in addition:
what are actual results? Any proposed or actual remedial measures if
not meeting goals?)
0
1
1
5. 401 Certification provides comprehensive protection for wetlands.
1
0
1
6. Whether the state program tracks 401 Certification decisions.
1
0
1
FINAL
18
28 December 2007
-------
3. Other sources of funding that play a role in building core elements
Large grantees tended to have a more diverse portfolio of funding sources than did small
programs. The majority of large programs said that they received funds for wetland program
development and implementation from two sources: 1) state tax (or bond) revenues; and 2)
State/Tribal Environmental Outcome Wetland Demonstration Program Pilot Grants exclusively
focused on program implementation activities (hereafter referred to "Pilot Grants").
Small programs tended to receive little or no funding from the governing state or tribe, and only
one small grantee received a Pilot Grant.15 Both large and small grantees utilized other sources of
funding16, but none appear to be as consistent or of the magnitude of WPDGs.
Exhibit 14: Other Funding Sources Utilized by the Grantee Cohort
Funding Item
% Grantees Responding by
Program Size
Large
Small
Total
a. Receives state funding from general tax or bond revenues
89%
22%
56%
b. Used other sources of funding
44%
44%
44%
c. Received a State/Tribal Environmental Outcome Wetland
Demonstration Program Pilot Grant
56%
11%
33%
Average
63%
26%
44%
15 An examination of Pilot Grant award patterns was not part of the scope of this evaluation. However, multiple small
programs mentioned that they were ineligible for Pilot Grant awards due to an insufficient progress made in developing
core elements.
16 Refers to funds acquired outside of WPDGs and any state sources. For example, some small programs within the
grantee cohort tapped Federal grant programs directed by the U.S. Fish and Wildlife Service and the Bureau of Indian
Affairs, while some large programs received nonpoint source funding and Performance Partnership Grants.
FINAL
19
28 December 2007
-------
C. Question 3 - What barriers do grantees face in building core elements of state/tribal wetland
programs?
We asked states and tribes in the grantee cohort for their opinions regarding barriers
encountered in developing or enhancing wetland programs, and provide a statistical summary of
responses in Exhibit 15. Grantee responses regarding types of barriers faced in developing wetland
programs tend to fall into two categories:
> Funding and Organizational Difficulties; and
> Grant Program Management Issues.
Again, we intentionally framed the inquiry broadly to minimize bias inherent in asking leading
questions about specific barriers. The frequency at which grantees cited barriers represents a "floor"
level of response. If we had polled the grantee cohort on each barrier, it is likely more grantees would
have agreed with some or all of the barriers cited. Appendix 1 supplements Exhibit 15 with more
details on the names of specific tribes and states offering opinions on barriers to wetland program
development. Our analysis of these barriers follows Exhibit 15.
Exhibit 15: Barriers to Building Core Elements Cited by Grantee Cohort
Barriers
% Grantees Responding by Program Size
Large
Small
Total
l-'unding and Organizational Difficulties
78%
89%
83%
a. Inadequate funding
44%
67%
56%
b. Insufficient staffing to develop program
22%
78%
50%
c. Inexperience with writing grants
0%
56%
28%
d. Competition for WPDG funding
33%
22%
28%
e. Lack of internal or interagency support
11%
33%
22%
f. Difficulty in securing matching funds
11%
33%
22%
Grant Program .Management Issues
56%
56%
56%
g. One year grant duration is too short
33%
33%
33%
h. Timing of funding does not coincide with
seasons to do field work; schedules slip
33%
22%
28%
i. Internal grant approval process
11%
22%
17%
FINAL
20
28 December 2007
-------
> Inadequate Funding Impedes Both Large and Small Programs. More than half of all states
and tribes in the grantee cohort cite insufficient funding as a barrier to program development.
Smaller programs face more disruptive impacts from low levels of or gaps in funding. For
example, the case of Saint Regis Mohawk Tribe illustrates how a small program can make
progress with consistent WPDG funding, yet suffer significant setbacks when it does not
receive an annual award (see Appendix 2 for a case study on Saint Regis Mohawk Tribe).
A third of large grantees voiced concern about how Pilot Grants have cannibalized the already
low levels of funding available for WPDGs. One large grantee noted that the decrease in
available WPDG funding due to Pilot Grants led it to extend its 5-year plan to ten years. This
highlights a disparity between large and small programs, whereby small programs tend to cease
operations when they fail to receive an annual WPDG award while large programs scale back
aspects of programs, reassign staff, and extend timelines for completion.
> Small Grantees Face Staffing Issues. Three quarters of small programs consider a shortage
of qualified staff to be an important factor impeding wetland program development. The
relative lack of depth of small program staffs leads to acute operational difficulties when an
employee leaves. The Saint Regis Mohawk Tribe case points to the difficulty that small
programs face in attracting appropriately trained and qualified personnel in the face of chronic
funding issues (e.g., potential failure to get a WPDG award, one-year duration of WPDGs).
> Small Grantees Lack Grant-writing Experience. More than half of small grantees say they
are hindered in their efforts to effectively compete for WPDG funding due to inexperience with
writing grant proposals, whereas no large programs cited this issue as an obstacle. Grant-
writing is a critical skill for small program survival, yet many small programs acknowledged a
lack of formal training for this skill and a general lack of experience in writing proposals. This
issue is compounded by the fact that small programs have limited time to perform grant writing
due to heavy workloads. Arkansas mitigated this issue by forming a workgroup (MAWPT) to
jointly write WPDG applications on behalf of five state agencies. The use of a permanent
workgroup enables agencies to reduce workload burdens in writing and administering grants.
> Both Large and Small Programs Cite Grant Program Management Issues. It is important
to note that grantees generally believe that EPA requirements for writing, reporting, and
administering WPDGs are reasonable. However, grantees cite some program management
issues that can impede core element development. A third of the grantee cohort said that the
one-year duration of WPDGs was too short a time frame both in terms of expected project
duration and from a grant administrative perspective (i.e., it would reduce workload if an
application could be written for two or more years rather than having to apply each year).
Roughly a quarter of the grantee cohort said that gradually slipping deadlines for applying,
reviewing, and awarding grants create funding gaps or cause delays in scheduled fieldwork
increased administrative burdens (e.g., an award in winter may delay projects until spring).
> Small Programs Struggle with Internal Issues. One-third of small programs indicated that
wetland program development either lacked internal agency support or faced resistance from
other state agencies with conflicting agendas (e.g., agencies promoting residential and business
construction).
FINAL
21
28 December 2007
-------
D. Question 4 - What can EPA do in the future to help states/tribes build core elements of wetland
programs?
Similar to the approach used to answer questions 2 and 3, we rely on grantee cohort suggestions
regarding what EPA can do to better facilitate the development of state/tribal wetland programs.17
Exhibit 16 compiles the recommendations made by the grantee cohort. In soliciting suggestions from
grantees, we did not ask for feedback on specific ideas to minimize bias inherent in asking leading
questions, but we did ask them to consider the critical success factors and the barriers they cited when
making suggestions. The frequency at which grantees made similar suggestions represents a "floor"
level of response. If we had polled the grantee cohort on each suggestion, it is likely more grantees
would have agreed with some or all of the suggestions made. Therefore, EPA may want to consider
grantee cohort suggestions on their merits rather than interpret the frequency of responses as an
indicator of priority given to any particular suggestion. Appendix 1 supplements Exhibit 16 with more
details on the names of specific tribes and states offering suggestions to EPA to facilitate state/tribal
wetland program development.
Exhibit 16: Suggestions to EPA Made by the Grantee Cohort
Grantee Suggestions
% Grantees Responding by
Program Size
Large
Small
Total
Grant Administration/Program Management
100%
89%
94%
a. Increase funding for WPDG program
89%
78%
83%
b. Extend grant duration beyond one year
44%
56%
50%
c. Provide assistance with grant material submissions
11%
56%
33%
d. Make QAPP requirements less onerous
11%
11%
11%
General Suggestions
33%
22%
28%
e. Facilitate the sharing/adapting of tools between
states and tribes
33%
11%
22%
f. Restart annual regional conferences
11%
11%
11%
17 The answer to Question 4 exclusively relies on the opinions offered by the grantee cohort and does not include
contractor ideas to improve the effectiveness of EPA's WPDG program. We capture our suggestions in the "Potential
Next Steps and Recommendations" section.
FINAL
22
28 December 2007
-------
> Most Grantees Suggest an Increase in WPDG Funding. Four-fifths of all grantees suggest
that more funding be made available under the WPDG program.
> Lengthening Grant Period of Performance Cited by Small and Large Programs. Half of
the grantee cohort suggests that EPA allow grants to be performed beyond their current one-
year duration, when appropriate. Many grantees, especially small programs, say a single grant
- covering more than one FY budget cycle - reduces grant-writing burdens and reduces the
chance of having to push back fieldwork required in some developmental projects. The
underlying sentiment in small programs is that grants with longer durations will tend to reduce
the frequency and severity of gaps in funding.
> Small Programs Need Increased Assistance on Grant Writing. A number of grantees
spoke of the steep learning curve involved with the grant writing and application process.
Some small grantees expressed that they felt at a disadvantage when competing with larger,
more technically savvy programs. More than half of small grantees said that they would stand
to benefit from more training or assistance in grant writing. Large programs, with more in-
house expertise, expressed little need for assistance on writing winning grant proposals.
> Better Coordination and Communication Across Grantees. One-third of large grantees
recommend that EPA step up efforts to foster the sharing of transferable tools, methodologies,
and databases among states and tribes. One grantee suggested that EPA examine those states
where there are multiple agencies involved in wetland program development and promote
more joint WPDG proposals (e.g., Arkansas, Virginia). The grantee thought that having a
"state coordinator" to minimize conflicting or redundant activities and promote coordination
makes sense given limitations on available WPDG funding. Another grantee (who also
suggested that EPA restart annual regional wetlands conferences) noted how helpful it would
be to bring together programs with similar issues to discuss strategy and share findings because
"talking to people is better than just reading their studies."
E. Other Findings
> Emphasis on Environmental Outcomes over Program Development. Multiple grantees -
regardless of program size - cited mounting pressure to achieve environmental outcomes, so
when having to choose between a developmental versus implementation activity, they are more
inclined to do the latter.
> Adequacy of National Databases. Two grantees (one large, one small) said that inaccuracies
within the National Wetlands Inventory impaired the development of their wetland programs.
EPA may want to examine data quality in this and other databases that states and tribes rely on
to develop their own databases and integrate data in support of identifying needs and setting
priorities.
> Utility of Grant Reporting. One grantee commented on the lack of follow up with regards to
semi-annual reports after they are transmitted to the EPA Regional office. The grantee stated
that they had yet to receive feedback on the reports and wondered if the reports could serve as
a useful foundation for a dialogue about progress made and areas of improvement.
FINAL
23
28 December 2007
-------
IV. Potential Next Steps And Recommendations
1. Tailor EPA Program Management to Account for Differences Between Large and Small
Wetland programs. A key finding of this evaluation project is that there are many
fundamental differences between large and small wetland programs regarding capabilities,
resources, and approaches to program development. These differences have implications that
go beyond the WPDG program to encompass broader aspects of EPA's wetland program
management strategy and tactics. EPA may want to consider changes to the following areas to
better facilitate the development of large and small wetland programs:
a. Refine Core Element Endpoints. EPA recently developed a list of core element
endpoints that collectively comprise the set of characteristics and capabilities of a fully
functional state/tribal wetland program. EPA used the endpoints in the evaluation of grant
applications under the State/Tribal Environmental Outcome Wetland Demonstration
Program, and plans to increasingly apply the endpoints in other aspects of program
development, implementation, and performance measurement.
Before EPA can effectively measure state/tribal achievement of endpoints (or incremental
progress towards achieving endpoints), it will be necessary to develop clear programmatic
definitions for each endpoint. We also recommend the development of guidance materials
and the performance of training of the regions, states, and tribes. Core element endpoints
might be more equitably applied to large and small programs by:
1. Prioritizing endpoints. Based on feedback from EPA, states, and tribes, we generally
find that some endpoints are more important than others. EPA may want to designate
priority levels to endpoints to further guide states/tribes on what program areas are
more important or urgent to develop. It is possible that upon further review, EPA may
determine that some endpoints should be dropped, while new ones may be warranted
(e.g., the frequency with which "guidance" was cited by grantees and the amount of
WPDG resources dedicated to developing guidance suggests consideration. Also,
developing a strategic plan would appear to be a viable endpoint candidate).
2. Categorizing endpoints. The current list of endpoints appears to be more reflective of
what a large program can become, and includes some items that a small program would
be at a relative disadvantage to achieve (e.g., data management systems). A fresh look
at the endpoints might yield "common" endpoints that all programs - regardless of size
- should aim to achieve. Some endpoints might be appropriately categorized as "large
program only" and "small program only". These three designations could enable EPA
to more equitably evaluate and compare the progress of large and small programs.
b. Process for Screening and Selecting WPDG Awards. Exhibits 1-5 illustrate a pattern of
awarding more WPDGs and higher average awards to large programs. This tendency
combined with the higher relative reliance that small programs have on WPDG funding
puts small programs at a disadvantage.
To achieve a more equitable distribution of WPDG funding to large and small programs,
EPA may want to consider adding a few "equity checks" to adjust award decisions and
FINAL
24
28 December 2007
-------
amounts, if necessary (e.g., average size of awards). As a means of reducing the
administrative burden on tribes and small state programs, EPA may want to introduce a
multi-year "blanket" application for grants from an interagency funding pool exclusively
for small programs. This would allow applicants who are hard-pressed to spare limited
FTE hours writing grants to reduce administrative activities and expose their proposals to
multiple funding sources.
c. State/Tribe Collaboration and Technology Transfer. Small programs tend to have
fewer resources and capabilities to develop the more technical and scientific endpoints.
EPA may want to place a greater emphasis on technology transfer and multi-program
collaboration to more cost-effectively achieve core element endpoints, especially for small
programs or programs that are relatively early in the program development lifecycle. For
example, develop a list of "best of class" tools, methodologies, guidance documents,
training modules, and databases that an undeveloped program can adapt to meet its needs
rather than create "from scratch." Also, given that at least 50 percent of the states and
tribes in the grantee cohort have more than one state agency "conducting significant
wetland-related activities,"18 EPA and grantees may want to focus on coordinating
wetland-related efforts between multiple organizations within the state (aligning to the core
element not included as part of this evaluation, 'Coordination'). While the objectives of the
evaluation did not involve an identification of redundant WPDG projects, we anecdotally
observed some instances where grantees were performing comparably scoped WPDG
projects. In the award evaluation process, EPA may want to actively compare proposed
projects with those underway to reduce overlap and, if practicable and appropriate, re-
scope awards (e.g., instead of "bottom-up" database development project, award a smaller
amount to tailor an existing database to the needs of the grantee).
2. Allow for WPDG Awards to Span Multiple Years. The current single-year duration of
WPDG awards is problematic both for EPA and state/tribal awardees. The current grant
application cycle of one year is administratively burdensome to all concerned, and does not
reflect the multi-year nature of many of the underlying projects being undertaken with WPDG
funds. EPA can still do single-year awards, where appropriate, but having the flexibility to
award multi-year projects eliminates states and tribes from having to essentially submit the
same proposal in subsequent years. Not only does this also reduce the amount of grant
applications to be reviewed (and grants to be subsequently administered and closed out), it
affords EPA the flexibility to tap multiple budget years for a single project. EPA may still
choose to release new RFPs each year, but the application for any given year is likely to be less
than the status quo.
3. Consider Fixed or Base WPDG Award Amounts. As the St. Regis Mohawk case study
illustrates, the "feast or famine" effect on small programs when they do not receive a WPDG
can significantly impair program momentum. For small or otherwise disadvantaged programs,
EPA may want to consider setting aside a base or fixed annual allocation of WPDG funding.
Depending on available funds and the number and magnitude of other potential awards in a
18 From Environmental Law Institute's State Wetland Program Evaluation: Phase III (page 16); See Appendix 1 for a count
of agencies conducting wetland activities within the grantee cohort. See the Arkansas case study in Appendix 2 for a
closer look at successful coordination among multiple organizations within a state.
FINAL
25
28 December 2007
-------
given year, EPA could supplement the base funding with a variable amount. This approach
could serve to increase the ability of a small program to retain and attract key personnel.
4. Enhance Database Tracking. While the scope of this evaluation project did not include an
analysis of the adequacy of programmatic databases, we did identify a number of data quality
and completeness issues during the data mining of grant information. For example, some grant
records lacked basic background information, were miscoded as WPDGs but were actually
Pilot Demonstration Grants, and did not include key planned and actual outputs/deliverables.
The latter issue has implications for effective program management and performance
measurement. Also, current data base structures do not appear to accommodate coding of
progress towards specific endpoints. Current data bases may need to be redesigned/enhanced
or new databases created to accommodate endpoint tracking.
FINAL
26
28 December 2007
------- |