United States
Environmental Protection
^1 m m Agency

231 -K-160-01
September 2016
www.epa.gov/smartgrowth

Flexible Framework for
Measurement of EPA's
Community-Based Initiatives

Office of Sustainable Communities
Smart Growth Program


-------
Mention of trade names, products, or services does not convey official EPA approval,
endorsement, or recommendation.


-------
Table of Contents

Preface and Acknowledgements	 1

Introduction	2

Document Purpose	2

Intended Audience	2

How to Use this Document	3

Framework Roadmap	3

Characteristics of Good Measurement	5

Basic Steps in the Measurement Process	7

Developing a Logic Model	8

Planning For Measurement	 11

What Data Will be Collected to Support the Measures	 11

Who Will Collect the Data	 12

How the Data Will be Collected	 12

How the Data Will be Analyzed	 15

How the Data Will be Stored/Managed	 15

Frequency of Reporting/Collection	 16

How the Data Will be Communicated to Facilitate Use	 16

Information Collection Requests	 17

Establishing a Baseline or Comparison	 18

Field Testing	 19

Resource Requirements	 19

Storytelling	 19

Partnerships	21

Overview	21

Measuring Partnership Health	21

Leveraging Resources	27

Overview	27

Measuring Leveraging Resources	27

Education and Training	30

Overview	30

Measuring Education and Training	30

Capacity Building	33

h®


-------
Overview	33

Measuring Capacity Building	33

Customer Satisfaction	36

Overview	36

Measuring Customer Satisfaction	36

Environmental Outcomes	40

Overview	40

Measuring Environmental Outcomes	40

Economic and Quality of Life Outcomes	45

Overview	45

Measuring Economic and Quality of Life Outcomes	45

Additional Resources	48

Program Evaluation and Measurement	48

Data Visualization	48

Social Network Analysis	49

in®


-------
Preface and Acknowledgements

In 2012 EPA's Evaluation Support Division (ESD), located within the Office of
Policy, developed a draft Flexible Framework for Measurement of EPA's Community-
Based Initiatives with contractual support from Industrial Economics, Incorporated
(IEc) and in consultation with a cross-agency team (OW, OCFO, Region 2, Region 7,
Region 8) and staff involved in EPA's community-based work.

To create the 2012 framework, IEc reviewed publicly-available information on
performance measurement for EPA's community-based work and conducted
interviews with representatives from nine community-based initiatives. The
interviews focused on identifying approaches that the Agency was taking towards
measurement and evaluation; specific problems that staff commonly encounter while
developing measurement approaches; and lessons learned from previous activities
that may be applicable to initiatives still developing their long-term measurement
and evaluation approach. Following this research stage, IEc developed an outline of
the measurement framework and draft measurement tables. EPA stakeholders
provided feedback on these drafts and IEc incorporated this feedback to develop the
2012 framework document.

In 2015, EPA's Making a Visible Difference in Communities cross-agency strategy
which provided focused support to approximately 50 over-burdened and under-
served communities, prompted renewed interest in the framework within EPA's
Office of Policy (OP). EPA contracted with IEc to revise and update the framework to
add additional guidance and include recent examples of measurement activities
related to EPA's community-based work. IEc's and EPA's efforts resulted in this
2016 Flexible Framework for Measurement of EPA's Community-Based Initiatives
guidance document.

This document adopts or adapts language from previously developed ESD
measurement resources including the training Logic Modeling, Performance
Measuremen t, and Program Evaluation: A Primer for Managers and Guidelines for
Measuring the Perform ance of Partnership Programs. ESD and OP wish to thank all
who assisted in this effort.

1®


-------
Introduction

EPA administers many programs through which the Agency works in partnership
with community organizations to improve the health of local communities and
minimize communities' environmental impacts. EPA's portfolio of community-based
work is characterized by a significant level of involvement at the community level,
with many activities directly implemented by community groups or local
governments. Community-based initiatives are often defined by a particular place
(e.g., specific geography or political jurisdiction) and sometimes by a specific
demography as well (e.g., a specific Tribe). Some examples of EPA's past, and
present community-based work includes the Brownfields program, Community
Action for a Renewed Environment (CARE), Community-Based Environmental
Protection (CBEP), Climate Showcase Communities, Indian Environmental General
Assistance Program (GAP), Environmental Justice Showcase Communities, Making
a Visible Difference in Communities (MVD), Superfund Jobs Training, and Office of
Sustainable Communities technical assistance programs.

When an organization or multiple organizations have programs with similar goals, it
is useful to establish a common vernacular for measurement to take advantage of
learning that can occur across programs. The measures tables that comprise the
heart of this framework, establish a common set of measures for EPA community-
based programs and initiatives. The tables also provide examples of how EPA
community-based initiatives have customized or added detail to these basic
measures to meet their needs.

Document Purpose

This technical document was developed for use by EPA managers and staff
supporting Agency community-based programs and initiatives, as well as their
measurement and evaluation contractors. Overall this document is intended to
support EPA efforts to measure the performance of community-based work.

Intended Audience

The intended audience for this framework includes:

• Decision-makers - executives tasked with setting strategic direction and
outlining accountability systems

2®


-------
•	Middle managers - managers who need to ensure resources for
practitioners are available and are held accountable for achieving targets by
decision-makers

•	Practitioners - staff delivering services to communities
How to Use this Document

For those new to measurement or those seeking to revisit or refine their measures,
this document highlights the key characteristics of good measurement, outlines the
basic planning required to set up a measurement system, and provides an inventory
of measures as options. This document is not meant to be a complete "how to"
resource for establishing a robust measurement system for your initiative. Be sure
to consult other materials and experts.1

Framework Roadmap

The first two sections of this document provide background information on the
characteristics of good measurement and the measurement process, respectively.
The heart of this framework is made up of short chapters covering seven topic areas
of measurement common to many EPA community-based programs:

Partnerships: EPA often supports partnerships as a means of tackling
complex environmental issues, including those in areas with disproportionate
burdens. These measures gauge the health and long-term viability of
partnerships in which EPA is investing.

Leveraging Resources: These measures gauge a program or initiative's use
of available resources to leverage additional resources, increasing the benefit
of EPA's investment.

Education and Training: These measures address training community
members to become effective environmental advocates, employees, and
leaders.

Capacity Building: These measures address developing the knowledge,
skills, and confidence of community organizations funded by EPA, and the
members served by those organizations.

Customer Satisfaction: These measures address community satisfaction
with EPA's assistance.

Environmental Outcomes: These are measurable environmental benefits
associated with initiative activities.

1 Additional resources are available at the end of this document and on EPA's evaluation website:
https://www.eDa.gov/evaluate

3®


-------
Economic and Qualify of Life Outcomes: These are measurable economic
and quality of life benefits associated with initiative activities.

Each measures table is organized in the same way, in the following columns:

•	Category: All seven measurement areas are further organized into
categories of similar measures. For example, the capacity building measures
categories include: empowering partners, increasing organizational capacity
among partners, and improving community group physical and
communication infrastructure. Each category contains between one and eight
measures.

•	Potential Measures: These are the measures suggested for EPA staff to
consider. EPA suggests that staff select a small set of measures, including
one or more outcome measure (see below). The measures vary in format; for
example, some measures are yes/no while others are framed as a number or
proportion. Staff may customize measures to meet the needs of their
individual initiatives.

Activity, Output, or Outcome:

The framework contains a variety
of activity, output, and outcome
measures. See sidebar for
definitions.

Primary Data Collector: Notes
if EPA or the community partner
would likely be in the best
position to collect data on the
measure.

Activity measures refer to what EPA or
the community partner does to implement
the initiative.

Output measures refer to the things or
products that EPA or the community
partners produces or delivers.

Outcome measures refer to changes in
knowledge, attitude, behavior, or condition.

V	J

• Examples: Provides examples of

where the measure has been used by an EPA community-based initiative.

Some of the examples in the tables may be project-specific as opposed to
program-wide.

On the environmental outcomes table, each measure category is linked to the
goals of EPA's Strategic Plan.

Note that throughout the framework document and tables, the term "partners"
refers to the community group or organization that EPA is working with, as
distinct from "community members," which refers to individuals residing in the
community.

4®


-------
Characteristics of Good Measurement

Performance measurement is the ongoing measurement and reporting of progress
and accomplishments using pre-selected measures. All EPA initiatives should
develop a performance measurement approach to assess how well EPA activities
address its stated goals. Measurement is critical for understanding progress for
learning purposes, and for demonstrating value internally and to the public.

The following pages highlight the basic steps for establishing a measurement system
for an initiative. This includes key characteristics of good measurement to consider
as you select the measures you will use and as you establish your system for
measuring performance.

Have you involved all relevant stakeholders in a collaborative and

transparent process?

The measurement process should engage all relevant stakeholders at appropriate
points throughout the process. You should seek to engage, in particular, your
community partners to ensure that their views and practitioner experience help
shape your approach. All stakeholders should see their relevance to the process and
should see some relevance of the measurement process to their own work. Gaining
the input and buy-in of community partners and stakeholders in the measurement
effort will help ensure the quality of their data collection.

Are the measures valid assessments of the program elements you are

most interested in tracking?

The measure chosen to assess an element of community-based programs should be a
good representation of what it is conceptualized to measure. Validity in the context
of measurement describes the degree to which a conceptualization of a community-
based initiative element is mapped directly to the operational definition that is
captured in the specific measurement. The menu of options in the flexible
framework includes those measures that are most likely to be valid representations
of key community indicators across situations. It is also of great importance in
selecting valid measurement that an initiative's set of performance measures should
directly gauge activities, outputs, and outcomes of the most relevance. This
particular aspect of validity is referred as content validity and is captured in a
measurement's relevance. Staff do not need to gauge the success of every outcome as
long as the primary objectives are assessed. For example, land-revitalization work

5®


-------
should have a measure of land revitalized, and a renewable-energy initiative should
have measure of renewable energy produced.

Are the measures reliable enough to render the same results if they
were independently collected by someone else?

An initiative's measurement approach should be implemented similarly within every
applicable community touched by EPA's work. Each project should collect the same
types of data using the same methods. That way, project-level data can be compiled
easily at the initiative level. Furthermore, replicable measurement methods can be
used as a model for other EPA work seeking to measure similar activities, outputs,
or outcomes.

Do the measures provide information on the most critical junctures
to achieving end goals or on the end goals themselves?

An initiative should use a logic model or program theory as a reference point for
selecting performance measures. While the logic model identifies the activities,
outputs, and expected outcomes, measures quantify the extent to which these
activities, outputs, and outcomes are being successfully realized. See information
below on logic modeling.

Are the measures feasible to implement?

Some metrics may seem highly-relevant to the initiative element they attempt to
quantify, but might be overly-burdensome either for community partners or EPA to
collect and compile. Before finalizing any measures, EPA should formulate a
measurement plan that outlines the intended methods for gathering, analyzing, and
compiling measurement data (see the next section, "Planning for Measurement," for
more detail). This plan should also include what information EPA will provide to
external partners, as well as what information external partners will report to EPA.
Additionally, EPA should check with potential community partners to verify the
data collection process is feasible.

To what will performance be compared?

When sharing performance data, it is important to include a comparison, such as
current performance compared to baseline performance (previous performance, or
performance at the start of the project or upon joining the initiative), performance
over time, or performance compared to a target or an established standard.

Is the initiative's role clear in achieving what has been measured?

The initiative may not be the sole causal agent to which outcomes may be attributed.
In many cases, the EPA's work is one of multiple factors contributing to trends in
natural resources conserved and pollution reduced or avoided. Where there is a

6®


-------
question of what specifically causes a desired outcome, EPA should carefully
communicate environmental results. It is appropriate to use language such as "the
initiative contributed to (X environmental outcome)" and/or to caveat the other
factors that may be contributing to reported results, including other public
policies/programs.

In addition to the above information, having a clear plan for how you will measure
performance and use performance information is critical to ensuring that the data
you need will be accessible when you need it and will be of sufficient quality to act
upon. The next section discusses what to consider in planning for measurement.

Basic Steps in the Measurement Process

This section highlights key steps in the process of developing and implementing a
measurement system. Be sure to consult more extensive "how to" documents as
needed to help you conduct the process yourself.2

Identify the Team, Engage in a Logic Modeling Discussion, and Begin

a Performance Measurement Plan

Identify individuals who will help you develop measures for your program. Engage
the group in a discussion of the theory of program change and consider documenting
a visual model that describes the elements of your community-based initiative and
the relationship between your program's work and its desired results. This visual
model will help you "see" your program activities, outputs, and outcomes and
prepare you to create your performance measures. Then, think about the audience,
purpose, context, roles, and resources for your performance measurement system.

Develop Performance Measures and Plan for Measurement

Adopt or create measures that show how well your program is meeting customer
needs and achieving environmental results. You will focus on measures that show
your short-term, intermediate, and long-term outcomes and meet the criteria for
good measurement. The goal in creating this flexible framework document is for
stakeholders in community-based initiatives that work collaboratively to identify a
core set of measures that may be broadly applied to programs of diverse
characteristics. Plan for measurement by thinking through the data you will need
to collect, who will collect it, how it will be collected, where it will be stored, how it
will be analyzed, to whom it will be communicated, how it will be communicated,
and how often. (See the next section on Planning for Measurement.)

2 Additional resources are available at the end of this document and on EPA's evaluation website:
https://www.eDa.gov/evaluate

7©


-------
Collect Data on Measures

Gather data for the measures that you developed. Be sure to field test your
measures, data collection processes, and data storage tools before launching your
full data collection. A field test can save time, money, and relationships by allowing
you to test and correct the process before burdening your entire population with your
first information request.

Analyze and Interpret Data, Communicate Results, Facilitate Use

Analyze the data you have collected to see whether you are achieving your initiative
goals. Then communicate key results to others inside and outside of the agency. In
communicating results, help draw attention to results that may indicate a need for
follow-up action, and work with stakeholders to ensure that they have all available
data and that the data are clear enough to support decision-making and action.

Revisit, Revise, Repeat

The steps above describe a sequential process; however, the process of performance
measurement is an iterative, ongoing process. It is likely that as you continue the
process, you may have to revisit and refine some of the information and processes
identified in the early steps.

Developing a Logic Model

An important step to developing a successful measurement approach is to create an
initiative logic model. A logic model is an illustration of how a program, initiative, or
project is supposed to work; it shows the relationship between an initiative's work
and its desired results. A logic model exercise is helpful to develop measures because
it outlines all of an initiative's intended activities, outputs, and outcomes, which
need to be well-defined to inform measurement. Although developing a full logic
model is preferable, in many cases a logic table format is sufficient for helping to
identify measures.

A logic model is made up of seven basic elements:

Resources/Inputs: What the initiative has to complete the work (e.g. people
and funding)

Activities: What the initiative will do
Outputs: The products the initiative delivers

Target Audience: The recipients of the initiatives activities and outputs

Short-term Outcomes: Changes in the target audience's knowledge,
attitude, or skills

8®


-------
Intermediate Outcomes: Changes in the target audience's behavior,
practices, or decisions

Long-term Outcomes: Changes in the environment as a result of the
initiative

The logic model describes the causal relationships among these elements to
communicate how the initiative is designed to realize its goals. Logic models also
document any external influences beyond the control of the EPA that could have
bearing on the implementation or outcomes of the initiative. For example, external
factors for community-based work may include:

•	Changes in funding or personnel at EPA and/or at the community level

•	The effect of state or federal policies that address the same environmental
issue

•	Shrinking or growing local population

The process of developing an initiative's logic model often uncovers subtle but
important differences in how different staff members think about how the initiative
is supposed to work. These are important conversations to have early on, and come
to agreement on, as the ability to measure accomplishments is a natural outgrowth
of staff consensus on specific goals, and how the initiative is supposed to achieve
them.

A generic logic table for community-based initiatives is displayed on the next page.
The logic table incorporates generic examples of the items that could be included
within each of the logic model elements for an EPA community-based initiative.
Actual logic models should have a far greater degree of specificity than this generic
table, and should tie specific elements together in a logic chain using individual
boxes and arrows. Also, actual logic models may not include all of the activities or
produce all of the outputs and outcomes shown in the generic table.

9®


-------
Community-Based Initiative Generic Logic Model

RESOURCES/
INPUTS

ACTIVITIES

OUTPUTS

TARGET AUDIENCE

SHORT-TERM
OUTCOMES

INTERMEDIATE
OUTCOMES

LONG-TERM
OUTCOMES

EPA Staff

Potential
community
partners

Funding

Institutional
knowledge within
EPA

Other potential
governmental,
business, or NGO
partners

Provide technical
assistance to
community
organizations

Select grantees and
administer grants

Undertake research
initiatives in
partnership with
community
organizations

Conduct
environmental
assessment,
cleanup, and
planning support

Develop a
measurement
approach

Program logic model

Communication with
community partners
such as meetings,
conference calls,
emails, and letters

Fact sheets, guides,
or other initiative
materials

Social media
presence

Trainings

Action plans

Environmental
Reports

Case studies of
successful projects

Compilations of

initiative
accomplishments

Community partner
organizations

Individual
community
members

Growth in the number
of community
organizations
partnering with EPA

Community
organizations increase
use of materials
developed through
the initiative

Increased community
organization/ member

knowledge of local
environmental issues

Increased technical
knowledge among
community
organizations/
members (from job
training)

Community
organizations/
members undertake
actions to address
local environmental
issues

Community
organizations/
members increase
capacity to
implement
environmental
programming

Environmental
benefits

Human health
benefits

Quality of life benefits

Local economic
benefits

Sustainability of
community-led
environmental
programming

10®


-------
Planning For Measurement

While selecting your initiative performance measures and before launching full scale
data collection to support the measures, it is important to carefully assess and plan
for data collection, analysis, and use by considering the following.

What Data Will be Collected to Support the Measures

Staff should identify the specific data to be collected to support each measure
selected. Individual measures may require similar data to be collected from different
sources. For example, if the initiative selects the partnership communication
measure "the number of discrete messages developed and used by others," then the
initiative may need to collect information on messages incorporated into different
media, such as online outlets, print media, and radio. Also, more than one type of
data maybe needed to support a single measure. For example, the economic and
quality of life measure "economic output per unit of energy consumption" requires
both economic output data and energy use data.

As discussed earlier in this framework, EPA staff should consider data issues during
measure selection, including feasibility of collecting data needed for all measures.
This is a particular concern for environmental and quality of life outcome measures,
for which staff should consider the availability of potential data sources and
technical know-how to successfully implement the measure.

For example, if the measure is related to energy use, staff should identify if existing
sources of energy use data that would support the measure, such as utility or fuel
bills, building energy models, and/or GHG inventories. If these existing sources are
not available at the time that a community partner joins an EPA initiative, then
EPA staff should consider the feasibility of their use for reporting purposes.
Typically, existing sources of data are available for energy use (and through
conversion, GHGs), water use, waste minimization, and land restoration and
preservation measures. Collection of new data may be necessary for tracking
progress on air quality, water quality, and toxics use reduction measures (unless the
toxics use of concerns is covered by the Toxics Reporting Inventory).

If new data collection is required for potential measures, then EPA staff should
consider the feasibility of collecting these data, including cost and technical
expertise required. If collection is not feasible, then EPA could consider a proxy for
directly collecting the data for some measures. For example, some EPA community
initiatives train individuals on potential impacts of climate change in their

11®


-------
community. EPA is often unable to assess individuals' actual knowledge gained at
the end of the training. Instead a proxy measure to assess knowledge gained is
"number of individuals that attended the training."

Who Will Collect the Data

The measures tables included in this guide contain a column indicating if EPA
or the community partner should collect data over time. This assessment is
based on likely access to the data source. For example, community organizations
will generally be in the best position to collect data from community members.

You will need to determine what data collection capacity currently exists among
your community partners and what capacity may need to be built. This may
include putting systematic protocols in place or shaping and developing those
already in place.

EPA staff will also need to determine who, specifically, will be in charge of
overseeing data collection. It is recommended that EPA staff designate one
individual with oversight of implementing the performance measurement plan,
including overseeing data reported by partners as well as data to be collected by
EPA. This coordinator may need to work with other EPA staff, including
regional staff, to collect data within EPA's purview.

How the Data Will be Collected

Regardless of who collects the data, or the form in which the data are collected,
EPA staff should develop a standard template or form for measurement data
collection, and should provide clear written instructions to reporters, with
examples of how to use the template. Developing FAQs is also a good idea,
especially if the initiative is using complex measures or is relying mostly on
partner organizations to provide data.

Common options for collecting measurement data include:

• Online reporting. Online reporting has many advantages over other
forms of reporting because it is often the most user-friendly option for
both community organizations and EPA staff to report information.
Online reporting systems are also preferable for the performance
measurement coordinator because, among other benefits, the systems can
require users to supply needed data before submitting the form and
remind users of reporting responsibilities via email. Most importantly,
online systems also eliminate the cost, time, and potential error
associated with data entry. The EPA Brownfields program has used
online reporting for many years to simplify reporting and manage
information.

12®


-------
Several off-the-shelf, free or low cost "form builder" options are available
to facilitate online reporting. Common examples include:

o Google Forms is a free basic form builder that includes nine different
question types (e.g., multiple choice, open text). Reported data are
automatically exported to Google sheets, which can be easily exported
to MS Excel if desired. As a free service, Google Forms lacks features
of other form builders, including the ability to: apply multi-column or
tabular layouts; apply sophisticated "skip" or "conditional" logic to
change what question a respondent sees next based on multiple
responses; allow users to save partial entries and resume work later;
or track who participated in a data collection.

o Survey Monkey is a commonly used online survey software that
allows users to customize 15 different question types, track who
participated in a data collection, apply sophisticated skip logic, and
export results into a variety of software including MS Excel and
PowerPoint, SPSS, and PDF. Survey Monkey offers a free version,
but it has more limited features than paid versions that currently
range from $26-$85/month. Some EPA offices already have paid
Survey Monkey accounts that may be accessible to community-
based initiatives.

o Formstack is similar to Survey Monkey, but if offers a larger
number of question types, extensive layout design tools, data
encryption, automated calculations on the form (such as unit
conversions), and it offers a variety of plugins and extensions,
including for Facebook and Google Analytics. Pricing for Formstack
currently ranges from $39-$250/month.

Customized online reporting solutions are another option. Advantages
to a customized approach include the ability to pre-populate participant
forms with previously reported data (such as baseline reporting or
previous year reporting); automatic performance of complex calculations;
and the ability to provide instructions in a customized manner. In
addition, customized solutions have virtually no restrictions of types of
questions that can be asked, or form format; Customized online reporting
systems can also include administrative functions such as tracking
performance measurement work flow. These systems typically populate a
back-end database, as opposed to off-the-shelf solutions, which typically
provide raw data in spreadsheet form. The downside to customized online
reporting systems is that they require up-front investment of program
funds, and typically require hiring a contractor to build and maintain the
system. Moreover, systems housed on the EPA website require compliance

13®


-------
with Agency technology restrictions, and regular coordination with
managers of EPA's website and information management infrastructure.

•	Emailed MS Word or Excel forms. If an initiative cannot implement
online reporting due to resource or other limitations, an alternative
solution is to develop a reporting form in MS Word, distribute it over
email or through an EPA SharePoint page, and have individuals return
the completed form to the performance measurement coordinator.
Microsoft "form" templates and Developer tools can be used to guide users
to providing data in the correct boxes. The downside to this approach is
that it is more burdensome than online reporting for the reporter and for
initiative staff, and requires data entry of reported data into a data
management spreadsheet or database. A variant is to develop a reporting
form in MS Excel or Access instead of MS Word, which allows initiative
staff to embed any calculations and minimizes data entry. However, the
developer of an MS Excel or Access form needs advanced skills to develop
a reporting form with the same user-friendly look and feel as an MS Word
form.

•	Grant reporting using the universal federal grant reporting form.

Federal form SF-PPR is a universal performance progress report form
that must be completed for grants that exceed $100,000 or more per grant
period. Although agencies provide customized instructions, the form fields
are universal.3 The form has a narrative section where grantees are
required to enter information per agency instruction; agencies can require
performance information to be reported here, eliminating the need for an
additional reporting form. This is the approach that HUD used to collect
measurement data from recipients of Sustainable Communities Regional
Planning grants.4 However, the format of the narrative area is completely
unstructured, and thus not ideal for reporting performance data.
Alternatively, agencies can require grantees to attach forms, such as a
performance reporting form. Given the $100,000 grant threshold, this
reporting option is not applicable to most EPA community-based
initiatives. This form may be emailed or may be incorporated into an
online system. Using this form may address ICR issues; see discussion
below.

3	An example of the SF-PPR form is available at:
http://www.na.fs.fed.us/fap/SF-PPR Cover%20Sheet.Dclf

4	HUD, Program Policy Guidance OSHC-2012-05, Semiannual Progress Reporting Requirements for
FY2011 OSCH Regional Planning and Community Challenge Grantees. Available at:
http://portal.hud.gov/hudportal/documents/huddoc?id=QSHC2012-05 RepReqFY2011.pdf

14®


-------
• Using social media in measurement activities. This document contains
some social media-related performance metrics, including the number of
people following an initiative's Twitter feed or Facebook page, and the
number of posts on social media pages. These metrics, as well as many
additional metrics of social media reach, are easy to track by applying Google
Analytics. Google Analytics is a free tool that tracks information on who is
visiting an initiative's website and social media sites, how long visitors are
staying, what visitors are doing on the site (e.g. download activity), where
the traffic is being referred from and where visitors go to after leaving the
site or social media page. As noted above, Formstack has plugins and
extensions for Google Analytics and for Facebook that may make data
collection more seamless.

EPA staff should test new reporting forms and systems with their target
audience, including community organizations and other EPA staff as applicable.
EPA staff should also check with potential community partners to verify the
data collection process is feasible. Regardless of the measures selected and
method employed, performance measurement coordinators should expect
inquiries from community organizations or other staff regarding reporting data,
and should be prepared to spend time fielding questions.

How the Data Will be Analyzed

Common performance analyses include comparison to baseline data and trend
assessment over time. The community-based initiative should consult with a
data analyst who will determine the analyses that are best suited to answer the
measurement questions of greatest importance to the initiative. The data
analysis should be conducted with clear caveats about what types of claims can
be made with the measurement strategy employed. It is very important that the
conclusions drawn from the measurement do not include claims that overreach
the type of data analysis conducted.

How the Data Will be Stored/Managed

Community-based initiatives need a place to store and manage data. For small
initiatives with few measures, an Excel workbook may suffice. But for larger
ones with many partners and/or many measures, a database is a better tool for
data storage because it is more flexible and allows for easier querying of
available data, and thus easier data analysis. If the initiative uses online
reporting, data will automatically populate a back-end database. In cases where
EPA anticipates collecting and organizing large (often multi-year) data sets,

15©


-------
information may be stored through the Central Data Exchange (CDX), the
Agency's electronic reporting site.5

Frequency of Reporting/Collection

EPA staff should develop a consistent measurement data collection schedule,
and communicate it to data reporters. Annual reporting is often the easiest
schedule to implement and communicate. However, EPA staff should carefully
calendar the schedule so that it comports with any internal or external reporting
deadlines. Also, if the initiative is using existing data to track progress over
time, staff will need to make sure that the initiative's data collection schedule
comports with update schedule for those data. When there is flexibility,
initiative staff should consider pursuing the minimum amount of collection
needed to meet program management requirements.

How the Data Will be Communicated to Facilitate Use

Performance measurement data can be used by community-based programs in a
variety of ways. At a minimum, results are used to inform initiative staff about
progress towards goals, identify any areas where progress is not being made,
and start an internal conversation about barriers to progress and potential
solutions. Whether and how an initiative reports results externally depends on
the situation. Some programs may want to package and share some or all results
with a wider audience, such as other Regions within the Agency or its
community partners through an annual report or fact sheets. For example, EPA
MVD staff utilized the internal SharePoint Community Resource Network site to
share information about their successes and best practices. Some initiatives may
also share results with the public on EPA's website. In addition, if any of the
initiative's measures are included in the official EPA strategic planning and
annual reporting process, or feed into measures that are included in that
process, initiative staff will need to provide applicable results to others in the
Agency.

When determining how best to convey performance measurement data, always
consider the particular data needs of the audience, any format and data
visualization preferences, and best practices on data reporting as shown in the
literature or through consultations with someone trained in the subject matter.6
Effective data visualization helps communicate findings to maximize reader
engagement and comprehension of key information. Some select data visualization
best practices include:

5	Information on EPA's CDX is available on their website: https://cdx.epa.gov/

6	Additional resources on effectively presenting data are available in the Additional Resources section of
this report.

16®


-------
•	Alignment of the most important information in the top half (particularly
left-side) of a page and/or emphasized using color or size;

•	Use of graphics in combination with written text to convey information;

•	Simple graphics that eliminate gradation and textures as background;

•	Avoiding the use of pie charts to present more than two categories of data;

•	Visual theme and/or repetition of some graphic elements throughout a
document to build unity and memorability;

•	Black or dark gray color for narrative text to increase comprehension levels;

•	Selected use of color to emphasize important information; and

•	Avoidance of red-green and yellow-blue combinations to accommodate
difficulty that people with colorblindness have with these colors.7

Take care in conveying data about your particular initiative or set of initiatives. In
many cases, there may be multiple potential factors contributing to environmental
outcomes reported in association with a community-based initiative. In other words,
the program or initiative may not be the sole causal agent to which outcomes may be
attributed. In some cases, other public policies/programs may have more of an
influence than community-based efforts. In other cases, economic factors may come
into play (e.g., a reduction in air pollution from industrial sources due to a reduction
in economic activity). Where there is a question of what specifically causes a desired
outcome, EPA should carefully communicate environmental results. It is appropriate
to use language such as "the initiative contributed to (X environmental outcome)"
and/or to caveat the other factors that may be contributing to reported results,
whether they be other public policies/programs.

Information Collection Requests

An issue worthy of particular note is the Information Collection Request (ICR).
Federal agencies are restricted under the Paperwork Reduction Act from collecting
similar information from 10 or more non-federal persons or entities unless they
receive Office of Management and Budget approval to do so. The Paperwork
Reduction Act is intended to reduce the burden on the public from unnecessary,
poorly designed and duplicative requests for information from the federal
government. To seek approval, initiative staff must prepare an Information
Collection Request (ICR) and submit it to EPA's desk officer in the Office of
Environmental Information. Seeking ICR approval can be a lengthy process; it may

7 Evergreen, Stephanie D.H. (2014). Presenting Data Effectively: Communicating Your Findings for
Maximum Impact. Thousand Oaks, Cahfornia: SAGE Publications, Inc.

17©


-------
take up to nine months for OMB to approve requests. Once granted, ICR clearances
are typically good for three years. EPA's Generic Customer Service ICR (described in
more detail in the Customer Satisfaction section of this document) may be applicable
to community-based measurement activities. More information on the ICR process
can be found on EPA's Intranet site.8

Establishing a Baseline or Comparison

As mentioned earlier in this framework, collecting baseline data is important
because it provides a frame of reference, and facilitates comparison of conditions
prior to the initiative to conditions after the initiative is implemented. EPA staff
should establish a baseline for all measures selected, including output and
outcome measures. In some cases, the baseline for measures may be equal to
zero, but this should not be assumed. For example, if an initiative that includes
job training is using the indicator "average starting wage of training
participants," the baseline is not zero, it is the average wage that participants
earned prior to receiving job training, which are data that must be collected from
participants.

Baselines can be constructed using a single year of data, or by using multiple
years of data. Initiatives should build baseline data collection into the process of
joining the initiative, by collecting data on measures as part of a grant or other
application, or immediately upon joining the initiative. In many situations, it is
preferable to use a multi-year baseline, as analyzing trends in reference to past
data as well as current data generally can provide for more robust measurement.
A multi-year baseline is particularly important in cases where single-year data
are spotty, or if external factors vary from year to year. A single-year baseline
should be used when data from previous years are no longer applicable to
current conditions. It should be noted that multi-year baselines require
additional data collection and analysis; hence, EPA staff will need to assess
feasibility of collecting these data.

In many cases, EPA staff will collect some baseline data, and measurement data,
from partner organizations. EPA should clearly document data collection
expectations and communicate expectations to community organizations, and
include these expectations within formal agreements such as grant agreements
and MOUs. As an alternative to collecting baseline data from partner
organizations, in some cases, EPA may also be able to collect baseline data from
existing data sources, such as publically available databases and reports.

If the community-based initiative is already established, it may not be feasible
to pursue a baseline. Consider identifying a "control" group or establishing other

8 The EPA Information Collection Request (ICR) Center website is: http://www.epa.gov/icr/

18®


-------
means of comparison to put program performance into context. For site-specific
community-based initiatives, a control group may be a set of similarly sized
communities where the initiative is not active.

Field Testing

Do not expect to get your measurement process right the first time out of the
box. A field test can save time, money, and relationships by allowing you to test and
correct the process before burdening your entire population with your information
request. By running through the entire process with a small group of volunteer
respondents or colleagues that provide a fresh pair of eyes you will get critical
feedback on what may not be working about the system you have designed. It is
important to test the communication aspects in addition to the data collection
pieces. By sharing a straw report with decision-makers and discussing the
implementation decisions they might make based on the data, you will ensure
that the information your system provides is right for your key audience(s) or
that necessary improvements can be made before the system is fully ramped up.

Resource Requirements

Resource requirements for implementing a performance measurement plan vary
depending on the sophistication of the approach and the size of the initiative (and
thus number of people reporting information). For small initiatives with a few
relatively straightforward performance measures, and less than 20-30 reporters, the
role of the performance measurement coordinator may require a small fraction of
one FTE per year. Initial staffing demands may be higher for selecting new
measures, developing new data collection forms, developing a data management
system (which may or may not include an online reporting component), developing
communications on performance measurement, and seeking initial ICR clearance. In
addition, contractual help may be required to develop more sophisticated
measurement approaches, or to develop a database or online reporting system.

Storytelling

Case studies and success stories are common examples of stories (or narratives) used
to communicate a program's processes, outcomes and impacts. Stories provide an
opportunity for EPA to engage with program participants and allow participants to
describe program processes and outcomes in their own words. Stories can be used to
develop information not readily available through more traditional, quantitative-
based performance measurement, or can serve to provide greater context and/or
interpretation of quantitative data. Unlike quantitative-based measurement, stories
are not confined to measuring certain parameters. Stories can therefore provide a
means by which to identify unintended consequences or unexpected outcomes of
program activities.

19®


-------
Stories can take a variety of different forms, from in-depth case studies to short
'vignettes' that focus on a specific program element, event or topic. Stories can be
collected through surveys, individual interviews or focus groups. Regardless of the
approach, stories should be collected as consistently and systematically as possible.
When collecting stories, consider the types of information that you need to build
meaningful stories, for example, stories should be recorded with a clear
understanding of who is telling the story and the timeframe under which the story
evolved. In some cases, it may be helpful to identify specific topics or issues for
which stories would generate valuable information. Are you interested in program
outcomes, strengths and/or weaknesses, or are you more interested in understanding
how a program (or specific program element) changed a participant's knowledge,
awareness, attitudes or behaviors? While open-ended questions are the most
effective approach for collecting stories, to the extent that you can create consistency
in the type of information that each story contains, it will be easier to identify
patterns and trends across a collection of stories.

When collecting stories, more meaningful information can be developed by collecting
multiple stories on the same topic. By their nature stories represent the perspective
of one individual at a single point in time. As such, while one or two stories may
provide interesting insights, a collection of stories on the same topic can be used to
identify patterns and trends that help initiative staff to evaluate program activities,
outcomes, successes and/or weaknesses. In most cases, when you listen to enough
stories on the same topic, a defined set of common themes will emerge. When you
reach this point, you will notice that stories from 'new' interviewees fall within a
known range of responses and/or experiences. At this point, staff may have greater
confidence that the stories they have collected represent a broad range of
experiences, rather than a small subset of perspectives or opinions.

EPA staff should also consider the timing of story collection. Asking program
participants to remember details is harder the further back in time you go.

Similarly, depending on the longevity of the project, information gathered through
stories may be more helpful if stories are gathered at multiple points over a time.

Take care when communicating stories collected from a particular program or
initiative. Qualitative stories should be used to complement (not replace)
quantitative data. When paired with quantitative measures, qualitative stories can
communicate a more complete and rich 'story' of EPA's work in community-based
programs.

The next section of this document covers the seven topic areas of measurement
common to many EPA community-based programs.

20®


-------
1. Partnerships

Overview

Nearly all of EPA's community-based initiatives establish partnerships with
community organizations. EPA may also partner with businesses, universities,
government agencies, and/or NGOs as well. The complexion of these partnerships
varies widely, although EPA typically serves as a grantor, advisor, or provider of
technical assistance to some extent. The level of EPA partners' expertise can vary
from startup groups organized solely to partner with EPA on the initiative to large,
well-established community organizations.

Community-based initiatives conduct activities to identify the appropriate partners
to help reach identified goals, build relationships with them, and maintain and
improve the partnership over time. The community-based initiatives that establish
and maintain mutually-beneficial partnerships most effectively will see the greatest
contribution to outcomes from its partners, and will be able to collect data more
effectively to inform other measures.

Measuring Partnership Health

EPA needs measures to gauge the health and long-term
viability of partnerships in which EPA is investing. EPA
and partners should document planned activities and
divisions of responsibilities. Partners that engage
community members through formal meetings should
track the frequency of and attendance at these events.

Throughout the partnership process, EPA should assess
the sustainability and endurance of partnerships. EPA
may also want to assess the extent to which community
members are represented by partnering organizations in
the context of work that the organization is conducting
with EPA. Finally, EPA may want to measure the extent to which partners take
actions indicating a long-term commitment to the goals of the EPA community-based
initiative. See the Partnership menu on the next page for ideas and examples of
applicable measures.

EPA may also consider analyzing and visualizing partnership data using social
network analysis (SNA) tools. SNA can show network structure, information flows,
and indicate key actors or organizations in a community network. Information can
often be collected through surveys of EPA staff and/or community members. Once

^^Partnership commitmerU^
example:

The Community-Based
Childhood Asthma Program
tracks the number of schools
using organized indoor air
quality management
practices consistent with the
EPA Tools for Schools
^^jpproach.

21®


-------
the information is gathered, data can be used to produce visual maps that illustrate
the presence and strength of relationships in the network, often at different points
in time (see the example from EPA's MVD initiative in Exhibit 1). Using specialized
software, higher-level analyses can also produce more detailed, complicated network
maps and quantitative measures to describe the network structure and strength.
SNA is useful when a program needs to assess network relationships or flows of
information in a manner that is comprehensive, quantitative, and/or relatively
consistent over time. More information on SNA is located in the Additional
Resources section of this report.

Exhibit 1. EPA's external partners before the MVD initiative and after one year of
implementing the initiative: number of partners and average frequency
of interaction.

NGO (local)

Number of connections to external
partners before the start of the initiative

Average frequency of EPA's interaction with
partners before the start of the initiative

Number of new connections to external partners
after one-year of implementing the initiative

Average frequency of EPA's interaction with partners
after one-year of implementing the initiative

Thicker lines indicate higher average frequency of interaction.

22®


-------
Partnerships

Often EPA supports partnerships as a means of tackling complex environmental issues, including those in areas with
disproportionate burdens. These measures gauge the health and long-term viability of partnerships in which EPA is
investing.







PRIMARY DATA







ACTIVITY, OUTPUT,

COLLECTOR (EPA

EXAMPLES PREVIOUSLY OR CURRENTLY USED

CATEGORY

POTENTIAL MEASURES

OR OUTCOME?

OR PARTNERS)

BY EPA COMMUNITY BASED PROGRAMS



Number of potential partners who express

Output

EPA



Identifying

interest in the initiative



Partners

Have contacts been made with all potential
partners? (Yes/No)

Activity

EPA





Is there clear agreement among partners
about planned activities? (Yes/No)

Activity or Output

EPA





Is there clear agreement among partners









about who is responsible for implementing

Activity or Output

EPA



Building

planned activities? (Yes/No)





x CARE: Number of CARE cooperative

Relationships

Do all stakeholders have access to formal





agreement projects managed in order to

with Partners

agreements (they have copies, know what the
agreements are, or where to find them)?
(Yes/No)

Activity or Output

EPA

obtain toxic reductions at the local level



Frequency with which mission and goals are
revisited (e.g. once per year)

Activity

EPA



Maintaining
and Improving
Partnerships

Proportion of partners retained per year

Outcome

EPA



Increase in number of partner
communications (number of emails, phone
calls, etc.)

Outcome

EPA

x MVD: Frequency of communication with
partners



Are the needs of engaged community









stakeholders being addressed by the

Outcome

EPA





initiative? (Yes/No)







Depth and

Extent of representation among diverse
partners (does this initiative have input from
core types of organizations e.g., funders,
church-based organizations (CBOs),
governmental and nongovernmental
organizations, universities, etc.)





x EJ Small Grants: Percent of groups (nonprofit,

Breadth of

Community

Partnerships

Output

Either

government, etc.) represented in
partnerships
x MVD: Number of partners by partner type



Is the initiative addressing all of the needs it
was intended to? (Yes/No)

Outcome

EPA



23®


-------
CATEGORY

POTENTIAL MEASURES

ACTIVITY, OUTPUT,
OR OUTCOME?

PRIMARY DATA
COLLECTOR (EPA
OR PARTNERS)

EXAMPLES PREVIOUSLY OR CURRENTLY USED
BY EPA COMMUNITY BASED PROGRAMS

Partner
Engagement

Are there indicators that partners are
engaged (are meetings with partners
occurring regularly)? (Yes/No)

Output

EPA

Proportion of partners in attendance at
formal and informal interactions

Output

EPA

Proportion of partners at a community forum

Output

EPA

Proportion of partners who voice their





opinions and needs, and descriptions of those

Output

EPA

opinions and needs





Proportion of and types of partners





participating in meetings (e.g. ethnic, cultural,

Output

EPA

and geographic diversity)





Proportion of partners asking for
information/attending trainings

Output

EPA

Proportion of partners reporting increased





awareness and understanding of initiative

Outcome

EPA

opportunities





Superfund JTI: Number of individuals who
attend training

Superfund JTI: Number of individuals who
attend orientation

Superfund JTI: Number of individuals who
attend tryouts to be accepted into the
training program

Urban Waters: Number of people who
attended a summit meeting that focused on
flooding, industrial contaminants, bacteria
and storm water, and reconnecting people to
a river

CBEP: Partnerships developed with
organizations outside of EPA to leverage
resources and/or expertise

Proportion of partners reporting adoption of
initiative goals

Outcome

EPA

Partner
Commitment

Proportion of partners participating in the
research effort and grant application process

Output

EPA

Proportion of partners at meetings over time
(i.e. do partners continue to participate?)

Output

EPA

Dollar amount of funding contributions from	Output (or

partners	Resource)

EPA

MVD: Total and type of resources provided by
EPA partners for initiative activities
CARE: Number of communities that, through
CARE, implement local solutions to address
an agreed upon list of priority toxic and
environmental concerns using the CARE
partnership K34

CBEP: Number of joint projects among
municipal, county, and state governments

Partnership
Sustainability

Number of years the initiative has been in
operation

Outcome

EPA



Proportion of geographic regions the initiative
has reached (e.g. states, counties)

Outcome

EPA



Diversity of individuals the initiative has
reached

Outcome

Partners



Proportion of the target audience being
reached

Outcome

EPA

X

Percent increase in the number of individuals
or organizations involved from baseline

Output

EPA

MVD: Number of new communities EPA
became involved in as part of the initiative
MVD: Increase in the number of
organizations involved from baseline
MVD: Percent increase in EPA's relationships
from baseline

CARE: Number and groups of residents
reached with environmental health
information

24©


-------






PRIMARY DATA







ACTIVITY, OUTPUT,

COLLECTOR (EPA

EXAMPLES PREVIOUSLY OR CURRENTLY USED

CATEGORY

POTENTIAL MEASURES

OR OUTCOME?

OR PARTNERS)

BY EPA COMMUNITY BASED PROGRAMS



Number and descriptions of new connections









with other initiatives (e.g., relationships or

Output

EPA





resource sharing)









Number and descriptions of additional
projects and partners

Outcome

EPA





Number of people following initiative Twitter
feed

Outcome

Depends upon
Twitter account



Leveraging



owner



Social Networks

Number of people included in the initiative's

Outcome

Depends upon



to Enhance the

Facebook or other social media group

group owner



Partnership

Number of posts from community members
on initiative Facebook or other social media
page

Outcome

Depends upon
group owner











x MVD: Number of new projects initiated from

Behavior
Change

Percent increase in the number of people or
partners taking action to change workplace,
school, or community processes or policies

Outcome

Either

baseline

x CARE: Percentage of partners who reported

changing their behavior
x CBEP: Number of participants in
environmental volunteer activities



Number of partner findings reported in







Reporting of

materials, websites, and messages (e.g.

Outcome

EPA



Findings

number of case studies, number of partners
reporting findings in an online database)



Expanded

Research

Collaborations

Proportion of partners who apply for
additional funding over time

Outcome

EPA



Number of new partners who join the
research project and/or partnership

Outcome

EPA



Setting and

Did the initiative set goals for developing and

Output

EPA



Meeting

maintaining partnerships? (Y/N)



Partnership
Goals

Proportion of partnership goals met

Outcome

EPA



Resources
Devoted to
Partnerships

Dollars spent developing and maintaining
partnerships

Output

EPA



Community
Involvement in
Research

Proportion of partners who participate in
collecting data

Output

EPA



Number of new organizations who become
involved in research and outreach

Outcome

EPA



25®


-------






PRIMARY DATA







ACTIVITY, OUTPUT,

COLLECTOR (EPA

EXAMPLES PREVIOUSLY OR CURRENTLY USED

CATEGORY

POTENTIAL MEASURES

OR OUTCOME?

OR PARTNERS)

BY EPA COMMUNITY BASED PROGRAMS



Number of tools utilized to enhance









awareness and knowledge of research and









environmental health risks (e.g. radio,

Activity

Either





television, live performances, websites, and









paper materials for dissemination)









Number of documents distributed (e.g.







Communication
of Partnership
Messages and
Materials

handout, presentation, fact sheet, case study,
pamphlet, manual, video tape, slide show,
CD-ROM, Web page or computer program)

Activity

Depends upon
document owner

x Community-Based Childhood Asthma
Program: Number of schools newly using

Measures of tool use (e.g. number of website
hits, number of pamphlets printed)

Outcome

Depends upon tool
owner/developer

organized indoor air quality management
practices consistent with EPA Tools for
Schools.

x CARE: Number of green maps distributed

Number of downloads of initiative outputs

Outcome

Depends upon host
site







x CARE: Number of workshops hosted for







Depends who runs
website



Number of website hits

Outcome

residents on specific issues



Number of other groups or initiatives that

Outcome

EPA





adopt initiative materials





Number of discrete messages developed and









used by others for radio, newspaper,

Outcome

EPA





pamphlets and television









Number of citations to initiative outputs

Outcome

EPA





26®


-------
2. Leveraging Resources

Overview

Leveraging resources is the process of using existing resources - including funding,
staff time, existing relationships, and communications - to grow and strengthen an
initiative. Common approaches used by EPA's community-based work to leverage
resources include: raising awareness of the work via social or conventional media,
identifying new funding sources, and increasing an initiative's network of partners.

The ability to successfully leverage resources is particularly important to EPA
community-based initiatives, which often have limited in-house resources and rely
on community groups and members to implement many programmatic aspects. With
these limited base resources, community-based initiatives are often expected to
achieve goals that would not be possible without expanding to involve community
partners and others with additional resources.

Measuring Leveraging Resources

The ultimate goal in measuring leveraged resources is to show how effectively an
initiative used its initial resources to maximize total additional resources or total
benefits. The first step is to determine the size and scope of in-house resources.

Next, staff might measure the leveraged activities
conducted or leveraged outputs generated. Typically,
initiative measures leverage by comparing a measure
of effort or resources expended to a measure of
outcome. The desired outcome is to show an overall
increase in the initiative's resources or environmental
impact as a result of its efforts. To take things a step
further, staff may want to measure how efficiently
EPA resources have been spent.

Leveraging resources is an important aspect of most community-based initiatives,
but few of them have the same leveraging approach. There is no singular approach
to measuring how effectively initiatives leverage resources; each one should
determine how to best measure its own leveraging approach.

See the Leveraging Resources menu on the next page for ideas and examples of
applicable measures.

r

Leverage ratio example:

Based on data from EPA regional
staff, for every EPA dollar spent,
external resources contributed an
additional $2.80 to implement
activities in MVD communities
since the start of the initiative.

V	J

27®


-------
Leveraging Resources

Leveraging resources is using available resources to leverage additional resources, increasing the benefit of an
investment.







PRIMARY DATA

EXAMPLES PREVIOUSLY OR CURRENTLY





ACTIVITY, OUTPUT,

COLLECTOR (EPA

USED BY EPA COMMUNITY BASED

CATEGORY

POTENTIAL MEASURES

OR OUTCOME?

OR PARTNERS)

PROGRAMS



Payback period

Outcome

EPA



Cost-Effectiveness

Return on investment

Outcome

EPA



Ratio of initiative funding to key outcomes
(e.g., jobs created or energy conserved)

Outcome

EPA





Number of individuals and organizations who







Raising Awareness

collaborate for the first time to accomplish a

Output

EPA



and Interest

common goal







Among
Community

Number of repeat collaborations between
partners

Output or Outcome

EPA



Partners

Number of new volunteers after efforts to
increase awareness and interest

Output

EPA





Number of news stories referencing initiative

Outcome

Either





Number of community members that are









positively impacted by the results of the

Outcome

Partners

x Urban Waters: Have city ordinances been

Broader Reach

initiative





passed to further initiative goals?



Number and types of policies or regulations





(Yes/No)



that can be or have been influenced by the

Outcome

EPA





initiative









Growth in initiative funding

Outcome

EPA

x Superfund JTI: Percentage of trainees

Financial Viability

Does the initiative have adequate resources





from the initiative who have been placed

to implement future planned activities?
(Yes/No)

Outcome

EPA

into jobs and maintained employment for
at least one year

Setting and

Did the initiative set goals for leveraging

Output

EPA



Reaching Goals for

resources? (Y/N)



Leveraging
Resources

Proportion of leveraging resources goals met

Outcome

EPA



Leveraging
Infrastructure and
Money

Funds obtained as investment in the initiative
(dollars)

Output

EPA

x Brownfields: Billions of dollars of cleanup
and redevelopment funds leveraged at

Number of grants awarded

Output

EPA

Brownfields sites

Total amount awarded in grants

Output

EPA

x MVD: ratio of EPA resources to external

28®


-------
CATEGORY

POTENTIAL MEASURES

ACTIVITY, OUTPUT,
OR OUTCOME?

PRIMARY DATA
COLLECTOR (EPA
OR PARTNERS)

EXAMPLES PREVIOUSLY OR CURRENTLY
USED BY EPA COMMUNITY BASED
PROGRAMS

Leverage ratio: Ratio of EPA initiative funding
to leveraged funding (e.g. ratio of grant
money to match money)

Outcome

EPA

resources
-1 CARE: Amount of funding provided

through grants
-1 CARE: Number of communities who

received funding
-1 EJ Small Grants: Number of dollars

awarded in grants
-1 EJ Small Grants: Number of grant
recipients

-1 CBEP: Financial resources (e.g., grants,
contracts, travel) directed toward CBEP
activities



Number of Full Time Employees (FTEs)

Resource

EPA



Number of new people contacted in
leveraging efforts

Activity

EPA

Leveraging People
(Human Capital)

Number of new people brought into the
initiative

Activity

EPA

Number and types of formal advisory board
activities conducted to leverage relationships,
ideas, and knowledge

x CBEP: Resources and expertise leveraged
through established partnerships with
organizations outside EPA

Activity

EPA

Leveraging Social Number of posts to initiative Facebook page
Networks to Seek

Output

Depends upon page
owner

Additional Funding Number of posts to initiative Twitter page

Output

Depends upon page
owner

29®


-------
3. Education and Training

Overview

One focus of some of EPA's community-based initiatives is to enable community
members to have a larger role in the implementation of environmental initiatives in
their community. One of these potential roles is for community-members to obtain
gainful employment in association with environmental initiatives. It is important for
a training initiative to demonstrate that it benefits trainees and the local economy.
Measuring the results of education and training efforts is typically more
straightforward than measuring other types of impacts, as the most common way of
measuring impact is to look at measures of gainful employment of trainees over
time. However, EPA must rely heavily on its community-based partners to collect
education and training outcome information, which can pose coordination
challenges.

Measuring Education and Training

The goal in measuring education and training is to
show how well the training initiative directly enables
local community members to become effective
environmental advocates, employees, and community
leaders. One area of measures for education and
training is to assess the appropriateness of training
materials for target audiences. Typically, initiatives
track how the reach of its training grows over time.

Growth metrics demonstrate that the training is well-
received and sustainable. Finally, training should
measure how effectively the initiative has directly
improved employment outcomes among trainees.

See the Education and Training menu on the next pag
applicable measures.

Training Effectiveness
Example Measures:

The Superfund Job Training
Initiative tracks the number of
trainees who have completed
training, the percentage of
trainees that have been placed
into jobs, and the percentage of
trainees retained in jobs for
over one year.



;e for ideas and examples of

30®


-------
Education and Training

These metrics address training community members to become effective environmental advocates, employees, and
community leaders.





ACTIVITY,

PRIMARY DATA









OUTPUT, OR

COLLECTOR (EPA

EXAMPLES PREVIOUSLY OR CURRENTLY USED

EPA QUESTION

POTENTIAL MEASURES

OUTCOME?

OR PARTNERS)



BY EPA COMMUNITY BASED PROGRAMS



Percent of trainees with passing scores on
quizzes or tests pre and post training

Outcome

Partners

X

Superfund JTI: Number of trainees who
have completed training



Percent of trainees that provide satisfactory
ratings of education or training initiative

Outcome

Partners

X

Superfund JTI: Percentage of trainees who
have been placed in jobs



Number of job training initiative participants
hired

Outcome

Partners

X

Superfund JTI: Percentage of trainees
retained in jobs for over 1 year

Measuring

Number of job training initiative participants
retained for at least 1 year

Outcome

Partners

X

Brownfields Workforce Development and
Job Training: Number of participants that

Effectiveness of
Education and

Average starting wage of training graduates

Outcome

Partners



have completed EPA-funded training
Brownfields Workforce Development and

Percent increase in post-training test scores
over time





A

Training

Outcome

Partners



Job Training: Number of training graduates









that have obtained employment after
training

Brownfields Workforce Development and



Percent increase in job training initiative
participants hired

Outcome

Partners

a













Percent increase in job training initiative
participants retained for at least 1 year

Outcome

Partners

X

Job Training: Average salary of training
graduates

CBEP: Number of users of selected CBEP
training tools



Number of follow-up training events

Activity

Partners







Number of follow-up materials to participants

Activity

Partners





Reach of
Training
Impacts

Number of community employers that support

Outcome

Partners





initiative through reimbursement or credit





Percent increase in attendance at trainings
over time

Output or
Outcome

Partners







Percent increase in training/education material

Output or

Partners







downloads over time

Outcome





Training

Did the initiative gather data on the target









Materials
Appropriate for

audience (language, literacy, education levels,
etc.) that might influence beliefs and values?

Activity

Partners





Particular

(Y/N)









Audience in

Percent of community members in the target

Activity

Partners





Community

audience involved in the development process





31®


-------




ACTIVITY,

PRIMARY DATA







OUTPUT, OR

COLLECTOR (EPA

EXAMPLES PREVIOUSLY OR CURRENTLY USED

EPA QUESTION

POTENTIAL MEASURES

OUTCOME?

OR PARTNERS)

BY EPA COMMUNITY BASED PROGRAMS



Did the initiative identify the preferred training
methods of target audience? (Y/N)

Activity

Partners





Number of website hits or training material
downloads originating from the target
geographic area

Output

Depends on who
runs website

x Superfund JTI: Number of individuals that

attend orientation
x Superfund JTI: Number of individuals that



Percent of training participants that are
residents of the target geographic area

Output

Partners

attend tryouts to be accepted into the
training initiative



Percent of training participants from the target
age group

Output

Partners

x Superfund JTI: Number of individuals that
attend training



Percent of trainees that fall into the target
income group

Output

Partners

x EJ Small Grants: Percent of students

graduating from training initiative of Native









Hawaiian descent



Number of languages in which initiative

Output

EPA

x EJ Showcase Communities: Number of inner



materials are translated

city youths trained in stormwater
management

Setting and

Did the initiative set goals for education and

Output

EPA



Reaching Goals

training? (Y/N)



for Education
and Training

Proportion of education and training goals met

Outcome

EPA



Resources









Devoted to
Education and

Dollars spent on developing education and
training initiative

Output

EPA



Training











32®


-------
4. Capacity Building

Overview

EPA community-based initiatives are designed to improve the ability of communities
to achieve initiative goals. Building capacity in communities encompasses
empowering community members to become effective advocates for community
needs, increasing organizational capacity of partner organizations, and improving
the community's physical and communication infrastructure. By building capacity,
EPA helps to ensure that progress made during community-based projects can be
sustained even after EPA's role in projects is reduced or ends. Moreover,
communities with sufficient capacity to implement more advanced initiative
activities are better partners and add value to initiative outcomes. Most
importantly, capacity building is designed to impart skills and knowledge that
community organizations and members can apply to many contexts, not just the
context of a specific EPA-community project or initiative.

Measuring Capacity Building

The goal in measuring capacity building is to assess the extent to which community
organizations funded by EPA, and the members served by these organizations,
develop knowledge, skills, and confidence to take a larger role in environmental
initiatives.

Staff working with community-based initiatives should decide whether its capacity
building efforts will focus on increasing organizational capacity, physical
infrastructure, individuals' knowledge, or some combination of all three. After staff
identify the areas in which they expect to build capacity, they should adopt a
strategy to collect data about communities' baseline levels of capacity so that
changes in capacity can be gauged over time.

See the Capacity Building menu on the next page for ideas and examples of
applicable measures.

33®


-------
Capacity Building

These metrics address developing the knowledge, skills, and confidence of community organizations funded by EPA, and
the members served by these organizations.

EPA QUESTION

POTENTIAL MEASURES

ACTIVITY,
OUTPUT, OR
OUTCOME?

PRIMARY DATA
COLLECTOR (EPA
OR PARTNERS)

EXAMPLES PREVIOUSLY OR CURRENTLY USED
BY EPA COMMUNITY BASED PROGRAMS

Empowering
Partners

Increasing

Organizational

Capacity

Among

Community

Partners

Increase in percent of community members
within affected communities who speak to
government leaders about environmental
issues from baseline

Outcome

Increase in percent of community members
within affected communities who have
sought community leadership positions or
have run for local office

Outcome

Increase in percent of community members
within affected communities who speak at
conferences or other public venues about
environmental issues

Outcome

Partners

Partners

Partners

Increase in the number of community





members occupying project leadership roles

Outcome

Partners

from the baseline





Number of community members who
received leadership training

Output

Partners

New research project, support group, or





enforcement committee established by

Outcome

Partners

initiative partners





Proportion of partners involved that have
developed bylaws

Output

EPA

Proportion of partners involved that have
developed a voting process

Output

EPA

Proportion of partners involved that have
developed conflict management procedures

Output

EPA

Proportion of partners involved that have
developed capacity-building goals

Output

EPA

CBEP: Membership in

environmental/conservation/wildlife

organizations

CBEP: Number of public/private partnership
efforts to protect the environment

34®


-------
EPA QUESTION

POTENTIAL MEASURES

ACTIVITY,
OUTPUT, OR
OUTCOME?

PRIMARY DATA

COLLECTOR (EPA EXAMPLES PREVIOUSLY OR CURRENTLY USED
OR PARTNERS) BY EPA COMMUNITY BASED PROGRAMS

Improving

Proportion of partner organizations with
access to adequate space and any other
necessary physical structures

Output

EPA

Community
Group Physical

Proportion of partner organizations with a
website

Output

EPA

and

Communication

Proportion of partner organizations with a
listserv

Output

EPA

Infrastructure

Proportion of partner organizations with a
social media presence (e.g. Facebook page,
Twitter feed)

Output

EPA

Setting and
Reaching Goals

Did the initiative set goals for capacity
building? (Y/N)

Output

EPA

for Capacity
Building

Proportion of capacity building goals met

Outcome

EPA

Resources
Devoted to
Capacity

Dollars spent on capacity building

Output

EPA

Building









35®


-------
5. Customer Satisfaction

Overview

A key goal of EPA's community-based initiatives is to ensure that community
partners and members have a trusting relationship with EPA and are satisfied with
the services that EPA is providing through its programing. Community partners
that are satisfied with EPA's role in the partnership are more likely to maintain a
high level of engagement in initiative activities. Additionally, communities with a
good relationship with EPA are more likely to broadly work in collaboration with
EPA regarding environmental issues in their community.

Measuring satisfaction provides EPA with an important indication of the
community's current attitude toward EPA. Furthermore, it provides a feedback loop
through which EPA can identify areas of community concern and adjust initiative
practices to strengthen its relationship with its community partners.

Measuring Customer Satisfaction

The goal in measuring customer satisfaction is to
assess how effectively EPA or its contractors carried
out their initiative responsibilities in the eyes of
community partners. While surveys are the most
direct way to measure customer satisfaction,
customer satisfaction can also be measured by
collecting information on the level of engagement and
participation by community members. See the
Customer Satisfaction menu on the next page for
ideas and examples of applicable measures.

EPA maintains a Generic Customer Service ICR that
covers data collection of customer service information.

OMB agrees that it is impractical to go through the
entire ICR process for every such collection. Under
EPA's generic ICR, this class of survey is pre-
approved and has a simplified, expedited OMB review
process for individual requests. To use the Customer
Service ICR, see the overview posted on EPA's Intranet.9

Customer Satisfaction Survey
Question Example:

How satisfied were you with the
contractor's ability to explain
EPA's remedy selection at the
site:

(1)	Very Dissatisfied

(2)	Dissatisfied

(3)	Somewhat Satisfied

(4)	Satisfied

(5)	Very Satisfied

9 Overview of the Generic Customer Satisfaction Survey ICR and How-To Guide, available at:
http://intranet.eDa.gov/icrintra/guidance.html

36®


-------
When developing surveys to measure customer satisfaction, initiative staff should
consider the feasibility of collecting honest feedback on a community's satisfaction
with EPA's activities. Anonymous surveys can help to produce a higher response
rate, and/or more honest feedback. Further anonymity can be granted by conducting
national surveys as opposed to community-level surveys.

37®


-------
Customer Satisfaction

These metrics assess communities' satisfaction with EPA's assistance.

EPA
QUESTION

POTENTIAL MEASURES

ACTIVITY,
OUTPUT, OR
OUTCOME?

PRIMARY DATA
COLLECTOR (EPA
OR PARTNERS)

EXAMPLES PREVIOUSLY OR CURRENTLY USED BY
EPA COMMUNITY BASED PROGRAMS

Satisfaction
with EPA
Services and
Decision-
Making

Percent of partner organizations and/or
community members surveyed about
customer satisfaction in the last 2 years.

Output

EPA

J Superfund TAGS collects satisfaction data but does
not report these data as part of their
measurement process. Other initiatives may do
the same.

J CBEP: Customer satisfaction with EPA tools and
information systems

Percent of partner organizations responding to
survey that state they are "highly satisfied"
with assistance received, training, or project
outcomes

Outcome

EPA

Percent increase of community members
responding to a survey that state they are
"highly satisfied" with assistance received,
training, or project outcomes from a baseline
survey

Outcome

EPA

Percent of community members responding to
survey that state they are "highly satisfied"
with the frequency and consistency of
information dissemination

Outcome

EPA

Percent increase of community members
responding to survey that state they are
"highly satisfied" with the frequency and
consistency of information dissemination
compared to a baseline survey

Outcome

EPA

Setting and
Reaching
Goals for
Customer
Satisfaction

Did the initiative set goals for customer
satisfaction? (Y/N)

Output

EPA



Proportion of customer satisfaction goals met

Outcome

EPA

Resources









Devoted to

Dollars spent on measuring customer

Outcome

EPA



Customer

satisfaction



Satisfaction









38®


-------




ACTIVITY,

PRIMARY DATA



EPA



OUTPUT, OR

COLLECTOR (EPA

EXAMPLES PREVIOUSLY OR CURRENTLY USED BY

QUESTION

POTENTIAL MEASURES

OUTCOME?

OR PARTNERS)

EPA COMMUNITY BASED PROGRAMS



Increased percent of community members or







Community
Engagement
and

Participation

target audience that attend public meetings
from baseline

Output

Partners



Increased percent of community members or
target audience that sign up for a listserv

Output

Partners



Increased number of community organizations

Output

EPA





that are initiative partners from year 1





39®


-------
6. Environmental Outcomes

Overview

All EPA initiatives are designed with the core goal of protecting human health and
the environment. An initiative's ability to accurately measure and report human
health and environmental outcomes can be very helpful in bolstering support for the
work from both EPA and outside entities. Initiatives that can demonstrate their
environmental impact are more likely to receive sustained funding and encourage
confidence in EPA from its community partners. As previously mentioned,
demonstrating definitive causal impact of an initiative on long-term environmental
outcomes can rarely be achieved outside of controlled experimental demonstrations.
With a strong program theory and measurement system in place, an initiative often
can plausibly establish that it contributes to the long-term environmental outcomes
that it purports to change. To be able to demonstrate claims of contribution, it is
advisable to track long-term environmental indicators that you can correlate with
your program's activities over time.

Measuring Environmental Outcomes



The goal in measuring environmental outcomes
is to characterize the environmental benefits
related to initiative efforts. See the
Environmental Outcomes menu on the next page
for ideas and examples of potential measures.

Environmental Outcome
Example:

In fiscal year 2015, the Superfund
Redevelopment Initiative declared 45
sites as "Sitewide Ready for
Anticipated Use."

J

40®


-------
Environmental Outcomes

Measurable environmental benefits associated with initiative activities

CATEGORY

POTENTIAL MEASURES

PRIMARY DATA
COLLECTOR (EPA
OR PARTNERS)

EXAMPLES BEING USED BY EPA
COMMUNITY BASED INITIATIVES

GOAL IN EPA STRATEGIC PLAN

Energy
Conservation

Transportation energy conservation
(kWh/MWh, gallons, cubic feet)

Number of kilowatt-hours of electricity
conserved

Reduction in number of gallons of oil used
Reduction in number of therms of natural gas
used

Total energy conservation (kWh/MWh or
Btu/MMBtu)

Renewable Kilowatt-hours, therms, or MMBTUs of
Energy	renewable energy generated (solar, wind,

Development geothermal, low-impact hydro, biomass)

GHG

Reduction

Percentage change in vehicle miles traveled
per capita

Tons of C02 emissions reduced or avoided

Typically, partners
will collect and
report

environmental
outcome data,
using instructions
and/or a standard
template provided
by EPA

-* Sustainable Communities: City Fleet,

Gas Mileage
-* Sustainable Communities: Fuel

Consumption/Purchase
-* Sustainable Communities: Residential

Energy Use
x CARE: Cost of fuel saved
-* CARE: Kilowatt-hours of electricity
saved per year

Goal 1: Addressing Climate
Change and Improving Air
Quality

RE-Powering America's Land:
Renewable energy generated on each
site

x CARE: Number of households that
made a renewable energy purchase
commitment
x CARE: Percentage of renewable energy
purchased by a city

Goal 1: Addressing Climate
Change and Improving Air
Quality

Climate Showcase Communities:
Expected GHG reductions (metric tons
C02e annually or total)

Climate Showcase Communities: Actual
GHG reductions (metric tons C02e
annually or total)

U.S. -Mexico Border 2020 Program:
Actual and potential greenhouse gas
emissions reductions from global
methane initiative projects in the
border region.

Goal 1: Addressing Climate
Change and Improving Air
Quality

41®


-------
CATEGORY

POTENTIAL MEASURES

PRIMARY DATA
COLLECTOR (EPA
OR PARTNERS)

EXAMPLES BEING USED BY EPA
COMMUNITY BASED INITIATIVES

GOAL IN EPA STRATEGIC PLAN

Air Quality

Days in the past year with Air Quality Index
(AQI) in the good range

Hospitalization for asthma per 10,000 residents

Reduction in pounds of toxic air emissions
(VOCs, nitrogen oxides, sulfur oxides, carbon
dioxide, PM, etc.)

Acres of impervious surface reduced in
targeted geographic area
Square meters of impervious surface replaced
Water	with pervious surface

Conservation Change in residential water consumption
efficiency (gallons per person per year)
Reduction in number of gallons of water used
(per household)

Percent decrease in fecal coliform found in the
watershed

Soil erosion: suspended solids (TSS in mg/L),
turbidity (FTU, NTU, etc.)

Change in percentage of assessed rivers and
streams that do not meet state and federal
water quality standards
Water Quality Decrease in the concentration of a particular
toxin in the water supply
Decrease in pounds of pollutant discharged in
targeted geographic area (for example, BOD,
COD, toxics, nutrients, TSS, contaminants in
storm water and pathogens. Includes
discharges to sewer systems, septic systems,
injection wells, ground water, etc.).

School Monitoring Initiative: Air
quality monitoring data (VOCs and
carbonyls in ppbv, etc.)

U.S. -Mexico Border 2020 Program:
Number of Days Exceeding Air Quality
Standards in Border Monitoring Areas
CARE: Tons of nitrogen oxide,
particulate matter, and/or carbon
dioxide reduced through anti-idling
zones

CARE: Percent reduction in fugitive air
emissions from a local coal distributor
CARE: Percent reduction in particulate
emissions inside buses

Typically, partners
will collect and
report

environmental
outcome data, using
instructions and/or
a standard template
provided by EPA

Goal 1: Addressing Climate
Change and Improving Air
Quality

Specific MVD Community: Gallons of
storm water retention capacity of
green infrastructure

Goal 2: Protecting America's
Waters

Number and percent of schools and
childcare centers that meet all health-
based drinking water standards. (ACS
measure; may or may not be affiliated
with the Lead in Drinking Water in
Schools and Child Care Facilities
program)

U.S. -Mexico Border 2020 Program:
Percent of Mexico border beach
sampling events above enterococcus
standard

Urban Waters: Number of gallons of
sewage stopped per day from being
discharged to the watershed

Goal 2: Protecting America's
Waters

42©


-------




PRIMARY DATA









COLLECTOR (EPA

EXAMPLES BEING USED BY EPA



CATEGORY

POTENTIAL MEASURES

OR PARTNERS)

COMMUNITY BASED INITIATIVES

GOAL IN EPA STRATEGIC PLAN



Number of acres of unusable land converted to



x Brownfields: Number of properties





usable land



cleaned up using Brownfields funding





Number of acres of unusable land converted to



x Brownfields: Number of acres of





renewable energy development



brownfields property made ready for



Land

Decrease in the concentration of a particular
toxin in soil



reuse

x Superfund Redevelopment Initiative:

Goal 3: Cleaning Up

Restoration





Number of Superfund sites ready for

Communities and Advancing

Number of acres of developed land converted
to open space



anticipated use site wide
x Five Star Restoration Grants Program:
Number of acres restored and
improved, under the 5-star, NEP, 319,
and great water body programs
(cumulative).

Sustainable Development



Percentage of land preserved as open space

Typically, partners

x Partnership for Sustainable



Land

Preservation



will collect and

Communities: Acres of Parks and

Goal 3: Cleaning Up

Number of acres of farmland

report

environmental
outcome data, using

Protected Space per Capita
x Sustainable Communities: Growth in
previously-developed areas

Communities and Advancing
Sustainable Development





instructions and/or









a standard template







Pounds/tons/cubic ft. of wastes reduced

provided by EPA

x U.S.-Mexico Border 2020 Program:
Percent adequate solid waste disposal
in Mexico's 30 km border zone
x U.S.-Mexico Border 2020 Program:



Waste

Minimization

Change in pounds/tons/cubic ft. of waste
recycled



Number of scrap tires removed during
clean up at two of the largest,
selected tire piles in the Border
Region

x CARE: Tons of e-waste collected and

disposed of properly
x CARE: Gallons of food waste diverted
from landfill

Goal 4: Ensuring the Safety of
Chemicals and Preventing
Pollution

43 ©


-------
CATEGORY

POTENTIAL MEASURES

PRIMARY DATA
COLLECTOR (EPA
OR PARTNERS)

EXAMPLES BEING USED BY EPA
COMMUNITY BASED INITIATIVES

GOAL IN EPA STRATEGIC PLAN

Toxics Use
Reduction

Pounds/ tons of a target toxin reduced

Typically, partners
will collect and
report

environmental
outcome data, using
instructions and/or
a standard template
provided by EPA

U.S. -Mexico Border 2020 Program:
Amount of pesticides used in U.S.
Border Counties: California and
Arizona

U.S. -Mexico Border 2020 Program:
Total toxic releases from reporting
facilities in the Border Region
CARE: Pounds of hazardous chemicals
removed from local schools

Goal 4: Ensuring the Safety of
Chemicals and Preventing
Pollution

44®


-------
7. Economic and Quality of Life Outcomes

Overview

All EPA initiatives are designed with the core goal of protecting human health and
the environment. In addition, some EPA initiatives are also explicitly designed to
improve local economies and enhance quality of life, especially in communities that
have been disproportionately affected by environmental burden. Community-based
initiatives that are designed to cleanup and reuse contaminated sites, or draw
development into existing areas, are often associated with job creation and economic
impacts during construction, as well as long-term economic benefits. In addition,
some community-based initiatives aim to enhance community quality of life by
improving access to transportation options, essential services (e.g., grocery stores,
healthcare), or recreational space and opportunities.

Some community-based initiatives or projects may confer multiple economic and
quality of life benefits. For example, a project to redevelop an abandoned facility into
a park would create local jobs during the redevelopment process. Once developed,
the new park has the potential to improve the quality of life residents within
walking distance of the park, and may improve the economic competitiveness of local
businesses around the park.

Measuring Economic and Quality of Life
Outcomes

The goal in measuring economic and quality of
life outcomes is to capture benefits that are
additional to those related to human health and
the environment. See the Economic and Quality
of Life Outcomes menu on the next page for
ideas and examples of applicable measures.

r

Example economic outcome
measure:

As of July 2016, nearly 109,000 jobs
have been leveraged through the
Brownfields Program since its
inception.

45®


-------
Economic and Quality of Life Outcomes

Measurable economy and quality of life benefits associated with initiative activities





PRIMARY DATA COLLECTOR (EPA

EXAMPLES PREVIOUSLY OR CURRENTLY USED BY EPA

CATEGORY

POTENTIAL MEASURES

OR PARTNERS)

COMMUNITY BASED PROGRAMS



Number of jobs leveraged, created, or retained at







a target site







Number of jobs leveraged, created, or retained by







the initiative



x Brownfields: Jobs leveraged from Brownfields activities
x Partnership for Sustainable Communities: Combined
Housing + Transportation Costs as a proportion of area
median income (derived from the H+T Affordability Index)



Economic output per unit of energy consumption

Partners will collect and report this

Economic
Benefits

Housing and transportation costs as a proportion
of area median income

information, but may need
instructions and/or a standard



Revenues created by local businesses that

template from EPA.



inhabited a redeveloped site or an area targeted







by the initiative







Percent employment in locally- owned and







operated businesses







Number of new residences constructed in targeted
geographic area

Partners

x Partnership for Sustainable Communities: Percent of
household income spent on housing and transportation
costs



Total percentage of people commuting via walking,
biking, or transit





Increase in miles of road with bike lanes

Partners

x Partnership for Sustainable Communities: Percent of total



Change in average wait time for bus (minutes)

Partners

regional population that reside in a low income census



Change in average wait time for train (minutes)

Partners

track AND reside more than one mile from a



Increase in housing (number of units) for low



supermarket/large grocery store (for rural census tracts,

Quality of
Life Benefits

income, medium income, and high income

Partners

the distance is more than 10 miles)

residents



x Partnership for Sustainable Communities: Percent of

Net acres of agricultural and natural resource land

Partners

population that reside within / mile of a park or open



lost annually to development per new resident

space



Percent of population that is low income and does



CBEP: Percent of commuters living within 30 minutes of



not live close to a supermarket or large grocery
store

Partners

work

CBEP: Ratio of energy extracted to renewable resource



Amount spent on infrastructure repair relative to
the amount of infrastructure in need of repair or

Partners

amount generated

CBEP: Percent of population within 1/2 mile of green/open



replacement



space



Percent of new housing units built in previously

Partners



46®


-------


PRIMARY DATA COLLECTOR (EPA

EXAMPLES PREVIOUSLY OR CURRENTLY USED BY EPA

CATEGORY POTENTIAL MEASURES

OR PARTNERS)

COMMUNITY BASED PROGRAMS

developed space





Percent of population within walking distance of a
park or open space

Partners



Density of environmental hazards

Partners



Change in park and recreation space (acres) per

Partners



capita (1000 people) within a / mile radius.



Change in percent of population within walking

Partners



distance of public transportation





47 ©


-------
Additional Resources

Program Evaluation and Measurement

American Evaluation Association www.eval.org

Bamberger, Michael, Jim Rugh, and Linda Mahry. (2012). RealWorld Evaluation:
Working Under Budget, Time, Data, and Political Constraints. Thousand Oaks,
California: SAGE Publications, Inc.

Mayne, John, (1999), Addressing Attribution Through Con tribution Analysis: Using
Performan ce Measures Sensibly, (1999), Office of the Auditor General of Canada.

Metzenbaum, Shelley and Watkins, Allison, (2007), A Memo on Measurement for
En vironmen tal Managers: Recom mendation and Reference Manual, Environmental
Compliance Consortium, http://www.environmentalevaluators.net/wp-
content/up loads/2011/01/Elements-of-EE-Logic-Model-Tab le-Format-v5.pdf

U.S. EPA. (2016). Evaluating EPA's Programs. https://www.eDa.gov/evaluate.

U.S. Government Accountability Office. (2012). Designing Evaluations: 2012
Revision. GAO-12-208G.

Data Visualization

Evergreen, Stephanie D.H. (2014). Presenting Data Effectively: Communicating Your
Findings for Maximum Impact. Thousand Oaks, California: SAGE Publications, Inc.

Tableau software, (undated). Visual Analysis Best Practices: Simple Techniques for
Making Every Data Visualization Useful and Beautiful. Available at:
http: //www. table au. com/

Tufte, Edward R. (2015). The Visual Display of Quantitative Information. Second
Edition. Cheshire Connecticut: Graphics Press, LLC.

ZingChart Blog: Your Guide To Visualizing Data. (2016). https://blog.zingchart.com/

48®


-------
Social Network Analysis

Brieger, Ronald L. (2004). The Analysis of Social Networks, pp. 505-526 in Handbook
of Data Analysis, edited by Melissa Hardy and Alan Bryman. London: SAGE
Publications, 2004.

Gephi. (2016). The Open Graph Viz Platform , https://geplii.org/

Hanneman, Robert A. and Mark Riddle. (2005). Introduction to social network
methods. Riverside, California: University of California, Riverside.

NodeXL. (2016). Network Graphs: Network Overview, Discovery and Exploration for
Excel, http://nodexl.codeplex.com/

Scott, John. (2013). Social Network Analysis. London: SAGE Publications, Inc.

49 ©


-------