United States
Environmental Protection
Agency
Office of Enforcement and
Compliance Assurance
(MC 2222 A)
EPA300-B-99-002
March 1999
www.epa.gov/oeca
Guide for Measuring
Compliance Assistance
Outcomes
^y Printed on paper that contains at least 30 percent postconsumer fiber.

-------
Table of Contents                                                                Page i

                              TABLE OF CONTENTS
SECTION I: INTRODUCTION                                                        1
      A. Background 	  1
      B. Purpose of Guide	 2
      C. Overview of Guide  	 2
SECTION II:  COMPLIANCE ASSISTANCE ACTIVITIES AND OUTCOMES               4
      A. Compliance Assistance Activities	 4
      B. Measuring Results of Compliance Assistance 	 4
             Output Measures	 5
             Outcome Measures 	 5
             Environmental and Human Health Impact Indicators	 6
      C. Continuum of Output and Outcome Measures 	 7
SECTION III: PLANNING AN EVALUATION                                          9
      A. What Is the Goal of Your Compliance Assistance Project?  	 9
      B. What Is the Purpose and Scope of Your Evaluation?	  10
             Anecdotal Assessments	  11
             Statistical Studies	  11
             Specifying the Target Population  	  12
      C. What Measures Are Appropriate?	  13
      D. Which Data Collection Method Best Meets Your Needs?  	  14
             The Paperwork Reduction Act and OMB Approval	  15
             Data Collection Tools  	  16
             Selecting the Proper Evaluation Method	  20
SECTION IV: CONDUCTING AN EVALUATION                                       22
      A. Identifying a Starting Point	  22
      B. Developing the Data Collection Instrument	  23
             Linking the Questions to Outcome Measures	  25
             Measuring Changes in Levels of Compliance	  26
      C. Pretesting the Data Collection Instrument	  27
      D. Selecting the Sample	  28
      E.  Improving the Response Rate 	  29

-------
Page ii                                                                Table of Contents
SECTION V: ANALYZING AND PRESENTING DATA                                   31
       A.  Where Do We Go from Here? 	  32
             Using Descriptive Statistics 	  32
             Removing Data Subsets 	  32
             Collapsing Like Responses	  33
             Analyzing Data by Subgroup 	  34
       B.  Presenting the Data 	  35
             Using Graphics	  35
       C.  Understanding the Context of Results	  37
             Considering Data Anomalies	  38
SECTION VI: COMPLIANCE ASSISTANCE EVALUATION CHECKLIST                 39
LIST OF FIGURES
       Figure 1. Continuum of Compliance Assistance Measures	  8
       Figure 2. Obtaining OMB Approval for Compliance Assistance Evaluations	  17
       Figure 3. Pie Chart Showing Customer Satisfaction with ABC Booklet 	  36
LIST OF TABLES
       Table 1. Examples of Target Populations	  12
       Table 2. Measures for Compliance Assistance Activities  	  14
       Table 3. Sample Measures for Sector Strategies	  15
       Table 4. Matching the Intensity of the Evaluation Method to the Compliance
             Assistance Activity	  20
       Table 5. Compliance Assistance Data Collection Tools	  21
       Table 6. Outcome Measures and Associated Codes 	  27
       Table 7. Raw Data on Customer Satisfaction  with ABC Booklet 	  31
       Table 8. Customer Satisfaction with ABC Booklet	  33
       Table 9. Collapsed Categories for Customer Satisfaction with the ABC Booklet	  34
       Table 10. Alternate Collapsed Categories for  Customer Satisfaction	  34
       Table 11. Selected Customers' Satisfaction with the ABC Booklet	  35
       Table 12. Potential Graphics for Presenting Data  	  37

-------
Table of Contents                                                                  Page iii

LIST OF APPENDICES

APPENDIX A

Performance Profile for EPA's Enforcement and Compliance Assurance Program

APPENDIX B

NPMS Set 3: Environmental and Human Health Improvements by Regulated Entities Work Group
Members

APPENDIX C

EPA Fact Sheet II: Sampling—The Basics

APPENDIX D

Office of Compliance Guidance on the Need for Information Collection Requests (ICRs) for the
Evaluation of Compliance Assistance Activities

APPENDIX E

Fact Sheet: How to Obtain Clearance for Regional Compliance Assistance Evaluation Surveys Under the
Generic ICR 1860.01 OMB Control #2020.0015

APPENDIX F

Menu of Sample Survey Questions by Outcome Measure

APPENDIX G

Examples of Surveys Developed to Measure Compliance Assistance Outcomes

APPENDIX H

Consolidated Screening Checklist for Automotive Repair Facilities

APPENDIX I

EPA Fact Sheet VII: Examples of Graphs for Presenting Customer Feedback Results

-------
Introduction                                                                            Page 1

                             SECTION I:  INTRODUCTION

       A.      Background
       In fiscal year 1994, EPA created the Office of Enforcement and Compliance Assurance (OECA),
which is responsible for ensuring the compliance of the regulated community with federal environmental
statutes. To achieve that goal, OECA employs an array of approaches including regulatory enforcement,
compliance assistance, and compliance incentives. While EPA has experience tracking and measuring its
enforcement presence, the Agency now faces the challenge of developing an infrastructure to measure the
impact of its compliance assistance activities.  OECA is currently in the process of developing the tools to
collect data for evaluating these activities.
       Independent of OECA's efforts, Congress enacted the Government Performance and Results Act
(GPRA) in  1993 to promote more results-based management.  This act requires federal agencies to
develop 5-year strategic plans with goals, objectives, and performance measures.  In response, EPA
developed a strategic plan that delineates specific goals. Goal 9, which falls under OECA's responsibility,
requires EPA to ensure full compliance with laws intended to protect human health and the environment.
EPA plans to meet this goal through compliance incentives and assistance programs and by identifying
and reducing significant noncompliance in high-priority program areas, while maintaining a strong
enforcement presence in all regulatory program areas.
        In  January 1997, OECA began developing the National Performance Measures Strategy (NPMS)
to establish new measures for evaluating the effectiveness of compliance assurance projects in terms of
impacts on the regulated community and the environment. Specifically, the purpose of the NPMS is "to
develop and implement an enhanced set of performance measures for EPA's enforcement and compliance
assurance program"1 that incorporate outcome measures (changes in behavior due, at least in part, to
compliance assistance activities), and  environmental indicators (measures of progress toward achieving
environmental or human health objectives) in addition to output measures. As part of the NPMS effort,
OECA established a performance profile that consists of 11 sets of measures for evaluating the impacts of
EPA's enforcement and compliance assistance activities (See Appendix A).  Set 3 of the NPMS is focused
         U.S. EPA.  1997. National Performance Measures Strategy. December, p. 3.

-------
Page 2                                                                             Introduction
on quantifying environmental or human health improvements resulting from compliance assistance tools
and initiatives.

       Measures developed through NPMS will apply only to EPA's enforcement and compliance
assurance program. They will not serve as a framework for measuring performance of state enforcement
and compliance assurance programs. OECA and the states developed a separate set of accountability
measures for state enforcement and compliance assurance programs, and those measures are being used
in Performance Partnership Agreements (PPAs). Thus, while NPMS measures are not applicable to states
directly, they can still find information in this guide to help measure goals articulated in their PPAs.

       B.      Purpose of Guide
       The NPMS Set 3 Task Group (whose members are listed in Appendix B) prepared this document
to help you measure the impact of your compliance assistance projects and activities (e.g., telephone
helplines, compliance assistance tools, workshops) on the regulated community.2 The methodology
presented in this guide also serves as pilot evaluation methodology under NPMS. Results from your
evaluations will help OECA implement the methodology nationwide in fiscal year 2000. OECA will
incorporate the results into its strategic plans and annual performance plans required by GPRA. In
addition, using this guide to measure compliance assistance outcomes will support internal Regional
management and policy decisions.

       C.      Overview of Guide
       This guide consists of the following sections designed to help you measure the outcomes of
compliance assistance:
       •      Section II: Compliance Assistance Activities and Outcomes discusses the types and
               purposes of compliance assistance activities and the outcomes associated with these
               activities.
          Under Set 10 of the NPMS, OECA is developing a PC-based system, "CATs," for tracking outputs
related to compliance assistance activities.

-------
Introduction                                                                               Page 3
        •      Section III: Planning an Evaluation identifies key questions for planning an evaluation
               and discusses possible considerations. This section also contains a description of data
               collection tools and explains why you might want to use a combination of tools.

        •      Section IV: Conducting an Evaluation provides general tips for conducting an
               evaluation, as well as specific tips by evaluation method, including establishing a starting
               point, developing and testing the data collection instrument, linking questions to outcome
               measures, and improving response rates.
        •      Section V: Analyzing and Presenting Data provides tips on analyzing and presenting
               data to facilitate management decisions. This section also discusses the importance of
               identifying and delineating the limitations and purpose of the study when publicizing or
               interpreting the results.

        •      Section VI: Compliance Assistance Evaluation Checklist summarizes key steps
               discussed in the guide.

-------
Page 4                                           Compliance Assistance Activities and Outcomes

  SECTION II:  COMPLIANCE ASSISTANCE ACTIVITIES AND OUTCOMES

       Compliance assistance consists of information and technical assistance provided to the regulated
community to help it meet the requirements of environmental law. First and foremost, compliance
assistance ensures that the regulated community understands its obligations by providing clear and
consistent descriptions of regulatory requirements.  Compliance assistance also can help regulated
industries find cost-effective ways to comply and to go "beyond compliance" in improving their
environmental performance through the use of pollution prevention and other innovative technologies.
       A.      Compliance Assistance Activities
       EPA groups compliance assistance activities into four major categories: telephone assistance,
workshops and training seminars, compliance assistance  tools, and onsite visits.

       •      Telephone Assistance includes assistance provided via special toll-free "hotlines" and
               other forms of telephone assistance, where phone assistance is a primary outreach vehicle
               for compliance assistance.

       •      Workshops include all types of meetings that are sponsored by the program and that
               involve a group of regulated entities.  Activities that fall under this category include
               seminars, conferences, training, and forums.

       •      Compliance Assistance Tools include  the development and distribution of all forms of
               printed materials (e.g., newsletters, fact sheets, information  packets, brochures) as well as
               videos, slide shows, and the establishment of Web sites that are either produced in the
               Regions, by Headquarters, or others for distribution purposes. Compliance assistance
               tools also include plain language guides, self-audit checklists, expert systems, and
               CD-ROM-based applicability determinations.

       •      Onsite Visits include facility visits to provide technical assistance, compliance assistance,
               environmental management reviews, and pollution prevention assistance.

       B.      Measuring Results of Compliance Assistance
       EPA has identified three types of measures to gauge the success of compliance assistance
activities. These measures include output measures, outcome measures (both regulatory and

-------
Compliance Assistance Activities and Outcomes                                           Page 5
nonregulatory), and environmental and public health indicators.  While this document focuses on
outcome measures, understanding output measures and environmental and public health indicators helps
put the outcome measures into perspective.
        Output Measures

        Output measures are defined as "quantitative or qualitative measures of important activities, work
products, or actions taken by EPA or by states under delegated federal programs."3  They include the
number of facilities reached through compliance assistance tools and initiatives or through the distribution
of compliance information.  Output measures for compliance assistance activities include the number of
site visits conducted, the number of helpline calls answered, and the number of fact sheets developed.

        Outcome Measures
        Outcome measures are "quantitative or qualitative measures of changes in behavior of the public
or regulated entities  caused, at least in part, by actions of government."4 Outcome measures include
changes in awareness and understanding, changes in behavior, and environmental and human health
improvements:

        •      Changes in awareness or understanding reflect an increased knowledge of regulatory
               or nonregulatory environmental issues, including reporting and monitoring requirements,
               regulatory schedules, and pollution prevention opportunities.  Examples of changes in
               awareness or understanding include the percentage of facilities receiving assistance that
               indicate an improved understanding of environmental regulations or the number of
               facilities attending a workshop that gained knowledge about pollution prevention or
               control technologies.

        •      Behavioral changes represent actual changes that a regulated entity has undertaken as a
               result of compliance assistance. Examples of behavioral change include the number of
               facilities that submitted required permit application or notification  forms because of a
               training program, or the percentage of facilities assisted that adopted recommendations
       3 NPMS, p. 4.
       4 NPMS, p. 4.

-------
Page 6
    Compliance Assistance Activities and Outcomes
               discussed during an onsite
               visit. Behavioral changes
               can be voluntary (e.g.,
               voluntary reductions of
               industrial emissions as a
               result of publication of
               pollution prevention guidance
               documents or fact sheets), or
               regulatory (e.g., facilities
               reporting overlooked
               chemicals as a result of the
PRINTERS TAKE ACTION AFTER
VIEWING WEB SITE
An online survey of users of the Printers National
Environmental Assistance Center (PNEAC) Web
site found that 65 percent of printers responding to
the survey (14 individuals) made a behavioral
change as a result of the Web site. Examples of
actions taken include:
•      Contacted a regulatory agency
•      Changed handling of waste or emission
•      Requested technical assistance
•      Changed a process or practice
•      Contacted a vendor
               publication of Toxic Release
               Inventory guidance documents). Improvements in compliance are also included under
               behavioral change.

        •      Environmental and human health improvements are measures of specific
               environmental and human health improvements at specific facilities resulting from
               compliance assistance activities. Environmental and human health improvement
               measures provide an indication of the scope and types of improvements resulting from
               compliance assistance tools and from the delivery of compliance assistance through
               targeted initiatives.  An example of environmental and human health improvements
               would be the number of pounds of pollutant emission reductions at a facility resulting
               from the adoption of a control technology explained in a training video.5

        Environmental and Human Health Impact Indicators

        Environmental and human health impact indicators are defined as quantitative or qualitative
measures of progress over time toward achieving national environmental or human health objectives.
These indicators help EPA measure what impacts its environmental programs are having on national
environmental problems. Environment indicators might, for example, show a reduced level of nutrients
         Note that it might take some time for facilities to implement certain behavioral changes and realize
environmental results. For this reason, you might want to periodically follow up with recipients of compliance
assistance.

-------
Compliance Assistance Activities and Outcomes                                          Page 7
in a water body over a specified amount of time.  EPA is planning annual studies of ambient
environmental conditions to measure overall progress in achieving its goals.
       C.      Continuum of Output and Outcome Measures

       Figure 1 shows the continuum of output and outcome measures from reach to the targeted
community to environmental and health improvements.  Each measure builds on the previous measure on
the continuum. Changes in behavior will not occur until the target audience understands the regulatory
requirements. Similarly, it is difficult to assess the environmental and human health improvements
without knowing the degree to which behavior has changed.
       Although this document focuses on measuring the outcomes (i.e., changes in awareness and
understanding, changes in behavior, and environmental and human health improvements} associated
with compliance assistance projects, understanding how effectively you have reached the target audience
will help you measure these outcomes. If the hotline, assistance tool, or workshop is reaching only a
small portion of the intended audience, there will be limited corresponding changes in awareness and
understanding. For example, if only a few printers in a targeted community are aware of a compliance
assistance workbook and hotline, only a small number of facilities can possibly make changes as a result
of the assistance.

-------
Page 8                                 Compliance Assistance Activities and Outcomes

Figure 1: Continuum of Compliance Assistance Measures
      Output Measures
          Reach
                      Outcome
  Changes In
 Awanenen £
Understanding
      I
              Environmental
Changes in   and Human Health
 Behavior      Improvements
       Low
           Medium
                     High
                              Level of Effort

-------
Planning an Evaluation
                                       Page 9
                    SECTION III:  PLANNING AN EVALUATION
       Planning is an essential step to conducting a successful evaluation. Without effective planning,
the subsequent steps in an evaluation are likely to provide results that are difficult to understand.
Identifying the goals of the compliance assistance project and where on the continuum you are starting
from, along with defining the purpose and scope of the evaluation, will help you determine the best
approach for your evaluation. To begin planning your evaluation, take some time to answer the following
key questions:

       •      What is the goal of your compliance assistance project?
       •      What is the purpose and scope of your evaluation?
       •      What measures are appropriate?

       •      Which data collection method best meets your needs?
       A.     What Is the Goal of Your Compliance Assistance Project?
       The first step in the evaluation process is to
identify the initial goals of the compliance
assistance project. Is the goal of your project to
document specific environmental results? Or, is the
goal of the project simply to increase awareness?
Region 4 hopes to document environmental results
in its Charleston community-based environmental
protection project. On the other hand, Ohio's "Dry
Cleaner Initiative," seeks to educate, via mass
mailings and onsite visits, dry cleaners on air
permitting requirements under the 1993 MACT
standard. Understanding where your project falls
EPCRA 100 PERCENT COMPLIANCE
GOAL
The Headquarters team working on the EPCRA
§312 Initiative set an ambitious goal for its
compliance assistance project: achieve 100
percent compliance among facilities in the
industrial organic chemical sector subject to the
requirements of EPCRA §312, which requires
facilities to prepare an inventory of chemicals
produced, used, or stored on site. EPA selected
this industry sector (SIC 2869) since this sector
reported the largest number of accidental
chemical releases of all industrial sectors
surveyed.
on the compliance assistance continuum will help ensure that you select appropriate measures to evaluate
your success.

-------
Page 10
                        Planning an Evaluation
       Examples of specific project goals include:
       •      To educate, via a series of seminars, companies in the Maquiladora industry that transport
               wastes from Mexico to the United States on applicable regulations.

       •      To identify multimedia environmental problems and pursue corrective action through
               community-based environmental protection (CBEP) and onsite assistance to small
               businesses to prevent the problems from reoccurring.

       •      To improve compliance  at approximately 1,000 federal facilities via an interagency
               working group.

       •      To promote environmental justice by improving the quality of the land, air, water, and
               living resources in a targeted urban neighborhood via onsite assistance to small
               businesses.

       B.      What Is the Purpose and Scope of Your Evaluation?
       Carefully assessing the evaluation's
purpose—why the evaluation is needed, how the
information will be used, and who will use the
evaluation results—will help you determine the
scope of the evaluation and next steps. Is the
purpose of the evaluation to  assist in management
decisions? For example, are you trying to collect
information such as lessons learned, innovative
techniques used by facilities as a result of
compliance assistance, or how the compliance
assistance activity helped the audience? If so, an
POSSIBLE EVALUATION USES
EPA's Hearing the Voice of the Customer (OP-
235-B-98-003) identifies several management
uses for program evaluations:
•      Find ways to improve program services.
•      Answer inquiries about program
       accomplishments or needs.
•      Make a stronger case for more budget
       resources.
anecdotal assessment will probably meet your evaluation needs.  On the other hand, if you trying to make
major policy decisions or preparing a report for distribution to the public, then you might want to consider
conducting a statistical study instead.

-------
Planning an Evaluation                                                                  Page 11
SNAPSHOT OF THE TEXAS/MEXICO BORDER INDUSTRY PROJECT
The Texas/Mexico Border Industry NPMS Set 3 Pilot Project seeks to educate, via a series of seminars,
companies in the Maquiladora industry that transport wastes from Mexico to the United States.  Seminar
topics provide information on applicable state and federal requirements, promote voluntary compliance
incentives and provide information on innovative pollution prevention techniques. Compliance guides
and checklists were developed for the seminars, which include a guide for generators, a checklist for
manifest requirements, and a guide for transporters of hazardous waste. The goal of the evaluation is to
determine the effectiveness of the workshops on behavioral change and to determine changes in
compliance data from HAZTRAKS before and after the seminars.
       Anecdotal Assessments

       Anecdotal assessments—evaluations that describe accomplishments, yet make no broad
generalizations or claims6—will be suitable for most compliance assistance evaluations conducted by
OECA.  Anecdotal assessments tell a story about how compliance assistance has impacted the group of
people that responded to your survey.  These evaluations can include quantitative assessments such as the
number of facilities implementing a process change as a result of onsite visits, but also tell a broader story
of how and why the facilities responded to the compliance assistance. If the purpose of your evaluation,
for example, is to determine whether or not onsite compliance assistance visits are worth continuing, then
you need to know, in general, how well the visits were perceived by industry as well as what specific
behavioral changes were made as a result of the visit. In this case, information from 6 to 10 recipients of
site visits will probably provide enough information to make a decision about the  value of site visits.
While it is a not statistical study (i.e., you cannot generalize your results to all facilities that received an
onsite visit), a  well-thought out anecdotal assessment will provide the qualitative information needed to
make an internal management decision as to whether this activity is worth continuing.

       Statistical Studies
       If the purpose of your evaluation is to make broad policy decisions, such as moving from a
compliance assistance phase to an enforcement stage for a particular industry sector, you might want to
consider conducting a statistical study. Statistical studies enable you to generalize your evaluation results
          See Sparrow, Malcolm K. 1997. Regulatory agencies searching for performance measures that count.
Journal of Public Administration Research and Theory.

-------
Page 12                                                                    Planning an Evaluation
to a larger audience (e.g., all facilities in an industrial sector, geographic region, or all users of a
compliance assistance tool). This type of study requires that you collect more detailed data from a larger
number of respondents in order to make reliable generalizations. The NPMS Set 1 Task Group is
developing a separate guide to assist Regions in developing statistical methodologies. Appendix C
includes additional information on conducting statistical studies, including how to chose the right sample
size.  If you are planning to conduct a statistical  study, you might want to consider contacting statistical
experts in EPA's Office of Policy for assistance.
        Specifying the Target Population

        As you develop the scope of your evaluation, you  should also identify your target population,
taking into account the purpose of your evaluation.  Target populations might include specific industry
sectors or a geographic area.  You might also want to target your evaluation broadly to encompass a
service, regardless of the sector or geographic location.  Table 1 illustrates how identifying the purpose of
your evaluation will help you define your target population.

Table 1. Examples of Target Populations
Purpose of Evaluation
Evaluating the impact of a compliance assistance helpline, fact
sheets, and site visits on the dry cleaning sector.
Assessing the reduction in chromium air emissions in a specific
region resulting from process changes implemented based on
outreach materials related to the chromium electroplating MACT
standards.
Estimating the changes in awareness and understanding of
environmental regulations of visitors to a Web site.
Target Population
Industry sector
Geographic region
Users of the Web site

-------
Planning an Evaluation                                                                  Page 13
 MEASURES FOR THE TEXAS CITY SMALL BUSINESS PILOT

 The goal of the Texas City, Texas, Small Business NPMS Set 3 Pilot Project is to identify multimedia
 environmental problems and pursue corrective action through CBEP to prevent the problem from
 reoccurring. Teaching small business owners to assess their facilities and correct problems is the key
 to achieving this goal.  The project team plans to achieve this goal through onsite visits to small
 businesses in Texas City. Compliance assistance will be available through the state environmental
 agency (the Texas Natural Resource Conservation Commission), a county-level assistance provider,
 and large-business mentors. Each visit will consist of an environmental assessment, determination of
 environmental compliance, and identification of changes required to achieve compliance. The
 evaluation team identified the following measures for this project:

         •      Rates of noncompliance.
         •      Improvements made as a result of the onsite visits.
         •      Number of facilities that self-police based on state/federal self-policing policies.
         •      Number of days for a significant violator to return to compliance.
         •      Number of small businesses participating in the project (and the percentage of small
                businesses in the community that are participating).
         •      Number of businesses that request onsite assistance, requests for written information,
                and numbers and types of compliance assistance information tools developed or
                identified by the program.
         •      Rate of compliance and noncompliance.
         •      Qualitative lessons learned.
       C.      What Measures Are Appropriate?

       Defining success up front, based on the project goal and purpose of the evaluation, will help you

select the appropriate evaluation measures, as well as interpret the results. Try to identify what results you

would expect based on your experience with similar projects, goals established at the outset of the

activity, or management expectations for the project.

       Tables 2 and 3 list specific measures developed by the NPMS Set 3 Task Group for assessing the

outcome of compliance assistance activities and sector strategies. OECA's goal is to develop a nationally

consistent methodology to track these measures, similar to the Case Conclusion Data Sheets, which

OECA developed for the enforcement program. These  sheets, now docket fields, track activities resulting

from enforcement actions such as improvements in labeling, recordkeeping, and reporting.

-------
Page 14
                                                         Planning an Evaluation
Table 2. Measures for Compliance Assistance Activities
 Outcome
 Measure
   Specific Measures
 Awareness and
 Understanding
   Number who improve their understanding of regulatory requirements
 Behavioral
 Change
   Regulatory:
   Number of regulatory requirements adopted (includes notifications, permits adopted,
   labeling, reporting, etc.)
   Changes in level of compliance (multimedia, subset, compliance indicators)
   Nonregulatory:
   Number of process changes adopted
   Number of environmental management changes or reviews adopted
   Number of best management practices adopted
   Number of self-audits conducted
   Number of recommendations adopted
   Number of facilities changing regulatory status
 Environmental
 and Human
 Health
 Improvements
   Number of facilities that reduce emissions or other pollutants
   Number of human health worker protection improvements
   Amount of emissions reduced, pollutants reduced, and/or risk reduced
       D.
Which Data Collection Method Best Meets Your Needs?
       Gathering the necessary data to evaluate compliance assistance activities is usually the most
difficult, time consuming, and resource intensive step in the evaluation process. It is essential that you
select the most appropriate data collection tool to meet the goals of your evaluation. Additional
considerations include determining if the data gathering falls under the purview of the Paperwork
Reduction Act (PRA), and is, therefore, subject to Office of Management and Budget (OMB) approval.
In addition, OECA developed an Information Collection Request (ICR) entitled Regional Compliance
Assistance Program Evaluation (EPA ICR No. 1806.01) that Regions can use to expedite OMB approval
of survey questions. This generic ICR reduces the OMB review period from 6 months to 6 weeks.

-------
Planning an Evaluation
Page 15
Table 3. Sample Measures for Sector Strategies
Sector
Petroleum
Refining Sector
Iron and Steel
Sector
Agricultural
Industry Sector
Chemical Sector
Specific Outcome Measures
Increased compliance rates for these facilities 1 year after the program.
Number of previously unknown facilities acting to address any compliance issues as
a result of receiving compliance assistance (including contacting EPA or states for
assistance).
Number/percentage of facilities that self-report ground-water contamination.
Number of environmental management plans adopted by steel mills.
Number of steel mills that evaluate plant water and waste operations as a result of
compliance assistance activities.
Number of concentrated animal feeding operations (CAFOs) with Comprehensive
Nutrient Management Plans (CNMPs).
Number of CAFOs that submit a permit application.
Number of facilities changing regulatory reporting.
Number of facilities reducing discharges as a result of compliance assistance.
        The Paperwork Reduction Act (PRA) and OMB Approval

        The PRA requires federal agencies to receive OMB approval prior to collecting substantially
similar information from 10 or more nonfederal respondents. As defined by the act, a "collection of
information" means obtaining or soliciting information through identical questions, or reporting or
recordkeeping requirements. The PRA applies to both mandatory and voluntary data collection efforts.
Thus, most compliance assistance evaluations are subject to the PRA. Note, however, that the following
actions do not require OMB approval:
        •      Surveys handed out onsite  immediately after a workshop, seminar, or meeting, that ask
               participants about the quality of the seminar (e.g., knowledge of speakers, usefulness of
               handouts).7

        •      Attendance sign-in sheets at a meeting, workshop, or Web site.
          Surveys to collect baseline data to assess awareness and understanding prior to the workshop may require
OMB approval. Consult Appendix D for specific scenarios.

-------
Page 16                                                                   Planning an Evaluation
Appendix D provides additional information to help you determine whether or not your evaluation falls
under the PRA.

        Figure 2 should help you determine what next steps are necessary to proceed with your
evaluation. In general, when you ask the same set of questions (whether voluntary or regulatory) to 10 or
more people, the survey will fall under the PRA.8 OECA has, however, obtained an expedited approval
process for many compliance assistance surveys through its generic ICR. You can use the generic ICR
when your goal is to determine the effectiveness of compliance assistance activities on the audience that
receives the compliance assistance (e.g., participants at a workshop or users of a compliance assistance
tool). Note, however, that the generic ICR cannot be used when you plan to use a statistical approach to
generalize the effectiveness of a compliance activity to an overall population. In this case, you will have
to develop a separate ICR for your evaluation.9

        If your survey is covered under the generic ICR, you will still need to obtain clearance from OMB
before you distribute your survey. To obtain clearance, you need to submit a copy of the survey
instrument, cover memo,  and burden estimate to the Regulatory Information Division (RID) in the Office
of Policy. For more information on the level of information needed, the timing, and sample memos, see
How to Obtain Clearance for Regional Compliance Assistance Evaluation Surveys Under the Generic
ICR 1860.01  OMB Control # 2020.0015 in Appendix E or on the Web at
.
        Data Collection Tools

        There are a number of tools you can use to collect data for your study. These tools include: mail,
online, and phone surveys; mail-back comment cards; focus groups; onsite revisits; and reviews of self-
reported data. In general, telephone surveys, focus groups, and onsite revisits allow for the collection of
          Note that this requirement does not apply when contacting officials at federal facilities.

          OMB put this constraint on the ICR to give the public an opportunity to comment on survey
methodology prior to implementation. For guidance on developing an ICR, see EPA's ICR Handbook available
from the Office of Policy, Regulatory Information Division (RID).  RID provides policy direction and oversight of
Agency management of regulatory information and manages the Agency's administration of the burden reduction
provisions of the Paperwork Reduction Act. OECA liaison, Lynn Johnson, on RID's Paperwork Analysis Team will
answer any questions you might have about whether an ICR is needed, issues involved in preparing an ICR, or the
ICR clearance process. Lynn can be reached by calling 202 260-2964 or via mail at Mail Code 2137. The fax
number is 202 260-9322.

-------
Planning an Evaluation
                                     Page 17
Figure 2. Obtaining OMB Approval for Compliance Assistance Evaluations
      Key
     O
      o
 Are yoy      the
              to
        f
    employees!
                                         Yes
                             Does your survey-
                           meet any PRA exemp-
                            tions as described in
                               Appendix D?
                   Yes
No
                         Ae
          the
  Is the
        under the
        ICRI

             E
                                        Yes
                      No
                               Submit survey
                          instrument, cover memo,
                          and burden estimate to
                            Lynn Johnson in the
                          Office of Policy. Allow 6
                          weeks for OMB approval.
                             The fCR
                    Hoftrf6e«k» EPA;$
                     to developing an ICR.
                       The          is
                           on the
                       it ,
more detailed information than mail or online surveys and mail-back comment cards, since they allow

you to ask followup or clarifying questions. Additional considerations for each of the methods are as

follows:

•      Mail-Back Comment Card.  Comment cards are tear sheets that you can distribute to potential

       respondents with the compliance assistance tool (e.g., guidebooks, videos, and information

       packets) itself. Since the card travels with the tool, you do not need to incur the costs of a

       followup mailing. Note, however, that you will incur the prepaid mailing costs for the tear sheet.

-------
Page 18
                        Planning an Evaluation
       The mail-back card also ensures that all users have access to commenting on the product.  If, for
       instance, a state compliance assistance program distributes your compliance assistance tool along
       with its own tools, users can still reach you with their comments.
       Online Survey. An online survey is a set
       of questions posted on an Internet Web
       site or list server.  These surveys have the
       potential to reach a large number of
       respondents.  For surveys on Web sites,
       you have the ability to reach users that
       might otherwise be unknown to you.
       Many respondents like online surveys as
       they can respond to them when it is
       convenient, and they do not need to worry
       about losing a survey or mailing it back.
OECA USES ONLINE SURVEY TO
EVALUATE ASSISTANCE CENTERS
OECA evaluated customer satisfaction and
behavioral change associated with 5 of the
Internet-based compliance assistance centers via
an online survey in 1998. Several hundred people
responded to each of the online surveys.  To
increase the response rate, OECA offered
respondents the opportunity to download an
environmental screen saver from the Web and
distributed a mousepad to respondents who
volunteered to be part of a followup focus group.
       As with mail surveys, the survey might result in a limited amount of detail as respondents might
       not want to spend a lot of time typing in a longer response.  In addition, without follow up, there
       is potential for ambiguity or conflicting results, as with the mail survey.

       Mail Survey. A mail survey is a set of questions sent to potential respondents via standard mail.
       A mail survey enables you to reach a large number of potential respondents, if you have their
       addresses.  Compared with a comment card, a mail survey enables you to ask more questions.
       There is, however, potential for ambiguity in the results without following up to a mail survey.  If,
       for example, you receive unexpected results or conflicting answers to questions, it is hard to
       interpret the data without following up.  Similarly, a limited level of detail is available in most
       responses, as respondents will not spend the time to write long answers to open-ended questions.

       Telephone Survey.  A telephone survey is a standard set of questions asked to potential
       respondents over the telephone.  These surveys, used alone, or in combination with mail or online
       surveys, enables the evaluator to ask followup or clarifying  questions, potentially resulting in
       better data.

       Focus Group.  A focus group is a facilitated group discussion of questions you raise to selected
       members of your target audience. Focus groups often stimulate creative ideas and innovative

-------
Planning an Evaluation
                                           Page 19
        approaches. They also enable participants to debate differences of opinion and the evaluator to
        assess outcomes. This activity does, however, require a limited focus or the results will be
        difficult to interpret. Note also that there might be a limited ability to ask followup questions if
        you are using a "hidden evaluator" technique (i.e., you are using a professional facilitator to ask
        the questions while you monitor the proceedings from behind a double mirror). In addition, note
        that dominant participants might skew the discussion.
        Onsite Revisit. During an onsite revisit,
        evaluators return to facilities that
        previously received a compliance
        assistance visit. Revisiting facilities can
        provide excellent data since facilities are
        likely to spend the necessary time to
        answer questions while you are onsite. In
        addition, the revisit itself might spur
        additional compliance or pollution
        prevention activities. Onsite revisits also
        provide you the added benefit of observing
   COLLECTING DATA TO EVALUATE THE
   CHARLESTON CBEP PROJECT
   The overall goal of the Charleston/North
   Charleston CBEP NPMS Set 3 Pilot Project is to
   improve the quality of the land, air, water, and
   living resources in the Charleston/North
   Charleston, South Carolina area.  The project
   team is assisting paint/body shops and small
   automotive repair shops primarily through onsite
   visits. The team plans to evaluate results through
   onsite revisits.
environmental changes in person.
•       Review Self-Reported Data. Another option for assessing results is to track compliance data
        that your audience is required to report, such as Toxic Release Inventory submissions, EPCRA
        §312 reports, and permit applications. If you conduct a workshop for dry cleaners about air
        permitting requirements, for example, check to see how many of the attendees applied for permits
        6 months after the workshop.
        You can also consider a combination approach—mail and/or online survey with a phone survey
to selected participants, for example. Such an approach enables you to potentially reach a large number
of respondents, yet also collect detailed information from selected participants. In addition, it enables you
to ask followup questions and clarify any unexpected results from the mail survey.  The  combination
approach yields detailed responses, plus a large number of general responses. In general, as resources
expended increase, both the detail  and the number of responses increase.

-------
Page 20
                      Planning an Evaluation
       Selecting the Proper Evaluation Method

       In general, the evaluation method used

should match the intensity of the compliance

assistance activity. It makes more sense, for

example, to use onsite visits to follow up on

previous visits than it does to use them to follow up

on those who simply received a guidebook or called

for telephone assistance.  Table 4 offers some

suggestions for matching the intensity of the

compliance assistance activity with the data

collection method. Table 5 highlights the uses of

the different tools, the resource implications, likely

response rates, and tips for lowering costs and

improving response rates.
IOWA DEVELOPS MAIL SURVEY TO
DEVELOP AND ASSESS GUIDEBOOK

The Iowa Waste Reduction Center (IWRC)
developed a mail survey to assess the knowledge
base and information needs of agribusinesses on
state and federal environmental regulations to
effectively develop a guidebook on this topic. To
encourage businesses to respond to the survey,
IWRC offered respondents a free copy of the
guidebook and a postage-paid return envelope.
In addition, the personal cover memo stressed
that the survey should take fewer than 10 minutes
to complete. IWRC  achieved a response rate of
approximately 23 percent on the initial survey
and plans to mail a followup survey along with
the guidebook. The  followup survey will enable
IWRC to assess the change in awareness and
understanding as a result of the guidebook.
Table 4. Matching the Intensity of the Evaluation Method to the Compliance Assistance Activity
DATA COLLECTION
TOOL
Mail-Back Comment Card
Online Survey
Mail Survey
Telephone Survey
Focus Group
Onsite Revisit
COMPLIANCE ASSISTANCE ACTIVITY
Compliance
Assistance
Tools (e.g., Web
site)
/
/
/



Helpline


/
/


Workshops


/
/
/

Onsite Visits


/
/
/
/

-------
Table 5.  Compliance Assistance Data Collection Tools
     METHOD
         WHEN TO USE
     RESOURCE CONSIDERATIONS
           RESPONSE RATE CONSIDERATIONS
 MAIL-BACK
 COMMENT
 CARD
When evaluating a publication or
other mailed tools, such as videos.
Only cost is the upfront cost of printing the
tear sheet and the prepaid postage. Perforated
tear sheets on cardstock cost more than
nonperforated sheets, but also tend to achieve
better response.
To improve the response rate, make sure the comment card is
noticeable and easy to fill out. To make the card more noticeable,
consider a brightly-colored tear sheet and placing the card in the
center of the document. Affixing postage will also help improve
response.
 ONLINE
 SURVEY
When evaluating electronic services
such as Web sites.
Online surveys can also be used as an
alternative to a mail survey.
Upfront costs include posting the survey on the
Web site and programming electronic data
entry capabilities, if desired.
Might have a higher response rate than mail survey, as online
surveys are easier to respond to.  Prompt Web users to respond to
the survey by making it noticeable on the site. Announce the
availability of the online survey to the target audience through
electronic means such as list servers and nonelectronic channels
such as a conferences or newsletters. Keep in mind that online
surveys exclude compliance assistance users without access to
Internet.
 MAIL SURVEY
When trying to reach a large number
of potential respondents.

When data needed are amenable to
yes/no answers or scales.
Requires postage costs for mailing the initial
survey as well as postage for the respondents
to return the survey, if using postage-paid
envelopes.  If sending out reminder postcards,
will require additional postage costs.
Likely to have a low response rate if used alone. Follow up with a
reminder postcard or phone call to improve response rate. Send
subsequent mailings on colored paper.  Consider using a third party
such as a trade association or nonprofit to administer.
 TELEPHONE
 SURVEY
When more detailed data is required.
If interviewing respondents in a large
geographical area, long distance charges will
apply.  Staff time to administer the survey over
the telephone is higher than the mail or online
survey.
Consider using non-EPA employees to conduct the survey to
improve perception of anonymity.  Phone surveys can also be used
to improve the rate of response for a mail survey. To improve the
response rate for a telephone survey, attempt calling  the respondent
on several occasions. In addition, calling respondents either early or
late in the day might help the interviewer bypass secretarial
screening.
 FOCUS GROUPS
When the evaluation goal is to
improve services, compare a variety
of services, or brainstorm new
approaches.

Before launching a new compliance
assistance or expanding assistance to
new sectors.
Professional facilitators tend to be costly. As
an alternative, consider using an impartial EPA
staff person with experience in facilitation.
Not usually an issue for focus groups since there are a small number
of participants.  Working with a trade association might increase
response.
Make the timing and location of the meeting as convenient as
possible. Consider conducting the focus group during the evening
or weekend.
  ONSITE
  REVISITS
When highly detailed data or
difficult-to-collect data (such as
environmental changes) are required.
When observed data are preferred or
required.
Often resource intensive as staff time is
required to set up and administer the survey to
participants. Evaluators will also incur travel
costs to the facilities. Resource constraints
often limit the number of facilities that can
participate.
Some evaluation teams try to get a commitment for a revisit from
the facility prior to the initial assistance visit.

-------
Page 22
                          Conducting an Evaluation
                  SECTION IV:  CONDUCTING AN EVALUATION
       Based on the answers to the planning steps above, you should now have an understanding of: the
purpose and scope of the evaluation effort, the measures to be used to evaluate if the goals have been
achieved, and the method to obtain the data needed to quantify the measures. This section includes tips
and suggestions for implementing the evaluation, from identifying the starting point to developing the
data collection instrument and selecting the sample population.

       A.      Identifying a Starting Point
       When possible, before conducting a
compliance assistance activity, identify a
starting point (e.g., the level of
understanding of environmental regulations
of the target audience). This will allow you to
compare results before and after compliance
assistance.  To develop a starting point,
consider the following options:

•      Develop a preassistance survey or
       "test." Before conducting the
       compliance assistance activity,
       consider developing a survey that
       asks respondents to describe their
       level of awareness, understanding,
       and practices related to the
       regulations on which you plan to
       supply compliance assistance.
       Comparing the results of the
PNEAC ASSESSES UNDERSTANDING BEFORE
AND AFTER VIDEOCONFERENCE
To assess changes in awareness and understanding as
a result of the May 1996 "Green and Profitable
Printing" videoconference, PNEAC conducted a brief
survey at the conclusion of the videoconference. The
survey asked participants to rate their understanding
of compliance requirements and pollution prevention
opportunities before and after the conference. On a
scale of 1 to 10, participants estimated that their
understanding of compliance and pollution prevention
issues increased by 2 to 3 points. The survey also
asked those printers interested in participating in
followup surveys to include their name and contact
information. In a subsequent survey of these
participants, 92 percent of the respondents indicated
that they had improved compliance since participating
in the conference.  An impressive 97 percent said they
adopted at least one of the pollution prevention
recommendations discussed. Ninety-five percent of
the respondents said the videoconference influenced
their decision to make the above changes. The
followup survey is available on the Internet at
.
       preassistance survey and a postassistance survey can help you determine the perceived impact of
       your assistance. Similarly, you can "test" how well your audience understands applicable
       regulations by asking them questions about what the regulations require. These tools can also

-------
Conducting an Evaluation                                                                Page 23
       help you target the compliance assistance program toward specific industry needs identified in the
       survey or "test."

•      Examine historical data on ambient environmental conditions. Several compliance
       assistance initiatives have been conducted with the goal of reducing emissions of specific
       pollutants from industrial sectors. Historical pollutant release data from facilities in these sectors
       can serve to identify the starting point. For example, Region 9 has had problems with the
       discharge of copper into the San Francisco Bay.  As a result, Region 9 has implemented several
       compliance assistance activities to reduce the discharge of copper in waste water from metal
       finishing facilities.  In this case, Region 9 can use historical discharge monitoring reports (DMRs)
       to establish current levels of discharge. After completion of compliance assistance activities, the
       Region can once again review the DMRs to evaluate the reduction of copper discharge.
       Examine historical compliance rates.     WDRIVE_BY,, EVALUATIONS IN REGION 4
       EPA has undertaken several compliance
          .,     ••,-,•    ,  •                    Most of the small businesses participating in the
       assistance initiatives to improve             „,   ,      ^™-,™    .    ,  ,        ,.     , .
                                                 Charleston CBEP project lack a compliance history.
       compliance rates in industrial sectors or
       initiatives, the historical compliance
To assess these paint/body and automotive repair
               , .     .     „   .,    .      f       shops, the proiect team has conducted drive-by
       geographic regions. For these types of          ,.^J       ,       ,     .,.,.,...
                                                 evaluations to assess the number 01 tacilities in
each category and visible environmental problems.
       rates with specific regulations in the
       sectors or Region could serve as the starting point. Note that using only the year prior to or the
       year of the compliance assistance activity as the starting point might bias the data if that year was
       an anomaly.  The NPMS Set 3 Task Group recommends that several years of previous data be
       evaluated in quantifying the starting point.  Since many compliance assistance activities are
       geared towards small businesses, however, you might not have historical compliance information
       on the targeted businesses.

       B.     Developing the Data Collection Instrument
       Regardless of the data collection method (mail, telephone, or online survey, mail-back comment
cards, site revisits, or focus groups) used, you will need to prepare a data collection instrument.  For mail
surveys, online surveys, and mail-back comment cards, most evaluators develop a standard questionnaire
for all recipients. This section presents some tips on developing an effective data collection instrument
and sample questions tailored for specific compliance assistance activities.

-------
Page 24                                                                Conducting an Evaluation
        For compliance assistance activities, most mail and online surveys and mail-back comment cards
are voluntary requests of the regulated community. To be useful, the clarity and ease of completion of
these surveys are critical in obtaining a sufficient response rate. Below are some formatting tips to make
the surveys easier to complete for the recipient.10

        •      Keep it simple. When developing a survey, endeavor to keep it simple.  Because the
               survey is voluntary, it should be designed to help the respondent complete the survey as
               easily as possible.
        •      Keep it short.  Limit the survey to one to two pages. This format will likely result in a
               higher response rate, as longer surveys can overwhelm the recipient.

        •      Be clear. Give clear instructions to the respondent, so that he/she understands what to
               do.  Spell out acronyms, use terminology known by your audience, and carefully word
               questions to help ensure that the audience understands the questions.

        •      Ask one question at a time. Compound questions can confuse respondents and the
               evaluation team.

        •      Define what is being asked for.  Avoid the use of relative terms like frequently, seldom,
               and occasionally as these terms mean different things to different users. Instead, define
               or quantify the number of times. Asking the respondent to "check all that apply" can
               help save space, but might result in incomplete answers compared with the more explicit
               "Yes/No" question. Open-ended questions can assist you in obtaining additional detail,
               but use them only as a supplement to scaled question since open-ended questions are
               subject to interpretation and often require additional burden for the respondent.

        •      Use  a balanced scale. Give respondents an opportunity to select from an equal number
               of positive and negative responses. In addition, the scale should move from negative on
               the left to positive on the right, and with strongly disagree on the left to strongly agree on
               the right. Scales should also be comprehensive and not overlap.
          These and other tips can be found in: Conway, Mai. 1998. Getting results from paper and electronic
surveys and questionnaires. IBM Global Services. Hamilton, New Jersey.  September.

-------
Conducting an Evaluation                                                                 Page 25
        •      Use a consistent scale. If you begin with asking respondents to identify whether or not
               they "agree" or "disagree" with a particular statement, keep this structure throughout the
               survey.  For example, do not switch from "agree/disagree" to "like/dislike" midway
               through the survey.

        •      Be logical. There are several ways to make a survey more logical to the recipient:

               —     Leave sufficient space with lines for open-ended responses, and tell respondent
                       what to do if more room is needed.
               —     Group similar question types together.
               —     Provide section headings.
               —     Use a vertical format, sufficient font size, and plenty of white space for ease of
                       reading and comprehension.
        For telephone surveys and site visits, evaluators often use a discussion guide, which provides a
consistent set of questions and enables the evaluator to change the order and phrasing of questions and
add clarifying questions as needed during the interview. For focus groups, evaluators typically develop
specific questions for the group to discuss.  Many of the tips listed above can be applied to a discussion
guide and focus group questions.

        Linking the Questions to Outcome Measures

        When designing the survey instrument, you will need to include questions that will provide
information on  selected outcome measures.  Appendix F provides sample questions, organized by
outcome measure, which were developed from EPA and state surveys used to measure compliance
assistance outcomes. These surveys are available on the Internet at
. Appendix F also includes questions to gauge customer
satisfaction11 and supplemental questions for specific compliance assistance activities.

        The NPMS Set 3 Task Group encourages you to tailor the questions in Appendix F to your
individual evaluation efforts. You can add, subtract, or alter the questions as appropriate. At a minimum,
          For more information on conducting customer satisfaction surveys, see U.S. EPA. 1998. Hearing the
voice of the customer: customer feedback and customer satisfaction measurement guidelines (OP-235-B-98-003).
November. Information about EPA's Customer Service Program (CSP), CSP's generic ICR for customer
satisfaction surveys, and the guidelines are available on the Internet at  or by
contacting Patricia Bonner at 202 260-0599.

-------
Page 26                                                               Conducting an Evaluation
however, you should include the bolded questions to measure compliance assistance outcomes. As
discussed on page 13, OECA's goal is to have a nationally consistent set of outcome measures for
compliance assistance for national implementation beginning in FY 2000.  Table 6 reviews the outcome
measures and the associated codes. Appendix G includes examples of actual surveys developed to
measure compliance assistance outcomes.

       Measuring Changes in Levels of Compliance

       To assess changes in the level of compliance (code: B R 2), you will need to first assess a baseline
compliance rate.  If you are assessing changes at a facility that is a major source, consult EPA media-
specific databases to determine the compliance baseline. When assessing changes in level of compliance
for small businesses, for which compliance information might not exist in EPA media-specific databases,
you will need to develop a surrogate measure of compliance. Sector-specific checklists, like the one in
Appendix H, can help illustrate key indicators of compliance. Note, however, that when you are trying to
determine compliance rates for an industry sector, it makes sense to conduct a statistically valid
evaluation so that you can generalize your results to the entire sector. Please remember that OECA's
"generic" ICR (1860.01 OMB Control #2020.0015) cannot be used for statistically valid surveys.

-------
Conducting an Evaluation
                                                                  Page 27
Table 6.  Outcome Measures and Associated Codes
 Outcome
 Measure
Specific Measures
Code
 Awareness and
 Understanding
Number who improve their understanding of regulatory
requirements.
Al
 Behavioral Change
Regulatory:

Number of regulatory requirements adopted (includes
notifications, permits adopted, labeling, reporting, etc.).

Changes in level of compliance (multimedia, subset,
compliance indicators).

Nonregulatory:

Number of process changes adopted.

Number of environmental management changes or reviews
adopted.

Number of best management practices adopted.

Number of self-audits conducted.

Number of recommendations adopted.

Number of facilities changing regulatory status.
                                                                               BR1


                                                                               BR2



                                                                               BNR1

                                                                               BNR2


                                                                               BNR3

                                                                               BNR4

                                                                               BNR5

                                                                               BNR6
 Environmental and
 Human Health
 Improvements
Number of facilities that reduce emissions or other pollutants.

Number of human health worker protection improvements.

Amount of emissions reduced, pollutants reduced, and/or risk
reduced.
E 1

E2

E3
       C.      Pretesting the Data Collection Instrument

       Regardless of the instrument selected, you should also "pretest" or "pilot-test" the data

collection instrument. To do so, select a small group of participants willing to answer the questions and

give feedback on their responses. This process will enable you to identify confusing questions, add

missing questions, or delete irrelevant questions.  The pretest also helps reduce bias in the evaluation by

identifying leading or inappropriate questions.

-------
Page 28
                         Conducting an Evaluation
       D.     Selecting the Sample
       For the bulk of compliance assistance
activities, statistically valid probability sampling
will not be required. In fact, you do not need to
sample at all in some cases.  If the target audience
of your compliance assistance activity is small
(e.g., 30 people attending a workshop), for
example, you can survey all of the people in your
target audience.  In this case, sampling is not
necessary since you are talking with everyone.
Using online surveys and comment cards also
eliminate the need for sampling, as anyone with
Internet access can respond to an online survey and
publications.
   NEWMOA PRETESTS STATE SURVEY
   Prior to conducting its 1998 survey on compliance
   assistance providers, the Northeast Waste
   Management Officials Association (NEWMOA)
   pretested its survey instrument to a group of 15
   people representing different subsets and
   organizations included in the 700-person sample.
   Pretesters provided valuable suggestions to clarify
   questions, improve survey flow, and  eliminate
   unnecessary questions. In fact, NEWMOA
   reduced the survey from 12 to 9 pages as a result
   of the suggestions from pretesters.
comment cards are sent automatically with
       In contrast, if your audience is large, the principles of probability sampling can assist you in
determining with whom to speak. First, consider developing a sample that is representative of your target
audience.  For example, if your goal is to evaluate the impact of your compliance assistance activity on
both large and small businesses, make sure that representatives of each business size are in your sample.
Second, exclude those outside your target population. For example, if you are evaluating a workshop and
are interested in documenting behavioral change, remove the consultants and state government officials
from the sample pool. For more information on sampling, see Appendix C.

       Remember that if you are conducting an anecdotal assessment,  and not a statistical study that
uses probability sampling, you cannot generalize the results of your evaluation to all users of the
compliance assistance tool or all members of an industry sector or geographic region.  Although 60
percent of workshop participants understand particular best management practices for the dry cleaning
industry, for example, this does not mean that 60 percent of all dry cleaners understand the regulations.
In order to make this kind of statement, you would need to take a large random sample of dry cleaners
and ask them awareness questions, using an appropriate statistically valid methodology.

-------
Conducting an Evaluation                                                                Page 29
        E.      Improving the Response Rate12
        An improved response rate will yield better survey results and add more confidence to your
findings. In general, you can expect a response rate of about 25 percent. Typical response rates (20 to 40
percent) vary by the type of survey, length, customer interest, and incentives. In addition, if you are
surveying a number of small businesses, your response rate might be low given the limited resources
available to small businesses.  In addition to suggestions to listed in Table 5 (page 21), consider the
following to improve response rate:

        •      Use a third party to conduct your evaluation.  An overall approach to improving
               response rates is to conduct the survey in conjunction with a trade association.  Since
               trade associations have a great deal of credibility with their membership, they can often
               help improve response rates by cosponsoring a survey or mailing out the survey on
               association letterhead. In addition, if a third party is conducting a survey of its
               membership, it can do so without the constraints of the PRA.

        •      Identify survey intentions at the outset of compliance assistance. When you initiate
               compliance assistance activities, let recipients know that you will be following up later on
               to determine the impact of the assistance. If possible, let the recipients know when and
               how you will be contacting them.

        •      Send a reminder postcard or replacement questionnaire to your audience or call
               recipients to encourage them to submit the survey or to complete the survey over the
               phone.

        •      Use a personal cover letter to generate a higher response rate than a formal form letter.
               In the letter, try to convince the respondent of the survey's significance and explain that it
               is an EPA-sponsored survey. Be sure to note if there are any compliance or enforcement
               ramifications for either submitting a response  or not submitting a response. Also
               consider enabling the respondent to submit the survey anonymously.

        •      Provide a stamped, self-addressed envelope for returns, and use first-class postage.
          Tips for increasing response rate can be found in: Nachmias, David.  1992. Research methods in the
social sciences. New York, New York: University of Wisconsin.

-------
Page 30                                                               Conducting an Evaluation
       •      Send any followup survey on different color paper.

       •      Consider the option of an online survey (many recipients find these easier to use, since
               they don't have to worry about misplacing the survey instrument).

       •      Consider offering incentives for survey completion.  Respondents to an online survey
               to evaluate the effectiveness of Internet compliance assistance centers, for example, were
               offered a free mouse pad. Also, through a cooperative agreement, EPA and a nonprofit
               organization offered respondents to a compliance assistance evaluation survey a free
               coupon at a national ice cream vendor.

       •      Use a logical  format (see "Developing the Data Collection Instrument" page 23),
               provide clear directions, and tailor the questionnaire to the target audience as  much as
               possible.

-------
Analyzing and Presenting Data
Page 31
               SECTION V: ANALYZING AND PRESENTING DATA
       Now that you have collected the data, how do you best present the results to management? Most
analysts begin the data analysis process by first compiling the raw data to each of the survey questions.
The remainder of this section uses an example developed by EPA's Office of Policy to illustrate how to
compile, analyze, and present data.13 The hypothetical question used by the Office of Policy is as
follows:

       On a scale ofl to 6 where 1 represents "highly dissatisfied" and 6 represents "highly satisfied,'
       how would you rate your satisfaction with the ABC booklet you received from EPA?

       The first step in analyzing the results of this question would be to plot the responses or raw data
to that question, as done in Table 7. The table shows that 450 people responded to the survey and the
frequency of each response received. Another interesting piece of data that could be noted is the number
of people who did not respond to the survey, which enables you to track your response rate.
               Table 7. Raw Data on Customer Satisfaction with ABC Booklet (n=450)
Score
1 Highly Dissatisfied
2
3
4
5
6 Highly Satisfied
Don't Remember Receiving
the ABC Booklet
Don't Know or No Opinion
TOTAL
Number
42
27
122
132
38
32
22
35
450
       13
          This example from U.S. EPA. 1998. Hearing the voice of the customer: customer feedback and
customer satisfaction measurement guidelines (OP-235-B-98-003), pp. 36-41, was slightly modified for this
document.

-------
Page 32                                                          Analyzing and Presenting Data
        A.     Where Do We Go from Here?
        Now that you have your raw data compiled, think about ways to make the data more meaningful
to decision-makers.  Consider using descriptive statistics, removing data subsets, collapsing the data, and
analyzing alternative subsets of the data.
        Using Descriptive Statistics

        Policy analysts commonly use descriptive statistics—the mean, median, and mode—to describe
the central tendency of the data:

        •      Mean: represents an average response. The mean is the most common measure of
               central tendency.

        •      Median: represents the mid-point response, or the 50th percentile (i.e., the value below
               which 50 percent of the responses fall). Some analysts call this the middle value.

        •      Mode: represents the most frequent response. Generally, the mode is not a good measure
               of central tendency, since often it depends on how the analyst groups the data.

        In our example, the mean, or average response, to how satisfied customers are was 3.5. The
median response falls at about 4.  The most frequent response given was also 4. The descriptive statistics
tell us that the data is skewed slightly to the right, meaning that customers are more satisfied than
dissatisfied.  At this  point, most analysts will consider other ways of grouping the data and describing it to
make it more meaningful, as the statistics of 3.5 and 4 are not all that meaningful as presented here.
        Removing Data Subsets

        To focus attention on those who did have opinions to express, you can separate out respondents
who did not have an opinion about the document or who could not remember receiving it. Of the 450
customers asked the question in our example, 22 did not remember receiving the booklet and 35 said they
had no opinion or did not know how they would rate their satisfaction with the booklet. Table 8 presents
the responses, both the raw number and percentage, who expressed an opinion. Thus, the percentages of
those with opinions is based on the 393 respondents who expressed opinions. If it is important to
determine the percentage of customers who don't remember or who have no opinion about the booklet,
you would calculate  those figures using 450—the total number who were asked the question—as the

-------
Analyzing and Presenting Data
Page 33
denominator. By including the sample size in the table (the information that "n = 450"), readers can do
these calculations, should they be interested.

       Table 8. Customer Satisfaction with ABC Booklet (n=450)
Score
1 Highly Dissatisfied
2
3
4
5
6 Highly Satisfied
SUBTOTAL
Don't Remember
Receiving the ABC Booklet
Don't Know or No Opinion
TOTAL
Number
42
27
122
132
38
32
393
22
35
450
Percent Expressing An Opinion
11
7
31
34
9
8
100
n/a
n/a

       Collapsing Like Responses

       The information presented in Table 8 might be at too great a level of detail for those examining
the example data.  The difference between a "2" and a "3" rating, for example, might not be meaningful
for them. Thus, you may find it useful to collapse the information into a smaller number of categories.
One possibility is to create three categories: dissatisfied (scores 1 and 2), neutral (scores 3 and 4), and
satisfied (scores 5 and 6).  Table 9 presents the collapsed categories:

-------
Page 34
Analyzing and Presenting Data
       Table 9. Collapsed Categories for Customer Satisfaction with ABC Booklet
Rating
Dissatisfied
Neutral
Satisfied
TOTAL
Number
69
254
70
393
Percent Expressing An Opinion
18
65
18
101*
        ' Total is greater than 100 due to rounding
       Most readers will be able grasp the information more easily once the categories are collapsed. It
is reasonable to ask: If you will eventually collapse responses, why does the question posed to customers
have six possible answers? Research has shown that people answering survey questions prefer to have a
fairly wide range of responses because they don't like to feel "forced" into a limited set of options. In
addition, analysts might have different approaches to collapsing categories.
       When collapsing data, consider what makes sense for your evaluation. For instance, in the above
example, you could group the data differently. You could, instead, make two groups: satisfied (responses
3-6) and dissatisfied (responses 1-2). Table 10 presents the results grouped this way. Note the difference
in the analysis.

       Table 10. Alternate Collapsed Categories for Customer Satisfaction with ABC Booklet
Rating
Dissatisfied
Satisfied
TOTAL
Number
69
324
393
Percent Expressing An Opinion
18
82
100
       Analyzing Data by Subgroup

       Subgroup analyses examine whether different kinds of customers have different kinds of
responses.  Suppose you want to examine whether printers and other technical assistance providers have
the same or different opinions about the ABC booklet.  You could collapse categories and sort
respondents by their status as printers or technical assistance providers. Table 11 presents hypothetical
findings:

-------
Analyzing and Presenting Data
Page 35
        Table 11. Selected Customers' Satisfaction with the ABC Booklet
Rating
Dissatisfied
Neutral
Satisfied
TOTAL
Printers
Number
27
94
35
156
Percent
17
60
22
99*
Technical Assistance
Providers
Number
17
78
12
107
Percent
16
73
11
100
        * Total is less than 100 due to rounding

        B.     Presenting the Data
        Presenting the data in a clear, concise manner that provides the proper context and does not
distort the results is an essential element in a compliance assistance evaluation.14 You have many options
for displaying results. In addition to raw data, descriptive statistics, and percentages, you can use graphics
such as pie charts and bar graphs. How you present the data will determine how others interpret your
results.  In general, the display of data should:

        •      Encourage the viewer to think about the substance of the data rather than the
               methodology, graphic design, or technology of graphic production.

        •      Encourage the viewer to compare different pieces of data.

        •      Present the data at several levels of detail, from a broad overview to the  fine details.
        •      Disaggregate the data as much as possible. For example, give the mean  value for small
               versus large facilities.

        Using Graphics
        Graphics, including pie charts, bar graphs, histograms, and scatter diagrams, will often enable
you to present your analysis more effectively than tables with numbers alone. For example, rather than
        14
          Many of the tips can be found in: Tufte, Edward R.  1983.  The visual display of quantitative
information.  Graphics Press.

-------
Page 36
Analyzing and Presenting Data
presenting decision-makers with the data from Table 9, "Collapsed Categories of Customer Satisfaction
with the ABC Booklet," consider using a pie chart, such as the one in Figure 3.
                     Figure 3. Pie Chart Showing Customer
                     Satisfaction with ABC Booklet
Appendix I provides additional information on using graphics. Table 12 illustrates four types of
comparisons that you might want to illustrate through graphics:15

        •      Components or proportions of the topic being examined

        •      Number of items or differences
        •      Frequency distribution of characteristics

        •      Co-relationships between variables
        15
          See Alman, Stanley M. 1976. Teaching data analysis to individuals entering the public service. Journal
of Urban Analysis. Vol. 3, No. 2. pp. 211-37. October.

-------
Analyzing and Presenting Data
Page 37
Table 12. Potential Graphics for Presenting Data1
COMPARISO
NTYPE
Components




Items





Frequency
Distribution




Co-relationships





KEY WORDS

Contribution

Share
Proportion
Percentage of total
Item A more/less
thanB
Differences
Rank A is greater/
less than B


Variation
Distribution

Concentration
Relative Frequency

A is related to B
AlTlfTPflQPQ/
llll/l WClOWO/
decreases with B

A does not increase/
decrease with B
GRAPHICAL
FORM
Pie chart —
illustrates

percentages of
the whole.

Bar chart —
compares the
magnitude and
difference
among
mutually
exclusive
categories.
Histogram —
shows both
magnitudes
and differences
among
categories .
Scatter
diagram — plots
ungrouped
data.


EXAMPLES

Proportion of workshop attendees from
different sectors.

Percentage of Web site users who say
they better understand regulations as a
result of visiting the site.
Pounds of pollutants discharged into a
waterbody over different years..
Types of control technologies or
pollution prevention techniques
implemented as a result of a training.


Variation in understanding of
environmental regulations by different
workshop participants.



Analysis of changes in awareness,
behavioral change, and environmental
impacts as a result of compliance
assistance.


        C.     Understanding the Context of the Results
        One of the pitfalls of data evaluation is not placing the results in the proper context.  With
compliance assistance, other factors come into play that might influence the behavior of the target
audience.  These factors might be completely unrelated to the compliance assistance activity being
evaluated. Not considering  such factors could mislead the results.  For example, as mentioned previously,
the goal of the Chesapeake Bay program was to reduce the percent of noncompliers with NPDES permits.
        16
          Ibid, page 217.  Also see: Patton, Carl and David Sawicki.  1986. Basic methods of policy analysis and
planning.  Prentice-Hall. Englewood Cliffs, New Jersey.

-------
Page 38                                                           Analyzing and Presenting Data
At the beginning of the program, the rate of noncompliance was 16 percent, which was consistent with the
national average. Through various compliance assistance activities, the noncompliance rate for facilities
discharging to the Bay was reduced to 1 percent. While this number implies that the compliance
assistance was very effective, if the national average was also reduced to 1 percent, it would indicate that
other factors were at play. In the case of the Bay program, the national average remained at 16 percent,
indicating that the Bay program compliance assistance was indeed effective.
       Considering Data Anomalies

       When presenting your data, consider if there are any data anomalies of which the reader should be
aware.  This is primarily of concern when evaluating compliance rates and pollutant discharges.  For
example, if a workshop were conducted in January 1997 with the intent of reducing perchloroethylene
emissions from dry cleaners, a likely starting point would be perchloroethylene emissions in 1996. If for
some reason 1996 was an anomaly in terms of these emissions, however, the comparison to 1996
emissions might not be valid. A more likely scenario would be that over the years prior to 1996,
perchloroethylene discharges were being reduced because of regulatory drivers. Comparison of 1997 data
only to 1996 data would show an apparent decrease because of compliance assistance, but when
compared to trends prior to 1996 would simply show a continuation of the trend. Instead, it might be
more appropriate to present emissions data from several years prior to 1996 to determine the true impacts
of compliance assistance.

-------
Compliance Assistance Evaluation Checklist                                             Page 39

   SECTION VI:  COMPLIANCE ASSISTANCE EVALUATION CHECKLIST

       This checklist summarizes key steps discussed in the guide. You can use this checklist to monitor
your progress throughout the evaluation.

Planning an Evaluation

       Q      Reviewed original goals of compliance assistance project.
       Q      Defined the purpose and scope of the evaluation.
       Q      Selected the approach (i.e., anecdotal or statistical study).
       Q      Identified appropriate outcome measures for evaluation (see Table 2, p. 14).
       Q      Selected the appropriate data collection tools (see Table 5, p. 21).
       Q      Determined the necessary steps to obtain OMB approval.

Conducting an Evaluation

       Q      Identified a way, if feasible, to assess the "starting point" or baseline for the evaluation.
       Q      Reviewed the guidelines for developing survey instruments.
       Q      Tailored sample questions from Appendix F to meet evaluation needs.
       Q      Pretested data collection instrument.
       Q      Obtained OMB approval.
       Q      Selected an appropriate sample.
       Q      Made efforts to improve response rate.

Analyzing and Presenting Data

       Q      Consolidated raw data.
       Q      Analyzed data using descriptive statistics.
       Q      Collapsed data, as appropriate.
       Q      Determined appropriate graphics to present key data findings.
       Q      Assessed context of results.

-------
APPENDICES

-------
APPENDIX A

-------
                               PERFORMANCE PROFILE FOR
            EPA's ENFORCEMENT AND COMPLIANCE ASSURANCE PROGRAM

Impact on Environmental or Human Health Problems

       Measured through annual evaluation studies of selected EPA objectives.

Effects on Behavior of Regulated Populations

       Levels of Compliance in Regulated Populations

               Set 1.   Rates of noncompliance for -
                      a) fully-inspected populations
                      b) self-reported compliance information
                      c) populations targeted for special initiatives
                      d) priority industry sectors

       Environmental or Human Health Improvements by Regulated Entities

               Set 2.   Improvements resulting from EPA enforcement action
               Set 3.   Improvements resulting from compliance assistance tools and initiatives
               Set 4.   Improvements resulting from integrated initiatives
               Set 5.   Self-policing efforts by using compliance incentive policies

       Responses of Significant Violators

               Set 6.   Average number of days for significant violators to return to compliance or enter
                      enforceable plans or agreements
               Set 7.   Percentage of significant violators with new or recurrent significant violations
                      within two years of receiving previous enforcement action

Enforcement and Compliance Assurance Activities

       Monitoring Compliance

               Set 8.   Number of inspections, record reviews, responses to citizen complaints, and
                      investigations conducted

       Enforcing the Law

               Set 9.   Number of notices issued, civil and criminal actions initiated and concluded, and
                      self-policing settlements concluded

       Providing Assistance and Information

               Set 10.  Facilities/entities reached through ~
                      a) compliance assistance tools and initiatives
                      b) distribution of compliance information

       Building Capacity

               Set 11.  Capacity building efforts provided to state, local, or tribal programs

-------
APPENDIX B

-------
                                        NPMS SET 3

    ENVIRONMENTAL AND HUMAN HEALTH IMPROVEMENTS BY REGULATED ENTITIES
                                 WORK GROUP MEMBERS
Region 1

John Moskal

Region 2

Patrick Durack
Ken Stoller

Region 3

Janet Viniski

Region 4

Dave Abbott

Region 5

Linda Beasley

Region 6

Barry Feldman
Bonnie Romo

Region 7

Pam Johnson

Region 8

Judy Heckman-Prouty

Region 9

Greg Czajkowski
Angela Baranco
Marvin Young

Region 10

Kathy Veit
617918-1826
212 637-3767
732321-6765
215 814-2999
404562-9631
312353-2071
214 665-7439
214665-8323
913551-7480
303312-6358
415 744-2107
415744.1196
415 744-1847
206553-1983

-------
APPENDIX C

-------
Fact Sheet II:  Sampling - the basics

If you have decided to use a survey approach for obtaining customer feedback, you need to determine
what sample size to use.  This Fact Sheet first discusses sample sizes, sampling error, and confidence
intervals—all of which factor into decisions about the sample size. It then presents a table for you to
use in determining what sample size to use - and tells you step by step how to make use of that table.
This Fact Sheet then describes how to go about randomly selecting that number of customers from the
total list of customers you have served during the time period to be covered by the survey.

What Kinds of Sample Sizes Are We Talking About?

Before we give specific guidelines on how to choose the sample size, it will be useful to set some
general expectations.  National public opinion polls like the Gallup Poll and the Roper Poll typically use
sample sizes in the range of 1,350 to 1,800.  These polls use fairly large sample sizes to obtain a result
that represents the entire adult U.S. population with a sampling error on the order of plus or minus
2.5% to 3%.  Such small levels of sampling error are needed because the polls often address matters of
national importance.  The decisions made, based in part on the results of these national polls,  may be
far-reaching, long-lasting, and affect millions of people.

The surveys you will be conducting to obtain customer feedback will be of a very different nature. The
target group whose opinions you need will be much smaller: it will probably be the people who have
come to you and your colleagues in one specific program area within EPA, within a limited time (e.g.,
during one year) to request certain products or  services. We are therefore talking about a target group
of maybe as many as 500 to 1,000 people (few EPA programs  directly serve more customers  than that)
and in some cases 50 people or fewer. Furthermore, although the decisions that will be affected by
customer feedback are important, they will probably not be far-reaching and long-lasting. The scope of
decisions to be made in most cases will be, for example:

•      Should we change a process to reflect customer comments?

•      Should we revise some of our written products?

•      Should we provide a half day of customer feedback training to each staff member?

Even in the worst case—we make the wrong decision about whether our products need to be  revised
and whether the staff members need further training—we will  (if we continue to obtain feedback from
our customers at least once each year) discover our error soon enough and be able to correct it, without
incurring excessive or irreparable damage in the meantime.

Based on these considerations, it is reasonable  to have higher sampling errors than those associated
with national surveys like the Gallup poll. We can feel comfortable with sampling errors of 5 percent
or even 10 percent.

Additionally, for getting feedback from EPA customers, we have relatively small target groups who
were served by a specific program during the time period of immediate interest.  For this reason, it is
reasonable for you to use a much smaller sample size than is used in the Gallup and Roper polls, which
seek to accurately capture the opinions of millions of people.

-------
Sampling Error

"Sampling error" is normally presented as a percentage with a plus or minus sign in front of it. For
example, the sampling error in one particular situation may be ± 3.5 percent. That means that the true
value of a given measure for the entire population—that is, the whole target group you are getting
feedback from—is the value obtained from your sample of customers, plus or minus 3.5 percent.  If for
example, 62.4 percent of your sampled customers are satisfied, the actual percentage of satisfied
customers lie within the range between 58.9 percent (62.4 percent - 3.5 percent) and 65.9 percent (62.4
percent + 3.5 percent).

But that is not quite true.  In fact, there is no range of reasonable size that we can identify for which we
can be certain that the true value for the full list of customers lies in that range.

Why is that so?

Because there's always the possibility of very flukey, very unlikely circumstances occurring - with the
result that the characteristics of the customers in the sample are very different from the characteristics
of the customers not in the sample.  In such circumstances, the true value for all customers will be very
different from the value obtained from the customers in the sample surveyed.  The only way to get
around this statistical fact is to specify "how certain we want to be" that the true value does, in fact, fall
with a  specific range around the value we obtain from the sample.  This degree of certainly we are
looking for is known as the "confidence level."

Confidence Level

The "confidence level"  indicates how confident we want to be that the true value lies within a specific
range.

There is no one confidence level that is the "right" one to use. There are many different possible
confidence levels, and only you can decide which confidence level is appropriate for your survey.

Much of the work in the area of public opinion surveys uses the 95 percent confidence level.  That
means  that if you determine the sampling error using the 95 percent confidence level, you can be 95
percent certain that the true value for all your customers will lie within a specific percentage band (one
equal to the size of the sampling error) around the result you obtain from the sample of customers you
contact.

Another confidence level commonly used is the 90 percent confidence level.  With a 90 percent
confidence interval, you can be confident that 9 times out of 10, the true value falls within the value
obtained from your sample of customers, plus or minus the sampling error.  Some analysts use 80
percent confidence intervals.

To decide what confidence level to use, you might want to think of a scale running from 80 to 95,
where  95 represents a high level of confidence and 80 represents a lower level of confidence.  Decide
which  confidence level to use based on the way in which your results will be used, how products and

-------
services may be affected by the results, and the frequency with which you will collect additional
information to confirm or revise your findings.
Determining the Sample Size

Now that we have established appropriate expectations with regard to sampling error and sample size,
we will provide you with some guidance on selecting your sample size. Please recognize that there are
several factors to consider in determining the sample size. The information provided here is intended
to help get you started.  Please refer as well to the additional information provided in Fact Sheets III, IV
and V. If you wish, you may also consult a statistician within your Office at EPA. A list of EPA
statisticians showing the EPA Office in which each of these statisticians is located can be obtained
from the Office of the Chief Statistician of EPA within EPA's Center for Environmental Information
and Statistics by calling 202-260-5244.
Number in Target
Group
1000
1000
1000
500
500
500
200
200
200
100
100
100
50
50
50
1000
1000
1000
500
500
500
Sampling Error
±5
±5
±5
±5
±5
±5
±5
±5
±5
±5
±5
±5
±5
±5
±5
±10
±10
±10
±10
±10
±10
Confidence Level
80
90
95
80
90
95
80
90
95
80
90
95
80
90
95
80
90
95
80
90
95
Sample Size
141
214
278
124
176
218
90
116
132
62
74
80
39
43
45
39
64
88
38
60
81

-------
200
200
200
100
100
100
50
50
50
±10
±10
±10
±10
±10
±10
±10
±10
±10
80
90
95
80
90
95
80
90
95
34
51
66
29
41
50
23
29
34
The above table is that appropriate for simple random sampling (SRS), which is a sampling procedure
based on sampling without replacement. Simple random sampling is the most commonly used
sampling procedure. The table is based on the approximate formula given in Fact Sheet IV. This
approximate formula includes an adjustment comparable to the finite population correction factor for
each combination of target population and sample size.

The precise formula that can be used instead of this approximate formula is also given in Fact Sheet
IV. For a discussion of the finite population correction factor, see Fact Sheet IV.  For a discussion of
the meaning and significance of sampling without replacement (as contrasted with sampling with
replacement), see the discussion  of this matter in the last section of Fact Sheet V.

The procedure described below in this Fact Sheet for randomly selecting a sample from the full list of
customers served in a specific period of time is simple random sampling and is therefore consistent
with the above table.

Here's How to Use the Above Table

The instructions that follow assume that the unit of analysis for the survey will be the "person served."
(See Fact Sheet  VII for a discussion of "Unit of Analysis.")

 (1) Identify  the number of persons you have served in the time period of interest.  Find that number in
the column labeled "Number in Target Group."

(2) Select the confidence level that you consider to be the most appropriate given the magnitude of the
decisions that will be made based (in part) on the results obtained from the survey:

       o If the decisions to be made using the survey results will be far-reaching, long-lasting and/or
       costly, use the 95% confident level.

       o If the decisions to be made using the survey results will be less far-reaching, less long- lasting
       or less costly, use the 90% confidence level.

-------
       o If the decisions to be made using the survey results will have more limited consequences,
       mostly in the short-term (e.g., in the next 6-12 months) and the cost implications of the
       decisions will be moderate, you may use the 80% confidence level.

(3) Select the level of sampling error you consider to be acceptable given the magnitude of the
decisions that will be made using the results obtained from the sample.

       o For most EPA customer satisfaction surveys, a sampling error of ±10% should be acceptable.

       o In cases where the decision to be made based (in part) on the survey results are of such a
       nature that a smaller level of sampling error is needed, a sampling error of ±5% can be used
       instead.

(4) Read off the corresponding sample size.

       (a) If the total number of customers served falls between two of the values shown above in the
       column "Number in Target Group," you can use interpolation to obtain an initial estimate of the
       appropriate sample size.

       (b) You can then use the approximate formula for determining sample size presented in Fact
       Sheet IV to obtain a much better estimate of the sample size needed.

       (c) You can stop here and make use of the approximate value for the sample size obtained in
       step (4)(b) immediately above. Alternatively, you can, if you wish, now make use  of the trial
       and error approach presented in Fact Sheet IV or, even better, the combined approach., also
       presented in Fact Sheet IV, to calculate the precise value for the sample size needed.
Here's How to Randomly Select a Sample of Customers Once You Have Determined
What Sample Size to Use

Once you have determined the appropriate sample size to use, the next step is to randomly select that
number of customers from the total number served in the time period of interest.  Here is a procedure
you can use to make that random selection:

(1) Make a complete list of all the persons served in the period of interest for which you already have
(or can obtain, with a reasonable expenditure of effort) the needed contact information (i.e., name, plus
address or phone number). Put the customers in alphabetical order to ensure that there are no duplicate
names.  Eliminate any duplicate names.

(2) Once all duplicate names have been eliminated (so that each name appears only once), starting at
the top of the list, number each name.  The result is the "master list" of customers served. The number
next to each name is that person's "customer number."
(3) Here is a computer based approach for selecting a sample of customers from the master list:

-------
       (a) You will use spreadsheet software (like Lotus 1-2-3 or Excel) to carry out the remaining
       steps of this procedure. Before you begin to make use of any particular spreadsheet software,
       first make sure that it has a "randomize" function. Not all spreadsheets do.

       (b) Enter the customer numbers in numerical order into the spreadsheet, one number per row.
       Place each of these numbers in the  second column of the spreadsheet, leaving the first column
       in each row blank. The result will be a spreadsheet with the number of rows equal to the number
       of customers and with the rows having the numbers "1", "2", "3", and so on (up to the total
       number of customers served), with  these numbers in the second column of each row.

       (c) Use the "randomize" function on the second column of the spreadsheet. The numbers in the
       second column are now in random order.

       (d) Enter numbers into the first column of each row. Enter the number " 1" into this column in
       the first row, enter "2" into this column in the second row, and so on. These new numbers are
       the row labels.

       (e) Mark off the number of rows corresponding to the sample size chosen above. For example,
       if the sample size is 65, mark off the first 65 rows.

       (f)  The numbers appearing in the second column of the rows marked off in step (e) above are
       the customer numbers corresponding to the customers to be included in the sample. For each of
       these customer numbers, read off the name of the customer appearing next to this number on the
       master list prepared in step (2) above and place it in a new list.  This is new list is the list of
       customers selected for inclusion in the sample - the people you will contact during the survey
       and ask to respond to the survey  questions.

       (g) If due to a lower than expected response rate, the number of customers from whom
       responses are received is less than the desired sample size, and all reasonable followup efforts
       have already been made to increase the response rate, go back to the spreadsheet and mark off
       the additional number of rows needed to reach the desired sample size.  The numbers appearing
       in the second column of these additional rows are the customer numbers for the additional
       customers to be added to the sample.

For an equivalent procedure that does not make use of a computer or a computer spreadsheet, see the
last section of Fact Sheet V.

-------
APPENDIX D

-------
            OFFICE OF COMPLIANCE

                   GUIDANCE

               ON THE NEED FOR

    INFORMATION COLLECTION REQUESTS
                      (ICRS)

           FOR THE EVALUATION OF

     COMPLIANCE ASSISTANCE ACTIVITIES
                SEPTEMBER 1997
Produced by the Office of Compliance Regional Compliance Assistance Work Group
              Lynn Vendinello, Work Group Chair

-------
       Federal agencies are generally required, by the Paperwork Reduction Act (PRA), to
receive Office of Management and Budget  approval prior to collecting substantially similar
information from ten or more non-Federal respondents.  A "collection of information" means the
obtaining or soliciting of information by an agency by means of identical questions, or identical
reporting or record keeping requirements, whether such collection of information is mandatory,
voluntary, or required to obtain a benefit. This includes any requirement or requests to obtain,
maintain, retain, report  or publicly disclose information. 5 CFR § 1320.3(c)

       There are exceptions to this rule and depending on your particular situation, your
compliance assistance evaluation task may or may not fall within an exception. This guidance
will help determine whether or not an Information Collection Request (ICR) is necessary for your
task. You may also contact the PRA experts in the Cross-Cutting Issue Division of the Office of
General Council to assist you with individual questions. They are Hale Hawbecker at 202-260-
4555 and Tanya Hill at 202-260-1486.
What is the Paperwork Reduction Act?

       The PRA is a law (PL 104-13) originally enacted by Congress in 1980, reauthorized in
1986  and revised in 1995, that essentially attempts to minimize the Federal paperwork burden
on the public.  Section 3501 of the law clearly states the eleven purposes of this Act.
       " §3501. Purpose

       The purposes of this chapter are to-

1.      Minimize the Federal paperwork burden for individuals, small businesses, State and local
       governments, and other persons resulting from the collection of information by or for the
       Federal Government;

2.      Ensure the greatest possible public benefit from and maximize the utility of information
       created, maintained, used, shared and disseminated by or for the Federal Government;

3.      Coordinate, integrate, and to the extent practicable and appropriate, make uniform Federal
       information resources management policies and practices as a means to improve the
       productivity, efficiency and effectiveness of Government programs, including the
       reduction of information collection burdens on the public and the improvement of service
       delivery to the public;

4.      Improve the quality and use of Federal information to strengthen decision making,
       accountability, and openness in Government and society;

5.      Minimize the cost to the Federal Government of the creation, collection, maintenance,

-------
       use, dissemination, and disposition of information.

6.      Strengthen the partnership between the Federal Government and State, local, and tribal
       governments by minimizing the burden and maximizing the utility of information created,
       collected, maintained, used, disseminated, and retained by of for the Federal Government;

7.      Provide for the dissemination of public information on a timely basis on equitable terms,
       and in a manner that promotes the utility of the information to the public and makes
       effective use of information technology;

8.      Ensure that the creation, collection, maintenance, use, dissemination, and disposition of
       information by or for the Federal government is consistent with applicable laws,
       including laws relating to--(a) privacy and confidentiality,

9.      Ensure the integrity, quality and utility of the Federal statistical system;

10.    Ensure that information technology is acquired, used, and managed to improve
       performance of agency  missions, including the reduction of information collection burden
       on the public; and

11.    Improve the responsibility and accountability of OMB and all other Federal agencies to
       Congress and to the public for implementing the information collection review process,
       information resources management, and related policies  and guidelines established under
       this chapter."

What is an ICR?

       An Information Collection Request (ICR) is a document submitted by federal agencies to
OMB in order to obtain approval of an information collection and/or a reporting and record
keeping requirement that falls under the purview of the PRA. The ICR must receive OMB
approval prior to the initiation of the information collection.

       The term "collection of information" according to the Paperwork Reduction Act of 1995
(PL 104-12(8.244)) means: "(A) the obtaining, causing to be obtained, soliciting, or requiring the
disclosure to third parties or the public, of facts or opinions by or for an agency, regardless of
form or format, calling for either-
       "(i) answers to identical questions posed to, or identical reporting or record keeping
requirements imposed on, ten or more persons, other than agencies, instrumentalities, or
employees of the United States; or
       "(ii) answers to questions posed to agencies, instrumentalities, or employees of the United
States which are to be used for general statistical purposes."

-------
       For guidance on how to complete an ICR, you can visit the "EPA's Information
Collection Request (ICR) Homepage" at www.epa.gov/icr on the Internet or consult Rick
Westlund in OPPE at 202 260-2745 and/or your OPPE Desk Officer.
When is an ICR Needed?

       An ICR is generally required for any activity involving the collection of identical
information from ten (10) or more non Federal respondents in any twelve month period. ICRs
may be approved for up to a three-year period and can be extended through subsequent approval
requests. An approved ICR is required as long as the activity continues.

       Examples of information collection activities that commonly require an ICR:

•      Information requirements in a rule (e.g. reporting, record keeping, waiver provisions).

•      Other information collection activities (e.g. studies, surveys,  application forms, audits,
       standardized data collection activities).

Certain activities are not subject to the Act.  For example:

•      An ICR is not required when the information is collected during the conduct of a criminal
       or civil enforcement action.

•      An ICR is not needed when the collection falls under one of the categories of items that
       OMB has concluded do not generally meet the definition of "information" contained in 5
       CFR§1320.3(h).

Many of the compliance assistance activities that the Office of Compliance is currently
undertaking as well as those of the compliance assistance programs  in the regions fall under one
of the categories. To assist in the determination of the need for an ICR, the following guidance is
provided:

•      Examples of scenarios that do and do not require ICRs.

•      A copy of OMB's implementing regulations (5 CFR § 1320).

•      A copy of a Fact Sheet, "How to Obtain  Clearance for Customer Satisfaction Surveys."

-------
ICR Applicability Scenarios

•     Category A: Web -Sites:

Scenario One: I am establishing a web site for my regional compliance assistance
program (or for a compliance assistance center) and would like to establish a
"comments" button or "feedback" feature.

Response: Generally, no ICR would be required for this activity. According to
OMB, "an undifferentiated 'suggestion box' format—such as one requesting 'ideas,
comments, suggestions, or anything else you would like to tell us,' or one asking,
'if you experience any technical problems with our site, or have any suggestions
for improving it, please let us know'~are not considered to be 'identical
questions'. l Such general solicitations of comments from the public do not require
OMB clearance. However, should the agency request specific information from
web site users, OMB approval would be required as explained in Scenario Two.

Scenario Two: I would like to put an on-line survey on my web site to determine
what features of the web site are most useful and to ask for suggestions for
improving the web site.

Response: The fact that your survey is on-line does not affect the decision as to
whether or not the survey requires OMB clearance. What will affect whether or
not the survey requires an ICR is the nature of your questions. According to OMB
guidance,  if your questions are non-identical then you will not need OMB
clearance.  Identical questions ask each respondent to supply the same level of
information on the same subject. For example, they often supply a specific set of
answers for the user to select from.  Non-identical questions are non-specific and
allow the responder to apply "facts or opinions" of their own choosing without any
direction from the government. In addition, if your survey is primarily for the
purposes of assessing customer satisfaction with your web site, you may want to
consider using an existing ICR Clearance (see  attached pamphlet). If your survey
attempts to get at behavioral changes and/or compliance improvements, then you
may need  a separate clearance again depending on the nature of your questions, hi
      lMThe Paperwork Reduction Act of 1995: Implementing Guidance", February 3, 1997, pg.
17

-------
general, if you feel that your questions are non-identical you may want to ensure a
certain degree of brevity with respect to your survey so that your survey does not
appear to follow a plan of inquiry.

Scenario Three: I would like to ask the users of my web site to identify
themselves by name or by category (e.g. auto service repair shop, car dealer,
consultant).

Response:  No ICR would be required for this activity. According to 5 CFR
1320.3 (h)(l), this category of an inquiry is not deemed to constitute an
information collection and therefore would not require clearance. The Paperwork
Reduction Act states that "Affidavits, oaths, affirmations, certifications, receipts,
changes of address, consents, or acknowledgment," do not constitute information.
Merely, asking users to identify themselves by name or category is a request for an
"acknowledgment" not generally subject to the PRA.

•     Category B: Workshops/Seminars/Training Sessions

Scenario Four: I am planning to hold a compliance assistance workshop for air
permits. This workshop is open to anyone who would like to attend (with limits
on total numbers able to physically attend). After the workshop is over, I would
like to hand out a voluntary questionnaire that asks the attendees questions such as:
a) Has this  workshop provided information that will help you improve your ability
to comply with environmental regulations?

Response:  No ICR would be required for this activity. Exemption #8 of the
Paperwork Reduction Act that states that, "facts or opinions submitted in
connection with public hearings or meetings"2 would apply to this scenario. To
provide for more certainty of this exemptions application, it would be best to
provide a Federal Register notice making it clear that the workshop is open to all
interested members of the public.  A second-best option would be to adopt an
open-door policy with respect to the workshop so that no one would be excluded (
except for obvious space limitations) from attending. In addition, you could also
conduct an on-the-spot evaluation of the workshop, since category #8 would most
likely apply to that activity, as well. You could also send a follow-up
questionnaire within a short time period following the  seminar (e.g. one week).
      25 CFR 1320.3 (h) (8)

-------
Scenario Four A: My compliance assistance program has funding for four
seminars this year. We would like to determine the topics that would be of the
greatest interest to our clients, so we would like to mail out a voluntary
questionnaire that lists potential seminar topics.

Response: An ICR would probably not be required for this activity.   Category #8
of the PRA would apply to this scenario as well. OMB guidance explains that,
"included in this category are questions which ask the proposed participants to
identify themselves and the topic(s) about which they desire to speak."3 Your
request for topics to be discussed is similar to asking for a request to speak on a
particular  topic.  Further, the requested items are "in connection with" the public
workshop and category #8 appears to apply to such inquiries.

Scenario Four B: After the completion of a workshop, we would like to send a
follow-up survey out which asks questions about behavioral changes that resulted
from attendance  at the workshop.

Response: hi this scenario you would probably need OMB clearance, especially if
there was  a significant time delay before the survey was mailed  out because the
information collected would no longer pertain directly to the public meeting that
was held.  If you were looking for behavioral changes that facilities later adopted
after the workshop had educated them  about environmental requirements, for
example, this information request is not directly related to evaluating the
immediate impact of the workshop (e.g., their satisfaction with the workshop and
their improved awareness/understanding of requirements).
Scenario Five: I will be holding a printing compliance training workshop that will
be made generally available to printers. I would like to administer a "test" before
and after the training to determine if understanding of environmental requirements
changes as a result of the training.

Response: No ICR would be required for this activity.  Category #7,
"Examinations designed to test the aptitude, abilities or knowledge of the persons
      3"The Paperwork Reduction Act of 1995: Implementing Guidance", February 3, 1997, pg
26

-------
tested and the collection of information for identification or classification in
connection with such examinations,"4 would apply to this scenario.  The nature of
your test should be with respect to their knowledge of the subject matter at hand.
If you wish to use the test to collect socioeconomic information about the
respondents, an ICR will probably be required.

Scenario Six: My office has given a grant to a state to develop a compliance guide
that integrates federal and state rules for metal finishers in my state.  One of the
criteria for awarding the grant was that the grantee have a component for program
evaluation. The grantee plans to include a comment card in the compliance guide
that would get mailed back to the state office.

Response: According to OMB guidance, "hi general, collections of information
conducted by recipients of Federal grants do not require OMB approval. On the
other hand, an agency is the sponsor of a collection of information...if the grant
recipient is: 1) collecting information at our specific request; and/or 2) the terms
and conditions of the grant require that the we specifically approve the collection
of information or the collection procedures."5 One can ask for a program
evaluation component of a grant proposal and/or measures of success; however,
we can not ask that a particular survey method be used without getting an ICR
approved.
      If, however, the award is a cooperative agreement, then the agency is
considered a sponsor of the information and all of the PRA restrictions on
information collection would apply.

•     Category C: Mailed or Phoned Surveys
Scenario Seven: An EPA employee or contractor would like to know how many
states have a small business policy and plans to call them to ask for a copy of their
policy, if they have one.
Response: No ICR would be necessary to conduct this activity. Category #2 of
      45 CFR 1320.3(h)(7)

      5D3ID. pg 14

-------
the Paperwork Reduction Act states that the request for "samples of products or
any other physical objects"6 does not constitute a collection of information.
According to OMB, this category "includes requests for information that is already
available in a form suitable for distribution and is provided in that form to all
requesters. (This request is a collection of information if the information has to be
compiled or if it is not provided to any person who requests it)."7
Scenario Eight: An EPA employee or contractor would like to follow up with
those states that have sent us a copy of their small business policy to ask specific
questions about their individual state policies.

Response: No ICR would be required for this activity. Since you will be asking
each of the states questions that pertain only to their specific policy and not
identical questions of each state, Category #6 of the PRA would apply.  Category
#6 of the PRA states that "a request for facts or opinions addressed to a single
person"8 does not constitute a request for information.  However, if EPA asked
the same questions following a plan to more than nine states, the PRA would
apply.
Scenario Nine: My EPA program would like to ask states to voluntarily answer a
survey that asks them to quantify the benefits of their compliance assistance
program.

Response: It is important to understand that the PRA applies not only to industry
and individuals but also to requests for information from states and local
governments.  Further, the fact that the survey is voluntary does not mean that the
PRA does not  apply, hi this case OMB clearance would be required because you
are asking identical questions and are directing them to specific entities.
      65 CFR 1320.3(h)(2)

      7"The Paperwork Reduction Act of 1995: Implementing Guidance," draft, February 3,
1995 pg. 24

      85 CFR 1320.3(h)(6)

-------
       APPENDIX  E



      For Appendix E, please see



www.epa.gov/oeca/perfmeas/icrfacts.html

-------
                    Fact Sheet

            How to Obtain Clearance
                       for
Regional Compliance Assistance Evaluation Surveys
          Under the Generic ICR 1860.01
            OMB Control # 2020.0015
                      EPA
          U.S. Environmental Protection Agency
                 Office of Compliance
      Office of Enforcement and Compliance Assurance

-------
QUESTION 1: WHEN CAN I USE THE GENERIC COMPLIANCE ASSISTANCE ICR?

The generic clearance shall be used for strictly voluntary collections of program evaluation type
information from customers of regional compliance assistance programs.1 This generic ICR can
be used for the bulk of the planned program evaluation/outcome measure work planned in the
regions, where your goal is to determine the effectiveness of your compliance assistance
activities on the audience that receives the compliance assistance (e.g. participants at a
workshop). The generic ICR shall not be used for measurement work that would require a
statistically-valid methodology (e.g. when generalizing to an overall population).  In those cases
a separate ICR must be undertaken to allow the public the opportunity to comment on the survey
methodology. This restriction; however, is not meant to discourage the use of sound survey
methodology. Programs are encouraged to use good survey design, methodology and
implementation in all of their program evaluation work.
QUESTION 2: HOW DO I RECEIVE APPROVAL FOR MY SURVEY, IF IT MEETS THE
CONDITIONS OUTLINED ABOVE?

Below are the instructions for submitting your survey for clearance:

Prior to initiating the survey, sponsoring regional programs must seek final approval from OMB.
To obtain approval, sponsoring regional programs must submit a clearance package consisting of
a memorandum and a copy of the survey instrument through the Regulatory Information
Division of the Office of Policy. The memorandum will be addressed from the program to Lynn
Johnson, the OECA Desk Officer in RID, Office of Policy (2137). Lynn can be reached at 202-
260-2964 A brief one-page memorandum must address the following:

o      Survey title, identification of survey originator (office, point of contact, phone number);

o      Description and intended purpose of the survey as it relates to compliance assistance
       program evaluation; and

o      Costs and burden to the Agency and respondents, and the number of respondents.

RID staff will review each submission to  ensure that it meets the requirements of the Paperwork
       1 In cases where a state or other party is conducting the survey through a cooperative
 agreement clearance would still be necessary because in these cases OMB considers the Agency
 as sponsoring the information collection and it is therefore covered by the Paperwork Reduction
 Act.

-------
Reduction Act and any conditions of the generic approval, and may reject any proposed survey
that does not meet the criteria above.
QUESTION 3: HOW LONG WILL THE PROCESS TAKE?

      Following review within RID, RID will submit surveys and attached materials to OMB
for a 30 working-day review.  During that time you should first check in both with OP to make
certain that it was sent over and then with OMB to remind them of their deadline. The OMB
examiner who will be reviewing all the surveys under this ICR is Eric Godwin. Eric can be
reached at 202-395-3087 or (ericjc.  godwin@.omb.eop.gov') It would be wise to call or e-mail
Eric to remind him of the deadline around 15 days after it has been sent to OMB. In addition,
you can also include in your cover note to OMB that if you do not hear back from OMB within
the  thirty-day period, you will assume that they have no comments on the survey and that it is
approved.

QUESTION 4: CAN I SEND A LIST OF SURVEY QUESTIONS THAT I PLAN TO USE
OVER AND OVER AGAIN AT WORKSHOPS, TRAINING SEMINARS, ON-SITE VISITS
TO OMB FOR APPROVAL?

      Yes. If you are going to be repeating your compliance assistance activities over the
course of the year(s) you can send a comprehensive  list of survey questions for pre-approval.
The list of questions would be broader than the list for any one survey.  For example, you may
know that you will be conducting MACT training seminars for wood finishers, drycleaners and
paint coaters next year.  You also know that you want to follow-up on the seminars by asking
those that attended what changes they made at the facility as a result of the training. You would
develop your list of questions that applied to all three sectors plus the sector-specific ones for
OMB approval. When you delivered the surveys you would only use those questions that were
appropriate to the audience being assisted. In calculating burden, however, you will need to base
it on the overall number of respondents to all of the  surveys (see BURDEN section below).

QUESTION 5: WHAT  DO I HAVE TO DO TO MY SURVEY TO SHOW THAT IT HAS
BEEN APPROVED BY OMB?

The OMB Control Number and expiration date must appear on the front page of an OMB-
approved form or survey, or on the first screen viewed by the respondent for an on-line
application. The rest of the burden statement must be included somewhere on the form,
questionnaire or other collection of information, or in the instructions or cover letter for such
collection.

The following information must appear on the first page of the survey:

       OMB Control No. 2020-0015

-------
       Approval expires 9/30/2001

      Public reporting burden for this collection of information is estimated to average eleven
      (11) minutes per response, including the time for reviewing instructions, gathering
      information, and completing and reviewing the collection of information. Send
      comments on the Agency's need for this information, the accuracy of the provided burden
      estimates, and any suggestions for reducing the burden, including the use of automated
      collection techniques to the Director, OP Regulatory Information Division, United States
      Environmental Protection Agency (Mail Code 2137), 401 M Street, SW, Washington,
      D.C. 20460; and to the Office of Information & Regulatory Affairs, Office of
      Management & Budget, 725 17th Street, NW. Washington, DC
      20503, Attention: Desk Officer for EPA.  Include the EPA ICR 1860.01 and the OMB
      control number 2020-0015 in any correspondence. Do not send your completed surveys
      to this address..

QUESTION 6: IS THERE A STANDARD COVER LETTER FOR THE SUBMITTAL OF
SURVEYS TO EPA OFFICE OF POLICY? AND HOW DO I CALCULATE THE
NECESSARY BURDEN ESTIMATES?

      For each  survey, you will need to send a cover memorandum to EPA Office of Policy. A
sample memorandum is attached as Attachment 1. The memo will be addressed to Lynn Johnson
of the Office of Policy. In the memo you need to address five areas:

       1) Background: Briefly describe the compliance assistance activity being undertaken

      2) Survey Purpose and Description: Briefly describe the goals that you are attempting to
      measure through the survey and attach the actual  survey (or list of survey questions)

       3) Survey Methodology and Use of Results: Explain how you plan to conduct the survey
       (comment card, mail-back form, phone etc...)

       4) Respondents Burden: You will need to calculate total respondents' burden hours and
       total respondents' cost. Follow the approach below and see the example in Attachment
       One.

       a) Number  of Respondents: How many respondents do you anticipate responding to the
       survey.

       b) Minutes  per response: How long will it take to complete the survey.  In the ICR, our
       estimates were as follows: phone surveys-10 minutes; mailed/faxed back surveys-10
       minutes; on-line surveys- 5 minutes; comment cards-5 minutes.

       c) Cost per hour:  In the ICR, we estimated the costs per respondent at $33/hr. This figure

-------
      is based on Bureau of Labor Statistics "Employer Costs for Employment Compensation"
      for the typical types of small businesses that we serve through our compliance assistance
      program and is the figure you should use in your calculation of respondents cost.

      Total Burden Hours = (# of Respondents X Minutes per response)/60
      Total Respondents Cost=$33 X Total Burden Hours

      5) Agency Burden: This refers to the time that it will take you to review the responses
      and conduct your analysis. You will need to supply Total Agency Burden which will be
      calculated by multiplying EPA staff time hi hours X cost per hour. In the ICR, we
      estimated cost per hour as $34.98/hr. This rate is based on a 1998 GS-13/01 salary of
      $55,969 or $26.91/hr with 30% overhead the hourly rate is $34.98. This figure should
      consistently be used in your calculation of Agency Burden.

      Total Agency Burden= Hours of EPA staff time X $34.98.
If you choose to supply OMB with a list of survey questions that you will use throughout your
regional compliance assistance program, you will still need to conduct the above calculations;
however, you provide OMB with the aggregate burden that totals up the individual burden for
each time a survey that pulls of questions from the overall list is used.

QUESTION 7: HOW CAN I LEARN WHAT OTHER REGIONS ARE DOING IN THIS
AREA?

       This generic ICR is a pilot ICR to enable us to learn how to conduct outcome
measurement of compliance assistance activities. To facilitate information sharing on survey
development, regions are asked to submit copies of their actual surveys to Lynn Vendinello in
the Office of Compliance at vendinello.lynn@epa.gov.  OC hi conjunction with EPA's Office of
the Small Business Ombudsman, will post the surveys on an EPA web site, at
http://www.smallbiz-enviroweb.org/perfmeas.html/
The web site has been developed in part to educate state programs to compliance assistance
efforts, including measurement activities. The Office of Compliance will be working with EPA's
Office of the Small Business Ombudsman to have the surveys categorized according to type of
compliance assistance activity and intended outcome.

QUESTION 8: DOES THE APPROVED ICR CHANGE THE OFFICE OF COMPLIANCE
EXPECTATIONS WITH RESPECT TO RECAP MEASUREMENT FOR FY'99?

       The Annual Program Goals for OECA for FY'2000 anticipates that we will be
implementing a nationally consistent methodology for outcome measurement of compliance
assistance. In FY'99 OC will ask the regions to collect outcome measures for their compliance
assistance efforts in a manner that is consistent with forthcoming FY'99 RECAP guidance. As

-------
part of the National Performance Measures Strategy Implementation, OC in concert with
regional Task Group members is exploring ways to facilitate RECAP compliance assistance
reporting, including the entry of output and outcome data into a PC-Based database system.
Until that system is available to all regions, regions are encouraged to save data on:

       1) response rates, follow-up strategies, important lessons related to survey design and
       implementation; and

       2) actual survey results.

and to submit this data as part of the FY'99 RECAP reporting process.
QUESTION 9: IS THE OFFICE OF COMPLIANCE GOING TO PROVIDE ME WITH ANY
ADDITIONAL ASSISTANCE ON SURVEY METHODOLOGY AND DESIGN?

In support of the National Performance Measures Strategy Implementation, OC has enlisted the
services of a contractor to assist the regions in their survey work. The contractor will be
producing additional guidance on survey design and will assist in the development of survey
questions based on our current categorization of outcome measures for compliance assistance.
See Attachment 3 for the NPMS Set 3/10 Task Group's Current Categorization of Compliance
Assistance Outcome Measures. In addition, to see examples of surveys that others have used,
please visit the EPA Small Business Office's Web Site at:
http://www.smallbiz-enviroweb.org/perfmeas.html/

-------
ATTACHMENT 1: SAMPLE COVER MEMO FROM REGIONAL PROGRAM TO
OFFICE OF POLICY
                U.S. ENVIRONMENTAL PROTECTION AGENCY
                                  REGION I
        OFFICE OF ENVIRONMENTAL MEASUREMENT & EVALUATION
              60 WESTVIEW STREET, LEXINGTON, MA 02173-3185

MEMORANDUM

DATE:      September 9,1999

SUB J:       Request for OMB Approval of Compliance Assistance Program Evaluation
            Survey: Pilot Generic ICR # 1860.01
FROM:     John Moskal
            New England Environmental Assessment Team

TO:         Lynn Johnson, OECA Desk Officer
            Regulatory Information Division
            Office of Policy

EPA's Region 1, New England Office is preparing to distribute copies of the New England Auto
Repair & Refinishing Compliance Manual.  In order to learn whether the document improves the
readers understanding of their federal and state regulatory requirements, we are including with
the document a mail-back survey.

      Approximately 12,000 copies of the checklist will be distributed upon request to auto
service and repair shops in the  EPA Region 1 states, with the survey form as an insert. We
expect to receive approximately 3,000 responses..
      If you have any questions or concerns about this request, please contact John Moskal at
617-565-2735.

Attachments

-------
                Request for Approval of Information Collection Activity

I. Background

The Auto Service and Refinishing Compliance Manual is an outreach tool, designed to inform the
auto service and repair shops of their federal and state multi-media compliance obligations in
"user-friendly" plain-English.  It was produced through a cooperative agreement with the New
England Waste Management Officials' Association (NEWMOA) under Section 215 of the Small
Business Regulatory Enforcement and Fairness Act. The purpose of this manual is to provide
clear and concise information to the auto service and repair shops and local technical assistance
providers that meets their informational needs and allows them to better understand the
integration of their federal and state multi-media rules.

II. Survey Purpose and Description

The New England Waste Management Officials' Association (NEWMOA) in cooperation with
EPA Region One is planning on conducting a program evaluation survey in the form of a mail-
back card to evaluate whether the manual provided auto service and repair shops with the
information they need to understand their federal and state compliance obligations in a way that
is easy to read and use. The results will be used to improve the reports content, readability and
use.

III. Survey Methodology and Use of Results

The potential target audience for the evaluation forms consists of approximately 12,000 auto
service and repair shops in the three New England States as well as government personnel (local,
state and national). EPA Region I plans to distribute the forms as inserts to copies of the New
England Auto Service and Repair Compliance Manual. We anticipate that approximately  3,000
readers will respond. We estimate that it will take a respondent approximately five minutes to
complete and evaluation form.

EPA Region I will create a database to track evaluation form responses. The information will be
used to prepare a report which will summarize the findings and make recommendations to
NEWMOA and Regional managers on how to improve the readability, use and content of this
report and other similar compliance assistance activities.

IV. Respondents' Burden

Number of Respondents            3,000
Minutes per Response              5  minutes x 3,000 = 15,000 minutes = 250 hours
Cost per Hour                     $33.00*
Total Burden:                    250 Hours; $8250

-------
* Based on Bureau of Labor Statistics "Employer Costs for Employment Compensation". This is
the estimated used in the generic ICR.


V. Agency Burden

EPA Staff Time                  100 hours
Cost per Hour                    $34.98**
Total Burden                    100 hours; $3,498


**This rate is based on a 1998 GS-13/01 salary of $55,969 or $26.91/hr with 30% overhead the
hourly rate is $34.98. This is the estimated used in the ICR.

-------
ATTACHMENT TWO: Sample Cover Note from Office of Policy to OMB

                  U.S. ENVIRONMENTAL PROTECTION AGENCY
                                  REGION VII
                  OFFICE OF WATER WETLANDS & PESTICIDES
                  726 MINNESOTA AVENUE, KANSAS CITY, KS.
MEMORANDUM

DATE:      September 14,1998

SUBJECT:  Request for OMB Approval of Program Evaluation Survey under Generic ICR
            #1860.01

FROM:     DAVID WILCOX
            IOWA SBREFA PROJECT OFFICER

TO:        Steve Vineski, IPB Desk Officer
            Information Policy Branch
            Office of Policy
       EPA's Region 7, Water Wetlands & Pesticides Office in cooperation with the Iowa Waste
Reduction Center (IWRC) is preparing to develop a "Handbook of Environmental Regulations
for Agribusiness". In order to help the IWRC and EPA assess the knowledge base and needs of
the agribusiness community and help identify the information types that will be most useful,
they are asking agribusiness facilities to fill out the attached mail back survey.

       Approximately 1,900 survey forms will be distributed to Iowa agribusiness facilities.
We expect to receive approximately 190 responses..

       If you have any questions or concerns about this request, please contact David Wilcox at
515284-4606.

Attachments
                                       10

-------
                Request for Approval of Information Collection Activity

I. Background

       The Handbook of Environmental Regulations is an outreach tool, designed to inform the
agribusiness facilities of their federal and state multi-media compliance obligations in "user-
friendly" plain-English. It will be produced through a cooperative agreement with the Iowa
Waste Reduction Center (IWRC) under Section 215 of the Small Business Regulatory
Enforcement and Fairness Act. The purpose of this manual is to provide clear and concise
information to the agribusiness community that meets their informational needs and allows them
to better understand the integration of their federal and state multi-media rules and regulations.

II. Survey Purpose and Description

       The Iowa Waste Reduction Center (IWRC) in cooperation with EPA Region 7 is
planning to conduct a needs evaluation survey in the form of a mail-back form, to evaluate how
to make the handbook more useful, assess the knowledge base and needs of the agribusiness
community, and help identify the information that will be most useful.

III. Survey Methodology and Use of Results

       The potential target audience for the evaluation forms consists of approximately 1,900
agribusiness facilities. IWRC plans to distribute the forms with  a postage paid envelope for
easy return. We anticipate that approximately 190 readers will respond. We estimate that it will
take a respondent approximately 10 minutes to complete and evaluation form.

       IWRC will use the survey results to help them tailor the contents and format of the
"Handbook of Environmental Regulation for Agribusiness".  The  survey answers will help the
Iowa Waste Reduction Center and the Iowa Department of Economic Development Small
Business Liaison, who has contracted with IWRC to develop and distribute the survey, to assess
the knowledge and needs of the Iowa agribusiness community.

IV. Respondents' Burden

       Number of Respondents:     190
       Minutes per Response:       10 minutes x 190 - 1,900 minutes = 32 hours
       Cost per Hour:             $33.00*
       Total Burden:             32 Hours; $1056

* Based on Bureau of Labor Statistics "Employer Costs for Employment Compensation". This is
the estimated used in the generic ICR.

V. Agency Burden: EPA Staff Time:             10 hours

                                         11

-------
ATTACHMENT 3: NPMS WORKING DRAFT CATEGORIZATION OF OUTCOME
MEASURES FOR COMPLIANCE ASSISTANCE
                               Outcome Categories

I. Changes in Awareness/Understanding of Compliance Assistance Opportunities and
Regulatory Requirements:

a) # who seek assistance

b) # who improve their understanding of a regulatory requirements

II. Changes in Behavior:
a) # of recommendations adopted
regulatory:
a) # of regulatory requirements adopted (includes notifications, permits adopted, etc...)

non-regulatory:
a) # of EMRs adopted
b)# of BMPs adopted
c) # of industrial process changes adopted

III) Improvements in Compliance:
a) change in compliance indicator level (e.g. using a simplified compliance checklist)
b) change in compliance rate

IV) Environmental and Human Health Improvements:
a) # of facilities that reduce emissions or other pollutants
b) # of human health/worker protection improvements
c) amount of emissions reduced, pollutants reduced and/or risk reduced
                                        12

-------
APPENDIX F

-------
MENU OF SAMPLE SURVEY QUESTIONS BY OUTCOME MEASURE
This appendix provides a menu of sample survey questions you can tailor for your specific evaluation
efforts. Use them as a guide to get you started.  Add questions, as appropriate, and delete questions that
are unrelated to your evaluation effort.  Modify the questions to fit the sector you targeted as well as for
the type of compliance assistance activity.  If your workshop did not discuss the importance of self-
auditing, for example, you do not need to include that as a potential response to Question 7, under
Behavioral Changes. Similarly, if your assistance focused on dry  cleaners and their obligations under the
Clean Air Act Amendments, you can delete the questions related to water and waste discharges.

Section I of this document focuses on outcome measurement, which is central to OECA's commitments
under the National Performance Measures Strategy (NPMS) and the Government Performance and
Results Act.  Section II lists supplemental questions for onsite visits, workshops, and Web sites. Some of
these questions are also related to the outcome measures—specifically, the number of audit
recommendations implemented. Section III of this document includes sample questions to assess the
background of the respondent and customer satisfaction.
SECTION I:
OUTCOME MEASUREMENT
Measuring the outcomes of compliance assistance—changes in awareness and understanding, behavioral
change, and environmental and human heath impacts—is a central component of OECA's performance
measurement strategy.  Questions related to the key measures that OECA plans to track are listed in bold
text and coded accordingly. Table 1 lists the outcome measures and the associated codes.

Table 1.  Outcome Measures and Associated Codes
 Outcome
 Measure
     Specific Measures
Code
 Awareness and
 Understanding
     1. Number who improve their understanding of regulatory
      requirements.
Al
 Behavioral Change
     Regulatory:

     1. Number of regulatory requirements adopted (includes
      notifications, permits adopted, labeling, reporting, etc.).

     2. Changes in level of compliance (multimedia, subset,
      compliance indicators).

     Nonregulatory:

     1. Number of process changes adopted.

     2. Number of environmental management changes or reviews
      adopted.

     3. Number of best management practices adopted.

     4. Number of self-audits conducted.

     5. Number of recommendations adopted.

     6. Number of facilities changing regulatory status.
                                                                              BR1
                                                                              BR2
                                                                              BNR1

                                                                              BNR2


                                                                              BNR3

                                                                              BNR4

                                                                              BNR5

                                                                              BNR6
                                              1

-------
 Outcome
 Measure
Specific Measures
Code
 Environmental and
 Human Health
 Improvements
1. Number of facilities that reduce emissions or other
  pollutants.
2. Number of human health worker protection improvements.
3. Amount of emissions reduced, pollutants reduced, and/or
  risk reduced.
El

E2
E3
A.   Outcome Measurement Category: Awareness and Understanding
1.    Would you say that you are more aware of and knowledgeable about environmental
     requirements and opportunities as a result of this compliance assistance?1
            a     Yes (Code: A 1)
            a     NO
            a     N/A
2.    As a result of the assistance you received, how has your understanding of the environmental
     regulations that apply to your business improved?
            Q     A great deal. I feel that I understand what is required.
            Q     Somewhat. I am still a bit confused about the regulations.
            Q     Not at all.
            a     N/A
            Comments:
     3.      What would have helped you to understand the environmental regulations more fully?

            Q     More clearly written regulations.
            Q     Better written guidance materials.
            Q     A more knowledgeable staff person.
            Q     A training class or workshop.
            Q     More time to read the materials.
            a     Other:
          Questions addressing compliance assistance outcome measures are listed in bold text.

                                              2

-------
     4.      What did you learn that will be most useful to you?
            Q      How to apply for a permit.
            Q      Information on new equipment or techniques to use to lower emissions.
            Q      How to implement an environmental management system.
            Q      The name of a contact in another regulatory department.
            Q      Information on how similar companies have reduced emissions or improved
                   compliance.
            a      Other:	
B.   Outcome Measurement Category: Behavioral Change
1.    Did you share the awareness/knowledge gained through this compliance assistance activity with
     others in your organization or company?
            a      Yes
            a      NO
2.    Did you make changes in environmental practices or otherwise take action to comply with
     specific federal, state, or local environmental regulations as a result of the compliance
     assistance?
            a      Yes (Code: B R 1)
            a      NO
            a      N/A
            Q      Not yet, but we are planning on taking action.
3.    To comply with specific federal, state, or local environmental regulations, has your facility
     filed a new notification form or letter?
            a      Yes (Code: B R 1)
            a      NO
            a      N/A
            Q      Not yet, but we are planning on doing so soon.
            If yes, please identify what type of form or letter. Check all that apply:
            Hazardous Waste (RCRA):  Q Generator  (Code: B R 1)
                                     Q Treatment/Storage/Disposal Facility (Code: B R 1)

-------
            Air Emissions:             Q  NESHAP/MACT (Code: B R 1)
                                     Q  New Source Performance Standards (Code: B R 1)
                                     Q  Refrigerant Recovery/Recycling Device Certification
                                        (Code: B R1)
            Toxic Chemicals:          Q  TRI Report Form "R" (Code: B R 1)
                                     Q  TRI Alternate Threshold Certification Report
                                        (Code: B R1)
            Hazardous Chemicals:      Q  Tier 1 or Tier 2 Chemical Inventory Report (Code: B R 1)
            Oil Spills ("OPA - 90 "):    Q  Facility Response Plan (Code:  B R 1)
                                     a  SPCC Plan (Code: B R 1)
4.    To comply with specific federal, state, or local environmental regulations, has your facility
     applied for a new permit or permit modification?
            a      Yes (Code: B R 1)
            a      NO
            a      N/A
            Q      Not yet, but we are planning on doing so soon.
            If yes, please specify the permit type. Check all that apply:
            Hazardous Wastes (RCRA): Q  Treatment (Code: B  R 1)
                                        a  Storage (Code: B R 1)
                                        Q  Disposal (Code: B R1)
            Air Emissions:                Q  Title V Operating Permit (Code: B R 1)
            Water Discharges:            Q  General Permit for Discharge of Storm Water
                                            (Code: B R1)
                                        Q  Individual Permit for Discharge to Surface Water
                                            (Code: B R1)

5.    Has your facility improved labeling/manifesting, recordkeeping, monitoring, or reporting as a
     result of EPA's compliance assistance?
            a      Yes  (Code: B R 1)
            a      NO
            a      N/A
            Q      Not yet, but we are investigating this option.

-------
            If yes, please indicate the area where the change was made. Check all that apply:
            Q      Labeling/manifesting change. (Code: B R1)
            Q      Recordkeeping change. (Code: B R 1)
            Q      Monitoring/sampling change. (Code: B R 1)
            Q      Reporting change. (Code: B R 1)
            Q      Testing change. (Code: B R1)
6.    Has your facility installed pollution control equipment or changed industrial processes as a
     result of EPA's compliance assistance?
            a      Yes (Code: B R1 or B NR 1)
            a      NO
            a      N/A
            Q      Not yet, but we are investigating this option.
            If yes, please indicate the type of change made.  Check all that apply:
            Q      Installed emissions control equipment. (Code: B R 1)
            Q      Installed a waste treatment system. (Code: B R 1)
            Q      Purchased new industrial process equipment.  (Code: B NR 1)
            Q      Made a process change without purchasing new equipment. (Code: B NR 1)
7.    Has your facility implemented procedures to improve environmental practices as a result of
     EPA's compliance assistance?
            a      Yes (Code: B NR 2, 3, or 4)
            a      NO
            a      N/A
            Q      Not yet, but we are investigating this option.

-------
            If yes, please indicate the type of change made. Check all that apply:

            Q     Implement an environmental management system. (Code: B NR 2)

            Q     Provide training to improve awareness and/or practices in your organization.
                   (Code: B NR 2)

            Q     Change the type of raw materials (including chemicals) used at your facility.
                   (Code: B NR 2)

            Q     Negotiate with suppliers or waste handlers to improve information or practices.
                   (Code: B NR 2)

            Q     Institute new best management practices. (Code: B NR 3)

            a     Conduct a self-audit. (Code: B NR 4)

8.    As a result of the changes indicated in Questions 6 and 7 above, has your facility changed
     regulatory status (e.g., found that the regulations do not apply to you or reduced emissions to
     avoid permitting all together)?

            a     Yes (Code:BNR6)

            a     NO
            a     N/A

            Q     Not yet, but my facility is investigating this option.

            If yes, please indicate how your regulatory status has changed. Check all that apply:

            Q     Changed status from a small quantity hazardous waste generator status to a
                   nonregulated status. (Code: B NR 6)

            Q     Changed from a "major" source to a "minor" source of air emissions.
                   (Code: B NR 6)

            Q     Changed from a discharger to zero discharger of industrial waste water.
                   (Code: B NR 6)

            Q     Other:
     9.      Are there other ways you have changed environmental practices or procedures, as a result
            of EPA's compliance assistance, which are not captured in the above questions?

-------
C.   Outcome Measurement Category: Environmental and Human Health Improvements
1.      Have you been able to reduce your emissions and/or wastes as a result of the behavioral
       changes identified in Section I B above? This reduction could be the result of implementing
       compliance requirements and/or pollution prevention projects.
         a      Yes (Code: E 1)
         a      NO
         a      N/A
         Q      No changes yet, but we plan to.
         If yes, identify areas of success and provide the amount of emissions, pollutants, and/or
         risk reduced, and associated cost savings, if possible: (Code: E 3)

         Activity                            Reduction            Cost Savings
  2.      If your facility made changes in environmental practices in Section I B above, do you
         believe these changes have resulted in a safer environment for your employees?
         a     Yes (Code E 2)
         a     NO
         a     N/A
         Q     No changes yet, but we plan to.
         If you answered yes, please describe how the workers' environment is safer (e.g., workers are
         now exposed to fewer chemicals):

-------
SECTION II:     SUPPLEMENTAL QUESTIONS FOR ONSITE VISITS, WORKSHOPS, AND WEB SITES

A.       Supplemental Questions for Onsite Visits

l.Have you implemented any of the opportunities discussed during the site visit regarding waste
  reduction or pollution control? If so, please describe which ones and the results.    (B NR 5)

         Activity                             Pollutant/Reduction   Cost Savings
  2.      As a result of the assistance you received, have you come up with any additional ideas
         regarding waste reduction? If so, please describe what ideas, and any results, if available.

         Activity                             Pollutant/Reduction   Cost Savings
  3.      Have you contacted other state or federal agencies (e.g., Air Quality, Water Quality) as
         recommended on your site visit information sheet?

         a     Yes
         a     NO
         Q     Not yet, but we plan to.
         a     N/A

-------
B.        Supplemental Questions for Workshops
          In addition to the questions in this section on technical content and logistically issues, you can
also modify the questions on behavioral change (see Section IB, above) to assess intent of facilities,
based on the information presented.  For example, one question might be:
          Based on what you learned in today's workshop, do you plan to make improvements in
          labeling/manifesting, recordkeeping, monitoring or reporting?
                 a     Yes  (Code:BRl)
                 a     NO
                 a     N/A
                 U     Not yet, but we will investigate further.
          If yes, please indicate the area where you plan to make a change. Check all that apply:
                 O     Labeling/manifesting change. (Code: B R 1)
                 U     Recordkeeping change. (Code: B R 1)
                 U     Monitoring/sampling change. (Code: B R 1)
                 O     Reporting change. (Code: B R 1)
                 O     Testing change. (Code: B R 1)
          You can then follow up with facilities at a later date to determine the changes actually made.
Keep in mind the amount of time facilities need to make these  types of changes when conducting your
followup surveys.
Technical Content and Logistical Questions for Workshops
1. Was the material presented clearly  and in a logical sequence?
          a      Yes
          a      NO
2. How would you rate the handouts and materials?
          Q      Excellent
          Q      Very Good
          a      Good
          a      Fair
          a      Poor

-------
3. Would you like more seminars like this one made available to you?
         a     Yes
         a     NO
         If yes, please list topics:
4. What was the most useful part(s) of the workshop?
5. What was the least useful part(s) of the workshop?
6. What topic(s) would you have liked to have spent more time on:
7. Would you be willing to spend more than 2 hours at a workshop in order to cover more topics?
         a     Yes
         a     NO
8. Do you have any suggestions to help us reach more people like yourself to have them attend these
  seminars?
9. Was the location of this meeting convenient for you?
         a      Yes
         a      NO
         Q      Somewhat
                                               10

-------
         Comments:
10.       What other locations would you recommend for future seminars?
  11.     Was the time of the seminar convenient for you?
         a      Yes
         a      NO
         Q      Somewhat
  12.     What other times might be more convenient?
         a      Morning (8:00 a.m.  - 11:00 a.m.)
         Q      Lunch (12 noon - 2:00 p.m.)
         a      Afternoon (2:00 p.m. - 5:00 p.m.)
         a      Evening (7:00 p.m. - 9:00 p.m.)
  13.     Did the speakers show knowledge of the subject?
         a      Yes
         a      NO
  14.     Was the technical level right for you?
         a      Yes
         a      NO
  15.     Were the questions handled  appropriately?
         a      Yes
         a      NO
  16.     What did you learn that will be the most helpful to you?
                                              11

-------
  17.     On a scale of 1-10(10 being the highest) how would you rate:
         	   The workshop
         	   The presenters

  18.     Would you be interested in having a followup onsite compliance assessment?
         a     Yes
         a     NO

C.       Supplemental Question for Web Sites

1. How did you learn about this Web site?
         Q     Search engine (please specify):	
                Link from another Web site (please specify):
                From an EPA document
                Referral from a colleague
                Other (please specify):	
                                              12

-------
SECTION III:     SUPPLEMENTAL BACKGROUND AND CUSTOMER SATISFACTION QUESTIONS

A.       Supplemental Background Questions

  1.      What type of organization do you work for?

         Q      Regulated facility or business
                Industry sector:	
         Q      Consulting company or law firm
         Q      Government
         Q      Trade association
         Q      Nonprofit organization
         Q      School or university
  2.      How did you become aware of this [insert name of compliance assistance activity]?
         Q      Referral from another  government agency, official, or hotline
         Q      Referral from another  business
         Q      Trade association
         Q      EPA letter or mailing
         Q      EPA workshop, seminar, or conference
         Q      Web site
         Q      EPA publication or newsletter
  3.      In the past, have you used any  other compliance assistance tools provided by EPA, such as
         (check all that apply):
         a      Hotline
         Q      Fact sheets
         Q      Guidance documents
         Q      Web site
         Q      Onsite visits
         Q      Workshops, seminars,  or conferences
                                              13

-------
  4.      In what areas did you request assistance? (Check all that apply)
          Q      Air permitting or regulations
          Q      Water permitting or regulations
          Q      RCRA/Hazardous waste permitting or safe handling of waste
          Q      Community right-to-know regulations
          Q      Toxic substances
          Q      Pesticides
          Q      Underground storage tanks
5. What prompted you to seek assistance? (Check all that apply.)
          Q      To find out if a specific environmental regulation applies to my facility.
          Q      General information about regulations.
          Q      Need help filling out a permit application form.
          Q      To identify ways to change status from regulated to unregulated business.
          Q      To obtain information about equipment or processes that will help us save money
                 complying with regulations.
          Q      To learn about pollution prevention opportunities.
          Q      Other:
6. Did the assistance provided adequately address the need you identified in Question 5 above?
         a      Yes
         a      NO
B.        Supplemental Customer Satisfaction Questions
          The first four questions are "core " customer satisfaction questions from EPA 's customer
feedback and customer satisfaction measurement guidelines.  When using these questions alone, EPA
recommends using a scale of I to 6 for customer satisfaction surveys for consistency.  When combining
customer satisfaction questions with questions to measure outcomes, you might want to adjust the scale
for consistency.  While a range of answers is good for customer satisfaction surveys, this approach makes
less sense for outcome measurement as EPA needs precise answers to track outcomes.
1. Overall, how satisfied are you with the services and products you have received from EPA?
          123456
          not at all                                                           very much
                                                14

-------
2. How courteously did EPA staff treat you?

         123456
         not at all                                                           very much

3. How satisfied are you with the communications you have received from EPA?

         123456
         not at all                                                           very much

4. How fully did EPA respond to your needs?

         123456
         not at all                                                           very much

5. Would you recommend this [insert name of compliance assistance activity] to other businesses?
         a
         a
Yes

No
6. How would you rate the technical understanding of the person assisting you (for helplines, workshops,
  onsite assistance) or the technical quality of materials (for publications, Web sites, and other
  materials)?
         a
         a
         a
         a
         a
Excellent

Good

Fair

Poor

No way to tell
7. How can we improve delivery of this service?
                                               15

-------
APPENDIX G

-------
EXAMPLES OF SURVEYS DEVELOPED TO MEASURE COMPLIANCE ASSISTANCE OUTCOMES

This appendix provides examples of 5 surveys developed to measure compliance assistance outcomes.

Sample 1:

EVALUATION OF DRY CLEANERS WORKSHOP
(AIR QUALITY REQUIREMENTS)
Date	
Your comments are important. Please fill out and return this anonymous evaluation form.

1.  Rate the usefulness and effectiveness of the workshop on a scale of 1 to 5

   5 = very useful/effective, 3 = somewhat useful/effective, 1 = not useful or effective
     Low       Medium      High
       12345            Overall workshop
       12345            Handouts, materials
       12345            Workshop location
       12345            Advertisement/Notice of workshop schedule

2.  Has the workshop had any of the following effects?

a.      I am more aware of, and knowledgeable about,
       the environmental requirements as a result of this workshop.        Yes	No	
b.     I plan to make changes in my company's recordkeeping practices     Yes	No	

c.     My company needs to install equipment to comply with the federal
      regulations (see question 5.)                                    Yes	No	

3.  Did you know before this workshop that (1) the requirements for using   certain emissions control
equipment; and (2) the frequency for inspecting for leaks, both vary according to the type of dry cleaning
equipment used, the date the machines were installed, and the amount of perc purchased per year?

                                                                 Yes     No
4.  What recordkeeping changes do you anticipate making at your facility?

                                                   Plan        Already     No
                                                   to do        do          plans
•     keep receipts for the last 5 years of perc purchases  	       	      	

•     keep a monthly log showing the amounts of perc
      purchased and the calculation of yearly perc
      consumption                                  	        	"    	

-------
       keep an operation and maintenance manual on site
       keep a log of the results of leak detections and repairs_
5. At your facility, where are you in the process of complying with the following requirements?
                                                      Plan to       Already     No
                                                      install        installed     plans
•      install required emissions control equipment

For refrigerated condensers:
       install temperature sensors at the outlet to measure
       outlet temperature (from dryer airstream)

       install temperature sensors at the inlet and outlet in
       order to measure temperature difference
       (from washer airstream of transfer machines)
6. In what areas, if any, would you like further training?
                                                      Plan         Already
                                                      to do         do
•      measure the above temperatures weekly            	       	

For carbon adsorbers:

•      measure the concentration of perc in the exhaust
       of the carbon adsorber using a colorimetric
       detector tube weekly                             	       	
Suggestions or comments regarding this workshop or other EPA assistance activities?

-------
Sample 2:

Compliance Assistance Survey for Missouri Animal Feeding Operation Rule Pamphlet


Please assist us by completing the following questions and returning this card to the Missouri Department of Natural
Resources (postage paid).


1. What was your level of knowledge about rules for Animal Feeding Operation in Missouri before reading the pamphlet?
(Circle one)

              High          Medium               Low

2. How well did the information in the pamphlet answer your questions?

              Very well      Satisfactorily           Did not help

3. Did the pamphlet improve your understanding of the rules for Animal Feeding Operations?

       Very Effective Effective       Poorly         Not at all

4. What other information should have been included?


5. Do you have any suggestions on how the pamphlet could be improved?

-------
Sample 3:

May ??, 1998


Dear , The Iowa Waste Reduction Center (IWRC) in cooperation with the Iowa Small Business Liaison will be creating a "Handbook of Environmental Regulations for Agribusiness." This handbook will fully integrate state and federal regulations into one easy-to-use reference book. Each regulation will be summarized with examples of how it could affect an agribusiness facility. Where appropriate, guidance in determining which regulations apply to your facility will also be included. The handbook will also give pollution prevention tips, provide state and regional contacts and give descriptions of applicable environmental assistance programs. In order to make this handbook as useful as possible, we are asking agribusiness facilities to complete the attached survey. The survey will help the IWRC assess the knowledge base and needs of the agribusiness community. It will also help the IWRC identify the information types that will be the most useful. All information reported in this survey will remain confidential. The survey will take you less than 10 minutes to complete and comes with a postage paid envelope for easy return. If you complete and return this survey you will receive a free copy of the handbook. Through the survey you may also request to be added to the IWRC's free newsletter mailing list and/or receive a free, confidential, on-site visit from the IWRC. The IWRC professional will review your operation, and provide practical guidance on environmental regulations and waste reduction in a confidential report sent directly to you. Please complete the survey and return prior to July 10, 1998. Thank you for your assistance. We will strive to make this document as user-friendly and as useful as possible. Iowa Waste Reduction Center Iowa Small Business Liaison University of Northern Iowa Iowa Department of Economic Development 800/422-3109 800/351-4668

-------
  Thank you for taking the time to complete this survey. This survey is to help us tailor the contents and format of the
   "Handbook of Environmental Regulations for Agribusiness". Your answers will help the Iowa Waste Reduction
                    Center assess the knowledge base and needs of you and your colleagues.

Contact Information
Name 	
Title 	
Address 	
City, state  zip code	
Phone number	
FAX 	
E-mail 	
SIC (if known)	
Environmental Survey
Please indicate your knowledge level concerning the environmental regulations that apply to each portion of your business.
If a particular sector does not apply, please check "not applicable". If you feel you are in compliance with this set of
regulations please check "In Compliance." Also let us know if you would be interested in environmental information
relevant to each business sector.
1)
2)
3)
4)
5)
6)
7)
Fertilizer storage and handling
KNOWLEDGE D No Knowledge     D
COMPLIANCE D In Compliance     D
                                            Some knowledge
                                            Not Sure
Pesticide and Chemical Storage and handling
KNOWLEDGE D No Knowledge      D  Some knowledge
COMPLIANCE D In Compliance      D  Not Sure
Fuel storage/fueling station
KNOWLEDGE D No Knowledge
COMPLIANCE D In Compliance

Vehicle Maintenance
KNOWLEDGE D No Knowledge
COMPLIANCE D In Compliance

Feed, Seed and Grain Handling
KNOWLEDGE D No Knowledge
COMPLIANCE D In Compliance

Grain Drying and Storage
KNOWLEDGE D No Knowledge
COMPLIANCE D In Compliance

Storm Water/Waste Water Issues
KNOWLEDGE D No Knowledge
COMPLIANCE D In Compliance
                                         D Some knowledge
                                         D Not Sure
                                         D Some knowledge
                                         D Not Sure
                                         D Some knowledge
                                         D Not Sure
                                         D Some knowledge
                                         D Not Sure
                                         D Some knowledge
                                         D Not Sure
                                                       D Very knowledgeable  D Not applicable
                                                       D Would like information
                                                              D  Very knowledgeable D Not applicable
                                                              D  Would like information
                                                       D  Very knowledgeable D Not applicable
                                                       D  Would like information
                                                       D  Very knowledgeable D Not applicable
                                                       D  Would like information
                                                       D  Very knowledgeable D Not applicable
                                                       D  Would like information
                                                       D  Very knowledgeable D Not applicable
                                                       D  Would like information
                                                       D  Very knowledgeable D Not applicable
                                                       D  Would like information

-------
 8)      Pollution Prevention
        KNOWLEDGE D No Knowledge      D  Some knowledge   D Very knowledgeable D Not applicable
        COMPLIANCE D In Compliance       D  Not Sure           D Would like information

 Information Sources

 Please indicate the sources currently used to gain environmental information, rate their helpfulness, and indicate if you
 would like additional information regarding these sources.
                              1-helpful      5-not helpful
 Web pages                    12345     D do not use    D need additional information
 List serves                    12345     D do not use    D need additional information
 ISU extension services          12345     D do not use    D need additional information
 University assistance           12345     D do not use    D need additional information
 Chemical/seed representative    12345     D do not use    D need additional information
 Trade associations              12345     D do not use    D need additional information
 State agencies                 12345     D do not use    D need additional information
 Federal agencies               12345     D do not use    D need additional information
 Trade publications              12345     D do not use    D need additional information

 Handbook Format and Content
 The handbook format and content are in the development stages.  To make this document most useful to you, please check
 which types of information you would like included.

 D      Federal regulations and references              D  Sample permits (water and air)              D  Case studies
 D      Iowa Code regulations and references           D  Practical tips
 D  Web page addresses
 D  Computer programs (emissions calculations, hazardous waste generation logs, etc.)
 D      Worksheets  (for emissions calculations, hazardous waste generation logs, etc.)

Electronic Capabilities

Do you have computer capabilities?              D  YES       D  NO
Do you access the Internet?                     D  YES       D  NO

Additional Resources

I would like to receive the IWRC's free "Closed Loop" newsletter               D  YES       D  NO
       covering current environmental regulations.

I would like the IWRC to contact me to set up a                        D YES       D  NO
       free, non-regulatory confidential site visit to my facility.

Additional Comments

-------
Thank you again for completing this survey.  All information contained in this survey will remain completely confidential.
Please return this survey in the postage paid envelope provided by July 10, 1998.  You will be mailed a copy of the
handbook upon its completion.

-------
Sample 4:

Please give us your comments on the video Making Pollution Prevention Work for You: Opportunities for Wood
Coaters. Your feedback will help us produce better outreach products.

1.  Overall, did you find the video worthwhile?
       	Yes     	Somewhat      	No

2.  How helpful are the following portions of the video?

       Materials:                           	Very  	Somewhat  	Not
       Equipment:                          	Very  	Somewhat  	Not
       Process Efficiency:                   	Very  	Somewhat  	Not
       Operator Efficiency:                  	Very  	Somewhat  	Not

3.  How convincing are the different types of speakers in the video?

       Host:                               	Very  	Somewhat  	Not
       Assistance provider:                  	Very  	Somewhat  	Not
       Wood facility owners / operators:       	Very  	Mixed  	Somewhat  	Not
       Vendors of coatings and equipment:     	Very  	Mixed  	Somewhat  	Not

4.  The video offers several reasons for trying pollution prevention techniques and technologies. Which are most convincing
for you? (Check any.)

       	Reducing your emissions to the environment.
       	Improving worker health and morale.
       	Preserving or improving product quality.
       	Reducing inefficiencies and waste in your process.
       	Saving money.
       	Avoiding accidents and liability.
       	Avoiding regulations.
5.  Do you intend to test or adopt a pollution prevention practice or technology as a result of viewing this video?
       	Yes     	No
       If yes, what do you intend to do?	
6.  Do you intend to refer to the video in the future or pass it on to a colleague?
       	Yes      	No

7.  Which of the following ways of receiving information on environmental issues work best for you? (Check any.)
       	Manuals (long, comprehensive)
       	Fact sheets (short, single-topic)
       	Newsletter or journal articles
       	CD Roms
       	Internet Web Pages
       	Workshops
             Videos

-------
8.  What type of job do you have?
             Operator at a wood coating or wood products manufacturing facility.
             Manager (able to make purchasing decisions) at a wood facility.
             Vendor of wood coatings or equipment.
             Consultant.
             Government assistance provider.
             Other (please specify):	
9.  Would you like information on topics related to the environment that were not covered in the video or not covered in
enough detail? If yes, please list topics:	


10. If you plan to test or adopt a new practice or technology at your facility as a result of this video, and may be willing to
talk to us about your experience, please write your name and telephone number below. We ask because we want to know if
our non-regulatory assistance efforts result in real  improvements.  Thank you!


11. Other comments or suggestions:	
Thank you for your input!

Abby Swaine and Janet Bowen
The New England Environmental Assistance Team
EPA Region I

Public reporting burden for this collection of information is estimated to average ten minutes per response, including time
for reviewing instructions, gathering information, and completing and reviewing the information.  Send comments on the
Agency's need for this information, the accuracy of the provided burden estimate, and any suggestions for reducing the
burden, including the use of automated collection techniques, to the Director, OP Regulatory Information Division, United
States Environmental Protection Agency (Mail Code 2137), 401 M Street, SW, Washington, D.C. 20460; and to the Office
of Information & Regulatory Affairs, Office of Management & Budget, 725 17th Street, NW, Washington D.C. 20503,
Attention: Desk Officer for EPA. Include the EPA ICR 1860.01 and the OMB control number 2020-0015 in any
correspondence. Do not send your completed survey to this address.  Approval expires 9/30/2001.

-------
Sample 5:

Form Approved OMB Control No.	                                                            Pg 1 of 2
Approval expires	

                              CEIT PROGRAM OUTCOME MEASURES SURVEY

Name:         	        Date:  	
Company:      	
Technology:    	
If you have comments or additional information on any question, please list in the comment section.  Include the number
of the question you are commenting on.

1.      In which of the CEIT programs did your company participate?        (Check all that apply)
                      a. Trade Show                                    Q
                      b. Innovative Technology Inventory                 Q
                      c. Innovative Technology Award                    Q
                      d. Technovation Bulletin                           Q
2.      Do you feel that potential users of new technologies (i.e. local/state
       officials, consultants, engineers) are more educated about your
       product as a result of your participation?                            Yes   Q  No

3.      Has your involvement in any CEIT program resulted in speeding up
       or easing the  permitting/approval for your technology?              Yes   Q  No

4.      Did your involvement produce any new inquiries about your          Yes   Q  No
       technology?

       If yes, please  estimate the number of additional inquiries produced.
       Did your involvement result in increased sales?                      Yes   Q  No

       If yes, please estimate the dollar amount.                           $
       Also, estimate the percent increase in sales.                         	%

       Of these sales, how many systems or technologies have already been
       installed?
If there are no new installations as a result of your participation, please skip to question 10 and answer part a. only.
Otherwise answer questions 7-10.

1.      Where are these new installations located?
       (Please give city and state)                                      	
       Of these installations, how many are:
              New applications
              Replacing a failing system
              Improving an existing system

-------
For questions 9b and I Ob, comparison should be made with comparable existing non-innovative technologies.

9.      Calculate the environmental improvements obtained from these
       installations: (For example, a, b, or/and other method)

       a. List the quantity of product treated by your technology.
       (For example, 1,500,000gallons ofstormwater)

       Quantity       Unit          Media
       	  	 of	has been treated.

       b. Compare the performance of the your technology with existing
       technologies. (For example, 99% TSS removal vs.  40% TSS removal)
        Yours
        	     vs.
        	     vs.
                              vs.
       c. Other improvements
10.    Calculate other economic benefits from your technology as follows:
       (For example, a, b, or/and other method)

       a. Provide the number of jobs created within your company by your
       technology.

       b. Compare the cost of your technologies to existing technologies.

         Yours                       Existing
       	     vs.   	
                              vs.
       c. Other benefits
Comments
(Please write the number of the question on which you are commenting.  You may attach additional pages.)

-------
APPENDIX H

-------
Consolidated Screening Checklist For Automotive Repair Facilities
Page 1 of6
        C    C    A   R
                              Helping Automotive
                              Professionals Save Time and
                              Protect the Environment
        GreenLznk
 Disclaimer


      Consolidated Screening Checklist For Automotive Repair

                                      Facilities

 This project has been funded, at least in part, with Federal funds from the U.S. Environmental
 Protection Agency (EPA) Office of Enforcement and Compliance Assurance under contract to SAIC,
 Contract Nos. 68-C4-0072, WA Nos. EC-2-2 (OC) and EC-2-3 (OC). The mention of trade names,
 commercial products, or organizations does not imply endorsement by the U.S. Government [or The
 Coordinating Committee For Automotive Repair—CCAR®].

                                   1. INTRODUCTION

 The United States Environmental Protection Agency (EPA) is providing this guidebook and
 screening checklist as a public service to the automotive service and repair industry. The Office of
 Compliance, through various meetings with industry representatives, shop owners and technicians,
 found that the main issue for automotive service and repair shops not in compliance with applicable
 environmental regulations was the lack of information available to the shop owner. This guidebook
 highlights for the user the various federal environmental programs a shop owner may be subject to,
 depending upon the activities that occur at the shop.

 The information is correct and accurate, as of the date published. The guidebook and associated
 screening checklist highlights important or key environmental program requirements as they apply to
 the various federal environmental programs. The checklist does not provide the complete
 environmental requirements necessary for a shop to be in total compliance. State and local laws,
 requirements, and rules, may apply in lieu of, or in addition to, the federal rules. Remember, this
 guidebook and screening checklist is a beginning, it is NOT the final word on environmental
 compliance. By understanding the basics of each environmental program, the user can then seek
 appropriate assistance from various state and local agencies.

 Company names, products, and processes identified in this guidebook are only used as  sources of
 reference. These names are familiar and/or prototypical within the automotive service and auto body
 repair industry. They are intended to help the user better understand the checklist, its implications,
http://www.ccar-greenlink.org/checklist.html
   3/26/99

-------
Consolidated Screening Checklist For Automotive Repair Facilities                       Page 2 of 6


 and his environmental responsibilities.

 The use of corporate and/or organizational names by the EPA should NOT be construed as an official
 endorsement of the named companies, services, products, or processes. It is up to you, the shop
 owner or facility manager, to make the best business decision in order to achieve and maintain
 compliance with environmental requirements based upon your business environment and shop
 operation.

 If you are not sure about your state and/or local environmental requirements, please contact your state
 and local environmental office. These offices can be found in the Blue Pages of your local telephone
 directory. If you don't know who to contact, you might consider the CCAR- Green Link ®
 Compliance Assistance Center.

 CCAR-Green  Link ® is a partnership between the EPA and the Coordinating Committee for
 Automotive Repair (CCAR®). CCAR® is an automotive industry organization whose mission is to
 increase the professionalism of automotive technicians.

 There are five ways to reach CCAR-Green Link ®:

 Toll-free: 1-888-GRN-LINK (476-5465)
 Internet: http://www.ccar- greenlink.org
 E-Mail: ccarinfo@unicom.net
 Phone: 1-913-498-2227
 Fax: 1-913-498-1770
 List of Page Contents

    1.  Waste Management
          o Waste Management
          o Used Oil
          o Used Anti-Freeze
          o Used Solvents
          o Batteries
          o Rags
          o Tires
          o Absorbents
    2.  Wastewater Management
          o Floor Drains & Wastewater Management
          o Storm Water
    3.  Air Pollution Control
          o Parts Cleaners
          o Motor Vehicle Air Conditioning (CFC's)
          o Catalytic Converters (CC's)
          o Fuels
          o Paints and Thinners
    4.  Underground Storage Tanks/Emergency Spill Procedures
          o Underground Storage Tanks
          o Spill & Emergency Response

 Proceed to Consolidated Screening Checklist For Automotive Repair Facilities Guidebook
http://www.ccar-greenlink.org/checklist.html                                             3/26/99

-------
Consolidated Screening Checklist For Automotive Repair Facilities
Page 3 of 6
1. WASTE MANAGEMENT
Waste
Management
Used Oil
Used Oil Filters
Used Anti-
Freeze
Used Solvents
Batteries
Rags
Tires
Has the facility determined which wastes are
hazardous wastes?(see guidebook)
Does facility generate more than 100kg (2201bs.) of
hazardous waste per month?
If yes, does facility have a U.S. EPA hazardous
waste generator I.D. number?(see guidebook)
Are used oil containers and piping leak free,
segregated, and labeled "used oil"?(seej*uidebook)
Are hazardous waste fluids mixed with used oil?
(see guidebook)
Is used oil collected and sent off site for recycling,
or burned in an on site heater?(see guidebook)
Does the facility accept household used oil?
If yes, is it tested for hazardous waste
(solvent/gasoline) contamination?(see guidebook)
Are used oil filters completely drained before
disposal?(see guidebook)
How are used oil filters disposed?(see guidebook)
Is used anti-freeze properly contained, segregated,
and labeled?(see guidebook)
Does the facility generate any antifreeze that is a
hazardous waste (>5ppm lead)?
If yes, is it recycled onsite in a closed loop system?
If no, is it counted toward facility generator status?
(see guidebook)
If used anti-freeze is not recycled on site, how is it
disposed?(see guidebook)

Are used solvents stored in proper containers and
properly labeled?(see guidebook)
How are used solvents disposed?(see guidebook)

Does facility have hazardous waste manifests or
shipping papers on file?(see guidebook)
Does facility return used batteries to new battery
supplier?
If not, how are used automotive batteries disposed?
(see guidebook)
Are used batteries contained and covered prior to
disposal?(see guidebook)
How are used rags and towels disposed?(sce
auidebook)
How are used rags stored while on-site?(see
guidebook)
How are used tires disposed?(sce guidebook)

Yes /No
Yes /No
Yes /No
Yes /No
Yes /No
recycle / onsite heater /
burned off-site / other
Yes /No
Yes /No
Yes /No
scrap metal / service / trash /
other
Yes /No
Yes / No / do not know
Yes /No
Yes /No
recycled off site / mixed
w/other fluids / landfill /
other
Yes/No/N/A
service / mixed w/other fluids
/ other
Yes/No/N/A
Yes/No/N/A
recycle / hazardous waste
landfill / other
Yes /No
laundry service / burned for
heat / trash
separate container / shop
trash can / floor
resale / retreading / landfill /
customer / N/A / other
http://www.ccar-greenlink.org/checklist.html
   3/26/99

-------
Consolidated Screening Checklist For Automotive Repair Facilities
                                                                 Page 4 of6
  Absorbents
                  Does facility use sawdust or other absorbents for
                  spills or leaks?(see guidebook)
Does facility determine whether used absorbents
are considered hazardous before disposal?(see
guidebook)
                  How are absorbents used for oil spills disposed?
                  (see guidebook)
                                               Yes/No
                                                                 Yes/No
                                               N/A / burned for energy7
                                               disposed as hazardous waste /
                                               characterized as
                                               nonhazardous and landfilled
                             2. WASTEWATER MANAGEMENT
  Floor Drains &
  Wastewater
  Management
                  How does facility clean shop floor and surrounding
                  area?(see guidebook)
                  Are fluids (oil, antifreeze, solvent) allowed to enter
                  floor drains for disposal?(see guidebook)
                                               uses dry cleanup / uses water
                                               Yes / No / no floor drams on
                                               site
                  How are fluids disposed?(see guidebook)
                                               municipal sanitary sewer /
                                               storm sewer / street / other
If floor drains discharge to municipal sanitary
sewer, to storm sewer system, or the street, has
facility notified POTW about potential
contamination in wash water?(see guidebook)
                  If drains discharge directly to surface waters or to
                  an undergound injection well, does facility have an
                  NPDES (surface), or UIC (underground) permit?
                  (see guidebook)
Yes/No
                                               Yes/No
  Storm Water
Does facility store parts, fluids and/or other
materials outside?
Are materials protected from rain/snow in sealed
containers or under tarp or roof?(see guidebook)
Yes/No
Yes/No/N/A
                                             fTION CONTROL
  Parts Cleaners
If lacility uses parts-cleaning sinks with
halogenated solvents, has facility submitted a
notification report to EPA?
Are sinks kept closed and sealed except when
actually cleaning parts?
Does facility follow required work and operational
practices?(see guidebook)
Yes/No/N/A
Yes/No/N/A
Yes/No/N/A
  Motor Vehicle
  Air Conditioning
  (CFC's)
                  Are MVAC technicians trained and certified by an
                  accredited program?
                  If yes, are certificates on file?(see guidebook)
                                               Yes/No N/A
                                               Yes/No
Is CFC recovery and/or recycling equipment EPA
approved?(see guidebook)
Yes/No/N/A
Is equipment recovery/recycling or recovery only?
(Circle one)
If recovery only, is refrigerant reclaimed by an
EPA-approved reclaimer?(see guidebook)
                                                                 recover/recycling / recovery
                                                                 only/
                                                                 Yes/No
                  Does facility replace CC's that are the correct type
                  based on vehicle requirements?(see guidebook)
                                               Yes/No/N/A
  Catalytic
  Converters
Does facility replace CC s on vehicles covered
under original manufacturer's warranty?
If yes, was original CC missing or due to state/local
inspection program requirement?(see guidebook)
Yes/No
Yes/No
http://www.ccar-greenlink.org/checklist.html
                                                                     3/26/99

-------
Consolidated Screening Checklist For Automotive Repair Facilities
                                                                 Page 5 of6
  (CC's)
Does facility properly mark and keep replaced CC's Yes
on-site for at least 15 days?(see guidebook)
                  Does facility completely fill out customer
                  paperwork and maintain on-site for at least 6
                  months?(see guidebook)
                                               Yes/No
  Fuels
                  Is Stage I vapor recovery equipment operated
                  properly during unloading of gasoline?(see
                  guidebook)
                                               Yes/No/N/A
                  Is Stage II vapor recovery equipment installed and
                  working at pumps?(see guidebook)
                                               Yes/No/N/A
Do fuel delivery records indicate compliance with
appropriate fuel requirements?(see guidebook)
Yes / No / records not
available
                  Are pumps clearly labeled witn the product they
                  contain?(see guidebook)
                                               Yes/No
                  Do gasoline pump nozzles comply with 10 gallon
                  per minute flow rate?(see guidebook)
                                               Yes / No / don't know
                  Is dyed, high^sulfur diesel/kerosene available for
                  sale to motor vehicles?(see guidebook)
                                               Yes/No
  Paints and
  Thinners
                  Are paints and thinners properly contained and
                  marked when not in use?(see guidebook)
                                               Yes/No/N/A
                  Does facility use low VQC paints?(see guidebook)
                                               Yes / No
                  Does facility determine whether paints are
                  considered hazardous before disposal?(see
                  guidebook)
                                               Yes/No
How are used paints, thinners and solvents
disposed?(see guidebook)
                  Does facility mix paint amounts according to need?
                  (see guidebook)
                  Does facility use new, high transfer efficiency
                  spray applications?(see guidebook)
                  If hazardous paints are used, are spray paint booth
                  air filters disposed of properly as hazardous waste?
                  If filters are not hazardous, how are they disposed?
                  (see guidebook)
reuse/recycle/mix w/other
fluids / landfill
                                               Yes/No
                                               Yes/No
                                               Yes/No
                                               recycled / landfill
                     4. UST/SPCC/EMERGENCYSPILL PROCEDURES
  Undergound
  Storage Tanks
                  Has the State UST program been notified of any
                  USTs located on-site?(see guidebook)
                  Does facility conduct leak detection for tank and
                  piping of all on-site UST systems?(see guidebook)
Do USTs at the facility meet requirements for spill,
overfill and corrosion protection?(see
guidancebook)
                  Are records and documentation readily available
                  (as applicable) for installation, leak detection,
                  corrosion protection, spill/overfill protection,
                  corrective action, financial responsibility, and
                  closure?(see guidebook)
                                               Yes/No/NA
                                               Yes/No/N/A
Yes/No/N/A
                                               Yes/No/NA
                  Does facility have a gasoline, fuel oil, or
                  lubricating oil storage capacity total greater than
                  1,320 gallons (or greater than 660 gallons in any
                  one tank) in above ground tanks or total
http://www.ccar-greenlink.org/checklist.html
                                                                     3/26/99

-------
Consolidated Screening Checklist For Automotive Repair Facilities
                                                                 Page 6 of 6
  Spill &
  Emergency
  Response
underground tank storage capacity greater than
42,000 gallons?
If yes, could spilled gasoline, fuel oil, or lubricating
oil conceivably reach navigable waters?
If yes, does the facility have a SPCC plan signed by
a professional Engineer?(see guidebook)
Yes/No
Yes/No
Yes/No
                  Are phone numbers of the national, state and local
                  emergency contacts available on site for immediate
                  reporting of oil or chemical spills.(see guidebook)
                                              Yes/No
 Source:  US EPA Office of Enforcement and Compliance Assurance, EPA 305-8-97-005, October,
 1997.

                                         Disclaimer:

 The Coordinating Committee For Automotive Repair (CCAR®) is providing the information
 contained in this Consolidated Screening Checklist For Automotive Repair Facilities Guidebook as a
 public service. CCAR® believes that the information is correct and accurate, as of the date posted,
 and has no reason to believe otherwise.

 However, CCAR® does not guarantee the correctness or accuracy of the information, and will not be
 responsible for incorrect or inaccurate information, or any damage or loss suffered by any person as a
 result of reliance on such information.

 The information presented relates to environmental programs of the Federal government. State and
 local laws, regulations, and rulings, may apply in lieu of, or in addition to, the Federal rules, and
 should be reviewed before taking any action in reliance on this information.

 Back to the top of the page


           [Home | Consolidated Screening Checklist Guidebook | Comment | E-Mail)

 CCAR-Green Link®
 Web Page produced by CCAR®.
 Last Update - 14-Jan-98
 Hosted By '+yr Unicom
          *   ^r  cow u-uM
-------
APPENDIX I

-------
act Sheet  VIII:     Examples   of    Graphs for Presenting
                           Customer Feedback Results

Once data have been collected, think hard about how you want to present them.  Often, we focus lots
of attention on collecting feedback and performing complex analysis, and forget that we have to market
the findings if we are going to help bring about change. The form you select for presentation can make
or break all the previous work.  Results need to be communicated clearly to the appropriate people
before an organization can begin learning from its customers.
                          Overall Satisfaction
         50-

         45-

         40-
20-

15-

10-

 5-

 0
                                            46.1
                              15.5
               1.2
             1-3
             Very
          Dissatisfied
                   4-5
                Dissatisfied
    6-8
 Acceptable
but Room for
Improvement
   9-10
High Quality
There are many variables to consider when presenting data, such as the nature and level of the audience,
the reasons the feedback was collected and how it will be used, and the nature of the data itself. Some
of the more common forms are listed below, with a brief explanation of the unique use of each
presentation.

A very basic bar graph can be used to convey the percentage of the population that responds within
a given range. For example, the graph above indicates that 1.2 % of the respondents rated their Overall
Satisfaction as one, two or three on a scale of one to ten, 15.5% rated Overall Satisfaction as four or
five, 46.1 % as six, seven or eight, and 37.2% as nine or ten.  Note that these groupings of 1-3, 4-5, 6-8
and 9-10 are somewhat arbitrary, and can be changed to suit the needs of your project.  Additionally,
the labels ''Very Dissatisfied', 'Dissatisfied', 'Acceptable,  but Room for Improvement'  and 'High
Quality'  are also subject to change according to individual needs.
                                       VIII-22

-------
                            Overall Satisfaction by Region
                          Northeast

                          Southeast

                             South

                          Southwest

                              West
                                  1
It is often useful to organize responses by customer segments that are meaningful to the survey
audience.  In the case above, the mean, or average, Overall  Satisfaction ratings are organized by
geographic regions.  Note also that the base, or number of respondents in that region, is noted to the
right of each bar.  This can be important to identify the relative validity of the information.
                                   Overall Satisfaction
                        Among those who received grants
                        totaling less than $50,000 a year


                        Among those who received grants
                        totaling more than $500,000 a year


                        Among those who receive grants totaling
                        between  $50,000 and $500,00 a year


                        Among those who have been interacting
                        with X.YZ for less than one year


                        Among those who have been interacting
                        with X.YZ between one and three years

                        Among those who have been interacting
                        with XYZ for more than three years
                                                                           Base
                                                1234567
105


146



125



 39



127



244
Responses can also be organized by other types of segments. In the case above, respondents answered
questions about their length of use, and the amount of grant money they had received.  Note that the
numbers of respondent in each category is to the right of each bar.
                                             VIII-23

-------
                           Contract staff
                          Technical staff
                       Administrative staff
                     Billing/Accounting staff
                                                                          I 8.3
                                                                    6.7
                                              246
                                                 Mean Satisfaction Rating
                                                                              10
A bar graph can also be used to identify the mean, or average, response to various services received.
This is useful to compare the levels of satisfaction between services offered.
                                  Quality of Health Care Received
                                             4-6
                                                     7-8
                                                             9-10
                                                                   best
                                       D Adults Surveyed (n=72)
                                       Q Children Surveyed (n=79)
A slightly more complex graph can allow the comparison of responses between two segments of a
population.  In the example above, 62% of the children surveyed considered the Quality of the Health
Care they received to be nine or ten, on a scale of one to ten.  In general, it appears that more children
rated the Quality  of Health Care as higher, while more adults provided lower ratings.
                                            VIII-24

-------
                                     Getting the Care You Need
                                         When You Need It
                            Adults Surveyed
                           Children Surveyed    28
                                     0%    20%   40%   60%   80%   100%
                                       Sometimes or Never	DUsually   DAIways |
Another way to compare two populations is to use segmented bar charts, as shown above. The graph
above indicated that children surveyed were more likely to feel they received the care they needed,
when they needed it.
                                 Priority Issues for Building
                                     Customer Loyalty
                                         Action Matrix
                                Contracts Management
                                > Grants
                                Management
                                                     Information Clearinghouse
                                                     Interactions with Agency
                                                     Staff
                           Low
                                                     Advocacy
                                                    » Distribution of Reports
                                         Satisfaction
                                                              High
If driver analysis is being performed, a useful way to present the results is in a quadrant chart, as in
the example above. By comparing the levels of satisfaction with the levels of importance, we can
prioritize results. In the example above, the services listed in the upper right quadrant are those that
were very important to the customer and were rated as providing high levels of satisfaction.  These
 services, Information Clearinghouse and Interactions with Agency Staff are identified as areas where
the organization is  meeting or exceeding the customers needs. In contrast, the upper left quadrant
identifies services that are very important to the customer,  but are rated as providing lo satisfaction.
It is these services, Contracts Management and Grants Management that require immediate attention.
The lower right quadrant identifies services that provide high levels  of satisfaction, but are not
important to the customer.  In the example above, no services were found to be in the lower left
                                           VIII-25

-------
quadrant.  This quadrant identifies services that are not important to the customer and provide low
levels of satisfaction.
                                 Drivers of Satisfaction
                            Hi
-Satisfaction-
                         Hi     Promote
                Importance
                       Low
   Maintain
Improve
                                                Commit Minimal Resources
Another example of a chart that related the results of driver analysis is above.  In this case, subjective
labels have been applied to the areas of the chart, according to the needs of the project
              Compared to other government or government-like grant or funding
                     processes would you say that your experience was...
                             Better?
                             31.7%
                   The Best?
                    10.0%
                                                                 The Worst?
                                                                   1.6%
                                                              About the Same?
                                                                  31.7%
                            Don't Know/
                            No Answer
                              15.0%
                                          VIII-26

-------
A pie chart is another method useful for relating the proportions of a population that responded in a
particular way to a question. In the example on the previous page, the majority of the respondents
clearly felt that the funding process was About the Same as with others.
                            Examples of Customer Remarks
                                Concerning Billing  Needs

                         * "Billing is sketchy and difficult to understand."
                         •* "We are running approximately 6 months behind
                          on billing."
                         VI have had problems with billing, and would like
                          XYZ to reassess the way they are billing;
                          timeliness and accuracy."
                         * "Poorly itemized billing."
                         * "Billing report really hard to understand, very
                          inconsistent."
                         •* "More prompt billing, so that I can delete them off the
                          records."
                         * "Billing is a twilight zone."
Open End responses are usually organized according to subject matter.  In the example above,
comments that refer to problems with a Billing Service are grouped together.  This is a very effective
way to communicate comments from customers to the audience. Following is an example of Open End
responses being organized and grouped together. The actual statement by the customer is not listed,
but the numbers of customers who felt a certain way is clearly communicated.
                    What  Products or Services to  XYZ
                    Customers  Want to  See  0 ffe re d ?
                                    (# of responses)
                 Information Distribution                       34
                 - Inte rne t                     9
                 - M a Ming s                    15
                 - Other                      10

                 Im p roved 1C la rified  Policies                   20
                 (supplies, photographic printing, m icrofilm , signs,library, etc.)

                 Improved C u sto m e r S e rv ice                    7

                 InformingCustomersofExistingServices       8
                                          VIII-27

-------
                 Customer  Satisfaction
              IstQtr
  2nd Qtr

Courtesy
Timeliness
3rd Qtr
4th Qtr
                                             Accuracy
                                             Accessibility
One additional type of chart which can be useful for presentations is the trend or run chart which is used
to identify meaningful changes from year to year, or between feedback activities. Such charts are used
to monitor progress and portray improvement. A time series chart not only can show trends, it can
portray relationships. With time series, change and relationships in two or more items can be compared
that would otherwise appear on different scales (apples and oranges) if the net change from one point
to another is defined as a percentage.
                                    VIII-28

-------