Arc £va,
           •;PM-22l
                               EPA 230-01-9-!-00-
                               Decamoer 1990
EPA
Communicating
Environmental Risks
A Guide To
Practical Evaluations

Risk Communication Series
                       Printed on Recvded P^oer

-------

-------
                         December 1990
  Communicating Environmental Risks
      A Guide to Practical Evaluations
                            Prepared for

                          Dr. Ann Fisher
                 Office of Policy, Planning and Evaluation
                  U.S. Environmental Protection Agency
                         401 M Street S.W.
                       Washington, DC 20460
                            Prepared by

                          Michael J. Regan
                       William H. Desvousges
                     Center for Economics Research
                      Research Triangle Institute
  The information in this document has been funded wholly or in part by the United States
Environmental Protection Agency under Cooperative Agreement No. CR 814676. It has been subjected
to the Agency's peer and administrative review and approved for publication as an EPA document.
Mention of trade names or commercial products does not constitute endorsement or recommendation for
use.

-------

-------
                                                                    Contents
CONTENTS
Chapter                                                                 Page

   1        Introduction	1
               Communicating About Environmental Risks	1
               Why Evaluate Risk Communication Programs?	2
               How To Use This Guidebook	3

   2        Evaluating Effectiveness: Issues and Considerations	5
               Determining an Appropriate Evaluation	5
               Coping with Problems in Evaluation	5
               Determining the Scope of Your Evaluation	7
               Summary	8
   3        The Planning Phase:  Integrating Communication and Evaluation	11
               Planning the Risk Communication Effort	11
               Preparing for Evaluation	11
               Summary	15

   4        The Design Phase: Developing and Pretesting Materials	17
               Designing the Risk Communication Effort	17
               Formative Evaluation: Pretesting Materials	17
               Excuses for Avoiding Pretesting	18
               Pretesting Methods		19
               Determining What and How Much To Test	28
               Planning and Conducting Pretests	29
               Summary	...31

-------
iv     Program Feedback: Using Evaluation Results
      CONTENTS (continued)
       Chapter                                                                Page

          5         The Implementation Phase:  Executing the Strategy
                      and Tracking Details	33
                      Process Evaluation	33
                      Establishing Process Evaluation Measures	34
                      Summary	34
          6         Program Assessment:  Evaluating Effectiveness	37
                      Outcome Evaluation	37
                      Measuring Effectiveness	37
                      Choosing a Design	40
                      Choosing a Sample	42
                      Collecting Outcome Data	43
                      Analyzing Data	43
          7         Program Feedback: Using Evaluation Results	47
                      Apply What You Have Learned	47
                      Share What You Learned	47
                      Write an Evaluation Report	48

                   Bibliography	51
                   Glossary	53
                   Appendix A:  Questionnaires
                   Appendix B:  Focus Group Materials
                   Appendix C:  Pretesting Materials

-------
                                                                        Preface
PREFACE
   Information programs play an important role in EPA's strategy to manage environmental
risks. Whether the hazard is naturally occurring (e.g. radon) or manufactured (e.g. asbestos
insulation), individuals often can take steps that reduce their own exposure.  Experience
demonstrates, however, that expanding public awareness, increasing knowledge, changing
attitudes, and motivating behavioral changes are difficult objectives to reach.

   In some cases, communication activities have achieved significant reductions in health risks.
Communicators have learned a lot about how to develop  and disseminate more effective
information materials, but serious health risks remain. Close attention to each phase of the risk
communication program, planning, design, implementation, and evaluation, will be critical to
determining future successes.

   This guidebook was developed to help EPA program staff evaluate the effectiveness of their
risk communication activities. Several important points are emphasized. First, risk communi-
cation budgets are never ideal, but some type of evaluation can be incorporated into almost any
size budget. Second, no one evaluation strategy is appropriate for every situation; you must tailor
an evaluation to meet your particular needs. Third, more attention should be paid to outcome
evaluation—determining the effects the activities had on the target audience(s).

   This project was sponsored by EPA's Risk Communication Program, Office of Policy,
Planning, and Evaluation (OPPE) under Cooperative Agreement Number CR814676-02.  It was
written by Michael J. Regan and William H. Desvousges at the Research Triangle Institute under
the supervision of Dr. Ann Fisher, OPPE. Some sections have been excerpted from Making Health
Communications Work, writted by Elaine Arkin  for the U.S. Department of Health and Human
Services.

-------

-------
                                                                  ChaDier 1. 'introduction
1
INTRODUCTION
Communicating About
Environmental  Risks

   Each year, citizens face growing amounts
of information about environmental hazards
such as radon, lead, incidental tobacco smoke,
and others.  It is increasingly important for
citizens to become informed about such poten-
tial health risks.  Therefore, as part of its
program to manage  environmental risks the
U.S. Environmental Protection Agency (EPA)
develops and distributes information to differ-
ent groups  about  the nature of a particular
hazard, and what  can and is being  done to
manage the risk and its consequences.

   Risk communication activities are impor-
tant for several major reasons:

 • To explain regulatory actions being taken
   and put residual risk in context—for citi-
   zens, private interest groups, the regulated
   community,  and  legislators and  govern-
   ment officials;
 • To help citizens  provide informed input
   into risk management decisions at the local
   level (e.g., siting waste disposal facilities);
 • For use  by  EPA  when it  does not have
   regulatory authority for dealing with some
   risks, or when the risks are experienced by
   people within their homes, which limits the
   types of regulatory intervention that would
   be effective.

   The term "risk communication" means dif-
ferent things to different people. For the pur-
poses of this guidebook, risk communication is
the purposeful  exchange of information  be-
                              tween interested parties about environmental
                              risks. Careful attention to risk communication
                              practices and process will help you to maxi-
                              mize the potential for success.
                                       Risk Communication
                                  Practices
                                 Key steps in the
                                 communication
                                 program
  Process
Rules of the game
that determine the
purpose, scope,
and order of key
activities
                                  Figure 1.  Dimensions of Risk
                                           Communication

                                 Risk communication practices are steps
                              taken by EPA's program staff to design and
                              disseminate messages about risk to  a target
                              audience. These steps include identifying the
                              target audience(s), developing and pretesting
                              different risk messages, producing informa-
                              tion materials (e.g., brochures, handbooks and
                              posters, public service announcements, and
                              videotapes), identifying appropriate commu-
                              nication channels (e.g., media, civic groups,
                              schools), and distributing the materials.

                                 Risk communication is complex and is
                              subject to many limitations. Here are some
                              examples:

                               • The emotion-laden attitudes  surrounding
                                 environmental risks, coupled with the de-
                                 tailed technical knowledge needed to un-

-------
".cr.rnLocating Environmental Risks
   derstand these phenomena, often act as bar-
   riers to  the comprehension of important
   information.

 • Print materials and videotapes require that
   the user be motivated to seek out risk infor-
   mation about a particular topic.

 • Conflicting perceptions of risk among indi-
   viduals make it difficult to develop effec-
   tive risk messages.

 • The news media have difficulty reporting
   scientific risk estimates.

 • Certain goals, such as changing behavior,
   are  more difficult to achieve  than simply
   reaching the audience.

   Risk communication is more than simply
designing and delivering risk messages to the
public  (or other target audience); it is a two-
way process that provides government, indus-
try, and individual decision makers with the
information they need to make decisions aimed
at controlling or managing risks. For example,
a community workshop might be held in which
public officials and residents exchange infor-
mation about the proposed cleanup of a Super-
fund site that would be both technically sound
and socially acceptable.

   The process of exchanging  information
can be undermined by  many potent issues,
such as scientific uncertainty, interest group
pressure, disrespect, or just plain  stubborn-
ness.  These and other problems pose potent
threats to effective risk communication but
often can be anticipated  and mitigated.

   More information on risk communication
issues can be found in the selected readings at
the end of the chapter.


Why Evaluate Risk
Communication Programs?

   Evaluation  is a purposeful effort to deter-
mine effectiveness.   It is essential  to risk
communication because it provides feedback
about whether risk messages are received, un-
derstood, and internalized by those for whom
they are intended.  Without evaluation, it is
impossible for communicators to choose those
messages and  channels that use limited re-
sources most effectively. Instead,.communi-
cators are left  to their own subjective inter-
pretations about what works and what doesn't.
A lack of evaluation, therefore, affects both the
quality of the individual risk communication
effort and the primary goal: improving public
health.
         Evaluating Risk
         Communication
     Evaluation can be used for any of
  the following purposes:

   • To conduct a formative evaluation
     to help program planners, manag-
     ers, and/or staff improve develop-
     ing or ongoing communication ac-
     tivities;

   • To conduct a process evaluation to
     identify how well the administra-
     tive and organizational aspects of
     the activities are functioning;

   • To conduct an outcome evaluation
     to  help the sponsor or others in
     authority decide the extent to which
     risk communication activities are
     successful and what should be their
     ultimate fate.

  All three types of evaluation mentioned
  here will greatly enhance the ability to
  ensure that resources allocated for risk
  communication are, in fact, used for
  activities that continue to meet the tar-
  get audience's needs.

-------
                                                                    Chaoter 1. introduction
    The ideal way to apply evaluation findings
is to  improve ongoing risk communication
activities. In addition, evaluation results are
valuable  for other uses:

  • To justify your effort;

  • To provide evidence of need for additional
   funds or other resources;

  • To increase institutional understanding of
   and support for risk communication activi-
   ties;

  • To encourage ongoing cooperative ven-
   tures with other organizations;

  • To avoid making the same mistakes in fu-
   ture risk communication efforts.

       The EPA Office; of Air and Radia-"
  tion (OAR) in cooperation with the US
  Consumer Product Safety Commission
  developed a booklet entitled, The Inside
  Story: A Guide to Indoor Air Quality. An
  outcome evaluation was conducted by
  EPA of the booklet's effectiveness in
  providing information on  indoor air
  pollution  to the general public.  This
  evaluation demonstrates an important
  lesson: Useful information can be gath-
  ered quickly and at low cost.
v	;	.

How To Use  This Guidebook
    The guidebook explains how to plan a
practical, cost-effective e valuation strategy that
can be integrated with your risk communica-
tion effort.  It identifies risk communication
objectives, which evaluation techniques  are
most suitable for different goals, and how to go
about the evaluation itself. While it has been
developed  specifically  for EPA,  the
guidebook's principles are relevant for evalu-
ating  risk communication activities in other
government  agencies.
    EVALUATION AND RISK COMMUNICATION:
         AN INTEGRATED FRAMEWORK
                Introduction
        Evaluating Effectiveness: Issues
             and Considerations
        The Planning Phase: Integrating
         Communication and Evaluation
         The Design Phase: Developing
           and Pretesting Materials
          The Implementation Phase:
          Executing the Strategy and
          	Tracking Details
            Program Assessment:
           Evaluating Effectiveness
             Program Feedback:
           Using Evaluation Results
        Figure 2.  Guidebook Outline

    The guidebook has seven chapters. Chap-
ters 1 and 2 introduce the most important issues
and considerations in evaluating risk commu-
nication efforts. Chapters 3-7 present a frame-
work that integrates evaluation with different
phases  of the  risk communication  effort:
planning (Chapter 3), design (Chapter 4),
implementation (Chapter 5), program  as-
sessment (Chapter 6), and program feedback
(Chapter 7). This five-phase framework has
been adopted here to facilitate thinking about
where and when various evaluation techniques
and activities are most effective.

    Throughout the guidebook, checklists and
questions are provided to make planning easier.
Additional readings are provided at the end of

-------
Communicating Environmental Risks
each section to direct you  to more complete
information about specific subjects. The Ap-
pendices include a glossary and other sources
of information.
             Selected Readings
    Covello, Vincent T., David B. McCallum,
    and Maria T. Pavlova, eds., Effective Risk
    Communication, Plenum Press, (1989).

    Krimsky, Sheldon, and Aionzo Plough, En-
    vironmental Hazards, Dover, MA: Auburn
    House Publishing Co., (1988).

    National Research Council, Improving Risk
    Communication, Washington, DC: National
    Academy Press, (1989).

    Interagency Task Force on Environmental
    Cancer  and  Heart and  Lung  Disease.
    "Evaluation and Effective Risk Communi-
    cation Workshop Proceedings."  Washing-
    ton, DC, (June 1988).

    U.S. Department of Health and Human Ser-
    vices, Making Health Communication Pro-
    grams Work, Bethesda, MD:  National Can-
    cer Institute, NIH Publication No. 89-1493,
    (1989).

    U.S. Environmental Protection Agency. The
    Inside Story:  A Guide to Indoor  Air Qual-
    ity—How Well Is It Working?, Washington,
    DC: Office of Policy, Planning, and Evalu-
    ation, EPA 230-01-073, (1990).

-------
                                                         Chapter 2. Evaluating Effectiveness  5
2
EVALUATING EFFECTIVENESS:  ISSUES
AND CONSIDERATIONS
Thinking About Evaluation
   Evaluations usually are initiated by some-
one in management who wants to know what
effect the communication effort is having on
the target audience. The evaluator's job is to
think through exactly what type of evaluation
is appropriate.

   Timing is an important aspect of evalua-
tion—good evaluations c annot be simply tacked
on the end of a risk communication effort.
Planning for evaluation early can be  a cost-
effective strategy and can increase the effec-
tiveness of risk communication activities. Thus,
communicators can gather better information
and have it available when it is most useful—
before full implementation.

   This chapter will help you think about the
purpose of the evaluation, what resources are
available, and what constraints will influence
your activities.


Determining an Appropriate
Evaluation
   You should consider several questions be-
fore deciding what kind of evaluation will be
best for your program:

   • How long will the program last? Will the
     implementation phase be long enough to
     permit measurement of significant ef-
     fects and periodic adjustment?

   • Do you want to repeat or continue your
     program?
                                • Can you evaluate your objectives in the
                                 foreseeable future?

                                • Which components of the program are
                                 most important to you?

                                • Are there management orpublic demands
                                 for program accountability?

                                • Will an evaluation report help communi-
                                 cation efforts compete with other agency
                                 priorities for future funding?

                                The table on the next page describes sev-
                            eral types of evaluation and the types of infor-
                            mation that each would try to collect. Chapters
                            4-6 describe how to use each of these types:
                            formative  (Chapter 4), process (Chapter 5),
                            and outcome (Chapter 6).
                            Coping with Problems in
                            Evaluation
                               Many considerations will influence what
                            type of evaluation you can do and how well you
                            can do it. Some limitations can be overcome,
                            while others cannot.

                               Working With Stakeholders—Keep  in
                            mind that the interests of various stakeholders
                            might be affected by an evaluation's findings.
                            Stakeholders might include agency planners,
                            managers, and program staff, oversight man-
                            agement (e.g., Congress), or the target audience.
                            For example, an outcome evaluation  might
                            show that a communication activity did not
                            increase the target audience's knowledge. This

-------
                         Types of Evaluation
The following types of evaluation have been adapted to serve the goals of evaluating
risk communication programs.

Formative—Evaluation during the formative stages of a risk communication effort
assesses the strengths and weaknesses of materials or campaign strategies before
implementation.  It permits necessary revisions before the full effort goes forward.
Among other things, materials can be tested for the following:
   • clarity
   • tone
   • comprehensiveness

Process—Process evaluation examines the procedures and tasks involved in imple-
menting an activity.  This type of evaluation also can collect information about the
administrative and organizational aspects of the overall effort, such as:
   • number of staff working  on the project
   • schedule of activities
   • number of materials distributed
   • attendance at meetings
   • number of calls to a hotline
   • number of public inquiries received as a result of a public service announcement
   • articles printed

Outcome—Outcome evaluation is used to collect and present information needed for
judgments about the effort and its effectiveness in achieving its objectives. Not all risk
communication efforts aire suitable for outcome evaluation. Herman, et al. note that
outcome evaluation  is most suitable when "the program has clear and measurable
goals and consistent replicable materials,  organization, and activities."  Outcome
evaluation can obtain descriptive data on a project and document the immediate effects
of the  project on the target audience (e.g., percent of the target audience showing
increased awareness of the subject). It is possible to get long-term results, but most
agencies cannot afford long-term evaluation.

An outcome evaluation can collect the following information about the program:
    • changes in knowledge and attitudes
    • expressed intentions of the target audience
    • changes in behavior
   Adapted from U.S. Department of Health and Human Services, 1989.

-------
                                                             Chapter 2. Evaluating Effectiveness  7
finding might determine the future allocation
of resources to risk communication efforts.

   Herman, et al. observe that "a good and
useful evaluation depends upon sharing infor-
mation and upon cultivating a constituency of
potential users who believe that the evaluation
addresses  prime issues of concern  and has
produced valid, reliable, and credible results—
in other words, a constituency who will trust
the findings."  The evaluator should identify
potential users of the findings and involve
them in the planning arid/or execution of the
evaluation.   Emphasize that  an effective
evaluation can improve  the performance of
ongoing or future communication efforts.

   Facing Resource Constraints—Limited
resources may force you to choose  between
formative, process, or outcome evaluation. No
technique, independently,  will  provide  you
with a complete picture of what happened.
Some experts  will  tell you that if you must
choose,  you should choose outcome evalua-
tion—the  only way to certify that you ac-
complished your objectives. Others will advise
that process measures can  improve program
management by helping you understand why
you did or did not accomplish your objectives.

   Every program planner faces constraints to
undertaking evaluation tasks, just as there are
constraints to  designing other  aspects of a
communication effort. These constraints may
include the following:

   • limited funds

   • limited staff time and  capabilities

   • length of time allotted to the effort

   • limited access to computer facilities

   • agency restrictions on hiring consultants
     or contractors
    • policies limiting the ability to gather in-
     formation from the public

    • management perceptions regarding the
     value of evaluation

    • ambiguous goals and multiple objectives
     of the risk communication effort

    • difficulties in  designing  appropriate
     measures for communication programs

    • difficulties in separating the effects of
     your activities from other influences on
     the target audience in "real world" situ-
     ations

    These  constraints make it necessary to
weigh existing limitations against the require-
ments for a credible  evaluation.  It is not true
that "something is better than nothing." If an
evaluation design, data collection, or analysis
must be compromised to fit limitations, you
must make two important decisions:

   1. Do the required compromises make the
     evaluation results invalid?

   2. Is an evaluation strategy essential com-
     pared with other  compelling uses for
     existing resources? For example,  if the
     risk  communication activity costs
     $10,000 and it would cost $15,000 for a
     credible evaluation of its effectiveness,
     there may be better uses for the $ 15,000.
Determining the Scope of Your
Evaluation
   Ideally, you would  want more than one
type of evaluation.  Rarely does anyone have
access to resources for ideal risk communica-
tion efforts, much  less an  ideal evaluation
component Scarce resources, therefore, should
be matched with those evaluation activities
that are most important.

   Set Evaluation Objectives and Priorities—
After you've determined which types of evalu-

-------
lommuncsting Environmental Risks
ation are relevant for your needs, think about
these questions:

   1. What aspects of the risk communication
     activities are most important to evaluate?

   2. Which evaluation  activities  will con-
     tribute the most to improving the current
     risk communication effort?

   The previous discussion of formative, pro-
cess, and outcome evaluation can help guide
you in setting evaluation priorities.

   Match Priorities with Resources—People
often underestimate the amount and types of
resources  available  to them for evaluation.
Think carefully about what resources are avail-
able:

   • staff and other people resources, such as
     committee  members,  associates from
     other programs, and volunteers

   • budget funds and  "in kind" resources
     such as computer  time, mailing costs,
     and printing services available from an-
     other source

   With a little creative thinking, you will find
that you can include some form of evaluation
for almost any size of budget.  The chart on
page 9 gives examples of evaluation tasks you
might consider if you don't really have an
evaluation budget ("minimal resources"), and
if you have a moderate budget for evaluation.
It also gives  you examples  of the kinds of
evaluations you might ideally consider ("sub-
stantial resources").

   The table is intended to  present general
guidelines for thinking about what can be done.
Once you begin to  look at the costs of the
specific evaluation activities presented in the
following chapters, you can revise the scope of
your evaluation.
Summary
   This chapter has introduced the different
types of evaluation and when they are most
useful. Throughout the guidebook, examples
from previous evaluations are provided to help
you think about how you might use evaluation.
After reading the next  several  chapters, you
can return to this section to clarify yourpriorities
and determine an appropriate scope for your
evaluation.  Keep in mind that evaluation of
risk communication activities is doable, af-
fordable, and can help  you achieve your ob-
jectives.
          Suggested Readings
   Green Lawrence, W., and Frances Marcus
   Lewis, Measurement and Evaluation  in
   Health Education and Health Promotion,
   Palo Alto, CA: Mayfield Publishing Co.,
   (1986).

   Herman, Joan L., Lynn Lyons Morris, and
   Carol  Taylor  Fitz-Gibbons, Evaluator's
   Handbook, Newbury Park, CA: Sage Pub-
   lications, (1989).
   Stecher, Brian M., and W. Alan Davis, How
   to Focus an Evaluation, Newbury Park, CA:
   Sage Publications, (1987).

   U.S. Department of Health and Human Ser-
   vices, Making Health Communication Pro-
   grams Work, Bethesda, MD: National Can-
   cer Institute, NIH Publication No. 89-1493,
   (1989).

-------
                                                Chapter 2.  Evaluating Effectiveness  9
Table 1. Evaluation Options Based on Available Resources
TYPE OF
EVALU-
ATION
Formative
Process
Outcome
RESOURCES REQUIRED
Minimal Modest Substantial
Readability test
Record-keeping
(e.g., monitoring activity
timetables; number of
callers to a hotline or
attendees at a community
event)
Activity assessments
(e.g., demographics of
callers to a hotline)
Print media review (e.g.,
monitoring of content of
articles appearing in the
media)
Central-location intercept
interview
Program checklist (e.g.,
check adherence to
program plans)
Progress in attaining
objectives
(e.g., periodic calculation
of percentage of target
audience aware or
participating)
Public surveys
(e.g., telephone surveys
of self-reported
knowledge or behavior)
Focus groups, individual
in-depth interviews
Management audit (e.g.,
thorough management
review of activities)
Assessment of target
audience for knowledge
gain (e.g., pretest and
posttest of change in
audience knowledge)
Studies of public
behavior/health risk
change (e.g., data on
mitigating activities or
changes in public's risk
status)
Adapted from U.S. Department of Health and Human Services 1989.

-------

-------
                                                        Chapter 3.  Tne Planning Phasa
                                                                     11
3
THE PLANNING  PHASE:  INTEGRATING
COMMUNICATION AND  EVALUATION
Planning the Risk
Communication E-ffort

   Planning for evaluation and risk communi-
cation together will improve the timing and
coordination of important activities, reduce
cost, and increase the quality of feedback.

   During the planning phase, you must de-
cide whether a risk posed by an environmental
hazard can be addressed through communica-
tion. Risk communication activities during the
planning phase might consist of the following:

   • Identify target audiences.
   • Determine goals and objectives of the
     effort.
   • Write program plan and timetable.

   This is not a comprehensive list but  it
demonstrates the nature of activities taking
place.
Preparing for Evaluation

   In the planning phase, you should build on
an understanding of your evaluation objectives
and priorities (see Chapter 2) and begin creat-
ing an evaluation design. Regardless of the
type of evaluation you want to do, the five steps
described below will help you piece together
the key  steps for an effective evaluation.

These steps should serve as general guidelines
to get you started.
                                 Evaluation: Five Basic
                                           Steps
                              Step 1:  Clarify Risk Communication
                                      Goals and Objectives.

                              Step 2:  Determine Information Needs
                                      for Evaluation.

                              Step 3:  Collect the Information.

                              Step 4:  Analyze the Data.

                              Step 5:  Draw Conclusions.
                            Step 1: Clarify Risk Communication Goals
                            and Objectives
                               The terms goals and objectives often are
                            used interchangeably, but the slight difference
                            is significant. The goals of a program highlight
                            what the program is expected to accomplish
                            overall; the objectives are the intermediate
                            outcomes that arc necessary to get there. A risk
                            communication strategy cannot be evaluated
                            without a clear set of goals and objectives.

                               The primary goal of risk communication
                            programs efforts is to achieve reductions in
                            environmental risks. But expectations should
                            be reasonable.  In practice, it is difficult to set
                            specific targets and time frames for improve-
                            ments (e.g., a 5 percent reduction in environ-
                            mental health risks within five years). Also,
                            the relationship between cost  and effective-

-------
ness remains unclear. Many factors other than
risk communication activities will influence
the exposure of the targeted audience to the
hazard and to information  about that hazard.
Nevertheless, every attempt should be made to
define the goals clearly and explicidy so that
they are measurable.
 j^                                    ~^*.
       In response to several scientific
  studies on the health effects of various
  indoor air pollutants, the EPA developed
  a risk communication strategy with the
  goal to reduce the potential health risks
  of individuals from exposure to indoor
  air pollutants [EPA (1990)].
\	'
    The objectives describe  the desired risk
communication outcomes, but not the specific
steps  for getting there.  These steps will be
determined later in developing the risk com-
munication strategy. Examples of risk com-
munication  program objectives might be to
increase awareness, to increase factual knowl-
edge, to change commonly held attitudes, or to
motivate behavioral change.
 ^~                                    ~*^.
       The stated objectives of the EPA
  risk communication program for indoor
  air were to inform, to raise consciousness
  and to provide  realistic pollution pre-
  vention solutions that  could  be easily
  implemented  in respondents' homes.
  (Note:  this  effort did riot state actual
  mitigation activities as  an objective).
v	;	.	x
    If you want  to evaluate your success in
achieving the stated objectives, you must clarify
exactly what you expect to take place. Arkin
(1988) recommends ranking objectives to di-
rect the attention of resources as well as  mak-
ing them
    •  specific,
    •  realistic or attainable,,
   • prioritized to direct the allocation of re-
     sources,

   • measurable to assess progress towards
     the goal, and

   • time specific.

   Once written, these objectives serve as a
kind of written "contract" that should allow
management  to assess the adequacy  of the
activities planned. In addition, they help plan-
ners and staff articulate their intentions. With
a clear description of what you hope to accom-
plish, you will be able to take several important
steps to plan your evaluation and data  collec-
tion strategies, including targeting exactly what
is to be observed or measured.
Step 2:  Determine Information Needs for
Evaluation
   Measuring Effectiveness—One of the most
important things to keep in mind as you are
setting objectives is to ask yourself; is it pos-
sible to evaluate the communication objec-
tives?  You should be creative and thoughtful
in choosing indicators that represent the objec-
tives being measured. These indicators will be
different for formative, process, and outcome
evaluations. For example, a formative evalu-
ation will be interested in the effectiveness of
various components of the communication ef-
fort  while an outcome evaluation would be
more interested in investigating overall ef-
fects. Determining what information you need
to collect for the evaluation need not be an
additional step;  it should be an integral part of
planning the risk communication strategy.

   The table below presents the types of infor-
mation that can be collected to answer different
evaluation questions.

-------
                                                             Chapter 3. The Planning Phase
                                             13
                    Examples of Evaluation Questions
  1.  How many people were reached? (process evaluation)
     • Amount of time on radio and television and estimated audience at those times
     • Print coverage and estimated readership
     • Numbers of education materials distributed
     • Numbers of speeches/presentations and size of audiences
     • Number of other organizational and personal contacts

  2.  Did they respond? (process evaluation)
     • Number of in-person, telephone mail inquiries (location of inquirers, where they
       heard of the program, and what they asked for)
     • Number of new organizations, businesses, media outlets, etc. participating  in the
       program
     • Response (e.g., filled-out evaluation forms) from presentations

  3.  Who responded?  (outcome evaluation)
     • Demographics of responders (e.g., gender, education, income)
     • Geographic residence of responders

  4.  Was there change?  (outcome evaluation)
     • Changes in knowledge and/or attitudes
     • Changes in intentions (e.g., individuals say they will try not to smoke indoors)
     • Actions taken (e.g., increase in  enrollment in radon testing)
     • Policies initiated or other institutional changes made
     Adapted from Arkin, 1988.
   Developing an A udience Profile—Before
designing risk messages and materials, a needs
assessment should be conducted to develop a
profile of the targeted audience, their charac-
teristics, habits, needs, resources, and inter-
ests. This baseline data can be used later for
both improving  materials (formative) and
measuring progress  in achieving goals and
objectives (outcome).

   After you have developed a profile of your
targeted audience, it may be useful to build a
system to track their characteristics so you can
   • periodically assess progress and the need
     for modification or new activities, and

   • identify the change in status among the
     target  audience  when your effort is
     completed.

   Often, audience  surveys are inappropri-
ately timed, are sporadic, or are incompatible
and results cannot be  compared.  To avoid
these problems, plan early for appropriate au-
dience tracking.

-------
'ommuriicating Environmental RISKS
       In 1988, the EPA's Office of Toxic
  Substances planned a public informa-
  tion program to help the public under-
  stand information related to toxic sub-
  stances released in the environment. A
  needs assessment was commissioned to
  identify current awareness, knowledge,
  perceptions, concerns, needs, and wants
  of various public groups (e.g., affected
  citizens, environmentalists, community
  leaders, local government staff, health
  and media professionals, educators, and
  students) about toxic substances.

Step 3: Collect the Information
   Choosing Data Collection Techniques—
Once you have determined the  information
requirements for the evaluation,  you need to
choose data collection techniques.  Question-
naires, focus groups, key informant interviews,
and telephone  surveys are only  some of  the
collection techniques available to evaluators.
No one set of techniques is  appropriate  for
every evaluation—be sure to choose those that
fit your particular needs and resources. Chap-
ter 4 describes some of the most useful tech-
niques.

   In many cases, scarce resources will limit
the extensive use  of sophisticated survey  in-
struments.  It is possible, however, to gain
valuable feedback from less formal evaluation
tools. Kline, et al. (1989) have developed an
excellent catalogue of "quick and easy" evalu-
ation tools that are practical and easy to use.

   Determining  When to Measure—Your
data collection strategy can and should piggy-
back on other risk communication activities.
Try passing out evaluation forms at civic group
meetings to get feedback on the presentation of
materials or to identify weaknesses in the com-
munication strategy.  Or distribute public
newsletters that contain a tear-off coupon for
audience feedback. When and how often you
collect information  will depend in pan on
resource constraints. Chapter 6 discusses how
timing of measurement affects the formal
evaluation design.

Step 4: Analyze the Data
   After collecting the data, look at how well
the information relates to the risk communica-
tion objectives to evaluate whether they  are
effective.  The analysis can only be as good as
the information collected during the evalua-
tion.  In the case of qualitative information,
there  will necessarily be a high degree of
subjectivity to the analysis.  In the case of
quantitative assessment, such as that for out-
come evaluation, the analysis will require us-
ing statistical techniques.  Don't be intimi-
dated by the prospect of using statistics; ex-
perts are available within the Office of Policy,
Planning and Evaluation or nearby research
centers and universities. Additional resources
are listed in the selected readings at the end of
chapter 6.

Step 5: Draw Conclusions
   Once you have collected and analyzed the
data, you must be able to draw conclusions
about the  effectiveness of various program
components or of the overall program. In most
cases, the  results of the evaluation probably
will highlight some successes as well as some
failures.  For example, you might find that
although most groups understood the message,
particular subgroups of the target population
remained confused  about the magnitude of
risks.  Or, you mightfmd that certain segments
of the audience received the communication,
but that behavioral change was much lower
than intended.

-------
                                                               Chapters.  The Planning Phase
                                           T5
    Remember, risk communication is a diffi-
cult and complex process, and even experi-
enced practitioners face unpredictable obstacles
requiring new skills and approaches. Keep in
mind that learning can take place from both
successes and failures. If a particular activity
is not effective, evaluation can help identify
the cause and thereby improve future efforts.

    If you are going to make recommendations
that are controversial, make sure that you can
support your findings with solid evidence.


Summary
    The five steps  in this chapter provide a
rough guide for developing an evaluation
strategy or design. The following chapters will
help you fill in the  blanks by describing the
evaluation activity most appropriate to a par-
ticular project  phase: Chapter 4, "The Design
Phase: Developing and Pretesting Materials,"
emphasizes  formative evaluation; Chapter 5,
"The  Implementation Phase:  Executing the
Strategy and  Tracking Details,"  highlights
process evaluation; and Chapter 6, "Program
Assessment: Evaluating Effectivensss," out-
lines outcome evaluation. Remember, each of
these evaluation types requires preparation
during the planning phase of the project.  In
addition, evaluation activities might overlap in
different phases of the program.
         Selected Readings
Arkin, Elaine, "Evaluation for Risk Com-
municators." Presented at the Workshop on
Evaluation and Effective Risk Communi-
cation, Washington, DC, June 2-3, 1988.

Dillman, Don A., Mail and Telephone Sur-
veys: The Total Design Method, New York:
John Wiley and Sons, (1978).

Kline, Mark, Canon Chess, and Peter M.
Sandman, Evaluating Risk Communication
Programs: A Catalogue of "Quick and Easy
Methods" Rutgers University, NJ:  Envi-
ronmental Communication Research Pro-
gram, (1989).

U.S. Environmental Protection Agency, The
Inside Story: A Guide to Indoor Air Qual-
ity—How Well is it Working?, Washington,
DC:  Office  of Policy, Planning, and
Evaluation, EPA 230-01-073, (1990).

-------

-------
                                                            Chapter 4. The Design Phase
                                            17
4
Designing the Risk
Communication Effort
    Once the planning phase is over, it is time
to get the ball rolling. Communication activities
will include the following:

    • Identifying messages and materials;
    • Deciding whether to produce new mate-
     rials;
    • Developing message concepts;
    • Developing draft materials;
    • Choosing communication channels.

    In one case, EPA's Office of Toxic Sub-
stances designed a public information program
to help the  public understand information re-
lated to toxic substances released in the envi-
ronment. In the design phase, communicators
tried to

    • identify and evaluate existing educational
     materials to prevent duplication of effort
     and assure optimal use of EPA resources.

    • identify credible sources of information
     and potential delivery  channels (e.g.,
     League of Women  Voters  chapters,
     homeowners associations) to guide the
     design of communications activities.

    • test messages explaining  the meaning
     and implications of toxic emissions (e.g.,
     public understanding of terms such as
     emission, risk, toxicity,  dose, exposure,
     and health effects).

    Many sources exist for help with the activi-
ties above.  Risk communication materials
exist from previous EPA programs.  In addi-
tion, the risk communication literature  has
many guidelines for designing an activity.


Formative Evaluation:
Pretesting Materials
   Pretesting draft materials is a type of for-
mative evaluation used to help ensure  that
communications materials will work. Pretest-
ing is used to answer questions about whether
the materials meet the following criteria:

   • understandable
   • relevant
   • attention-getting and memorable
   • attractive
   • credible
   • acceptable to the target audience

   These are factors that can make the differ-
ence in whether materials work or don't work
with a particular group; they also involve value
judgments by the respondents in the pretest and
your interpretation of what they mean. Most
pretesting involved a few persons chosen to
represent the intended target audiences, rather
than a statistically valid sample (see Chapter 6
for more information on choosing a sample).
Pretesting is generally "qualitative research",
research  that can be interpreted somewhat
loosely to provide clues about audience ac-
ceptance and direction regarding materials
production and use. It can screen out materials
and approaches that clearly won't work, but

-------
            nvironmental
such qualitative pretesting cannot guarantee
success.

    Pretesting Methodology: Going About the
Evaluation—The best methods for a particu-
lar risk communication effort depend upon the
nature of the materials, the target audience, and
the amount of time and resources available for
pretesting.  No formula exists for selecting a
pretest methodology, nor is there a "perfect"
method for pretesting.  Methods  should be
selected and shaped to fit each pretesting re-
quirement, considering the objectives  of and
resources available for each project.

    This chapter describes some methods for
pretesting environmental health risk concepts,
messages, and materials. In addition, sample
questionnaires are included in Appendix A and
other pretesting materials are included in Ap-
pendix C, for you to adapt. Each method has
both benefits and limitations.   Sometimes
combining methods will overcome the  limita-
tions of individual procedures.  For example,
focus group interviews may be used to identify
     EPA pretested an early draft of a
  booklet for citizens about lead in drink-
  ing water/The pretests revealed that
  the draft was more appropriate for
  managers of the water supply system,
  did not convey the important message
  that testing was the only way to deter-
  mine whether there were high levels of
  lead at the household's water tap, and
  did not tell citizens how to get their
  water tested.  These problems were
  remedied in the final version of Lead and
  Your Drinking Water, and respond to a
  basic risk communication rule: Don't
  alert people to what they perceive as a
  new risk without telling them  how to
  reduce it.
issues  and concerns relative to a particular
audience, followed by individual interviews to
discuss particular concerns in greater depth.

   Readability testing should be used as a first
step in pretesting draft manuscripts. This might
be followed by contacting target audience re-
spondents through individual questionnaires
or interviews regarding the materials. Central
location interviews or theater testing of mes-
sages for television or radio permits contact
with larger numbers of respondents  and is
especially useful prior to final production of
materials.   Guidance on how to  choose  the
most suitable method for a particular situation
follows the descriptions of pretesting methods.

   Pretesting offers both the opportunity and
the temptation to structure the test and interpret
the results to support or justify a preconceived
point of view. It is natural to want your favorite
concepts or messages to test well, but there is
no need to test unless you are willing to con-
sider the results objectively.

   One final  point:  pretesting does not
guarantee success. Good planning and sound
pretesting  can be negated by mistakes in final
production. The message in a radio PSA on
radon testing, for instance, may pretest well,
but then be flawed by an execution that uses an
actress who seems too happy to be concerned
about possible exposure.  Similarly,  leaflet
copy that pretests well may be rendered inef-
fective by a poor layout, hard-to-read type, and
inappropriate illustrations.


Excuses for Avoiding Pretesting
   "I don't have the time or money."
   Pretesting needs to be included as one step
in your risk communication development pro-
cess from  the beginning. Your project plans

-------
                                                              Chapter 4.  The Design Phase
should include time and resources for the pre-
test and for any changes you might need to
make as a result of the pretest. Otherwise, you
may not have the funds, and your boss may see
the time for pretesting and alterations in mate-
rials as a delay in production rather than evi-
dence of careful program development.

    "My boss won't support pretesting."
   Use the information in this guide and in the
Suggested Readings to convince him or  her
that you need to pretest. Beautiful materials
and an elegant program design can't guarantee
that  the target audience will pay attention,
understand and relate to your messages.  It's
cheaper to find out whether the materials have
a chance to work before they are produced than
to have to start over later, or worse—have an
unsuccessful program.   Once you have pre-
tested, be sure to explain to your superiors (in
person or in a report) how it worked and how
you modified your approach in response to the
pretesting. Build a case for their acceptance of
future pretesting. Using quotes from the target
audience or anecdotes to illustrate your findings
can make your report more  interesting and
memorable.

    "/ can tell the difference between good and
bad materials—I don't need to pretest."
   Many people have said this over the years,
only to find out they can be wrong.  Your
training and experience are essential creden-
tials, but are you sure you can react objectively
to materials you have created or are responsible
for?  Can you really assume the role of people
who are different from  you  (if you are  not
representative of the target audience) and  see
your materials through their eyes? For example,
the "don't drink and drive" program learned
through pretesting that teenagers were more
threatened by  the possibility of losing their
license than the threat of injury, death, or
parental disapproval.

    "Our artist/producer says that pretesting
can'the used to judge creativity."
    Graphics staff, artists, and creative writers
may be sensitive to criticism from "nonprofes-
sionals," including the target  audience.  Ex-
plaining the purpose of pretesting or involving
them  in the pretest process may help them
understand and appreciate the process.  You
should explain that you are testing all elements
of the communication—your original com-
munication strategies, the message, the pre-
sentation—and not just their work. By testing
alternative concepts you can provide the cre-
ative staff with direction without telling them
their work "failed."


Pretesting  Methods
    The most frequently used pretesting meth-
ods are as follows:

    • focus group interviews
    • readability testing
    • self-administered questionnaires
    • central location intercept interviews
    • theater testing

    These methods are described below. There
is a summary chart on  page  26 to help you
compare the advantages and disadvantages of
each method.

1.  Focus Groups
    Focus groups are a form of qualitative
research adapted by market researchers from
group therapy. They are used to obtain insights
into target audience perceptions, beliefs, and
language.  A focus  group interview is con-
ducted with a group of about 8 to 10 people.

-------
20
'cmmunicaung Environmental RISKS
       Using a discussion outline, a moderator keeps
       the session on track while allowing respon-
       dents to talk freely and spontaneously. As new
       topics related to the outline emerge, the mod-
       erator probes further to gain useful insights.

           Focus groups  are especially useful in the
       concept development stage of the communica-
       tion process. They provide insights into target
       audience  perceptions, misconceptions, atti-
       tudes,  and beliefs on an environmental risk
       issue, allow planners to explore perceptions of
       message concepts, and help nigger the creative
       thinking of communication professionals. The
       group discussion stimulates respondents to talk
       freely, providing valuable clues for developing
       materials in the audience's own language and
       suggestions for changes or new directions.

           Focus groups  also can be used to supple-
       ment quantitative  research. Market research-
       ers originally developed this technique to ex-
       plore in greater depth the data from large scale
       consumer surveys.  Obtaining in-depth infor-
       mation from individuals typical of the target
       audience can provide insights into what the
       statistical data mean, or why  individuals re-
       spond in  certain ways.

           Respondents  selected for focus  groups
       should be typical  of the intended target audi-
       ence.  Various subgroups within the target
       audience may be represented in separate group
       discussions, especially when discussing sensi-
       tive or emotional subjects, to segregate respon-
       dents  by  age, sex, race, or whatever other
       variable is likely to hinder freedom of expres-
       sion. Respondents  are recruited one to three
       weeks in advance of the interview sessions,
       usually by telephone. They may be recruited
       using the telephone directory and interviewed
       by phone to determine if they qualify for the
       group. Or they may be recruited from among
                                           members of a relevant organization, place of
                                           employment, or other source.  Lastly, private
                                           firms can be hired to identify participants and
                                           appropriate facilities. Recruiting respondents
                                           "at random" is not required because the results
                                           from focus group research are not intended to
                                           be statistically representative.

                                              There  are several  important  criteria for
                                           conducting effective group  interviews.  Ide-
                                           ally, respondents should not know the specific
                                           subject of the sessions in advance, and they
                                           should not know  each other.   Knowing the
                                           subject may result in respondents formulating
                                           ideas in advance and not talking spontaneously
                                           about the topic during the session. Knowing
                                           other respondents may inhibit individuals from
                                           talking freely. Finally, all respondents should
                                           be relative "newcomers" to focus group inter-
                                           views. This permits more spontaneity in reac-
                                           tions and eliminates the  problem of "profes-
                                           sional" respondents who may lead or monopo-
                                           lize the discussion. For the same reasons, you
                                           may want to exclude health professionals and
                                           market researchers from  focus groups.

                                              Desvousges and Smith (1988) present the
                                           following lessons for implementing  focus
                                           groups:

                                              • Work with civic groups, church organi-
                                                zations, and social organizations to reach
                                                target segments.
                                              • Make sure the organizational structure of
                                                the group knows about the session and its
                                                objective.
                                              • Send people a confirmation letter and a
                                                brochure about your organization to re-
                                                duce anxiety about intentions.
                                              • Don't try to hold focus groups with re-
                                                spondents who might have difficulty with
                                                a topic. One-on-one in-depth interviews
                                                may be a better alternative for targeting
                                                these individuals.

-------
                                                             Chapter 4.  The Desion Phase
                                             21
    • Have clear objectives and a written agenda
     to keep the sessions on track and to en-
     sure that all important topics are covered.

    • Select a relaxed setting with an informal
     format. Community halls, church halls,
     or local meeting places all work well.
     Refreshments help to break the ice.

    • Keep the session to no more than two
     hours. While a break is generally unnec-
     essary, a short one can sometimes help
     reorient the discussion if people are tend-
     ing to pursue extraneous  matters and
     offers a natural opportunity to shift gears
     and review issues in a  different way.

    • Remain at the location for a while after
     the session officially ends.  Remember
     discussion of important or controversial
     topics  can  influence people  after  they
     leave the session.  So attention to infor-
     mal  opportunities  for discussion can
     moderate impacts and  ease anxieties.

    There is no firm rule about the number of
focus groups that should be conducted.  The
number of groups depends  upon your needs
and resources.  If target audience perceptions
appear to be comparable after a few focus
groups (you'll need at least two groups to make
this decision), you may not find out any more
by convening additional sessions.  If percep-
tions vary, and the direction for message devel-
opment is unclear, additional groups may be
beneficial. In this case, revisions in the discus-
sion outline after a few groups can help clarify
unresolved issues in the additional groups.

    Use an experienced, capable moderator,
with skills for handling the group process. The
moderator should not be designated as an ex-
pert in  the subject matter  being discussed;
rather, a good moderator builds rapport and
trust and probes respondents without reacting
to, orinfluencing, their opinions. The moderator
must be able to lead the discussion, and not be
led  by the group.  The moderator must empha-
      In 1990, EPA sponsored a series of
  focus groups to pretest daft materials
  explaining the health risks from radon
  in drinking water. Specific suggestions
  for improving the materials were made:

      • Change title to "Radon and Well
       Water."

      • Eliminate information that is not
       specific to private well users.
      • Include information about water
       testing and treatment
      • Design a simpler layout.

      • Display the EPA  logo  more
       prominently.

      • Replace "mitigation" with a more
       familiar phrase.

      • Include sources for more general
       radon information at the end and
       in the factsheet
size that there are no right or wrong answers to
questions posed.  A good moderator under-
stands the process of eliciting comments, keeps
the discussion on track, and makes it clear that
he or she is not an expert on the subject.  You
will  need to rehearse with the moderator to
point out any topics or concerns you want
emphasized or discussed in more depth.

   The results of focus group interviews should
be interpreted carefully.  It is useful for an
unseen observer (e.g., behind a one-way mir-
ror) to take notes as well as to tape record or
videotape  the session for later review.  In
interpreting the findings from group interviews,
you  should look for trends and  patterns in
target audience perceptions rather than just a
"he said ... she said" kind of analysis.

-------
22
t"vircr mental Rists
          Group discussion should not be used when
       individual responses or quantitative informa-
       tion is needed. For example,, when assessing
       the final copy for a booklet, it is more important
       to gather individual rather than group reactions
       to indicate the individual's actual comprehen-
       sion, perceptions and potential use. However,
       self-administered questionnaires can be com-
       pleted by each participant prior to beginning a
       group discussion to combine individual  and
       group reactions.

          Focus group  aids  are  included  in
       Appendix B.

       2. Readability Testing
          "Readability testing" simply predicts the
       approximate educational level a person must
       have in order to understand written materials.
       Risk communication materials such as pam-
       phlets, flyers, posters, and magazine articles
       are designed for distinct target groups; a read-
       ability test will indicate if they are written at a
       level  most of the audience  can understand.
       Assessing the readability  of a pamphlet or
       another printed message will not guarantee its
       effectiveness and is by no means an absolute
       indicator of success.

          Readability formulas use counts of  lan-
       guage variables  such as word and sentence
       length. The formulas have been devised statis-
       tically to predict readability. Generally speak-
       ing, the reading level required to understand a
       given pamphlet will be higher when its sen-
       tences are long or when it has many polysyl-
       labic words.

          Readability  formulas measure only the
       structural difficulty (i.e., vocabulary, sentence
       structure, and word density) of written text.
       They do not measure other factors related to
                                how "readable" a certain text is, such as sen-
                                tence "flow," conceptual difficulty, organiza-
                                tion of material,  the influence of format or
                                design of materials on comprehension, accu-
                                racy, or credibility. Readability tests are con-
                                ducted by program staff and do not include
                                participation by the audience for whom the
                                materials are being produced. Consequently,
                                readability testing supplements but does not
                                supplant the need to pretest with the target
                                audience.

                                    Despite  its limitations, readability testing
                                is useful because it

                                    • can be performed quickly,
                                    • is virtually without cost,
                                    • provides a tangible measure, and
                                    • reminds the writer to choose words and
                                      terms carefully.

                                    Based on a review of the advantages, dis-
                                advantages, and predictive validity of 12 se-
                                lected readability formulas, the NCI Office of
                                Cancer Communications chose the SMOG
                                grading formula for testing the readability lev-
                                els of its public and patient education materi-
                                als.  SMOG was  chosen because  it is both
                                simple to use and accurate. Complete instruc-
                                tions for using the SMOG readability test for
                                print materials are included in Appendix C.

                                    Environmental health risks often involve
                                many polysyllabic words and complex terms;
                                readability formulas have not been designed to
                                take into account such special terminology. In
                                some cases, extensive  use  of multisyllable
                                words known to be understandable to a particu-
                                lar audience (e.g., "radioactive") may lead to a
                                high readability score.  Therefore, as with all
                                pretesting, readability test results-should be
                                used as indicative and not predictive of prob-
                                lems or success.

-------
                                                              Chapter 4. The Design Phase
                                              23
3. Self-Administered Questionnaires

    Self-administered  questionnaires offer
several advantages. They:

    • Enable  program planners to  elicit de-
     tailed information from respondents who
     may not be accessible for personal inter-
     views (e.g., doctors, teachers, or resi-
     dents of rural areas);

    • Allow respondents to maintain their ano-
     nymity  and reconsider their responses;

    • Do not require interviewer time and can
     be done relatively inexpensively;

    • Can be answered by many respondents at
     once;

    • Can be mailed to respondents along with
     the pretest materials;

    • Can be  distributed to respondents gath-
     ered at a central location;

    • Can be  used where personal interviews
     are not feasible;

    • Offer an inexpensive pretesting technique
     for agencies with minimal resources.

    A self-administered questionnaire should
be designed and then pilot tested with five to
ten respondents. Usually, questionnaires and
pretest materials are distributed to respondents
after they have  been contacted, but they also
may be mailed to potential respondents with-
out advance  notification.   Respondents are
asked to review the materials on their own, to
complete the questionnaire, and then to return
it within a specified time.

   The questionnaire should be relatively short
and clear or respondents may not complete it.
Clear,  concise instructions to the respondent
are important  because there is no interviewer to
offer clarification. Open-ended questions can
be used to assess comprehension and overall
reactions to materials and close-ended ques-
tions to assess  such factors as personal  rel-
evance and believability of the material. Mea-
sures of attention or recall may not be reliable
when used with this technique because respon-
dents can refer back to the material.

    Resources are invested primarily in ques-
tionnaire development and analysis of results.
The analysis costs can be kept lower by mini-
mizing the number of open-ended questions.

    Self-administered questionnaires have
certain disadvantages:

    • The primary problem is the possibility of
     a low response rate.

    • It  is important to over-recruit respon-
     dents and recontact respondents to en-
     courage  them to return  their question-
     naires to ensure a sufficient number of
     returns.

    • The data collection may take longer than
     with other methods (e.g., central location
     intercept interviews) because of delays
     in responses, especially if the question-
     naires are mailed.

    • The type of respondents who return the
     questionnaires may be  different from
     those who do not respond, and this ap-
     proach cannot be used with respondents
     who have reading and writing limita-
     tions.  Hence, a certain degree of bias
     may be introduced, so results should be
     interpreted with this in mind. (Phone
     calls to those who did not respond will
     permit a comparison of  respondent/
     nonrespondent answers.)


4. Central Location Intercept Interviews
    Central location intercept interviews in-
volve stationing  interviewers  at a point fre-
quented by individuals from the  target audi-
ence and asking them to participate in  the
pretest. There are two advantages to this:

    • A high traffic area (e.g., a shopping mall,
     hospital waiting area, or school yard) can

-------
24
Communicating Environmental RISKS
            yield a number of interviews in a reason-
            ably short time.

           • A central location for hard-to-reach tar-
            get audiences can be  a cost-effective
            means of gathering data.

           A typical central location interview begins
       with the intercept. Potential respondents are
       stopped and asked whether they will partici-
       pate.  Then specific screening questions are
       asked to see whether the potential respondents
       fit the criteria of the target audience. If so, they
       are taken  to the interviewing station (a quiet
       spot at a shopping mall or other site), are shown
       the pretest materials, and asked questions. The
       questions can help assess the following:

           • comprehension
           • intentions
           • individual reaction
           • personal relevance
           • credibility
           • recall (if test situation includes exposure
            to the materials prior to the interview)

           These interviews cannot tell  you about
       behavioral responses over time unless you
       sample before and  after the communication
       effort.

           Although  the respondents intercepted
       through central location interviews may not be
       statistically representative of the target popu-
       lation, the sample is usually larger than those
       used in focus groups or individual in-depth
       interviews.  You may be able to  get a more
       representative sample if your  audience has
       easily identifiable characteristics (e.g., pregnant
       women).

           Unlike focus groups or in-depth interviews,
       the questionnaire used in central location in-
       tercept pretesting is highly structured and
       contains  primarily multiple choice or close-
                                           ended questions  to permit  quick  response.
                                           Open-ended questions, which allow  "free
                                           flowing" answers, should be kept to a  mini-
                                           mum because they take too much time for the
                                           respondent to answer and for the interviewer to
                                           record responses.  The questionnaire, as in any
                                           type of research, should be pilot-tested before
                                           it is used in the field. Several sample question-
                                           naires are included in Appendix A.

                                               A number of  market research companies
                                           throughout the country conduct central loca-
                                           tion intercept interviews in shopping malls. In
                                           some cases, interactive computer  programs
                                           have been used effectively to stimulate interest
                                           of potential interviewees. Clinic waiting rooms,
                                           churches, Social  Security  offices, schools,
                                           worksites, or other locations frequented by
                                           individuals representative of the target audience
                                           also can be  used for this purpose. Be sure to
                                           obtain clearances  or permission to set up inter-
                                           viewing stations  in these locations well in
                                           advance.

                                               Posters can be tested in the kind of setting
                                           (e.g., a clinic waiting  room or schoolroom)
                                           where they  will be used.  Posters should be
                                           mounted on  a wall along with other materials—
                                           just as they are expected to be used—where the
                                           target audience passes, gathers, or waits.  Se-
                                           lecting respondents from among those who
                                           have been "exposed" to the poster in its "natural
                                           setting" prior to the interview, and then mov-
                                           ing to a nearby but separate location to ask
                                           questions, will permit an assessment of factors
                                           such as comprehension and personal relevance,
                                           and also whether

                                               •  the material attracts attention, and
                                               •  the respondent can recall the material
                                                 when exposed to it in a "natural" setting.

                                               The major advantage of the central loca-
                                           tion intercept approach is its cost-effectiveness

-------
                                                              Chapter 4. The Design Phase
                                             25
for interviewing large numbers of respondents
in a short amount of time. For example, in one
recent mall-intercept survey, researchers got
400 interviews in one day at a modest cost of $5
each. Because these interviews are intended to
provide guidance ("qualitative" information),
the size of the sample should only be large
enough  to give you answers to your pretest
questions. If you have interviewed 30 respon-
dents and most of them feel similarly about
your materials, you are probably ready to stop.
If, however, there are substantial disagree-
ments or differences among respondents, or
their  responses have  raised new questions,
additional interviews should be conducted un-
til you are satisfied that you have clear direc-
tion from the respondents. You may decide to
revise (and  perhaps test again)  after fewer
interviews if it is clear that changes are needed.

    Designing a central location intercept pre-
test can be  relatively easy.   A  few simple
questions ("Do you own a home?" "How old
are you?" "Do you have teenage children?")
can identify respondents typical of the target
audience quickly at the point of intercept.

    Questions to assess comprehension and
target audience perceptions of the pretest ma-
terials form  the core of the questionnaire.  A
few additional questions, tailored to the spe-
cific item or items being tested ("Do you prefer
this picture—or this one?"), also may be con-
structed to meet program planners' particular
needs. The interview should be no longer than
10 minutes.  If it must be longer, you may need
to design special incentives to convince the
respondent to continue the interview (e.g., a
small payment or gift, or a plea regarding the
importance of the subject and their opinions).

    Central location intercept interviews should
not be used if respondents must be interviewed
in depth or on emotional or very sensitive
subjects. The intercept approach also may not
be suitable if respondents  are  likely to  be
skeptical or resistant to being interviewed on
the spot (e.g., commuters anxious to return
home).  Although it is time-consuming to  set
up prearranged appointments,  they actually
may save time if respondents are unwilling to
cooperate in a central location.

5. Theater Testing
   "Theater" tests are so-called because they
gather a large group of respondents in a room
(or "theater"-style setting)  at once to react,
usually  to audio  or audiovisual  materials,.
Commercial services conduct theater-style tests
for advertising agencies; this technique can be
adopted for environmental risk messages.  In
commercial theater testing, up to 300 respon-
dents are recruited by telephone to a central
location, such as  a hotel.  Respondents  are
asked to watch a "pilot" television program to
judge whether it should be aired.

    Commercials are included in the program;
some are control (constant) spots, while others
are being tested.   At the conclusion of  the
program, respondents are asked whether they
recalled any commercials (or PSAs), and then
asked questions regarding  content and per-
sonal relevance.  A similar sequence can be
used to test radio PSAs.

   Theater testing  quickly gathers a large
number of responses. Unlike some other pre-
test methods, the materials being tested  are
embedded within a program, with commer-
cials, to simulate a natural viewing situation.
This permits the assessment of how likely  the
audience is to pay attention to and remember
the message.

    Because commercial  testing services  are
costly, you should consider conducting your
own. A guide to conducting your own theater-

-------
  •"/car/re environmental
              TABLE 2.  PRETEST METHODS:  SUMMARY
I.   Individual
a.   Self-administered Questionnaires (mailed or personally delivered)
    Purpose:               To obtain individual reactions to draft materials
    Application:             Print or audiovisual materials
    Number of Respondents: Enough to see a pattern of responses (Minimum 20; 100-200
                           ideal)
    Resources Required:     Lists of respondents; Draft materials; Questionnaire; Post-
                           age (if mailed); Tape recorder or VCR (for audiovisual mate-
                           rials)
    Pros:                   Inexpensive;  Does not require staff time to  interact with
                           respondents (if mailed); Can be anonymous for respondents;
                           Can reach homebound, rural, otherdifficult-to-reach groups;
                           Easy and (usually) quick for respondents
    Cons:                  Response rate may be low (if mailed); May require follow-up;
                           May take long time to receive sufficient responses; Respon-
                           dents self-select (potential bias); Exposure to materials isn't
                           controlled; May not be appropriate if audience has limited
                           writing skills

b.   Individual Interviews (phone or in  person)
    Purpose:               Probe for individual's responses,  beliefs, discuss range of
                           issues
                           Develop hypotheses, messages, potentially motivating strat-
                           egies; Discuss sensitive issues or complex draft materials
                           Minimum of 10 per type of respondent
                           Lists of respondents; Discuss ion guide/questionnaire; Trained
                           interviewer; Telephone or quiet room, Tape recorder
                           In-depth  responses may differ from first response; Can test
                           sensitive or emotional materials;  Can test more  complex/
                           longer materials; Can learn more about "hard-to-reach" audi-
                           ences; Can be used with individuals who have limited reading
                           and writing skills
                           Time  consuming to conduct/analyze; Expensive,  and may
                           yield no firmer conclusion or consensus
c.   Central Location Intercept Interviews
    Purpose:               To obtain more quantitative information about materials/
                           messages
    Application:             Broad range, including concepts, print, audiovisual materials
    Number of Respondents: 30-100 per type (enough to establish pattern of response)
    Resources Required:     Structured questionnaire; Trained interviewers; Access to
                           mall, school, other location; Room or other place to interview;
                           Tape  recorder or VCR (for audiovisual materials)
    Pros:                   Can quickly conduct large number of interviews; Can provide
                           "reliable" information for decision-making; Can test many
                           kinds  of  materials; Can use to get  respondents for self-
                           administered  questionnaire; Quick to analyze  close-ended
                           questions
Application:

Number of Respondents:
Resources Required:

Pros:
Cons:

-------
                                                               Chapter 4.  The Design Phase
                                                                                        27
               TABLE 2.  PRETEST METHODS: SUMMARY
                                 (continued)
    Cons:
II.
a.
                       Short (10 min.) interviews; Incentive/persuasion needed for
                       more time; Cannot probe; Cannot deal with sensitive issues;
                       Sample is restricted to individuals at the location; Respon-
                       dents choose to cooperate and may not be representative
Group
Focus Group Interviews
Purpose:

Application:
                           To obtain in-depth  information about beliefs,  perceptions,
                           language, interests, concerns
                           Broad; concepts, issues, audiovisual or print materials, logos/
                           other artwork
    Number of Respondents:  8-12 per group; Minimum 2 groups per type of respondent
    Resources Required:     Discussion outline; Trained moderator; Lists of respondents;
                           Meeting room; Tape recorder; VCR (for audiovisual materials)
    Pros:                   Group interaction and length of discussion can stimulate more
                           in-depth responses; Can discuss concepts prior to materials
                           development; Can gather more opinions at once; Can com-
                           plete groups and analyses quickly; Can cover multiple topics
    Cons:                  Too few respondents for making generalizations; No  indi-
                           vidual responses (group influence) unless combined  with
                           other methods; Respondents choose to attend,  and may not
                           be typical of the target population
b.  Theater Testing

    Purpose:

    Application:

    Number of Respondents:

    Resources Required:


    Pros:

    Cons:

III.  Nonparticipatory
a.  Readability Tests
    Purpose:

    Application:
    Number of Respondents:
    Resources Required:
    Pros:
    Cons:
                       To test audiovisual materials with many respondents at once

                       Pretest audio or audiovisual materials

                       60-100 per type (enough to establish a pattern of response)

                       Lists of respondents; Questionnaire; Large meeting room; AV
                       equipment

                       Can test with many respondents at once; Large sample may
                       be more productive; Can be inexpensive; Can analyze quickly
                       Few open-ended questions possible; Can require more elabo-
                       rate preparation; Can be expensive if incentives required
                       To assess reading comprehension skills required to under-
                       stand print materials
                       Print materials
                       None
                       Readability formula; 15 minutes
                       Inexpensive; Quick
                       "Rule of thumb" only—not predictive; Does not account for
                       environmental or health terminology; No target audience
                       reaction
   Source: U.S. Department of Health and Human Services 1989.

-------
 ?mrr:un;c3iir.g Environmental P.isxs
             TABLE 3. APPLICABILITY OF PRETESTING METHODS
                  Nonpar-
                 ticipatory
Qualitative
Qualitative or Quantitative
                                                     Central
                 Readability  Focus    Self   Individual   Location        Mail      Theater
                   Tests    Groups  Tests  Interviews   Interviews  Questionnaires  Tests
  1. Concept
    Development
  2. Poster
  3. Flyer
  4. Booklet
  5. Notification
    Letter
  6. Storyboard
  7. Radio PSA
  8. TV PSA
  9. Videotape
  Adapted from U.S. Department ol Health and Human Services 1989.
style tests  is included in the HHS  (1989)
planner's guide in the selected readings. You
can choose a setting where the target audience
gathers and where they can assemble in a large
group (e.g., a senior citizens center, a school
auditorium) to conduct your own theater-style
test at lower cost.


Determining What  and How
Much To Test
    Qualitative research should be conducted
in the early stages of program development
before full funds have been committed to ma-
terials production and while messages can be
changed if necessary. As noted earlier, testing
can be useful at the concept development stage,
once audiences and communication strategies
have been determined, and prior to message
development Exploration with the target au-
dience at this stage, most frequently through
focus group discussions,  can help determine
appropriate message appeals (e.g., fear arous-
ing vs. factual), effective spokespersons (e.g.,
              a scientist, public official, or member of the
              target  audience), and appropriate language
              (determined by listening to the group discus-
              sion).

                  Testing of drafted materials prior to final
              production permits identification of flaws prior
              to the expenditure of funds for final produc-
              tion, and especially prior to the use of materials
              with target audiences.

                  A combination of methods can be used to
              assess an  audience's  comprehension, the
              message's  believability, personal relevance,
              acceptability, and other strong and weak points.
              Methods should be selected to suit the purpose
              of the testing, the sensitivity of the subject, and
              the resources available for testing.  Adequate
              investigation is especially important when de-
              veloping sensitive or potentially frightening
              messages, presenting complex,.new informa-
              tion, or designing a new  approach.  In these
              cases,  pretesting can reveal  potential prob-
              lems, but  must be carefully  structured, con-
              ducted, and analyzed.

-------
                                                             Chapter 4. The Design Phase      29
    Qualitative research responses cannot be   other support you would provide to the firm
considered representative of the  public, nor   conducting the test. In some cases, you may
can they be projected to the population as a   reduce these costs by conducting pretests on
whole. If representativeness is required, more   your own, with the  help of an expert.  Some
formal methodologies should be used. How-   market researchers  will tell you that bad re-
ever, for most pretesting purposes, qualitative   search is worse than  no research, and you must
methods may be more valuable because they   use professionals; others say that with proper
provide insights into thinking and reasons for   instruction, you can do some testing on your
attitudes or misunderstandings that are vital to   own.  As long as you know the limitations,
refining messages and materials.              some information is better than none.  Both
                                           points of view are valid; venture on your own
    When deciding when, whether, and how   wjm care
much you should use pretest methods in devel-
oping your program, consider:                   As in the planning stage of program devel-
     TT       ,  ,     ,      ,               opment,  a first  step  in planning a pretest is to
    • How much do you know about the target   f     ,     ,    ,.  .       _°     ,.
     audience?                             formulate the objectives.   These  objectives
                                           should be stated specifically to provide a clear
    • How much do you know about  them in   understanding  of what you want to  learn.
     relation to your environmental nskprob-   ,,        /.      .         ,
     lem or issue?                          Measures of attention, comprehension, believ-
                                           ability, and personal relevance are key.
    • Is your issue or problem new, controver-
     sial, sensitive, or complex?                 Designing  the Questionnaire—When  a
    • Have you conducted related research that   questionnaire is used, specific  questions to
     can be applied to this topic?              identify  strengths and weaknesses in rough
    • Can you afford to make a mistake with a   messages and materials should *  developed
     particular message or audience?          based on the pretest objectives.  Questions
                                           should not be asked just to satisfy someone's
Planning and Conducting  Pretests    cunosity
   The level of effort and staff resources re-      There are several ways to keep down costs
quired will vary considerably from one pretest   f°r pretesting questionnaires:
tothenexL Most pretesting is conducted with       ~    ..       ^     .   .       ,    ,
    „,,..,                 * Keep the questionnaire short and to the
small samples of respondents who are typical       point.
of the target audience and who are easily acces-
sibleThe.sults.combiaedwithyourprofes-      '^^^SS
sional judgment, provide important direction       easy tabulation and analysis.
for improving messages and materials.             ___
                                              • Whenever possible,  borrow  questions
   This section provides practical suggestions.       from other pretesting studies.
These suggestions should help you reduce the      • Try to  develop codes for quantifying
time and costs involved, whether OT not com-       responses in .advance-when open-ended
mercial research firms are hired to supply field       questions  are necessary.  However,  the
wor.ca.dtabu.adon.T.ecos.estimatesin.he       ££""£ X^^otnV^t
chart on page 32 are for direct costs only— not       how to develop effective risk communi-
included are staff time to provide direction or       canon materials and strategies.

-------
30
Communicating Environmental Risks
          Sample  questionnaires are included in
       Appendix A as  one resource.  In addition,
       Chapter 6 contains a description of the major
       components of a questionnaire used for out-
       come evaluation.

          Recruiting Respondents—If your budget
       does not allow you to hire a market research
       firm to recruit for various types of pretesting
       activities, you can recruit respondents your-
       self. Providing information or a speaker to a
       local church, school, civic or social organiza-
       tion may encourage members to participate in
       a pretest.

          Another way to ensure sufficient participa-
       tion is to recruit more people than actually are
       needed.  Often respondents who agree to par-
       ticipate do not show up. If all participants do
       show up, they should be included in the pretest,
       or the "extra" respondents should be informed
       that too many respondents are present, given
       the agreed-upon incentive, thanked, and al-
       lowed to leave.

          Here are some other ways to increase par-
       ticipation:

           • Schedule the pretest at a time that is most
            convenient for respondents (e.g., at lunch
            or after work).

           • Choose a safe and convenient site.

           • Provide transportation.

           • Arrange for child care during the time of
            the pretest, if necessary.

          Trained interviewers should be used when-
       ever possible. For focus group and in-depth
       interviews, this is essential. If your office has
       no experience in focus group studies, you might
       consider hiring a good, experienced modera-
       tor, observing and taping the sessions, and
       using them as training to develop in-house
                                            skills.  Local advertising agencies may be of
                                            assistance in identifying a  good moderator.
                                            Continuing education courses in interpersonal
                                            communication or group interaction may be
                                            useful for staff training or identifying potential
                                            interviewers.

                                               For conducting central location interviews,
                                            university and college departments of market-
                                            ing, communications, or health education might
                                            be able to provide  interviewer  training  and
                                            student interviewers. Pretesting a poster  or a
                                            PSA is an excellent "real world" project for a
                                            faculty member to adopt as a class project.
                                            Students in these departments are being trained
                                            in research methods, and pretesting can give
                                            them a chance to develop their skills.

                                               Facilities—Pretesting facilities should be
                                            quiet and comfortable.   Meeting  rooms at
                                            churches, office buildings, or other institutions
                                            can be used for conducting focus group or
                                            individual in-depth interviews. If an observa-
                                            tion room with a one-way mirror is not avail-
                                            able, you may allow staff to listen by hooking
                                            up speakers in a room nearby, or by audiotaping
                                            or videotaping the session. If necessary, one or
                                            two observers can sit at the back .of the room,
                                            but they need to keep quiet so the focus group
                                            respondents  will not be influenced by their
                                            comments.

                                               Getting Help—Many resources exist for
                                            obtaining professional assistance in pretesting.
                                            Faculty at university departments of market-
                                            ing,  communications, health education, psy-
                                            chology or sociology can be helpful in design-
                                            ing and conducting pretests. Marketing re search
                                            firms specializing in respondent recruitment,
                                            interviewing,, filiation, and jQther services
                                            sometimes have facilities for conducting group
                                            sessions and other techniques. The American
                                            Marketing Association's Marketing Services
                                            Guide lists suppliers and services geographi-

-------
                                                              Chapter 4.  The Design Phase
                                              31
cally throughout the United States.  Also, ad-
vertising  clubs  (many affiliated  with the
American Advertising federation) and chap-
ters of the Public Relations Society of America
sometimes undertake public service projects at
no charge to nonprofit organizations.  Other
sources include the Marketing Research Asso-
ciation and the Association of Public Opinion
Researchers.

    One caution:  individuals trained in com-
mercial pretesting may not be completely aware
of all the  nuances and subtleties involved in
risk communication. They will be able to draw
on their commercial experience for selecting
the appropriate pretest methodology.  How-
ever,  other factors such as the  wording and
interpretation of questions and results are in-
fluenced by the complexities of risk informa-
tion. You should be prepared to supervise and
guide your consultants.


Summary

    To yield useful results, a pretest should be
planned carefully. Ample time  should be al-
lowed for

    • contracting with support firms (if neces-
     sary),

    • arranging for the required facilities (1-2
     weeks),

    • developing and testing the questionnaire
     (2-3 weeks),

    • recruiting interviewers and respondents
     (2-4 weeks),

    • gathering the data (1-2 weeks),

    • analyzing the results (1 week),

    • making the appropriate alternations in
     messages or materials, and

    • pretesting again, if needed.
   And adequate pretesting should include
the following:

   • carefully defining the target audience,

   • recruiting from that audience,

   • considering tests with "gatekeepers" or
     intermediaries,

   • defining the purpose of materials prior to
     designing questionnaire,

   • locating a trained interviewer and inter-
     preter for some tests,

   • carefully assessing results, and

   • considering using a "mix" of methods to
     tailor your pretesting to your needs.

   Without adequate planning, pretesting may
not serve  its intended purpose—to improve
your messages and materials. Instead, it could
become expensive research that is of little or no
use.
         Selected Readings
  American Marketing Association, Marketing
  Services Guide, Chicago: published yearly.

  Basch, Charles E., "Focus Group Interview:
  An Underutilized Research Technique for
  Improving Theory and Practice in Health
  Education," Health Education Quarterly
  14(4):411-448, (1987).

  Desvousges, William H., and V. Kerry Smith.
  "Focus Groups and Risk Communication:
  The Science of Listening to  Data."  Risk
  Analysis 8(4), (1988).

  Sudman, Seymour, andNormanM. Bradbum,
  Asking  Questions: A Practical Guide to
  Questionnaire Design, San Francisco, CA:
  Jossey-Bass Publishers, (1986).

  U.S. Department of Health and Human Ser-
  vices, Making Health Communication Pro-
  grams Work, Bethesda, MD: National Can-
  cer Institute, NIH Publication  No. 89-1493,
  (1989).

-------
32
Ccnvur.iC3ting Environmental Risks
                             ESTIMATED COSTS OF PRETESTING, 1988
          These estimated costs are included to suggest how you should budget for pretesting using commercial firms.
          Actual costs will vary depending upon geographic location, audience to be recruited, amount of effort
          contributed by staff, companies and respondents. The potential for such contributions may be significant for
          some issues. However, be careful not to jeopardize the quality of results with a too-skimpy budget.
                                              Qualitative Studies
                         (Estimated costs for 10 general population respondents for 1.5 hours)
                                                  Focus Group
                                                     (One)
                                                                      Individual In-depth
                                                                          Interviews
                                                                            (Ten)
          a.   Questionnaire development
          b.   Recruitment
          c.   Respondent fees
          d.   Facilities, travel
          e.   Moderator/interviewer
          f.   Analysis and report

               Total
                                        S    100-300
                                            350-600
                                              0-400
                                            250 - 500
                                            300-500
                                          300-1.800

                                        51,300-4,000
       S    200-500
            400-600
              0-300
            150-500
            400-600
          450 - 2.500

       $1,600-5,000
                                             Quantitative Surveys
                       (Estimated costs for 100 general population respondents for 15-20 minutes)
                                      Door-to-Door
                                                 Central
                                                Location
                                                (Intercept/
                                               Single Site)
Telephone
 (Local)
Mail
          a.  Questionnaire         S   400-3,000   $   200-3,000   S   400-3,000  $   500-3,000
                development
          b.  Questionnaire production    400-1,000        200-500        300-500        100-300
                -t-travel/facility,
                phones/mail
          c.  Screen/conduct interviews  2,500-4,000     1,500-2,000     1,000-1,500               0
          d.  Code/keypunch/tabulation   500-1,000       500-1,000       500-1,000      500-1,000
          e.  Analysis & report         1.000-3.000     1.000-3.000     1.QOO-3.QQQ     1.0QQ-3.QQQ
             Total
                          $ 4,800 - 12,000   $ 3,000 - 9,500  $  3,000 - 9,000   $ 2,100 - 7,500
          Note:   Although many costs increase consistently with increases in sample size, "Questionnaire Development" and
                 "Analysis/Report" increase more slowly, reducing the cost-per-interview with larger samples.
          Source: U.S. Department of Health and Human Services 1989.

-------
                                                    Chapter 5. The Implementation Phase
                                                                      33
5
THE IMPLEMENTATION  PHASE:
EXECUTING THE STRATEGY AND
TRACKING DETAILS
Process Evaluation
   Once the program is under way, potential
problems can be identified before they become
serious. You can build a monitoring system
into your  program to help you identify any
problems, flaws, or oversights regarding mate-
rials, implementation strategies, or channel
selection before they become major impedi-
ments to success.

   Often, problems can be quickly corrected
if you can identify them. For example, if you
ask the public to call for more information, you
should provide a mechanism (e.g., a simple
response form) for telephone operators to record
questions asked and answers given. A frequent
review of responses will identify whether in-
correct  or inadequate  information is being
given, any new information required to  re-
spond, and inquiry patterns.

   Frequently, program implementation takes
longer than you might expect—materials may
be delayed at the printer, a major news story
may preempt your publicity, or a new priority
may delay community participation. A peri-
odic review of planned tasks and time schedule
will help  you alter any plans that might be
affected by unexpected events or delays. There
is nothing  wrong with altering your plans to fit
the situation—keeping in mind what you are
trying to achieve. In fact, you may risk damag-
ing your program if you are not willing to be
flexible and alter specific activities when
needed.
                               Process evaluation, tracking how and how
                            well your program is working, can provide
                            tangible evidence of program progress, often
                            useful to provide encouragement and reward to
                            participants and evidence of success to your
                            own office. It can also assure that the program
                            is working the way in which you planned—a
                            vital assurance prior to undertaking any more
                            formal outcome evaluation.
                             S~                                 "*^
                                   A program to increase the num-
                               ber of households checked for radon
                               was designed to educate children in
                               the classroom about the hazards of
                               radon and have them take home mate-
                               rials to encourage their parents to have
                               their house tested. The program pro-
                               vided teacher training and classroom
                               materials, but after allowing sufficient
                               time for the teachers to complete their
                               instruction, there  was no significant
                               increase in requests for home tests for
                               radon. The program managers con-
                               cluded that using children to influence
                               their parents was not an effective
                               strategy.  However, a more careful
                               review of what happened showed that
                               teachers did not send materials home
                               with the children; they had been given
                               master copy suitable for photocopying
                               but not suitable for mimeographing.
                               Because .they only had access to a
                               mimeograph machine, the materials
                               were not used.

-------
'orrmunica'ing Environmental Risks
Establishing Process Evaluation
Measures

   To help avoid major operational problems
because specific tasks aren't working,  you
should make sure that program checks are in
place. Mechanisms in place should track the
following:

   • work performed, time  schedules,  and
     expenditures (internal resources)

   • publicity, promotion, and other outreach

   • participation, inquiries,, orotherresponses

   • functioning and quality of response sys-
     tems (distribution, inquiries, response)
   • Conducting focus groups or telephone
     interviews with program participants/
     target audience members;

   • Following up with key individuals in the
     community to check their preparedness
     and interest and to identify problems.

   These process measures will tell you how
the program is operating, and may tell  you
whether the target audience is responding;
these measures will not tell you about the pro-
gram effects: whether the audience learned,
acted, or made a change as a result. Therefore,
it is important to evaluate the results of your
program—its effect  or  outcome (see  Chap-
ter 6).
ing:
   Some ways of tracking include the follow-   eu mmarv
     Reviewing materials inventory weekly;

     Getting clipping services of print media
     coverage;

     Supplying "bounce-back" cards or mak-
     ing follow-up phone calls with television
     and radio stations;

     Monitoring logs of television/radio sta-
     tions for frequency and time of PSA
     airings;

     Monitoring  volume  of inquiries  and
     length of time to reply;

     Reviewing telephone responses for ac-
     curacy and appropriateness;

     Checking distribution, points to assess
     materials use (and make sure that materi-
     als are  still available);

     Making phone calls or arranging meet-
     ings with participating organizations to
     review progress and problems;
   Periodically you should assess whether

   • activities are on track and on time,

   • the target audience is being reached,

   • some strategies  appear to be more  suc-
     cessful than others,

   • some aspects of the program need more
     attention, alteration, or elimination,

   • time schedules are being met, and

   • resource expenditures are acceptable.

   The process evaluation and other tracking
measures you established should permit this
assessment.  You  should establish specific
intervals to review progress. Preparing progress
reports—with successes, modified plans, and
schedules—can help you keep all your agency
and program players  informed and synchro-
nized.

-------
                                                         Chapter 5. Hie Implementation Phase
35
         Selected Readings
King, Jean A., Lynn Lyons Morris, and Carol
Taylor Fitz-Gibbon, How to Assess Program
Implementation, Newbury Park, CA: Sage
Publications, (1987).

U.S. Department of Health and Human Ser-
vices, Making Health Communication Work,
Bethesda, MD:  National Cancer Institute,
NIH Publication No. 89-1493, (1989).

U.S. Environmental Protection Agency,
Communicating Radon Risk Effectively:  A
Mid-Course Evaluation, Washington,  DC:
Office of Policy, Planning and Evaluation,
EPA 230-07-87-029, (1987).

-------

-------
                                                        Chapter 6. Program Assessment
                                                                      37
6
PROGRAM ASSESSMENT:
EVALUATING EFFECTIVENESS
 Outcome Evaluation
    Often people assume the impact of risk
 communication program s cannot be evaluated,
 or that it costs too much money or takes too
 much expertise. These concerns are based on
 real constraints, but they should not prevent
 you from conducting  an effective outcome
 evaluation.

    Outcome evaluation methodologies try to
 measure changes in the target audience's
 awareness, knowledge, attitudes, and/or be-
 havior.  In some cases, outcome evaluation
 uses qualitative measures to get an indication
 of the audience impacts. Unlike the pretesting
 methods, however, quantitative measures of-
 ten  are used to draw definitive conclusions
 about the overall impact.


 Measuring Effectiveness
    Measuring the effectiveness of a risk com-
 munication program involves subtle consider-
 ations.   For  example, Viscusi, Magat, and
 Huber [1986] described effectiveness in terms
 of exercising a "sound judgment.'* Deciding
 on what constitutes sound judgment, however,
 remains somewhat subjective, even in light of
 the best scientific evidence available.

   What is clear, however, is that attitudinal/
 behavioral measures of effectiveness are nec-
essary because simply asking people about
effectiveness can be very misleading.  For
example, Smith et al. [U.S. EPA, 1987] found
that almost 90 percent of homeowners receiving
                            a radon fact sheet considered it very effective.
                            Attitudinal/behavioral measures of effective-
                            ness showed these same homeowners to have
                            less understanding of key radon concepts and
                            a greater divergence between their perceived
                            and technical risks compared to similar home-
                            owners who received experimental brochures.
                             f                                 »
                                 The State of Maryland sponsored an
                              information program to explain  the
                              health risks from radon. This Mary-
                              land radon study  considered  three
                              questions related to effectiveness. First,
                              what do the various indicators show
                              about the overall effectiveness of  the
                              risk communication program? Second,
                              how do these  findings  compare with
                              other public information efforts to im-
                              prove public health?  Third, can  the
                              effects of the EPA's experimental risk
                              communication program be isolated
                              from the effects of other sources of ra-
                              don information?
                            v    	,
                               This section discusses four measures that
                            can be used to assess effectiveness:
                               • awareness of the  risk and its potential
                                consequences
                               • knowledge about risks and mitigation
                               • attitudes toward the risk
                               • behavior toward the risk

                               The choice of evaluation  measures can
                            influence the outcome of the final evaluation.
                            The following discussion presents the pros and
                            cons of each measure and develops guidelines

-------
cons of each measure and develops guidelines
for situations in which each may be appropri-
ate.

   Awareness—Did the target audience see
the risk message? How many times? Where?
Increased awareness is a basic indicator for any
risk communication program because it is a
necessary condition for any subsequent behav-
ioral actions to reduce  the risk.   Increased
awareness, however, does not  guarantee that
the  desired behaviors will occur.  Neverthe-
less, it is a starting point or building block that
underlies almost every  model of behavioral
changes (see McGuire [1985]).

    Awareness can  be appraised from several
perspectives:

    • the absolute levels in a follow-up survey
     of each target group
    • the change in awareness in each target
     group between baseline  and  follow-up
     surveys
    • the change in awareness in an experi-
     mental group compared to a control group

    Each of these perspectives provides some-
what different insights into the effectiveness of
the risk communication program. More infor-
mation on choosing a perspective for the evalu-
ation is presented later in this chapter.
     The Safe Water Drinking Act of 1974
  requires that the public be notified when
  maximum contaminant  levels  are  ex-
  ceeded Bruvold et al. (1985) interviewed
  60 respondents in 15 California com-
  munities that had recently received  a
  notification letter. The study found that
  respondents who recalled seeing the let-
  ter (68 percent) were much more likely to
  have specific knowledge about the con-
  taminant and its effects.
   Attitudes—What did they think about the
risk and its potential consequences? Did the
risk message affect their views? Did they use
the information to form more correct attitudes
toward the risks?  Attitudes are an important
measure of risk communication effectiveness.
Aizen and Fishbein [1977] argue that attitudi-
nal change is an important condition for behav-
ioral change.  They  also  argue that attitudes
that are closely linked to the behavioral patterns
under investigation can also help to predict
changes in that behavior. Most experts tend to
agree that attitudinal measures are an important
part of evaluating communication  effective-
ness. There is far less agreement, however,
over the ability of attitudinal measures to predict
behavior (McGuire [1985]).
 S~     ~*~"^™~~~~i^™™~^~"~™"    ""~"•""            ^v
     In the  Maryland  radon  survey,
  evaluators developed a survey question-
  naire that included three attitudinal
  measures for which respondents were
  asked to strongly agree, agree, disagree,
  or strongly disagree.  The  three state-
  ments were as follows:
     • ult is important to test my home to
        find  out if I have a radon prob-
        lem."
     • "If I had a radon problem it would
        be costly  to fix."
     • "Even if a radon problem was fixed,
        my home would still be worth a lot
        less."

  These three statements  corresponded
  closely to the risk communication mes-
  sages that emphasized that testing is
  important, that remediation need not be
  expensive, and that remediation can be
  effective. If the messages were received
  and processed, the proportion agreeing
  to the first would  increase, and the pro-
  portions disagreeing to the second and
  third would increase.

-------
                                                           Chapter 6.  Program Assessment
                                            39
   One way to evaluate effectiveness com-
pares  personal risk assessments made  after
receiving the information with the technical
risk assessments for the same individuals. Ide-
ally, we would like people to make decisions
that reflect the proper amount of precautionary
behavior.   Unfortunately,  the definition  of
"proper" for policy purposes is not necessarily
clear cut Even if attitudes change in a "ratio-
nal" way, the adjustments might be far  from
perfect.

   Knowledge—Did the target audience learn
anything more about the source or processes
responsible for the risk? Many risk communi-
cation programs have as a primary objective
increasing knowledge—whether people
learned factual information presented in the
information materials.  Like attitude changes,
knowledge can be  viewed as both an endpoint
and a precondition for some desired behavioral
action, such as testing  for radon. As an end-
point, we are interested in measuring whether
our risk communication program transferred
information to citizens about the risk.  As a
precondition for behavior, we are interested in
evaluating whether the transfer of certain types
of information has an effect on the level or type
of behavioral change.

       In  the Maryland radon study,
  evaluators administered a seven ques-
  tion "radon quiz'9 in both the baseline
  and follow-up  surveys (see Appendix
  A). The quiz was multiple choice with
  three answer choices;. The same ques-
  tions were used in  both surveys. The
  advantage of this strategy is that each
  question can be examined for improved
  performance. The only potential disad-
  vantage is that the strategy could alien-
  ate some members of the panel sample
  who had answered  the same questions
  three months earlier.  This was found
  not to be a problem.
   Different materials can be compared to
determine which type is  more effective in
conveying information about both the nature
of the risk and what can be done to mitigate the
potential effects.

   Knowledge can be affected by many vari-
ables, such as education, income, and gender.
Simply measuring knowledge at the end of the
program will not tell you what accounts for
changes in learning. You can control for these
"confounding" variables with an appropriate
research design.

   Behavior—Did they change their behav-
ior in response to the information?  In some
cases, behavioral change is an explicit objec-
tive of the risk communication program; in
other cases it is not.  This indicator attempts to
evaluate effectiveness in getting people to take
preventative measures to reduce their own
personal exposure to an environmental hazard,
in getting them to attend a community meeting,
or in getting them to address other kinds of
risk-related behavior. In the case of mitigation,
activities might include the following:

   1. Purchasing  specific  equipment—
     homeowners have been observed to see
     if they purchased radon testing kits. If
     their homes tested positive for radon,
     they were observed to see whether they
     installed  basement fans  or  air filters,
     among other mitigation techniques.

  2. Changing  consumption  patterns—
     changes in the consumption of certain
     foods, such as organic vegetables, have
     taken place in response to information
     about the potential  health impacts of
     agrochemicals.

  3. Changing personal habits or routines—
     researchers have begun looking at the
     smoking habits of adults in response to
     information about health impacts of in-
     door air pollution, especially on young
     children.

-------
40
Communicating Environmental Ri
       Choosing a Design

          Green et al. recommend the true experi-
       mental design as the best evaluation design.
       This design consists of five elements:

         1.  Representative sample  of the  target
            population or program

         2.  One or more pretests (measures preced-
            ing the communication activity)

         3.  Unexposed groups for comparison

         4.  Random assignment  of the sample to
            experimental  and control groups

         5.  One or more posttests to measure effects
            after the communication activity

          You can simplify the evaluation without a
       total loss of valid results. However, the last
       variable—post-testing—is  essential for out-
       come evaluation.  Although resources may
       force you to compromise on any of the first
       four variables, remember that the additional
       cost of looking at all five variables is modest
       because of the high initial investment in plan-
       ning  the evaluation. You can get a better
       understanding of behavior by knowing some-
       thing about other groups.  If you measured for
       changes in behavior without a control group,
       you would have a hard time explaining why
       behavior did or did not change.

          It is possible to choose more than  one
       design for the same evaluation.  For example,
       you might use an experimental/control group
       design for comparing attitudinal changes and
       focus only on the experimental group for mea-
       suring changes in knowledge. You may want
       to keep it  simple, especially if it is your first
       evaluation, by selecting only one design.  In
       addition, you are encouraged to find a qualified
       expert within your  agency or at a nearby re-
       search center or university.
                                          Experimental and Control Groups
                                             An important factor in planning an evalu-
                                          ation is to think about who is to be measured
                                          and when. An experimental group is the sample
                                          of the target audience to be tested for levels of
                                          or changes in awareness, knowledge, attitudes,
                                          or  behavior.  A control  group—one that is
                                          similar in all respects to the experimental group
                                          except for the specific risk communication
                                          activity—is sometimes chosen to isolate the
                                          effects of uncontrollable  variables (e.g., in-
                                          come, gender,  etc). There are three possible
                                          designs to choose from:
                                             •  experimental group only
                                             •  experimental group  and a non-equiva-
                                               lent (not randomly assigned) control
                                               group (often called a comparison group)
                                             •  experimental group and a true (randomly
                                               assigned) control group

                                             Note that in all cases, you will be measur-
                                          ing the experimental group—those people who
                                          are intended to receive the risk messages.  A
                                          control group is chosen by the same methods as
                                          the experimental group.   These people are
                                          measured at the same time as the experimental
                                          group but are not exposed to the risk commu-
                                          nication materials.
                                          /N
                                                In the Maryland radon study, three
                                           communities were chosen for the study.
                                           Each community  had  high reported
                                           levels of radon and was similar in socio-
                                           economic terms. Hagerstown received
                                           an  integrated but modest media cam-
                                           paign—radio and  print public service
                                           announcements and a utility bill insert
                                           Frederick received the same media
                                           campaign plus a community outreach
                                           program that included presentations,
                                           posters, and  related  activities.
                                           Randallstown served as the compari-
                                           son community and received no special
                                           radon information.

-------
                                                           Chapter 6. Program Assessment
                                             41
   Without a control group, it is hard to know
how  good the results of your evaluation are,
whether the results would have been as  good
with  some other risk communication activity,
and even whether the effort had any effect on
the results at all. It is recommended, therefore,
that you use a control group.

   It is difficult to control for all variables, but
some of the major variables, such as income,
race, and education, can be observed easily.
More importantly, the sampling procedure can
determine whether the control group is true
(randomly assigned) or non-equivalent (non-
randomly assigned). Random assignment is
the best way to avoid complex explanations of
differences between groups because it increases
the likelihood  that factors affecting the out-
come are spread evenly over the two groups.
Random assignment is also important for gen-
erating statistically reliable results. More in-
formation on sampling—how to choose ex-
perimental and control  groups—is included
later  in this section as well as in the selected
readings at the end of the chapter.

Timing and Testing
   Deciding when to test requires some care-
ful thinking.  In some cases, the decision will
be based on constraints of the program such as
a deadline for finishing the final report.  You
may  also have to decide on allowing time for
program effects to take place but not so long
that the effects might wear off.

   You have three choices for when to admin-
ister  a test:
   • posttestonly
   • pretest and posttest
   • time series (a series of tests before  the
     program  is implemented and after it is
     finished)

   Each of these options uses a posttest to
determine the  outcome  measures for chosen
indicators, although the posttest-only option
does not tell you about changes over time.

   Pretesting and posttesting, whether for ex-
perimental and/or control groups, allow you to
observe changes in key indicators over time.
The effect of pretesting, however, might alter
the outcome measures being observed.  For
example, a baseline interview might sensitize
an individual to be more receptive to the ensu-
ing risk message.  You might be able to get
around this problem by pretesting  a random
half of both the control and experimental groups.
You  could  then statistically compare differ-
ences within each group to determine whether
the differences are significant. If you find no
reason  to think the sensitization bias exists,
then  you can compare the entire experimental
and control groups to evaluate differences.
 .                                     ^
       In the Maryland radon study, re-
  searchers were concerned about the
  problem of sensitization bias resulting
  from re-interviewing the same  people.
  They developed a design that used two
  independent samples from each com-
  munity.  A baseline survey was con-
  ducted with one sample from each com-
  munity during December 1987. Evalu-
  ators then conducted follow up surveys
  with both samples from each commu-
  nity. This design allowed researchers to
  conduct before and  after surveys,
  thereby avoiding interpersonal differ-
  ences between measurements. In addi-
  tion, the study used independent samples
  to test for sensitization bias, which was
  found to be insignificant.

   Time series testing is useful if you have the
money and the interest in measuring changes in
key indicators over time. These tests may take
place during the communication program to

-------
42
       track progress or can be used after the program
       to see if the changes are temporary or lasting.

       Summary
          Many different combinations of groups
       and tests can be used in developing an evalua-
       tion design.  Different factors may influence
       your decision:

          • How much money do you have?
          • What information do you need to make a
           sound judgment?
          • When do you need  the information the
           most?

          In  general, the best design is one that in-
       cludes multiple tests with an independent group
       to test for sensitization. More information on
       when  to test can be  found in  the selected
       readings or from qualified experts.

       Choosing a Sample
          Sampling is a method for selecting a group
       of individuals from the entire population.  Al-
       though we try to collect samples that are repre-
       sentative of the entire population, some degree
       of uncertainty exists. The go>al for the statisti-
       cian is to draw a sample  in a way that mini-
       mizes uncertainty and allows us to make gen-
       eralizations about characteristics of the popu-
       lation  as a whole.

          In some  statistical analyses, such as the
       evaluation on indoor air pollution  (see box
       below), the sample chooses itself.  When the
       individuals requested the indoor air booklet,
       they distinguished themselves from the rest of
       the population.  It would  be dangerous to
       generalize about characteristics, such as atti-
       tudes or awareness, beyond trie limited popula-
       tion of responses.

          In other cases, you will identify a popula-
       tion and then choose a representative sample at
random from  the population. However, the
manner in which you select people at random
influences the reliability of the final results.
For example, an interviewer standing on a
street comer who chooses attractive candi-
dates for interviews is said to be subject to
personal selection bias—he cannot generalize
about the entire city's population from his
sample because it is not representative.  This
problem  may be overcome  with  more sys-
tematic procedures, such as selecting every
fifth person, regardless of his appearance or
other factors.  Even so, it is unlikely that the
population on a particular street on any given
day is representative of the whole population.

      To evaluate the effectiveness of a
  booklet on indoor air  pollution, EPA
  evaluators  drew a simple  random
  sample from requests for the booklet
  received by the Agency's Public Infor-
  mation Center  (PIC).  These requests
  were drawn from a large box that had
  been used to store information requests.
  Rather than polling all 9,000 requests,
  evaluators consulted with OPPE's Sta-
  tistical Policy Staff and, considering
  time,  resource  constraints, and likely
  responses rates, decided to  draw  a
  sample of 450 households.

      The sample was selected randomly
  by  drawing every  twentieth  request
  from the box* The advantage of a true
  random sample is that evaluators can
  generalize about the population at large.
  In the case of the indoor air booklet,
  however, the population consisted of
  those households who had requested
  the booklet, not the general population.
  Evaluators, therefore, had to limit their
  generalizations to those people who re-
  quested the booklet.

-------
                                                            Chapters. Program Assessment
                                             43
    The preferred technique to avoid the bias of
personal selection is to use mechanical meth-
ods of selecting a random sample. One option
is to assign a number to every individual in the
population (e.g., city, county) and then use a
table  of random numbers to make the selec-
tions  for you. These tables usually contain
instructions on how to use them to appropri-
ately select the sample for you. Often, comput-
ers are used to pick a random sample, espe-
cially if the sample is going to be large. Instead
of drawing the sample yourself, you may be
able to purchase one from a sampling firm.

    Even randomized techniques  can intro-
duce  some types of bias that will cause  the
sample to be unrepresentative of the overall
population.   You must decide whether  the
characteristic, such as  income, would likely
influence what you are trying to analyze. If you
are unsure, consult an expert.

   Several sources at die end of this chapter
can help you to determine an appropriate sample
size. Sample size is important because it is one
determinant of how far you can generalize your
results to  the population.  When  trying to
determine the size of the sample, one rule of
thumb might apply: choose as large a sample as
time and money permit [Fitz-Gibbon, et al.,
(1987)]. A large sample has a better chance of
representing a large group; a smaller sample
reduces the likelihood of representativeness.
Remember, however, that other statistical con-
siderations may influence your confidence level
more than sample size.

Collecting Outcome Data
   Evaluation instruments, such as achieve-
ment tests, questionnaires, personal interviews,
records, reports, or checklists, are used to col-
lect data.  Some combination of instruments
may be necessary to collect the best informa-
 tion. For example, you may use focus groups
 to find out what is known about the health risks
 from air pollution before designing a question-
 naire to test a larger group for knowledge.

    Outcome evaluation is difficult to execute
 because of the type of information needed to
 measure knowledge and attitudes. Neverthe-
 less, questionnaires do exist that can guide
 your own work.  Both mail and  telephone
 survey methods can collect reliable data. These
 instruments are particularly useful with large
 samples.
 y-                                    -x.
       In evaluating The Inside Story: A
  Guide  to Indoor Air Quality, the EPA
  used a telephone survey to collect infor-
  mation on knowledge, attitudes, and
  behavior (see Appendix A).  The ques-
  tions looked at the following respon-
  dent characteristics:

       *  reading the booklet
       •  judgments about pollution
       •  learning from the booklet
       •  feelings  about the booklet
       •  mitigating actions

  A telephone survey was used to collect
     necessary data
   Page 44 takes you through a questionnaire
to show how each of the questions gathers
information related to the  key indicators:
awareness, knowledge, attitudes, and behav-
ior. With a better understanding about the type
of information to gather, you can adapt the
questions to your own risk issue.


Analyzing Data
   Statistics will help you put your data into a
more manageable and comprehensible form,
but they cannot make up for a poor design.

-------
44
Communicating Environmental Risks
             Communicating Radon Risk Effectively: Maryland Baseline Survey
             EPA sponsored an evaluation of its radon risk communication program in Maryland jointly
          with the State. The study used a questionnaire to collect information on awareness, attitudes,
          knowledge, and behavior as indicators of effectiveness. The questionnaire consisted of 26 easy-
          to-answer questions. The numbers and letters beside each question are used to compile the data
          which makes analysis easier. To understand how each of the questions was used to collect
          relevant information, turn to Appendix A and refer to the following guidelines:

          Questions 1 and 2: General Attltudlnal Profile
             These questions develop a profile of the respondent's attitude toward environmental issues
          in general. A ten point scale is used in question 2 to get a relative measure of concern for various
          types of pollution.  The analysis then can explore how these attitudes might influence key
          indicators, such as knowledge and awareness.

          Questions 3-6: Awareness
             These questions explore the respondent's awareness of radon as a potential health problem.
          Questions 4  A-F attempt to  identify the sources used for information about radon, such as
          magazines, newspapers, radio, TV, PSA's, utility bill inserts, personal relationships, or a state
          hotline number.  Questions 5 and 6 explore the respondent's understanding of the government
          agencies that might be responsible for disseminating information to the public.  The difference
          between the results in the baseline and follow-up surveys can be used to assess the effectiveness
          of the communication program in reaching the intended audience(s).

          Questions  7-11:  Behavior and Attitudes
             These questions explore the behavior and attitudes of respondents who have and have not
          tested their homes for radon.  Questions 8A-D simply examine preventative measures, such as
          testing and mitigation, taken to reduce potential health impacts.  Questions 7, 9, 10 and 11
          highlight the sources of attitudes that influence the respondent's willingness to test.

          Questions  12-18:  Knowledge
             These questions test for specific knowledge about the characteristics of radon, its potential
          health effects, testing, and mitigation.  This baseline knowledge was used to help develop
          appropriate materials that address information gaps or misinformation.

          Questions  19-26:  Key Characteristics of Sample
             These questions look at variables that might determine whether the sample is representative
         of the overall population as well as to compare the experimental and control groups. In addition,
         this information can be used in the planning phase to identify and target priority groups for
         information materials.

-------
                                                              Chapters. Program Assessment
                                              45
With a good design, dam analysis can be used
to form opinions, develop theories, or make
decisions. Fitz-Gibbon, et al. [1987] suggest
three ways in which statistical techniques can
be applied:  to describe data, to generate hy-
potheses, and to test hypotheses.

    Describe Data—if you have tested public
knowledge about risks from hazardous wastes
and someone asks you to describe the scores,
you will need some  way to summarize  the
scores in an accurate way. Graphs, charts, and
other visuals aids are examples of descriptive
statistics.
                                  77.9
                                      69.0
       Newspaper      Radio
Figure 3. Sources of Awareness for Three
           Groups Hearing About Radon

    Generate Hypotheses—if you have col-
lected a large amount of information in a ques-
tionnaire, you can use exploratory data analy-
sis to see if there are any patterns in  the data or
to generate hypotheses about the relationships
between key variables. For example, baseline
information gathered in the New York radon
study indicated that respondents'  individual
characteristics and attitudes affected the num-
ber of correct responses on the radon quiz.

    Test Hypotheses—the same procedures
used to search a set of data for relationships can
also be used to test hypotheses, to see if there
is strong evidence that a relationship is more
than just a chance pattern in the particular data.
Since data are necessarily drawn from small
samples, we can use inferential statistics, such
as regression analysis, to give us confidence
that our sample is representative of the popula-
tion as a whole. For example, in the New York
radon study, a regression technique estimated
the  effect of attitudinal and other variables on
the radon quiz score, showing that prior aware-
ness and higher education levels improved
performance. Sources in the selected readings
explain inferential statistics in greater detail.
                                                       Selected Readings
  Aizen, I., and  M. Fishbien, "Attitude-
  Behavior Relations:  A Theoretical Analysis
  and Review of Empirical Research," Psy-
  chological Bulletin 84:888-918, (1977).
  Bruvold, W.H.,  L.A.  Wanllaw, and J.M.
  Gaston, "An Evaluation of Public Notifica-
  tion Requirements in California," Journal of
  American Water Works Association 77(3) :40-
  43, (1985).
  Dillman, Don A., Mail and Telephone Sur-
  veys: The Total Design Method^ New York:
  John Wiley and Sons, (1978).
  Fitz-Gibbon, Carol Taylor, and Lynn Lyons
  Morris, How to Design a Program Evalua-
  tion, Newbury Park,  CA:  Sage Publications,
  (1987).
  Freedman, David, Robert Pisani, and Roger
  Purves, Statistics, New York:  W.W. Norton
  and Company, (1978).
  Kline, Mark, Caron Chess, and Peter  M.
  Sandman, Evaluating Risk Communication
  Programs: A Catalogue of "Quick and Easy"
  Feedback Methods, Rutgers University, NJ:
  Environmental  Communication Program,
  (1989).
  Lipsey, Mark W., Design Sensitivity Statisti-
  cal Power for  Experimental Research,
  Newbury Park, CA: Sage Publications^ 1990).

-------
46
Icmmunicating Environmental Risks
             Selected Readings (continued)
          McGuire, William J., "Attitudes and Atti-
          tude Change," in Gardner L indzey and Elliot
          Aronson, eds., Handbook of Social Psychol-
          ogy, volume 2, third edition, New York:
          Random House, pp. 233-304, (1985).

          Rowntree, Derek, Statistics Without Tears:
          A Primer for Non-Mathematicians, New
          York: Charles Scribner's Sons, (1981).

          U.S. Environmental  Protection  Agency,
          Communicating Radon Risk Effectively:  A
          Mid-Course Evaluation, Washington,  DC:
          Office of Policy, Planning and Evaluation,
          EPA 230-07-87-029, (1987).

          U.S. Environmental  Protection  Agency,
          Communicating Radon Risk Effectively: Ra-
          don Testing in Maryland, Washington, DC:
          Office of Policy, Planning and Evaluation,
          EPA 230-03-89-048, (1989).

          U.S. Environmental Protection Agency, The
          Inside Story: A Guide to Indoor Air Qual-
          ity—How Well Is It Working?, Washington,
          DC: Office of Policy, Planning and Evalu-
          ation, EPA 230-01-073, (1990).

          Viscusi, W. Kip, W.A. Magat, and Joel Huber,
          "Informational Regulation of Consumer
          Health Risks:  An Empirical Evaluation of
          Hazard Warnings," Rand Journal of Eco-
          nomics, 17(Autumn):351-65, (1986).

-------
                                                       Chapter 7. Program Feedback
                                                                      47
7
PROGRAM FEEDBACK:
USING  EVALUATION RESULTS
Apply What You Have Learned

   Take the time  to apply what  you  have
learned to modify your program or to advise
others who are planning similar programs. For
example

   • Reassess goals and objectives.
     -Has anything changed (e.g., with the
       target audience, the community, or your
       agency's mission) to require revisions
       in the original goals and objectives?

     -Is there new information  about the
       environmental risk that should be in-
       corporated into the program messages
       or design?

   • Determine areas where additional effort
     is needed.
     - Are there objectives that are not being
       met? Why?

     - Are there strategies or activities that
       did not succeed?  Are more resources
       required? Do you need to review why
       they didn't work and what can be done
       to correct any problems?

   • Identify effective activities or strategies.
     - Have some objectives been met as a
       result of successful activities?

     -Should  these be  continued and
       strengthened because they appear to
       work well?

     - Or should they be considered success-
       ful and completed?

     - Can they be expanded to apply to other
       audiences or situations?
                                • Compare costs and results of different
                                 activities.

                                 - What were the relative costs (includ-
                                   ing staff time) and results of different
                                   aspects of your program?

                                 - Are there some activities that appear to
                                   work as well but cost less than others?

                                • Reaffirm support for the program.
                                 - Have you shared the  results of your
                                   activities with the leadership of your
                                   office  and agency?

                                 - Did you share this information with
                                   the individuals and organizations out-
                                   side your agency who contributed?

                                 - Do you have evidence of program ef-
                                   fectiveness and continued need to con-
                                   vince  your agency to continue your
                                   program?

                                 - Do you have new or continuing activi-
                                   ties that suggest the involvement of
                                   additional organizations?

                                • Decide to end a program  that did not
                                 work.
                             Share What You Learned

                                The ideal way to apply evaluation findings
                             is to improve your ongoing program. You also
                             can use what you learn from process or out-
                             come evaluation measures to

                                • justify your program with management

                                • provide evidence of need for additional
                                 funds or other resources

-------
43
Environmental RISKS
          • increase institutional understanding of
            and support for risk communication ac-
            tivities

          • encourage ongoing cooperative ventures
            with other organizations.

          It is often difficult to find the  time to
       analyze and report on what you have learned
       and share it with others. Nevertheless, what
       you learn from implementing a communication
       program might be invaluable to someone who
       is faced with a similar responsibility. Even if
       you cannot prepare a formal report or article to
       let others know what you have learned, consider
       alternatives such as:

          • letters about your findings to appropriate
            environmental, public health, or health
            education journals
          • a poster presentation at a relevant profes-
            sional meeting
          • a program description and sample mate-
            rials sent to a related clearinghouse, fed-
            eral or state agency
          • local professional  newsletters
          • letters, phone calls, brief reports or meet-
            ings with your peers in similar organiza-
            tions.

          Letting  others know about your program
       may prompt them  to tell you about similar
       experiences, lessons, new ideas or  potential
       resources.
       Write an Evaluation Report
          Taking the time to write a report about an
       evaluation task that you  have conducted is
       useful for several reasons. The report can
       provide
          • the discipline to help you critically ana-
            lyze  the  results of the evaluation and
            think about any changes you should make
            as a result,
                                    • a tangible product for your agency,
                                    • evidence that your program or materials
                                     have been carefully developed—to be
                                     used as a "sales" tool with gatekeepers
                                     (e.g.,  television station public service
                                     directors),
                                    • a record of your activities for  use  in
                                     planning future programs,
                                    • assistance to others who may be inter-
                                     ested in developing similar programs or
                                     materials, and
                                    • a foundation for evaluation activities in
                                     the future (e.g., it is easier to design a new
                                     questionnaire based on one you have
                                     previously used than to start anew)

                                    Careful Analysis—Often evaluation tasks
                                are added to other responsibilities that already
                                represent full time commitments. This means
                                there is seldom sufficient time to think about
                                the meaning of evaluation findings. If you are
                                conducting or observing a pretest or another
                                evaluation task, it may  be  easy to develop
                                conclusions  about the effectiveness of your
                                materials or program during the time the tasks
                                are being conducted. Avoid this temptation
                                and take the time to review enough findings to
                                have a good  basis for concluding how well
                                your materials or program work, or what
                                changes should be made.

                                    Writing a report  can provide the opportu-
                                nity to consider everything that happened in
                                the course of the evaluation, how these events
                                relate to the purpose of the evaluation, and any
                                recommendations for modification to improve
                                your materials or program.

                                    A Tangible Product—Outcome and other
                                evaluation tasks require a considerable invest-
                                ment of scarce program time and funds.  Pre-
                                senting your  agency with a product may be
                                particularly useful if there is a lack of support
                                for evaluation. It can help others not only to see

-------
                                                            Chapter 7. Program Feedback
                                             49
that something was received for their invest-
ment, but also to understand why the evalua-
tion was valuable.

   Evidence of Effectiveness—If you want
intermediaries (e.g., a television station, clinic,
school, organization, or employer) to use your
materials or program,  you may have to con-
vince them of its value. An evaluation report
offers proof that the materials  and program
were carefully developed. This evidence can
help explain why your materials or program
may be better than others.

   A Formal Record—What you learned in
conducting an evaluation, both the process and
the results, may be applicable to future pro-
grams to be planned by you or others.  Don't
forget to highlight unanticipated events outside
your control that helped or hindered the risk
communication activity.  Staff may change
and your memory may fade;  an evaluation
report is assurance that lessons learned  are
available for future application.

   Help for Others—Sharing your evaluation
report with peers who may be considering the
development of similar programs  may help
them  to design their  programs more  effec-
tively, convince them to use (or modify) your
program instead and establish your reputation
for good program design.

   A Foundation for Future Evaluation Ef-
forts—It is much easier to design an evaluation
based on former experience than to start "from
scratch."  A  report outlining what you did,
why, as well as what worked and what  should
be altered in the future provides a solid base
from which to plan a new pretest or outcome
evaluation. Be sure to include any question-
naire or other instruments  you used in your
report so that you can find and review them
later.
   Report Outline—Consider including these
sections in your report:
   • Background: purpose and objectives of
     the program
   • Description:  what was evaluated
   • Purpose:  why the evaluation was con-
     ducted
   • Methodology:  how it  was conducted
     (with whom, when, how many, instru-
     ments used)
   • Obstacles:  problems  in  designing or
     conducting the evaluation
   • Results:  what you found out, how in-
     terim results lead to mid-course correc-
     tions of the risk communication effort,
     and what application it has to the pro-
     gram (program recommendations)
   • Resources:  money and staff time used
     for conducting the evaluation

   Although the report should provide a clear
record of what you did, it should not be any
longer or more formal than needed. Keep it
short and easy  to read.  Attach any question-
naires, tally  sheets  or other  instruments  you
used as appendices instead of describing them
in narrative form. Don't make it any harder a
task than necessary!

   Finally, make sure to share it with whoever
might find  it  useful, as well as program
implementers who  provided feedback.   The
best report is of no value if it is filed unread.

   Remember, risk communication activities
play a key role in reducing the threats posed by
environmental hazards.  The effectiveness of
risk communication has been improved by
applying the principles of evaluation.  This
guidebook was developed to help you design
an evaluation that is appropriate for your situ-
ation, but making it work well is up to you.

-------
50
:mmunica!ing Environmental Risks
                   Selected Readings
          Green, Lawrence W., and Frances Marcus
          Lewis, Measurement and Evaluation  in
          Health Education and Health Promotion,
          Palo Alto, CA:  Mayfield Publishing Co.,
          (1986).

          Hawkins, J. David, and Britt Nederhood.
          Handbook for Evaluating Drug and Alcohol
          Prevention Programs. U.S. Department of
          Health and Human Services, DHHS Publi-
          cation No. (ADM) 87-1512, (1987).

          Morris, Lynn Lyons, and Carol Taylor Fitz-
          Gibbon, How to Present an Evaluation Re-
          port, Beverly Hills, CA: Sage Publications,
          (1978).

          U.S. Department of Health and Human
          Services,  Making Health Communication
          Programs Work, Bethesda, MD: National
          Cancer Institute, NIH Publication No. 89-
          1493, (1989).

-------
                                                                             Bibliography
                                               51
            BIBLIOGRAPHY
Aizen,  I., and M.  Fishbien, "Attitude-Behavior
   Relations: A Theoretical Analysis and Review
   of Empirical Research," Psychological Bulletin
   84:888-918, (1977).

American Marketing Association, Marketing Ser-
   vices Guide, Chicago: published yearly.

Arkin, Elaine, "Evaluation for Risk Communica-
   tors." Presented at the Workshop on Evaluation
   and Effective Risk Communication, Washing-
   ton, DC, June 2-3, 1988.

Basch, Charles E.,  "Focus; Group Interview:  An
   Underutilized Research Technique for Improv-
   ing Theory and Practice in Health Education,"
   Health Education Quarterly  14(4):411-448,
   (1987).

Bruvold, W.H., L.A. Wardlaw, and J.M. Gaston,
   "An  Evaluation of  PUblic Notification Re-
   quirements in California." Journal of Ameri-
   can  Water Works Association  77(3):40-43,
   (1985).

Covello, Vincent T., David B. McCallum, and
   Maria T. Pavlova, eds., Effective Risk Commu-
   nication, Plenum Press, (1988).

Desvousges,  William H., and V. Kerry Smith,
   "Focus Groups and Risk Communication: The
   Science of Listening to Data." Risk Analysis
   8(4), (1988).

Dillman, Don A., Mail and Telephone  Surveys:
   The Total Design Method, New York: John
   Wiley and Sons, (1978).

Fisher, Ann, Maria Pavlova, and Vincent Covello,
   (eds), Evaluation and Effective Risk Communi-
   cation: Workshop Proceedings, Cincinnati, OH:
   Center for Environmental Research  Informa-
   tion, EPA-600-9-90-054, (1990).

Fitz-Gibbon, Carol Taylor, and Lynn Lyons Morris,
   How to Design a Program Evaluation, Ne wbury
   Park, CA:  Sage Publications, 1987.

Freedman, David, Robert Pisani, and RogerPurves,
   Statistics,  New York: W.W. Norton and Com-
   pany, (1978).
Green, Lawrence W., and Frances Marcus Lewis.
   Measurement and Evaluation in Health Educa-
   tion and Health Promotion, Palo Alto, CA:
   Mayfield Publishing Co., (1986).

Hawkins, J. David, and Britt Nederhood, Handbook
   for Evaluating Drug and Alcohol Prevention
   Programs.   U.S.  Department of Health and
   Human Services, DHHS Publication No. (ADM)
   87-1512, (1987).

Herman, Joan L., Lynn Lyons Morris, and Carol
   Taylor Fitz-Gibbons, Evaluator's Handbook,
   Newbury Park, CA: Sage Publications, (1989).

Interagency Task Force on Environmental Cancer
   and Heart and Lung Disease, "Evaluation and
   Effective Risk Communication Workshop Pro-
   ceedings," Washington, DC, June 1988.

King, Jean A., Lynn Lyons Morris, and Carol
   Taylor Fitz-Gibbon, How to Assess Program
   Implementation, Newbury Park, CA:   Sage
   Publications, (1987).

Kline, Mark, Caron Chess, and Peter M. Sandman,
   Evaluating Risk Communication Programs: A
   Catalogue of "Quick  and Easy" Feedback
   Methods, Rutgers University, NJ:  Environ-
   mental Communication Program, (1989).

Krimsky, Sheldon and Alonzo Plough, Environ-
   mental Hazards, Dover, MA: Auburn House
   Publishing Co., (1988).

Lipsey, Mark W.,  Design Sensitivity Statistical
   Power for Experimental Research, Newbury
   Park, CA: Sage Publications, (1990).

McGuire, William  J.,  "Attitudes and  Attitude
   Change," in Gardner Lindzey  and Elliot
   Aronson, eds., Handbook of Social Psychology,
   volume 2, third edition, New York:  Random
   House, pp. 233-304, (1985).

Morris, Lynn Lyons, and Carol Taylor Fitz-Gib-
   bon, How to Present an Evaluation Report,
   Beverly Hills, CA: Sage Publications, (1978).

National Research Council, Improving Risk Com-
   munication,  Washington, DC: National Acad-
   emy Press, (1989).

-------
52     Communicating Environmental Risks
       Rowntree, Derek, Statistics  Without  Tears: A
          Primer for Non-Mathematicians, New York:
          Charles Scribner's Sons, (1981).

       Stecher, Brian M, and W. Alan Davis, How to Focus
          an Evaluation, Newbury Park, CA: Sage Pub-
          lications, (1987).

       Sudman,  Seymour, and Norman  M. Bradbum,
          Asking Questions: A Practical Guide to Ques-
          tionnaire Design, San Francisco, CA: Jossey-
          Bass Publishers, (1986).

       U.S. Department of Health and Human Services,
          Making Health Communication Programs
          Work, Bethesda, MD:   National Cancer Insti-
          tute, NIH Publication No. 89-1493, (1989).

       U.S. Environmental Protection Agency, Commu-
          nicating Radon Risk Effectively: A Mid-Course
          Evaluation, Washington, DC: Office of Policy,
          Planning and Evaluation, EPA 230-07-87-029,
          (1987).

       U.S. Environmental Protection Agency, Commu-
          nicating Radon Risk Effectively:  Radon Testing
          in Maryland, Washington, DC: Office ofPolicy,
          Planning and Evaluation, EPA 230-03-89-048,
          (1989).

       U.S. Environmental Protection Agency, The Inside
          Story:  A  Guide to Indoor Air  Quality—How
          Well Is It Working?, Washington, DC: Office
          of Policy, Planning and Evaluation, EPA 230-
          01-073, (1990).

       Viscusi, W. Kip, W.A. Magat, and  Joel Huber,
          "Informational Regulation of Consumer Health
          Risks:  An Empirical  Evaluation of Hazard
          Warnings,"  Rand Journal of  Economics
          17(Autumn):351-65, (1986).

-------
                                                                         Glossary
                                             53
            GLOSSARY!
Audience profile. A technique used to collect
information about the characteristics, habits,
needs, resources, and interests of a particular
group of individuals (see baseline study).

Baseline study. The collection and analysis of
data regarding a  target audience or situation
prior to intervention.

Central location intercept interviews.  In-
terviews conducted with respondents who are
stopped at a highly trafficked location that is
frequented by individuals typical of the desired
target audience.

Channel. The route of message delivery (e.g.,
mass media, community, interpersonal).

Closed-ended questions. Questions that pro-
vide respondents with a list of possible answers
from which to choose;  also called multiple
choice questions.

Control (comparison) group. A sample ran-
domly  selected and matched to  the  target
population according to characteristics identi-
fied in the study to permit a comparison of
changes between those who receive the inter-
vention and those who do not. A comparison
group serves the same function but it is not
randomly selected (orotherwise lacks the match
desired for statistical analysis).

Convenience samples.  Samples that consist
of respondents who are typical of the target
audience and who  arc easily  accessible; not
statistically projectable to the entire popula-
tion being studied.

Design. A comprehensive statement of evalu-
ation objectives, methods, and techniques.
Diagnostic information.  Results from pre-
testing research that indicate the strengths and
weaknesses in messages and materials.
Experimental group. A sample of the target
audience who are chosen to receive a commu-
nication treatment.
Focus group interviews. A type of qualitative
research in which an experienced moderator
leads  about 8 to  10 respondents through a
discussion of a selected topic, allowing them to
talk freely and spontaneously.
Formative evaluation.  Evaluative research
conducted during program development. May
include  state-of-the-art reviews, pretesting
messages  and materials, and pilot testing a
program on a small scale before full implemen-
tation.

Goal. The overall improvement the program
will strive to create.

Impact  evaluation.  Research designed to
identify whether and to what extent a program
contributed to accomplishing  its stated goals
(here, more global than outcome evaluation).

In-depth interviews. A form of qualitative
research consisting of intensive interviews to
find out how people think and what they feel
about a given topic.

Intermediaries.  Organizations, such as pro-
fessional, industrial, civic, social or fraternal
groups, that act as channels for  distributing
program messages and materials to members
of the desired target audience.
Objective. A quantifiable statement of a de-
sired program achievement necessary to reach
a program goal.

-------
54
Communicating Environmental Risks
       Open-ended question. Questions that allow
       an individual to respond freely in his or her
       own words.

       Outcome evaluation.  Research designed to
       account for a program's accomplishments and
       effectiveness; also called "impact" evaluation.

       Polysyllabic words. Words that contain three
       or more syllables.

       Pretesting. A type of formative research that
       involves systematically gathering target audi-
       ence reactions to messages and materials before
       they are produced in final form.

       Process evaluation.  Evaluation to study the
       functioning of components of program imple-
       mentation; includes assessments of whether
       materials are being distributed to the right
       people and in what quantities, whether and to
       what extent program activities are  occurring,
       and other measures of how and how well the
       program is working.

       PSA.  Public  service  announcement; used
       without charge  by the media.

       Qualitative research.  Research that is  sub-
       jective in that it involves obtaining information
       about feelings  and impressions from small
       numbers of respondents.  The information
       gathered usually should not be described in
       numerical terms, and generalizations about the
       target populations should not be made.

       Quantitative research. Research designed to
       gather objective information from representa-
       tive, random samples of respondents; results
       are expressed in numerical terms (e.g., 35
       percent are aware of X and 65 percent are not).
       Quantitative data are used to draw conclusions
       about the target audience.

       Random sample. A sample of respondents in
       which every individual of a particular popu-
       lation has had an equal chance of being included
                                          in the sample.

                                          Readability testing. Using a formula to pre-
                                          dict the approximate reading grade  level a
                                          person must have achieved in order to under-
                                          stand written material.

                                          Recall. In pretesting, used to describe the
                                          extent to which respondents remember seeing
                                          or hearing  a message  that was  shown in a
                                          competitive media environment; usually cen-
                                          ters on main idea.
                                          Risk management.
                                          control options.
The selection of risk
                                          Stakeholder.  Someone with an interest or
                                          "stake" in the outcome of the evaluation.

                                          Self-administered questionnaire. Question-
                                          naires that  are filled out  by respondents
                                          themselves; mailed directly to the respondent,
                                          or filled out by respondents gathered at a central
                                          location.

                                          Target audience.  The desired or intended
                                          audience for program messages and materials.
                                          The primary target audience consists of those
                                          individuals the program is designed to affect.
                                          The secondary target audience is that group
                                          (or groups) that can help reach or influence the
                                          primary audience.

-------
              APPENDIX A
          QUESTIONNAIRES
1.  Communicating Radon Risk Effectively
   (Maryland Baseline Survey)
2.  Indoor Air Quality Booklet Survey
3.  Managing Environmntal Risks at Public
   Schools:  A Survey of Local School Districts

-------

-------
                                                                         M5 ^C-CC
                                                                         ^cires 5/88
              Communicating Radon Risk Effectively:
                       Maryland Baseline Survey

           Telephone #                                         RTI ID
1.  Compared to other issues the State of Maryland faces, do you think environmental issues are:
   (READ LIST; CIRCLE ONE NUMBER.)

   a. MORE IMPORTANT	 01

   b. JUST AS IMPORTANT	 02

   c. LESS IMPORTANT	03

   d. DONT KNOW (DONT READ)	 94
2.  We're interested in finding out how serious you think the risks from some types of pollution are to
   your community and to your household. On a scale from 1 to 10, with 1  meaning NOT AT ALL
   SERIOUS and 10 meaning VERY SERIOUS, please tell me how serious you think the risks from
   each type of pollution are to your community and to your household. (READ LIST, SCALE, AND
   CATEGORIES; PROBE FOR NUMBER.)
                   Not at all                               Very
                   Serious                               Serious
                                                          10
                                                               Your   i  I    Your
                                                            community '  I household
   a. LEAD IN DRINKING WATER	

   b. HAZARDOUS WASTES IN LANDFILLS

   c. RADON IN HOMES	
    For the rest of the interview I'm going to ask questions mainly about one of the sources I
    mentioned—radon in homes. During the past few months, have you seen or heard anything
    about radon? (CIRCLE ONE NUMBER.)
    a. YES	 01
                                                               Skip to Question 5
                                                               on page 3

-------
4A.  in the past few months have you seen anything in a newspaper or magazine or heard anything
     on the radio or TV about radon?
     a. YES	•	 01  	*• Continue
     b. NO	 02  	*• Skip to Question 4B

     Was that in the newspaper or magazine, or was it on the radio or TV?
     (CIRCLE ALL THAT APPLY.)
     a. NEWSPAPER	 01     c. RADIO	 03                              —
     b. MAGAZINE	 02     d. TV	 04                              '


4B.  Have you seen or heard  any public service ads about radon in a newspaper or magazine, or on
     the radio or TV in the past few months?
     a, YES	 01  	*• Continue
     b. NO	 02  	>• Skip to Question 4C

     Was that in the newspaper or magazine, or was it on the radio or TV?
     (CIRCLE ALL THAT APPLY.)
     a. NEWSPAPER	 01     c. RADIO	 03
     b. MAGAZINE	 02     d. TV	 04


4C.  Have you seen a poster,  read a utility bill insert, or heard a presentation about radon in the past  •
     few months?
     a. YES	 01  	*» Continue
     b. NO	 02 	*• Skip to Question 40

     Was that a poster or utility Dill insert? (CIRCLE ALL THAT APPLY.)
     a. POSTER 	 01
     b. UTILITY BILL INSERT	02
     c. PRESENTATION  	 03


4D.  Have you talked about radon with a friend, relative, or coworker in the past few months?
     a. YES	 01  	^ Continue
     b. NO	 02 	*• Skip to Question 4E

     Was that a friend, relative, or coworker? (CIRCLE ALL THAT APPLY.)
     a. FRIEND	 01     c. COWORKER	 03
     b. RELATIVE	 02


4E.  Have you called the State of Maryland toll-free number for radon information?                       —
     a, YES	 01
     b. NO	 02                                                               j ]
                                                                                                LJ

4F.  In the past few months have you learned anything about radon in some other way?                  ' •
     a. YES	 01 	*• How was that?  	        «-*
     b. NO  	 02                      	

-------
    Information accut radon comes from many sources. If you wanted to know more about ration.
    which government agency would you contact? (DO NOT READ LIST; CIRCLE THE AGENCY.)

    a. MARYLAND DEPARTMENT OF THE ENVIRONMENT  	 ^
    b. MARYLAND DEPARTMENT OF HEALTH AND MENTAL HYGIENE  	 02
    c. LOCAL HEALTH DEPARTMENT	 °3
    d. U.S ENVIRONMENTAL PROTECTION AGENCY	 °4
    e. OTHER (SPECIFY)		—	 °5
    f. DONT KNOW (DONT READ) 	 94
6   If you wanted to learn about radon-related health problems, which of the following sources would
    you trust the most to give you that information? (READ UST; CIRCLE ONE NUMBER.)

    a. MARYLAND DEPARTMENT OF THE ENVIRONMENT	  01
    b. MARYLAND DEPARTMENT OF HEALTH AND MENTAL HYGIENE 	  02
    c. LOCAL HEALTH DEPARTMENT	  °3
    d. U.S. ENVIRONMENTAL PROTECTION AGENCY	  04
    e. FAMILY DOCTOR	  °5
    f. SOME OTHER SOURCE (SPECIFY)	  °6
    g. NO ONE (DONT READ)	'	  &
    h. DONT KNOW (DONT READ)	  w
 7    People have different opinions about radon. How much do you agree or disagree that the
     following statements are your opinion? (READ ANSWER CHOICES AFTER FIRST STATEMENT.)
                                      Strongly
                                       agree
Agree
Disagree
Strongly ;  i  Don't
disagree j  !  know
     a. IT IS IMPORTANT TO TEST
       MY HOME TO FIND OUT IF
       I HAVE A RADON PROBLEM	01	02	03	  04	94
     b. IF I HAD A RADON PROBUEM,
       IT WOULD BE COSTLY TO FIX	01	02	03	  04	94
     c. EVEN IF A RADON PROBLEM
       WAS FIXED, MY HOME WOULD
       STILL BE WORTH A LOT LESS	01	02	03	  04	94

-------
8A.  Have you had your home tested for radon? (CIRCLE ONE NUMBER.)
     a  YES  	 01   	^ Continue

     b-  NC                                                 °21 	^ Skip to Question 9
     c  DCNT KNOW (DONT READ)	 94 )
8B.  When did you get your results?    	/	
     (If "don't know", enter "94/94")     MONTH/YEAR

8C.  Were the results over 4 picocuries per liter?
     a  YES	 01   	^ Continue

     b- N0	 °2l 	*• Skip to Question 11
     c. DONT KNOW (DONT READ)	 94)

8D.  Did you do followup testing, anything to fix the problem, both, or nothing?
     a. FOLLOWUP TESTING . . . ."	 01
     b. RX PROBLEM  	 02
     c. BOTH	 03

     d. NOTHING 	 04
     & DONT KNOW (DONT READ)	 94
Skip to Question 11
 9.   People may have various reasons for deciding not to have their home tested for radon. What is
     the main reason you have not had yours tested. (DONT READ L/S7; ALL THAT APPLY.)
     a. NEVER THOUGHT ABOUT IT ..........  01       i. WOULD RATHER NOT KNOW IF
     *D,DNT KNOW tT WAS POSSIBLE  ......  02         THERE IS A PROBLEM .....................  09

     c. OONT THINK I  HAVE A PROBLEM               ''
     d.    NTKNo  HOW TO TEST. .........          k' RXING A PRO8LEM ls TOO 6XP6NSIVE
     e. THOUGHT TESTING WAS TOO                  L CONCERNED ABOUT CONR06NT1AUTY       '2
       EXPENSIVE ........................  05       m. JUST HAVENT GOTTEN AROUND TO IT .......  <3
     f. DONT THINK TESTS ARE RELIABLE ...  06       n. OTHER (SPECIFY) - 14
     g. NOT INTERESTED ...................  07       0. DONT KNOW (DONT READ) ................  94
     h. DIDNT KNOW IT WAS A
       PROBLEM IN THIS AREA .............  08
10.   Suppose your local heaJth department was offering a radon test for a one-time cost of S10, S25,
     $50, $100. The cost would cover two radon detectors, the results, and a booklet about radon.
     Would you take part in such a radon testing program? (CIRCLE ONE NUMBER.)
     a. YES 	  01
     b. NO	  02
     c. DONT KNOW (DONT READ)	  94

11.   Suppose you are just  moving to this area and you want a home like the one you're in now. You
     have narrowed the choice to two houses that are almost identical. The only difference is that
     House 1  has radon levels 2, 5 times higher than the government's guidelines for action, while
     House 2 has no radon but costs an additional $5,000, $10,000, $15,000, $20,000. Which house
     would you buy? (CIRCLE ONE NUMBER.)
     a. HOUSE 1  	  01
     b. HOUSE 2	'	  02
     c DONT KNOW (DONT READ) 	  94

-------
Some pecoie have heard a great deal about radon while others have heard very little. We're interested
in learning how much people know about radon. For the next group of questions, I am going to read
you three cnoices. Please tell me which answer you think is best. If "don't know" is your best answer.
then say that.
                                                                               Record
                                                                              Responses

12.   Where does most radon in homes come from?
     a. INDUSTRIAL POLLUTION 	 01
     b. NATURAL URANIUM IN SOIL	'	 °2
     c. OR HOME APPLIANCES	 °3
     d. DONT KNOW (DON'T READ)  	 94

13.   Which of the following best describes radon? Radon has:
     a. A SLIGHT ODOR	 01
     b. A STRONG ODOR	'	 °2
     c. OR NO ODOR AT ALL	 °3
     d. DONT KNOW (DONT READ)	 94

14.  When radon is measured in a home, which of the following will affect the level?
     a. THE TIME OF YEAR ITS MEEASURED  	 01
     b. THE AMOUNT OF INDUSTRIAL POLLUTION AROUND THE HOME	 02
     c. OR THE NUMBER OF APPLIANCES IN THE HOME	 03
     d. DONT KNOW (DONT READ) 	 94

 15.  How can you test your home for radon?
     a. YOU CAN DO IT WITH A HOME TEST  	 01
     b. ONLY TRAINED PERSONNEL CAN TEST	 02
     c. OR YOU CANNOT TEST FOR RADON  	 03
     d. DONT KNOW (DONT READ) 	 94

 16.  When do health problems from being exposed to radon usually occur?
     a. WITHIN A FEW WEEKS	 01
     b. IN A FEW YEARS  	 °2
     c. OR NOT FOR 10 TO 30 YEARS	 °3
     d. DONT KNOW (DONT READ)  	 94

 17.  What kind of health problems are high levels of radon exposure likely to cause?
      a.  MINOR SKIN PROBLEMS  	 01
      b.  EYE IRRITATIONS  	 °2
      c.  OR LUNG CANCER	 °3
      d. DONT KNOW (DONT READ)  	 94

 18.  What can homeowners do to reduce high radon levels in their homes?
      a. REMOVE THE APPLIANCES CAUSING THE PROBLEM 	 01
      b. HIRE A CONTRACTOR TO RX THE PROBLEM	 02
      c. OR THERE IS NO WAY TO FIX THE PROBLEM	 03
      d. DONT KNOW (DONT READ) 	 94

-------
19.  People sometimes describe themselves in various ways. For each statement I read please tell me
    if these things are true about you always, often, sometimes, or never. (READ UST AND SCALE;
    CIRCLE ONE NUMBER FOR EACH STATEMENT.)
                                       Always  j   Often
        Sometimes
          Never
        Don't
        know
    a. I THY TO FIX THINGS
      AROUND THE HOUSE 	  01.
    b. I EXERCISE AND/OR WATCH
      WHAT I EAT TO PROTECT MY
      HEALTH 	  01.
    c. I ASK MY PHYSICIAN A LOT
      OF QUESTIONS ABOUT MY
      HEALTH 	  01.
02
02
02
    d.  I WAIT UNTIL I HAVE A LOT OF
       INFORMATION BEFORE I DECIDE
       TO BUY SOMETHING UKE A NEW
       APPLIANCE	  01	02

    e.  I QUESTION INFORMATION FROM
       EXPERTS OR OTHER AUTHORITIES	  01	02
03
03
03
            03
            03
04.
04.
04.
            04.
            04.
94
94
94
         94
         94
20.   Please tell me how active you are in each of the following types of organizations or activities.
     (READ UST AND SCALE; CIRCLE ONE NUMBER FOR EACH STATEMENT.)
                                                  Very
                                                  active
        Somewhat
          active
        Not at all
          active
       I Don't
       I know
     a. CIVIC CLUB (KIWANIS. LEAGUE OF WOMEN VOTERS) ....  01 	02 	03	94

     b. CHURCH OR RELIGIOUS ORGANIZATION 	  01 	02 	03	94

     c. VOLUNTEER ACTIVITIES (RED CROSS, UNITED WAY)	  01 	02 . .  	03	94
                                                                                           II
                                                                                           u

-------
Now, we have just a ;ew more general background questions.

21.  About how many years have you lived at this address?	YEARS


22.  Is your home a: (READ LIST; CIRCLE ONE NUMBER.)
    a. SINGLE-FAMILY HOME	  01       d. TOWNHOUSE	 04
    tx MOBILE HOME	  02      ft CONDOMINIUM  	 05
    c. DUPLEX  	  03      f. DONT KNOW (DONT READ)	 94


22,  To the best of your knowledge was your home built: (READ LIST; CIRCLE ONE NUMBER.)
     a. BEFORE 1940	  01      c. AFTER 1976	 03
     b. BETWEEN 1940 AND  1976	  02      <1 DONT KNOW (DONT READ)	 94


24.  Are you planning to move during the next year? (CIRCLE ONE NUMBER.)
     a. YES	  01
     b. MAYBE 	  02
     c. NO	  03
     d. DONT KNOW (DONT READ)	  94


25.  Does your home  have a basement? (CIRCLE ONE NUMBER.)

     a. YES	  °1
     b- N0                                                 °2} 	*• Skip to Question 27
     c. DONT KNOW (DONT READ)	  94)       on page 8
 26.  Is any part of your basement used as living space by you or your family?
     (CIRCLE ONE NUMBER.)
     a. YES 	 01
     b. NO	'.	 °2
     c. DONT KNOW (DONT READ)	 94

-------
27.  Hew many oecpie are there in your household? 	

28.  How many children  under 12? 	

29.  Do you smoke cigarettes or other tobacco products? (CIRCLE ONE NUMBER.)
    a.  YES  	 01
    b.  NO	 02

30.  Does anyone else in your household smoke? (CIRCLE ONE NUMBER.)
    a.  YES  	 01
    b.  NO	 02


31.  What was the highest grade of school that you completed? (CIRCLE ONE NUMBER.)
    a.  NO SCHOOL	01       e. SOME COLLEGE (13-15)	 05
  '  b.  GRADE SCHOOL (1-8)	  02       f.  COLLEGE GRADUATE (16)	 06
    c.  SOME HIGH SCHOOL (9-11)	  03       g. POSTGRADUATE (17+)	 07
    d.  HIGH SCHOOL GRADUATE (12)  ....  04


32.  What is your age?	YEARS


33.  Is  your racial or ethnic background (CIRCLE ONE NUMBER.)
    a.  WHITE OR CAUCASIAN	  01       d. ASIAN OR PACIFIC ISLANDER 	 04
    b.  BLACK OR NEGRO  	  02       e. NATIVE AMERICAN INDIAN	 05
    c.  HISPANIC	  03


34.  (ASK ONLY IF UNCLEAR.) What is your sex? (CIRCLE ONE NUMBER.)
    a.  MALE 	  01
    b.  FEMALE	  02
35.  I'm going to read a list of income categories for FAMILY income from all sources BEFORE taxes
     during 1986. Please tell me to stop when I get to yours. (CIRCLE ONE NUMBER.)
     a. $5.000 OR UNDER	  01       e. $35,001 - $50,000	 05
     b. $5X101 - $1S£00	  02       f. $50,001 - $65£00	 06
     c. $15,001 - $25,000	  03       g. $65£01 - $80,000	 07
     d. $25,001 - $35,000	  04       h. $80,001 AND OVER	 08
36.  If you had to sell your home today, what do you think your home and property would sell for?
     $	(PROBE FOR APPROXIMATE)
                       Thank you very much for your cooperation.
                     Your answers will be most helpful in this study.

-------
          "eiecrore Me.
                                          RT1 1.0.
                                                                             CMS 20'C-OC'4
                                                                                E.xsires 5/38
                    Radon Information Effectiveness Survey:
                           Maryland Baseline Screener
                                    Final Interview Code
01 Ineligible, Not Residential Number
02 Ineligible, Not Homeowner
03 Ring, No Answer
04 Nonworking Number
05 Double Wrong Connection
                                  06 Answering Machine/Service
                                  07 No Result From Dial
                                  08 Fast Busy/Computer Tone
                                  09 Unable to Contact
                                  10 Physically/Mentally Incompetent
                                                              11 Language Barrier
                                                              12 Interview Completed
                                                              13 Partial Data
                                                              14 Final Interview Refusal
                                                              15 Other
                             I'm calling from the Research Triangle Institute (RTl), in North
Hello, my name is
Carolina. We are conducting a study on what people know and tnink about environmental issues. It
won't take much of your time and your answers will be kept strictly confidential. (Additional information,
if necessary: Your cooperation is very important because we want to find out what the general public
knows about environmental issues. This is not a sales call. The study is sponsored by the State of
Maryland.)
1.
   Is this
   Yes
   No
                                01   - CONTINUE
                                02   - "Thank You," HANG UP
2.  Does this number serve a: (READ ALL CHOICES, MARK ONE.)
   Residence	 01  - CONTINUE
   Business/institution 	 02
   Or something else	 03
                                    - "Thank You," HANG UP
3.  Do you own your residence?
   Yes  	  01   - CONTINUE
   No  	  02   - TERMINATE
4.  As part of our study, I need to randomly choose an adult who makes or shares in important
   household decisions. Please tell me the first names of the adult decisionmakers in your household.
   (IF RESPONDENT CAN'T ANSWER, ASK FOR ADULT, REPEAT INTRODUCTION.)
   1. Woman's Name:
                                               2. Man's Name:
   Third Decisionmaker:
                                             Fourth Decisionmaker:
   (TO CHOOSE RESPONDENT, LOOK AT LABEL AND CHOOSE THE FIRST NAME IF THE NUMBER IS A "1"
   OR THE SECOND NAME IF ITS A "2" IF YOU HAVE TWO MEN OR TWO WOMEN DECISIONMAKERS,
   JUST WRITE THE SECOND NAME IN THE MAN'S SPACE AND FOLLOW THE SAME CHOICE SELECTION
   RULE. IF MORE THAN TWO DECISIONMAKERS,  THEN CONSULT YOUR RANDOM SELECTION TABLE.
   CIRCLE NUMBER OF PERSON SELECTED.)

   May I please speak to NAME OF SELECTED DECISIONMAKER. (IF NOT AVAILABLE, SCHEDULE A
   CALLBACK.)

   READ INTRODUCTION IF PERSON ANSWERING  IS NOT THE RESPONDENT.
                                     TERMINATION
     Thank you very much for your cooperation. Our study involves only homeov/ners, so I won't
     need any more of your time. Thank you again for your help.

-------

-------
          INDOOR AIR QUALITY BOOKLET SURVEY    OBS *  ^
Name
Address
Phone
                         REGION
Date
Time
Result
Recall Code
Abbreviations:
     NA = no answer
     NH = respondent not home
     WR = will return
     DISC = disconnect
     AM = answering machine
                              WN = wrong number
                              1C = interview completed
                              PIC = partially completed
                              RC » return call
                              ET = eastern time
     I = IDENTICAL TO A PRIOR QUESTION
     VS = VERY SIMILAR TO A PRIOR QUESTION
     S = SIMILAR OR BASED ON A PRIOR QUESTION

********************* ALL CAPS ARE NOT READ *********************
Hello.  Is this the
                         (last name)
 (IF NO, The  number  I  was calling is 	
	 residence.)
 (full name)

 (IF WRONG NUMBER, I am sorry to have bothered you.)
                                residence?


                                         and it was for
My name is	and I'm conducting a study to determine
the effectiveness of the recent publication, The Inside Storv:  A
Guide to Indoor Air Quality.  This  is not a sales call.

-------

-------
Your household was  chosen  randomly from the group  of  people who
requested  this  publication  from  the  Environmental  Protection
Agency.

I'd like to ask  you  some questions about  the booklet.   It's very
important to us to know what you think,  so we can tell whether our
efforts to inform you  are working.  All answers you give will be
kept strictly confidential.   This will only take a few minutes.


USE IF RELUCTANT:  Again, this is not  a  sales call.  It is a study
sponsored by the Environmental Protection Agency.

1.  First of all, did  your  household  receive the Guide to Indoor
Air Quality from EPA?   (It has a blue and grey cover.)

     A. NO	01
     	I'm sorry.  One was sent to your household but apparently
failed to reach you.  Would you like me to arrange for another copy
to be sent to you?   (REAFFIRM ADDRESS)'  May I ask you a few general
questions about  the  environment? CONTINUE WITH QUESTIONS 2-4 AND
8-37, SKIPPING QUESTIONS 19 AND 20.

     B. YES	 .02

     ** Are  you the person  in your household most familiar
     with the booklet?

     i)  NO—May I speak with him/her?
          —Is  there  a  convenient  time  when  he/she  will  be
available to talk with me?  SCHEDULE CALLBACK
     ii) YES

     ** About how much time did you spend reading this booklet?

     a. LESS THAN 10 MINUTES	01
     b. 10 TO 30 MINUTES	02
     C. 30 TO 60 MINUTES	03
     d. OVER AN  HOUR	04

     e. DID NOT  READ	00
     *  IF 0 MINUTES,  CONTINUE WITH QUESTIONS  2-4 AND 8-37,
     SKIPPING QUESTIONS 19  AND 20


2.  Compared to  other  environmental issues  that might affect your
health, do you  think indoor air pollution is:

     a. more important	01
     b. just as  important	02
     c. or  less  important	03
     d. DON'T KNOW	04

-------
3.  On a scale from 1 to 10, with one meaning not at all serious,
and 10 meaning, very  serious,  tell me how serious  you think the
risks from each of  the following types of pollution  are  to your
household.

     a.  first, lead in drinking water                   	
     b.  hazardous  wastes in landfills                   	
     c.  indoor air pollution                            	
4.  I'm going to read several statements.  Please tell me whether
you strongly agree,  agree,  disagree or strongly disagree with each
statement.  If you don't know, just say "don't know."
                                                       123  4 99
     a. Most indoor air pollution comes froa nearby
 industries	SAAD SD DK
     b. Ordinary household products can cause indoor
 air pollution	SA A D SD DK
     c. The best way to reduce indoor air pollution
 usually is to remove the  source  cf the pollution	SA A D SD DK
     d. The only health effects coming from indoor air
 pollution are short-term	SA A D SD DK
     e. Most people need to test their homos for a wide
 variety of indoor ciir pollutants	SA A D SD DK
     f. Radon is the only  major indoor air pollutant. .SA A D SD DK
     g. Reducing indoor air pollution is always very
 expensive	SA A D SD DK

Now  some  statements  about the  booklet;  again,  strongly  agree,
agree, disagree, strongly disagree.
                                                       123  4 99

5.   a. The booklet was written in everyday English...SA A D SD DK-
     b. The organization of the booklet was hard to
        follow	SA A D SD DK
     c. The booklet covered what  you  needed to know...SA A D SD DK
     d. The booklet helped you identify possible
        sources of indoor air pollution  in your home..SA A D SD DK
     e. The booklet described practical ways to
        reduce indoor air pollution in your hone	SA A D SD DK


6.   On a scale of  1 to 10,  with  1  meaning not  informed and  10
meaning very  informed,  how informed  did you feel  you were  about
indoor air pollution:

     a. before you received the Guide to Indoor Air Quality?  	
     b. after you received the booklet?                      	
7.  Can you think of any particular information in the booklet that
you found most informative or helpful?

     CIRCLE ALL THOSE ANSWERS WHICH APPLY.   ANSWERS ARE NOT READ
     a. GENERAL DESCRIPTION OF CAUSES OF INDOOR AIR POLLUTION..01

-------
     b. HOW INDOOR AIR POLLUTION AFFECTS YOUR  HEALTH	02
     c. DESCRIPTION OF STEPS TO REDUCE INDOOR  AIR POLLUTANTS... 03
     d. REFERENCE GUIDE  (MIDDLE OF BOOKLET)	04
     e. MEASURING POLLUTANTS IN THE HOME	05
     f. ADDITIONAL SOURCES OF INFORMATION	06
     g. BUILDING A NEW HOME	07
     h. INFORMATION ON WEATHERIZING HOMES	08
     i . SICK BUILDING SYNDROME	09
     j . APARTMENT LIVING	10
     k. RADON	11
     1. ENVIRONMENTAL TOBACCO SMOKE	12
     m. BIOLOGICAL CONTAMINANTS, SUCH AS BACTERIA AND MOLD	13
     n. CARBON MONOXIDE	14
     o. NITROGEN DIOXIDE	15
     p.  RESPIRABLE  PARTICLES  THAT ARE  RELEASED  WHEN FUELS  ARE
     INCOMPLETELY BURNED	16
     q. ORGANIC CHEMICALS AND GASES, SUCH AS PAINTS,VARNISHES
     AND FUELS	17
     r . FORMALDEHYDE	18
     s . PESTICIDES	'	19
     t. ASBESTOS	20
     u. LEAD	21
     v. OTHER (SPECIFY) 	 ...22
     w. DON'T KNOW/NO OPINION	99

INDICATE ALL THAT APPLY.      ANSWERS ARE NOT  READ.

 **  Within the last year,  have you taken,  or do you have plans to
take, any measures to reduce  	 in your
home?

 8.  Radon

     NO	01
     YES	02

        What have you done or are you doing?
          a. TEST HOME RADON LEVELS      	 PCI/L
          b.  MORE INFORMATION  OR  PROFESSIONAL  ADVICE
          (E.G. EPA GUIDELINES)
          C. SEAL CRACKS AND OTHER OPENINGS  IN BASEMENT  FLOOR
          d. INCREASE VENTILATION
          e. TREAT RADON CONTAMINATED WELL WATER
          f. DECREASE SMOKING IN HOME
          g. PLANS TO 	
          h. OTHER (SPECIFY)  	
 9.  Environmental tobacco smoke

     NO	01
     YES	02

        What have you done or are you doing?
          a. STOP SMOKING

-------
          b. DISCOURAGE OTHERS FROM  SMOKING
          c. ASK SMOKERS TO SMOKE OUTSIDE
          d. PLANS TO
          e. OTHER (SPECIFY)
 10.  Biological contaminants, such  as  bacteria or mold

     NO	01
     YES	02

        What have you done or are you doing?
          a. INSTALL  FANS  VENTED TO THE  OUTDOORS IN  THE  KITCHEN
          AND/OR BATHROOM(S)
          b. INCREASE USE  OF THE FANS  VENTED TO THE  OUTDOORS IN
          THE KITCHEN AND/OR BATHROOM(S)
          c. VENT CLOTHES DRYER OUTSIDE
          d. CLEAN HUMIDIFIER MORE FREQUENTLY
          e. USE ONLY DISTILLED WATER IN  THE  HUMIDIFIER
          f. EMPTY WATER TRAYS IN APPLIANCES  MORE FREQUENTLY
          g. CLEAN AND DRY, OR REMOVE',  WATER-DAMAGED CARPET (S)
          h. DECREASE USE OF BASEMENT AS  A  LIVING AREA
          i. CONSCIOUSLY ATTEMPT TO  MAINTAIN  HUMIDITY AT 30-50%.
          j .  VENTILATE   THE ATTIC   AND  CRAWL  SPACE  TO  PREVENT
          MOISTURE BUILD-UP
          k. PLANS TO 	
          1. OTHER (SPECIFY) 	
 11.  Carbon monoxide and nitrogen dioxide

     NO	01
     YES	02

        What have you done or are you doing?
          a. PROPERLY ADJUST GAS APPLIANCES
          b. VENT GAS SPACE HEATERS AND FURNACES
          c. PROPER FUEL IN KEROSENE SPACE HEATERS
          d. INSTALL EXHAUST FAN,  VENTED TO THE OUTDOORS,  OVER GAS
          STOVE
          e.  INCREASE  USAGE  OF  EXHAUST  FANS,   VENTED  TO  THE
          OUTDOORS, OVER GAS STOVE
          f. CHOOSE PROPERLY SIZED WOOD STOVES CERTIFIED  TO MEET
          EPA EMISSIONS STANDARDS
          g. CHECK SEAL ON WOOD STOVE DOOR
          h. TRAINED PROFESSIONAL VISIT—INSPECT, CLEAN AND TUNE-
          UP CENTRAL HEATING SYSTEM
          i. DECREASE IDLING OF CAR IN GARAGE
          j. PLANS TO 	
          k. OTHER (SPECIFY) 	
 12.  RespiraJble particles, which are released when fuels  are not
completely burned,

     NO	01
     YES	02

-------
       What have you done or are you doing?
         a. VENT FURNACES TO THE OUTDOORS
         b. CHOOSE PROPERLY  SIZED  WOOD STOVES CERTIFIED  TO  MEET
         EPA EMISSIONS STANDARDS
         c. CHECK SEAL ON DOOR OF WOOD STOVE
         d. CHANGE FILTERS ON CENTRAL HEATING  AND COOLING  SYSTEMS
         AND AIR CLEANERS
         e. TRAINED PROFESSIONAL VISIT—INSPECT,  CLEAN  AND TUNE-
         UP CENTRAL HEATING SYSTEM
         f. PLANS TO 	___
         g. OTHER (SPECIFY)	
13.   Organic chemicals and gases, such as from paints  and  fuels?

    NO	01
    YES	02

       What have you done or are you doing?
         a. MORE AWARE OF MANUFACTURER'S DIRECTIONS
         b. USE PRODUCTS  OUTDOORS OR IN WELL-VENTILATED
         AREAS
         c. DISCARD  UNUSED OR  LITTLE-USED CONTAINERS
         SAFELY
         d. BUY QUANTITIES TO BE USED SOON
         e. PLANS TO 	
         f. OTHER (SPECIFY)	

14.   Formaldehyde

    NO	01
    YES	02

       What have you done or are you doing?
         a. USE EXTERIOR  GRADE, LOWER  EMITTING,  PRESSED WOOD
         PRODUCTS
         b. MAINTAIN  MODERATE  TEMPERATURES AND  REDUCE HUMIDITY
         LEVELS TO 30-50%
         C. INCREASE VENTILATION, PARTICULARLY AFTER NEW SOURCES
         OF EMISSION HAVE BEEN INTRODUCED.
         d. PLANS TO 	
         e. OTHER (SPECIFY) 	

15.   Exposure to pesticides

    NO	01
    YES	02

       What have you done or are you doing?
         a. MORE AWARE OF MANUFACTURER'S DIRECTIONS
         b. MIX OR DILUTE OUTDOORS
         c. APPLY ONLY IN RECOMMENDED QUANTITIES
         d. TAKE PETS OR PLANTS OUTDOORS TO APPLY
         e. GREATER USE OF NON-CHEMICAL METHODS OF PEST CONTROL

-------
          f. SELECT PEST CONTROL  COMPANY  CAREFULLY
          g. DECREASE  STORAGE OF  UNNEEDED PESTICIDES  INSIDE  THE
          HOME
          h. DISPOSAL OF UNWANTED  CONTAINERS  MORE  SAFELY
          i. STORAGE OF CLOTHES WITH MOTH REPELLENTS IN SEPARATELY
          VENTILATED AREAS
          j. INDOOR SPACES  CLEAN AND WELL-VENTILATED  IN ORDER TO
          ELIMINATE OR MINIMIZE USE  OF  AIR FRESHENERS
          k. PLANS TO 	___
          1. OTHER (SPECIFY) 	

 16.  Asbestos

     NO	01
     YES	02

        What have you done or  are  you doing?
          a. PROFESSIONAL  ADVICE  TO IDENTIFY POTENTIAL ASBESTOS
          PROBLEMS
          b. TRAINED AND QUALIFIED CONTRACTORS
          c. REPLACE  WOODSTOVE  DOOR  GASKETS WHICH  MAY  CONTAIN
          ASBESTOS, FOLLOWING  PROPER PROCEDURE
          d. PLANS TO 	
          e. OTHER (SPECIFY) 	

 17.  Lead

     NO	01
     YES	02

        What have you done or  are  you doing?
          a. PAINT TESTED FOR  LEAD
          b. MORE CARE IN NOT  DISTURBING  LEAD-BASED PAINT
          C.  COVER  LEAD-BASED  PAINT  WITH  WALLPAPER  OR  OTHER
          BUILDING MATERIAL
          d. USE WELL VENTILATED AREAS FOR HOBBIES AND HOUSEHOLD
          MAINTENANCE ACTIVITIES  INVOLVING LEAD
          e. CONSULT HEALTH  DEPARTMENT ABOUT REMOVAL  AND CLEANUP
          IF LEAD EXPOSURE IS  SUSPECTED
          f. TEST BLOOD LEVELS
          g. TEST DRINKING WATER  FOR LEAD
          h. PLANS TO 	
          i. OTHER (SPECIFY) 	
(VS)
18.   In the past  year,  about  how much money  have you spent  on
testing for or reducing indoor  air pollution  in your home?

     a. NONE	01
     b. < $100	02
     C. $100 - 199	03
     d. $200 - 499	04
     e. $500 - 999	05
     f. $1000 OR OVER	06

-------
19.   Have you contacted any of the sources listed in the booklet?

     a.  NO	01
     b.  YES	02

          Which one(s)?  	
20.   Have  you shared  the  booklet or recommended  the booklet to
others not in your household?

     a. NO	01
     b. YES
        Who would that be?
          FAMILY/RELATIVES—NOT LIVING WITH THEM	02
          FRIENDS/NEIGHBORS	03
          OTHER (SPECIFY)..	..04
Now just a few general background questions and we'll be finished.

(I)
21. About how many years have you lived at this address?  	•
(I)
22.  Do you own your own home?

     a. NO	01
     b. YES	02-
     c. DON'T KNOW.,	99

(VS)
23.  What type of home  is  it?

     a. SINGLE-FAMILY HOME	01
     b. MOBILE HOME	02
     c. DUPLEX		03
     d. TOWN-HOUSE	04
     e. CONDOMINIUM	05
     f . APARTMENT.	06
     g. OTHER (SPECIFY)		07
     h. DON'T KNOW	99

(D
24.  To the best of your knowledge was your home built:

     a. before 1940	01
     b. between 1940 and 1976	02
     c. or after 1976	03
     d. DON'T KNOW	99

-------
(VS)
25.   Are you planning to move during the next year or two?

     a .  NO	01
     b.  YES	02
     c.  MAYBE	03

(I)
26.   Does your home have a basement?

     a.  NO (GO TO 28)	01
     b.  YES	02

     (I)
     27.  Is any part of your basement used as  living space by  you
      or your family?
          a. NO	01
          b. YES	02

(I)
28.   How many people are in your household?              	
(I)
29.  How many under the age of 12?                       	

30.  How many over the age of 60?                        	

(VS)
31.   Does  anyone in  your household  smoke  cigarettes  or  other
tobacco products?

     a. NO	01
     b. YES	02

(I)
32.  What was the highest grade of  school  that  you  completed?

     a. NO  SCHOOL	01
     b. GRADE SCHOOL  (1-8)	02
     c. SOME HIGH SCHOOL  (9-11)	03
     d . HIGH SCHOOL GRADUATE  (12 )	04
     e. SOME COLLEGE  (13-15)	05
     f. COLLEGE GRADUATE  (16)	06
     g. POSTGRADUATE  (17+)	07

(VS)
33.  Please tell me which age category  you are  in.

     a. 18  - 24	01
     b. 25  - 34	02
     c. 35  - 44	03
     d. 45  - 54	04
     e. 55  - 64	05
     f. 65  and  over	06

-------
(VS)
34.   What is your racial or ethnic background?

     a .  WHITE OR CAUCAS IAN	01
     b.  BLACK OR NEGRO	02
     c.  HISPANIC	03
     d.  ASIAN OR PACIFIC ISLANDER	04
     e.  NATIVE AMERICAN INDIAN 	05
     f .  REFUSAL	99

(D
35.   What is your sex?  (ASK ONLY IP UNCLEAR)
     a.  MALE	01
     b.  FEMALE	02

(VS)
36.   I'm going to read a list of broad income  categories  for family
income from all sources before  taxes during  1988.  (1986 USED IN
MO STUDY)  Please tell me to stop when I get tc yours.

     a.  $5, 000 or under	01
     b.  $5,001 - 15,000	02
     c.  $15,001 - 25,000	03
     d.  $25,001 - 35,000		04
     e.  $35,001 - 50,000	05
     f.  $50,001 - 65,000	06
     g.  $65,001 - 80,000	07
     h.  $80,001 and over	08
     i.  REFUSAL	99
Thank you very much for your cooperation.

37.    Is  there anything you could suggest to improve this booklet-
or future information on indoor air  quality?   (FOR THOSE WHO HAVE
NOT  READ THE BOOKLET:   Is there any  specific information about
indoor air quality you would find useful?)
Again, thank you.  Your responses will be combined with others and
analyzed  to help us  improve  our communications about  indoor air
quality.

-------
                MANAGING ENVIRONMENTAL RISKS AT PUBLIC SCHOOLS:
                         A SURVEY OF LOCAL SCHOOL DISTRICTS
   By completing this questionnaire you will help us to evaluate and improve federal and state
   programs to provide information and assistance to local school districts on reducing student
   and staff exposure to environmental health risks.  Your response will be strictly confidential.

   This form should be completed by the Superintendent of Schools or by the individual who
   is responsible for determining or supervising the actions your district takes to address
   potential environmental problems.  The questions that follow are for your entire district.
Q1   What is your position with this school district? (Circle the number of the best answer)

     1   SUPERINTENDENT
     2   OTHER (please specify)	_	
                                                   name and position title
Q2  How many years have you been employed in this district?

     	 YEARS
03   Who is responsible for deciding what actions will be taken by this school district about
     environmental health issues?  (Circle the numbers of ajl that apply)

     1   LOCAL BOARD OF EDUCATION
     2   SUPERINTENDENT OF SCHOOLS
     3   OTHER (specify)	
Q4   Who is  responsible  for  directly supervising  any actions  this  district  takes  about
     environmental health issues?
                                 Position or Department

-------
05   About how often does your district use the following sources to obtain  information  on
     potential environmental problems in the schools?  (Circle the number of the best answer
     for each information source listed)
int media 	

nt 	

onmental
SELDOM SOMETIMES OFTEN
. . . 1 2 3
. . . 1 2 3
. . . 1 2 3
. . . 1 2 3
. . . 1 2 3
     Radio or television news

     State education department

     State health department

     Regional or national Envi
     Protection Agency Office

     Other (Such as the State School Board Association,
     environmental groups, other state agencies, etc.
     please specify)	
06  In the past year, what has been the combined level of concern expressed by parents,
     students, faculty and staff about each of the following? (Circle number of best response
     for each item)
Student use of drugs and alcohol 	
Student use of tobacco 	
Asbestos in school buildings 	
Radon in school buildings 	
Other indoor air pollution 	
Outdoor air near schools 	
Lead in drinking water 	
Other drinking water concerns 	
Othpr (snerifv) 	 	
NONE UTTLE SOME GREAT
. . . 1 2 3 4
. . . 1 2 3 4
. . . 1 2 3 4
. . . 1 2 3 4
. . . 1 2 3 4
. . . 1 2 3 4
. . . 1 2 3 4
. . . 1 2 3 4
1234
DON'T
KNOW
9
9
9
9
9
9
9
9
9

-------
Q7  What do you think the relative health risk is for students and employees in your district's
     facilities for each of the  following?   We recognize it is difficult to know exactly how
     signrficant different risks are, but please circle the number of the response that best reflects
     your opinion about each issue.
Student use of a
Student use of tc
Asbestos in schc
Radon in school
Other indoor air
Outdoor air near
Lead in drinking
Other drinking w
Other (specify)
Icohol and drugs 	
Dbacco 	
)ol buildings 	
buildings 	
pollution 	
schools 	
water 	
ater concerns 	

NO SOME GREAT DON'T
RISK RISK RISK KNOW
..12345 9
..12345 9
..12345 9
..12345 9
..12345 9
..12345 9
..12345 9
..12345 9
1 ? 3 4 «i Q
                      ABOUT LEAD IN  DRINKING WATER
08   From where does your school district obtain its supply of drinking water? (Circle numbers
     of all that apply)


     1   SCHOOL OWNED WATER SUPPUES
     2   PURCHASE FROM LOCAL COMMUNITY

     3   PURCHASE FROM PRIVATE SUPPLIER

     4   OTHER (please specify)	
Q9   Does your district have a program for testing drinking water for contaminants, metals or
     other problems? (Circle number of best response)
     1   NO

     2   YES
How often do  you test drinking water

supplies?	

-------
010 How familiar are you with state and federal regulations and guidelines for testing for and
     correcting lead in school drinking water? (Circle number of best answer for each)
     State regulations and guidelines

     Federal regulations and guidelines


NOT AT ALL SOMEWHAT VERY
FAMIUAR
	 1
> 	 1
FAMIUAR
2
2
3
3
FAMIUAR
4 5
4 5
011  Which of the following has your district used to help determine your district's actions on
     testing for and correcting lead in drinking water problems? (Circle the numbers of all that
     apply)
         NO INFORMATION HAS BEEN OBTAINED

         PRINTED MATERIALS FROM THE ENVIRONMENTAL PROTECTION AGENCY

         PRINTED MATERIALS FROM STATE HEALTH DEPARTMENT

         NEWSPAPER AND OTHER PRINTED MEDIA

         WORKSHOPS OR SEMINARS SPONSORED BY (specify)	
        OTHER (Such  as state  school  board association, contractors, national  education

        organization, etc. Please specify)	
012 Has your district specifically tested for lead in drinking water in the district's buildings?
     (Circle number of best answer)
     1    NO
     2  YES
Is your district currently planning to test for lead levels in drinking water
in the next 12 months?
                     1  NO
           EWhy not?

 2 YES

Skip to Question  15

-------
Q13 When did your district first test for lead in drinking water, and when did your district most
     recently test for lead in drinking water? (List date, or approximate number of months or
     years ago)
     First test
     Most recent test
014  What did these tests find?  (Circle numbers of aji that apply)

     1   NO RETESTING OR CORRECTIVE ACTIONS WERE NECESSARY
     2   RETESTING NECESSARY AT SOME SITES
     3   CORRECTIVE ACTIONS WERE NECESSARY
Has this been completed?
     1  YES    2  NO
           a.   What types of problems, or potential problems, were found? (Circle numbers
               of aji that apply)
                    WATER SUPPLY PROBLEMS
                    PLUMBING PROBLEMS
                    WATER COOLER PROBLEMS
                    OTHER (specify)	
           b.   Please describe the problem and any difficulties in taking corrective action.
           c.  What is the status of corrective actions? (Circle numbers of aJl that apply)
              1     SOME CORRECTIVE ACTIONS HAVE BEEN COMPLETED
                    When?       	
              2     SOME CORRECTIVE ACTIONS ARE PLANNED WITHIN	MONTHS
              3     SOME OR ALL CORRECTIVE ACTIONS HAVE NOT BEEN SCHEDULED
                    Why not?	

-------
 015 If your district has, or will, test for and correct any lead in drinking water problems- (Circle
      numbers nf a that znn\A                                  *"      K wuicmo. ^uue
numbers of ajl that apply)
                                                 DISTRICT   PRIVATE     STATE   DON'T
                                                  STAFF CONTRACTORS   STAFF   KNOW
      Who did, or would do, the testing?  	1        2          3      9
      Who did, or would do, any corrective actions?	1        2          3      9
 Q16  From what sources were funds obtained, or where will funds be obtained, to implement
      testing for and correcting lead in drinking water problems?
 Q17 In the spring of 1989, the Environmental Protection Agency (EPA) sent a flyer to all local
     school districts and state health and education agencies announcing a manual entitled
      Lead in School Drinking Water."  From where, if at all, have you obtained or will you obtain
     this manual?  (Circle numbers of aj| that apply)
     1    NOT AWARE OF THIS MANUAL
     2   NO CURRENT PUNS TO OBTAIN THIS MANUAL
     3   GOVERNMENT PRINTING OFFICE USING ORDER FORM IN THE FLYER
     4   STATE DEPARTMENT OF EDUCATION OR DEPARTMENT OF HEALTH
     5   FROM THE REGIONAL EPA OR FEDERAL EPA OFFICES
     8   OTHER (Specify)	
     9   DONT KNOW
018 Have you received the manual? (Circle number of best response)

     1   NO	
     2   DONT KNOW
     3   YES
                          Please Skip to Q22

-------
Q19 Using the five point rating scales beside each item, please indicate if you think the manual
     "Lead in School Drinking  Water' is:  (Circle 9 if you don't recall or have not  used the
     manual)


Clear and

Instructive

Complete


Understandable 	

and Informative 	



NOT VERY
1 2 3 4 5
' ' fc» W ~ \J
1 2 3 4 S
• • » w T w
1O *3 >l C
^ ^3 ^ r^
DONT
RECALL


q
y
Q
     If you did not find the manual to be complete, what else did you require?
Q20 Did the manual affect your district's actions or plans regarding testing the drinking water
     for lead?  (Circle number of best response)

     1   NO
     2   YES, SOMEWHAT
     3   YES, DEFINITELY
Q21  If the manual "Lead in School Drinking Water" had not been available, where would your
     district have sought guidance on testing and correcting for lead in drinking water supplies?
     (Circle numbers of ajl that apply)

     1   INFORMATION MAY NOT HAVE BEEN SOUGHT
     2  STATE HEALTH DEPARTMENT
     3  REGIONAL EPA OFFICE
     4  ENVIRONMENTAL CONSULTING FIRMS
     8  OTHER (specify)	
Q22  Have you seen the list published in the spring of 1989 of lead lined water coolers that the
     Environmental Protection Agency  recommends should be tested, repaired or replaced?
     (Circle number of best answer)

     1   NO
     2   YES

-------
Q23 How important do you think each of the following has been in motivating and helping your
     district to take action on potential health risks due to lead in school drinking water?  (Circle
     number of best response for each item)
State requirements and recommendations . .
Federal requirements
State technical assist
State financial assista
EPA materials and te
Concerns expressed
parents and staff . .
Othpr fspprify)
and recommendations
ance 	
nee 	
chriical assistance ....
by the public, media

NOT VERY
IMPORTANT IMPORTANT
. 1 2345
. 1 23 4
1234
1234
. 1 23 4
1234
1234
5
5
5
5
5
5
DON'T
KNOW
9
9
9
9
9
9
9
Q24 Please indicate how serious each of the following has been in impeding any action your
school district might take about lead in drinking wate
each item)
Inadequate district funds 	
Inadequate state funds 	
Inadequate information from the
Environmental Protection Agency 	
Inadequate information from state agencies 	
Inadequate expertise in district 	
Inadeauate staff to handle extra work 	
*r? (Circle number of best answer for
NOT VERY
SERIOUS SERIOUS
.12345
. 1 2 3 4 5
. 1 2 3 4 5
. 1 2345
. 1 2345
.12345
DON'T
KNOW
9
9
9
9
9
9
Q25 Please add any other comments you have about the federal Environmental Protection
     Agency's requirements or about the materials and technical assistance they provided about
     lead in drinking water.
                                         8

-------
                   ABOUT RADON GAS IN YOUR SCHOOLS
Q26 How familiar are you with state and federal regulations and guidelines for testing for and
     correcting the presence of radon gas? (Circle number of best answer for each)
     Federal guidelines
j guidelines 	

NOT AT
ALL
1
. . . 1
SOMEWHAT
FAMIUAR
2 3
2 3
A
4
VERY
FAMILIAR
c;
X
Q27 Which of the following has your district used to help determine your district's actions on
     testing for and correcting radon gas problems?  (Circle the numbers of ajl that apply)
        NO INFORMATION HAS BEEN OBTAINED

        PRINTED MATERIALS FROM THE ENVIRONMENTAL PROTECTION AGENCY

        PRINTED MATERIALS FROM STATE HEALTH DEPARTMENT

        NEWSPAPER AND OTHER PRINTED MEDIA

        WORKSHOPS OR SEMINARS SPONSORED BY (specify)	
        OTHER (Such as state  school  board association,  contractors, national education
        organization, etc. Please specify)	
Q28 Has your district specifically tested for radon gas in the district's buildings? (Circle number
     of best response)
     1   NO

     2   YES
Is your district currently planning to test for radon gas problems in the
next 12 months?
                     1  NO-
           >Why not?
                     2  YES

                    Skip to Question 31
Q29  When did your district first test for radon gas, and when did your district most recently test
     for radon gas? (List date, or approximate number of months or years ago)
     First test
     Most recent test

-------
Q30 What did these tests find? (Circle numbers of M that apply)
     1    NO RETESTING OR CORRECTIVE ACTIONS WERE NECESSARY
     2   RETESTING NECESSARY AT SOME SITES
     3   CORRECTIVE ACTIONS WERE NECESSARY
Has this been completed?

     1  YES   2 NO
a.
b.
c.
•
What types of problems were found?

What difficulties have you had addressing these problems?

What is the status of corrective actions? (Circle numbers of aji that
1 SOME CORRECTIVE ACTIONS HAVE BEEN COMPLETED
When?




apply)
2 SOME CORRECTIVE ACTIONS ARE PLANNED WITHIN MONTHS

3 SOME OR ALL CORRECTIVE ACTIONS HAVE NOT BEEN SCHEDULED
Whv not? (specify)



Q31  If your district has, or will, test for and correct any radon gas problems:  (Circle numbers
     of ajl that apply)
                                                DISTRICT    PRIVATE     STATE  DON'T
                                                 STAFF   CONTRACTORS  STAFF  KNOW
     Who did, or would do, the testing? 	1

     Who did, or would do, any corrective actions?	1
                                        10

-------
Q32 From what sources were funds obtained, or where will funds be obtained, to implement
     testing  for and correcting radon gas problems?
Q33 From  where,  if at all,  have you obtained,  or  will  you obtain,  the  report "Radon
     Measurements in Schools"? (Circle numbers of all that apply)
     1    NOT AWARE OF THIS REPORT

     2    NO CURRENT PLANS TO OBTAIN THIS REPORT

     3    GOVERNMENT PRINTING OFFICE.

     4    STATE DEPARTMENT OF EDUCATION OR DEPARTMENT OF HEALTH

     5    FROM THE REGIONAL EPA OR FEDERAL EPA OFFICES

     8    DONT KNOW

     9    OTHER (Specify)	
Q34 Have you received the report "Radon Measurements in Schools?"  (Circle number of best
     response)
     1   NO __

     2   DONT KNOW _ J      P'eaSC  S    t0
     3   YES
Q35  Using the five point rating scales beside each item, please indicate if you think the report,
     "Radon Measurements in Schools" is: (Circle 9 if you don't recall  or have not used the
     report)


Clear and
Instructive
Complete


Understandable 	
and Informative 	 	


NOT \
. 1 2 3 4
. 1 2 3 4
. 1 2 3 4

/ERY
5
5
5
DONT
RECALL
Q
Q
9
     If you did not find the report to be complete, what else did you require?
                                        11

-------
Q36  Did the report affect your district's actions or plans regarding testing for radon?  (Circle
     number of best response)
     1   NO

     2   YES, SOMEWHAT

     3   YES, DEFINITELY
Q37 If the report "Radon Measurements in Schools" had not been available, where would your
     district have sought guidance on testing for and correcting radon gas problems?  (Circle
     numbers of ajj that apply)
     1    INFORMATION MAY NOT HAVE BEEN SOUGHT
     2    STATE HEALTH DEPARTMENT
     3    REGIONAL EPA OFFICE
     4    ENVIRONMENTAL CONSULTING FIRMS

     5    OTHER (specify)	—
district to take action on potential health risks
response for each item)
State requirements and recommendations . .
Federal recommendations 	
State technical assistance 	
State financial assistance 	
EPA materials and technical assistance
Concerns expressed by the public, media
parents and staff 	
Other (soecifv) 	
; due to radon gas? (Circle number of best
NOT VERY
IMPORTANT IMPORTANT
. 1 2 34
1234
1234
1234
. 1 2 3 4
1234
1234
5
5
5
5
5
5
5
DONT
KNOW
9
9
9
9
9
9
9
                                         12

-------
 Q39 Please indicate how serious each of the following has been in impeding any action vour
     school district might take about radon gas?  (Circle number of best answer for each item)


Inadequate
Inadequate
Inadequate
Protection 1-
Inadequate
Inadequate
Inadequate


district funds 	

state funds 	

information from Environmental
Vgency 	

information from state 	
expertise in district 	
staff to handle extra work 	
NOT VERY
SERIOUS SERIOUS
.12345
.12345

.12345
12345
• * *• W ™ \j
12345
• * «• W ™T \J
12345
DONT
KNOW
9
9

9


Q
Q40 Please add  any other comments you have about the federal Environmental Protection
     Agency's guidance or about the materials and technical assistance provided about radon
     gas.
                    ABOUT YOUR DISTRICT'S FACILITIES


Q41  Approximately what proportion of your facilities were built, or totally remodeled, in each of
     the following time periods? (Circle number of best answer for each time period)
     Since 1980

     1960 - 1979

     1940- 1959

     Before 1940




NONE OR
VERY FEW
1
1
1
. . . 1
2
2
2
2
MOST OR
ALL

3
3
                                       13

-------
                    IF YOU NEED MORE  INFORMATION
    Check this box if you would like information how to obtain the EPA Lead in School Drinking
    Water manual and the name and number of the contact person in your state government.
    Check this box of you would like information how to obtain the EPA report  "Radon
    Measurements in Schools" manual and the name and number of the contact person in your
    state government.
    Results of this survey will be aggregated so no school district can be identified.  If you
    would like a summary of the results of this survey, check this box.
                IS THERE SOMETHING WE OVERLOOKED?

Please use this space for anything you would like to add about the U.S. EPA, the materials it
provides, the assistance it offers, mandates, recommendations, etc.
Thank you for your assistance!
                                      14

-------
             APPENDIX B
     FOCUS GROUP MATERIALS
1.  Screening Questionnaire


2.  Focus Group Format Guide
3.  Background Information on Health Concerns
   and Home Repairs

-------

-------
                        SCREENING QUESTIONNAIRE
 1. First of all, are you tht
       a. male head of household
       b. female head of household
       c. neither                             TERMINATE

 2. Do you own or rent your home?

       a. own
       b. rent                                TERMINATE

 3. From what source do you get your water for household use?

       a. city provides water                    (RECRUIT 5)
       b. community -owned well               (RECRUIT 1)
       c. private-owned well                    (RECRUIT 4)

 4. Into which of the following age categories do you fall?

       ^ 21-34                              (RECRUIT 3)
       b. 35-49                              (RECRUIT 3)
       c- 50-64                              (RECRUIT 3)
       d- 65+                                (RECRUIT 1)

 5.Have you ever considered lead in your drinking water to be a problem?

       a. yes
       b- no                                 SKIP TO QUESTION 7

 6. How concerned are you regarding lead in your drinking water? Are you

      a. very concerned                       TERMINATE
      b. somewhat concerned                   TERMINATE
      c. not very concerned
      d.not at all concerned

7. When was the last time you were in a group discussion lasting longer than half an hour?

      a. less than a year ago
      b. more than a year ago
      c- never                               SKIP TO INVITATION

8. What was the subject of that discussion group?
IF SUBJECT SAYS
                                    , THEN TERMINATE.

-------

-------
Focus Group  Format

Design session to last two hours. An hour and 40 minutes of that will actually be used for
discussion. The remainder is reserved for refreshments and mingling after the discussion.

Evaluate both pieces of literature in each group.  Present the shorter brochure first and
discuss it for about 40 minutes.  Introduce the longer brochure second and discuss it for
about an hour. Some of the discussion about the second brochure will include
comparisons between the two.

Introduction
Good morning.  My name is     and I work at the              ^- '           -'.-".-
                    -  We are doing a study to learn how homeowners make decisions
about their homes and the health issues related to their homes. Each of you was selected to
participate in  today's discussions because you, or you and your spouse, are a homeowner.
We've invited you here today to talk about some environmental issues that relate to your
homes.

Icebreaker
First we are going to work our way around the table and introduce ourselves. Tell us your
first name, how long you have lived in your present home, and then describe your favorite
room in the home.

Ranking Cards  (First Focus Group Only)
Now I'm going to give each of you 2 cards. The first card has a list of five common
household concerns, the second has a list of ten common health concerns. I would like for
you to put your first name on each card, then rank the items on this card from one to five,
and on this card from one to ten, to indicate how seriously you consider each of these
problems or concerns. One should indicate your most serious concern.

Pamphlet I
I am going to hand each of you a fact sheet that contains information relating to your
homes.  Take  a few minutes to read the brochure. When everyone is finished, I will ask
you a few questions.
(Hand out pamphlets. Allow 3-5 minutes to read)

What general information does the pamphlet convey about radon
       •What do you think radon is?
       •What are the dangers of radon?
       •What are the chances of having radon in your home?

How much information does the pamphlet provide
       •What, if anything, would you do after reading this fact sheet?
       •What other information, if any, would you need to determine if radon is a problem
             in your home?
       •Where do you think you might obtain  that information?

Does the pamphlet encourage the homeowner to take action
       •How likely would you be to measure the radon level in your home after you
             finished reading this fact sheet?
       •What would you do if you discovered that your home had a high level of radon?

-------
How concerned is the homeowner about radon relative to other household problems
       Let's look at the cards you filled out earlier.
       •Other than radon, what problems do you worry about in your home?
       •Where would you rank radon exposure among these problems?

How concerned is the homeowner about radon relative to other health concerns
       •Now think for a minute about health concerns. What are some other health
             concerns that you worry about?
       •Where would you rank the risk of lung cancer from radon exposure?

How is information distributed?
       •Where do you think you might have found this fact sheet?
       •If you were in charge of telling homeowners about radon, would you want to use
             this fact sheet?
       •If so, how would you make it available?
       •What changes would you make to this fact sheet?
       •What other methods would you use to inform homeowners about radon?

Ranking Cards  (First Focus Group Only)
I'm going to pass out 2 more cards.  These are just like the ones you already have. The
first card has a list of five common household concerns, the second has a list of ten
common health concerns. I would like for you to put your first name on each card, then
rank the items again, based on how you now feel about each of these concerns. Again, let
one should indicate your most serious concern. When you are finished, you can pass the
cards back to me.
Brochure II

Now I am going to hand you another pamphlet. It may contain some of the same
information as the first one, but please read it carefully.  It is a little longer, so you'll have
more time to read. When everyone has finished, I will ask you some more questions.


How much information does the pamphlet provide
       •What new information did you learn from this pamphlet?
       •Would you need to learn more before you decided to find out if radon is a problem
             in your home?
       •What additional kinds of information would you need?
       •Where do you think you could obtain this information?

How does the homeowner perceive the risks associated with radon exposure?
       •Does this pamphlet change your ideas about the dangers of radon?
       •Are you more or less concerned about radon?
       •If you tested the air in your home and the results showed a concentration of
             20pCi/La, what would you do next?
       •If the test showed  1 pCi/La what would you do?
       •At what level of radon concentration would you become concerned enough to take
             corrective measures?
       •What about water? How many of you obtain your water from a city or county
             water utility? Where do the rest of you obtain water?
       •At what level of radon concentration in your water would you become concerned
             enough to take corrective measures?

-------
Does the pamphlet instruct the homeowner to take corrective measures
       •If you decided that you needed to reduce the level of radon in your home, how
             would you go about doing that?

What are the homeowner's expected costs of radon reduction?
       •If the source of radon in your home is soil gas, how much do you think it would
             cost to reduce the radon level?
       •What if the source is water?

How concerned is the homeowner about radon relative to other problems
       •If you were planning some home improvement next month, say for example,
             converting your electric water heater to gas, and you discovered what you
             considered to be an unsafe level of radon in your home, what would you
             do?
       •After reading the second pamphlet, how do you rank radon among other
             household problems? (Handout two more sets of blank cards)
       •What about health concerns? Where would you rank radon among the health
             concerns you listed earlier?

How is information distributed
       •Where do you think you might find this pamphlet?
       •If you were in charge of telling homeowners about radon, would you  use this
             pamphlet?
       •What changes, if any, would you make?
       •How would you make this pamphlet available to homeowners?
       •If you were in charge, which pamphlet, the first or the second, would you prefer
             to use?
       •What other methods would you think would be effective in telling homeowners
             about radon?.
       •What do you know now that you didn't know before reading these pamphlets?


Brochure TTI

We're going to look at one last pamphlet. This one is fairly short, but contains different
information, so please read it: carefully.   When everyone has finished, I will ask you a few
more questions.

What type of risk information does the fact sheet convey?
       •What new information did you learn from this pamphlet?
       •What additional information would you like?
       •Does this pamphlet change your ideas about the dangers of radon?
       •Are you more or less concerned about radon?

How is information distributed
       •If you were in charge of telling homeowners about radon, would you use this
             pamphlet?
       •What changes, if any, would you make?

-------

-------
             Priorities for Home  Change



	replacing  roof

	replacing  furnace/heating  system

	major  landscaping  changes

	major  exterior  changes,  e.g. painting house  or
         new  entrance way

	reducing radon  level  in  house

	adding a new room or conversion

	replacing  plumbing to reduce lead  in your
         drinking  water

-------
             Your  Health  Concerns
__cancer  of the colon

__heart  disease

__diabetes

Jung  cancer  from  exposure to tobacco smoke

 AIDS
__lung cancer from exposure to radon gas

__breast  cancer/testicular  cancer

__cancer  from  exposure to pesticides or other
      chemicals

._Alzheimer's  disease

._stroke

_neurological  disorders from  exposure  to  lead  in
      your  drinking  water

-------
            APPENDIX C
      PRETESTING MATERIALS
1.  Field Review Form





2.  Pre-Post Booklet Testing Form





3.  How to Test for Readability

-------

-------

-------
    FIELD REVIEW FORM
Target Audience (if different from Screening Form):
Topic (if different from Screening Form):
Major Messages (list):
Persuasive Technique (describe):
Distinguishing Qualities (describe):
Ace No.  _
Date  	
Reviewer
Production Quality
Excellent            Poor

      54321
(Comments)
Content:
(Comments)
Credibility:
(Comments)
Ability to Attract Attention:
(Comments)
Ability to Convey Information:
(Comments)
Ability to Change Attitudes:
(Comments)
Ability to Elicit Appropriate Action:
(Comments)
Appropriate for National Distribution:
54321
54321
54321
54321
54321
54321
Yes No Limited MRP (Hpcrrih^
(Comments)
Overall Rating:
54321
     (specify any particular strengths/weaknesses)

Recommend for further consideration (e.g., promotion, replication, purchase, adaptation, testing or evaluation)?
	Yes    	No

Please explain recommendation:
Return to:

-------
Considerations for Field Review

1. Target audience—What audience is the material best suited for? For whom should it not be used? Consider the
   language style, use of terminology,  length, appropriateness of examples and format in determining the target
   audience.

2. Persuasive technique—Are the messages positive and upbeat? Are positive role models used? Fear appeals?
   Authority figures  (who)? Peer pressure?

3. Distinguishing qualities—Innovative or unique presentation, format or style? Fills a need for specific audience or
   message?

4. Production qualities—Is the material professional in appearance, attractive, well-written? Is the production format
   appropriate for the intended use (e.g., setting, equipment required)? Should production changes be considered
   (e.g., use of less or more color)?

5. Content—Clear and accurate? Up to date? Appropriate message, tone and appeal? Stimulating? New knowledge?
   Perpetuate myths or stereotypes? Balanced and credible? Biased or judgmental?

6. Elicit action—Describes desired behavior? Illustrates skills required? Demonstrates appropriate behavior?

7. Credibility—Is production or distribution source credible for target audience? For intermediaries (e.g., teachers or
   parents)? Is message,  theme, presentation credible?

8. Appropriate for national distribution—Will materials stand alone, or require training for use? Inappropriate for some
   audiences (e.g., culturally inappropriate) or geographic areas?

9. Recommendation for evaluation—Are there questions or uncertainties that need to be resolved prior to determin-
   ing disposition? Should materials be tested?

-------
 Pre-Post Booklet Testing Form
 Source: U.S. Environmental Protection Agency
I.  Pretest Questions
As you probably are aware, Toms River is the site of a pilot project designed to inform residents about
potential environmental hazards associated with the Superfund site, and to encourage their involve-
ment in EPA's decision-making process for cleanup of the site.
  We would appreciate your willingness to share your reactions to the attached fact sheet by reading
it and answering a few questions. We do not ask your name and all information you provide will re-
main confidential.
  Because only a few Toms River citizens are being asked to help judge this material, your response
is particularly valuable.


Before you begin, please check the appropriate answers to these four questions.

1. How much would you say you know about the Toms River Superfund study?
  A  little	  Some	  A lot	
2. Is  there anything in particular you want to know about the study?
  Yes	   No	
  If yes, please specify.
  (Note: more knowledge questions can be added here.)

3. Are you or any member of your family an employee/former employee of (Superfund site company)?
  Yes	   No	

4. Are you a member of any group particularly concerned about the environment?
  Yes	   No	
Now, please turn the page and read the fact sheet.

-------
 II.  Posttest Questions
 Now that you have finished reading the fact sheet, please answer the questions below. You may refer
 back to the fact sheet as you consider your response if you wish.
 1.  In your own words, what would you say is the purpose of the Superfund study?
    (Note: additional knowledge questions can be added here.)
 2.  How much of the information in the fact sheet was new to you?
    Most of it	  Some of it	None	
 3.  Do you have questions about the Superfund study which were not answered in the fact sheet?
    Yes	   No	
    If yes, please list:
 4.  Was there anything you particularly liked about the fact sheet?
    Yes	   No	
    If yes, what?
 5.  Was there anything you particularly disliked about the fact sheet, or found confusing?
    Yes	   No	                             '                           *'
    If yes, what?
 6.  This fact sheet is most appropriate for (check all that apply):
    General public	  College graduates	  Professionals	
 7.  Would you recommend the fact sheet to a friend or family member?
    Yes	  No	

 8.  The following are a series of phrases describing the fact sheet. Please circle the one choice on
    each line that most closely reflects your opinion.
    a. very interesting              somewhat interesting          not at all interesting
    b. very informative             somewhat informative          not informative
    c. accurate                   partially accurate              inaccurate
   d. very clear                  somewhat clear               confusing
   e. very useful                 somewhat useful              not useful
   f.  unbiased                   biased towards government     biased towards industry
   g. easy to read                understandable                hard to understand
   h. complete                   somewhat complete           incomplete
9. Would you like to say anything else about the fact sheet? Please comment:	
Thank you very much for your help in reviewing this fact sheet.

-------
 How to Test  for Readability
The SMOG Readability Formula
To calculate the SMOG reading grade level,
begin with the entire written work that is being
assessed, and follow these four steps:
1.  Count off 10 consecutive sentences near the
   beginning, in the middle, and near the end of
   the text.
2.  From this sample of 30 sentences, circle all
   of the words containing three or more
   syllables (polysyllabic), including repetitions of
   the same word, and total the number of words
   circled.
3.  Estimate the square root of the total number
   of polysyllabic words counted. This is done by
   finding the nearest perfect square, and taking
   its square root.
4.  Finally, add a constant of three to the square
   root. This number gives the SMOG grade, or
   the reading grade level that a person must
   have  reached if he or she is to fully under-
   stand the text being assessed.

   A few additional guidelines will  help to clarify
these directions:
•  A sentence is defined as a string of words
   punctuated with a period (.), an exclamation
   point  (!) or a question mark (?).
•  Hyphenated words are considered as one
   word.
•  Numbers which are written out should also be
   considered, and if in  numeric form in the text,
   they should be pronounced to determine if
   they are polysyllabic.
• Proper nouns, if polysyllabic, should be
  counted, too.
• Abbreviations should be read as
  unabbreviated to determine if they are
  polysyllabic.

   Not all pamphlets, fact sheets, or other
printed materials contain 30 sentences. To test a
text that has fewer than 30 sentences:
1. Count all of the polysyllabic words in the text.
2. Count the number of sentences.
3. Find the average number of polysyllabic
  words per sentence as follows:
  ovpranp -  Total # of polysyllabic words
  dveidye -  Totai # ot sentences

4. Multiply that average by the number of
  sentences short of 30.
5. Add that figure on to the total number of
  polysyllabic words.
6. Find the square root and add the constant of
  3.

   Perhaps the quickest way to administer the
SMOG grading test is by using the SMOG
conversion table. Simply count the number of
polysyllabic words in your chain of 30 sentences
and look up the approximate grade level on the
chart.
  An example of how to use the SMOG
Readability Formula and the SMOG Conversion
Table is provided on the following page.
          U.S. Department of Health and Human Services, Making Health Communication Programs Work,
   Bethesda, MD: National Cancer Institute, NIH Publication No. 89-1493, (1989).

-------
Example Using the
SMOG Readability Formula:
    In(ControUJmDCancer—
    You Make a(DiffereDC^

     (The key is action)(You can help protect yourself against
    cancer) Act promptly to:

   M Prevent some cancers through simple changes in
    lifestyle.)

    (* Find out about early (jetectior)) tests in your home.)

   (*'Gain peace of mind throuj

   Cancers You Should Know About
 (7. Lung Cancer is the number one cancer among men, both
   in the number of new cases each yeaiCjT^OOOj'and deaths
             ^^     	)rates are due mainly to
    	»smoiong^BVnot smoking,  you can largely
   prevent lung cancer^Trie risk is reduced by smoking less,
   and by using lower tar and (gcdurp brands) But quitting
   altogether is by far the most effective safeguard. The
   American Cancer Society offers Quit Smoking Clinics
   and self-help materials.

     Colorectal Cancer is second  in cancer deaths (25,100)
   and third in new cases (49,000). When it Li found early,
   chances of cure arc good. A regular general physical
   usually includes a digital examination of the rectum and a
   guaiac slide test of a stool specimen to check for invisible
   blood. Now there are also Do-It-Yourself Guaiac Slides
   for home use. Ask your doctor about them. After  you
   reach the age of 40, your regular  check-up may include a
   "Procto," in which the rectum  and part of the colon are
   inspected through a hollow, lighted tube.

  ("• Prostate Cancer is second in  the numberpf new cases
   each year (57,000), and third in deaths QS]jt^i(ttoccurs
   mainly in men over 60)^A'(gguI|> rectal exam ofthe
   prostate by your doctor is the best £rotectj5)i)

   A Check-Up Pays Off
. j          "    *        	
(I* Be sure to have a (egula), ggher§?(pTiyslc^l including an
  oral exarrh(u is your best ggaranK^ of good health)
   *T7iis pamphlet is from the American Cancer Society.
                   Sample only: Information may not be current.


     fcHow Cancer Works
   (' If we knowsomething about how cancer works, we can
      act more <32eciiv£$ to protect ourselves against the
      disease) Here are the basics.
                                                       ('
      1. Cancer spreads; time county— Cancer is 6u on
        what tests to get and how often they should be
        performed.
     3. What you can do— Don't smoke and you will sharply
        reduce your chances of getting lung cancer. Avoid too
        much sun, a major cause of skin cancer. Learn
        cancer's Seven Warning Signals, listed on the back of
        this leaflet, and see your doctor promptly if they
        persist. Pain usually is a  late symptom of cancer; don't
        wait for it.
     Unproven Remedies
     Beware of unproven cancer remedies. They may sound
     appealing, but they are usually worthless. Relying on
     them can delay good treatment until it is too late(check
     with your doctor or the^nenc^n Cancer ,
    More Information
  0*For more i(tTormatiofr of any kind about cancer—free of
    cost—contact your local unit of th
-------
   We have calculated the reading grade level
 for this example. Compare your results to ours,
 then check both with the SMOG conversion
 table:
   Readability Test Calculations
   Total Number of Fblysyllabic Words            =38
   Nearest Perfect Square                     =36
   Square Root                              =  6
   Constant                                 =  3
   SMOG Reading Grade Level                 =  9
SMOG Conversion
Total Polysyllabic
Word Counts
0-2
3-6
7-12
13-20
21-30
31-42
43-56
57-72
73-90
91-110
111-132
133-156
157-182
183-210
211-240
TaW«*
Approximate Grade
Level (±1.5 Grades)
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
•Developed by: Harold C. McGraw, Office of Educational Research,
Baltimore County Schools. Towson, Maryland.

-------

-------