% .0/ OFFICE OF INSPECTOR GENERAL
Catalyst for Improving the Environment
Pilot Study
Science to Support Rulemaking
Report 2003-P-00003
November 15, 2002

-------
	r				,
CAA 1 Clean Air Act
CWA j Clean Water Act
EPA j Environmental Protection Agency
EPCRA
FIFRA
IQ Guidelines
Emergency Planning and Community Right to Know Act
Federal Insecticide, Fungicide, and Rodenticide Act
Information Quality Guidelines
NAAQS
National Ambient Air Quality Standards
NESHAP
National Emission Standard for Hazardous Air Pollutant
NOx
Nitrogen Oxides
NPDWR
National Primary Drinking Water Regulations
OAR
Office of Air and Radiation
OMB
Office of Management and Budget
OPEI
Office of Policy, Economics, and Innovation
OPPTS
Office of Prevention, Pesticides, and Toxic Substances
ORD
Office of Research and Development
OSWER
OW
Office of Solid Waste and Emergency Response
Office of Water
RAPIDS
RCRA
Rule and Policy Information Development System
Resource Conservation and Recovery Ad
SDWA
Safe Drinking Water Act
TSCA
Toxic Substances Control Act

-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
OFFICE OF
INSPECTOR GENERAL
November 15,2002
MEMORANDUM
SUBJECT: Report 2002-P-0003
Pilot Study: Science to Support Rulemaking
FROM: Jeffrey Harris, Director, Cross-Media Issues
Office of Inspector General
TO:	Thomas Gibson, Associate Administrator for
Policy, Economics, and Innovation
Paul Gilman, Science Advisor to the Agency
This report concludes the pilot study for an evaluation of science to support rulemaking
conducted by the Office of Inspector General (OIG) of the U.S. Environmental Protection
Agency (EPA), It addresses the role played by science, the genesis of this science, and the extent
to which the science was peer reviewed. Although this was only a pilot study, we believe the
results are sufficient to offer suggestions to improve the transparency and consistency with which
science is applied to rulemaking in the EPA. Some of the suggestions elaborate on
recommendations made in June 2001 to the Administrator by EPA's internal task force on
improving rulemaking. Our report also discusses the strengths and weaknesses of the
methodology we used in conducting the pilot study.
We appreciate the cooperation we received from the primary contacts for the rules covered by the
pilot study, and those others involved in these rulemakings who took the time to answer our
questions.
Since this report does not include recommendations to which you must respond, no action is
required. In accordance with EPA Order 2750, we are closing the report in our tracking system
upon issuance. We have no objections to the further release of this report to the public. For your
convenience, this report will be available at http://www.epa.gov/oigearth/eroom.htm.
If you have any questions about this report, please contact Christine Baughman at 202-566-2902.

-------

-------
Rules, also known as regulations, are a critical cornerstone of the Environmental
Protection Agency's (EPA's) mission. By statute and executive order, they are to
be based on the best reasonably obtainable scientific, technical, economic, and
other information. EPA Administrator Whitman noted that the Agency's ability
to "accomplish our mission and continue to have a meaningful impact on the
quality of life for all Americans to a large extent is based on our ability to more
fully integrate science into our programs, policies and decisions."
By identifying the science that was critical to rules promulgated by EPA in the
past, we hoped to determine whether better research planning, application of
science to rules, and explanation of the role of science in rules could achieve
improvements in the science behind future environmental regulations. By critical,
we do not mean that the rules were promulgated because of the science, but that
without the science the rules would have been different, or even impossible to
promulgate. We completed a pilot study consisting of 15 case studies involving
16 of EPA's significant rulemakings to determine whether a full study would be
useful and feasible.
Results In Brief
The rales included in the pilot study were not a representative statistical sample of
EPA rules, and we did not identify all of the critical science inputs for every rule.
However, we made observations that we believe transcend these limitations and
will be useful to EPA rulemakers:
Role of Science: Science played an important role in the rules, but that
role was not always clear. Even though the rules included in this pilot
study depended on hundreds of scientific documents, because the role of
science often was not presented in a manner consistent with the
conventions of communicating scientific information, it may be unclear
what science was critical and why.
Sources of Science: Although critical science originated from a variety of
sources, research performed under contract to EPA and the regulated
community by private sector firms was the most common source. Grants
and cooperative agreements accounted for about 8 percent of the work.
Data: Some of the rules would be based on fewer assumptions if EPA
had more data and fewer scientific "blind-spots."
i
Report 2003-P-00003

-------
Peer Review: The critical science supporting the rules often was not
independently peer reviewed. Consequently, the quality of some science
remains unknown.
The pilot study identified significant challenges to identifying target populations
of EPA's non-significant rules and identifying critical science consistently, and
we do not intend to pursue a full study at this time.
Suggestions
Based on our observations in the pilot study, we offered several suggestions;
*	EPA should ensure that science in rulemaking is presented in a way that is
apparent and consistent with the conventions of science.
•	Information technology could be better used to ensure that the
Administrator, Congress, and the public could determine that the science
behind rulemaking is adequate.
The critical science behind EPA's rules should consistently be
independently peer reviewed.
Agency Comments and OIG Evaluation
We received both formal and informal responses from EPA management.
Generally, the comments from Agency officials were supportive of the
suggestions, although they identified some concerns about the details of their
implementation. EPA's Science Advisor has committed to review the Agency's
progress in implementing its Peer Review policy during the coming year, and
ensure that Agency decisions are based on sound science. We have incorporated
many of the specific Agency comments directly into the report and its Addendum
to improve clarity and factual accuracy. Because of the commitments made by
Agency management in their comments, we believe the report's observations may
serve as a baseline against which the Agency can chart progress.
ii
Report 2003-P-00003

-------

Executive Summary				i
What We Did and Why						1
We gained an understanding of rulemaking at EPA 				2
We identified significant rules promulgated since 1990 and
selected a sample for 15 case studies 					3
We sought to identify the critical science behind each rule				5
We identified the sources of the critical science 			8
We asked the respondents about science gaps and science quality 		8
* We identified the type of peer review undergone
by the critical science								9
What We Learned About the Science Behind the Rules 			10
Science played an important role in the rules, but that role was not always clear 		10
Although critical science originated from a variety of sources,
the private sector was the most common source 				.	15
More data and fewer "blind spots" could reduce assumptions 				17
Critical science supporting the rules often was not independently peer reviewed 		18
Suggestions 				20
Agency Comments 		21
Response to Agency Comments 		29
Pilot Lessons Learned		33
Exhibits
1: Significant Rules Finalized -1990-2001 						35
2: Numbers and Completeness of Critical Documents 		39
3: Who Performed the Science Work		40
4: Who Funded the Science Work						41
5: Funding Mechanisms Used 							42
6: Peer Review Actions Taken 											43
7: EPA Science Advisor Comments									44
8; Office of Water Comments									48
9: Office of Prevention, Pesticides and Toxic Substances Comments 		58
10: Office of Policy, Economics, and Innovation Comments 					69
11: Report Distribution 										78
in
Report 2003-P-00003

-------

-------
:
e Did
. ¦*¦¦¦¦ -i. .**- .jfH
Rules (also called regulations)1 are a cornerstone of the Environmental Protection
Agency's (EPA's) mission. Rules are, first and foremost, legal documents written
to meet legal goals. Nonetheless, they are to be based on the best reasonably
obtainable scientific, technical, economic, and other information. In June 2001, a
Regulatory Development Task Force established by the EPA Administrator to
improve EPA regulations offered several recommendations to ensure that science
has a more prominent role in EPA decision-making, and that there is timely and
thorough analysis of issues. The recommendations included improving existing
processes to more effectively ensure broader Agency involvement and executive
input on cross-cutting scientific, economic, or policy issues, as well as involving
EPA scientists in determining needed analyses and research, identifying
alternatives, and selecting options. As noted in a May 2002 memorandum from
the EPA Administrator, the ability of the Agency to "accomplish our mission and
continue to have a meaningful impact on the quality of life for all Americans to a
large extent is based on our ability to more fully integrate science into our
programs, policies and decisions."
By understanding what science was critical to the rules promulgated by EPA in
the past, we hoped to determine whether better research planning, application of
science to rules, and explanation of the role of science in rales could achieve
improvements in the science behind future environmental regulations. By critical,
we do not mean that the rules were promulgated because of these documents;
rather, without the documents, the rules would have been different, or even
impossible to promulgate.
We believed that such a study should answer the following questions:
~	What role does science play in supporting rules?
*¦ What science provided the most critical support for the rules and what was its
genesis?
~	What were the most significant gaps in the science underpinning the rules, and
could they have been filled with better research planning and communication?
~	How was the quality of the science evaluated?
~	Do rules with better scientific underpinnings exhibit measurably better
outcomes?
1	"Regulations," or "rules," arc agency statements of general applicability and future effect, which the
agency intends to have the force of and effect of law, and that are designed (1) to implement, interpret, or prescribe
law or policy, or (2) to describe the procedure or practice requirements of an agency. "Rulemaking" is synonymous
with "regulatory action."
1
Report 2003-P-00003

-------
In August 2001, we started a pilot study to determine whether we could answer
the above questions and, if so, the level of resources required. Early in the pilot
we consulted two Agency groups. The Research Strategies Advisory Committee
of the EPA Science Advisory Board is comprised of representatives of the various
advisory boards that advise on science in rulemaking in EPA. EPA's Regulatory
Steering Committee is the cross-agency group most closely involved with
rulemaking in EPA. With the advice of both groups, we designed this pilot study,
which we completed June 2002.
We began by identifying EPA's significant rules finalized after 1989, and then
selected a small sample to pursue as case studies. For each rule in the sample, we
identified primary contacts involved in the rulemaking and contacted each
individual via e-mail for assistance in identifying the critical science documents
underlying the rule. For each document, we established: who conducted the
study, how the study was funded, and whether and how the document was peer-
reviewed. Each step is discussed in more detail below, and the findings for each
selected rule are summarized in the case studies located in an Addendum to this
report available at http://www,epa.gov/oigearth/eroom.htm.
Except for the limitations identified in this paragraph, the pilot study was
conducted in accordance with the Government Auditing Standards for
performance audits. Our research into the management controls over developing
regulations was limited to a general understanding of the process, both for rales in
general and the rules in the sample. We did not test any controls (e.g., comparing
the planning requirements to the actual plans for the rules in the pilot study).
Also, because this was only a pilot study, not all of the attributes of a finding were
pursued, such as those related to cause (e.g., why there was a lack of peer review).
We gained an understanding of rulemaking at EPA
EPA develops many types of regulations based on a variety of mandates, but
certain processes are common to most rulemakings. Typically, once a rulemaking
is triggered by a statute, court order, petition, or executive initiative, the
appropriate program office(s) and the Office of Policy, Economics, and
Innovation (OPEI) prioritize staff time and resources, determine the regulatory
strategy to be pursued, identify the science and data needed and available, identify
a set of regulatory options, select an option, and propose the rule in the Federal
Register. The notice of proposed rulemaking in the Federal Register includes a
preamble that describes the basis and purpose of the rule, the alternatives
considered, and the underlying supporting information. To send a clear message,
the preamble must be written in plain language. The underlying supporting
information is found in the "dockets" for the rulemaking (drawers of paper files
available for public perusal and, for more recent rules, the equivalent electronic
files available on EPA's website). The dockets also contain all public comments
received, EPA's responses to public comments, and other information EPA
2
Report 2003-P-00003

-------
considers relevant to its decisions. After allowing for public comment (typically
60 days), EPA finalizes the rule by publishing it in the Federal Register, with a
new preamble responding to important comments and identifying any changes in
the rale since its proposal.
EPA's OPEI estimates that EPA publishes Federal Register notices for between
1,000 and 1,300 rales each year. Approximately 20 of these rules each year are
"significant" under Executive Order 12866 (see box). These significant rules
must be reviewed by the Office of Management and Budget, unless they were
specifically exempted. EPA proposes or finalizes approximately 200 rules each
year that have a lower level of impact than the significant rules but are still
generally national in scope. The remaining notices are for rules that primarily
impact individual States, Tribes, sites, or manufacturers, or involve minor
modifications and corrections to existing rules. Many proposed rules never
become final, and proposed significant rales may become non-significant on
finalization if their estimated costs decrease, they are determined to modify
existing significant rules, or the Office of Management and Budget determines the
rule is not significant.
We identified significant rules promulgated after 1989 and
selected a sample for 15 case studies
To develop a sample of rules to study, we first needed to identify a target
population of rules of interest. We decided to focus on significant rales (see box)
finalized after 1989. OPEI maintains a database of planned and ongoing
rulemakings - the Rule and Policy Information Development System (RAPIDS) —
but at the time of the study, did not maintain a list of rules finalized over the
previous 10 years and was not able to construct a complete list of the significant
rules before 1998 using RAPIDS.
We used the Federal
Register and EPA's
website to identify the 89
significant rules finalized
in 1990 through 2001,
which are listed in Exhibit
1. The list of rules
promulgated before 1994
may be incomplete since
EPA's web-based materials
tended to be dated 1994
and later. Focusing on the
75 rales finalized from
Under Executive Order 12866, "Significant" rules:
~	Have an annual effect on the economy of $100 million
or more or adversely affect in a material way the
economy; a sector of the economy; productivity;
competition; jobs; the environment; public health or
safety; or State, local, or tribal governments.
~	Create a serious inconsistency or otherwise interfere
with an action taken or planned by another agency.
~	Materially alter the budgetary impact of entitlements,
grants, user fees, or loan programs or the rights and
obligations of recipients thereof.
~	Raise novel legal or policy issues arising out of legal
mandates, the President's priorities, or the principles
set forth in the Executive Order.
1994 on, we show in
Figure 1 that more than half (39) of these rules were issued under the Clean Air
3
Report 2003-P-00003

-------
Act. Most of the rest involved either the Clean Water Act (16) or Safe Drinking
Water Act (6). Of these 75 rules, 39 met the significant criteria because of their
expected economic impact. (Note: because one of the rules involved two laws.
Figures 1 and 2 total 76 and 40, respectively.) As shown in Figure 2, a slightly
higher proportion of the economically significant rules involve the Clean Air Act,
but the economically significant rules are otherwise similar to the larger
population of significant rules.
Figure 1
Figure 2
1694-2001 Sgnmcacti Rules
By Law
EPCRft
2 1	1
TSCA	[
CWA
16
19M-2WH Economical
Stgnficant Unlet
By Law
CAA: Clean Air Act
CWA: Clean Water Act
EFCRA: Emergency Planning and
Community Right to Know Act
FIFRA: Federal Insecticide, Fungicide, and
Rodenticide Act
RCRA: Resource Conservation and Recovery Act
SOW A: Safe Drinking Water Act
TSCA: Toxic Substances Control Act
For the pilot study, we selected 16 significant rales finalized during 1991 through
2001. Three of the rules concerned related land disposal restrictions for solid
wastes, so we combined them into one case study. The Integrated Pulp and Paper
rule involved two different laws, so we divided it into two case studies. Thus,
there was a total of 15 case studies in the pilot. Each case study is identified in
Table 1 and detailed in the Addendum report, which is available at
http://www.epa.gov/oigcarth/eroom.htm.
4
Report 2003-P-00003

-------
Table 1:
Case Studies

iiiwll
	,,,199.1,.....J.,
CM
• Municipal Waste Combustors
• Mill*		 MlttMtlllllllilH
IIml.II
SDWA
I Synthetic Chemicals Monitoring
""""l'993 	!"
iiiiiiiiiiiHmiiuitiiiiii
55™
	|. Ac j d R a i n. Permits	
1994 I


1996 !
RCRA
i Land Disposal Restrictions
1998 [
¦ *¦*¦« »i tt i ti mi i »«•*«¦•

	=			 		
	1,994,	|
caa ::
		
IIjIIC.I1
cir:
	! t. Great.Lakes W	



	
;sca,u
! Biotechnology
«tl It* «*l i<» ¦•«¦¦¦¦* ill 			 till lt>* »•!»!«• !•»<¦••( Ill* I
liii,..."II
ciill
..JPuJp and.Paper (Air)					
1998 	!
CWA	
1 Pulp and Paper (Water)
	 			 ¦AllllinillllllllHti
	1998.,	|
SDWA™
i Disinfectants and Byproducts
**¦»¦•! tl **•¦••• ll iiiaiaialtll MM ll •lil'il at lid MltMllllltl ¦•¦¦in
Iljlilll
IcaI
! Polvchlorlnated Biphenyls
¦ iiiiMiMiMiiiriin aiaiaiinlaiiiiMiiaiiianniiaMiniiatiMiiiiiiiiiMM
IljIiZIl
caTI
i Reaionai Ozone
			
IIjsslZI
clilll
	
'*2001 	!
FIFRA
1 Plant-Incorporated Protectants
..J...
"k.
3
7
8
9
.10
..11.
'*13
,14
*15
This judgment sample was selected to represent as wide a range of statutes as
possible, and to span the decade for rules under the Clean Air, Clean Water, and
Safe Drinking Water Acts, but it is not statistically representative of EPA's
significant rules during the period. For example, there was only one rule in the
sample establishing a National Emission Standard for Hazardous Air Pollutant
(NESHAP), whereas six NESHAPs were finalized between 1994 and 2001.
Notably, there are no National Ambient Air Quality Standards (NAAQS) in the
sample, even though three were finalized by EPA during this period. NAAQS are
especially science-rich rules, but we excluded them because the particulate matter
and ozone NAAQS were under remand to the District of Columbia Circuit Court
during the study.
We sought to identify the critical science behind each rule
For purposes of the pilot project, "science" included scientific and technical work
products addressing, for example, pollutant emissions, environmental transport,
cost impacts, exposure to humans and ecosystems, the effects of such exposure,
risk assessments, monitoring methods, and databases, i.e., the kinds of documents
that would be produced by scientists or engineers. We included economic
analyses only when they had a critical impact on the rule. As noted earlier, by
"critical," we do not mean that the rules were promulgated because of these
documents; rather, without the documents, the rules would have been different, or
perhaps would have even been impossible to promulgate.
5
Report 2003-P-00003

-------
We had planned to identify the critical science documents supporting each rule by
relying on a variety of people involved in developing the regulation to identify the
documents. We asked the primary contact to identify one or more additional
contacts from the following groups: scientists from the program office who
worked on the rulemaking; EPA Office of Research and Development (ORD)
scientists who provided expertise to the rulemaking team; the Senior Executive
from the program office immediately responsible for the rule; independent peer
reviewers; environmental stakeholders; and representatives of the regulated
community. With this range of perspectives on the critical science, if there was
wide consensus, we could be reasonably sure we had identified all the critical
science. It also would require less reliance on an evaluation team to make
independent judgments about what science was critical.
We started by identifying primary contacts for each rule (the people from the EPA
program office who led the rulemaking). After 6 weeks, we had identified 83
contacts (including the primary contact for each case study). We sent e-mails to
the contacts, asking each of them to identify 5 to 10 science references (e.g.,
papers, reports) that they believed most critically influenced the rulemaking and,
without which, the rules would have been substantively different. We asked that
they consider several categories of science that may be relevant to the rules. We
also asked - if they had the discretion, funds, and "20/20" hindsight - what
science gaps would they have tried to fill that might have made the rule
substantially different. Finally, we asked how they would rate the scientific
quality of the rule, on a scale of 1 (low) to 5 (high).
We received at least one detailed response for only 7 of the 15 case studies by late
November 2001, but we received no helpful responses from 58 of the contacts.
We received no useful e-mail responses from any of the EPA executives, peer
reviewers, environmental stakeholders, or representatives of the regulated
community. We then conducted interviews with primary contacts and other EPA
staff. The interviews yielded specific references and more general explanations of
how the rulemaking proceeded, and advice to look in the docket for reports on
particular studies. We were thus able to identify additional critical science
documents. Of the critical documents, we identified almost half from reading the
materials in the dockets. The pilot team identified all of the critical documents for
two of the rules (Cases 2 and 8). Because we were not able to interview peer
reviewers or stakeholders, and because we relied on only one or two Agency
contacts and the preambles and technical support documents developed by EPA,
our results may reflect an EPA bias on what science was critical.
Our process did not result in the identification of all of the critical science
documents for all of the rules. We believe we did identify all of the major
technical support that embodied the final process of gathering together the science
and other information to support the rules for all of the case studies (see Exhibit
2). These documents had titles such as background information documents,
6
Report 2003-P-00003

-------
technical support documents, regulatory impact analyses, and economic impact
analyses. They were often, but not always, cited in the preamble. If the preamble
was the only place where the science was brought together, we identified the
preamble as a critical document.
We encountered more difficulty as we identified the critical documents cited in
the major support documents. These documents included other documents
developed specifically to support the rule (e.g., databases, regulatory methods,
health criteria documents), as well as documents from the primary science
literature (e.g., toxicology data supporting a criterion, models used to do an
analysis, or even the data or mechanisms necessary to the model itself). For
example, two of the rules involved the development of National Primary Drinking
Water Regulations (Cases 2 and 11). The standards under the regulations
(maximum concentrations of chemicals in the finished water) depend in part on
the risk to human health posed by the chemical. Health Criteria Documents
summarize and interpret the available toxicology and epidemiology available in
the literature to arrive at various "criteria" values that put the heaviest weight on
(reliable) studies that show effects at the lowest chemical concentrations. These
underlying studies then become critical documents (there are usually more than
one per health criteria document because of the different types of effects, e.g.
cancer, reproductive effects, etc.). Thirty-four of these underlying criteria
documents were identified for Cases 2 and 11 alone. Because the number of
critical documents increases exponentially as one goes backward through the
citation chain, locating all of them becomes a very time-consuming process.
We believe we identified all of the critical documents cited in the major technical
support documents for six of the rules ("level 2" documents in Exhibit 2). In
three of the case studies, we went into the references for this second group of
documents, and identified the most critical references identified as "level 3"
documents in Exhibit 2 (e.g., the chemical mechanism upon which a component
of a model used to support a rule was based, or research that led to the
identification of a problem that led to the recognition of the problem that the rule
was promulgated to address).
Exhibit 2 shows a wide range in the numbers of critical documents per case. Two
of the cases for which we believe we have a complete list of level 2 documents
include only 10 and 12 critical documents, while two of the cases for which we
believe the lists are incomplete include as many as 25. Adding level three
documents also increases the number. We believe this wide range arises from a
combination of: differences in the scientific complexity of the rules, which affects
the type of critical documents needed to support the rule; and failure to identify all
of the critical documents for each rule.
These factors complicate the interpretation of the statistics for this or any full
study in the future. Because we did not identify all of the critical documents for
7
Report 20Q3-P-00003

-------
all the rules, comparative statistics on critical science documents (e.g., proportions
of critical science documents funded under grants, or performed by other federal
agencies) could be subject to bias, and interpretations based on such calculations
could be misleading. Overcoming this problem would present a challenge for
expanding the pilot study.
We identified the sources of the critical science
Once a critical science document was identified, we had to find a hard or
electronic copy. We then examined it to determine who performed the science
work, who funded it, the funding mechanism used, and whether and how it was
peer reviewed. We categorized the resulting information for statistical analysis.
Who performed the research often was identified on the title page of reports, the
by-lines in journal articles, or in the acknowledgments (e.g., a report might state
that valuable contributions had been made by a company or organization).
Funding information (both who and what mechanism) often could be found on the
title page of reports or the acknowledgments section of journal articles. We found
that some documents had an EPA cover, but inside indicated that the document
had been prepared for EPA by a contractor. In those cases, we classified the
report as funded by EPA, but prepared by a private sector firm under contract. If
instead, the report only acknowledged substantial support from a contractor, we
classified the report as having been produced by EPA, with a contract as the
funding mechanism. The criteria for the classifications are more fully explained
in the Addendum.
The document did not always address one or more of these characteristics. In
some cases, we were able to conclude that a program office funded a technical
support document (no other organization would have reason to fund it), or that the
only allowable or likely mechanism to support a scientific study was a contract.
When all else failed, we asked those who had responded to our inquiries, either
during the development of the case study or upon final review.
We asked the respondents about science gaps and science quality
At the outset of the pilot study, we had hoped to identify indicators of the
scientific quality of final rules that we could relate to the characteristics of the
critical science inputs. For example, did more rigorous peer review or the
extensive use of science from academic labs lead to "better" rules? We were
unable to identify any objective indicators, so at the suggestion of the Research
Strategies Advisory Committee, we asked respondents (some of whom were not
on our initial list of contacts) to: (1) identify any additional science that would
have been useful if it had been available ("gaps"), and (2) rate the science quality
of the rules on a scale of one to five, with five being the highest quality.
8
Report 2003-P-00003

-------
We identified the type of peer review undergone
by the critical science
Science used by EPA to support rules should be credible. Peer review is EPA's
preferred method of ensuring credibility. Peer review is the documented, critical
review of a work product, performed by experts who are independent of those
who developed the product. In a 1994 policy, EPA specifically required peer
review for scientifically and technically based work products intended to support
EPA decisions. EPA's December 1998 Peer Review Handbook expanded this
policy. Regulatory Management Guidance based on the 1998 Handbook was
issued by EPA's Office of Policy in June 1999, requiring that all rules undergoing
Final Agency Review must include either a statement that "no major scientific or
technical documents were utilized to support the rulemaking," or a statement of
compliance with peer review requirements. In a fact sheet related to this
guidance, the criteria identified for such documents are very similar to those for
significant rules in Executive Order 12866. The December 2000 Peer Review
Handbook for the first time required that the final work product itself must
describe the peer review to which it was subjected, or note that it was not peer
reviewed. A regulation itself is not subject to EPA's peer review policy, even
though the major scientific work products that support it are.
Peer review will become even more important as EPA implements recent
guidance from the Office of Management and Budget (OMB) developed to
comply with the Treasury and General Government Appropriations Act of Fiscal
Year 2001. The OMB "Guidelines for Ensuring and Maximizing the Quality,
Objectivity, Utility, and Integrity of Information Disseminated by Federal
Agencies" state:
As a general matter, in the scientific and research context, we regard
technical information that has been subjected to formal, independent,
external peer review as presumptively objective.
EPA's related guidelines, which became effective on October 1, 2002, are
consistent with many existing practices and policies, including the above
mentioned peer review policy, according to the Associate Administrator for
Policy, Economics, and Innovation.
Some documents indicated they had been peer reviewed, and by whom. Critical
documents in scientific journals were subject to the peer review policies of the
journals. For the many documents that did not indicate their peer review method
or status, we consulted EPA's Peer Review Database and asked respondents.
9
Report 2003-P-00003

-------
What We Learned, Abput^^l^
a the Science Behihd;.jt|^%ulefi%-\
The rules included in the pilot study were not a representative statistical sample of
EPA rules, and we did not identify all of the critical science inputs for every rule.
However, we made observations that we believe transcend these limitations and
will be useful to EPA rulemakers:
~	Science played an important role in the rules, but that role was not always
clear.
~	Although critical science originated from a variety of sources, the private
sector was the most common source.
~	More data and fewer scientific "blind-spots" could reduce assumptions.
~	The critical science supporting the rules often was not independently peer
reviewed.
At the end of this section, we provide some suggestions that, based on our
observations, should help EPA improve its use of science in making rules.
Science played an important role in the rules,
but that role was not always clear
We found that science played an important role in the rules, but that role was not
always clear. Even though the rules included in this pilot study depended on
hundreds of individual scientific documents, because the role of science was
generally not presented in a manner consistent with the norms of science, it may be
unclear to the public what science was critical or why.
The role of science
We identified 452 critical science documents for the 16 rules in the pilot study.
Each of these documents, had the results been different, may have affected who
was subject to the regulation, their cost of compliance, or risk to the public and the
environment.
10
Report 2003-P-00003

-------
The number of documents used per ease study ranged from 8 to 852. Even though
the 16 rules in the pilot are not a representative sample numerically, they are based
on the same statutes as those for EPA's other significant rulemakings in the 1990s
(Exhibit 1), so we believe the number of critical documents for these rules is not
atypical of the importance of science in the formulation of EPA rules.
Some examples of critical documents follow.
~	Four of the rules in this pilot study set water quality standards for drinking
water under the Safe Drinking Water Act (Cases 2 and 11) or for discharges to
lakes and rivers under the Clean Water Act (Cases 6 and 10). Criteria
documents summarized and interpreted the available toxicology and
epidemiology available to arrive at "criteria" values - concentrations deemed
"safe" for exposure to people and ecosystems. Other documents synthesized
the likelihood of exposure of people and ecosystems to pollutants, as well as
treatment costs. These data were used to set the enforceable standards. Had
any of the 186 critical science documents been different, it is reasonable to
expect that one of the enforceable standards could have been set at a different
number.
*¦ One rule involved setting emissions caps for nitrogen oxides (NOx) for 22
eastern States by demonstrating that the caps would significantly reduce
contributions to nonattainment of the NAAQS for ozone in downwind states
(Case 13). There was no requirement to assess the risk of non-attainment of
the NAAQS on health or welfare3. Instead, most of the 42 critical science
documents focused on establishing NOx emissions inventories and modeling
the chemistry and downwind transport of ozone and its precursors. This was to
show which of the States would be required to reduce emissions, and that the
proposed caps would significantly reduce nonattainment in the downwind
states. Without the inventories and modeling, there would be no scientific
basis for showing which states were significant contributors, and some of the
estimates would have been different. That could have resulted in some States
not being subject to the rule. We also identified critical science documents
without which it is reasonable to believe the section authorizing the rule might
not have been included in the Clean Air Act.
~	Several rules (Cases 1, 7, and 9) required that a particular technology must be
used, or an emissions limit equivalent to using that technology achieved. These
rules, involving a total of 50 critical science documents, required that EPA
develop databases to characterize the universe of sources to be regulated
Case 4, which had 85 critical documents, represents a cluster of 3 related significant rules on land disposal
restrictions that were finalized over a span of 5 years.
3	The ozone NAAQS was the subject of a significant 1997 rulemaking.
11	Report 2003-P-00003

-------
(municipal waste combustors, municipal landfills, and pulp and paper mills,
respectively) and models to estimate emissions from these sources. Without
the data and models, decisions about which sources would not be regulated,
and the technology required for those that were regulated, would almost
certainly have been different. In the first two cases, EPA could have made an
administrative decision whom to regulate, but the decision would likely have
been different. In the mills instance, law requires all sources to be regulated,
but emissions limits for new mills must be no higher than those at the best
existing mills, and standards for existing mills not at the "best" level also
needed upgrades. Without an emissions model and database, EPA would have
no legal basis to set the standards.
Role of science not always clear
Even though science played a role in all of the rules in this pilot study, it may not
always be clear because of the way the role of science is presented in the
preambles. Science is communicated according to widely accepted professional
norms. For example:
~	The question to be answered is introduced, along with any previous scientific
results that the current study builds on. Such results are explicitly referenced.
~	The methods used in the study are described in sufficient detail so that the
study can be reproduced by others.
~	The results are presented in the form of graphs and tables (not just the data that
support the authors' conclusions), usually with estimates of statistical
uncertainty. Authors discuss .how the data in each figure and table support and
do not support alternative answers and reach carefully bounded conclusions.
However, we saw little evidence of any of these conventions in communicating the
scientific underpinnings of the rules in the preambles, which form the legal basis
for the rule. We found that the preambles did not consistently present the scientific
and technical questions to be answered in terms of exactly how the answers would
be used to support the rale. The methods were not presented in sufficient detail to
reproduce the studies, or to understand how the studies were done. Preambles for
only 5 of the 16 rules presented any data tables, and only one presented any
indication of uncertainty estimates.
Two of the preambles provided examples of good practices in the presentation of
data:
~	The preamble of the regional ozone rule (Case 13) describes the scientific
approach taken to modeling, and tabulates the results of the model runs in
terms of the quantitative contributions of upwind States to non-attainment of
12
Report 2003-P-00003

-------
the ozone NAAQS in each downwind State. The cost effectiveness of various
ozone control options are compared in two other tables.
~	The preamble to the disinfectant and byproduct rule (Case 11) also presents
data tables comparing compliance forecasts between the 1994 proposal and the
final rale, different systems costs, populations potentially exposed, and cancer
cases expected under different options. One figure shows the ranges of
estimated benefits, the only example of uncertainty seen in any of the
preambles, and others show how many households will incur different costs of
treatment.
The preamble of a rule regulating emissions of air pollutants from landfills
(Case 7) instead illustrates an opportunity lost.
~	The preamble explained that EPA decided to exclude 90 percent of the
smallest landfills from the regulation, and to require that best demonstrated
technology be applied to any of the larger landfills with estimated
emissions ofNMOC (non-methane organic hydrocarbons, which
contributes to the formation of ozone) of more than 50 Mg/yr. The intent
was to reduce these emissions from all landfills by 53 percent at a particular
cost per ton removed. However, no data were presented showing the
estimated emissions from landfills of different size classes, the estimated
emissions reductions from each class at different control levels, the
corresponding costs, or any indication of the uncertainty in any of the
estimates (emissions estimates were based on a mathematical model rather
than actual measurements). Such a table would have made it clearer why
EPA had chosen the particular mix of landfill size and emissions caps in
the final rule, based on a combination of science and economics.
Many of the technical support documents contained data supporting decisions, but
for many of the rules in the pilot study, we observed no referencing or inconsistent
referencing of even the major critical science documents developed to support the
rule. Although most of the preambles were meticulously referenced to previous
Federal Register notices and case law, only six preambles referenced the science
underpinnings.
~	The municipal waste combustor rule (Case 1) cited seven documents at the
beginning of the preamble, but did not tie the technical arguments in the
preamble to the documents.
~	The nonroad diesel rule (Case 14) cited specific science reports using footnotes
and made some of the technical support documents available on the EPA web
site.
13
Report 2003-P-00003

-------
~	The disinfectant and byproduct rale (Case 11) was referenced in the manner of
a scientific paper, to a bibliography in a section in the preamble.
*¦ The pulp and paper rule (Cases 9 and 10) referenced documents by the docket
index number, but not by report title. Many of these documents were available
on-line, but the docket index was not, so one could not go right from the
citation to the on-line document.
~	The biotechnology rale (Case 8) and the plant-incorporated protectant rule
(Case 15) both cited references by number, which corresponded to a reference
section in the preamble.
Finally, we found it difficult to determine the relative importance of science and
administrative discretion when preambles contained statements that began, "EPA
believes ..." or "EPA concludes... " In these cases, it was not always clear when
a decision was based on science or on administrative discretion and, if it was based
on administrative discretion, whether the science really mattered. Two quotes
from the regional ozone rule (Case 13), a rule with substantial scientific
underpinnings, are illustrative:
...these four jurisdictions rank among the six highest jurisdictions in the
OTAG (Ozone Transport Assessment Group) region in terms ofNOx
emissions density. EPA believes that this high density provides an
appropriate basis for concluding that each of these four jurisdictions
should be included as a significant contributor.
Because no highly cost-effective controls are available to eliminate the
remaining amounts ofNOx emissions, EPA concludes that those emissions
do not contribute significantly to downwind nonattainment or maintenance
problems.
The scientific basis for EPA's decision about which States to include as significant
contributors was air quality model runs that identified the degree to which upwind
States contributed to ozone formation episodes in downwind states. In these
quotations, it appears that EPA is arguing that emissions density, or even cost-
effectiveness of controls, are equally suitable criteria.
In summary, even though the rules included in this pilot study depended on
hundreds of individual scientific documents, because the role of science was not
presented consistently in the preambles in accordance with the norms of science, it
may be unclear exactly what science was critical and why. We could find no
explicit guidance on the presentation of scientific findings necessary to support
EPA's rules on the "Process Guidance" section of EPA's on-line Regulatory
Reference Library, but OPEI in its comments indicated that EPA's Risk
Characterization Guidelines may in part meet this need.
14
Report 2003-P-00003

-------
Although critical science originated from a variety of sources,
the private sector was the most common source
We determined (where possible) who performed the science, who funded it, and
the funding mechanism used for the critical science documents identified for the
15 case studies. For reasons explained earlier, we cannot generalize from the rules
in the pilot to EPA rulemaking overall. Nonetheless, we can still make some
useful observations about the roles played by the various organizations conducting
and funding critical science, and the funding mechanisms used for the rules in the
pilot. Minimum numbers are cited in recognition that we did not identify all
critical science documents in support of some of the rules.
Who performed the critical science?
As summarized in Table 2 and detailed in Exhibit 3, critical research was
performed by the private sector, EPA program offices, ORD, other Federal
agencies, and other (non-Federal) government organizations (such as States). We
counted work as performed by an
organization if the report was
under the organization's cover
and did not indicate it was
prepared by a contractor for the
organization. If more than one
organization performed the work,
we counted each so the total
exceeds the number of critical
documents (i.e., 452).
Program offices or their
contractors developed virtually al
of the technical support
documents that made the
scientific case for the rule
(Exhibit 3). Even for the two rules where State and Federal government agencies
worked on research teams to develop technical support documents (the Great Lakes
water quality guidance [Case 6] and regional ozone rule [Case 13]), the EPA
program offices developed the final technical support documents. EPA's ORD
contributed 28 critical documents, including health criteria documents, monitoring
methods, assessment protocols, engineering studies, and air quality models and
field studies.
Other Federal agencies contributed 24 critical studies to the rules in the pilot.
Almost half of these supported the Great Lakes water quality guidance, and
involved the National Oceanic and Atmospheric Administration and the U.S. Fish
Table 2:

Who Performed the Critical Science
" * -ST' ^
.» a* ¦ " at • i -3»r& .•¦jiaUjL
j. Organization

Private sector
| 238
EPA in-house - program office
| 95
Academia
...•—«
OJ
CO
EPA in-house - ORD
1 28
Other (non-Federal) government
i 25
Other Federal agency
| 24
Unknown
! 5
15
Report 2003-P-00003

-------
and Wildlife Service, who have research laboratories in the Great Lakes region.
Other (non-Federal) government organizations contributed 25 critical studies, over
half of which involved the two rules in which interstate pollution issues were the
main focus (Great Lakes and regional ozone).
The large number (238) of critical science documents developed by the private
sector includes both reports contracted by EPA program offices and ORD, and
reports contracted by the regulated community (state and local governments, and
private industry). It also includes reports completed in-house by private industry.
Thus, the private sector was the most common source of the critical science behind
the rales.
Who funded the critical work?
As summarized in Table 3 and detailed in Exhibit 4, EPA program offices funded
the vast majority of the critical science documents. Other organizations (primarily
State governments and industry) funded 100 documents. Many industry
contributions involved data gathering to
support the various emissions rules (e.g.,
pulp and paper, reformulated gasoline,
and regional ozone). In some cases, the
regulated industry agreed with EPA
beforehand to a research or data
gathering strategy. ORD funded at least
85 critical science documents, including
criteria documents, early development of
the model used to support the regional
ozone rule, development or evaluation
of monitoring methods, and research
grants and cooperative agreements that
produced findings that proved critical.
Other Federal organizations, including
the National Institutes of Health,
National Science Foundation, Department of Energy, Department of Agriculture,
and Department of Interior, funded the smallest number.
What mechanisms were used to fund the critical work?
As summarized in Table 4 and detailed in Exhibit 5, the most common funding
mechanism was a contract. Contracts were used to support critical science by all
funding organizations. Some reports or technical appendices were developed by
contractors and delivered as finished products (Table 2). In other instances
contractors gathered data, ran models, conducted analyses, provided specific
Table 3;
Who Funded the Critical Science
Orga riizatior|.'s||

EPA program office
j 260
Other
| 100
EPA ORD
j 85
Other Federal agency
j 31
Unknown
| 10
If more than one organization funded the
work, we counted each so the total
exceeds the number of critical documents
(i.e., 452).
16
Report 2003-P-00003

-------
Table 4:
Funding Mechanisms for Critical Science
^Mechanism?5
expertise, or otherwise
contributed substantial input
to work products that were
ultimately authored by EPA
program office or ORD staff.
Of the 230 critical documents
provided through contracts,
212 were funded by an EPA
program office or ORD.
Internal EPA funding was
used by program offices to
develop at least 85 of the
technical support documents,
including rule preambles.
Since rulemaking is an
inherently governmental
function, program personnel
should exercise control over
the final application of science to the rule. The "Other" category primarily
included internal funding by other Federal and other governmental organizations,
particularly those who do much of their own research.
Contract
| 230
EPA in-house
| 85
Other
| 63
Unknown
| 41
Grant
I 25
Cooperative agreement
| 15
Interagency agreement
! 2
If more than one funding mechanism was used,
we counted each so the total exceeds the
number of critical documents (i.e., 452).
Grants and cooperative agreements by law cannot be used for the primary purpose
of securing goods or services for the government, so it was not surprising that we
identified them in only nine of the rules. More than half of these documents
funded under assistance agreements were published eight years or more before the
rule was finalized. Some of the documents were quite old. For example, an
epidemiology study done in 1950, long before passage of the Clean Water Act,
served as the basis for a Maximum Contaminant Level in the 1991 drinking water
standards on synthetic chemicals. This suggests that much of the science funded
by assistance agreements pays off many years in the future, and that it is not funded
specifically to support a rule, but to address the larger environmental problem that
was the target of the rulemaking.
More data and fewer "blind spots" could reduce assumptions
For 12 of the rules, respondents indicated additional science would have made their
rules better. On a scale of 1 to 5, with 5 being the highest, the respondents gave
the rules included in this pilot study quality scores ranging evenly between 3 and 5.
Therefore, we concluded that, in the view of the EPA respondents who worked on
17
Report 2003-P-00003

-------
the rules, these were good rules that could be even better if more science had been
available4. The most frequently expressed desires were for:
~ Data on emissions rates, characterization of regulated sources, and toxicity that
could lead to less uncertainty (9 rules).
*¦ Science to fill "blind-spots" (6 rules).
Based on the responses, we concluded that having more data would have resulted
in more efficient rules, because they would have required fewer conservative
assumptions. Scientific "blind spots" are areas where no body of scientific
research was available at the time of the rulemaking to adequately assess some of
the potential risks, or the particular risk was not anticipated at the time of
rulemaking. In most cases, there was a sense that while the rulemakers believed
EPA was doing a good job under the circumstances, the science and data were
being generated under undue pressure. The desire for additional scientific
information in so many of the rules included in this pilot study suggests this desire
may be common to many Agency rulemakings.
Critical science supporting the rules often was not
independently peer reviewed
A regulation itself is not subject to EPA's peer review policy, even though the
major scientific work products that support it are subject to peer review. Public
comments are taken on almost all regulatory actions, but according to the Peer
Review Handbook, public comment does not substitute for peer review. This is
because public comment does not necessarily draw the kind of independent, expert
information and in-depth analysis expected from a peer review. Nonetheless, we
were told by EPA staff members involved in 6 of the 15 case studies, that their
documents did not require peer review because they had been subjected to public
comment. As noted in the Response to Comments, we acknowledge that the
guidance on this issue has been evolving over the decade in which the rules in the
case studies were finalized.
These scores do not reflect input from peer reviewers, environmental organizations, or the regulated
community, and they should not be assumed to hold true for EPA's rales in general.
18
Report 2003-P-00003

-------
A large number (276) of the critical documents supporting the rules either were not
peer reviewed (144) or their peer-review status was indeterminate (132). Details
on peer review actions
are summarized in
Table 5 and Exhibit 6.
Lack of peer review, or
of information about
peer review, may cast
doubt on the quality of
this science.
Table 5:
Peer Review Actions
EPA has a database -
the Peer Review
Product Tracking
System - to track peer
review of its scientific
and technical work
products. It is a single
repository for product-
specific peer review
reporting and tracking,
and uses a common
reporting form for all
entries. Work products
that were completed since
ctions^M
No peer review was done
Unknown whether a peer review was done
Peer review was done by an external group
in a non-public manner, such as a refereed
journal or external experts hired by EPA
Peer review was publicly done by a Federal
advisory committee, such as the Science
Advisory Board or National Research Council
Peer review was done by some other public
review process by external experts
Independent internal EPA review done, such
as through the risk assessment forum or by
having ORD review a program office's
document
144
132
119
29
17
11
1991 should be reported in one of four categories:
List A: Work products for which peer review was completed.
List B: Candidate work products for future peer review.
List C: Work products not subject to peer review.
List D; Scientific articles or reports by EPA staff that were peer reviewed
outside EPA.
This database should be the primary means of tracking the present, past, and future
peer review status of the critical science documents identified in the pilot.
However, we could find very few of these documents in this database. We
searched the database using combinations of titles and keywords for documents
supporting the 10 cases included in this pilot study with rules finalized since 1994.
We were able to find only 4 of the 272 critical documents for these rules:
*¦ Two technical support documents for the pulp and paper National Emission
Standard for Hazardous Air Pollutant (NESHAP) marked "peer review not
needed."
19
Report 2003-P-00003

-------
~	The primary modeling technical support document for the regional ozone rule,
listed as Category "C" (a non-major scientific/technical product), marked "peer
review not needed."
~	The BEIS2 model, a critical document in the regional ozone rule listed as
category "C," marked "peer review not needed."
The peer review status for these documents was signed off on as "complete" by the
requirements reviewer from the ORD's Office of Science Policy. We also
searched on "DBP," "disinfection," and "chlorinated," and found several entries
related to the disinfectant and byproducts rule, including studies of by-products
resulting from ozonation; a 1994 regulatory impact assessment; engineering and
cost studies; and a study on cancer and chlorinated drinking water. However, these
entries did not correspond by title or date to any of the 59 critical documents
identified by the primary contacts and ORD scientists, so we were not sure whether
these documents ultimately were superceded by publication in different form or
later versions.
We concluded that: there was little correspondence between the entries in the Peer
Review Product Tracking System and the items in the docket; keyword searches
were not effective in identifying the important science behind the rules included in
this pilot study; and there was no consistency in the classification of items in the
database, either with respect to their importance, intended use, or need for peer
review. Therefore, we determined that oversight of peer review of the critical
science documents to support the rules included in this pilot study was limited and
ineffective.
Suggestions
We offer the following suggestions to the Associate Administrator for Policy,
Economics, and Innovation, and to EPA's Science Advisor, regarding science
behind EPA's rulemaking;
1. Consider presenting the scientific findings that support a rule in specific
sections of the preambles. These findings should be organized according to the
norms of science, in summary form, and indicate:
~	Why the science is required to support the rule, and how the results will
be used.
~	The methods used.
~	The important results (showing key data and their uncertainty).
~	Interpretation of the findings, and comparison with other studies that
appear to support or contradict the results.
~	Scientific referencing of underlying scientific and technical documents.
20
Report 2003-P-00003

-------
~	A separate section of the preamble that would bring in issues of law,
policy, economics, and administrative discretion that do not depend on
the scientific findings.
2.	Focus more attention in the development phase of regulations on collecting
data and doing research to address "blind spots" to support rulemakings.
3.	Take advantage of EPA's information technology capabilities to:
~	Hotlink references in preambles to documents in the docket.
~	Link scientific and technical documents in the docket to the Science
Peer Review Database.
~	Link RAPIDS to the Science Peer Review Database.
~	Maintain through RAPIDS an inventory of all rules proposed and
finalized each year.
4.	Reinforce EPA's current peer review policy, ensuring that all EPA-generated
documents critical to significant and substantive rulemakings are independently
peer reviewed, and that the responses to the significant comments appear in the
documents.
Agency Comments
OPEI, the EPA organization responsible for oversight of the Agency's regulatory
activity, provided comments on the draft report, as did the Office of Water (OW);
the Office of Prevention, Pesticides and Toxic Substances (OPPTS); and the EPA
Science Advisor. The Office of Air and Radiation (OAR) and the Office of Solid
Waste and Emergency Response (OSWER) provided informal comments. Many
of the comments led to improvements in the clarity and factual accuracy of the
report. In general, the comments supported the suggestions in the draft report, but
identified both opportunities and concerns regarding details of their
implementation. The comments are included in their entirety in Exhibits 7-10, but
we summarize the main points in this section.
On behalf of EPA, OPEI commented:
The report does an excellent job of recognizing Agency institutional
mechanisms which ensure that regulations are based on sound science. The
role of peer review and the peer review process in the development of
credible science is discussed in depth, and the Office of Policy, Economics
and Innovation (OPEI) agrees with the heavy emphasis the report places on
the utility and importance of independent peer review. Not emphasized are
two other key "good science" processes: Analytic Blueprints and the Risk
Characterization Policy. The former was designed, in part, to ensure that
critical science needs are identified early in the process and developed in
21
Report 2003-P-00003

-------
time to inform regulatory decisions, and the latter requires that both the risk
assessment process and risk the analyses are transparent, clear, reasonable
and consistent. Taken together, these three existing mechanisms can assure
that;
•	Critical science is identified early, and developed in time to inform
decisions (Analytic Blueprint),
•	Critical science is of sufficient quality for regulatory decision making
(Peer Review Process),
•	The quality of the science and the associated uncertainty is clearly
described (Risk Characterization Policy).
Further, these three mechanisms appear to directly address three of the four
findings of your report, i.e., that critical science supporting the rules often
was not independently peer reviewed, that more data and fewer "blind
spots" could reduce assumptions, and that the role of science (was*} not
made clear. Your report "determined that the oversight of peer review of
the critical science documents to support the pilot rules was limited and
ineffective." Applying the same logic suggests that shortfalls in identifying
critical data needs, and the lack of transparency and clarity in science is due
to inefficiencies or limitations in the two Agency processes intended to
identify, develop and make critical science transparent.
The OPEI comments go on to address each of these issues. We have also
incorporated the significant comments from the other EPA Programs and the EPA
Science Advisor.
With regard to the presentation of science in the preambles, OPEI recommended
that we consider using EPA's risk characterization policy as a framework for
presenting the results and suggestions in the report. They said:
Some of the science supporting rulemaking deals with health and
environmental risks. EPA adopted its policy on "Risk Characterization" in
February 1992, via a memorandum from Henry Habicht, Deputy
Administrator, and an accompanying document, prepared by a cross-office
work group. The policy was reiterated and elaborated in the mid 1990s. At
its core, the policy states that significant risk assessments should:
•	Describe how the estimated risk is expected to vary across population
groups, geographic areas, or other relevant break-outs,
•	Describe the sources of uncertainties in the risk estimates, and quantify
them, to the extent possible, and
22
Report 2003-P-00003

-------
•	Explicitly identify the impact of science and data, as opposed to
policy choices, as the source of various elements of the risk assessment.
We have found that this standard has been followed in an incomplete
fashion in documents supporting regulations, as well as other EPA risk
assessments. The draft Office of the Inspector General (OIG) report refers
repeatedly to the second and third elements of EPA's Risk Characterization
Policy, both in describing its findings and in its recommendations. We
recommend that OIG examine this policy (in effect during most of the time
period covered by the pilot study), and use it as a framework for presenting
its results and suggestions.
The EPA Science Advisor strongly agreed:
The key issue is that the preamble should present a clear summary of the
science supporting the regulatory decision, including properly
characterizing risks and the supporting science for risk management. The
preamble should list the documents from which its science-based
statements are made and the docket should contain the complete record.
This would allow readers to refer to the source material, including the
original primary science documents referenced in the critical documents
(using "primary document" as traditionally used in the science community).
OPEI also brought up EPA's new Information Quality Guidelines, which were
recently implemented:
This suggestion is consistent with the Agency's efforts related to the use of
and dissemination of information covered by the new Information Quality
Guidelines (IQ Guidelines)	[The Act] direct[s] Federal agencies to:
•	adopt a basic standard of quality as a performance goal and take
appropriate steps to incorporate information quality criteria into agency
information dissemination practices; [and]
•	issue guidance for ensuring and maximizing the quality, objectivity,
utility, and integrity of information disseminated by the agency;
establish administrative mechanisms allowing affected persons to
obtain correction of information that does not comply with the
guidelines....
OPEI believes that a full implementation of the IQ Guidelines will improve
the Agency's performance related to its discussion regarding the use of
science in rulemakings. This is also an area where OPEI and ORD together
can develop more complete recommendations regarding the presentation of
23
Report 2003-P-00003

-------
scientific findings in preamble discussions. OPEI and ORD are both
increasing their presence in Agency rulemakings as a result of last year's
Task Force on Improving Regulation Development. OPEI believes that this
increased participation by ORD and OPEI analysts will improve the
attention to and discussion of the results of the underlying analysis,
including but not limited to science, used to support EPA regulations could
be improved. This discussion would be consistent with the IQ Guidelines,
existing policies such as the risk characterization policy, and some of the
key findings of your report.
Finally, OPEI suggested developing "Principles of Analytic Integrity":
Recently, the Administrator reaffirmed the "Princip[le]s of Scientific
Integrity" establishing clear and ethical standards that should govern the
conduct of scientific studies within the Agency. To date, there is no
parallel document establishing standards for the use of research in a policy
analytic setting. OIG may wish to recommend that such a document be
developed expanding on its recommendations for clarity of presentation,
etc. and drawing on other Agency guidelines such as The Guidelines for
Preparing Economic Analyses,
There was considerable comment from the program offices on exactly how the
science in the preambles should be presented.
OW commented;
We believe there is merit in the proposal to improve the consistency of the
presentation of scientific data and conclusions in regulatory preambles.
However, more work is needed to determine how to implement such a
proposal given the wide variety of types of regulations the Agency
develops. We also need to consider the impact on the cost of developing
rules and on the length of preambles.
OW also noted:
Depending on statutory requirements for a given rule, the optimal preamble
structure for communicating the role of science maybe quite different.
Any recommendations to revise the format for rule preambles across the
Agency should be flexible and take this consideration into account. To
achieve the same objectives of the report, we recommend modifying the
recommendation to suggest that norms of science be applied consistently
throughout current preamble formats where science is discussed, in order to
improve the understanding of the scientific basis for rules.
OPPTS commented:
24
Report 2003-P-00003

-------
The preamble to the rulemaking is not, nor has it ever been, considered the
proper vehicle for communicating the science in the manner prescribed on
page 11 [page 12], The proper vehicle for communicating the science in
that detail is in separate documents that are made available to the public as
part of the rulemaking docket, with a general description provided in the
preamble. The preamble must provide a layman's explanation of the basis
for the rulemaking, including the science, economic and technical analyses
and other considerations that informed the decisions represented in the rule-
making.
The suggested addition of these science discussions in the preamble is cost
prohibitive and impractical	most stakeholders consulted in 1994, when
we evaluated the level of detail, format, and function of the preamble as
part of the government wide streamlining initiative, indicated that they
prefer for the preamble to contain a succinct summary of the science,
economic and technical analyses and other considerations that went into the
rulemaking. This allowed those responsible for or interested in the
different disciplines to obtain a general understanding of all of these
considerations, as well as the details of the one of most interest to them.
Since the primary audience for the rulemaking is not the scientists,
including the detailed scientific information in the preamble would not
serve as an effective way to communicate the scientific information to the
primary audience.
The EPA Science Advisor agreed:
.... while Agency preambles should effectively communicate the scientific
underpinnings of the rules, the description of the professional norms for
such communication is not accurate. The norms as described accurately
reflect how scientists communicate in their primary documents, but not
how science is communicated in what is described as critical primary
documents.
OAR, while in broad agreement with the other comments, provided the following
caveat, "However, noting that the preamble to a rule may be the only source of
background data that our stakeholders read, it would seem appropriate to ensure
that a complete (albeit brief) discussion of the critical science is included in future
rulemakings as suggested..."
With respect to focusing more attention in the development phase of
regulations on collecting data and doing research to close "blind spots" to
support rulemakings, OPEI commented:
The purpose of an analytic blueprint is to identify research needs and guide
data collection and research studies during the development phase of
25
Report 2003-P-00003

-------
regulations. While a requirement for developing, updating, and following
an "analytic blueprint" has been a formal part of EPA's rule-making
process for more than a decade, it has been OPEI's experience that most
analytic blueprints are treated as little more than formalities. As a result of
last year's review and reassessment of EPA's rale-making process, OPEI
and the program offices are taking steps to make the blueprints more central
and relevant to actual rule-making decisions. We suggest that the OIG
report consider referring to the analytic blueprints as one means to achieve
the results desired in (this) suggestion.
None of the other comments spoke of the analytic blueprint process. However,
OW commented:
Many EPA regulations are based on years of research and data gathering by
EPA, other Federal agencies, academia, and industry. For example, we
have been working on the arsenic drinking water standard steadily since the
1970s. OW and other programs have extensive processes of joint planning
with ORD and outside stakeholders to anticipate information needs as
much as possible. Yet, there are always data gaps and uncertainties which
we must grapple with. This is in the nature of the rulemaking enterprise.
While the Pilot Study cited respondents who said they would have liked to
have had more data, it did not identify any particular ways of obtaining it
without increasing costs or slowing down action. Allocating resources to
closing "blind spots" means something else will not be done, and delaying
action means the status quo will continue.
OW also noted that:
The 1996 SDWA Amendments require EPA to use "the best available,
peer-reviewed science and supporting studies conducted in accordance with
sound and objective scientific practices" when setting drinking water
standards (sec. 1412 (b)(3)(A)). The US Court of Appeals for the District
of Columbia Circuit determined that Congress's intent was best available
evidence at the time of rulemaking.
They noted that otherwise, "it could also negatively impact the ability to meet
statutory deadlines."
The EPA Science Advisor commented that ORD is increasing its involvement in
the Agency's decision-making process.
With respect to the suggestion to take advantage of EPA's information
technology capabilities, OPEI commented about the characterization of the
RAPIDS data base and its capabilities:
26
Report 2003-P-00003

-------
RAPIDS tracks all substantive rulemakings appearing in the Semi-Annual
Regulatory Agenda as well as a number of actions not in the Agenda, such
as Reports to Congress, Policy Plans, etc. RAPIDS does not track every
non-substantive rulemaking (SIPs, SNURs, FIPs, State Approvals, etc.), but
a sister database to RAPIDS (Federal Register Tracking Database - FR
Dailies), also maintained by OPEI's Regulatory Management Staff (RMS),
tracks every EPA action sent to and published in the Federal Register.
These rules are not economically significant or normally reviewed by OMB
and therefore are classified as "not significant."
RAPIDS records go back a number of years (1996 forward) with some
rulemaking records from earlier years available. RAPIDS also tracks
NPRMs published in many of those same years. The Regulatory
Management Staff (RMS) has built numerous views in RAPIDS and has a
view (list) of rules finalized each year.
The report seems to confuse or not clearly differentiate between
"significant" rulemakings (those OMB reviews) and "economically
significant rulemakings" (economic impact of greater than $100 million per
year). RAPIDS separates out those rales identified as "economically
significant." This designation has only been in effect for rules in the Semi-
annual Regulatory Agenda as Priority "A" (Economically Significant) since
1995. Although for years before 1995, it is more difficult to clearly identify
economically significant rules, RAPIDS identifies 50 final rules as
economically significant for the years 1994-2001 and can produce lists of
economically significant rules published final for the years 1990 to the
present.
OPEI went on to say:
OPEI is currently evaluating and enhancing RAPIDS in order to improve
the management information that is available or potentially obtainable. To
date, RAPIDS has focused on tracking regulation development progress
and facilitating EPA's submission of its portion of the Semi-Annual
Regulatory Agenda to OMB. OPEI is interested in adding features that
enhance management accountability and improved performance metrics.
RAPIDS currently links to relevant guidance and policy documents. OPEI
will continue to improve RAPIDS and seek to take advantage of other
information technology capabilities over the next year. Much of this work
will be coordinated through the Regulatory Steering Committee or
Regulatory Policy Council. We will follow up with you over the next
several months to more fully understand these recommendations and
identify what specific changes or opportunities we can adopt.
27
Report 2003-P-00003

-------
OW commented, "We support the report's third recommendation to make better
use of the Agency's information technology capabilities. Consistent use of these
tools throughout the rulemaking process will improve communication and access
to the critical scientific support documents," and that, "We would support an effort
to identify and implement ways to improve the information the Agency makes
available on rulemaking."
OAR commented that:
The value of the new Science Inventory Database is obvious. An up-to-
date, searchable system (as is in place currently) is a valuable tool when
researching the science behind rulemaking	[but that] the Science
Inventory database was designed to be a "data-lean" system which provides
enough information to direct the reader to the correct source for more
details; it was not designed to be the repository of all information related to
critical documents, especially those not issued by EPA. Whether or not this
system should be linked to other databases should be the subject for the
Science Inventory Work Group to consider.
OPEI commented on reinforcing EPA's current peer review policy ensuring that
all EPA-generated documents critical to significant and substantive rulemakings
are independently peer reviewed, and that the responses to the significant
comments appear in the documents:
OPEI fully supports this recommendation on peer-review of critical
documents and in fact has recently extended this peer-review policy to
include economic analyses. OPEI is working closely with the Agency's
Program Offices to ensure that a full review of supporting economic
analyses for all economically significant rules occurs prior to the rule's
submission to OMB. In this way, the application of sound and consistent
economic practices is ensured and the Agency's position on the use of
sound science strengthened.
With respect to the Agency's peer review policy, OW commented:
We support the report's fourth suggestion to reinforce the Agency's peer
review policy. Because the current policy does not explicitly require peer
review, it may be appropriate to recommend updating the policy to require
peer review in certain situations to ensure it is applied more consistently
across the Agency.
OW further commented:
EPA's Peer Review Policy was first issued in 1992, after some of the rules
considered in the Pilot Study. Full implementation has taken time and
28
Report 2003-P-00003

-------
continuous effort. Thus, it would not be surprising that compliance was
limited in the earlier period, but we would hope that it had been improving
as we approach the present time. Unfortunately, the report does not present
information on peer review performance over time, so we cannot tell
whether this has happened.
OSWER also commented on the changing peer review guidance over the course of
the study, and urged that we appropriately caveat that fact in the summary of the
report. In fact, they questioned whether the observations on peer review were
meaningful, given that we did not compare the peer review status of the documents
with the policy then in effect.
The EPA Science Advisor commented, "I am concerned about the OIG's finding....
that critical science supporting the rules often was not peer reviewed. I plan to
review the Agency's progress in implementing its Peer Review Policy during the
coming year."
On a more general note, the EPA Science Advisor also stated:
.... the Administrator named me to serve as the Agency's Science Advisor.
I take this role very seriously and plan to make important strides in ensuring
that Agency decisions are based on sound science, and that science is
presented and characterized properly in our rules and other important
documents.
Response to Agency Comments
The OPEI, the program offices, and the EPA Science Advisor were in general
agreement with our suggestions, but expressed some concerns about details
regarding their implementation. Even better, it appears that the mechanisms are in
place, and some steps have been taken, to make substantial progress in
implementing them. We believe that the observations in the report may serve as a
baseline against which progress can be charted.
We have incorporated many of the specific Agency comments directly into the
report and its Addendum to improve their clarity and factual accuracy. We made
corrections in several of the case studies that led to small changes in the overall
statistics on the critical documents, but they did not significantly alter the
qualitative observations or the suggestions. To simplify the report, we dropped the
distinction between primary and secondary documents that appeared in the draft
report. We also added an explanation of the different levels (one, two, and three)
of documents we reviewed.
Several questions were raised about the treatment of economics in the pilot study.
We had considered dealing with economics as thoroughly as the biological and
29
Report 2003-P-00003

-------
physical sciences in the pilot study. Initial perusal of the primary economic studies
(e.g., cost-benefit analyses and regulatory impact analyses) tended not to reveal
many citations from the primary literature. Consequently, we only included critical
economics documents when they obviously had an impact on the rule (i.e., using
the same criteria used for critical science documents) and did not cite any of the
references contained therein. We agree with the OPEI comments that economic
science is as critical as the physical, biological, and engineering sciences, and refer
the reader to a recent report, Economic Analyses at EPA: Assessing Regulatory
Impact5, that includes analyses of two of the rules (case studies 5 and 6) in the pilot
study.
Rather than making changes in the suggestions based on the comments, given the
generally positive responses to them, we have chosen instead to view the Agency
responses as a road map toward acting on them. In that spirit, we are responding
more specifically to the comments on the four suggestions.
We understand that preambles (as the embodiment of what is essentially a legal
process) cannot take on the appearance of a science journal, or be extended by
many pages to provide extensive graphs and data tabulations. Well-organized,
well-referenced, and peer reviewed technical support documents that carry the
weight of the scientific underpinnings of the rule are suitable for this task. We do
believe, however, that the critical scientific underpinnings of EPA's rules should
be explained, in plain English, in terms of the methods used to gather data, the
results obtained, and the applicability and uncertainty associated with their
application to the rule. There are examples of good practices in communicating the
scientific basis for regulations in many of the preambles of the rules in the pilot
study, and we encourage OPEI to work with the programs and the Science Advisor
to bring all preambles up to the highest standard possible. Referencing of this
science (including economics) should be as careful as the legal referencing in the
preambles.
Also related to presentation of the role of science, we agree with the Agency
comments that effectively implementing the Risk Characterization Guidelines
should improve the explanation of the application of science to regulatory
decisions. We would add that even though the Guidelines focus on risk
assessments, the principles apply as well to science and technology applications
that do not involve risk (e.g., establishment of the maximum achievable control
technology for the Pulp and Paper NESHAP, or even the adoption of monitoring
technology in the Acid Rain regulation). We agree that the new Information
Quality Guidelines should have a positive influence on the application of science to
regulations. We urge OPEI and the Science Advisor to pursue the concept of
developing a "Principles of Scientific Integrity" document, as suggested by OPEI.
5
Morganstem, R. [Ed.]. 1997. Resources for the Future: Washington, DC
30	Report 2003-P-00003

-------
We agree with OPEI that the regulatory blueprint represents an opportunity to
identify and close science and data gaps during the relatively short period between
the time a rulemaking is initiated and the final rule is proposed. Many of the rules
in the pilot study demonstrate how much can be accomplished during this period.
We urge OPEI and the Science Advisor to ensure that regulatory blueprints are
"more than formalities" in the future, and that they become central to identifying
the scientific data and analyses needed to support the regulation, and to plan and
ensure their independent peer review. The statistics on the critical science
documents funded under assistance agreements (grants and cooperative
agreements) suggest that Requests for Assistance must be planned 5-8 years in
advance of proposed rulemakings, which may be too late for regulatory blueprints.
ORD's multi-year planning process must take this time lag into account in
planning research that may be supportive of rulemaking in the future. We also note
that it was not just science gaps, but data gaps, that were highlighted by many of
the contacts in the pilot study. Monitoring data were often as important as research
in supporting the rulemakings in the pilot study (e.g., Cases 1, 3,7,9 and 13).
We are encouraged that OPEI has made improvements to RAPIDS since we began
the pilot study, and that several of the programs agree that integrating and linking
EPA's databases on regulation, science, and peer review would be helpful. We
have one caveat, however. OPEI commented that the FR Dailies database now
allows identification of all EPA rules, significant and otherwise, and that RAPIDS
now identified 50 economically significant rules (greater than $100 million/yr)
finalized since 1994. We had identified only 37 in the draft report. We checked
the list in the pilot against RAPIDS, and based on the information in the preambles
relating to Executive Order 12866, we determined that we failed to identify one of
the rules in Exhibit 1 as economically significant, and one more was questionable
(the expected impact was $99 million the first year to the economy and SI million
to EPA, and $50 Million/yr thereafter). We changed the listings in Exhibit 1 to
reflect these errors. However, we determined that 11 of the rules in RAPIDS were
not economically significant, and that two economically significant rules were
missing for this period. This reflects the human factor in information management
- information management systems are only as good as the quality of the data that
are input by the people who use them. We encourage the Agency teams
developing and integrating RAPIDS, the Science Inventory, and the Peer Review
Databases to not lose sight of this critical fact.
Finally, we are encouraged that OPEI, the programs, and the EPA Science Advisor
are in agreement about the need for more consistent independent peer review.
Three of the programs raised the point that EPA's peer review guidelines were in
flux during the ten years of rulemaking covered by the pilot study, and that we
should have taken that fact into account in interpreting the peer review statistics.
One program office even questioned whether the observations about peer review
had any validity, under those circumstances. It was beyond the scope of the pilot
study to compare Agency practice with then-current guidance, and we made no
31
Report 2003-P-00003

-------
such observation. Rather, the statistics should be seen as a baseline against which
progress may be measured. The EPA Science Advisor commented, "I am
concerned about the OIG's finding.... that critical science supporting the rules often
was not peer reviewed. I plan to review the Agency's progress in implementing its
Peer Review Policy during the coming year." We encourage the Science Advisor
in his efforts regarding peer review and his commitment to ensure that Agency
decisions are based on sound science.
32
Report 2003-P-00003

-------
essons uearne
smMSmmmiM
We conducted the pilot study to determine whether a full study could provide
answers to the questions in the introduction to this report and, if so, the level of
resources required. Proceeding with a full study would be resource-intensive. We
had intended for the pilot study to be completed in four months with less than half
a staff year of effort. Because we were not able to get timely responses to our e-
mail queries, and because it proved harder than we expected to determine funding
sources and peer review, the field phase of the pilot took 10 months and 1.5 staff
years, and we were still unable to identify all of the critical science documents and
their corresponding data for the 14 rules. For those reasons, we do not intend to
pursue a full study at this time.
However, if such a study were to be pursued at some future date, we believe:
~	Developing a list frame for a target population of substantive or minor rules is
not straightforward given the current capabilities of RAPIDS. We were not able
to confirm that it would be possible to easily identify using RAPIDS, a target
population of rules, either current or past, on which to conduct any future
studies. An alternative would be to draw sample rules out of RAPIDS (or the
Federal Register for older rules). This approach would have to be pilot tested.
~	There should be strict decision criteria for defining critical documents, and
there should be periodic group review and agreement about which documents
meet the criteria.
*¦ A decision should be made about how far back in the decision process for a
rule you can go before you can no longer determine with confidence that a
science document was critical, but you should go back at least that far.
* Interviews should be conducted with all parties involved with the rulemaking.
Special effort should be made to interview peer-reviewers and stakeholders.
E-mail is not an effective mechanism to elicit detailed responses.
~	There should be follow-up interviews with all respondents, asking them to
confirm preliminary information about each document.
~	There should be at least one research scientist on the team to facilitate
identifying critical documents.
~	There should be an advisory committee of scientists who understand both
research and rulemaking, to assist the review team.
33
Report 2003-P-00003

-------
~	All the pertinent documents applicable to rulemaking during the period covered
should be reviewed, including the preamble to the proposed rule. Critical
science identified in the proposed rule was not always cited in the preamble to
the final rule, or in the major technical support documents.
~	A different method would need to be devised to address the original questions
regarding research planning and rulemaking outcomes.
34
Report 2003-P-00003

-------
Exhibit 1
Significant Rules Finalized - 1990-2001
= Rule in pilot study	$ = Rule is significant because of its economic impact
1990
RCRA
{Land Disposal Restrictions for "Third Third" Schedule Wastes ]55 FR 22520 }$
1990
CAA
[Volatility Regulations for Gasoline and Alcohol Blends Sold in Calendar [55 FR 23658 [$
[Years 1992 and Beyond [ [
1991
SDWA jNPDWR (National Primary Drinking Water Regulations): Synthetic ;56 FR 3526 |$
[Organic Chemicals and Inorganic Chemicals; Monitoring for i [
[Unregulated Contaminants; NPDWR Implementation; NPDWR ! I
[Regulations i . . . I
1991
RCRA
jSoiid Waste Disposal Facility Criteria ]56 FR 50978 ]$ ]
1991
CAA
[Standards of Performance for New Stationary Sources; Municipal [56 FR 5488 i$
{Waste Combustors l l
1991
[CAA
{Tier 1 Light-Duty Tailpipe Standards and Useful Life Requirements ]56 FR 25724 ]$
1992
[CAA
{Operating Permits Regulations Title V of the Clean Air Act ]57 FR 32250 {$
1993
CAA
[Accelerated Phaseout of Class I & II Ozone Depleting Substances and [58 FR 65018 [$
{Methyl Bromides j ]
1993
| CAA
[Acid Rain Permits, Allowance System, Emissions Monitoring, Excess [58 FR 3590 [$
[Emissions and Appeals Regulations Under Title IV of the Clean Air j j
{Amendments of 1990 j {
1993 ICAA
[Conformity of General Federal Actions to State Implementations Plans [58 FR 63214 [$
1993 [CAA
	1		
« I 4
[Control of Air Pollution from New Motor Vehicles and New Motor [58 FR 9468 [$
[Vehicle Engines, Regulations Requiring on-Board Diagnostic Systems j j
{on 1994 and Later Model Year Light-Duty Vehicles ] {
1993
[CAA
[Evaporative Emission Regulations for Gasoline-Fueled and Methanol- [58 FR 16002 [$
[Fueled Light Duty Vehicles, Light-Duty Trucks, and Heavy Duty [ [
[Vehicles [ [
1993
[CWA
[Oil and Gas Extraction Point Source Category, Offshore Subcategory, [58 FR 12454 [$
[Effluent Limitations Guidelines and New Source Performance j [
[Standards [ i
1993
[RCRA
{Corrective Management Units (CAMU) {58 FR 8658 i
1994
CAA
ICAA
[Control of Air Pollution: Determination of Significance for Nonroad [59 FR 31306 !
[Sources and Emission Standards for New Nonroad Compression- j I
jlgnition Engines at or above 37 Kilowatts ] {
1994
[Fuel and Fuel Additives: Standards for Reformulated Gasoline [59 FR 7716 {$
1994
Ircra
i	
[Land Disposal Restrictions - Phase II - Universal Treatment Standards, [59 FR 47982 [$
[and Treatment Standards for Organic Toxicity Characteristics [ i
1994
jCAA
• » *
[List of Substances and Threshold Quantities for Accidental Release [59 FR 4478 [$
{Prevention j ]
1994 jCAA
	A	
[NESHAP; Source Categories: Organic Hazardous Air Pollutants from [59 FR 19402 l$
[the Synthetic Organic Chemical Manufacturing Industry (SOCMI) and I \
[Other Processes Subject to the Negotiated Regulation for Equipment \ \
jLeaks ] {
35
Report 2003-P-00003

-------
.. .Te.?r..i^F*:-	jTitie ^
1994 ICAA jOn-Board Control of Refueling Emissions from Light Duty Vehicles and |59 FR 16262 I
i Light Duty Trucks I I
1994
;CAA
lOrganic Air Emission Standards for Tanks, Surface Impoundments,
land Containers at Hazardous Waste Treatment, Storage, and Disposal
iFacitities and Hazardous Waste Generators
I59 FR 62896
j$
1994
ICAA
iRenewable Oxygenates for Reformulated Gasoline
=59 FR 39258
;
1994
lEPCRA-Toxic Chemical Release Reporting, Community Right-to-Know
159 FR 61432
I
1995
ICAA
[National Emission Standards for Chromium Emissions From Hard and
160 FR 4948
I
1995 ICAA
lEmission Standards for Marine Tank Vessel Loading Operations
:60 FR 48388
A.....
j$
1995
|CAA
i	
[New Source Performance Standard (NSPS): Municipal Waste
iCombustion - Phases 11 and ill (Large Units)
I60 FR 65381
{$
1995
ICAA
lOzone Transport Commission: Emission Vehicle Program for Northeast 160 FR4712
lOzone Transport I
|$
1995
ICWA
'Water Quality Guidance for Great Lakes System
I60 FR 15366
i$
1995
ICAA
INESHAP: Petroleum Refineries
•60 FR 43244

1995
ICWA
jWater Quality Standards for San Francisco Bay and Delta
=60 FR 4664
I
1996
ICAA
jAcid Rain Program: Phase II Nitrogen Oxides Reduction Program
161 FR 67112
!$
1996
=CAA
[Control of Emissions of Air Pollution: Emission Standards for Gasoline
{Spark-Ignition and Diesel Compression-Ignition Marine Engines
=61 FR 52087
j$
1996
ICAA
Federal Test Procedure for Emissions From Motor Vehicles and Motor
{Vehicle Engines: Review
;61 FR 54851
|$
1996
IRCRA
Land Disposal Restrictions - Phase III: Decharacterized Wastewaters,
{Carbamate Wastes, and Spent Aluminum Potliners
161 FR 15566
j$
1996
ICAA
{NSPS: Municipal Solid Waste Landfills Amendments
161 FR 9905

1996
ICAA
[Regulation of Fuel and Fuel Additives: Certification Requirements for
[Deposit Control Additives
:61 FR 35310
|$
1996
•jTSCA
.1	
Lead: Requirements for Disclosure of Known Lead-Based Paint and/or
Lead-Based Paint Hazards in Housing
161 FR 9064
i
1996
ITSCA
I Lead: Requirements for Lead-Based Paint Activities in Target Housing
jand Child-Occupied Facilities
161 FR 45788

1996
ICAA
IRisk Management Program for Chemical Accidental Release
iPreventlon
=61 FR 31668
J	
i$
1996
ICWA
lOil and Gas Extraction Point Source Category: Final Effluent
Limitations Guidelines and Standards for the Coastal Subcategory
161 FR 66085

1996
ICAA
jS02 NAAQS Review and Implementation Plan
161 FR 25566
1	
1997
ICAA
iNational Ambient Air Quality Standards for Particulate Matter
I62 FR 38651
1$
1997
iCAA
[¦Compliance Assurance Monitoring Rule (Previously Enhanced
I Monitoring Rule)
I62 FR 54900
1
1997
ICAA
jHospital/Medical/lnfectious Waste Incinerators
=62 FR 48348
l$
1997
ICAA
INAAQS: Ozone
I62 FR 38856
1$
1997
ICAA
jProtection of Stratospheric Ozone (Motor Vehicle Air Conditioners)
162 FR 68026

1997 ITSCA
Microbial Products of Biotechnology
=62 FR 17910
I.....
1997
ICAA
^Transportation Conformity Rule Amendments: Flexibility & Streamlining I62 FR 43780

1997
ICAA
IControl of Air Pollution From New Motor Vehicles and New Motor
iVehicle Engines: Voluntary Standards for Light-Duty Vehicles
162 FR 31191
|$
1998
[CAA
JControl of Emissions of Air Pollution From Nonroad Diesel Engines
]63 FR 56967 j$
36	Report 2003-P-0Q003

-------
i ^gpm
1998 -CAA
hrz-i'-.i i¦'¦ -/ \:ii'
(Finding of Significant Contribution and Rulemaking for Certain States in |63 FR 57355 !$
[the Ozone Transport Assessment Group (OTAG) Region for Purposes I	\
of Reducing Regional Transport of Ozone
1998
CAA,
CWA
Integrated NESHAP and Effluent Guidelines: Pulp and Paper
63 FR 18504 $
1998
CAA Locomotive Emission Standards
63 FR 18977
1998
RCRA
Land Disposal Restrictions Phase IV: Final Rule Promulgating
Treatment Standards for Metal Wastes and Mineral Processing
Wastes; Mineral Processing Secondary Materials and Bevill Exclusion
Issues; Treatment Standards for Hazardous Soils, and Exclusion of
		
63 FR 28555
1998
SDWA J^PDWFfc Stage 11 pisinfectenVpisjnfe<^on. By-PTOducts Rule
63FR69389 J
1998
TSCA
Lead: Requirements for Hazard Education Before Renovation of Target
Housing
63 FR 29908
1998
TSCA

63 FR 35384
1998
SDWA
NPDWR: Interim Enhanced Surface Water Treatment
63 FR 69477 $
1998
CWA
Pharmaceutical Manufacturing Category Effluent Limitations
Guidelines, Pretreatment Standards, and New Source Performance
Standards
63 FR 50387
1998
SDWA

63 FR 44511
1999
CAA
Findings of Significant Contribution and Rulemaking on Section 126
64 FR 28250
1999
CWA
National Pollution Discharge Elimination System (NPDES)
Phase H Regulations
64 FR 68722
1999
CAA

64 FR 28564
1999
EPCRA
TRI: Reporting Threshold Amendment for Certain Persistent and
B loaccu m u 1 atjv e Tox ic .Che im i cal s C
64 FR 58666
1999
CWA
Underground Injection Control Regulations for Class V Injection Wells,
Revision
64 CR 68545
1999 CWA
1999 CWA
NPDES: Regulations for Revision of the Water Pollution Control
64 CR 68721
Water Quality Standards: Establishment of Numeric Criteria for Priority
64 FR 61181
1999 CWA
NPDES Permit Application Requirements for Publicly Owned
Treatment Works and Other Treatment Works Treating Domestic
Sewage
64 CR 42433
2000 CAA
2000 CAA
Control of Emissions of Air Pollution from 2004 and Later Model Year
Heavy-Duty Highway Engines and Vehicles: Revision of Light-Duty
Truck Definition
Nonroad Spark-Ignition Engines At or Below 19 Kilowatts
(25Horsepower)^ (P hase 2)
65 FR 59895
65 FR 24267
2000
2000
CAA
RCRA
Protection of Stratospheric Ozone: Incorporation of CAA for Reduction
.Controlled Substarices
65 FR 70795
Hazardous Waste Management System - Identification & Listing -
Chlorinated Aliphatics Production Waster, Land Disposal Restrictions,
CERCLA
65 FR 67067
37
Report 2003-P-00003

-------
2000 CWA
Effluent I
2000 CAA [Tier II Light-Duty Vehicle arid Light-Duty Truck Emission Standards and
jline Sulfur Standards^
2000 SDWA iNPDWR: Radionuc!^ 		*		""		'.'III...
Limitations Guidelines, Pretreatment Standards, and New
irce Performance Standards for the Centralized Waste Treatment
it Source Category
2000 CWA [Effluent Limitations Guidelines, Pretreatment Standards, and New
jrce Performance Standards for the Transportation Equipment
'Cleaning Point Source Category
65 FR 6698
65FR76707
65 FR 81241
65 FR 49665
2000 CWA
FIJFRA
CAA
[Revisions to the Water Quality Planning and Management Regulation
land Revisions to the NPDES Program in Support of Revisions to the
iYy.^ter QuaIity Planning and ManagementRegulation	
65 FR 43585
Heavy-Duty Engine Emission Standards & Diesel Fuel Sulfur Control
Requirements
66 FR 5002
Identification of Dangerous Levels of Lead Pursuant to TSCA
403
66 FR 1205

NESHAP; Chemical Recovery Combustion Sources at Kraft, Soda,
Sulfite and.Stand-Alone.Sem^
66 FR 3179 $
2001 CWA
2001 -SDWA
Further Revisions to the CWA Regulatory Definition of Discharge of
Dredged Material
:NPDWR; Arsenic and Clarifications to Compliance and New Source
:nta m ina nts Mo ni.torinc)
66 CR 4549
66 FR 6975
Effluent Limitations Guidelines and New Source Performance
[Standards for the Oil and Gas Extraction Point Source Category; OMB
...Recycling Rule
PDES: Regulations Addressing Cooling Water Intake Structures for
Facilities	
2001 [RCRA [Hazardous Waste Management System: Identification and Listing of
[Hazardous Waste: Inorganic Chemical Manufacturing Wastes; Land
[Disposal Restrictions for Newly Identified Wastes; and CERCLA
[Hazardous Substance Designation and Repoilabte
2001 jRCRA [Amendments to the Corrective Action Management Unit Rule
66 CR 6849
66 FR 31085
66 FR 65256 j
66 FR 58257 !
38
Report 2003-P-00003

-------
Exhibit 2
Numbers and Completeness of
Critical Documents
Saua'a?
&W~-- J Of Critical?
[MunicipalWaste Cqmbuston
lAcij^^JRajn^ Pejmite mmmi
fe§MPj,SR9§fI.RfS!r.l9,1i9»,?,§,
teffsrfflyJsJf.^.Ssfslio,?,,,	
	
lM iff jP.SL.§,?.!M.,W,Sl.?te. LandfiUs
[Biotechnology	
teM'.i?.sos}.E.?£f.c..(6M	
[Puljjand PaiJer ^Water),^^,,^^^^
iDisinfectantsand..^
KS!xsM9r.ini|.te,i?.,iii?J?.frMs	
[Regiona] Ozone	 _
[w.9ncss^.,5iff,§!.Mnoi'?®sl	
f	*	*	TOTALS
How"
were re
Case Ddcuments
jS'
15
2 I
49
3 i
12
4 I
85
5 !
19
6 i
64
7 ?
25
8 !
25
9 I
10
10 :
14
11 I
59
12 i
10
13 I
42
14 I
8
.15...!,..,
	15 _
JSomeleye!,2
JA!j level, 2 and.so,™,\eve
jAIjtevel.2			
jAIIJevel 2t	
J§omeJevel2			ti
J&!!.!s,ys)..2.„...	
JSomeJevej.2				
J§omeJeve!2	
jAIMevd 2,
iSome level 2
452
J§£tn£.J§,Y.§!..?,		
J§s?.cn.?Jj.Y.i!.,?,	
,JSm,!sM2..	
Level 1 = Major support document (e.g., regulatory impact statement)
Level 2 = Document referenced in the major support document
Level 3 = Document referenced in a level 2 document
39
Report 2003-P-00003

-------
Exhibit 3
Who Performed the Science Work
Year Act RiifSSiort Title. -
AC
...... i
IO
OF i OG
PS;
• :¦«!
u?r

1991 iCAA ^Municipal Waste Combusters
0
; 6
1
0
1
9 j
0 ;
17
1991 ISDWA ^Synthetic Chemicals Monitoring
4
! 7
5
3
2
27 !
3 S
51
1993 iCAA -Acid Rain Permits
0
! 2
1
0
0
10 I
0 I
13
1994 jRCRA Hand Disposal Restrictions
0
i 4
0
0
0
81 i
o i
85
1994 ICAA iReformulated Gasoline
3
j 5
0
0
1
10 \
0 I
19
1995 jCWA jGreat Lakes Water Quality
21
! 14
5
12
5
11 I
0 I
68
1996 jCAA jMunicipal Solid Waste Landfills
1
i 3
U
0
1
21 I
0 i
26
1997 jTSCA jBiotechnology
12
! 7
0
2
0
5 i
o i
26
1998 jCAA jPulp and Paper (Air)
0
i 3
0
0
0
7 !
0 I
10
1998 jCWA jPulp and Paper (Water)
0
! 9
0
0
0
5 !
0 i
14
1998 iSDWAiDisinfectants and Byproducts
16
i 7
11
4
6
19 j
2 j
65
1998 JTSCA iPolychlorinated Biphenyls
0
i 4
1
0
0
5 i
o i
10
1998 jCAA jRegional Ozone
6
! 10
4
3
7
20 i
0 i
50
1998 iCAA :N on road Diesel Engines
1
I 4
0
0
1
5 I
0 I
11
2001 iFIFRA iPlant-lncorporated Protectants
4
I 10
0
0
1
3 I
0 I
18
Total Number
68
I 95
28
24
25
238!
5 |

AC:	Academia
IP:	EPA in-house - Program Office
IO:	EPA in-house - ORD
OF:	Other Federal Agency
OG: Other (non-Federal) government entity
PS: Private sector
U: Unknown
40
Report 2003-P-00003

-------
Exhibit 4
Who Funded the Science Work

me

" 0"RDi|lr0l^
¦tcjfe
mmmm
1991
CM
Municipal Waste Combusters
12
5
0
2
0
19
1991
SDWA
Synthetic Chemicals Monitoring
21
21
3
4
1
50
1993
CAA
Acid Rain Permits
12
8
0
0
0
20
1994
RCRA
Land Disposal Restrictions
84
1
0
0
0
85
1994
CAA
Reformulated Gasoline
11
0
0
7
1
19
1995
CWA
Great Lakes Water Quality
14
9
17
26
2
68
1996
CAA
Municipal Solid Waste Landfills
12
5
0
7
1
25
1997
1998
TSCA	
CAA
Pulp and Paper (Air)
11
5
5
0
2
0
5
5
4
0
27
	10
1998
CWA
Pulp and Paper (Water)
10
1
0
4
0
15
	64
1998
SDWA
Disinfectants and Byproducts
Polychlorinated Biphenyls
22
9
16
4
21
1
1998
1998
TSCA
1
0
0
0
10
CAA
^Regional Ozone
16
13
4
17
0
50
1998
2001
CAA
FIFRA
Nonroad Diesel Engines
Plant-Incorporated Protectants
7	
14
0
0
0
	T"
1
0
8
*16 "
1
0
Total Number
260
85
31
100
10
* 4B6&
PO: EPA Program Office	O: Other
ORD: EPA Office of Research and Development	U: Unknown
OF; Other Federal Agency
41
Report 2003-P-Q0003

-------
Exhibit 5
Funding Mechanisms Used
w as:*- *
r Act.
1991 CAA

Title#fr^-
gElca
IS
m
1991
1993
1994
1994
SDWA
CAA
RCRA
Monitoring
33
9
1
CAA
Land Disposal Restrictions
Reformulated Gasoline
0
0
1
0
"6"
.....
0
0
0
1995
1996
1997
1998
CWA
Great Lakes yyater Quali^
Municipal Solid Waste
Biotechnology
11
2
2
1
0.
0
10
81
6
10
11
2
4
5
12
3
0
0
5
20
0
0
0
7
11
8
CAA
Em! p.n p® [
o
4
0
1
0
1
1998
1998
1998
CWA
SDWA
Disinfectantsand .Byprodu cts _
TSCA

1998
1998
2001
CAA
CAA
FiFRA
RegionaJ Ozone
Plant-Incorporated Protectants
0
3
0
3
1
0
0
T
0*
3
0
1
0
0
0
1
0.
......
8
16
12
20
5
24
4
5
4
3
13
0
8
1
1
6
0
*2
0
Total Number
25
15
230
85
63
41
G:	Grant
CA:	Cooperative Agreement
IAG:	Interagency Agreement
C:	Contract
I:
O;
U:
EPA in-house
Other
Unknown
42
Report 2003-P-00003

-------
Exhibit 6
Peer Review Actions Taken
'.aV.\
1991
*V. . * 1 "W' » * T TBSS*
ICAA {Municipal Waste Combusters
1
12 i
1
0
1
0 l
15
1991
ISDWA {Synthetic Chemicals Monitoring
6
16 I
3
0
19
5 I
49
1993
iCAA :Acid Rain Permits
0
0 I
12
0
0
0 I
12
1994
jRCRA iLand Disposal Restrictions
85
0 I
0
0
0
0
85
1994
iCAA : Reformulated Gasoline
0
9 I
0
0
10
0
19
1995
iCWA iGreat Lakes Water Quality
5
16 I
6
0
37
0 I
64
1996
ICAA iMunicipal Solid Waste Landfills
0
24 I
0
0
1
0 !
25
1997
ITSCA iBiotechnology
9
8 I
3
0
5
0 I
25
1998
jCAA iPulp and Paper (Air)
10
o i
0
I 0
0
0 I
10
1998
jCWA IPulp and Paper (Water)
13
0 I
0
I 0
0
1 I
14
1998
ISDWA iDisinfectants and Byproducts
0
18 I
0
i 4
33
4 I
59
1998
ITSCA iPolychlorinated Biphenyls
8
1 I
0
I 1
0
0 I
10
1998
]CAA {Regional Ozone
7
12 i
1
I 11
10
1 I
42
1998
ICAA -Nonroad Diesel Engines
0
6 I
0
I 1
1
0 I
8
2001
IFIFRA iPlant-lncorporated Protectants
0
10 I
3
I 0
2
0 I
15

Total Number
144
132 ;
29
i 17
119
11 ]
. 452
N: No peer review done	OEP:
U: Unknown whether peer review done
FACA; Peer review done by	ENP:
Federal advisory committee
Peer review done by some other public
review process by external experts
Peer review done by external group
in a non-public manner
Independent internal EPA review
43
Report 2003-P-00003

-------
Exhibit 7
EPA Science Advisor Comments
The full text of the comments follows.
44
Report 2003-P-00003

-------
pncffc
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
August 19, 2002
OFFICE OF
RESEARCH AND DEVELOPMENT
MEMORANDUM
SUBJECT: Response to OIG Draft Audit Report - Science to Support Rulemaking
FROM; Paul Gilman	/s/Paul Gil man
Science Advisor to the Agency (8101R)
TO:
Nikki Tinsley
Inspector General (2410T)
This memorandum transmits comments on the Office of Inspector General's (OIG) Draft
Report, Science to Support Rulemaking, dated July 19, 2002. Briefly, I concur with the
suggestions made to improve the transparency and consistency with which science is applied to
Agency rulemaking.
In early spring of 2001, the Administrator recognized the need to improve the scientific
and economic basis of Agency decisions and commissioned a task force to develop
recommendations for improving the rulemaking process. The task force made many
recommendations, and the Agency is well along the way towards implementing them.
One key outcome of the Administrator's Task Force is to increase the Office of Research
and Development (ORD) involvement in the Agency's decision-making process. Another
outcome is that the Administrator named me to serve as the Agency's Science Advisor. I take this
role very seriously and plan to make important strides in ensuring that Agency decisions are based
on sound science, and that science is presented and characterized properly in our rules and other
important documents.
I am concerned about the OlG's finding (discussed on pages 17-18 of the draft report) that
critical science supporting the rules often was not peer reviewed. I plan to review the Agency's
progress in implementing its Peer Review Policy during the coming year.
In addition to implementing the recommendations to improve the scientific basis of our
decisions, the Agency is working to finalize Information Quality Guidelines that will apply to all
information that it disseminates. These guidelines, which will be effective in October 2002,
present the Agency's procedures for ensuring the quality of information that we disseminate, and
45
Report 2003-P-00003

-------
provide an opportunity for the public to request correction of information that does not comply
with the guidelines. These guidelines will help to improve the quality and transparency of our
decision-making.
ORD would like to offer three related comments to sharpen the accuracy of the report.
First, the draft report uses the term "primary document" to refer to the documents considered to
have most critically influenced a regulatory decision. In the draft report, primary documents are
described as those that "embodied the final process of gathering together the science and other
information to support the rule," with examples being background support documents, regulatory
impact analyses, and economic impact analyses (page 6). A different term should be used to refer
to these documents (perhaps "critical document"), because in the scientific community a primary
document generally refers to original scientific research, rather than gathering, reviewing and
analyzing data collected by others.
Second, the draft report indicates an effort was made to determine which organizations
performed and funded the science work embodied in the critical document (page 7). While it
should be possible to determine who prepared and funded the critical document, the value of
doing so is not clear, because the scientific research embodied in what OIG refers to as the
primary documents was likely performed by many individuals and organizations whose work was
being summarized. For example, all of the critical scientific research could have been performed
by EPA scientists, but the critical document summarizing it was prepared by a contractor. This
might present an inaccurate picture about the contribution of EPA scientists. The ambiguity
inherent in this situation should be acknowledged.
Finally, while Agency preambles should effectively communicate the scientific
underpinnings of the rules, the description of the professional norms for such communication
(pagel I) is not accurate. The norms as described accurately reflect how scientists communicate
in their primary documents, but not how science is communicated in what is described as critical
primary documents. The key issue is that the preamble should present a clear summary of the
science supporting the regulatory decision, including properly characterizing risks and the
supporting science for risk management. The preamble should list the documents from which its
science-based statements are made and the docket should contain the complete record. This
would allow readers to refer to the source material, including the original primary science
documents referenced in the critical documents (using "primary document" as traditionally used in
the science community).
I appreciate the opportunity to review and respond to the draft report. Science must play a
more prominent role in Agency decision-making. As Science Advisor to the Agency, one of my
objectives is to ensure that the critical scientific information used in our decisions meets the
highest standards of quality and transparency.
46
Report 2003-P-00003

-------
cc: ORD Executive Council
ORD Management Council
ORD Science Council
R. Dyer (8104R)
C. Bosma (8104R)
C. Varkalis (8102R)
47	Report 2003-P-00003

-------
Exhibit 8
Office of Water Comments
The full text of the comments follows.
48
Report 2003-P-00003

-------

$	UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
m
\ pRdt4-'
y
WASHINGTON, D.C. 20460
OFFICE OF
WATER
September 10, 2002
MEMORANDUM
SUBJECT: OIG Report: Science to Support Rulemaking
FROM: G. Tracy Mehan /s/
Assistant Administrator for Water
TO:	Jeffrey Harris, Director for Program Evaluations
Cross-Media Issues
Office of Inspector General
Thank you for the opportunity to review the suggestions in the subject report for improving the use of
science in EPA rulemaking. This is an issue of critical importance to the Agency's credibility.
We believe there is merit in the proposal to improve the consistency of the presentation of scientific data
and conclusions in regulatory preambles. However, more work is needed to determine how to implement such a
proposal given the wide variety of types of regulations the Agency develops. We also need to consider the impact on
the cost of developing rules and on the length of preambles. OW will be happy to work with your staff, OPEI, and
other offices to develop these proposals further.
The proposal to use information technology to hot-link various elements of the rulemaking package and
Agency databases is intriguing and should be developed further. We also support reinforcement of EPA's peer
review policy.
We are less positive about the proposal to "focus more attention in the development phase of regulations on
collecting data and doing research to close "blind spots" to support rulemakings. For major rulemakings, there may
be many years of data gathering and research available, but gaps in knowledge always remain. A decision to defer
action while research is done is as much a risk-management decision as any other and implies the continuation of the
status-quo for an additional period of time.
Comments by our staff are attached for your consideration.
Attachment
cc: Jay Messer
Thomas Gibson
A1 McGartland
Report 2003-P-00003

-------
Comments
OIG Draft Pilot Study: Science to Support Rulemaking
The Pilot Study was undertaken to assess the use of science in EPA rulemaking.
Predictably, many methodological issues were encountered, exacerbated by the difficulty of
assessing efforts that occurred as much as a decade ago. The OIG is proposing not to undertake a
more comprehensive study at this time. We support this conclusion.
The Pilot Study did lead, however, to some interesting suggestions. These comments
center on the suggestions, which were:
1. Consider presenting the scientific findings that support a rule in specific sections of
the preambles. These findings should be organized according to the norms of
science, in summary form, and indicate:
~	Why the science is required to support the rule, and how the results will
be used.
*¦ The methods used.
*¦ The important results (showing key data and their uncertainty).
~	Interpretation of the findings, and comparison with other studies that
appear to support or contradict the results.
~	Scientific referencing of underlying scientific and technical documents.
A separate section of the preamble that would bring in issues of law,
policy, economics, and administrative discretion that do not depend on
the scientific findings.
~
2.	Focus more attention in the development phase of regulations on collecting
data and doing research to close "blind spots" to support rulemakings.
3.	Take advantage of EPA's information technology capabilities to:
~	Hotlink references in preambles to documents in the docket.
»• Link scientific and technical documents in the docket to the Science
Peer Review Database.
~	Link RAPIDS to the Science Peer Review Database.
~	Maintain through RAPIDS an inventory of all rules proposed and
finalized each year.
4. Reinforce EPA's current peer review policy, ensuring that all EPA-generated
documents critical to significant and substantive rulemakings are independently
peer reviewed, and that the responses to the significant comments appear in the
documents.
50
Report 2003-P-00003

-------
1. Consider presenting the scientific findings that support a rule in specific sections of the
preambles.
The study team found an absence of consistency in reporting and using scientific findings
in the rules which made them difficult to compare. The goal of fostering consistency in this area
is a desirable one and will make things easier for our stakeholders. At the same lime, it must be
recognized that EPA develops a wide variety of types of regulations based on a variety of statutory
mandates: for example, some are technology based, some based on individual risk targets, some
on balancing costs and risks. Some are directed at human health risks, others at ecological risks,
others at both.
It is not clear how one would formulate a structure that would accommodate this variety.
The suggestions in the report, the so-called "norms of science", are really a model for reporting
research results from a particular investigation. What we have in rulemaking preambles and
supporting documents is generally a summary of knowledge in a particular area leading to a
conclusion that contributes to the decision. In terms of scientific literature, this is more like a
review article than a research report.
Further, requirements in this area have the potential of increasing the costs of rulemaking
and increasing the length of preambles. These undesirable effects should be minimized as we
develop ways of fostering greater consistency in the presentation of scientific findings.
In summary, this proposal has merit but needs further development before it can be
adopted. OW will be happy to work with OIG, OPEI, and other offices on these issues.
2. Focus more attention in the development phase of regulations on collecting data and
doing research to close "blind spots" to support rulemakings.
Many EPA regulations are based on years of research and data gathering by EPA, other
Federal agencies, academia, and industry. For example, we have been working on the arsenic
drinking water standard steadily since the 1970s. OW and other programs have extensive
processes of joint planning with ORD and outside stakeholders to anticipate information needs as
much as possible.
Yet, there are always data gaps and uncertainties which we must grapple with. This is in
the nature of the rulemaking enterprise. While the Pilot Study cited respondents who said they
would have liked to have had more data, it did not identify any particular ways of obtaining it
without increasing costs or slowing down action. Allocating resources to closing "blind spots"
means something else will not be done, and delaying action means the status quo will continue.
51
Report 2003-P-00003

-------
3. Take advantage of EPA's information technology capabilities.
There are some intriguing possibilities here. We would support an effort to identify and
implement ways to improve the information the Agency makes available on rulemaking.
4. Reinforce EPA's current peer review policy,
EPA's Peer Review Policy was first issued in 1992, after some of the rules considered in
the Pilot Study. Full implementation has taken time and continuous effort. Thus, it would not be
surprising that compliance was limited in the earlier period, but we would hope that it had been
improving as we approach the present time. Unfortunately, the report does not present
information on peer review performance over time, so we cannot tell whether this has happened.
We also note that, in many cases, the investigators could not determine whether documents were
peer reviewed.
It is not clear whether the investigators are proposing a change in the Agency's Peer
Review Policy. The recommendation (quoted above) does not appear to be any different from the
current policy. If a change is being suggested, this should be made clear.
52
Report 2003-P-00003

-------
OGWDW Comments
OIG Draft Pilot Study: Science to Support Rulemaking
Factual Accuracy of Report
1)	Exhibit 1 incorrectly lists the following rules as significant (by E012866): Filter Backwash
Recycling, Consumer Confidence report. The Exhibit incorrectly lists the Radon Rule as finalized
in 1999; the Radon rule has not yet been finalized.
Exhibit 1 also incorrectly records the Arsenic rule as being withdrawn, which is not accurate. The
original Arsenic rale promulgated on January 22, 2001. EPA temporarily delayed the effective
date for this rule for 60 days, rom March 23, 2001 until May 22,2001 (66 FR 16134), in
accordance with the memo from Andrew Card entitled "Regulatory Review Plan". The effective
date was again delayed to Februrary 22,2002 (66 FR 28342) to conduct reviews of the science
and cost analysis.
2)	Different numbers for total critical documents are given for Exhibits 3 (471), 4 (469), 5 (443),
and 6 (436) and on page 18 of main text (2940). Also the number of critical documents listed for
individual rules are inconsistent across exhibits. It appears that these numbers should be the same
- check these numbers and correct or otherwise provide an explanation for differences.
3)	Page 18 of the main report states that EPA's Peer Review Product Tracking System data base
should be the primary means for tracking present, past, and future peer review status of critical
science documents identified in the pilot. It further states that, since the tracking system was
developed, only 4 of 364 critical documents identified in the case studies, were found listed in this
database. This appears to be an inappropriate suggestion as EPA's tracking system is designed for
EPA generated documents whereas only a fraction of the critical documents (116/471 -Exhibit 3)
are generated by EPA.
Report Content
1)	The discussion of critical science source and funding did not discuss the significance of the
information or how it relates to the report's recommendations. The report does not discuss the
linkage between source of funding and peer review status, and these data are important. The
report should clarify how the two relate.
2)	It is important to recognize, at least with regard to drinking water rales (largely due to the 1996
SDWA amendments), that the science discussion in preambles has evolved significantly in the last
10 years. Thus, the analysis of case study 2 is very outdated and does not reflect practices since
1996.
53
Report 2003-P-00003

-------
3) Please define the RAPIDS system and explain its purpose for the benefit of readers unfamiliar
with the database.
Report Recommendations
1)	The report's first recommendation to separate the discussion of science versus non-science
influences into different preamble sections does not appear efficient as there would be significant
redundancy in the discussion of rule criteria in each section. If this recommendation is
implemented, it could result in doubling the preamble discussion, while not necessarily facilitating
a better understanding of the basis for the rule.
Depending on statutory requirements for a given rule, the optimal preamble structure for
communicating the role of science may be quite different. Any recommendations to revise the
format for rule preambles across the Agency should be flexible and take this consideration into
account. To achieve the same objectives of the report, we recommend modifying the
recommendation to suggest that norms of science be applied consistently throughout current
preamble formats where science is discussed, in order to improve the understanding of the
scientific basis for rules.
2)	We support the report's second recommendation to reduce "blind spots." However, if more
greater data collection is mandated, this recommendation has major resource implications if
additional funding is not provided. It could also negatively impact the ability to meet statutory
deadlines.
The 1996 SDWA Amendments require EPA to use "the best available, peer-reviewed science and
supporting studies conducted in accordance with sound and objective scientific practices" when
setting drinking water standards (sec. 1412 (b)(3)(A)). The US Court of Appeals for the District
of Columbia Circuit determined that Congress's intent was best available evidence at the time of
rulemaking. EPA agrees with this assessment.
3)	We support the report's third recommendation to make better use of the Agency's information
technology capabilities. Consistent use of these tools throughout the rulemaking process will
improve communication and access to the critical scientific support documents.
4)	We support the report's fourth suggestion to reinforce the Agency's peer review policy.
Because the current policy does not explicitly require peer review
(htto ://www. ep a. gov/o sp/spc/p erevm em .htm). it may be appropriate to recommend updating the
policy to require peer review in certain situations to ensure it is applied more consistently across
the Agency. A more consistent means of tracking peer reviewed documents will be very
beneficial, and should help clarify the fact that many of the studies listed as having unknown peer
review status in the draft report were actually peer reviewed.
54
Report 2003-P-00003

-------
Factual Accuracy of Case Study Discussions
Synthetic Chemicals Monitoring - Case Study 2
(1)	Page A-6: the last paragraph beginning discussion of Category 1,33 and HI pollutants.
This whole discussion needs some work - some statements are not quite accurate. We suggest the
following replacement paragraph:
"Category I contaminants are those which EPA has determined there is strong evidence of
carcinogenicity from drinking water ingestion and the MCLG is set at zero. Category II
contaminants are those which EPA has determined that there is limited evidence for
carcinogenicity from drinking water ingestion. The MCLG for Category H contaminants is
calculated using the RfD/DWEL with an added margin of safety to account for cancer effects or
are based on a risk range of 10"4 to 10"6 when data are inadequate to derive an RfD. Category III
contaminants arc those which there is inadequate evidence of carcinogenicity by drinking water
ingestion. For Category IH contaminants, the MCLG is established using the RfD. The science
issues with respect to the MCLGs thus involve health risk assessments that deal with all the above
aspects for each of the pollutants."
(2)	Page A-8: third paragraph.
We suggest striking the first sentence that reads "Compliance with the MCL is determined by
analysis with approved analytical techniques." While this is a true statement, this is not an
appropriate lead into the discussion on PQLs and analytical feasibility limitations in setting of an
MCL. We suggest replacing it with the following sentence: "The feasibility of setting an MCL at
a precise level is also influenced by laboratory ability to reliably measure the contaminant in
drinking water." Also, there is a typo toward the end of this paragraph ... instead of PCLs - this
should be PQLs.
(3)	Page A-9: first full paragraph that begins with "EPA proposed monitoring requirements ...."
Unfortunately, this paragraph does not provide complete information regarding the final decision
for unregulated contaminants and may be a little misleading. The report cites the January 30,
1991 Final NPDWR for SOCs, IOCs, and unregulated contaminants. This paragraph discusses
that EPA proposed monitoring requirements for ~110 unregulated contaminants and notes that
EPA adopted a scheme requiring all systems to monitor for the highest priority organics, unless a
vulnerability assessment determined that a system was not vulnerable to contamination, but it fails
to specifically state that the final rule settled on a one time monitoring requirement for 30
unregulated organic and inorganic contaminants. The report does note this on the first page but
we think this should be restated here as well.
(4)	Page A-9: second full paragraph on SMCLs.
55
Report 2003-P-00003

-------
Tthe next to the last sentence is missing a parentheses to close out "discoloration of water." Also,
we suggest breaking the second to the last sentence into two so that the following phrase for the
aluminum SMCL can be included:
"EPA dropped the proposed organics SMCLs but retained the existing odor SMCL of 3 Total
Odor Number (TON). The Agency finalized an SMCL range for aluminum (due to discoloration
of water) with the precise level for each system being determined by the State. Furthermore, the
Agency deleted an MCL for silver and finalized an SMCL to protect against skin dicoloration or
argyria from a lifetime exposure."
(5)	Page A-9: third full paragraph, five lines down - this should be 1,2-dichloropropane not 1,2-
dichloropropanol.
(6)	A couple places - chromium is capitalized (and it does not begin a sentence) - change to
small case.
(7)	Page A-16 - Last paragraph .... we suggest rewording the last two sentences as follows:
" Just as we were finishing the study, OGWDW announced its preliminary decision not to revise
NPDWRs for 68 chemical contaminants. The Agency stated that the 68 chemical NPDWRs
should not be revised at this time for one of the following reasons:
~	36 NPDWRs were undergoing Agency health risk assessments. These assessments are not
expected to be complete in time for EPA to make its final revise/not revise decisions.
~	17 NPDWRs remained appropriate and any new information available to the Agency
supports retaining the current regulatory requirements
~	12 NPDWRs had new health, technological, or other information that indicated a potential
revision to MCLG and/or MCL; however, the Agency believed any potential revision
would result in a minimal gain in the level of public health protection and/or provide
negligible opportunity for significant cost-savings.
~	3 NPDWRs had data gaps or research needs that needed to be addressed before EPA could
make definitive regulatory decisions. When the data gaps have been resolved, EPA plans
to consider the results in the next review cycle.
Stage 1 DBPR - Case Study 11
(1) Page A - 80 : "Brief description of science input to the rule"
The last two sentences are incomplete in their intended coverage and should be revised. We
suggest replacing the last two sentences with the following: "In addition, EPA needed to assess
risks associated with DBP occurrence levels and to evaluate best available technologies for
reducing such risks to feasible levels (while not compromising microbial protection). Using
scientific and technological information gathered, EPA defined best available technologies,
criteria by which total organic carbon (naturally occurring organic precursors to DBP formation)
56
Report 2003-P-00003

-------
should be removed, and how various DBFs and disinfectants should be measured and
monitored.."
2)	Many of the critical documents cited in the "Table of Critical Documents" as having unknown
peer review status arc actually published in journals that require peer review (e.g., JAWWA,
Epidemiology). This may be true for other case studies. It will be worthwhile to reassess and ret-
tally these classifications with consideration of studies published in peer-reviewed journals-the
report's conclusion's may be influenced by such an exercise.
3)	The following documents should be listed as primary (Ref #46,47, 48) to be consistent with
the text describing "primary" documents in the main report.
57
Report 2003-P-00003

-------
Exhibit 9
Office of Prevention, Pesticides and
Toxic Substances Comments
The full text of the comments follows
58
Report 2003-P-00003

-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON D.C., 20460
OFFICE OF
PREVENTION, PESTICIDES AND
TOXIC SUBSTANCES
September 13, 2002
MEMORANDUM
SUBJECT: Review of the Office of Inspector General's Pilot Study on Science to
Support Rulemaking - OPPTS Comments
FROM: Angela F. Hofmann //s/ Angela F. Hofmann
Director of Regulatory Coordination
Office of the Assistant Administrator (7101M)
TO: Jeffrey Harris, Director for Program Evaluations on Cross-Media Issues
Office of Inspector General (OIG) (2460T)
Thank you for the opportunity to review the Office of Inspector General's (OIG's)
draft report entitled: "Science to Support Rulemaking", a pilot study which evaluated the
Agency's use of science to support EPA rulemaking. We coordinated the review for
OPPTS, and respectfully submit the attached comments and suggestions for your
consideration.
The pilot team sought to identify the role that science played in supporting 14
EPA rules promulgated between 1994 and 2001, represented by 15 case studies. Of
the 15 case studies, three were related to rulemakings promulgated by OPPTS, i.e.,
case studies 8,12 and 15. We have specific comments on two of the case studies, i.e.,
case studies 8 and 15, and do not have any comments on case study 12, which
involved the PCB Disposal Amendments.
We discussed our specific comments on Case Study 15, the Plant Incorporated
Protectants Rule, with Chris Baughman, and she has addressed our comments with the
changes that we identify in the attachment. We discussed our comments on Case
Study 8, the TSCA Biotechnology Rule, with Jay Messer, and he has indicated that he
will consider our specific suggestions contained in the attachment. In addition, we have
provided some general comments and suggestions that we believe will help improve
the report.
If you have any questions about our comments, please contact Sandy Evalenko
on my staff at 564-0264. Thank you.
Attachment
? A \
I Wi i
59
Report 2003-P-0Q003

-------
cc: 10: Steve Johnson; Susan Hazen; Sandy Evalenko
OSCP: Joseph Merenda; Tom McClintock; Elizabeth Milewski
OIG: Chris Baughman; Jay Messer
60
Report 2003-P-00003

-------
Attachment
OPPTS Comments on the OIG Pilot Study
"Science to Support Rulemaking"
A. Comments Specific to the OPPTS Rulemakings
1.	Text Changes Requested on Case Study 15 - Plant Incorporated Protectants
a.	On page 13 of the draft report, the statement identified is contradicted by the
finding on page A-108. The statement on page 13 should be revised as follows:
"The biotechnology rule (Case 8) and the plant-incorporated protectant rule
(Case 15) both cited references by number, which corresponded to a reference
section in the preamble, but many of these references were to dictionaries or
general textbooks-and did not support scientific statements."
b.	On page A-107 under brief description of the rule, the following statement
should be corrected as follows:
"In addition, the rule establishes a new part in the Code of Federal Regulations
(CFR) specifically for plant-incorporated protectants, i.e., 40 CFR 174.
Procedures are also set forth for Confidential Business Information (CBI); any
claim of confidentiality must be substantiated when the claim is made. The rule
also requires, for exempted plant-incorporated protectants not registered, that
any person who produces, for..."
c.	On page A-108, under the brief description of science input to the rule, the
following statement should be corrected as follows:
"The rule was a legal mechanism to confirm that plant-incorporated protectants
were covered by FIFRA. The science aspects concerned the exception
exemption for protectants derived through conventional breeding from sexually
compatible plants. To comply with FIFRA, such protectants could mav not
generally cause unreasonable adverse effects on the environment."
2.	Text Changes Requested on Case Study 8 - TSCA Biotechnology
a.	On page A-60, under the brief description of science input to the rule, the
following statements should be corrected as indicated:
"The intent of the rule was to establish EPA's regulatory program for
microorganisms, with the goal of provide regulatory relief to those wishing to
use certain products of microbial biotechnology, while ensuring that EPA could
adequately identify and regulate '
b.	On page A-61, the characterization in the first bullet appears inconsistent with
the discussion below it, which acknowledges that EPA's decision under the rule
centered on whether the product is "new," i.e., not whether the 2 classes present
61
Report 2003-P-00003

-------
different levels or types of risks but whether one class is more likely to be "new."
The first bullet needs to be corrected as indicated below:
"The science inputs to the rule should involve three key issues:
* Requirement for an MCAN - Is the intergeneric microorganism more likely
to be "new" there significantly less risk associated with intrageneric transfers
than with intergeneric transfers? Is there a significant risk associated with
c.	On page A-61, the following paragraph does not explain why these issues
"clouded" or adversely affected the science considered in this rulemaking. An
explanation is need to support the statements in the paragraph below or it needs
to be revised and clarified. This rulemaking was more of a procedural rule, with
the science used to determine the process and informational requirements that
would be applied when these microorganisms were reviewed by EPA as part of
the new chemical premanufacture notification requirements under TSCA.
"Identifying the exact role of science in this rulemaking is clouded by
several issues. First, the the Biotechnology Science Coordinating Committee
(BSCC) of the Domestic Policy Council Working Group on Biotechnology in the
1980s was developing a coordinated policy for dealing with biotechnology across
the various agencies with a regulatory role (e.g., FDA, USDA, and EPA). Each
of the Agencies and Departments then had to adapt the BSCC guidance to the
particular statutory requirements under which the organization had regulatory
authority. Under TSCA, EPA had to regulate microorganisms as "chemicals,"
because Congress had not specifically anticipated that genetically engineered
microorganisms would themselves act as "products." This combination of
restrictions brought about by the desire of the Federal government to have an
integrated approach, and the need to stay within statutory boundaries that were
somewhat artificial, greatly constrained the way new science could be applied to
the rule."
d.	On page A-62, the purpose of this paragraph needs to be better explained.
What is the basis for the conclusion (highlighted below) that the articles didn't
appear to be critical to support the final rule? This information was indeed
important in supporting the specific requirements and reviews established in the
rulemaking for these new microorganisms under the premanufacture notification
provisions of TSCA.
"ORD also had a substantial research program in biotechnology in the 1980s. In
a presentation to the BSAC in April, 1987, the AA for ORD indicated that ORD
had a budget of $7 million for R&D in biotech, approximately 80% of which was
in external research grants, primarily directed at developing "widely accepted
methods in ... microbial ecology." ORD projects aimed at evaluating monitoring
strategies for planned field releases were presented to the BSAC at the JULY
1987 and January 1988 meetings. ORD reported on several biotechnology
workshops at the January 1989 meeting, and at the December 1989 meeting,
ORD presented a progress report on 53 projects that had been conducted under
the program, the "primary foci of these studies [were] on detection and
62
Report 2003-P-00003

-------
enumeration, survival and colonization, and genetic exchange." The following
excerpt from one of the BSAC members is telling, however. After noting that
although he did not totally agree that the program was a success, but was "one
of the most important efforts in the area of environmental science," he noted that
"much progress had been made in the considering genetic, ecologic, and
evolutionary issues,.... the information was still insufficient to give a definitive
answer on what merited review." Although several journal articles funded by
ORD are included in the docket, none appears critical to support of the
final rule fnor do papers funded bv other organizations}."
e. On page A-67, the following conclusion (the last sentence in the paragraph
under methodology - see highlighted text) is not supported by this paragraph.
This statement requires additional explanation or it needs to be revised.
"OIG had no response from any of the respondents. The information was
developed by reading the rule and preamble, the primary technical support
documents, the ESA report (Tiedje 1989), the RIA, the response to comments
report, and the reports and minutes from the BSAC meetings in the docket. The
reference lists for the primary documents, as well as research papers cited in the
docket table of contents were identified, and scanned for content, funding
sources, etc. Research funded by ORD and identified by acquisition number
were tracked back to the original decision memos in the GAD files (most turned
out to be competitively awarded). It became obvious during this exercise that
the research cited, while broadly relevant to the survival of artificially
introduced microorganisms in the field and mechanisms of gene transfer,
did not specifically support (nor specifically not support) the positions to
which thev were referenced, and thus thev are not included in the list of
critical documents."
B. General Comments
The following are general comments, observations and suggestions for your
consideration.
1.	Scope of the Pilot Study,
Please clarify early in the report whether the pilot team considered economic
analyses when they evaluated "science" for the purposes of the pilot study. At times it
appears that the team's consideration was limited to what is traditionally thought of as
"science," i.e., scientific research and analyses of risks and effects. For example, in the
first paragraph of the Executive Summary and in the detailed discussion of
methodology. Since many consider economic analysis to be a scientific discipline, it
would be helpful to describe what the team included as "science" in the context of their
evaluation of science in support of rulemaking.
2.	Understanding Rulemaking at EPA.
We would like to make a few comments and suggest several improvements to
the discussion on this topic that appears on page 2.
63
Report 2003-P-00003

-------
a.	Rulemakings are not just triggered by a statute, court order or executive
initiative as stated in the first sentence of the first paragraph. Rulemakings may
also be triggered by a citizen who petitions the Agency to take a specific
regulatory action or to issue a rulemaking to address a particular concern, i.e., in
addition to the Administrative Procedures Act, several statutes contain specific
provisions that require the Agency to consider these petitions (e.g., TSCA
section 21, EPCRA section 313, etc.). Licensing actions may also use
rulemakings as the mechanism for implementing the licensing action. To avoid
the potential for the reader to conclude that rulemakings are only triggered as
described, we suggest that you preference the statement by inserting "Typically,"
or "In general," at the beginning of the first sentence.
b.	As you know, rulemaking dockets for the major program offices are
maintained in a specific facilities, which were recently consolidated to create the
new EPA Docket Center located in the basement of EPA West. The
parenthetical description should be revised to explain that these "drawers of
paper files" can easily be accessed by the public, and are not files that are only
maintained by the individual rule leads. Although opened to the public only this
past April, it should be noted that the Agency now makes these files publicly
available in its new online electronic docket and comment system, EPA Dockets.
For future rulemakings, the public will have easier and online access to non-
copyrighted and non-confidential references that are used to support a
rulemaking.
c.	The process summary that is provided does not include one of the most
significant steps required for any significant or economically significant
rulemaking, i.e., review by the Office or Management and Budget (OMB) and
other interested federal agencies and offices pursuant to E.0.12866. This
review may often play a critical role in shaping the final rule that publishes in the
Federal Register, and can impact on how the science is presented in the
preamble. We suggest that you add a new sentence to recognize this critical
step in the rulemaking process for both the proposed and final rule stages.
d.	Although the public comment period may typically be 60 days, the Agency
often provides for 90 days or longer for economically significant rulemakings,
and, on occasion, may also provide just 30 days for public comment. We
suggest that you reflect this be revising the following sentence as indicated:
"After allowing at least 60 days for public comment (typically 60 days), EPA
finalizes the rule by publishing it in the Federal Register, with a new preamble..."
e.	In the second paragraph, please clarify whether the 20 rules were categorized
as "significant" or "economically significant" under E.O. 12866. Although the
criteria for "significant" rulemakings that appear in section 3(f) of E.O. 12866 are
identified, there is no explanation of here for "economically significant," although
that phrase is not used until the next page. Although the EO itself does not
define this term, OMB's implementing guidance for EO 12866 defines this term
as rulemakings that meet the criteria in section 3(f)(1) of the EO. Please note
that the economic trigger here is not the only one. It can also be cost savings or
64
Report 2003-P-00003

-------
a non-cost related reason as indicated by the second part of the criteria in
section 3(f)(1).
In addition to clarifying these terms, we suggest that the report clarify which
criteria were used in selecting the rules evaluated. It is also important to clarify
what is meant by "substantive rules," because that was a specific term of art that
was used under the previous E.O. (EO 12291), which was replaced by E.O.
12866. Today rulemakers use this term to distinguish a non-substantive rule
(e.g., something more technical in nature that does not impact the scope or
requirements of the rule - like a rule that changes how a form should be sent to
EPA) from a substantive rule that changes requirements or behavior, takes an
action, implements a decision, etc..
f.	For OPPTS, the remaining rules do not "primarily impact individual States,
Tribes or sites, or involve minor modification and corrections to significant or
substantive rules." The remaining rulemakings in OPPTS are substantive rules
that OMB specifically exempted from E.O. 12866 that are categorized as exempt
and not as "non-significant" (i.e., rulemakings that establish pesticide tolerances),
or the are otherwise substantive rules that are categorized as non-significant
under E.O. 12866. Only a few of the remaining rules involve corrections or minor
modifications, or are otherwise limited to individuals.
g.	The determination of whether a rule that was categorized as significant at
proposal can be categorized as non-significant at final is one that is based on the
criteria in section 3(f), and OMB's implementing guidance for EO 12866. If, for
example, the agency does not receive adverse comments and the final rule is
substantively similar to the proposal, OMB may determine that the final rule is
not significant. The last sentence of the second paragraph implies that the only
time the categorization for the rule might change at the final rule is if the
estimated costs decrease or the rule is determined to modify existing significant
rules. This statement should be corrected.
3. Significant Rules Identified
We would like to make a few comments and suggest several improvements to
the discussion on this topic that appears on page 3.
a. RAPIDS is an internal agency tracking database that was first used by
programs around 1995 primarily to help facilitate the development and review of
the Regulatory Agenda. Active use of the system by the program offices was
phased in across the Agency, which meant that some offices were entering their
information directly, while others had their information entered by OPEI staff. In
addition to standard reports that any user can access, a special report may be
generated, as long as the information sought is maintained in the system. We
do not believe that the criticism of OPEI and RAPIDS in this discussion is
accurate. We suggest that you discuss these details with OPEI and revise this
discussion accordingly.
65
Report 2003-P-00003

-------
b.	With regard to searches for rules promulgated before 1994, it is important to
note that pilot team did not have easy access to the information on these rules
because the electronic reference sources that were used by the team, i.e., the
website, and the electronic Federal Register access systems, were under
development and only contain limited information for rules issued in 1994, or
earlier. For this reason, the team was uncertain that the information on these
earlier rules was complete. Searches using commercial electronic referencing or
indexing sources might have identified rules for these earlier periods, as a
manual search of the Federal Register indexes would have. To avoid a reader
interpreting this discussion incorrectly as an indication that there isn't a way to
generate such a list for this earlier time period, we believe that this discussion
should be revised.
c.	Correct the reference to why a rule might be "economically significant," as
discussed above.
d.	Clarify whether the 14 rules were taken from the economically significant
group or both. For example;
On page 2 of the draft Report, it indicates that OPEI estimated that the Agency
publishes 1,000 to 3,000 rules each year, and that "approximately 20 of these
rules are "significant" according to E.O. 12866." Is that 20 a year, or 20 total for
the period of the evaluation?
On page 3, it continues with the team having "identified 88 "significant rules" that
were finalized in 1990 through 2001" then later that the team settled on "74 rules
from 1994 on." And eventually explains that the pilot study focused on 14 of
these rules.
The criteria the team used to select only 14 rules out of the potentially over
12,000 rules that EPA promulgated in that time are not clear. No criteria for
selection are described in the study other than the attempt to ensure that the
rules evaluated would provide a wide range of statutes, and that the rules were
not intended to be a representative sample.
The report needs to further explain the selection criteria to provide credibility for
the pilot study using only these 14 rules to serve as the basis for supporting the
general conclusions regarding the Agency's use of science in rulemaking and the
related recommendations.
4. Rulemaking Expertise
Rules written by the EPA serve a number of purposes, not all of them strictly
scientific. Factors affecting the form and content of a rule include statutory, scientific,
economic, political and enforcement/compliance considerations. Rules are first and
foremost legal documents written to meet several legal goals. They must also
communicate information clearly to the lay public, particularly information on how
individuals may comply with the rules. Any team undertaking a study along the lines of
this pilot should include the perspective of these other disciplines.
66
Report 2003-P-00003

-------
For example, although the report notes that the rulemaking process is governed
by specific requirements contained in various statutes and Executive Orders (these are
in addition to any in the environmental laws referenced), there isn't a discussion about
what those requirements are, or how they may impact the development of, the analysis
performed, or the information considered as part of the rulemaking. In addition, during
the period covered by the study, many of these requirements were either newly
imposed, or recently revised. For example, the only executive order related to these
requirements that is mentioned, EO 12866, was issued in October 1994 as a revision to
a previous EO. Since then, over 10 more executive orders or statues were issued that
directly impact not only when an Agency must consider certain factors in rulemaking,
but how the Agency must perform specific analyses. To be complete, any study about
the use of science in rulemaking must also consider the rulemaking context, and all of
the factors that must be considered by the Agency in making a decision.
Along these lines, we believe that any future studies should also consider how
new requirements, whether procedural or policy related, that are intended to increase or
make improvements, end up impacting the use of science in support of rulemakings.
For example, on October 1 the Information Quality Guidelines are supposed to take
effect. It would be interesting to see, when and how those requirements might impact
our current activities with regard to science and rulemaking. The new electronic docket,
which will substantially increase access to critical documents that are used to support
rulemaking, will also impact this issue.
5. Method of Selecting "Critical" Documents
The methodology of selecting "contacts" is not clear. Of most concern is that, of the 83
identified contacts, no helpful responses were received from 58 of these contacts. A
response from so few identified contacts, with a response rate of 33%, can introduce a
strong bias into the study. To their credit, the authors recognize this potential for bias in
their report. They nonetheless offer strong recommendations on how preambles of
rules should be written. The team also ignored for the purposes of their study
documents that they could not find in the rule dockets. Dockets do not necessarily
contain a hardcopy of all documents associated with a rulemaking. Documents need
not actually be physically in the docket for the Agency to have relied on them.
6. Rulemaking Preambles
The preamble to the rulemaking is not, nor has it ever been, considered the
proper vehicle for communicating the science in the manner prescribed on page 11.
The proper vehicle for communicating the science in that detail is in separate
documents that are made available to the public as part of the rulemaking docket, with
a general description description provided in the preamble. The preamble must provide
a layman's explanation of the basis for the rulemaking, including the science, economic
and technical analyses and other considerations that informed the decisions
represented in the rulemaking.
The suggested addition of these science discussions in the preamble is cost
prohibitive and impractical. For example, the suggested inclusion of the scientific
67
Report 2003-P-00003

-------
charts, graphs, and tables in the preamble would not only significantly increase the
publication costs only, it would require additional resources and overly complicate the
Agency's ability to ensure that the Federal Register document complies with
accessibility provisions in section 508 of the American Disabilities Act because tables,
charts and graphs require special programing to be electronically accessible for 508
readers.
In addition, most stakeholders consulted in 1994, when we evaluated the level of
detail, format, and function of the preamble as part of the government wide streamlining
initiative, indicated that they prefer for the preamble to contain a succinct summary of
the science, economic and technical analyses and other considerations that went into
the rulemaking. This allowed those responsible for or interested in the different
disciplines to obtain a general understanding of all of these considerations, as well as
the details of the one of most interest to them. Since the primary audience for the
rulemaking is not the scientists, including the detailed scientific information in the
preamble would not serve as an effective way to communicate the scientific information
to the primary audience.
68
Report 2003-P-00003

-------
Exhibit 10
Office of Policy, Economics, and
Innovation Comments
The full text of the comments follows.
69
Report 2003-P-00003

-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
OFF ICE OF POLICY,
ECONOMICS, AND INNOVATION
September 22, 2002
MEMORANDUM
SUBJECT: Review of Draft Report "Science to Support Rulemaking'
FROM: Thomas J. Gibson
Associate Administrator
TO: Nikki Tinsley, Inspector General
Office of the Inspector General
Thank you for the opportunity to review the draft report "Science to Support Rulemaking." This
report is important since it explicitly describes the role of science in achieving the Agency's
mission. It is especially relevant in light of the Administrator's efforts to improve the Agency's
decision making by promoting the full integration of science, including economics, into the
regulatory process. In addition, the report's suggestions support many areas for improvement
that are targeted in the Agency's new Information Quality Guidelines.
Having carefully reviewed the report, we offer the attached comments and suggestions on ways
in which we believe the report could be improved, as well as our reactions to each enumerated
suggestion made in the report. We hope that our suggestions will help strengthen the study's
findings and the conclusions you draw from them.
Our general comments on the report can be found in Attachment 1. Attachment 2 contains our
reactions to the report's suggestions, and Attachment 3 is detailed comments organized by page
number.
Again, thank you for the opportunity to review the Draft Pilot Study. My congratulations on a
well-written report that clearly articulates the need to explicitly define the role of science in the
rule making process. I look forward to seeing your revised report.
Attachments
70
Report 2003-P-00003

-------
Attachment 1
GENERAL COMMENTS
Recognizing Institutional Mechanisms
The report does an excellent job of recognizing Agency institutional mechanisms which ensure
that regulations are based on sound science. The role of peer review and the peer review process
in the development of credible science is discussed in depth, and the Office of Policy, Economics
and Innovation (OPEI) agrees with the heavy emphasis the report places on the utility and
importance of independent peer review. Not emphasized are two other key "good science"
processes: Analytic Blueprints and the Risk Characterization Policy. The former was designed,
in part, to ensure that critical science needs are identified early in the process and developed in
time to inform regulatory decisions, and the latter requires that both the risk assessment process
and risk the analyses are transparent, clear, reasonable and consistent. Taken together, these
three existing mechanisms can assure that:
12.	Critical science is identified early, and developed in time to inform decisions (Analytic
Blueprint),
13.	Critical science is of sufficient quality for regulatory decision making (Peer Review
Process),
14: The quality of the science and the associated uncertainty is clearly described (Risk
Characterization Policy).
Further, these three mechanisms appear to directly address three of the four findings of your
report, i.e., that critical science supporting the rules often was not independently peer reviewed.
that more data and fewer "blind spots" could reduce assumptions, and that the role of science
(was) not made clear. Your report "determined that the oversight of peer review of the critical
science documents to support the pilot rules was limited and ineffective." Applying the same
logic suggests that shortfalls in identifying critical data needs, and the lack of transparency and
clarity in science is due to inefficiencies or limitations in the two Agency processes intended to
identify, develop and make critical science transparent.
Defining the Scope of the Science
The report refers to "science" quite broadly without ever offering a clear definition of what
specifically was considered. The report would benefit from a clearer discussion of what
categories of science were addressed and on which categories primary contacts were asked to
comment. Page 5 of the report simply states: "We asked that [the contacts] consider several
categories of science that may be relevant to the rules..." but does not disclose what these
categories were. Presumably, such categories included risk assessment, exposure modeling, and
71
Report 2003-P-00003

-------
epidemiology. It is not clear from the text, however, whether engineering or any social sciences
were included in this discussion.
The extent to which economics and other social sciences were addressed in this report is an
important point to make especially in light of the renewed importance the Administrator and the
General Accounting Office have placed on the conduct of quality economic analyses in the
Agency. While the review of economic analyses may be beyond the scope of this particular
report, the importance of this type of analysis as an input into the decision-making process should
not be overlooked.
Characterizing the Role of Economies in the Decision Making Process
The report mis-characterizes the role of economics in the decision-making process in some
places. The final point under suggestion number one lumps economics, a social science, together
with non-scientific aspects of rule making such as law and administrative discretion. This should
not be the case. Economics should be separated from other non-scientific considerations. Under
some statutes, such as the Safe Drinking Water Act (amended 1996) findings from economic
science may be the basis for the standard, and therefore merit separate, detailed treatment
analogous to that given to other science. Also, under the Federal Insecticide, Fungicide and
Rodenticide Act, the Agency must balance the risk posed by the use of pesticides against the
economic impacts on crop production of restricting pesticide use. Even in cases where a
standard may not be driven by economic findings, data on benefits, costs and impacts should be
presented with the same clarity and detail as "scientific findings."
Use of Risk Characterization Policy as a Framework for Presenting Results and
Suggestions
Some of the science supporting rulemaking deals with health and environmental risks. EPA
adopted its policy on "Risk Characterization" in February 1992, via a memorandum from Henry
Habicht, Deputy Administrator, and an accompanying document, prepared by a cross-office work
group. The policy was reiterated and elaborated in the mid 1990s. At its core, the policy states
that significant risk assessments should:
•	Describe how the estimated risk is expected to vary across population groups, geographic
areas, or other relevant break-outs,
•	Describe the sources of uncertainties in the risk estimates, and quantify them, to the
extent possible, and
•	Explicitly identify the impact of science and data, as opposed to policy choices, as the
source of various elements of the risk assessment.
72
Report 2003-P-00003

-------
We have found that this standard has been followed in an incomplete fashion in documents
supporting regulations, as well as other EPA risk assessments. The draft Office of the Inspector
General (OIG) report refers repeatedly to the second and third elements of EPA's Risk
Characterization Policy, both in describing its findings and in its recommendations. We
recommend that OIG examine this policy (in effect during most of the time period covered by
the pilot study), and use it as a framework for presenting its results and suggestions.
A Call for the Development of "Principals of Analytic Integrity"
Recently, the Administrator reaffirmed the "Principals of Scientific Integrity" establishing clear
and ethical standards that should govern the conduct of scientific studies within the Agency. To
date, there is no parallel document establishing standards for the use of research in a policy
analytic setting. OIG may wish to recommend that such a document be developed expanding on
its recommendations for clarity of presentation, etc. and drawing on other Agency guidelines
such as The Guidelines for Preparing Economic Analyses.
Characterization of the RAPIDS Data Base and its Capabilities
RAPIDS tracks all substantive rulemakings appearing in the Semi-Annual Regulatory Agenda as
well as a number of actions not in the Agenda, such as Reports to Congress, Policy Plans, etc.
RAPIDS does not track every non-substantive rulemaking (SIPs, SNURs, FIPs, State Approvals,
etc.), but a sister database to RAPIDS (Federal Register Tracking Database - FR Dailies), also
maintained by OPEI's Regulatory Management Staff (RMS), tracks every EPA action sent to and
published in the Federal Register. These rules are not economically significant or normally
reviewed by OMB and therefore are classified as "not significant."
RAPIDS records go back a number of years (1996 forward) with some rulemaking records from
earlier years available. RAPIDS also tracks NPRMs published in many of those same years. The
Regulatory Management Staff (RMS) has built numerous views in RAPIDS and has a view (list)
of rules finalized each year.
The report seems to confuse or not clearly differentiate between "significant" rulemakings (those
OMB reviews) and "economically significant rulemakings" (economic impact of greater than
SI00 million per year). RAPIDS separates out those rules identified as "economically
significant." This designation has only been in effect for rules in the Semi-annual Regulatory
Agenda as Priority "A" (Economically Significant) since 1995. Although for years before 1995,
it is more difficult to clearly identify economically significant rules, RAPIDS identifies 50 final
rales as economically significant for the years 1994-2001 and can produce lists of economically
significant rules published final for the years 1990 to the present.
For additional information regarding the capabilities and content of the RAPIDS database or its
sister database, OIG staff may wish to contact RMS staff directly (Darryl Adams is the contact
for RAPIDS).
73
Report 2003-P-00003

-------
Attachment 2
REACTIONS TO SUGGESTIONS
1) Consider presenting the scientific findings that support a rule in specific sections of
the preambles. These findings should be organized according to the norms of
science...
This suggestion is consistent with the Agency's efforts related to the use of and dissemination of
information covered by the new Information Quality Guidelines (IQ Guidelines). These
Guidelines have been developed in response to Section 515 of the Treasury and General
Government Appropriations Act for FY 2001 (often referred to as the Data Quality Act) that
directs the Office of Management and Budget (OMB) to issue guidelines that provide policy and
procedural guidance to Federal agencies and direct Federal agencies to:
•	adopt a basic standard of quality as a performance goal and take appropriate steps to
incorporate information quality criteria into agency information dissemination practices;
•	issue guidance for ensuring and maximizing the quality, objectivity, utility, and integrity
of information disseminated by the agency; establish administrative mechanisms allowing
affected persons to obtain correction of information that does not comply with the
guidelines; and
•	submit an annual report, beginning January 1,2004, to OMB on the number and
disposition of complaints received.
OMB published its guidelines to Agencies on October 1,2001, and required agencies, including
EPA, to develop and publish their own information quality guidelines by October 1,2002. The
Office of Environmental Information has led development of the IQ Guidelines within EPA.
These Guidelines were developed as a Tier 1 rulemaking, with broad participation across the
Agency and included briefings with the Deputy Administrator. EPA submitted its draft final IQ
Guidelines to OMB on August 1 and the new Guidelines become effective October 1, 2002. Per
OMB requirements, EPA published draft guidance and received public comment on the draft
document.
The guidelines are consistent with many existing Agency practices and policies (e.g., EPA
considered its peer review and risk characterization policies while developing its guidelines).
However, the statutory basis and underlying OMB guidelines provide additional emphasis on
EPA's implementation of these information quality practices and policies. The complaint
resolution process will further intensify accountability for Agency staff, line managers, and
senior managers to ensure quality information products. Pre-dissemination review processes
need to be reviewed and there is a plan to develop minimum standards for pre-dissemination.
review (product review) consistent with the Guidelines. OE1 is developing a communication
strategy as well as the details and procedures necessary to implement the complaint resolution
74
Report 2003-P-00003

-------
process, including appeals and the appeals panel. OEI is responsible for ensuring that sufficient
information is available to report complaint resolution in conformance with OMB deadlines.
Each program office and region are responsible for implementing the guidelines when they are
finalized and implemented on October 1, 2002.
OPEI believes that a full implementation of the IQ Guidelines will improve the Agency's
performance related to its discussion regarding the use of science in rulemakings.
This is also an area where OPEI and ORD together can develop more complete recommendations
regarding the presentation of scientific findings in preamble discussions. OPEI and ORD are
both increasing their presence in Agency rulemakings as a result of last year's Task Force on
Improving Regulation Development. OPEI believes that this increased participation by ORD and
OPEI analysts will improve the attention to and discussion of the results of the underlying
analysis, including but not limited to science, used to support EPA regulations could be
improved. This discussion would be consistent with the IQ Guidelines, existing policies such as
the risk characterization policy, and some of the key findings of your report.
2)	Focus more attention in the development phase of regulations on collecting data and
doing research to close "blind spots" to support rulemakings.
The purpose of an analytic blueprint is to identify research needs and guide data collection and
research studies during the development phase of regulations. While a requirement for
developing, updating, and following an "analytic blueprint" has been a formal part of EPA's rule-
making process for more than a decade, it has been OPEI's experience that most analytic
blueprints are treated as little more than formalities. As a result of last year's review and
reassessment of EPA's rale-making process, OPEI and the program offices are taking steps to
make the blueprints more central and relevant to actual rale-making decisions. We suggest that
the OIG report consider referring to the analytic blueprints as one means to achieve the results
desired in Suggestion 2.
3)	Take advantage of EPA's information technology...
OPEI is currently evaluating and enhancing RAPIDS in order to improve the management
information that is available or potentially obtainable. To date, RAPIDS has focused on tracking
regulation development progress and facilitating EPA's submission of its portion of the Semi-
Annual Regulatory Agenda to OMB. OPEI is interested in adding features that enhance
management accountability and improved performance metrics. RAPIDS currently links to
relevant guidance and policy documents. OPEI will continue to improve RAPIDS and seek to
take advantage of other information technology capabilities over the next year. Much of this
work will be coordinated through the Regulatory Steering Committee or Regulatory Policy
Council. We will follow up with you over the next several months to more fully understand these
recommendations and identify what specific changes or opportunities we can adopt.
75
Report 2003-P-00003

-------
4) Reinforce EPA's current peer review policy ensuring that all EPA-generated
documents critical to significant and substantive rulemakings are independently
peer reviewed, and that the responses to the significant comments appear in the
documents.
OPEI fully supports this recommendation on peer-review of critical documents and in fact has
recently extended this peer-review policy to include economic analyses. OPEI is working
closely with the Agency's Program Offices to ensure that a foil review of supporting economic
analyses for all economically significant rules occurs prior to the rule's submission to OMB. In
this way, the application of sound and consistent economic practices is ensured and the Agency's
position on the use of sound science strengthened.
76
Report 2003-P-00003

-------
Attachment 3
DETAILED COMMENTS
Pace 1 ("What We Did and Why")
It is unfortunate that economic analysis was not included in the pilot study, especially given the
importance the Administrator has recently placed on the role of economics in rule making. It
may be worth mentioning that although economics is important to the rule making process, you
chose to focus on chemistry, biology, health sciences, etc. for the purposes of the pilot study.
In the last paragraph, consider elaborating a bit more regarding what was presented to the
Research Strategies Advisory Committee and the Regulatory Steering Committee as well as the
comments you received from them.
Page 2
In the first full paragraph, more exposition would be useful for those not familiar with the
Government's Auditing standards. Do the standards recommend the testing of controls?
A short summary of the methodology employed would also be helpful for the reader if provided
early in the report. We recommend the following:
"The pilot study was conducted in steps. Once we learned more about the rule-making
process, we began our pilot study by identifying all significant rales that were eligible for
the study and then selected a small sample to pursue in case studies. For each selected
rule, we identified primary contacts involved in the rule making and contacted each
individual via email for assistance in identifying the critical science documents. We then
attempted to locate each primary and secondary science document underlying the rule-
making process for each selected rule. For each located document, we established who
conducted the study, how the study was funded, and the level of peer-review the study
received. Each step is discussed in more detail below and the findings for each selected
rule are summarized in the case studies located in the appendix."
In the second paragraph consider changing the wording so the sentence reads:
"The remaining notices are for rules that primarily impact individual States, Tribes, or
sites...."'
Page 3 ("We Identified Significant Rules Since 1990 and Selected 15 Case Studies")
For the title consider inserting "as" so that the title reads "We identified significant rules since
1990 and selected 15 as case studies"
77
Report 2QQ3-P-0Q0Q3

-------
The second paragraph could be made clearer by making the following changes (shown in italics):
"Therefore, we used the Federal Register and EPA website to identify 88 significant rules
finalized in 1990 through 2001. They are listed in Exhibit 1. The list of rule promulgated
before 1994 may be incomplete since EPA's web-based materials tended to be dated 1994
and later. Focusing on the 74 rules finalized from 1994 on, we show in Figure 1 that
more than half (38) of these rules were issued under the Clean Air Act...."
Page 4 ("We Sought to Identify the Critical Science Behind Each Rule")
Footnote 2: The composition of the pilot team is an important factor in how the study was
conducted and by whom. This footnote should perhaps be moved to the text.
Page 5
Top of page: Is it the case that all primary contacts for the 15 rules responded? Were all primary
contacts still at EPA? Presumably the 83 contacts mentioned at the end of the paragraph were
those identified by the primary contacts. Are the primary contacts for each rule included in the
83? Did the pilot study team identify the contacts or did the primary contacts identify the
contacts, or were the 83 contacts all primary contacts identified by the pilot study team? What
proportion of the 83 contacts were at EPA?
Second paragraph: In the email sent to contacts, was a description of the project included in the
email? Was any official endorsement of the study by a manager included to help gain
cooperation?
Third paragraph (first full paragraph): In this paragraph you mention that follow-up contacts were
made with those individuals who had not responded. What kinds of follow-ups were made?
Were the follow-ups by email? Telephone? Did the follow-ups yield any additional
cooperation? What was the breakdown of the 83 contacts by role (EPA execs, peer reviewers,
etc.)? When stating, "We then turned to a combination of interviews and reading materials in the
dockets," who was interviewed? Presumably, the report refers to those individuals who
responded to the initial inquiries. Was a standard set of questions asked in each interview? How
many attempts were made to contact stakeholders and peer-reviewers? Was accurate contact
information available for each individual or was it the case that the contact information had
changed and the person could not be found?
Page 6
First paragraph: The example of "think of laying a brick wall" was not as helpful as it could be
and does not match the situation entirely. Perhaps if the example were reworded to read "think
of a brick wall comprised of many individual pieces...
78
Report 2003-F-00003

-------
Second paragraph: The second paragraph mentions an advisory from the Research Strategies
Advisory committee. Does this refer to the recommendations received from the committee as a
result of the briefing they received about the study? The suggestions they made should be
summarized earlier in the report as noted above,
The first sentence is not as clear as it could be. Consider rewording it so that it reads:
"Identifying the critical science inputs to the various rules proved to be a much more difficult
task than expected given that the pilot study team members carrying out the identification process
were not involved in the original rulemaking. As a result, for some rules, the data are likely to be
incomplete. We encountered a particular problem as we traced...."
Readers of this paragraph may get get the health criteria documents and the critical documents
confused. Are the underlying studies primary or secondary critical documents? According to the
text box, it appears they should be secondary. A few wording changes might clarify this.
Consider making the following changes:
"These underlying studies then become critical secondary documents (there are
usually more than one per health criteria document)".
Rewording the next few sentences may also help clarify the discussion here (changes noted in
italics):
"Thirty-three of these underlying criteria documents were identified for Cases 2, 9 and
11 alone. Because the number of critical secondary documents increases exponentially as
one goes backward through the citation chain, locating these documents becomes a very
time-consuming process."
Page 7 ("We Identified the Sources of the Critical Science")
First full paragraph: Add comma and remove "or" so that the sentence reads, "Who performed
the research was often identified on the title page of reports, the by-lines in journal articles, or in
the acknowledgements...."
Second paragraph: Again, the report should establish who the respondents are. Are these the 7
people who responded to the initial inquiry?
Third paragraph: ("We asked the respondents about science gaps and science quality")
Last sentence: the rating scale should be defined (what does one mean? What does five mean?)
Last paragraph: ("We identified the type of peer review undergone by the critical science")
This section may read better with some reordering: cut the first paragraph in this section and
move the OMB quote and the sentence preceding it so that they follow the now second to last
paragraph in this section (before the paragraph starting "Some documents indicated...").
79
Report 20G3-P-00003

-------
Page 9 ("Science Played a Critical Role in the Rules, but That Role May Not Be Clear")
Title: Consider changing the title wording so that it reads " .. .but that role was not always clear"
The wording change should be carried over to the last sentence in the first paragraph.
Page 10
Second bullet: "We also identified critical science documents without which it is reasonable to
believe the section authorizing the rule might not have been included in the Clean Air Act?"
This seems to mean that you reviewed Congress' use of science, which is interesting, but seems
beyond the scope of the report. Did you undertake similar efforts to review the science
underlying other relevant statutes passed during the study period?
Page 11 ("Role of Science Not Made Clear")
Second full paragraph: Consider changing "science" to "scientific" so that the sentence reads
"However, we saw little evidence of any of these con ventions in communicating the scientific
underpinnings of the rules in the preambles..
Last sentence of the second full paragraph: consider adding "described below" so that the
sentence reads "Two of the preambles described below provided examples of food practices in
the presentation of the data."
Page 12
The last paragraph on page 12 says that "only six [preambles] were well-referenced to the science
underpinnings." Five bullets follow, and most of them illustrate inadequate referencing, although
the second bullet seems to offer no criticism of the preamble of Case 14. This list is confusing, .
and contrasts to the two illustrative bullets on pages 11 and 12, which unambiguously detail
examples of "good practices in the presentation of data".
Page 15 ("Who Funded the Critical Work")
"Intrastate" should be "interstate"
Table 3. It is not clear from the text how the count for ORD critical science documents came to
75. The text states that ORD funded 74 of the secondary documents and that EPA program
offices funded all but one of the primary documents. Did ORD fund a primary document? If so,
this should be stated clearly in the text. Also, it would be useful to break the "Other" category
into its constituent parts (primarily State governments and industry, according to the text).
The text states on page 9 that 436 critical scientific studies were identified and classified in
various ways, in Tables 2, 3,4, and 5. Although footnote 5 on page 14 notes that some
80
Report 2003-P-00003

-------
categories can be counted more than once (so the totals for Tables 2, 3, and 4 are greater than
436), it would be clearer if each of those three tables contained a note to that effect.
Page 17 ("More Data and Fewer "Blind Spots" Could Reduce Assumptions")
The finding that more data and fewer 'blind spots' could reduce assumptions seems reasonable,
but the text does not say that a great number of conservative assumptions were made where there
were "blind spots." Was this, in fact, the case? It seems the text should support the finding more
clearly.
Consider refraining from referring to the rules as "pilot rules." The rules themselves have been
finalized and are therefore not "pilot." Perhaps you could refer to them as "the rules included in
this pilot study" or "the selected rules" or simply "the rules."
Second full paragraph, first sentence: Consider changing the wording order so that the sentence
reads "Based on the responses, we concluded that having more data would have resulted in even
more efficient rales, because...."
Second full paragraph, second sentence: Remove the extra period at the end of the sentence.
Last paragraph ("Critical science supporting the rules was not always independently peer
reviewed"): Be consistent on the relative merits of peer review and public comment. This
paragraph states that public comment does not substitute for peer review, which seems sound and
reasonable. However, page A-75 of the appendix provides an extended editorial on the relative
merits of peer review and public comment that suggests otherwise. The text in the appendix
even seems to question the validity of Agency peer review policy through the use of scare quotes:
...This rulemaking process must be substantially more rigorous than the Agency's "peer
review" process-
Last paragraph, last sentence: Consider giving an indication of how many total staff members
were in the sample. Add wording so that the sentence reads "Nonetheless, we were told by six of
the X EPA staff members..where X = total number of EPA staff members with which you
were in contact.
Page 18
First sentence: Consider changing the wording so that sentence reads "A large number (290) of
the critical documents supporting the rules either were not peer reviewed (138) or their peer-
review status was indeterminate (152)." The "large number of the critical documents supporting
the pilot rules — 2,940" should apparently be the number "290."
Table 5: Consider changing last "Action" entry so that it reads, depending on the intended
meaning, either "Independent internal EPA review done through the risk assessment forum, ORD
81
Report 2003-P-00003

-------
or a program office" or "Independent internal EPA review done of a program office document,
such as through the risk assessment forum or ORD."
Page 21 ("Pilot Lessons Learned")
Third bullet: The meaning in this bullet is not entirely clear. Consider rewording as follows:
"A clear determination of a rule's relevant history should be made prior to the
commencement of future studies, where "relevant history" is defined as the length of time
preceding a rule's fmalization during which review team members can be confident that
identified science documents meeting the requirements of "critical documents", can in
fact, be defined as such. Any future studies of this sort should plan to conduct reviews
over the rule's relevant history.
Fifth bullet: Consider rewording as follows:
"Interviews should be conducted with as many people connected to the rulemaking as
possible. Special effort should be made to interview peer-reviewers and stakeholders.
Email is not an effective mechanism to elicit this kind of information."
Seventh bullet: The bullet should lead with the recommendation. Consider rewording as follows:
"There should be at least one research scientist on the team in spite of the fact that a
science background is not necessary to identify critical science documents. A science
background increases the efficiency of the identification process."
Exhibits
The Exhibits would be more informative if they were reorganized. Specifically, Exhibit 1 should
highlight the rules that are part of the study sample. For the other exhibits, so long as the name
and case number accompanies each study there is no need to present each list in chronological
order. Exhibits 2-6 can be reordered to highlight completeness, aggregate numbers or other
interesting findings along the dimensions displayed.
Appendices
Confirm coding on "who performed" the work. Are EPA contracted reports 'private sector' or
'program office'? The body of the report suggests that EPA-contracted reports are performed by
the 'private sector,' but at least one the appendix indicates otherwise. See specifically A-45,
reference #2. This may be an isolated error; each reference was not reviewed in detail.
Confirm coding for peer review: on pages A-78 and A-79, the peer review status of several
journal articles are coded as "unknown" although they appear to be from peer-reviewed journals
(including Fundamental and Applied Toxicology - now published as Toxicological Sciences:
82
Report 2003-P-00003

-------
Epidemiology, which is peer-reviewed; Toxicologic Pathology, which is peer-reviewed; and
others). These are simply examples. Each reference was not reviewed in detail.
There are a number of typographical errors and minor editorial errors that need to be addressed,
particularly in the appendices.
83
Report 2003-P-00003

-------
Report Distribution
Exhibit 11
Headquarters Officials
Associate Administrator for Policy, Economics, and Innovation (1804A) (paper copy)
Agency Followup Official (271 OA) (paper copy)
Agency Audit Followup Coordinator (2724A) (paper copy)
Associate Administrator for Congressional and Intergovernmental Relations (1301 A)
Associate Administrator for Communications, Education, and Media Relations (1701A)
Inspector General (2410)
Audit Liaisons
Pat Gilchriest, OA (1104A)
Pam Stirling, OPEL (1805T)
Peter Cosier, OAR (6102A)
Tom Coda, OAQPS (C40402)
Pat Keitt, OPPTS (7101M)
Cheryl Varkalis, ORD (8102R)
Johnsie Webster, OSWER (5103T)
JudyHecht, OW(4101M)
Howard Levin, Region 5 (MF-10J)
Primary Contacts
Synthetic Chemicals Monitoring: A1 Havinga OGWDW
Acid Rain Permits; Larry Kertcher, OAP/OAR
Municipal Waste Combustors: Walter Stevenson, ESD
Land Disposal Restrictions: Rhonda Minnick, OSWER
Reformulated Gasoline: Paul Machiele, OTAQ
Great Lakes Water Quality: Mark Morris, OW
Polychlorinated Biphenyl Disposal: Tony Baney, OPPTS
Nonroad Diesel Engines: Karl Simon, OTAQ
Biotechnology: Elizabeth Milewski, OPPTS
Plant-Incorporated Protectants: Elizabeth Milewski, OPPTS
Disinfectants and Byproducts: Tom Grubbs, OGWDW
Regional Ozone: Tom Helms, OAQPS/OAR
Pulp and Paper (Air): Stephen Shedd, ESD/OAR
Pulp and Paper (Water): Donald Anderson, OW
Municipal Solid Waste Landfills: Martha Smith, ESD/OAR
Note: Electronic distribution unless otherwise indicated.
84
Report 2003-P-00003

-------