United States
Environmental Protection
Agency
Science Policy Council
M
-------
EPA 100/B-03/001
June 2003
I
3) 2
\
U.S. Environmental Protection Agency
A Summary of General Assessment Factors for
Evaluating the Quality of Scientific and Technical
Information
Prepared for the U.S. Environmental Protection Agency
by members of the Assessment Factors Workgroup, a group of the
EPA's Science Policy Council
Science Policy Council
U.S. Environmental Protection Agency
Washington, DC 20460
-------
Page ii Assessment Factors
DISCLAIMER
This document has been reviewed in accordance with United States Environmental
Protection Agency policy and approved for publication and distribution. Mention of trade names
or commercial products does not constitute endorsement or recommendation for use.
-------
Assessment Factors Page iii
TABLE OF CONTENTS
FOREWORD Page iv
ACKNOWLEDGMENTS Page vi
A Summary of General Assessment Factors for Evaluating the Quality of Scientific
and Technical Information Page 1
1. Introduction Page 1
1.1 Overview Page 1
1.2 Purpose Page 1
1.3 Background Page 2
2. Assessment Factors Page 4
2.1 General Assessment Factors Page 4
2.2 Examples of Questions Raised by Consideration of the
Assessment Factors Page 5
2.3 Relationship Between the General Assessment Factors and the
Elements of Quality in EPA's Information Quality Guidelines .... Page 8
3. Summary Page 10
References Page 11
-------
Page iv
Assessment Factors
FOREWORD
This document was prepared under the auspices of the Science Policy Council (SPC) to
describe the assessment factors and considerations generally used by the Agency to evaluate the
quality and relevance of scientific and technical information. These general assessment factors
are founded in the Agency guidelines, practices and procedures that make up the EPA
information and quality systems, including existing program-specific quality assurance policies.
As such, the general assessment factors do not constitute new quality-related considerations, nor
does this document describe a new process for evaluating information. This document is
intended to raise the awareness of the information-generating public about EPA's ongoing
interest in ensuring and enhancing the quality of information available for Agency use. Further,
it complements the Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and
Integrity of Information Disseminated by the Environmental Protection Agency (EPA
Information Quality Guidelines). This summary of Agency practice is also an additional resource
for Agency staff as they evaluate the quality and relevance of information, regardless of source.
Consistent with the Agency's approach to the development of the EPA Information
Quality Guidelines, this document is the product of an open, collaborative process between EPA
and the public. During the development of this document, EPA obtained public comments on a
draft version of the document released in September 2002 and commissioned the National
Academy of Sciences to host a workshop in January 2003 to discuss key aspects of this document
from a scientific and technical perspective.
We want to acknowledge and thank the Assessment Factors workgroup for its steady and
insightful work in assembling this document under stringent time constraints and scrutiny. We
particularly appreciate the efforts of the co-chairs, Haluk Ozkaynak (ORD) and Greg Schweer
(OPPTS), who successfully led and shepherded the workgroup.
It is with great pleasure that we present the Summary of General Assessment Factors for
Evaluating the Quality of Scientific and Technical Information.
Paul Oilman, Ph.D.
Science Advisor to the Agency
Chair, Science Policy Council
Elaine Stanley
Director, Office of Information Analysis
and Access
Office of Environmental Information
-------
Assessment Factors Page v
Science Policy Council
Paul Oilman, Science Advisor to Stan Meiburg, Region 4
the Agency, Chair Joseph Merenda, OPPTS
Devereaux Barnes, OSWER William Muszynski, Region 2
Robert Brenner, OAR Joanne Rodman, OCHP
Jerry Clifford, OIA Michael Ryan, OCFO
William Farland, ORD Michael Shapiro, OW
Geoffrey Grubbs, OW Elaine Stanley, OEI
Diana Love, OECA Ramona Trovato, OEI
Bharat Mathur, Region 5 Vanessa Vu, SAB
Albert McGartland, OPEI Anna Wolgast, OGC
Science Policy Council Steering Committee
Randolph Perfetti, Chair Kate Mahaffey, OPPTS
Thomas Baugh, Region 4 Carl Mazza, OAR
Michael Brody, OCFO James Nelson, OGC
Reginald Cheatham, OEI Jennifer Orme-Zavaleta, ORD
Patricia Cirone, Region 10 Rosemarie Russo, ORD
John Diamante, OIA Rita Schoeny, OW
Michael Firestone, OCHP Margaret Stasikowski, OPPTS
A. Robert Flaak, SAB Kevin Teichman, ORD
Jerri-Anne Garl, Region 5 Mary Ellen Weber, OPPTS
Roland Hemmett, Region 2 William Wood, ORD
Lee Hofmann, OSWER Tracey Woodruff, OPEI
Daniel Malloy, OCFO
Science Policy Council Staff
Edward Bender Kerry Dearfield Kathryn Gallagher
-------
Page vi
Assessment Factors
ACKNOWLEDGMENTS
Many people worked on and contributed to the assessment factors effort that ultimately
resulted in this document. Many EPA employees from the Agency's Offices and Regions
provided input and we would like to specifically acknowledge the efforts and contributions made
by the following individuals:
Haluk Ozkaynak, ORD, Co-Chair
Mary Belefski, OPPTS
Katherine Biggs, OECA
Connie Bosnia, ORD
Ming Chang, OEI
Weihsueh Chiu, ORD
Evangeline Cummings, OEI
Kerry Dearfield, SPC Staff
Thomas Eagles, OAR
Kathryn Gallagher, SPC Staff
Staci Gatica-Hebert, ORD
Roland Hemmett, Region 2
Annie Jarabek, ORD
Stephen Kroner, OSWER
Greg Schweer, OPPTS, Co-Chair
Kevin Kubik, Region 2
Karen Martin, OAR
Amy Mills, ORD
Barbara Pace, OGC
Nicole Paquette, OEI
Devon Payne-Sturges, OPEI
Ward Penberthy, OPPTS
James Quackenboss, ORD
Mary Reiley, OW
Bruce Rodan, ORD
Terry Simpson, ORD
Elaine Stanley, OEI
Lanelle Wiggins, OPEI
-------
Assessment Factors Page 1
U.S. Environmental Protection Agency
A Summary of General Assessment Factors for Evaluating the
Quality of Scientific and Technical Information
1. Introduction
1.1 Overview
As part of the ongoing commitment of the United States Environmental Protection
Agency (USEPA) to ensure the quality of the information it uses, the Agency is publishing this
summary of general assessment factors in an effort to enhance the transparency about EPA's
quality expectations for information that is voluntarily submitted to or gathered or generated by
the Agency for various purposes. This Assessment Factors document is intended to inform
information-generating scientists about quality issues that should appropriately be taken into
consideration at the time information is generated. It is also an additional resource for Agency
staff as they evaluate the quality and relevance of information, regardless of source. The general
assessment factors are drawn from the Agency's existing information quality systems, practices
and guidelines that describe the types of considerations EPA takes into account when evaluating
the quality and relevance of scientific and technical information used in support of Agency
actions. As such, the general assessment factors do not constitute new quality-related
considerations, nor does this document describe a new process for evaluating information. This
document is intended to raise the awareness of the information-generating public about EPA's
ongoing interest in ensuring and enhancing the quality of information available for Agency use.
1.2 Purpose
The Agency believes that the summary of general assessment factors provided in this
document will serve to increase the extent to which the information-generating public builds
quality considerations into the generation and documentation of their information products. The
Agency expects that the resulting improvements in the quality of such information will enable the
Agency to more fully utilize and disseminate such information. Thus, this document is intended
to complement the Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and
Integrity of Information Disseminated by the Environmental Protection Agency (EPA
Information Quality Guidelines) (EPA, 2002) and other Agency efforts to ensure and enhance
information quality, as discussed below in Section 1.3. This document is not a regulation and is
not intended to create any legal rights or impose legally binding requirements or obligations on
EPA or the information-generating public.
-------
Page 2 Assessment Factors
Although the assessment factors as presented are intended to most generally apply to
individual pieces of information, they can also be used as part of a broader evaluation of a body
of evidence that is collectively evaluated through a process typically referred to as a "weight-of-
evidence" approach. The weight-of-evidence approach considers all relevant information in an
integrative assessment that takes into account the kinds of evidence available, the quality and
quantity of the evidence, the strengths and limitations associated with each type of evidence and
explains how the various types of evidence fit together. Details as to the Agency's approach to
integrating a body of evidence depend on the type of decision or action being undertaken, and are
not the subject of this document. For instance, the Guidelines for Carcinogen Risk Assessment,
Review Draft (EPA, 1999) provides guidance on characterizing the weight-of-evidence for
carcinogenicity. Similarly, the Guidelines for Ecological Risk Assessment (EPA, 1998) describes
the development of "lines of evidence" to reach a conclusion regarding an ecological risk
estimate.
The general assessment factors are presented and discussed more fully in Section 2.1.
Section 2.2 presents illustrative examples of the types of questions that consideration of these
factors raise in the process of evaluating the quality and relevance of different types of
information for different uses. The relationship between these general assessment factors and the
elements of quality contained in the EPA Information Quality Guidelines is discussed in Section
2.3.
1.3 Background
In October 2002, EPA made available the EPA Information Quality Guidelines. The
EPA Information Quality Guidelines were developed in response to guidelines issued by the
Office of Management and Budget (OMB, 2002) under Section 515 of the Treasury and General
Government Appropriations Act for Fiscal Year 2001 (Public Law 106-554; H.R. 5658). The
EPA Information Quality Guidelines set forth the Agency's policy and procedural guidance for
ensuring and maximizing the quality of information disseminated by EPA, regardless of the
source of the information, and articulate the Agency's ongoing commitment to ensuring and
maximizing information quality through existing policies, systems and programs. Thus, the EPA
Information Quality Guidelines build upon the Agency's numerous existing systems, practices
and guidelines that address information quality, and provide new policies and administrative
mechanisms that respond to OMB's guidelines.
The EPA Information Quality Guidelines also recognize that, as part of its efforts to
ensure information quality, the Agency does not wait until the point at which information is
disseminated to consider important quality principles. Rather, the Agency recognizes that it is
-------
Assessment Factors Page 3
important to assure the quality of information through processes that incorporate quality
principles starting at the point at which information is generated.
The Agency uses and disseminates information that is generated by a variety of sources,
including EPA itself as well as other parties that produce information through EPA contracts,
grants and cooperative and interagency agreements or in response to a requirement under a
statute, regulation, permit, order or other mandate. EPA generally has considerable control or
influence over the quality of this information at the time the information is generated. Existing
quality controls that EPA applies to the generation of information from these sources are based
on EPA's Quality System (EPA, 2000a; EPA, 2000b), Peer Review Policy (EPA, 1994), Risk
Characterization Policy (EPA, 1995) and other agency-wide and program-specific policies, as
well as specific provisions in contracts, grants, agreements, regulations and statutes. A few
additional useful web sites for obtaining further information on EPA's Quality System and
various regulatory policies and decisions are provided under the References section at the end of
this document.
The Agency also receives information that is voluntarily submitted by or collected from
external sources, the generation of which does not come under the direct control of the Agency's
internal information quality systems. This information may include scientific studies published
in journal articles, testing or survey data, such as environmental monitoring or laboratory test
results, and analytic studies, such as those that model environmental conditions or that assess
risks to public health. Since EPA has placed great emphasis on the management of
environmental issues on a cooperative basis with its many stakeholders, the amount of
information submitted to EPA by external sources is increasing. Such sources include other
federal, state, tribal, local and international agencies; national laboratories; academic and
research institutions; business and industry; and public interest organizations. Although EPA's
existing quality systems are not applied at the time this information is generated, EPA does apply
appropriate quality controls when evaluating this information for use in Agency actions and for
its dissemination consistent with the EPA Information Quality Guidelines. The Agency hopes
this document will inform the public of EPA's objectives and enlist them in its effort to
disseminate quality information and make quality decisions.
During the development of this document, EPA requested public input in a variety of
ways. EPA distributed a draft document for public comment in September 2002 and hosted a
public meeting in Washington, DC. In January 2003, EPA commissioned the National Academy
of Sciences to host a workshop to discuss key aspects of this document from a scientific and
technical perspective. EPA revised this document based on the input received through these
public outreach opportunities.
-------
Page 4 Assessment Factors
2. Assessment Factors
2.1 General Assessment Factors
When evaluating the quality and relevance of scientific and technical information, the
considerations that the Agency typically takes into account can be characterized by five general
assessment factors:
• Soundness - The extent to which the scientific and technical procedures,
measures, methods or models employed to generate the information are
reasonable for, and consistent with, the intended application.
• Applicability and Utility - The extent to which the information is relevant for the
Agency's intended use.
• Clarity and Completeness - The degree of clarity and completeness with which
the data, assumptions, methods, quality assurance, sponsoring organizations and
analyses employed to generate the information are documented.
• Uncertainty and Variability - The extent to which the variability and uncertainty
(quantitative and qualitative) in the information or in the procedures, measures,
methods or models are evaluated and characterized.
• Evaluation and Review - The extent of independent verification, validation and
peer review of the information or of the procedures, measures, methods or
models.
These assessment factors reflect the most salient features of EPA's existing information
quality policies and guidelines. Whether the information consists of scientific theories, computer
codes for modeling environmental systems, environmental monitoring data, economic analyses,
social survey or demographic data, chemical toxicity testing, environmental fate and transport
predictions or a human health risk assessment, EPA generally evaluates information by weighing
considerations that fit within these five assessment factors. Thus, these factors encompass
considerations that are weighed in the process of evaluating the quality and relevance of
information. The appropriate level of quality for any particular information product is
necessarily related to how and in what context the information is to be used. If EPA chooses to
later "disseminate" the information, that dissemination would be covered by the Information
Quality Guidelines which describe EPA policy and procedures for reviewing and substantiating
the quality of information before EPA disseminates it.
-------
Assessment Factors Page 5
When EPA considers using information for a particular purpose, careful judgment is
applied to evaluate the information for quality and relevance in the context of the specific
Agency action being developed. For instance, in the context of a given action, EPA may need to
weigh the appropriateness of using information with significant, but known uncertainties to fill
"data gaps," relative to using default assumptions or committing additional resources to generate
new information.
2.2 Examples of Questions Raised by Consideration of the Assessment Factors
Example questions that could be raised by the consideration of each of the assessment
factors for various types of information are provided below. Given the very general nature of
these assessment factors, the Agency felt that a compilation of such illustrative questions would
most clearly convey the intended nature and breadth of the assessment factors, and how they
would be reflected in an evaluation of various types of information. However, the applicability
of these factors depends on the individual situation, and EPA retains discretion to consider and
use factors and approaches on a case-by-case basis that may differ from the illustrative
considerations presented below.
2.2.1 Soundness
The extent to which the scientific and technical procedures, measures, methods or
models employed to generate the information are reasonable for, and consistent
with, the intended application.
a) Is the purpose of the study reasonable and consistent with its design?
b) To what extent are the procedures, measures, methods, or models
employed to develop the information reasonable and consistent with sound
scientific theory or accepted approaches?
c) How do the study's design and results compare with existing scientific or
economic theory and practice? Are the assumptions, governing equations
and mathematical descriptions employed scientifically and technically
justified? Is the study based on sound scientific or econometric
principles?
d) In the case of a survey, have the questionnaires and other survey
instruments been validated (e.g., compared with direct measurement data)?
Were checks for potential errors made during the interview process?
-------
Page 6 Assessment Factors
e) How internally consistent are the study's conclusions with the data and
results presented?
2.2.2 Applicability and Utility
The extent to which the information is relevant for the Agency's intended use.
a) How useful or applicable is the scientific or economic theory applied in
the study to the Agency's intended use of the analysis?
b) How relevant are the study's purpose, design, outcome measures and
results to the Agency's intended use of the analysis (e.g., for a chemical
hazard characterization)?
c) Are the domains (e.g., duration, species, exposure) where the model or
results are valid useful to the Agency's application?
d) How relevant is the study to current conditions of interest? For example,
in the case of a survey, are conditions likely to have changed since the
survey was completed (i.e., is the information still relevant)? Is the
sampled population relevant to the Agency's current application? How
well does the sample take into account sensitive subpopulations?
2.2.3 Clarity and Completeness
The degree of clarity and completeness with which the data, assumptions,
methods, quality assurance, sponsoring organizations and analyses employed to
generate the information are documented.
a) To what extent does the documentation clearly and completely describe
the underlying scientific or economic theory and the analytic methods
used?
b) To what extent have key assumptions, parameter values, measures,
domains and limitations been described and characterized?
c) To what extent are the results clearly and completely documented as a
basis for comparing them to results from other similar tests?
-------
Assessment Factors Page 7
d) If novel or alternative theories or approaches are used, how clearly are they
explained and the differences with accepted theories or approaches
highlighted?
e) Is the complete data set accessible, including metadata, data-dictionaries
and embedded definitions (e.g., codes for missing values, data quality
flags and questionnaire responses)? Are there confidentiality issues that
may limit accessibility to the complete data set?
f) In the case of a modeling exercise, have the definitions and units of model
parameters been provided? To what extent have the procedures for
applying the model been clearly and completely documented? How
available and adequate is the information necessary to run the model
computer code?
g) To what extent are the descriptions of the study or survey design clear,
complete and sufficient to enable the study or survey to be reproduced?
h) Have the sponsoring organization(s) for the study/information product and
the author(s) affiliation(s) been documented?
i) To what extent are the procedures for quality assurance and quality control
of the data documented and accessible?
2.2.4 Uncertainty and Variability
The extent to which the variability and uncertainty (quantitative and qualitative)
in the information or in the procedures, measures, methods or models are
evaluated and characterized.
a) To what extent have appropriate statistical techniques been employed to
evaluate variability and uncertainty? To what extent have the sensitive
parameters of models been identified and characterized?
b) To what extent do the uncertainty and variability impact the conclusions
that can be inferred from the data and the utility of the study? What are
the potential sources and effects of error and bias in the study design?
-------
Page 8 Assessment Factors
c) Did the study identify potential uncertainties such as those due to inherent
variability in environmental and exposure-related parameters or possible
measurement errors?
2.2.5 Evaluation and Review
The extent of independent verification, validation and peer review of the
information or of the procedures, measures, methods or models.
a) To what extent has there been independent verification or validation of the
study method and results? What were the conclusions of these
independent efforts, and are they consistent?
b) To what extent has independent peer review been conducted of the study
method and results, and how were the conclusions of this review taken
into account?
c) Has the procedure, method or model been used in similar, peer reviewed
studies? Are the results consistent with other relevant studies?
d) In the case of model-based information, to what extent has independent
evaluation and testing of the model code been performed and documented?
2.3 Relationship Between the General Assessment Factors and the Elements of
Quality in EPA's Information Quality Guidelines
The definition of quality in the EPA Information Quality Guidelines consists of three
components, consistent with the definition of quality in OMB's Guidelines: objectivity, utility
and integrity of disseminated information. "Objectivity" focuses on the extent to which
information is presented in an accurate, clear, complete and unbiased manner; and, as a matter of
substance, the extent to which the information is accurate, reliable and unbiased. "Utility" refers
to the usefulness of the information to the intended users. "Integrity" refers to security, such as
the protection of information from unauthorized access or revision, to ensure the information is
not compromised through corruption or falsification.
The five general assessment factors presented in this document are consistent with the
quality elements of objectivity and utility, but do not extend to the distinct element of integrity
(which refers to the separate matter of security issues). The assessment factor applicability and
utility is most directly related to the element of utility in the OMB and EPA Information Quality
-------
Assessment Factors Page 9
Guidelines. The other four assessment factors relate to the element of objectivity, which itself
encompasses a number of issues related to both presentation and substance. In particular, the
factor clarity and completeness is most directly related to some aspects of the presentation of
information (including whether the information is "presented in an accurate, clear, complete and
unbiased manner"). The factors soundness, uncertainty and variability and evaluation and
review most directly relate to the substantive aspects of the element of objectivity (related to
whether the information itself is "accurate, reliable and unbiased"), although they also play a role
in enhancing aspects of the presentation of the information. Thus, the general assessment factors
are fully consistent with the related information quality elements described in the OMB and EPA
Information Quality Guidelines, and do not constitute a conceptually different or unrelated basis
for evaluating information quality.
It is important to note that the EPA Information Quality Guidelines apply to
"information" that EPA disseminates to the public. The EPA Information Quality Guidelines
apply to information generated by third parties if EPA distributes information prepared or
submitted by an outside party in a manner that reasonably suggests that EPA endorses or agrees
with it; if EPA indicates in its distribution that the information supports or represents EPA's
viewpoint; or if EPA in its distribution proposes to use or uses the information to formulate or
support a regulation, guidance, policy or other Agency decision or position (EPA 2002). Please
refer to the EPA Information Quality Guidelines for additional information regarding their
applicability to information EPA disseminates.
-------
Page 10 Assessment Factors
3. Summary
This document describes the assessment factors and considerations generally used by the
Agency to evaluate the quality and relevance of the broad range of scientific and technical
information used by the EPA. These factors are founded in the Agency guidelines, practices and
procedures that make up the EPA information and quality systems including existing
program-specific quality assurance policies. Consistent with the Agency's approach to the
development of the EPA Information Quality Guidelines, this document is the product of an
open, collaborative process between EPA and the public.
The Agency believes that the summary of general assessment factors provided in this
document will serve to increase the extent to which the information-generating public builds
quality considerations into the generation and documentation of their information products. The
Agency expects that the resulting improvements in the quality of such information will enable the
Agency to more fully utilize and disseminate such information. Thus, this document is intended
to complement the EPA Information Quality Guidelines and other Agency efforts to ensure and
enhance information quality.
-------
Assessment Factors Page 11
References
US Environmental Protection Agency, 1994, Peer Review and Peer Involvement at the U.S.
Environmental Protection Agency, Science Policy Council,
http://www.epa.gov/osp/spc/memo0607.htm
US Environmental Protection Agency, 1995, Policy for Risk Characterization, Science Policy
Council, http://www.epa.gov/osp/spc/rcpolicy.htm
US Environmental Protection Agency, 1998, Guidelines for Ecological Risk Assessment,
EPA/630/R095/002F, Risk Assessment Forum,
http ://cfpub. epa. gov/ncea/cfm/recordisplay.cfm?deid= 12460
US Environmental Protection Agency, 1999, Guidelines for Carcinogen Risk Assessment,
Review Draft, NCEA-F-0644, Office of Research and Development,
http ://cfpub .epa. gov/ncea/raf/cancer. cfm
US Environmental Protection Agency, 2000a, "Policy and Program Requirements for the
Mandatory Agency-wide Quality System," (EPA Order 5360.1 A2),
http://www.epa.gov/quality/qs-docs/5360-l.pdf
US Environmental Protection Agency, 2000b, "EPA Quality Manual for Environmental
Programs," (EPA Order 5360 Al), http://www.epa.gov/quality/qs-docs/5360.pdf
US Environmental Protection Agency, 2002, Guidelines for Ensuring and Maximizing the
Quality, Objectivity, Utility, and Integrity of Information Disseminated by the
Environmental Protection Agency, EPA 260R-02-008, Office of Environmental
Information, http ://www. epa. gov/oei/qualityguidelines
US Office of Management and Budget, 2002, Guidelines for Ensuring and Maximizing the
Quality, Objectivity, Utility, and Integrity of Information Disseminated by Federal
Agencies, (67 FR 8452), http://www.whitehouse.gov/omb/fedreg/reproducible2.pdf
Additional Useful Web Sites
EPA Quality System web site: http://www.epa.gov/quality
EPA Science Policy Council web site: http://www.epa.gov/osp/spc
EPA Information Quality Guidelines web site: http ://www. epa. gov/oei/quality guidelines
------- |