UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                                  WASHINGTON D.C. 20460
                                                              OFFICE OF THE ADMINISTRATOR
                                                                SCIENCE ADVISORY BOARD
                                   August 7, 2008
EPA-SAB-08-010

The Honorable Stephen L. Johnson
Administrator
U.S. Environmental Protection Agency
1200 Pennsylvania Avenue, N.W.
Washington, D.C. 20460

       Subject:  Review of Draft "Multi-Agency Radiation Survey and Assessment of
               Materials and Equipment (MARSAME) Manual"

Dear Administrator Johnson:

       At the request of EPA's Office of Radiation and Indoor Air, the Science Advisory Board
(SAB) Radiation Advisory Committee (RAC), augmented with additional experts (referred
herein as the Panel or the SAB MARSAME Review Panel), has completed its review of the
"Multi-Agency Radiation Survey and Assessment of Materials and Equipment (MARSAME)
Manual, "(December 2006 Draft).  This draft manual was prepared by a multi-agency work
group with participation from the U.S. Department of Energy, the U.S. Nuclear Regulatory
Commission, the U.S. Department of Defense, and the U.S. Environmental Protection Agency
(U.S. EPA).  The multi-agency work group has been active since 1995, for some periods with
representation from additional federal agencies, to prepare a series of radiological guidance
documents, of which this is the third. The preceding documents are entitled "Multi-Agency
Radiation Survey and Site Investigation Manual (MARSSIM) " and "Multi-Agency
Radiological Laboratory Analytical Protocols (MARLAP) Manual." These manuals were
previously reviewed by the RAC, augmented with additional experts.

        The MARSAME manual complements MARSSIM (a surface soils radiation survey
manual) by providing a process for surveying potentially radioactive material and equipment
(M&E). It is a detailed document that provides guidance to determine whether M&E are
sufficiently free of radionuclide contamination to be admitted to or removed from a site. Its
chapters address the components of a survey plan: initial assessment, input needed for decision
making, survey design, survey implementation, and reaching a disposition decision.  The manual
begins with a road map to help the user navigate the manual, includes a chapter with illustrative
examples, and collects pertinent information in seven appendices. Much  of its presentation is

-------
based on the contents of MARSSIM and MARLAP because M&E surveys often are related to
site investigations and utilize laboratory analyses; however, an M&E survey may stand alone.

       The SAB found the MARSAME manual to be an admirable cooperative and competently
written effort by staff from the several agencies that provides guidance in an important endeavor.
The Panel expects the  manual to be as widely applied as the two earlier radiological guidance
manuals and to contribute significantly to radiation protection for the US population.  To assist
this endeavor, the Panel presents 37 Recommendations and a Statistical Analysis Appendix in
the enclosed review.

       The main recommendations are:

    •   To facilitate application of MARSAME by important users such as project managers and
       site cleanup specialists who are not radiation protection professionals, support the Manual
       by holding training courses.

    •   To improve ease in reading and applying MARSAME, combine in a separate chapter the
       detailed guidance for hypothesis testing, experimental design, and statistical analysis that
       is now dispersed throughout the Manual.
    •   To clarify application of MARSAME guidance,  improve the illustrative examples (now
       mislabeled  'case studies') by replacing them with real case studies, or enhance the
       illustrative examples with additional information based on real situations.
    •   To permit realistic design of M&E surveys in response to required action levels, include
       in MARSAME the pertinent regulations and guidance, and provide an updating
       mechanism.
    •   To avoid serious errors in delineating the radioactive contaminants of M&E, provide in
       MARSAME the same detailed discussion of radionuclides that are categorized as (1)
       removable from the surface and (2) distributed throughout the material volume as is
       currently devoted to the category of surface contamination that may be either fixed or
       removable.
    •   To expand the  range of alternative M&E surveys available in MARSAME - from
       application  of the full content of the Manual to 'no further action needed,' include
       descriptions of iterative release efforts that may include decontamination efforts and
       storage for radioactive decay.
Other recommendations concern additional refinements and improvements in content and
presentation, including the concept that MARSAME be updated periodically.

       The SAB appreciates the opportunity to review this draft manual and hopes that the
recommendations provided will enable EPA and cooperating agencies to issue effective guidance
for radiological surveys of material and equipment.

-------
The revised MARSAME Manual will be a useful document for EPA and other Federal and State
agencies in providing guidance to control transfer of material and equipment that may be
contaminated with radionuclides. We look forward to the Agency's response.

                                 Sincerely,
             /Signed/                               /Signed/

      Dr. M. Granger Morgan                   Dr. Bernd Kahn
      Chair, Science Advisory Board            Chair, Radiation Advisory Committee

-------
                                       NOTICE

        This report has been written as part of the activities of the EPA Science Advisory Board
(SAB), a public advisory group providing extramural scientific information and advice to the
Administrator and other officials of the Environmental Protection Agency.  The SAB is
structured to provide balanced, expert assessment of scientific matters related to problems facing
the Agency. This report has not been reviewed for approval by the Agency and, hence, the
contents of this advisory do not necessarily represent the views and policies of the
Environmental Protection Agency, nor of other agencies in the Executive Branch of the Federal
government, nor does mention of trade names of commercial products constitute a
recommendation for use.  Reports and advisories of the SAB are posted on the EPA website at
http://www.epa.gov/sab.

-------
                  U.S. Environmental Protection Agency
                          Science Advisory Board
  Radiation Advisory Committee (RAC) Augmented for Review of the
   Multi-Agency Radiation Survey and Assessment of Materials and
                      Equipment (MARSAME) Manual

CHAIR:
Dr. Bernd Kahn, Professor Emeritus, Nuclear and Radiological Engineering Program, and
Director, Environmental Radiation Center, GTRI, Georgia Institute of Technology, Atlanta, GA

PAST CHAIR:
Dr. Jill Lipoti, Director, Division of Environmental Safety and Health, New Jersey Department
of Environmental Protection, Trenton, NJ

RAC MEMBERS:

Dr. Thomas B. Borak, Professor, Department of Environmental and Radiological Health
Sciences,  Colorado State University, Fort Collins, CO

Dr. Antone L. Brooks, Professor, Radiation Toxicology, Washington State University Tri-
Cities, Richland, WA

Dr. Faith G. Davis, Senior Associate Dean, Professor of Epidemiology, Division of
Epidemiology and Biostatistics, School of Public Health, University of Illinois at Chicago,
Chicago, IL

Dr. Brian Dodd, Consultant, Las Vegas, NV

Dr. Shirley A. Fry, Consultant, Indianapolis, IN

Dr. William C. Griffith, Associate Director, Institute for Risk Analysis and Risk
Communication, Department of Environmental and Occupational Health Sciences, University of
Washington, Seattle, WA

Dr. Jonathan M. Links, Professor, Department of Environmental Health Sciences, Bloomberg
School of Public Health, Johns Hopkins University, Baltimore, MD

Mr. Bruce A. Napier, Staff Scientist, Radiological Science & Engineering Group, Pacific
Northwest National Laboratory, Richland, WA1
1      Mr. Napier was unable to attend the face-to-face meeting of October 29-31, 2007 and the conference call of
      March 10, 2008.
                                       ii

-------
Dr. Daniel O. Stram, Professor, Department of Preventive Medicine, Division of Biostatistics
and Genetic Epidemiology, Keck School of Medicine, University of Southern California, Los
Angeles, CA

Dr. Richard J. Vetter, Head, Radiation Safety Program, Mayo Clinic, Rochester, MN

CONSULTANTS:

Mr. Bruce W. Church, President, BWC Enterprises, Inc., Hurricane, UT

Mr. Kenneth Duvall, Environmental Scientist/Consultant, Washington, D.C.

Dr. Janet A. Johnson, Consultant, Carbondale, CO

Dr. Paul J. Merges, President, Environment & Radiation Specialists, Inc., Loudonville, N.Y.


SCIENCE ADVISORY BOARD STAFF

Dr. K. Jack Kooyoomjian, Designated Federal Officer, 1200 Pennsylvania Avenue, NW,
Washington, DC, 20460-0001, Phone: 202-343-9984, Fax: 202-233-0643, or 0645
(kQoy,OQjTijlM4ack@e,pa,,goy) Messenger/Physical Delivery Address: 1025 F Street, NW, Room
3606, Mail Code 1400F
                                          in

-------
                     U.S. Environmental Protection Agency
                             Science Advisory Board

CHAIR
Dr. M. Granger Morgan, Lord Chair Professor in Engineering, Department of Engineering and
Public Policy, Carnegie Mellon University, Pittsburgh, PA

MEMBERS
Dr. Gregory Biddinger, Coordinator, Natural Land Management Programs, Toxicicology and
Environmental Sciences, ExxonMobil Biomedical Sciences, Inc, Houston, TX

Dr. Thomas Burke, Professor, Department of Health Policy and Management, Johns Hopkins
Bloomberg School of Public Health, Johns Hopkins University, Baltimore, MD

Dr. James Bus, Director of External  Technology, Toxicology and Environmental Research and
Consulting, The Dow Chemical Company, Midland, MI

Dr. Deborah Cory-Slechta, Professor, Department of Environmental Medicine, School of
Medicine and Dentistry, University of Rochester, Rochester, NY

Dr. Maureen L. Cropper, Professor, Department of Economics, University of Maryland,
College Park, MD

Dr. Virginia Dale, Corporate Fellow, Environmental Sciences Division, Oak Ridge National
Laboratory, Oak Ridge, TN

Dr. Kenneth Dickson, Regents Professor, Department of Biological Sciences, University of
North Texas, Aubrey, TX

Dr. David A. Dzombak, Walter J. Blenko Sr. Professor of Environmental Engineering,
Department of Civil and Environmental Engineering, College of Engineering, Carnegie Mellon
University, Pittsburgh, PA

Dr. Baruch Fischhoff, Howard Heinz University Professor, Department of Social and Decision
Sciences, Department of Engineering and Public Policy, Carnegie Mellon University, Pittsburgh,
PA

Dr. James Galloway, Professor, Department of Environmental Sciences, University of Virginia,
Charlottesville, VA

Dr. James K. Hammitt, Professor, Center for Risk Analysis, Harvard University, Boston, MA

Dr. Rogene Henderson, Senior Scientist Emeritus, Lovelace Respiratory Research Institute,
Albuquerque, NM
                                         IV

-------
Dr. James H. Johnson, Professor and Dean, College of Engineering, Architecture & Computer
Sciences, Howard University, Washington, DC

Dr. Bernd Kahn, Professor Emeritus, Nuclear and Radiological Engineering Program, and
Director, Environmental Radiation Center, GTRI, Georgia Institute of Technology, Atlanta, GA

Dr. Agnes Kane, Professor and Chair, Department of Pathology and Laboratory Medicine,
Brown University, Providence, RI

Dr. Meryl Karol, Professor Emerita, Graduate School of Public Health, University of
Pittsburgh, Pittsburgh, PA

Dr. Catherine Kling, Professor, Department of Economics, Iowa State University, Ames, IA

Dr. George Lambert, Associate Professor of Pediatrics, Director, Center for Childhood
Neurotoxicology, Robert Wood Johnson Medical School-UMDNJ, Belle Mead, NJ

Dr. Jill Lipoti, Director, Division of Environmental Safety and Health, New Jersey Department
of Environmental Protection, Trenton, NJ

Dr. Michael J. McFarland, Associate Professor, Department of Civil and Environmental
Engineering, Utah State University, Logan, UT

Dr. Judith L. Meyer, Distinguished Research Professor Emeritus, Institute of Ecology,
University of Georgia, Athens, GA

Dr. Jana Milford, Associate Professor, Department of Mechanical Engineering, University of
Colorado, Boulder, CO

Dr. Rebecca Parkin, Professor and Associate Dean, Environmental and Occupational Health,
School of Public Health and Health Services, The George Washington University Medical
Center, Washington, DC

Mr. David Rejeski,  Director, Foresight and Governance Project, Woodrow Wilson International
Center for Scholars,  Washington, DC

Dr. Stephen M. Roberts, Professor, Department of Physiological Sciences, Director, Center for
Environmental and Human Toxicology, University of Florida, Gainesville, FL

Dr. Joan B. Rose, Professor and Homer Nowlin Chair for Water Research, Department of
Fisheries and Wildlife, Michigan State University, East Lansing, MI

-------
Dr. James Sanders, Director and Professor, Skidaway Institute of Oceanography, Savannah,
GA

Dr. Jerald Schnoor, Allen S. Henry Chair Professor, Department of Civil and Environmental
Engineering, Co-Director, Center for Global and Regional Environmental Research, University
of Iowa, Iowa City, IA

Dr. Kathleen Segerson, Professor, Department of Economics, University of Connecticut, Storrs,
CT

Dr. Kristin Shrader-Frechette, O'Neil Professor of Philosophy, Department  of Biological
Sciences and Philosophy Department, University of Notre Dame, Notre Dame, IN

Dr. V. Kerry Smith, W.P. Carey Professor of Economics , Department of Economics , W.P
Carey School of Business, Arizona State University,  Tempe, AZ

Dr. Deborah Swackhamer, Interim Director and Professor, Institute on the Environment,
University of Minnesota,  St.  Paul, MN

Dr. Thomas L. Theis, Director, Institute for Environmental Science and Policy, University of
Illinois at Chicago, Chicago, IL

Dr. Valerie Thomas, Anderson Interface Associate Professor, School of Industrial and Systems
Engineering, Georgia Institute of Technology, Atlanta, GA

Dr. Barton H. (Buzz) Thompson, Jr., Robert E. Paradise Professor of Natural Resources Law
at the Stanford Law School and Director, Woods  Institute for the Environment Director, Stanford
University, Stanford, CA

Dr. Robert Twiss, Professor Emeritus, University of California-Berkeley, Ross, C A

Dr. Lauren Zeise, Chief, Reproductive and Cancer Hazard Assessment Branch, Office of
Environmental Health Hazard Assessment, California Environmental Protection Agency,
Oakland, CA

SCIENCE ADVISORY  BOARD STAFF
Mr. Thomas Miller, Designated Federal Officer, 1200 Pennsylvania Avenue, NW
1400F, Washington, DC,  Phone: 202-343-9982, Fax: 202-233-0643
                                          VI

-------
                            TABLE OF CONTENTS
1.  EXECUTIVE SUMMARY	1

2.  INTRODUCTION	4
   2.1  Background	4
   2.2  Review Process and Acknowledgement	5
   2.3  EPA Charge to the Panel	6


3.  RESPONSE TO THE STATISTICS ELEMENTS OF THE CHARGE QUESTIONS	8

4.  RESPONSE TO CHARGE QUESTION 1: PROVIDING AN APPROACH FOR PLANNING,
            CONDUCTING, EVALUATING AND DOCUMENTING ENVIRONMENTAL
            RADIOLOGICAL SURVEYS TO DETERMINE THE APPROPRIATE DISPOSITION
            FOR MATERIALS AND EQUIPMENT	9

   4.1  Charge Question # 1	9
   4.2  Charge Question # la	10
   4.3  Charge Question # Ib	  11
   4.4  Charge Question* Ic	11
   4.5  Charge Question # Id	12


5.  RESPONSE TO CHARGE QUESTION 2: COMMENTS ON THE STATISTICAL METHODOLOGY
            CONSIDERED IN MARSAME	14

   5.1  Charge Question # 2	14
   5.2  Charge Question # 2a	14
   5.3  Charge Question # 2b	  15
   5.4  Charge Question #2c	18


6.  RESPONSE TO CHARGE QUESTION 3: RECOMMENDATIONS PERTAINING TO THE
            MARSAME ROADMAP AND APPENDICES	19

7.  RECOMMENDATIONS BEYOND THE CHARGE	22

REFERENCES 	23

APPENDIX A - STATISTICAL ANALYSIS - AN INTRODUCTION TO EXPERIMENTAL DESIGN AND
            HYPOTHESIS TESTING WITH SPECIFIC COMMENTS ON STATISTICS	26
   A-l An Introduction to Experimental Design and Hypothesis Testing	26
   A-2 Specific Comments 	29

APPENDIX B -ACRONYMS AND ABBREVIATIONS	38

APPENDIX C - TYPOS AND CORRECTIONS	40
                                        vn

-------
                                    TEXT FIGURES


FIGURE 1 - Suggested Changes for Figure 6.3 from MARSAME Manual for Interpretation of Survey
              Results for Scan-Only and In Situ Surveys 	16

FIGURE 2 - Suggested Changes for Figure 6.4 from MARSAME Manual for Interpretation of Results for
              MARSSIM-Type Surveys 	17

FIGURE 3 - The MARSAME Process 	20


                                APPENDIX A FIGURES


FIGURE A-l SCENARIO A 	30

FIGURE A-2 SCENARIO B 	30

FIGURE A-3 Comparison of Sc  	34

FIGURE A-4 Comparison of Sc for longer background counting period	34

FIGURE A-5 Sc for a briefer background counting period 	35

FIGURE A-6 1- p as a function of % excess count above background 	35
                                           Vlll

-------
                            1. EXECUTIVE SUMMARY

       The Radiation Advisory Committee (RAC) of the Science Advisory Board (SAB),
augmented for this review (also hereinafter referred to as the Panel, or the SAB MARSAME
Review Panel) has completed its review of the Agency's draft document entitled "Multi-Agency
Radiation Survey and Assessment of Materials and Equipment (MARSAME) Manual, " Draft
Report for Comment, December 2006 (U.S. EPA, 2006; see also the MARSAME Hotlink at
    : //www . m ar same . org) . The MARSAME manual presents a framework for planning,
implementing, and assessing radiological surveys of material and equipment (M&E).
MARSAME supplements the Multi-Agency Radiation Survey and Site Investigation Manual
(MARS SIM; see also the MARS SIM Hotlink at
and refers to information provided in the Multi-Agency Radiological Laboratory Analytical
Protocols (MARLAP) Manual. The MARLAP Hotlink is
http ://epa. gov/radiation/marlap/index.html .

       All manuals were prepared collaboratively by a multi-agency work group comprised of
professionals from several pertinent Federal agencies.  The three documents, taken together,
describe radiological survey programs in great detail and address recommendations to "technical
audiences having knowledge of health physics and statistics" for performing such surveys.  The
manuals are designed to enable effective comparisons of survey measurements of radionuclide
concentrations to regulations or guides for accepting or rejecting approval of a program or
process. Vocabulary and techniques in MARSAME are carried forward from MARS SIM and
MARLAP.

       The MARSAME manual complements MARS SIM (a surface-soil radiation survey
manual known as the Multi-Agency Survey and Site Investigation Manual) by providing a
process for surveying potentially radioactive M&E that may be in nature, commerce, or use
when considered for receipt or disposition. It presents an overview of the various aspects of
initial assessment, decision inputs, survey design, survey implementation, and assessment of
results. Important activities such as hypothesis testing  and statistical analysis of measurement
reliability are described in considerable detail. A number of illustrative examples, incorrectly
termed "case studies," are presented. A road map assists the reader in moving among chapters.
Useful information is collected in appendices.

       This review of the MARSAME Manual by the SAB was requested by the EPA Office of
Radiation and Indoor Air (ORIA).  It is based on reading the MARSAME Draft Report for
Comment (December 2006), presentations by MARSAME multi-agency work group members at
the meeting on October 29-31, 2007, and discussions in a series of teleconference meetings held
on October 9, 2007, December 21, 2007, and March 10, 2008.  A quality review of the Panel's
April 24, 2008 draft report was conducted by the chartered SAB on May 29, 2008 in a public
teleconference.

       The Panel recognizes the magnitude of the effort by the multi-agency work group and the
value of its product.  The Panel recommends modifications to only a small fraction of this
product. Panel recommendations can be summarized in the following broad categories:

-------
   •   MARSAME guidance is suitable for experienced radiation protection and surveillance
       staff, but supporting courses should be presented for involved managers and
       professionals with limited knowledge of health physics or statistics to give them the
       needed special training and information that they would otherwise have to assimilate by
       searching through the lengthy MARS SIM and MARLAP documents. (1-3,3-2, 3-4, C-
       4)2

   •   Specialized guidance for applying statistical tools for data analysis, experimental design,
       and hypothesis testing should be separated from the otherwise pervasively non-
       quantitative guidance for the convenience of the general audience and for acceptance by
       specialists. This guidance should be in a separate chapter, enhanced in accord with
       comments in the Appendix to this review, (lb-3, lc-1, 2a-l, 2a-2, 2c-l, 3-6)2

   •   The incorrectly entitled 'case  studies' should be labeled as 'illustrative examples' and
       their contents should be enhanced to assure realism, (ld-1, ld-2, ld-3, ld-4, 2c-2, 2c-3)2

   •   Known regulations and guidance for meeting M&E action levels in MARSAME should
       be tabulated or cited by reference, with a mechanism for updating the references, (lb-1,
       3-5)2

   •   As much consideration should be given to surveys for radioactive contamination that is
       removable from the surface and that is dispersed throughout the material volume as is
       given  currently to surface contamination (fixed plus removable) in order to distinguish
       among the three categories for radiation protection, (lb-2, 2b-3, 2b-4)2

   •   The various alternatives for M&E surveys should be described in sufficient detail to
       provide a wide choice of options, from no further action needed through minor survey
       efforts to a major survey that applies the full contents of the MARSAME manual.  The
       options should include iterative M&E release efforts such as decontamination or storage
       for decay. (1-1, 1-2, lc-2, C-3)2

   •   Other recommendations are intended to improve the usefulness of various portions of the
       MARSAME manual, including updating it periodically  (la-1, la-2, la-3, 2a-3, 2b-l, 2b-
       2, 3-1, 3-3, 3-7, 3-8, C-l, C-2)2

       The multi-agency work group clearly has devoted considerable effort to describing the
statistical tools.  This is important because acceptance of survey measurements depends on their
reliability near the action level (AL).  Meeting this requirement can only be demonstrated in a
statistical framework; for example, the discrimination level (DL) must be below the AL in
Scenario A, where the DL is defined to the satisfaction of the  surveyor and the regulator in terms
of the values for allowable type I error a and the allowable type II error p.
       The parenthetical numbers identify recommendations in response to the charge questions.

-------
       Because of the importance of clarity in the mathematical support structure, a sub-group
of the Panel has prepared a guide to those topics in MARSAME that is collected in Appendix A
to this review. This guide is devoted to matters such as survey design, the gray region, the DL,
the test significance levels a and P, and hypothesis testing for Scenario A and Scenario B.  The
guide is intended to present to the multi-agency work group the Panel's view of (1) making this
approach readily accessible to persons only generally familiar with statistical analysis, and (2)
gaining acceptance by those who are knowledgeable on this topic.

-------
                             2.     INTRODUCTION
       2.1   Background

       The MARSAME Manual (U.S. EPA, 2006b) was designed to guide a radiation protection
professional through all aspects of radiological surveys of M&E prior to intended receipt or
appropriate disposition. It is written sufficiently broadly to pertain to all types of M&E. Cited
as examples are metals, concrete, tools, trash, equipment, furniture, containers of material, and
piping, among others.  Release or interdiction, i.e., acceptance or rejection of M&E transfer, are
the alternate outcomes of the survey.

       The draft document for review was prepared collaboratively by staff working together
from the following Federal agencies: the U.S. Environmental Protection Agency (US EPA), the
U.S. Nuclear Regulatory Commission (US NRC), the U.S. Department of Energy (US DOE),
and the U.S. Department of Defense (US DoD). It is part of a continuing  and technically
significant effort that began with writing MARSSIM (U.S. EPA./SAB,1997; U.S. EPA, 2000
and 2001), continued with MARLAP (U.S. EPA./SAB, 2003a.; U.S. EPA, 2004), and anticipates
preparation of at least one other manual after MARSAME for sub-surface radiation surveys and
characterization. The methodology and associated vocabulary in MARSAME follow those of
the preceding manuals, although a few aspects of MARSAME are distinct. Notably,
MARSAME may be connected to MARSSIM and MARLAP as part of a  site survey, or stand by
itself in considering the transfer of M&E to or from a site.

       Survey guidance in the MARSAME manual and its predecessors is based on the Data
Quality Objectives (DQO) process to design the best survey with regard to disposition option,
action level (AL),  and M&E type. The Data Life Cycle (DLC) supports DQO by carrying
suitable information through the planning, implementation, assessment, and decision stages of
the program.  The data are collected, evaluated,  and applied in terms of Measurement Quality
Objectives (MQO) established with statistical concepts of data uncertainty and Minimum
Quantifiable Concentrations (MQC).  The sensitivity of measurements is defined in terms of the
discrimination limit (DL), which is is attained by selecting suitable radionuclide detectors and
conditions of sampling and measurement. The measurement results must be acceptable relative
to action levels and significance levels specified in regulations or guidance.

       The MARSAME document is structured as follows, shown with the relevant charge
question (CQ) number (See Section 2.3 for the charge questions):
       Acronyms and  Abbreviations
       Symbols, Nomenclature, and Notations
       Conversion factors
       Road Map (CQ 3)
       Chapter  1,  Introduction and overview (CQ 1)
       Chapter 2,  Initial assessment of M&E (CQ la)
       Chapter 3,  Identify inputs for the decision  (CQ Ib)
       Chapter 4,  Survey design (CQ Ic)

-------
       Chapter 5, Implementation of disposition surveys (CQ 2a)
       Chapter 6, Assess the results of the disposition survey (CQ 2b)
       Chapter 7, Case studies (CQ Id and 2c)
       7 Appendices (CQ 3)
       References
       Glossary

       Response to the charge questions was the primary purpose of the Panel and is addressed
first. The Panel also considered a few related topics, commented in detail on the MARSAME
discussion of statistical and operational aspects, and suggested minor corrections.

       2.2   Review Process and Acknowledgement

       The U.S. EPA's Office of Radiation and Indoor Air (ORIA), on behalf of the Federal
Agencies participating in the development of the draft MARSAME Manual, requested the SAB
to provide advice on the draft document entitled "Multi-Agency Radiation Survey and
Assessment of Materials and Equipment (MARSAME) Manual, " Draft Report for Comment,
December 2006 (U.S. EPA. 2006b.; also numbered as NUREG-1575, Supp. 1; EPA 402-R-06-
002; and DOE/EH-707).  MARSAME is a supplement to the "Multi-Agency Radiation Survey
and Site Investigation Manual" (MARSSIM; U.S. EPA, 2000 and 2001; also numbered as
NUREG-1575, rev. 1; EPA 402-R-97-016, Rev. 1; and DOE/EH-0624, Rev. 1). The SAB Staff
Office announced this advisory activity and requested nominations for technical experts to
augment the SAB's Radiation Advisory Committee (RAC) in the Federal Register (72 FR
11356; March 13, 2007).

       MARSAME was developed collaboratively by the Multi-Agency Work Group (60 FR
12555; March 7,  1995) and provides technical information on approaches for planning,
conducting, evaluating, and documenting radiological surveys to determine proper disposition of
materials and equipment (M&E). The techniques, methodologies, and principles that form the
basis of this manual were developed to be consistent with current Federal limits, guidelines, and
procedures.

       The Panel met in an initial public teleconference meeting on Tuesday, October 9, 2007.
The meeting was intended to introduce the subject and discuss the charge to the Panel, determine
if the review and background materials provided were adequate to respond to the charge
questions directed to the Panel, and agree on charge assignments for the Panelists. A public
meeting was scheduled on Monday, October 29 through Wednesday, October 31, 2007, to
receive presentations by the multi-agency work group, consider the charge questions, and draft a
report in response to the charge questions pertaining to the draft MARSAME manual. The Panel
discussed the first public draft report (December 17, 2007), in a December 21, 2007 public
conference call. The Panel discussed it's second public draft report (February 27, 2008) in the
March  10, 2008 public conference call.  The April 24, 2008 draft report was provided for a
public quality review teleconference meeting of the chartered Board on May 29, 2008. This
report incorporates suggestions made by the chartered SAB.

-------
       2.3   EPA Charge to the Panel

       The EPA's Science Advisory Board (SAB) previously conducted the scientific peer
reviews of the companion multi-agency documents MARSSJJVI (U.S. EPA/SAB, 1997; EPA-
SAB-RAC-97-008, dated September 30, 1997) and MARLAP (U.S. EPA/SAB, 2003b.; EPA-
SAB-RAC-03-009, dated June 10, 2003). The Federal agencies participating in those peer
reviews considered the process used by the SAB to be beneficial in assuring the accuracy and
usability of the final manuals. Subsequently, two consultations took place for MARSAME (U.S.
EPA/SAB, 2003a; EPA-SAB-RAC-CON-03-002, dated February 27, 2003, and U.S. EPA/SAB,
2004; EPA-SAB-RAC-CON-04-001, dated February 9, 2004). These are now being followed by
a request from EPA ORIA on behalf of the four participating Federal agencies that the SAB
conduct this formal technical peer review of the draft MARSAME manual.

       The following charge questions were posed to the SAB MARSAME Review Panel (U.S.
EPA, 2007b):

1) The objective of the draft MARSAME is to provide an approach for planning, conducting,
evaluating,  and documenting environmental radiological surveys to determine the appropriate
disposition for materials and equipment with a reasonable potential to contain radionuclide
concentration^) or radioactivity above background. Please comment on the technical
acceptability of this approach and discuss how well the document accomplishes this objective.
In particular, please
       a) Discuss the adequacy of the initial assessment process as provided in MARSAME
       Chapter 2, including the new concept of sentinel measurement (a biased measurement
      performed at a key location  to provide information specific to the objectives of the Initial
      Assessment).

       b) Discuss the clarity of the guidance on developing decision rules, as provided in
      MARSAME Chapter 3.

       c) Discuss the adequacy of the survey design process, especially the clarity of new
      guidance on using Scenario  B, and the acceptability of new scan-only and in-situ survey
       designs, as detailed in MARSAME Chapter 4.

       d) Discuss the usefulness of the case studies in illustrating new concepts and guidance,
       as provided in MARSAME Chapter 7.

2) The draft MARSAME, as a supplement toMARSSIM, adapts and adds to the statistical
approaches of both MARSSIM and MARLAP for application to radiological surveys of materials
and equipment.  Please comment on the technical acceptability of the statistical methodology
considered in MARSAME and note whether there are terminology or application assumptions
that may cause confusion among the three documents. In particular, please

       a) Discuss the adequacy of the procedures outlined for determining measurement
       uncertainty, detectability, and quantifiability, as described in MARSAME Chapter 5.

-------
       b) Discuss the adequacy of the data assessment process, especially new assessment
       procedures associated with scan-only andin-situ survey designs, and the clarity of the
       information provided in Figures 6.3 and 6.4, as detailed in MARSAME Chapter 6.

       c) Discuss the usefulness of the case studies in illustrating the calculation of
       measurement uncertainty, detectability, and quantifiability, as provided in MARSAME
       Chapter 7.

3) The draft MARSAME includes a preliminary section entitled Roadmap as well as seven
appendices. The goal of the Roadmap is to assist the MARSAME user in assimilating the
information in MARSAME and determining where important decisions need to be made on a
project-specific basis. MARSAME also contains appendices providing additional information
on the specific topics. Does the SAB have recommendations regarding the usefulness of these
materials?

-------
    3.    RESPONSE TO THE STATISTICS ELEMENTS OF THE CHARGE
                                       QUESTIONS

       Detailed discussions of statistical analysis related to experimental design and hypothesis
testing permeate the otherwise non-mathematical guidance for M&E surveys.  The Panel
response and comments specifically addressed to statistical analysis are compiled in Appendix A
rather than scattering them throughout this review. Appendix A consists of an introduction that
describes the view of the Panel, followed by specific reviewer responses based on these reviews.
All related responses to individual charge questions, notably for charge questions Ib, Ic, and 2a,
are referred to Appendix A.

       The Panel  recommends that topics presented in Appendix A be included as a separate
chapter that appears early in the MARSAME manual. This will serve to consolidate many of the
important statistical concepts that now are now scattered throughout several chapters.  The first-
time user will then become familiar with statistical considerations that are the backbone for the
MARSAME process.

       MARSAME contains many suggested equations for designing and interpreting survey
procedures (e.g., Tables 5.1, 5.2). The equations are derived from sound statistical principles,
but can lead to incorrect conclusions if the underlying assumptions in the derivations are not
satisfied. Not every equation needs to be derived in detail, but the assumptions and sampling
requirements to implement specific equations should be thoroughly identified and explained.

       Classical hypothesis testing procedures require specification of a null hypothesis and
values of a and P that quantify boundaries for type I and type II errors. The selection of these
values provides a measure of tolerance for uncertainty and assurance that the ultimate goals
relating to risk are satisfied. The existing discussion in Chapter 4 is too vague to provide
guidance on how these values should be selected and should either be more specific or —  if this
is considered to be beyond the scope of MARSAME - refer to sources of detailed guidance on
the selection of a and p.

       In the design of disposition surveys, the manual discusses determination of measurement
uncertainty, detectability, and quantifiability in terms of MQO requirements. These MQO values
should be organized and presented for individual types,  such as in situ, scan only, or MARS SIM
survey types.

-------
        4.  RESPONSE TO CHARGE QUESTION 1: PROVIDING AN
    APPROACH FOR PLANNING, CONDUCTING, EVALUATING AND
  DOCUMENTING ENVIRONMENTAL RADIOLOGICAL SURVEYS TO
   DETERMINE THE APPROPRIATE DISPOSITION FOR MATERIALS
                               AND EQUIPMENT

4.1  Charge Question 1: The objective of the draft MARSAME is to provide an approach for
planning, conducting, evaluating, and documenting environmental radiological surveys to
determine the appropriate disposition for materials and equipment with a reasonable potential
to contain radionuclide concentrations) or radioactivity above background. Please comment
on the technical acceptability of this approach and discuss how well the document
accomplishes this objective.


       The MARSAME manual impresses the Panel as an excellent technical document for
guiding an M&E survey.  Regarding CQ 1, the Panel recommends greater detail in describing
the "alternate approaches or modification" for applying MARSAME, as discussed in Chapter 1,
lines 50 - 56. For example, the option of decontaminating the M&E as part of the process when
considering alternate actions appears to be missing. The Panel also recommends making the
manual more accessible to interested non-specialists, notably project managers and other
decision makers. Such non-specialists generally are not included in the intended "technical
audience having knowledge of health physics and an understanding of statistics," with further
capabilities also described in Chapter 1, lines 187 - 194. The following itemized
recommendations elaborate on these points.

RECOMMENDATION 1-1: Create a sub-section for the discussion that begins in Chapter 1,
line 49, to present clearly the concept of simple alternatives to what may appear to the reader to
be a major undertaking.  Also, in lines 103-111  further define 'release' vs. 'interdiction' to
clarify the distinction between the terms.  Follow these paragraphs with  sufficient detail and
references to later chapters to assure the reader that when M&E is reasonably expected to have
little or no radioactive contamination, it can be processed without excessive effort under the
MARSAME system. One approach identified subsequently is applying  standard operating
procedures (SOP's). Categorization as non-impacted or as class 3 M&E based on historical data
also can lead to an appropriately simple process.

RECOMMENDATION 1-2: Insert a sub-section in Chapter 1 and in appropriate subsequent
chapters to consider various degrees of M&E decontamination as part of the available options
associated with a MARSAME survey. Storage  for radioactive decay can be an option for
decontamination.

RECOMMENDATION 1-3: Insert a paragraph after Chapter 1, line 196, to address use by
persons less skilled  professionally than defined  in a preceding paragraph. Reference to
Appendices B,  C, and D, would be helpful for such persons. Adding an appendix that includes
portions of the  MARS SIM Roadmap and  Chapters 1 and 2 could provide suitable background
information without requiring that all of MARS SIM be read.  Presentation of training courses for

-------
managers and other generalists with responsibility for MARSAME radiation surveys would be
most helpful.

4.2 Charge Question la: Discuss the adequacy of the initial assessment process as provided
in MARSAME Chapter 2, including the new concept of sentinel measurement (a biased
measurement performed at a key location to provide information specific to the objectives of
the Initial Assessment).

       The initial assessment (IA) process is useful as described. That many measurements
made throughout the MARSAME process could be biased should be obvious to the radiation
protection and  survey professional.  Additional information sources cited below could be helpful.

       Sentinel measurements, as described for the IA process of MARSAME have been widely
applied, although not necessarily designated by that name. They are rational and useful for
obtaining an IA of the type and magnitude of radioactive contaminants although they may not
have been randomly selected and, hence, are biased by definition. These measurements and their
applicability and limitations are well described in the document, and their use is clear. In fact,
wider application appears practical.

RECOMMENDATION la-1: Add to the information  sources in Chapter 2, lines 104 - 115,
the files (inspection reports, incident analyses, and compliance history) maintained by currently
and formerly involved regulatory agencies.  Discussion with agency staffs, especially their
inspectors, also could be fruitful.

RECOMMENDATION la-2: The listing of complexity attributes in Table 2.1 could include
Toxic Substances Control Act (TSCA) materials and hazardous waste.

RECOMMENDATION la-3: In Chapter 1, lines 253 - 259, MARSAME should recognize
that sentinel measurements are important because they may represent the entire historical record
available for IA.  Moreover, the measurements may have been so well planned that considering
them "limited data" is misleading without a clear definition of terms.  Sentinel measurements are
particularly useful to evaluate assumptions based on  process knowledge. In Chapter 2, lines 277
- 280, design of a preliminary survey for radioactive contaminants to fill knowledge gaps often
depends on the availability of data from sentinel measurements. In some instances, the physical
shape of the M&E may limit  further survey to sentinel measurements. On the other hand, the
MARSAME Manual draft, line 258, is correct in stating that sentinel measurements should not
be used alone to justify categorization of M&E as non-impacted, especially when geometric or
non-homogeneity limitations in radiation detection are suspected.
                                           10

-------
4.3  Charge Question Ib:  Discuss the clarity of the guidance on developing decision rules, as
provided in MARSAME Chapter 3.

       This chapter, devoted to developing decision rules, is very useful.  The decision rules are
clear. The Panel has the following recommendations concerning (1) distinction among surface
removable, surface fixed,  and volumetric radioactive contamination; (2) presentation of
regulations and guidance that address these contaminant forms; and (3) the mathematically
complex aspects of measurement method uncertainty, detection capability, and quantification
capability. With regard to the latter, Chapter 3, lines 567 - 622 takes the MARSAME
presentation from broad guidance to specific statistical tutorial, which raises difficulties for some
general readers and questions for some professionals.

RECOMMENDATION  lb-1:  The regulations or guidance for radionuclide clearance that
define the action levels (AL) discussed in Chapter 3, lines 118 - 120, and listed in Appendix E
should be sufficiently inclusive to apply to the usual M&E handled by users with regard to both
non-fixed ( removable) surface contamination and volumetric (distributed throughout the
material) contamination.  Tabulate or cite all other known pertinent regulations and guides for
this purpose. To the non-fixed surface contamination regulations included in Table E.2 by DOE
and Table E.3 by NRC, add the Department of Transportation regulation (U.S. DOT,
49CFR173.443), and guides by states such as New Jersey (State of New Jersey, 2007) and
Nevada (State of Nevada, 2001).  Include guidance for volumetric contamination clearance,
summarized in Table 5.1 of NCRP (2002) from reports of national and international standard-
setting groups.

RECOMMENDATION  lb-2:  Information that guides decisions for radioactively
contaminated M&E, listed in Chapter 3, lines 141 -  147,  should include measurements of
removable vs. fixed surface contamination to match the distinctions specified in Tables E.2 and
E.3.  Insert sub-sections that discuss the implications of planning for and responding to
measurement of removable vs. fixed and surface vs. volumetric radioactive contamination and
the subsequent disposition of M&E according to this categorization (see also
RECOMMENDATIONS  2b-3 and ld-3 for discussion of removable radioactive contaminants).

RECOMMENDATION  lb-3:  Maintain the more general tone of MARSAME throughout
Chapter 3 while moving detailed discussions of statistical aspects to a separate chapter (see also
RECOMMENDATIONS  lc-1 and 2a-l). This approach  could remove concerns such as why the
Minimum Detectable Concentration (MDC) is recommended for the Measurement Quality
Objective (MQO) in Chapter 3, lines 593 - 597, instead of the Minimum Quantifiable
Concentration (MQC), and how item #1 differs from item #3 on lines 609 - 617.

4.4  Charge Question Ic:  Discuss the adequacy of the  survey design process, especially the
clarity of new guidance on  using Scenario B. and the acceptability of new scan-only and in-
situ survey designs, as detailed in MARSAME Chapter 4.

       With the exception of Section 4.2, Statistical Decision Making, Chapter 4 is easily
understood by the general reader. Classification of M&E is an effective and helpful process.
                                           11

-------
The Disposition Survey Design and Documentation sections are well prepared. Further
discussion would help in addressing problems associated with complex geometric or non-
homogeneous distributions of the radioactive contamination relative to the detector.  These are of
particular interest when using scanning or in situ detection methods, and could be demonstrated
effectively in the illustrative example concerning rubble disposal of Section 7.3.

       Regarding statistical decision making, the concepts of hypothesis testing and uncertainty
per se are readily understood. However, the aspects of uncertainty with default significance
levels and the resulting gray area and discrimination limits (DL) leading to minimum
quantifiable concentrations (MQC) are not so readily assimilated. Extensive consideration of the
statistical approach is attached to this review as Appendix A.

RECOMMENDATION lc-1:  In the organization of MARSAME, instead of the current
mixture of general guidance about surveillance with detailed presentations of statistical matters,
retain in each chapter only a brief and less detailed discussion of statistics. Collect the
mathematical discussion in a separate chapter, as proposed above.  Chapter 19, Measurement
Statistics, in MARLAP should  serve as example. The separation will serve both the specialist in
statistics, who will appreciate the exposition in the newly added chapter, and readers with less
training in statistics who can follow the general import of the MARSAME approach in the
existing chapters.

RECOMMENDATION lc-2: The MARSAME manual has emphasized disposition options
that, after identification and segregation, lead directly to the disposition survey.  Conditioning of
the M&E, such as vacuuming, wiping down, chemical etching, and other forms of
decontamination should be encouraged for meeting disposition options (see also
RECOMMENDATION 1-2). Preliminary measurements are useful for this purpose.  The
MARSAME manual  should provide more detail on these approaches and encourage them as an
As Low As Reasonably Achievable (ALARA) policy.

4.5  Charge Question Id:  Discuss the usefulness of the case studies in illustrating new
concepts and guidance, as provided in MARSAME Chapter 7.

       Case studies can be immensely beneficial for  clarifying the MARSAME process and
guiding the user, but  members of the multi-agency work group informed the Panel that Chapter 7
does not contain case studies but rather invented illustrative examples. The latter usually are not
as instructive as case studies because they lack the  element of reality, but can be helpful if
created carefully to represent actual situations.

RECOMMENDATION ld-1: Delete or replace the example for Standard Operating Procedure
(SOP) use in Section 7.2.  Given the good discussion in Section 3.10 for improving an SOP
within the MARSAME framework, the example of applying SOP's at a nuclear power station
appears to contribute little.

RECOMMENDATION ld-2: The example in Section 7.3 of mineral processing of concrete
rubble is instructive,  but the reader should be informed that many more measurement results than
those listed in Table 7.3 are obtained under actual conditions and must be evaluated before
                                           12

-------
making decisions. The radionuclide concentrations reported in Chapter 7, lines 213 - 214,
should be confirmed as typical values or replaced by such values, because readers may apply
them as default values. For the same reason, the AL taken from a U.S. Nuclear Regulatory
Commission document (NUREG-1640;U.S. NRC, 2003) should be identified as a specific
selection, not a general limit. Inserting boxes with interpretive comments would help the reader
to understand the process used for illustration and the logic leading to the decisions.

RECOMMENDATION ld-3: Insert an introductory statement to place in context the length of
the 21-page example devoted in Section 7.4 to a simple baseline survey of a rented front loader,
to avoid discouraging the reader from applying it.  This statement should explain that these
details are needed to describe the survey process, but that the actual work is brief. This survey
provides  an opportunity to present the benefit of sentinel measurements and the comparison of
removable with fixed surface contamination. An actual  case history undoubtedly would show
these and also contain a table of survey measurements.

RECOMMENDATION ld-4: Include in each of the illustrative example headings a statement
that they  are demonstrating the MARSAME process.
                                           13

-------
    5.  RESPONSE TO CHARGE QUESTION 2: COMMENTS ON THE
      STATISTICAL METHODOLOGY CONSIDERED IN MARSAME

5.1 Charge Question # 2:  The draft MARSAME, as a supplement to MARSSIM, adapts and
adds to the statistical approaches of both MARSSIM and MARLAPfor application to
radiological surveys of materials and equipment.  Please comment on the technical
acceptability of the statistical methodology considered in MARSAME and note whether there
are terminology or application assumptions that may cause confusion among the three
documents.

      MARSAME contains tables and text that carefully compare the three documents and
identify consistencies and differences.  To Panel members familiar with the three documents,
application of the statistical methodology in MARSAME appears to match that used in
MARSSIM and MARLAP.

      The statistical methodology applied in MARSAME is acceptable and not confusing when
all three documents are read. Application of comments in Appendix A to this report and
consolidation of the mathematical aspects of MARSAME in  a single chapter as recommended
below should enhance use of MARSAME.

      A shift appears to have occurred from use of the Data Quality Objective (DQO)
terminology of MARSSIM to the Measurement Quality Objective (MQO) of MARSAME, but
the principle is understandable.  Clearly, MARSAME has close connections to MARSSIM in
surveys of M&E at MARSSIM sites. The manual also addresses M&E that is to be  moved onto
or from a site for various reasons, including - - but not necessarily - - processing and surveying
the site subject to MARSSIM.

5.2 Charge Question # 2a: Discuss the adequacy of the procedures outlined for determining
measurement uncertainty, detectability, and quantifiability, as described in MARSAME,
Chapter 5.

      The presentation for determining uncertainty, detectability, and quantifiability in Chapter
5, as well as aspects of this discussion in Chapters 4 and 6, follows the well-developed path in
MARSSIM and MARLAP and is essential to the disposition  survey planner. The Panel believes
that the outlined procedures are adequate, but that correct application by the user requires (1)
previous reading of MARSSIM and MARLAP, and (2) the expertise  and knowledge specified in
Chapter 1, lines 189- 194.

RECOMMENDATION 2a-l:  Enable the reader to understand the topics in Chapter 5 more
clearly by separating the entire mathematically detailed statistical exposition in a chapter that
could be entitled "Review of Experimental Design and Hypothesis Testing." Appendix G can be
included in this chapter. The chapter can be placed before Chapter 4.  All sections currently in
Chapters 4-6 that discuss generalized aspects of these topics, including measurement
uncertainty, detectability, and quantifiability, can be kept in place; reference should  be made to
the technical discussions, equations, and tables in the new chapter.
                                          14

-------
RECOMMENDATION 2a-2: Consider the comments made in Appendix A concerning the
topics of experimental design, hypothesis testing, and the statistical aspects of uncertainty in
preparing the separate chapter suggested above.

 RECOMMENDATION 2a-3: Move the discussion on setting MQOs, in Sections 5.5 thru 5.9,
to Chapter 4 on Survey Design. Organize a summary or guide that focuses on the procedures for
setting MQOs and for determining uncertainty, MDC, and MQC. The ability to set
Measurement Quality Objectives (MQOs) is an important element of the MARSAME process,
but the discussion involving the implementation of MQOs in the design of the three survey types
may confuse the reader.  Aspects of implementation are immersed in details defining,
explaining, and deriving theoretical  concepts

5.3 Charge Question # 2b: Discuss the adequacy of the data assessment process,  especially
new assessment procedures associated with scan-only andin-situ survey designs, and the
clarity of the information provided in Figures 6.3 and 6.4.

       The data assessment process is appropriate, carefully presented, and thoroughly explored.
The advice is pertinent and the examples are helpful.

       The Panel discusses statistical considerations in Appendix A.  The information presented
in Figures 6.3 and 6.4 is  clear (See Figures 1 and 2, below), but minor changes, shown in the
following two revised Figures are proposed.

       The Panel noted above the importance of distinguishing in all MARSAME chapters
among contamination that is (1) removable from the surface, (2) fixed to the surface,  or (3)
volumetric. Smear surveys (wipe tests) are an integral part of an M&E survey because of the
potential radiation dose from removable radionuclides that can spread from M&E surfaces and
be inhaled and ingested.  Removable surface contamination is included in DOE regulations in
Table E.2 and NRC regulations in Table E.3, as well as DOT regulations and International
Atomic Energy Agency (IAEA) guidance.  Multi-agency working group members expressed
reluctance about including  in MARSAME a survey technique that they consider to be poorly
reproducible for defining the removable radionuclide amount per area. The Panel response is
that insufficiently discussing wipe tests is unrealistic and misleading. Each type of measurement
has its own uncertainty.  A reasonable approach  is to begin with the instruction in U.S. DOT
49CFR173.443 "wiping  an area of 300 cm2  ... with an absorbent material ... using moderate
pressure" that "sufficient measurements shall be taken in the most appropriate locations to yield
a representative assessment"  and then provide guidance on defining and controlling variability.

RECOMMENDATION 2b-l: In Fig. 6.3 (See Figure 1 below, which reworks Fig.  6.3), clarify
the distinction of a MARSSIM-type survey by moving  "Start" to immediately above the decision
point "Is the Survey Design Scan-only or In situl" and then connecting this to an inserted
decision diamond "Is the AL equal to zero or background?". A "yes" leads to "Requires  scenario
B ..." and a "no" leads to "Disposition Decision Based on Mean
                                           15

-------
RECOMMENDATION 2b-2: In Fig. 6.4 (See Figure 2 below, which reworks Fig. 6.4), for a
more consistent presentation, insert a decision diamond after both "Perform the Sign Test" and
"Perform the WRS Test" that says "Scenario A," followed by a "yes" or "no" leading to the two
"Scenario A" and "Scenario B" branches at both locations.
                 "  Ss the Survey
                 Design Scan-Only
                   or In Situ?
                 AL Equal to Zero
                 ..or Background"?
                  -  "  AM
                   Results -i
                    From Th
                     MOC? .
                  MAE Meet the
                 isposition Cnienon
 :
 \ Type Suivey
 " Figure (J.4
  Deoision Based
on Mean of a Sampled
*• - _t PoputoHon?  .-- '
                                         Individual Results Must;
                                            fee Recorded    :
  UCL ,1 U8GR?
                                               Meelthe
                            , M&.E Do Not ftrteet the *••
                            \ Deposition CRferion
                                                          Assess the Results of the Disposition Survey
                       Record individual Scan
                        Resoitei if Rfc^iwred
                           From The '
                            UB<3R?
                                                                       *
                                                                     ; Meet the
Figure 1 - Suggested Changes for Figure 6.3 from MARSAME Manual for Interpretation
            of Survey Results for Scan-Only and In-Situ Surveys
                                                16

-------
                                         Fio/n
                                       Fiaufe 6 3
the     of the        Survey
                        Scenario A
                               Reject H.
                                      M&E Meet the
             . M8E Do Not Meel the'
             '.. Disposrt'-on Crrtenor,
 	I	

  Reject !-
Figure 2 - Suggested Changes for Figure 6.4 from MARSAME Manual for Interpretation
of Results for MARSSIM-Type Surveys
                                              17

-------
RECOMMENDATION 2b-3:  To counteract the discomfort of Multi-agency working group
members with the qualitative aspect of wipe tests, the MARSAME manual could recommend
evaluations of the removable radionuclide fraction measured by wipe test for the surveyed M&E.
These evaluations can include, for example, sequential smears at a given location at the M&E, or
smears at adjoining locations performed with different material and pressure, by different
persons, and for different radionuclides. Refer to State of Nevada (2001) and State of New
Jersey (2007) for a description of the process, to Rademacher and  Hubbell (2008) pp. 10, 16 for
an application to radiological monitoring, and to U.S. EPA (2007a) for more general applications
of the wipe test.

RECOMMENDATION 2b-4:  Insert sub-sections in all chapters to address implementation
and assessment of survey processes to distinguish between surface and volumetric contamination
(i.e., measurement after surface cleaning or observing the effect of counting geometry) and
between removable and fixed surface contamination (i.e., wipe test results compared to total
surface activity).  These types of contamination are described in Chapter 1, lines 127 - 152, but
their implications  should be considered throughout the MARSAME manual. Concerns in
measuring volumetric contamination include characterizing non-uniformly distributed
radionuclides and quantifying radionuclides that emit no gamma rays.

5.4  Charge Question # 2c: Discuss the usefulness of the case studies in illustrating the
calculation of measurement uncertainty, detectability, and quantifiability as provided in
MARSAME chapter 7.

       As stated in the response to CQ Id, case studies are invaluable in guiding the user
through complex operations. A useful case study discussion with  references is available in ITRC
(2008). The illustrative examples given instead of case studies in  MARSAME lack the realistic
data accumulation that permits estimation of uncertainly. Excessively detailed derivations of
equations for calculation are shown in Chapter 7, lines 579 - 628,  658 - 665, 682 - 689, and
1133 - 1150. For discussions related to uncertainty, refer to Appendix A.

RECOMMENDATION 2c-l: Move the detailed derivations, including partial derivatives,
identified above to the newly added separate chapter recommended for discussion of
experimental design and hypothesis testing.

RECOMMENDATION 2c-2: Use illustrative examples to demonstrate any MARSAME
guidance that the multi-agency work group considers difficult to follow. These may include
approximating uncertainty (see Chapter 5), distinctions such as interdiction vs. release, and
applying scenarios A vs. B.

RECOMMENDATION 2c-3: Use Sections 7.4 and 7.5 to illustrate the benefit of wipe tests for
determining removable radioactive surface contaminants.  Experience suggests that the
contaminant usually is in this form on M&E such as earth-moving equipment.
                                           18

-------
    6.  RESPONSE TO CHARGE QUESTION 3: RECOMMENDATIONS
    PERTAINING TO THE MARSAME ROADMAP AND APPENDICES

Charge Question 3: The draft MARSAME includes a preliminary section entitled Roadmap
as well as seven appendices.  The goal of the Roadmap is to assist the MARSAME user in
assimilating the information in MARSAME and determining where important decisions need
to be made on a project-specific basis. MARSAME also contains appendices providing
additional information on the specific topics. Does the SAB have recommendations regarding
the usefulness of these materials?

       The Roadmap is crucial in guiding the reader through a document as complex as
MARSAME. The appendices are useful in various ways, such as providing information
compilations and statistical tables, and avoiding the need to seek this information in MARSSIM
and MARLAP. Also necessary to the reader are the acronyms and abbreviations; symbols,
nomenclature, and notations;  and glossary. The following Recommendations are intended to
enhance their use.

RECOMMENDATION 3-1: Roadmap Figure 1 connects the MARSAME chapters in terms of
the Data Life Cycle. Consider establishing an analogous connection with Roadmap Figures 2, 3,
5, 6, 7, and 8.  At present, the only Roadmap figures connected to each other are Fig. 2, 3, and 4,
and 7 with 8.

RECOMMENDATION 3-2: Consider assisting project managers by highlighting major
operational decision points in the roadmaps.

RECOMMENDATION 3-3: The roadmap should ensure that the primary components of the
process are identified, their relationship to one another is depicted, and the boundaries of
application are well-defined, in accord with the DQO process. Figure 3 provided below could be
used in the MARSAME roadmap to illustrate application of the DQO process in the MARSAME
manual. Realize also that the DQO process is iterative, so that, as in the case of MARSSIM, the
MARSAME program  should have the potential to improve and update the manual.
                                         19

-------
  PLAN
CONDUCT
 ASSESS
  DECIDE


f          \
V         J


                    Survey
                     &
                  Validation
                   Results

            Figure 3 - The MARSAME Process
                       20

-------
RECOMMENDATION 3-4: Indicate in the body of the text that Appendices B, C, and D are
useful overviews of the environmental radiation background, sources of radionuclides, and
radiation detection instruments, respectively, for managers and generalists; they may be too
general for the experienced health physicist to whom the manual is addressed.

RECOMMENDATION 3-5: Insert a table with action level (AL) guidance for volumetric
radionuclide contamination in Appendix E (see RECOMMENDATION lb-1).

RECOMMENDATION 3-6: Either move Appendix G into the new chapter on experimental
design and hypothesis testing or indicate its relation to that new chapter.

RECOMMENDATION 3-7: Move the Glossary to the front to join the tables of acronyms and
of symbols.

RECOMMENDATION 3-8: Expand the definition of 'Interdiction' in the glossary to clarify
its application to receiving or disposing of M&E.
                                          21

-------
            7.   RECOMMENDATIONS BEYOND THE CHARGE
RECOMMENDATION C-l: In Chapter 3, discuss in the recommended separate chapter on
statistics any decisions leading to selecting the degree of confidence, embedded in the choice of
significance level a and P values. Selection may be a matter of the acceptable uncertainty
specified by the agency that sets  the action level.

RECOMMENDATION C-2: In Chapter 2, discuss the impact of survey cost and needed skills,
instruments, and time on the MARSAME effort. Brief projects obviously need different designs
than lengthy ones. Discuss requirement and program for data retention, especially in long
projects and when contractors are replaced.

RECOMMENDATION C-3: In Chapter 6, discuss the options to be considered and pursued
when the plan proposed initially  for M&E transfer is rejected because of the observed
contaminant levels.

RECOMMENDATION C-4: Provide an additional Appendix that summarizes topics in
MARS SIM and MARLAP that are important to the MARSAME manual but are insufficiently
described in it, or at least give page references to the earlier documents. Such topics may
include aspects of quality assurance (e.g., validation and verification of results), data reliability
affected by sample dimensions, measurement frequency, and detector characteristics.  Consider
also the effect of non-random variability in measurement (e.g., fluctuating geometry or monitor
movement rate).
                                          22

-------
                                 REFERENCES

Federal Register Notice Citations:
FR, Vol. 60, March 7, 1995, p. 12555
FR, Vol. 72, No. 48, March 13, 2007, p. 11356
FR, Vol. 72, No. 184, September 24, 2007, pp. 54255 - 54257.
FR, Vol. 72, NO.89, pp. 25695, May 7, 2008

ITRC, 2008 "Decontamination and Decommissioning of Radiologically Contaminated
Facilities," Interstate Technology & Regulatory Council, Radionuclides Team, Washington, DC
20001 (http://www.itrcweb.org/Documents/RAD5.pdf)

NCRP, 2002  "Managing Potentially Radioactive Scrap Metal," Report #141, National
Council on Radiation Protection and Measurements, Bethesda, MD 20814.

Rademacher, Steven E. and Joshua L. Hubbell, 2008  "Boeing Michigan Aeronautical
Research Center (BOMARC) Missle Shelters and Bunkers Scoping Survey Report," Vols. 1
&2, United States Air Force Institute for Operational Health, Surveillance Directorate,
Radiation Surveillance Division, Brooks City-Base, TX, IOH-SD-BR-SR-2008-000_, February
2008

State of Nevada, 2001  "Division of Environmental Protection Dry Wipe Sampling," Oct. 16,
2001 (7 pages) http://ndep.nv.gov/fallon/dry.pdf

State of New Jersey, 2007  "Radiological Emergency Response Plan," SOP 361 Wipe
Sampling, Rev. 9, May 2007

U.S. DOT, 49CFR173.443 "Package and Vehicle Contamination Limits.  Minimum Required
Packaging for Class  7 (Radioactive) Materials"

U.S. EPA, 2000 and 2001 "Multi-Agency Radiation Survey and Site Investigation ManuaF
(MARSSIM).  NUREG-1575, rev. 1; EPA 402-R-97-016,  Rev. 1, DOE/EH- 0624, Rev. 1,
August 2000 and June 2001 update

U.S. EPA, 2006  "Guidance on Systematic Planning Using the Data Quality Objectives
Process, EPA (WG4"EPA/240/B-06/001, February 2006

U.S. EPA, 2006b  "Multi-Agency Radiation Survey and Assessment of Materials and
Equipment Manual (MARSAME), Draft Report for Comment, "NUREG-1575,  Supp. 1; EPA
402-R-06-002; DOE/EH-707, December 2006

U. S. EPA, 2007a  "A Literature Review of Wipe Sampling Methods for Chemical Warfare
Agents and Toxic Industrial Chemicals," EPA/600/R-07/004, January 2007
(http://www.epa.gov/NHSRC/pubs/reportWipe042407.pdf)
                                        23

-------
U.S. EPA, 2007b  Memo from Elizabeth A. Cotsworth, Director, Office of Radiation and Indoor
Air (ORIA) to Vanessa Vu, Director, Science Advisory Board Staff Office, and entitled
"Review of the Draft Multi-Agency Radiation Survey and Assessment of Materials and
Equipment Manual," October 23, 2007

U. S. EPA/SAB, 1997  "An SAB Report: Review of the Multi-Agency Radiation Survey and
Site Investigation Manual (MARSSIM)," Prepared by the Radiation Advisory Committee
(RAC) of the Science Advisory Board, EPA-SAB-RAC-97-008, Sept. 30, 1997

U.S. EPA, 2004  "Multi-Agency Radiological Laboratory Analytical Protocols (MARLAP)
Manual,"NUREG-1576; EPA 402-B-04-001A; NTIS PB2004-10521, July 2004

U.S. EPA/SAB, 2003a  "Consultation  on Multi-Agency Radiation Site Survey Investigation
Manual (MARSSIM) Supplements for Materials and Equipment (MARSAME): A Science
Advisory Board Notification of a Consultation," EPA-SAB-RAC-CON-03-002, February 27,
2003

U. S. EPA/SAB, 2003b "Multi-Agency Radiological Laboratory Analytical Protocols
(MARLAP) Manual: An SAB Review of the MARLAP Manual and Appendices by the
MARLAP Review Panel of the Radiation Advisory Committee (RAC) of the U.S. EPA Science
Advisory Board (SAB)," EPA-SAB-RAC-03-009, June 10, 2003

U. S. EPA/SAB, 2004 "Second Consultation on Multi-Agency Radiation Site Survey
Investigation Manual (MARSSIM) Supplements for Materials & Equipment (MARSAME): A
Science Advisory Board Notification of a Consultation," EPA-SAB-RAC-CON-04-001,
February 9, 2004

U. S. NRC, 1997   "A Nonparametric Statistical Methodology for the Design and Analysis of
Final Status Decommissioning Survey," Draft Report for Comment, Washington, DC, Nuclear
Regulatory Commission (NRC) NUREG-1505, August, 1995.

U.S. NRC, 2003  "Radiological Assessment for Clearance of Materials from Nuclear
Facilities," Nuclear Regulatory Commission (NRC), Office of Nuclear Regulatory Research,
Washington DC, NUREG-1640, Vols. 1-4, June 2003.
                                        24

-------
                      Web-based Citations and Hotlinks

MARS SIM: http://epa.gov/radiation/marssim/index.html

MARLAP:
MARSAME: http://www.marsame.org
DRY WIPE SAMPLING PROTOCOL FOR THE STATE OF NEVADA:

State of Nevada, 2001  Division of Environmental Protection Dry Wipe Sampling, Oct. 16,
2001 (7 pages)  htt^/ndegjw.goy/fallon/diy.pd_f
                                      25

-------
  APPENDIX A - STATISTICAL ANALYSIS - AN INTRODUCTION TO
     EXPERIMENTAL DESIGN AND HYPOTHESIS TESTING WITH
                  SPECIFIC COMMENTS ON STATISTICS

      A-l  An Introduction to Experimental Design and Hypothesis Testing:

The general problem of designing a survey of the sort described in the MARSAME document
involves the following issues:

   (1) Understanding the error properties of the measurement instrument and how they can be
      manipulated (by changing counting times or performing repeated measurements of the
      same radionuclide quantity, for example). Generally the measurement error can be well
      characterized by its standard deviation 
-------
probability of rejecting the null hypothesis at a given a.  (Note that terminology used here
follows the MARSAME Glossary and list of Symbols, Nomenclature and Notations.)

       When a single measurement is taken, the variance of that measurement will equal
ali +al •  In some cases the sampling distribution and thus as may be irrelevant to a
MARSAME survey; for example, there may be no spatial variability (when there is only 1 level
of radiation relevant to a small item). An important issue is how the error properties of the
instrument behave when repeated measurements of the same equipment item or same portion of
material are taken. For some measuring instruments, it may be reasonable to assume that the

average of N measurements of the same unit will have standard deviation equal to —-^=. This
will be the case in an idealized radiation counter, since performing additional measurements on
the same sampling unit (item) is equivalent to increasing the count times for that unit. In other
cases, inherent biases in measurement instruments may result in a measurement error shared by
all measurements.

       When sampling variability occurs (so that as is not zero), the variance of the mean of a
random sample of N measurements of will have variance somewhere in the range
  77         7
—^	-tocr2 +—2- .  The first of these corresponds to measurement errors that are completely
   N       M   N
unshared and the second corresponding to measurement errors that are completely shared due to
imperfect calibration (for example, in the "measured efficiency" of a monitor discussed in
several places in the manual). Generally, as more measurements are taken, the contribution of
the sampling variance to the variance of the mean tends to disappear, whereas some or all of the
contribution of the measurement error may remain.  The special case when 100% of a potentially
contaminated material is measured may be regarded as the limit when N -> QO.  Again, some or
all of the measurement error variance may still remain.

       For most situations in MARSAME, the null  hypothesis concerns the difference between
background levels and the level of contamination of the M&E.  Table 5.1 (in the current
document) gives some special formulae used when counts in time follow a Poisson distribution
(so that the variability  of the counts of both background and the item of interest depends on
counting time and radiation level). In general, the variance of the difference between sampled
radioactivity and the estimate of background will require special investigation as a part of the
survey design.

       For simplicity, it is useful to denote the standard deviation of measurement minus
background as a, which refers to the standard deviation of the estimate (often termed the
standard error) obtained from the entire measurement method (involving either single readings,
multiple readings, scans of some or all of the material, etc.). This a can be a relatively
complicated function of the underlying measurement and sampling variability (which must
include the uncertainties in the estimate of background) that may require careful study to
quantify properly.
                                          27

-------
       Once a is determined, the power, 1-|3, of a study will depend upon two other parameters,
(1) the type I error rate a and (2) the size of the assumed true difference A. If the standard error
of the estimate, a, is the same for all radiation levels being measured, then the ratio A/a
determines power for a given value of a (otherwise a more complicated expression is used, as in
Table 5.1 of MARSAME). For known a, we may specify the "detectable difference A" by
fixing both the type I error a and the power 1-|3 and solving for A. In the MARSAME manual,
this detectable difference A is called the width of the "gray region."  (Differences less than this A
are only detectable with power less than the required 1-|3 and hence are "gray.")  If the action
level, AL, is defined to be the upper bound of the "gray region," then the lower bound (AL-
detectable difference A)  is called the "discrimination limit" (DL). Note that implicitly the
detectable difference A and the DL depend upon the power, type I error rate, and the standard
error of the estimated a. One of the confusing aspects of the MARSAME manual is that the DL is
introduced long before the concept of power or type I error.

       The two scenarios (A and B) considered in the report both assume that the null
hypothesis is at the action level, but differ in the direction of the alternative hypothesis and
generally in the value of AL. Under scenario A, the alternative hypothesis is that the radiation
level is less than the action level (which is the upper limit above background to be allowed)
whereas under scenario B the alternative hypothesis is that the radiation level is greater than the
action level (which is typically set to background).  Under scenario A the M&E is only deemed
to be safe for release if the null hypothesis is rejected,  whereas under scenario B the M&E is
safe for release if the null hypothesis is not rejected.

       If under scenario A, for example, the true value of the radionuclide level (or level above
background) is less than or equal to DL then the survey will have power 1-|3 to reject the null
hypothesis that the true value is equal to the AL with type I error a.  Under scenario B, if the
value of true contamination-background is greater than the detectable difference A, then the
study will again have power  1-|3 to reject this null hypothesis at type I error rate a.  Assuming
that the standard error of the  estimate a, does not depend upon the radiation levels being
measured, the formula for the "detectable"  A, given a, a and power 1-|3 is
              Detectable difference A = (Zl_fl + Zl_a )
-------
detectable A meets requirements (for example so that the DL is not set to be too small in
Scenario A, or that the upper range of the gray region is not set too high above background in
Scenario B).

       In some situations (non-normal distributions,  short count times), the detectable A will be
larger than described in equation (1) and more specialized statistical analysis may be needed.
Such techniques as segregation according to likely level of contamination may improve the
accuracy of equation (1), as will longer count times.

       Hypothesis testing (accepting or rejecting the null hypothesis) involves comparing an
estimate of contamination level to a "critical value" (termed Sc in the manual) which allows us to
decide whether the observed estimate is consistent with the null value (at a  certain type I error
level) after taking account of the variability (i.e., a) of the measurement. For Scenario A, this
value is equal to Sc = AL - Zi_a a, and for Scenario B it is Sc = AL + Zi_a a. By definition,
power  is the probability, as computed under the alternative hypothesis, of rejecting the null
hypothesis; that is, the probability that the observed estimate is less than (for scenario A) or
greater than (for scenario B) the critical value Sc.

       If the normality of the estimate is in doubt, then other approaches to hypothesis testing
may be needed. For example, while for long count times  the Poisson distribution can be
approximated as normal for the purpose of hypothesis testing, for short count times,  specialized
formulae (see Section 5.7.1) may be needed to give a better approximation to the distribution of
(measured-baseline) for an idealized radiation counter.
A-2    Specific Comments:

       Section 3.8.1 describes "Measurement Method Uncertainty" but in somewhat more vague
terms than above. The intent of this section could be better understood in reference to the
suggested introduction to experimental design and hypothesis testing.

       All of Chapter 4 would be more comprehensible if it consistently  referred back to the
suggested introduction to experimental design and hypothesis testing.

       Section 4.4.1.2 gives a recommendation for how much of an impacted material should be
scanned: it is not clear to what the a value now refers (eq 4-1).  This appears to be the
measurement error standard deviation 
-------
these figures to include several additional parameters with some supplemental text would be
helpful.

       Recommendations for scenario A and B are presented in Figs. A-l and A-2. These
embellished Figures with some additional text should also eliminate the need to repeat this
information in Chapter 5, as in Figs. 5.2, 5.3, 5.4.
                               Discrimination Critical
                                  Limit     Value
                                       Scenario A
                                (H0: Net Activity i Action Level)

                                  Figure A-l. Scenario A
                    Zero
Discrimination
    Limit
                                          Scenario B
                                  (H0: Net Activity < Action Level)

                                  Figure A-2. Scenario B
       As mentioned above, the Action Level (AL) for net excess radioactivity is used in
defining the null hypothesis. However, the decision on accepting the null hypothesis is not
based on the numerical value of net radioactivity at the AL.  Rather, each sample is compared
with the Critical Value shown in the Figures. This ensures that the probability for rejecting the
                                            30

-------
null hypothesis, when it is true, will not exceed a. The Discrimination Limit (DL) is the net
radioactivity in the sample where the probability of accepting the null hypothesis, when it is
false, is |3 (i.e., the power for rejecting the null hypothesis is 1-|3).  The Gray area is the region
of net radioactivity in the sample where the statistical power to reject the null hypothesis, when it
is false, is less than 1-|3.

      Application of Measurement Quality Objectives (MQOs) discussed in Section 5.5 is an
operational aspect of the MARSAME process. MQOs are part of the Data Quality Objectives
process (DQOs) used as a platform for both the MARS SIM and MARSAME process. Use of
MQOs was not incorporated into the MARS SIM process, so it maintains a unique role to
MARSAME. The application of MQOs is fairly new to Decommissioning planning.  It was
employed in MARLAP in 2004 for laboratory-based measurements and now has been extended
to field measurements in MARSAME.  The Guide to the Expression of Uncertainty in
Measurement (GUM), which forms the basis for much of the conceptual and statistical
framework of MQOs, was published by the International Standards Organization (ISO) and the
National Institute of Standards and Technology (NIST) in 1995 and 1994, respectively.  The
topic of MQOs may be  unfamiliar to many users of the MARSAME.  For this reason, it is
important to provide  a sound basis for the operational and statistical aspects of its use.

      Some SAB MARSAME Review Panel comments, in the text and in this Appendix,
specifically address the theoretical foundations of the underlying statistical assumptions used in
the mathematical relationships and equations.  Other panel comments address the application of
MQOs from an operational standpoint.  The identification of MQOs for certain types of
measurement cases and survey designs may be confusing to readers unfamiliar to MQO
applications. Considerable detail in the manual is provided on defining, explaining, and deriving
the relevant theoretical  concepts. The writers of the MARSAME manual should ensure that
operational information on the implementation of MQOs is not too deeply  embedded within the
theoretical discussion. More distinction should be placed on information applicable to
identifying performance characteristics, setting MQOs, and selecting appropriate measurement
methods. Effective use of the manual relies on the reader to be able to apply MQOs to their
specific measurement problem.

      A summary or guide that organizes the measurement uncertainty, detectability, and
quantifiability  requirements for each of the three types of MARSAME surveys, including In-
Situ, Scan-only, and MARSSEVI-type, would be beneficial to the user. The guide would collect
information on the selection of MQOs,  which may be scattered throughout the chapter, into one
coherent presentation for ready reference. The guide would be useful for designing MARSAME
disposition surveys, training activities and for reference when regulators evaluate the
measurement requirements of disposition survey plans.

     The presentation of statistical formulations and derivations can be quite detailed and
extensive and,  if not properly balanced with the operational aspects of the guidance, may detract
from the clear presentation of the guidance to the target audience. It is important to recognize
that the manual is written for those directing and implementing the process, interpreting results,
and making decisions.  The operational aspects of the guidance address this broad audience
directly; however, there is an audience concerned with the scientific and technical soundness of
                                           31

-------
the procedures and the rigor for which the process is founded. An appropriate balance between
the presentation of the operational aspects and the statistical foundations of the guidance is
paramount.

       The intent of Section 5.5 would be made clearer as dealing with the factors that impact
the measurement error uncertainty a as described in more general terms in the suggested review
of experimental design and hypothesis testing. Apparently, however, aM (the standard deviation
of a single measurement not taking into account spatial distribution of materials or the variability
of the background) is being confused with the overall a (total measurement method uncertainty
taking these factors into account). It is A / a, not A / aM, that determines the overall power of the
experiment.  The document should clearly differentiate these two a 's.

       Section 5.5.1 lines 289-293 seems to be confusing aM with as. It is as that, generally
speaking, can be decreased by improving scan coverage (not ou'^f this includes "shared" error
terms such as the "variance of measured efficiency").  The new terminology UMR apparently
refers either to an estimate of the measurement error uncertainty aM or to overall a but this is not
made clear in this section (and the requirement that UMR < cr/3 makes no sense if as can be
reduced to 0 by improving scan coverage).

       The comments on line 302-303 seem to require that UMR estimates the overall a.
Example 2 is confusing because the requirement that UMR be a factor of 10 times smaller than A
seems to assume that UMR is an estimate of aM rather than the overall uncertainty a (this would
be a very stringent requirement indeed). Here one needs to focus not just on aivibut rather on the
total variability including as. If as can be reduced to zero by scanning all of a material why is
such a stringent requirement made on
       Line 360 introduces new and not clearly defined uncertainties (uc and 
-------
B described in Chapter 4. This is expanded in Table 5.2 to the minimum detectable value, SD. It
is the smallest value of net radioactivity, MDC, that will yield an observed measurement greater
than Sc with a statistical power of 1-|3. That is, the probability that a sample containing exactly
the MDC will be less than Sc is |3 (i.e., false negative).

       The equations in Tables 5.1 and 5.2  are used throughout MARSAME as examples for
estimating critical values Sc and MDC. These equations are based on the Poisson assumption for
counting statistics and distribution of the difference between two random numbers that are
Poison distributed. In effect, this implies that an independent measurement of a control is paired
with each measurement of a sample.  Sc is based on the distribution of two random  numbers
selected from the same distribution of background.

       Although the equations are correct, it is not common to measure a control for every
sample of unknown contamination.  This process of comparing paired samples is rare.
Generally, an estimate of background radioactivity is established, and subtracted from every
sample to estimate the "net" count.

       Tables 5.1 and 5.2 are used throughout MARSAME without any reference to any
assumptions that were used to derive the equations. There could be serious implications in
decisions relating to the presence of radioactivity using Sc and hypothesis testing using MDC as
the DL. On the other hand, for most cases these equations might be satisfactory. It will be
important for the MARSAME manual to clarify this, and to provide more details on how to
measure and characterize "background" in controls that are used to determine "net" activity.

       Some  examples are shown below. For this case, equations 5.1.1 (Currie) and 5.1.3
(Stapleton) were used to  compute Sc. A Monte Carlo model was used to estimate Sc for paired
samples from the true background distribution (MC) and also for a constant background, equal to
the true mean, that was subtracted from a random sample of background (MCB). For these
cases, a = |3 = 0.05. Fig. A-3 is for the case where the sample time t§ and the background time tb
are equal and yield a mean count of 200. The abscissa is normalized to the value of Sc obtained
from the Currie equation.

       This illustrates that Sc obtained from 5.1.1 does  indeed come from a distribution of paired
samples which is simulated in MC. However the value for Sc obtained by subtracting a constant
value equivalent to the mean value of background, MCB, is actually about 30% lower than Sc
from the equations.

       Fig. A-4 is for the case where the sample time ts is 5 and the background time tb is 50.
For this case,  the background is estimated with greater precision because tb is large. With a
constant background to estimate  background, the value of Sc is similar to that obtained from the
equations in Table 5.1; however both MCB  and the Currie equation yield a value of Sc that is
somewhat lower that that obtained from paired samples (MC) by Monte Carlo simulation.
                                          33

-------
Sc (Relative Counts)
Rb = 10 ts= 20, tb = 20
•^20 0 T~~ — — ~~~~^^
•11ft fl -
100.0
90 0 -
80 0 -
70 n ,
60 0

© • ©


£

Currie (5.5.1) Stapleton (5.1.3) MC MCB
^
                   Figure A-3. Comparison of Sc
    U)


    I
    O
    a

    3
    '3
    W
120.0



110.0



100.0



 90.0



 80.0



 70.0



 60.0
                                 ts=5,tb =
             Currie (5.5.1)    Stapleton (5.1 .3)
                                            MC
                                                         MCB
Figure A-4. Comparison of Scfor longer background counting period
                                 34

-------
Sc (Relative Counts)
•N
Rb = 10 ts=20,tb = 10

100.0
90 0
80 0



0 •



urn
Currie (5.5.1) Staple ton (5.1.3) MC MCB
j
                 Figure A-5. Sc for a briefer background counting period
                                    Rb = 20ts=tb = 20
                                10      20       30      40
                                 % excess above Background
50
             Figure A-6.1-p as function of % excess count above background
       Fig. A-5 is for the case where ts is twice the value of tb. Values obtained for Sc using the
Currie equation are close to the value from the Monte Carlo simulation for paired samples, but
the estimate of Sc using constant value of background is low by about 40%.

       Fig. A-6 shows an example of the statistical power, 1-P, as a function of the increasing
amounts of radioactivity above background.  The curve starting on the ordinate at a statistical
power, 1-P, of 0.05 represents the simulation for paired samples and the curve starting at the
origin represents the simulation when a constant value of background is subtracted from the
sample to form the net value. Without excess radioactivity, P for the paired samples is 0.05 and
P = 0.01 when background is a constant.  The two curves are identical when the excess
radioactivity corresponds to Sc and therefore P = 0.5.  The vertical line corresponds to the value
of MDC obtained from equation 5.2.1.  Note that the MDC, (1-P) = 0.95, obtained from  the
                                           35

-------
simulation with constant value for background is smaller than when using the assumption of
paired samples.

       MARLAP provides additional modifications to estimating Sc when the Poisson
approximation may not be satisfied. However, it is not clear that the concerns relating to the
process of measuring controls or reference materials have been eliminated.

       Many equations have been suggested for designing and interpreting survey procedures in
MARSAME. The equations are derived from sound statistical principles.  They can lead to
incorrect conclusions if the underlying assumptions in the derivations are not satisfied. The
Panel does not recommend that each equation be derived in detail, but suggests that the
assumptions and sampling requirements needed to properly implement equations be documented
in MARSAME.

       Section 5.8, Determining Measurement Quantifiability is a complicated way of saying
that a must be small enough (and hence A / a large enough) for the measurement method to have
good power to reject the null hypothesis that the level of radioactivity is at the AL for a
reasonable A (width of the gray region). It also must give a reasonably narrow confidence limit
for the estimated value, i.e. where the width of the confidence limit is small compared to the
value of the AL.

       One complication that is explicitly dealt with in the definition of the MQC is that the
measurement method uncertainty, i.e.  a,  generally will depend upon the (unknown) true level of
radioactivity itself- for example a perfect counter has Poisson variance equal to its mean.  Thus
the MDC is just the value, yO, of the radioactivity level for which the ratio, k=yO/a, is large (the
manual recommends k=10).  If yO is small relative to the action limit (between 10-50 percent of
the AL is recommended), then it is clear that (1) the detectable A will be small with respect to the
action limit (i.e. the DL will be close to the AL) and (2) confidence limits around an estimated
value of radioactivity will be narrow relative to the value of the AL. Saying this clearly
improves the intelligibility of this section.

       Section 5.8.1 would be more intelligible if it first noted that it is giving a computation of
the MDC, yO, for a fixed k by a formulae for a that takes account of several factors which  are
combined into this one a. These factors are the length of the reading time for the source, the
length of reading time for the background, the true value of the background reading, and an
estimate of the variance of a "shared"  measurement error term, i.e. the measured efficiency of
the monitor.

       Section 6.2.1 has some confusing  aspects: as described earlier, the gray region is defined
in terms of the power and type I error of the test with a measurement method of total standard
deviation a. Sentences like "Clearly MDCs must be capable of detecting radionuclide
concentrations or levels of radioactivity at or below the upper bound of the gray region" seem
tautological if the gray region is defined in terms of detection ability; specifically in terms  of
power, type 1 error, and a.
                                           36

-------
       Lines 215-224 of Section 6.2.3 confuse by the statements about how individual
measurement results can be utilized for scan-only measurements.  The statement that "if
disposition decisions will be made based on the mean of the logged data, an upper confidence
level for the mean is calculated and compared to the UBGR," must be interpreted carefully.  If
one did a standard test such as Wilcoxan or t-test) one would ignore any uncertainty component
resulting from variability in the measurement process (i.e. measurement error shared by all
measurements that constitute the scan).  Only if 
-------
            APPENDIX  B -ACRONYMS AND ABBREVIATIONS

A           Scenario A for hypothesis testing
AL          Action Limit (or Level)
ALARA      As Low As Reasonably Achievable
a           Maximum acceptable probability for Type I error rate (alpha)
AM          Arithmetic Mean
P            Maximum acceptable probability for Type II error rate (Beta)
B           Scenario B for hypothesis testing
1-P          Numerical value of the statistical power to reject the null hypothesis when it is
             true
CFR         Code of Federal Regulations
CON         Consultation
CQ          Charge Question (CQ1, CQ 2, CQ3)
A           Difference (Alternative - Null hypothesis), also the Detectable Difference
DFO         Designated Federal Officer
DL          Discrimination Limit (also Discrimination Level)
DLC         Data Life Cycle
DoD         Department of Defense (U.S. DoD)
DOE         Department of Energy (U. S. DOE)
DOT         Department of Transportation (U. S. DOT)
DQO         Data Quality Qbj ective
EH          Environmental Safety and Health (U. S. DOE/EH)
EPA         Environmental Protection Agency (U.S. EPA)
FR          Federal Register
GUM        Guide to the Expression of Uncertainty in Measurement
H0           Null Hypothesis
IA           Initial Assessment
IAEA        International Atomic Energy Agency
ISO          International Standards Organization
ITRC        Interstate Technology & Regulatory Council
k            Coverage Factor for  Uncertainty
LBGR       Lower Bound of the  Gray Region
MARLAP    Multi-Agency Laboratory Analytical Protocols (Manual)
MARSAME  Multi-Agency Radiation purvey and Assessment of Materials and Equipment
             (Manual)
MARS SIM   Multi-Agency .Survey and $>ite Investigation Manual
M&E        Materials and Equipment
MC          True Background Distribution
MCE         Random  Sample of Background
MDC        Minimum Detectable Concentration
MQC        Measurement Quality Uncertainty (also Minimum Quantifiable Concentrations)
MQO        Measurement Quality Objective(s)
N           The Sample Size (N  measurements, for instance)
NCRP       National  Council on  Radiation Protection and Measurements
NHSRC      National  Homeland Security Research Center
                                        38

-------
NIST        National Institute of Standards and Technology
NRC         Nuclear Regulatory Commission (U.S. NRC)
NUREG     NRC NUclear REGulatory Guide (U. S. NRC)
OAR        Office of Air and Radiation (U. S. EPA/OAR)
ORIA        Office of Radiation and Indoor Air (U. S. EPA/OAR/ORIA)
PAG         Protective Action Guide
pdf          Portable Document Format
q            critical value for statistical tests
QA          Quality Assurance
QC          Quality Control
QA/QC      Quality Assurance/Quality Control
Rb           Mean Background Count Rate
RAC         Radiation Advisory Committee (U.S. EPA/SAB/RAC)
rev          Revision
SAB         Science Advisory Board (U.S. EPA/SAB)
a            Standard deviation

-------
                    APPENDIX C - TYPOS AND CORRECTIONS

xxix line 504 power?
         522 delete one (
xxxi     561 delete one)
         567 delete one (
xxxiv    671 Technetium (sp.)
xxxv     676 delete (duplicates 675)
1-3       80 change "activity concentrations" to "area activity" or leave as is but change
             "Bq/m2" to "Bq/m3" and add "and area activity (Bq/m2)
3-9     194 non-radionuclide-specific  (insert dash)
4-5     Figure 4.la replace second "Large" by "Much Larger"
             Figure 4b. replace second "Small" by "Equally Small or Smaller"
5-21     523 value in denominator should be 0.4176 (see line 527)
         527 plus should be behind square root of 87
5-53    1148 delete 2nd period
6-6      142 insert "to" behind "likely"
6-11     280 insert "that" behind "determine"
6-13     329 insert "that" behind "demonstrate"
6-23     474 and 482 critical value in symbols table is not in italics (italicized k is coverage
             factor)
7-10     210 Tl-208 should be beta/gamma, not just beta, with gamma-ray energy in next
         column
B-6       151 maximize, not minimize
D-9      219 what does "varies" mean?
D-36     849 for LS spectrometer, insert (alpha) on first line of column 2 and (gamma) for the
                    HPGE and Nal detectors
F-l       26 delete (FRER)
                                          40

-------