1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON D.C. 20460

OFFICE OF THE ADMINISTRATOR
SCIENCE ADVISORY BOARD

	Date to be Inserted —

EPA-SAB-08-XXX

The Honorable Stephen L. Johnson
Administrator

U.S. Environmental Protection Agency
1200 Pennsylvania Avenue, N.W.

Washington, DC 20460

Subject: Re Review of a Multi-Agency Work Group Draft Document entitled "Multi-
Agency Radiation Survey and Assessment of Materials and Equipment Manual
(MARSAME)," Draft Report for Comment, December 2006

Dear Administrator Johnson:

The Radiation Advisory Committee (RAC) Multi-Agency Radiation Survey and
Assessment of Materials and Epuipment (MARSAME) Review Panel of the Science Advisory
Board has completed its review of the Multi-Agency Work Group draft document entitled
"Multi-Agency Radiation Sun'ey and Assessment of Materials and Equipment Manual
(MARSAME)," Draft Report for Comment, December 2006.

RESERVED FOR FUTURE DEVELOPMENT


-------
1

2

3

4

5

6

7

8

9

10

11

12

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

NOTICE

This report has been written as part of the activities of the EPA Science Advisory Board
(SAB), a public advisory group providing extramural scientific information and advice to the
Administrator and other officials of the Environmental Protection Agency. The SAB is
structured to provide balanced, expert assessment of scientific matters related to problems facing
the Agency. This report has not been reviewed for approval by the Agency and, hence, the
contents of this advisory do not necessarily represent the views and policies of the
Environmental Protection Agency, nor of other agencies in the Executive Branch of the Federal
government, nor does mention of trade names of commercial products constitute a
recommendation for use. Reports and advisories of the SAB are posted on the EPA website at
http://www.epa. gov/ sab.

i


-------
SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

U.S. Environmental Protection Agency

Science Advisory Board
Radiation Advisory Committee (RAC)

Multi-Agency Radiation Survey and Assessment of Materials and
Equipment (MARSAME) Manual Review Panel

CHAIR:

Dr. Bernd Kahn, Professor Emeritus, School of Nuclear Engineering and Health Physics, and
Director, Environmental Radiation Center, Georgia Institute of Technology, Atlanta, GA

PAST CHAIR:

Dr. Jill Lipoti, Director, Division of Environmental Safety and Health, New Jersey Department
of Environmental Protection, Trenton, NJ

RAC MEMBERS:

Dr. Thomas B. Borak, Professor, Department of Environmental and Radiological Health
Sciences, Colorado State University, Fort Collins, CO

Dr. Antone L. Brooks, Professor, Radiation Toxicology, Washington State University Tri-
Cities, Richland, WA

Dr. Faith G. Davis, Senior Associate Dean and Director of Graduate Studies, Professor of
Epidemiology, Division of Epidemiology and Biostatistics, School of Public Health, University
of Illinois at Chicago, Chicago, IL

Dr. Brian Dodd, Consultant, Las Vegas, NV

Dr. Shirley A. Fry, M.B., B. Ch., MPH, Consultant, Indianapolis, IN

Dr. William C. Griffith, Associate Director, Institute for Risk Analysis and Risk
Communication, Department of Environmental and Occupational Health Sciences, University of
Washington, Seattle, WA

Dr. Jonathan M. Links, Professor, Department of Environmental Health Sciences, Bloomberg
School of Public Health, Johns Hopkins University, Baltimore, MD

Mr. Bruce A. Napier, Staff Scientist, Radiological Science & Engineering Group, Pacific
Northwest National Laboratory, Richland, WA

11


-------
SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

Dr. Daniel O. Stram, Professor, Department of Preventive Medicine, Division of Biostatistics
and Genetic Epidemiology, Keck School of Medicine, University of Southern California, Los
Angeles, CA

Dr. Richard J. Vetter, Head, Radiation Safety Program, Mayo Clinic, Rochester, MN
CONSULTANTS:

Mr. Bruce Church, President, BWC Enterprises, Inc., Hurricane, UT
Mr. Kenneth Duvall, Environmental Scientist/Consultant, Washington, D.C.

Dr. Janet A. Johnson, Consultant, Carbondale, CO 81623

Dr. Paul J. Merges, President, Environment & Radiation Specialists, Inc., Loudonville, N.Y.

SCIENCE ADVISORY BOARD STAFF

Dr. K. Jack Kooyoomjian, Designated Federal Officer, 1200 Pennsylvania Avenue, NW,
Washington, DC, 20460-0001, Phone: 202-343-9984, Fax: 202-233-0643, or 0645
(koovoomiian.iack@epa.gov) Messenger/Physical Delivery Address: 1025 F Street, NW, Room
3606, Mail Code 1400F

in


-------
SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

5

6	Provide Roster here. DFO will get update of Charter Science Advisory Board roster and insert

7	here.

8

9

10	SCIENCE ADVISORY BOARD STAFF

11	Mr. Thomas Miller, Washington, DC

2

3

4

U.S. Environmental Protection Agency
Science Advisory Board

BOARD

iv


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

TABLE OF CONTENTS

1.	EXECUTIVE SUMMARY	1

2.	INTRODUCTION	3

2.1	Background	3

2.2	Review Process and Acknowledgement	4

2.6 EPA Charge to the Panel	4

3.	PRINCIPLES OF APPROACH FOR RESPONSE TO THE STATISTICS ELEMENTS OF THE

CHARGE QUESTIONS	6

4.	RESPONSE TO CHARGE QUESTION 1: PROVIDING AN APPROACH FOR PLANNING,

CONDUCTING, EVALUATING AND DOCUMENTING ENVIRONMENTAL
RADIOLOGICAL SURVEYS TO DETERMINE THE APPROPRIATE DISPOSITION
FOR MATERIALS AND EQUIPMENT	7

4.1	Charge Question #1	7

4.2	Charge Question # la	8

4.3	Charge Question # lb	8

4.4	Charge Question # lc	9

4.5	Charge Question # Id	9

5.	RESPONSE TO CHARGE QUESTION 2: COMMENTS ON THE STATISTICAL METHODIOLOGY

CONSIDERED IN MARSAME	11

5.1	Charge Question#2	11

5.2	Charge Question # 2a	11

5.3	Charge Question # 2b	12

5.4	Charge Question #2c	12

6.	RESPONSE TO CHARGE QUESTION 3: RECOMMENDATIONS PERTAINING TO THE

MARSAME ROADMAP AND APPENDICES	14

7.	ADDITIONAL SUGGESTIONS	15

FIGURES IN APPENDIX A-2 SCENARIO A 	23

FIGURES IN APPENDIX A-2 SCENARIO B 	24

REFERENCES CITED	26

APPENDIX A - STATISTICAL ANALYSIS - AN INTRODUCTION TO EXPERIMENTAL DESIGN AND
HYPOTHESIS TESTING AND SPECIFIC COMMENTS ON STATISTICS	19

A-l An Introduction to Experimental Design and Hypothesis Testing	19

A-2 Specific Comments 	22

APPENDIX B -ACRONYMS AND ABBREVIATIONS	28

APPENDIX C - MARSAME TYPOS AND CORRECTIONS	30

v


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

1. EXECUTIVE SUMMARY

The Radiation Advisory Committee (RAC) of the Science Advisory Board (SAB) has
completed its review of the Multi-Agency Work Group draft document entitled "Multi-Agency
Radiation Survey and Assessment of Materials and Equipment Manual (MARSAME)," Draft
Report for Comment, December 2006 (U.S. EPA. 2006; See also the MARSAME Hotlink at
http://www.marsame.org). The Multi-Agency Radiation Survey and Assessment of Materials
and Equipment (MARSAME) document presents a framework for planning, implementing, and
assessing radiological surveys of material and equipment (M&E). MARSAME supplements the
Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM; See also the
MARS SIM Hotlink at http://epa.gov/radiation/marssim/index.htmn. and refers to information
provided in the Multi-Agency Radiological Laboratory Analytical Protocols manual (MARLAP;
See also the MARLAP Hotlink at http://epa.gov/radiation/marlap/index.htmn. All of these were
prepared by a work group that is a joint effort by staff members of multiple pertinent Federal
agencies. The three documents, taken together, describe radiological survey programs in great
detail and address recommendations to competent professionals and managers for performing
such surveys. The surveys are designed to compare measurements to radionuclide
concentrations specified in regulations or guides for accepting or rejecting a program or process.
Vocabulary and techniques in MARSAME are carried forward from MARSSIM and MARLAP,
with a few items added that are particularly applicable to M&E surveys.

The MARSAME document also pertains to surveying possibly radioactive M&E that
may be in nature or in commerce when considered for acceptance or release. It presents a
thorough grand overview of the various aspects of initial assessment, decision inputs, survey
design, survey implementation, and assessment of results. In addition, some aspects, such as
hypothesis testing and statistical aspects of measurement reliability are described in considerable
detail. A number of illustrative examples are presented, and useful information is collected in
appendices.

This review of the MARSAME document by an EPA-SAB Radiation Advisory
Committee (RAC) Panel was requested by the EPA Office of Radiation and Indoor Air (ORIA).
The review is based on reading the MARSAME Draft Report for Comment (December 2006),
presentations by MARSAME work group members on October 29-31, 2007, and
teleconferences among Panel members. The review responds to the set of charge questions posed
by ORIA, but also refers to certain other technical items.

The Panel recognizes the magnitude of the effort by the work group and the value of its
product; it notes that the Panel suggestions for changes address only a small fraction of this
product. Most Panel recommendations can be summarized in the following broad categories:

• MARSAME guidance is suitable for experienced radiation protection and
surveillance staff, but use by other interested readers, such as managers, will
require that they receive special training;

1


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

•	Appropriate advice and information should be added for use of (a) available
regulations and technical guidance for the action limit (AL), (b) decontamination
applied as part of the disposition plan, and (c) measurements to distinguish
removable surface contamination and volumetric contamination from fixed
surface contamination; and

•	The specialized guidance for applying statistical tools should be separated from
the otherwise pervasive non-quantitative guidance, both for the convenience of
the general audience and for acceptance by specialists.

The above items are discussed within the context of the charge questions.

Because of the importance given by the work group to the mathematical support
structure, a sub-group of the Panel has prepared a guide to placing portions of MARSAME
devoted to matters such as survey design, the gray region, and hypothesis testing in a context that
is easily accessible to persons generally familiar with statistical analysis. This guide is in the
Appendix (See Appendix A) to this review.

2


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

2. INTRODUCTION

2.1 Background

The MARSAME document was designed to guide a professional through all aspects of
radiological surveys of M&E prior to their intended receipt or discharge. It is written sufficiently
broadly to pertain to all types of M&E. Cited as examples are metals, concrete, tools, trash,
equipment, furniture, containers of material, and piping, among others. The presented alternative
outcomes are release or interdiction, i.e., acceptance or rejection of M&E transfer.

The document was prepared by staff working together from the following Federal
agencies: US EPA, US NRC, US DOE, and US DoD. It is part of a continuing effort that began
with writing MARS SIM and continued with MARLAP. As a result, the methodology and
associated vocabulary in MARSAME follow those of the preceding manuals, although a few
aspects of MARSAME are distinct. Notably, MARSAME may be connected to MARS SIM as
part of a site survey, or stand by itself in considering the transfer of M&E.

Surveys described in the MARSAME manual and its predecessors are based on the Data
Quality Objectives (DQO) process to design the best survey with regard to disposition option,
action level, and M&E description. The Data Life Cycle (DLC) supports DQO by carrying
suitable information through the planning, implementation, assessment, and decision stages of
the program. The data are collected, evaluated, and applied in terms of Measurement Quality
Objectives (MQO) established with statistical concepts of data uncertainty and minimum
quantifiable concentrations.

The MARSAME document is structured as follows, shown with the related charge
question (CQ):

Acronyms and Abbreviations
Symbols, Nomenclature, and Notations
Conversion factors
Road Map (CQ 3)

Chapter 1, Introduction and overview (CQ 1)

Chapter 2, Initial assessment of M&E (CQ la)

Chapter 3, Identify inputs for the decision (CQ lb)

Chapter 4, Survey design (CQ lc)

Chapter 5, Implementation of disposition surveys (CG 2a)

Chapter 6, Assess the results of the disposition survey (CQ 2b)

Chapter 7, Case studies (CQ Id and 2c)

7 Appendices (CQ 3)

References

Glossary

3


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

Responding to the charge questions was the primary purpose of the RAC Panel and is
addressed first. The Panel also addressed some other topics, commented in detail on the
MARSAME discussion of statistical aspects, and suggested corrections where needed.

2.2 Review Process and Acknowledgement

The U.S. EPA's Office of Radiation and Indoor Air (ORIA), on behalf of the Federal
Agencies participating in the development of the MARSAME Manual, requested the U.S. SAB
to provide advice on a draft Multi-Agency Work Group document entitled "Multi-Agency
Radiation Survey and Assessment of Materials and Equipment (MARSAME) Manual, "

December 2006. MARSAME is a supplement to the "Multi-Agency Radiation Survey and Site
Investigation Manual" (MARSSIM, EPA 402-R-970-016, Rev. 1, August 2000 and June 2001
update). The SAB Staff Office announced this advisory activity and requested nominations for
technical experts to augment the SAB's Radiation Advisory Committee (RAC) in the Federal
Register (72 FR 11356; March 13, 2007). MARSAME was developed collaboratively by the
Multi-Agency Work Group (60 FR 12555; March 7, 1995) and provides technical information on
approaches for planning, conducting, evaluating, and documenting radiological disposition
surveys to determine proper disposition of materials and equipment (M&E). The techniques,
methodologies, and philosdophies that form the basis of this manual have been developed to be
consistent with current Federal limitations, guidelines, and procedures.

The SAB RAC MARSAME Review Panel met in an initial public teleconference meeting
on Tuesday, October 9, 2007 to introduce the subject and discuss the charge to the Panel,
determine if the review and background materials provided are adequate to respond to the charge
questions directed to the SAB's RAC MARSAME Review Panel, and agree on charge
assignments for the Panelists. The purpose of the meeting of Monday, October 29 through
Wednesday, October 31, 2007 was to receive presentations by the Multi-Agency Work Group
staff, deliberate on the charge questions, and draft a report in response to the charge questions

pertaining to the draft MARSAME Manual, dated December 2006	(continue with

Dec 21, 2007 and March 10, 2008 conference calls, etc. — KJK)	

2.3 EPA Charge to the Panel

The EPA's Science Advisory Board (SAB) conducted the scientific peer reviews of the
companion multi-agency documents MARSSIM (EPA-SAB-RAC-97-008, dated September 30,
1997) and MARLAP (EPA-SAB-RAC-03-009, dated June 6, 2003), and the Federal agencies
participating in those peer reviews found the process used by the SAB to be extremely beneficial
in assuring the accuracy and usability of the final manuals. Consequently, two consultations
have taken place for MARSAME (EPA-SAB-RAC-CON-03-002, dated February 27, 2003, and
EPA-SAB-RAC-CON-04-001, dated February 9, 2004). On behalf of the four participating
Federal agencies, the EPA's Office of Radiation and Indoor Air (ORIA) is requesting that the
SAB conduct the formal technical peer review of the draft MARSAME.

4


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

The following charge questions were posed to the SAB Rac's MARSAME Review Panel
(U.S. EPA. 2007):

1)	The objective of the draft MARSAME is to provide an approach for planning, conducting,
evaluating, and documenting environmental radiological surveys to determine the appropriate
disposition for materials and equipment with a reasonable potential to contain radionuclide
concentration(s) or radioactivity above background. Please comment on the technical
acceptability of this approach and discuss how well the document accomplishes this objective.
In particular, please

a)	Discuss the adequacy of the initial assessment process as provided in MARSAME
Chapter 2, including the new concept of sentinel measurement (a biased measurement
performed at a key location to provide information specific to the objectives of the Initial
Assessment).

b)	Discuss the clarity of the guidance on developing decision rules, as provided in
MARSAME Chapter 3.

c)	Discuss the adequacy of the survey design process, especially the clarity of new
guidance on using Scenario B, and the acceptability of new scan-only and in-situ survey
designs, as detailed in MARSAME Chapter 4.

d)	Discuss the usefulness of the case studies in illustrating new concepts and guidance, as
provided in MARSAME Chapter 7.

2)	The draft MARSAME, as a supplement to MARSSIM, adapts and adds to the statistical
approaches of both MARSSIM and MARLAP for application to radiological surveys of materials
and equipment. Please comment on the technical acceptability of the statistical methodology
considered in MARSAME and note whether there are terminology or application assumptions
that may cause confusion among the three documents. In particular, please

a)	Discuss the adequacy of the procedures outlined for determining measurement
uncertainty, detectability, and quantifiability, as described in MARSAME Chapter 5.

b)	Discuss the adequacy of the data assessment process, especially new assessment
procedures associated with scan-only and in-situ survey designs, and the clarity of the
information provided in Figures 6.3 and 6.4, as detailed in MARSAME Chapter 6.

c)	Discuss the usefulness of the case studies in illustrating the calculation of
measurement uncertainty, detectability, and quantifiability, as provided in MARSAME
Chapter 7.

3)	The draft MARSAME includes a preliminary section entitled Roadmap as well as seven
appendices. The goal of the Roadmap is to assist the MARSAME user in assimilating the
information in MARSAME and determining where important decisions need to be made on a
project-specific basis. MARSAME also contains appendices providing additional information on
the specific topics. Does the SAB have recommendations regarding the usefulness of these
materials?

5


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

3. PRINCIPLES OF APPROACH FOR RESPONSE TO THE
STATISTICS ELEMENTS OF THE CHARGE QUESTIONS

(This section is reserved for the present time for discussion on statistics that might need to be

moved into the body of the text. —KJK)

Detailed discussions of statistical analysis related to experimental design and hypothesis
testing permeate the otherwise general guidance for M&E surveys. The Panel response and
comments are compiled in Appendix A on Statistical Analysis rather than scattering them
throughout this review. Appendix A consists of an introduction to describe the view of the
Panel, followed by specific reviewer responses based on these reviews. Related responses to
individual charge questions are referred to Appendix A.

6


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

4. RESPONSE TO CHARGE QUESTION 1: PROVIDING AN APPROACH
FOR PLANNING, CONDUCTING, EVALUATING AND DOCUMENTING
ENVIRONMENTAL RADIOLOGICAL SURVEYS TO DETERMINE THE
APPROPRIATE DISPOSITION FOR MATERIALS AND EQUIPMENT

4.1 Charge Question 1: The objective of the draft MARSAME is to provide an approach for
planning, conducting, evaluating, and documenting environmental radiological surveys to
determine the appropriate disposition for materials and equipment with a reasonable potential
to contain radionuclide concentration(s) or radioactivity above background Please comment
on the technical acceptability of this approach and discuss how well the document
accomplishes this objective.

The MARSAME manual is an excellent technical document that adequately describes a
robust assessment process. The Panel suggests some improvements t o (1) describe "alternate
approaches or modification" for applying MARSAME, as discussed in Chapter 1, lines 50 - 56;
and (2) design the manual for use by others - notably project managers — than "the technical
audience having knowledge of radiation health physics and an understanding of statistics" plus
further capabilities described in Chapter 1, lines 187 - 194. One aspect that appears to be missing
is the option of decontaminating the M&E as part of the process when considering alternate
actions.

SUGGESTION 1-1: Separate the discussion that begins on 1. 49 by creating a sub-section to
present clearly the concept of simple alternatives to what may appear to the reader to be a major
undertaking. Follow this paragraph with sufficient detail and references to later chapters to
assure the reader that M&E that can be reasonably expected to have little or no radioactive
contamination can be processed without excessive effort under the MARSAME system. One
approach identified subsequently is applying standard operating procedures (SOP's).
Categorization as non-impacted or as class 3 M&E based on historical data can lead to an
appropriately simple process.

SUGGESTION 1-2: Insert a paragraph after 1. 196 to address use by persons less skilled
professionally than defined in a preceding paragraph. For such users, reference to Appendices B,
C, and D, would be helpful. Adding another appendix that includes portions of the MARS SIM
Roadmap and Chapters 1 and 2 could provide the suitable background information without
requiring that all of MARS SIM be read. Presentation of training courses for managers and other
generalists with responsibility for radiation surveys would be most helpful.

SUGGESTION 1-3: Insert a sub-section in Chapter 1 and also in appropriate subsequent
chapters to consider various degrees of M&E decontamination as part of the available options
associated with a MARSAME survey. Note that storage for radioactive decay is one option for
decontamination.

7


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

4.2	Charge Question la: Discuss the adequacy of the initial assessment process as provided
in MARSAME Chapter 2, including the new concept of sentinel measurement (a biased
measurement performed at a key location to provide information specific to the objectives of
the Initial Assessment).

The initial assessment process is adequate as described. That many measurements made
throughout the MARSAME process could be biased is recognized. Some additional information
sources and M&E categories may be helpful.

SUGGESTION la-1: The discussion in Chapter 2, lines 104 - 115 could include reviewing files
(inspection reports, incident analyses, and compliance history) of currently and formerly
involved regulatory agencies. Discussions with these agencies and their inspectors could also be
fruitful.

SUGGESTION la-2: The listing of complexity attributes in Table 2.1 could include TOSCA
materials and hazardous waste.

Sentinel measurements, as described for the initial assessment process of MARSAME
have been commonly applied. They are rational and useful for obtaining an initial idea of the
type and magnitude of radioactive contaminants. Because they were not randomly selected, by
definition they are biased. These measurements and their applicability and limitations are well
described in the document, and their use is clear. In fact, wider application appears practical.

SUGGESTION la-3: In Chapter 1, lines 253 - 259, MARSAME should recognize that Sentinel
measurement is important because often it is all that is available historically for initial
assessment (IA). Hence, considering it to be "limited data" can be misleading. Moreover, for
Chapter 2, lines 277 - 280, design of a preliminary survey for radioactive contaminants to fill in
knowledge gaps often depends on the availability of data from Sentinel measurements, and in
some instances only further Sentinel measurements are possible.

4.3	Charge Question lb: Discuss the clarity of the guidance on developing decision rules, as
provided in MARSAME Chapter 3.

This chapter devoted to developing decision rules is most useful. The decision rules are
admirably clear. Some additions will surely benefit the reader.

SUGGESTION lb-1: The regulations or guidance for radionuclide clearance that define the
action level discussed in Chapter 3, lines 118 - 120 are sufficiently important to be presented
here, rather than in the obscurity of Appendix E. This information includes Table E.2 for
regulations by DOE and Table E.3 by NRC. Additional information, for example, the guidance
reported in Table 5.1 of NCRP (2002) on volumetric clearance standards, should also be given
here to present the thinking of national and international standards and guidance groups.

SUGGESTION lb-2: Information that describes the radioactive contaminant listed in lines 141
-147 should include removable vs. fixed surface contamination. Further, insertion of a sub-

8


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

section that discusses the planning implications of removable vs. fixed and surface us. volumetric
contamination would be helpful to the user.

SUGGESTION lb-3: The discussion concerning measurement method uncertainty, detection
capability, and quantification capability on lines 567 - 622 takes the MARSAME presentation
from broad guidance to specific statistical tutorial. The tutorial raises certain questions for some
general readers and other questions for some professionals. One approach is to maintain the less
specific tome of MARSAME in these three sub-sections and refer to a detailed discussion of
statistical aspects as given in SUGGESTIONS lc-1 and 2a-l.

SUGGESTION lb-4: Please clarify the following: Why is the MDC recommended for the
MQO on lines 593 - 597 instead of the MQC? How does item #1 differ from item #3 on lines
609-617?

4.4	Charge Question lc: Discuss the adequacy of the survey design process, especially the
clarity of new guidance on using Scenario B. and the acceptability of new scan-only and in-
situ survey designs, as detailed in MARSAME Chapter 4.

With the exception of Section 4.2, Statistical Decision Making, Chapter 4 is easily
understood by the general reader. Classification of M&E is an effective approach and helpful.
The Disposition Survey Design and Documentation sections are well prepared. Regarding
statistical decision making, the concepts of hypothesis testing and uncertainty per se are readily
understood. However, the concept of uncertainty with default significance levels and the
resulting gray area and discrimination limits leading to minimum quantifiable concentrations are
not so readily assimilated. An extended consideration of the statistical approach has been
prepared and is attached to this review as Appendix A.

SUGGESTION lc-1: Consider maintaining the same level of generalized guidance that
pervades most of MARSAME in brief sub-sections that address statistical matters. Collect the
mathematical discussion in a separate chapter, as proposed in SUGGESTION 2a-l. This type of
discussion in Chapter 19, Measurement Statistics, of MARLAP should serve as example.
Separation will serve both the specialist in statistics, who will appreciate the exposition in the
new chapter, and those with less training in statistics who will follow the general import of the
MASAME approach in the existing chapter.

4.5	Charge Question Id: Discuss the usefulness of the case studies in illustrating new
concepts and guidance, as provided in MARSAME Chapter 7.

Case studies can be useful for clarifying the MARSAME process and guiding the user.
Although the Panel was informed by the MARS SIM Multi-Agency Work Group that Chapter 7
contains, not case studies, but made-up illustrative examples, these also can be helpful if created
to represent actual situations. When an illustrative example fails to match a real situation, some
changes in the presented example can improve it.

9


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

SUGGESTION ld-1: Delete or replace the example for SOP use in Section 7.2. Given the good
discussion in Section 3.10 for improving an SOP within the MARSAME framework, the
example of applying SOP's at a nuclear power station appears to contribute little.

SUGGESTION ld-2: The example in Section 7.3 of mineral processing of concrete rubble is
instructive. The reader should be informed that many more measurement results than those listed
in Table 7.3 are ordinarily obtained, but were not created here to conserve space. The
radionuclide concentrations reported on lines 213-214 either should be confirmed as typical
values or the reader should be cautioned that they are not. For the same reason, the AL taken
from NUREG-1640 should be identified as a specific selection, not a general limit. Inserting
boxes with interpretive comments would help the reader to understand the process and the
decisions made.

SUGGESTION ld-3: The sheer length of the 21-page example in Section 7.4 of the baseline
survey of a rented front loader discourages its application. An introductory statement should
explain that details are needed to describe the mechanism of the survey, but that the actual work
is brief. This survey provides a good opportunity to present Sentinel measurements and the
comparison of removable and fixed surface contamination. An actual case history undoubtedly
would show these and also contain a table of survey measurements.

SUGGESTION ld-4: The illustrative example headings would benefit from inclusion of a
statement that they are demonstrating the MARSAME process.

10


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

5. RESPONSE TO CHARGE QUESTION 2: COMMENTS ON THE
STATISTICAL METHODOLOGY CONSIDERED IN MARSAME

5.1	Charge Question # 2: The draft MARSAME, as a supplement to MARSSIM, adapts and
adds to the statistical approaches of both MARSSIM and MARLAP for application to
radiological surveys of materials and equipment. Please comment on the technical
acceptability of the statistical methodology considered in MARSAME and note whether there
are terminology or application assumptions that may cause confusion among the three
documents.

MARSAME contains tables and text that carefully compare the three documents and
identify consistencies and differences. To those familiar with the three documents, application of
the statistical methodology in MARSAME appears to match that used in MARSSIM and
MARLAP to the extent observable over the existing wide range of applications.

A shift appears to have occurred from use of the Data Quality Objective (DQO)
terminology of MARSSIM to the Measurement Quality Objective (MQO) of MARSAME, but
the principle is comprehensible. It is clear that MARSAME has close connections to MARSSIM
in surveys of M&E that were located at MARSSIM sites. The manual recognizes that M&E
moved onto the site or used to process and survey the site subject to MARSSIM also may be
considered under MARSAME. In addition, M&E unconnected with MARSSIM sites are subject
to MARSAME.

5.2	Charge Question # 2a: Discuss the adequacy of the procedures outlined for determining
measurement uncertainty, detectability, and quantifiability, as described in MARSAME,
Chapter 5.

The presentation for determining uncertainty, detectability, and quantifiability in Chapter
5, as well as aspects of this discussion in Chapters 4 and 6, follows the well-developed path in
MARSSIM and MARLAP. Problems to be considered are whether comprehension and correct
application by the user requires (1) previous reading of MARSSIM and MARLAP, and (2) the
expertise and knowledge specified in Chapter 1, lines 189 - 194.

SUGGESTION 2a-l: Improve understanding the mathematically detailed statistical exposition
in MARSAME by separating it in its entirety in a chapter that could be entitled "Review of
Experimental Design and Hypothesis Testing". Appendix G can be included in this chapter. The
chapter can be placed before Chapter 4 or after Chapter 6. All sections currently in Chapters 4 -
6 that discuss aspects of these items, including measurement uncertainty, detectability, and
quantifiability, should be kept in place but revised to present generalized discussions of these
matters, with reference to the technical discussions, equations, and tables in the new chapter.

SUGGESTION 2a-2: Refer to Appendix A for a detailed set of comments concerning the topics
of experimental design, hypothesis testing, and the statistical aspects of uncertainty.

11


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

5.3	Charge Question # 2b: Discuss the adequacy of the data assessment process, especially
new assessment procedures associated with scan-only and in-situ survey designs, and the
clarity of the information provided in Figures 6.3 and 6.4.

The data assessment process is carefully presented and thoroughly explored. Much good
advice is given and the examples are helpful.

Suggestions for statistical considerations are presented in Appendix A. The information
presented in Figures 6.3 and 6.4 is clear, but minor changes are proposed. The need to address
removable and fixed surface contamination and volumetric contamination in all chapters is
emphasized.

SUGGESTION 2b-l: In Fig. 6.3, clarify the distinction of a MARSSIM-type survey by moving
"Start" to immediately above the decision point "Is the survey Design-scan only or In-situ?" and
then connecting this to the decision point "Is the AL equal to zero or background?". A "yes"
leads to "Requires scenario B" and a "no" leads to "Disposition decision based on mean".

SUGGESTION 2b-2: In Fig. 6.4, for a more consistent presentation, insert a decision diamond
after "Perform the sign test" and "Perform the WRS test" that says "Use scenario A", at both
locations, followed by a "yes" or "no" leading to the two branches at both locations.

SUGGESTION 2b-3: Insert sub-sections in all chapters to address the implementation and
assessment of survey processes that distinguish between surface and volumetric contamination
(i.e., repeated measurement after surface removal) and between removable and fixed surface
contamination (i.e., wipe test results compared to total surface activity). These types of
contamination are described in Chapter 1, lines 127 - 152, but their implications are
insufficiently considered throughout MARSAME. Concerns include difficulties in characterizing
the depth of volumetrically distributed radionuclides, quantifying radionuclides that emit no
gamma rays, and subsequent contamination of persons and surfaces by removable radionuclides.

5.4	Charge Question # 2c: Discuss the usefulness of the case studies in illustrating the
calculation of measurement uncertainty, detectability, and quantifiability as provided in
MARSAME chapter 7.

As stated in the response to Charge question Id, case studies are invaluable in guiding the
user through complex operations. The illustrative examples with which the case studies were
replaced lack the realistic data accumulation that permits estimation of uncertainty. Excessively
detailed calculations are provided on lines 579 - 628, 658 - 565, and 682 - 689. For discussions
related to uncertainty, refer to the Appendix.

SUGGESTION 2c-l: Move the detailed calculations identified above to the separate chapter
recommended for discussion of experimental design and hypothesis testing.

SUGGESTION 2c-2: Use the illustrative examples to demonstrate distinctions such as
interdiction vs. release and scenarios A us. B.

12


-------
SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

1	SUGGESTION 2c-3: Use the illustrative example in Sections 7.4 and 7.5 to demonstrate the

2	benefit of smears (wipe tests) to determine removable surface contaminants. Experience suggests

3	that the contaminant usually is in this form on M&E such as earth-moving equipment.

4

5

13


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

6. RESPONSE TO CHARGE QUESTION 3: RECOMMENDATIONS
PERTAINING TO THE MARSAME ROADMAP AND APPENDICES

Charge Question 3: The draft MARSAME includes a preliminary section entitled Roadmap
as well as seven appendices. The goal of the Roadmap is to assist the MARSAME user in
assimilating the information in MARSAME and determining where important decisions need
to be made on a project-specific basis. MARSAME also contains appendices providing
additional information on the specific topics. Does the SAB have recommendations regarding
the usefulness of these materials?

The Roadmap is crucial in guiding the reader through a document as complex as
MARSAME. The appendices are useful in various ways, such as providing information
compilations and statistical tables, and avoiding the need to seek this information in MARSSIM
and MARLAP. Also necessary to the reader are the acronyms and abbreviations; symbols,
nomenclature, and notations; and glossary. The following suggestions are intended to enhance
their use.

SUGGESTION 3-1: Roadmap Figure 1 connects the MARSAME chapters in terms of the Data
Life Cycle. Is it possible to draw an analogous connection with Roadmap Figures 2, 3, 5, 6, 7,
and 8? At present, the only Roadmap Figures connected to each other are 2, 3, and 4, and 7 with
8.

SUGGESTION 3-2: Assist project managers by highlighting major operational decision points
in the roadmaps.

SUGGESTION 3-3: Indicate in the body of the text that Appendices B, C, and D are useful
overviews of the environmental radiation background, sources of radionuclides, and radiation
detection instruments, respectively, for managers and generalists, although they are too general
for the experienced health physicist to whom the manual is addressed.

SUGGESTION 3-4: Move Tables E.2 and E.3 and associated comments from Appendix E to
Section 3.3, of which the tables should be an integral part.

SUGGESTION 3-5: Either move Appendix G into the new chapter on experimental design and
hypothesis testing or indicate its relation to that new chapter.

SUGGESTION 3-6: Move the Glossary to the front to join the tables of acronyms and of
symbols.

14


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

7. ADDITIONAL SUGGESTIONS

SUGGESTION C-l: Discuss decisions leading to selecting the degree of confidence, embedded
in the choice of alpha and beta values, in a section of Chapter 3. Ultimately, the selection may be
a matter of the acceptable uncertainty specified by the agency that sets the action level.

SUGGESTION C-2: Discuss the impact of survey cost and time frame on the MARSAME
effort in a section of Chapter 2. Very brief or lengthy projects obviously need different designs.
Data retention becomes important in long projects, especially if contractors replace each other.

SUGGESTION C-3: Discuss in a section in Chapter 6 the options to be considered and pursued
when the plan proposed initially for M&E transfer must be rejected.

SUGGESTION C-4: Provide references (possibly to MARSSIM) for aspects of the
MARSAME process that are discussed in much less detail than statistics. Among such topics are
quality assurance (including validation and verification of results), the relation of radionuclide
concentrations to radiation exposure (dose) for various radionuclide distributions in M&E,
importance of sample dimensions or measurement frequency, and the effect of non-random
variability in measurement (e.g., fluctuating geometry or monitor movement rate).

15


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

REFERENCES CITED

(Alphabetical and date sequenced listing of Author Last name, First name, Middle Initial, Title,
Date, etc. To be refined in later versions — KJK).

Federal Register Notice Citations:

FR, Vol. 60, No. , March 7, 1995, p. 12555

FR, Vol. 72, No. , March 13, 2007, p. 11356

FR, Vol. 72, No. 184, September 24, 2007, pp. 54255 - 54257.

FR, Vol. 72, NO.	, Date, pp.	-	..(Charter Board Mtg. announcement to be added)

NCRP. 2002. Managing Potentially Radioactive Scrap Metal, Report #141, National Council on
Radiation Protection and Measurements, Bethesda, MD 20814.

U.S. EPA. 2000 and 2001. "Multi-Agency Radiation Survey and Site Investigation Manual'
(MARSSIM). NUREG-1575, rev. 1; EPA 402-R-970-016, Rev. 1, DOE/EH - 0624, Rev. 1,
August 2000 and June 2001 update

U.S. EPA. 2006. "Multi-Agency Radiation Survey and Assessment of Materials and
Equipment Manual (MARSAME), Draft Report for Comment," NUREG-1575, Supp. 1; EPA
402-R-06-002; DOE/EH-707, December 2006

U.S. EPA. 2007. Memo from Elizabeth A. Cotsworth, Director, Office of Radiation and Indoor
Air (ORIA) to Vanessa Vu, Director, Science Advisory Board Staff Office, and entitled

"Review of the Draft Multi-Agency Radiation Survey and Assessment of Materials and
Equipment Manual, " October 23, 2007

U.S. EPA/SAB. 1997. "An SAB Report: Review of the Multi-Agency Radiation Survey and
Site Investigation Manual (MARSSIM)," Prepared by the Radiation Advisory Committee
(RAC) of 5he Science Advisory Board , EPA-SAB-RAC-97-008, Sept. 30, 1997

U.S. EPA SAB. 2002. "PanelFormation Process: Immediate Steps to Improve Policies and
Procedures: An SAB Commentary, " EPA-SAB-EC-COM-02-003, May 17, 2002.

U.S. EPA. 2004. "Multi-Agency Radiological Laboratory Analytical Protocols (MARLAP)
Manual,"NUREG-1576; EPA 402-B-04-001A; NTIS PB2004-10521, July 2004

16


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

U.S. EPA/SAB. 2003a. "Consultation on Multi-Agency Radiation Site Survey Investigation
Manual (MARSSIM) Supplements for Materials and Equipment (MARSAME): A Science
Advisory Board Notification of a Consultation," EPA-SAB-R.AC-CON-03-002, Bebruary 27,
2003

U.S. EPA/SAB. 2003b. "Multi-Agency Radiological Laboratory Analytical Protocols
(MARLAP) Manual: An SAB Review of the MARLAP Manual and Appendices by the
MARLAP Review Panel of the Radiation Advisory Committee (RAC) of the U.S. EPA Science
Advisory Board (SAB)," EPA-SAB-RAC-03-009, June 10, 2003

U.S. EPA/SAB. 2004. "Second Consultation on Multi-Agency Radiation Site Survey
Investigation Manual (MARSSIM) Supplements for Materials & Equipment (MARSAME): A
Science Advisory Board Notification of a Consultation," EPA-S AB-RAC-CON04-001,
February 9, 2004

U.S. NRC. 1997. "A Nonparametric Statistical Methodology for the Design and Analysis of
Final Status Decommissioning Survey," Draft Report for Comment, Washington, DC, Nuclear
Regulatory Commission (NRC) NUREG-1505, August, 1995.

17


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

Web-based Citations and Hotlinks

(e.g., Provided below as an illustrative example. Needs more work —KJK)

MARS SIM: http://epa.gov/radiation/marssim/index.html

MARSAME: http://www.marsame.org

MARLAP: http: //epa. gov/radi ati on/marl ap/index. html

18


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

APPENDIX A - STATISTICAL ANALYSIS - AN INTRODUCTION TO
EXPERIMENTAL DESIGN AND HYPOTHESIS TESTING AND
SPECIFIC COMMENTS ON STATISTICS

A-l An Introduction to Experimental Design and Hypothesis Testing:

The general problem of design of a survey of the sort described in the MARSAME document
involves the following issues:

(1)	Understanding the error properties of the measurement instrument and how this can be
manipulated (by changing counting times or performing repeated measurements of the
same dose quantity, for example). Generally the measurement error can be well
characterized by its standard deviation om. This value may be a constant (all
measurements having the same standard deviation) or it may vary with radiation level (as
in the behavior of an idealized radiation counter);

(2)	Understanding the distribution of dose in the population of equipment or materials that
are to be measured. This distribution can often be well characterized by a standard
deviation as which we may call the sampling standard distribution;

(3)	Deciding upon the number of samples, N, from the distribution of activity that will be
used in the detection problem;

(4)	Specifying the null and alternative hypotheses to be examined;

(5)	specifying the type I error (a) allowed;

(6)	finding a specific alternative hypothesis (usually parameterized by a difference
A=alternative - null value) for which the power to reject the null hypothesis takes a
specified value 1-|3.

From a statistical standpoint, designing an experiment means finding values of the
sample size N and the detectable difference A that will control type 1 error and power, given the
instrument's measurement error properties and the sampling dose distribution.

In MARSAME, the null and alternative hypotheses generally concern the true difference
in levels between a potentially contaminated material or piece of equipment and the appropriate
background reference. In Scenario A, the null hypothesis is that the M&E is at least as
radioactive (over background) as some number called AL (the action limit), and the alternative is
that the true concentration is less than AL. In Scenario 2 the null hypothesis is that the M&E is
at the action level (which usually equals the background in scenario B) and the alternative
hypothesis is that the M&E is over the AL.

19


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

When a single measurement is taken, the variance of that measurement will be equal to
a2M+a2s. In some cases, the sampling distribution and thus as may be irrelevant to a

MARSAME survey; for example, there may be no spatial variability (when there is only 1 level
of radiation relevant to a small item for example). An important issue is how the error properties
of the instrument behave when repeated measurements of the same equipment item or same
portion of material are taken. For some measuring instruments, it may be reasonable to assume
that the standard deviation of the average of N measurements of the same unit will have standard

deviation equal to . This will be the case in an idealized radiation counter, since performing
yJN

additional measurements on the same sampling unit (item) is equivalent to increasing the count
times for that unit. In other cases, there may be inherent biases in measurement instruments so
that some or all of the measurement error is shared for all measurements. When sampling
variability is present (so that as is not zero), the variance of the mean of a random sample of N

a1 +a2	a1

measurements of will have variance somewhere in the range —		 to a2 +—. The first of

N	N

these corresponds to measurement errors that are completely unshared and the second
corresponds to measurement errors that are completely shared due, for example, to imperfect
calibration (as in the "measured efficiency" of a monitor discussed in several places in the
document). Generally, as more and more measurements are taken, the contribution of the
sampling variance to the variance of the mean disappears, whereas some or all of the
contribution of the measurement error may remain. The special case when 100 percent of a
potentially contaminated material is measured may be regarded as the limit when N -> qo. Again,
some or all of the measurement error variance may still remain.

For most situations covered by MARSAME, the null hypothesis concerns the difference
between background levels and the level of contamination of the M&E. Table 5.1 (in the current
document) gives some special formulae used when counts in time follow a Poisson distribution
(so that the variability of the counts of both background and the item of interest depends on
counting time and radiation level). In general, the variance of the difference between sampled
radioactivity and the estimate of background will require special investigation as a part of the
survey design.

For simplicity, it is useful to denote the standard deviation of measurement minus
background as a, which refers to the standard deviation of the estimate (often termed the
standard error) obtained from the entire measurement method (involving either single readings,
multiple readings, scans of some or all of the material, etc.). This a can be a relatively
complicated function of the underlying measurement and sampling variability (which must
include the uncertainties in the estimate of background) that may require careful study to
quantify properly.

Once a is determined, the power, 1-J3, of a study will depend upon two other parameters,
(1) the type I error rate a and (2) the size of the assumed true difference A. If the standard error
of the estimate, a, is the same for all radiation levels being measured, then the ratio A/a
determines power (otherwise a more complicated expression is used as in Table 5.1 of

20


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

MARSAME). For known a,we may specify the "detectable difference A by fixing both the type
I error a and the power 1-|3 and solving for A. In the MARSAME document, this detectable
difference A is called the width of the "gray region". (Differences less than this A are only
detectable with power less than the required 1-|3 and hence are "gray".) If the action level, AL, is
defined to be the upper bound of the "gray region", then the lower bound (AL- detectable
difference A) is called the "discrimination limit" (DL). Note that implicitly the detectable
difference A and the detectable limit DL depend upon the power, type I error rate, and the
standard error of the estimate a. One of the confusing aspects of the MARSAME document is that
the DL is introduced long before the concept of power or type I error.

The two scenarios (A and B) considered in the report both assume that the null
hypothesis is at the action level, but differ in the direction of the alternative hypothesis and
generally in the value of AL. Under scenario A, the alternative hypothesis is that the radiation
level is less than the action level (which is the upper limit above background to be allowed)
whereas under scenario B the alternative hypothesis is that the radiation level is greater than the
action level (which is typically set to background). Under scenario A the M&E is only deemed to
be safe for release if the null hypothesis is rejected, whereas under scenario B the M&E is safe
for release if the null hypothesis is not rejected.

If under scenario A, for example, the true value of the radionuclide level (or level above
background) is less than or equal to DL then the survey will have power 1-|3 to reject the null
hypothesis that the true value is equal to the AL with type I error a. Under scenario B,if the
value of true contamination-background is greater than the detectabledifference A, then the study
will again have power 1-|3 to reject this null hypothesis at type I error rate a. Assuming that the
standard error of the estimate, a, does not depend upon the radiation levels being measured, the
formula for the "detectable" A, given a, a and power 1-|3 is

Detectable difference A = (Zx_p + Z, u)a	(1)

Where Zx_p and Zx_a are the corresponding critical regions for the standard normal random

variable. A somewhat more complicated formulae for A is needed when a is not independent of
radiation level as in Table 5.1; however, formula (1) gives a useful (conservative) approximation
to the detectable difference if we choose a to be at its maximum likely value for either the null or
alternative hypothesis.

In general, the use of equation (1) for the detectable difference A requires that the
estimate of contamination (measurement - background) be approximately normally distributed.
For radiation counters with long count times and large values of N (when there is sampling
variability as well as measurement variability), this assumption is usually quite appropriate.
Because the width of A is (for fixed power and type I error) dependent on a, it is important that
an instrument or measurement technique (and sampling fraction for spatially distributed
contamination) is selected which is sensitive enough (provides small enough a) so that the
detectable A meets requirements (for example so that the DL is not set to be too small in
Scenario A, or that the upper range of the gray region is not set too high above background in
Scenario B).

21


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

In some situations (non-normal distributions, short count times), the detectable A will be
larger than described in equation (1) and more specialized statistical analysis may be needed.
Such techniques as segregation according to likely level of contamination may improve the
accuracy of equation (1), as will longer count times.

Hypothesis testing (accepting or rejecting the null hypothesis) involves comparing an
estimate of contamination levels to a "critical value" (termed Sc in the report) which allows us to
decide whether the observed estimate is consistent with the null value (at a certain type I error
level) after taking account of the variability (i.e. a) of the measurement. For Scenario A this
value is equal to Sc = AL - Zi_a a, and for Scenario B it is Sc = AL + Zi_a a. By definition,
power is the probability, as computed under the alternative hypothesis, of rejecting the null
hypothesis; that is, the probability that the observed estimate is less than (for scenario A) or
greater than (for scenario B) the critical value Sc.

If normality of the estimate is in doubt, then other approaches to hypothesis testing may
be needed. For example, while for long count times the Poisson distribution can be approximated
as normal for the purpose of hypothesis testing, for short count times specialized formulae (see
section 5.7.1) may be needed to give a better approximation to the distribution of (measured-
baseline) for an idealized radiation counter.

A-2 Specific Comments:

Section 3.8.1 describes "Measurement Method Uncertainty" but in somewhat more vague
terms than above. The intent of this section could be better understood in reference to the
suggested introduction to experimental design and hypothesis testing.

All of section 4 would be more comprehensible if it consistently referred back to the
suggested introduction to experimental design and hypothesis testing.

Section 4.1.1.2 gives a suggestion for how much of an impacted material should be
scanned: it is not clear to what the a value now refers (eq 4-1). This appears to be the
measurement error standard deviation oM rather than the total standard deviation of the
measurement method (measurement method uncertainty). Presumably, this is giving a
recommendation that will keep the total measurement method uncertainty bounded for a given
level of measurement error (om).

The statistical concepts described earlier in this report are illustrated for the first time in
Figures. 4.2 and 4.3 of MARSAME. It is unfortunate that even though the concepts shown of
the figures all relate to net radioactivity, they are termed a "level", "value" or "limit". This could
cause confusion and possibly be misinterpreted by someone who is preparing to establish a
survey design. An expansion of these figures to include several additional parameters with some
supplemental text would be helpful.

22


-------
SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

1	Suggestions for scenario A and B are presented. These embellished Figures with some additional

2	text should also eliminate the need to repeat this information in Chapter 5 as in Figs. 5.2, 5.3,

3	5.4.

4

5

6

LBGR	.	UBGR

i	A	"

Limit	Value	Level

Scenario A
(H0: Net Activity ^ Action Level)

23


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

Level	Value	Limit

Scenario B
(H0: Net Activity < Action Level)

As mentioned above, the Action Level for net excess radioactivity is used in defining the
null hypothesis. However, the decision on accepting the null hypothesis is not based on the
numerical value of net radioactivity at the Action Level. Rather, each sample is compared with
the Critical Value shown in the Figures. This insures that the probability for rejecting the null
hypothesis, when it is true, will not exceed a. The Discrimination Limit is the net radioactivity
in the sample where the probability of accepting the null hypothesis, when it is false, is (3 (i.e. the
power for rejecting the null hypothesis is 1-|3). The Gray area is the region of net radioactivity
in the sample where the statistical power to reject the null hypothesis, when it is false, is less than
1-p.

The intent of section 5.5 would be made more clear as dealing with the factors that
impact the measurement error uncertainty a as described in more general terms in the suggested
review of experimental design and hypothesis testing. It appears, however, that aM (the standard
deviation of a single measurement not taking into account spatial distribution of materials or the
variability of the background) is being confused with the overall a (total measurement method
uncertainty taking these factors into account). It is A / a, not A / om, that determines the overall
power of the experiment. The document should clearly differentiate these two a 's.

24


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

Section 5.5.1, lines 289-293, seems to be confusing am with as . It is as that, generally
speaking, can be decreased by improving scan coverage (not amif this includes "shared" error
terms such as the "variance of measured efficiency"). The new terminology wmr is apparently
referring either to an estimate of the measurement error uncertainty aM or to overall a but this is
not made clear in this section (and the requirement that uMr < os/3 makes no sense if as can be
reduced to 0 by improving scan coverage).

The comments on line 302-303 seem to require that uMr be estimating the overall a.
Example 2 is confusing because the requirement that uMr be a factor of 10 times smaller than A
seems to assume that uMr is an estimate of aM rather than the overall uncertainty a (this would be
a very stringent requirement indeed). Here one needs to focus not just on Om but rather on the
total variability including as . If as can be reduced to zero by scanning all of a material why is
such a stringent requirement made on am?

Line 360 introduces new and not clearly defined uncertainties (uc and (Pmr). Example 5 is
unclear, and needs to be tied to some general design or hypothesis testing principles - it just
comes out of thin air as it stands.

Section 5.6 is a good description of addressing measurement uncertainty Cm in certain
special cases. One thing that could be clarified is that om is now referring to the error in
measurement-background rather than just the error in the measurement itself. At other points in
the document am seems to refer rather to the variance of just the measurement.

Table 5.1 shows details of the calculation of a critical value specialized to radiation
counters with Poisson errors in estimating both the background radioactivity level and the level
of radioactivity in the measured M&E. Use of the Stapleton formulae seems to be giving an
improvement correcting for non-normality of the Poisson distribution for small count times. It
would be helpful here to note clearly that the MDC is the value of Sc for rejecting the null
hypothesis (scenario B) of no excess radiation above background, i.e. by referring back to the
suggested introduction to experimental design and hypothesis testing.

Section 5.8, Determining Measurement Quantifiability is a complicated way of saying
that a must be small enough (and hence A /a large enough) in order for the measurement
method to have good power not only to reject the null hypothesis that the level of radioactivity is
at the AL for a reasonable A (width of the gray region), but also to give a reasonably narrow
confidence limit for the estimated value, i.e. where the width of the confidence limit is small
compared to the value of the AL.

One complication that is explicitly dealt with in the definition of the MQC is that the
measurement method uncertainty, i.e. a, generally will depend upon the (unknown) true level of
radioactivity itself - for example a perfect counter has Poisson variance equal to its mean. Thus
the MDC is just the value, yO, of the radioactivity level for which the ratio, k=y0/a, is large (the
document recommends k=10). If yO is small relative to the action limit (between 10-50 percent
of the AL is recommended), then it is clear both that (1) the detectable A will be small with
respect to the action limit (i.e. the DL will be close to the AL) and (2) confidence limits around

25


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

an estimated value of radioactivity will be narrow relative to the value of the AL. Saying this
clearly helps to improve the intelligibility of this section.

Section 5.8.1 would be more intelligible if it first noted that it is giving a computation of
the MDC, yO, for a fixed k by a formulae for a that takes account of several factors which are
combined into this one a. These factors are the length of the reading time for the source, the
length of reading time for the background, the true value of the background reading, and an
estimate of the variance of a "shared" measurement error term, i.e. the measured efficiency of the
monitor.

Section 6.2.1 has some confusing aspects: as described earlier, the gray region is defined
in terms of the power and type I error of the test with a measurement method of total standard
deviation a. Sentences like "Clearly MDCs must be capable of detecting radionuclide
concentrations or levels of radioactivity at or below the upper bound of the gray region" seem
tautological if the gray region is defined in terms of detection ability; specifically in terms of
power, type 1 error, and a.

Section 6.2.3., lines 215-224, confuse by the statements about how individual
measurement results can be utilized for scan-only measurements. The statement that "if
disposition decisions will be made based on the mean of the logged data, an upper confidence
level for the mean is calculated and compared to the UBGR" if not interpreted carefully (i.e. if
one did a standard test such as Wilcoxan or t-test) would ignore any uncertainty component
resulting from variability in the measurement process (i.e. measurement error shared by all
measurements that constitute the scan). Only if oM has no shared components (or if they are very
small) would it make sense to do a standard statistical test using the observed data alone.
Specifically the sample standard deviation would underestimate the true measurement standard
deviation a if there is a shared uncertainty (such as errors in the estimate of counting efficiency)
incorporated in Om .

The suggestion (line 60) that for MARS SIM type surveys the sample standard deviation
can be used to generate a power curve also implicitly assumes that no shared measurement error
components exist. But this contradicts the conclusion of line 223-224 that "Measuring 100% of
the M&E accounts for spatial variability but there is still an uncertainty component resulting
from variability in the measurement process." In fact, all the discussion of selecting and
performing a statistical test, and drawing conclusions in the rest of Section 6 seems to be
implicitly assuming that there are no shared errors from measurement to measurement: is this the
intention? Was this what was being meant by the (confusing) discussion in Section 5.5.1, lines
289-293? For example, even if all measurements are less than the action level this might not
really be enough information to conclude that the M&E meet the disposition criterion.

Suppose all measurements are only somewhat less than the action level but it is also
known that the counting efficiency was not very well estimated. Ignoring the uncertainty in the
counting efficiency could lead to the wrong conclusion in this case, if the uncertainty in the
counting efficiency is indeed "shared error" over all the measurements. In many places in this
document, errors in counting efficiency or other apparently shared measurement errors are

26


-------
1

2

3

4

5

6

7

8

9

10

11

12

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

mentioned (as on line 223-224), but this issue seems to be ignored in most of section 6. If the
document is assuming that such shared errors are small enough to be ignorable then this should
be stated explicitly, (see also footnote 4 on page 6-17)

One possible resolution is to assume that the measurement of background has exactly the
same "shared" uncertainties (counter efficiencies etc) as does the measurement of the
radioactivity level in the M&E. In this case, the shared uncertainties will be subtracted out when
the background is subtracted from the level measured in the M&E. If this is what is meant then
this should be stated clearly (and this should be highlighted in the any initial "review of
experimental design and hypothesis testing" when discussing the various components included in
a).

27


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

APPENDIX B -ACRONYMS AND ABBREVIATIONS

(Use only those terms that are applicable to the subject content being discussed and apply as
follows. This template needs revision, with new terms to be added and others to be dropped —

KJK)

A

Scenario A

AL

Action Limit

a

Type I Error

AM

Arithmetic Mean

AR

Absolute Risk

P

Beta

B

Scenario B

Bq

Bequerels

Bq/m2

Bequerels/ Square meter

Bq/m3

Bequerels/Cubic meter

1-p

Specified Value (1 minus Beta)

CDC

Centers for Disease Control and Prevention

CFR

Code of Federal Regulations

Co

Chemical symbol for cobalt (60Co isotope)

CQ

Charge Question (CQ1, CQ 2, CQ3,)

A

Difference =Alternative - Null value) also the Detectable Difference

DFO

Designated Federal Officer

DL

Discrimination Limit

DLC

Data Life Cycle

DoD

Department of Defense (U.S. DoD)

DOE

Department of Energy (U.S. DOE)

DQO

Data Quality Objective

EAR

Excess Absolute Risk

EPA

Environmental Protection Agency (U.S. EPA)

FR

Federal Register

FGR-13

Federal Guidance Report 13

GM

Geometric Mean

GMC

Geometric Mean Coefficient

GSD

Geometric Standard Deviation

Gy

grav, SI unit of radiation absorbed dose (lGv is eauivalent to 100 rad in

traditional units)

H

Chemical symbol for Hydrogen (3H isotope)

H0

111

HPGE

Ill

IA

Initial Assessment

oo

Infinity

I

Chemical symbol for Iodine (131I isotope)

ICRP

International Commission on Radiological Protection

ICRU

International Commission on Radiation Units and Measurements, Inc.

keV

kiloelectron Volts

28


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

MARLAP
MARSAME

MARSSIM

M&E

MDC

MQC

MQO

mSv

N

NAI

NAS

NCRP

NRC

OAR

ORIA

PAG

Pu

QA

QC

QA/QC
R

RAC
rad

rem

RERF

R/h

RR

SAB

a

Om
C>s
Sc
SI

®mr

Type I
Type II
Tl-208
u

P-mr

9

Multi-Agency Laboratory Analytical Protocols

Multi-Agency Radiation Survey and Assessment of Materials and Equipment
Manual

Multi-Agency Survey and Site Investigation Manual

Materials and Equipment

Measurement Data Uncertainty

Measurement Quality Uncertainty

Measurement Quality Objectives

milli-Sievert

The Sample Size (N measurements, for instance)

Sodium Iodide Detectors

National Academy of Sciences (U.S. NAS)

National Council on Radiation Protection and Measurements

Nuclear Regulatory Commission (U.S. NRC)

Office of Air and Radiation (U.S. EPA/OAR)

Office of Radiation and Indoor Air (U.S. EPA/OAR/ORIA)

Protective Action Guide

Chemical symbol for Plutonium (239Pu Isotope)

Quality Assurance

Quality Control

Quality Assurance/Quality Control
roentgen

Radiation Advisory Committee (U.S. EPA/SAB/RAC)

Traditional unit of radiation absorbed dose in tissue (a dose of 100 rad is

equivalent to 1 gray (Gy) in SI units)

Radiation equivalent in man; traditional unit of effective dose equivalent (equals

rad x tissue weighting factor) (100 rem is equivalent to 1 Sievert (Sv))

Radiation Effects Research Foundation

Roentgen per hour; traditional measure of exposure rate

Relative Risk

Science Advisory Board (U.S. EPA/SAB)

Standard deviation

Standard Deviation of Measurement Error
Standard Deviation of Sampling Distribution
Critical Value

International System of Units (from NIST, as defined by the General Conference
of Weights & Measures in 1960)

The relative upper bound of the estimated measurement method uncertainty ]lmr,

Error

Error

111

Uncertainty (e.g., uc), and

Estimated Measurement Method Uncertainty

Uncertainty (e.g., cpMR)

29


-------
SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

US	United States

WLM	Working Level Months

WRS	(A statistical test)

yo	???

Z	Critical Regions (e.g., Z i_ a, or Z i. p)

30


-------
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

SAB Draft Report dated December 17, 2007 - Draft for Panel Review - Do Not Cite or Quote. This review draft is a work
in progress, does not reflect consensus advice or recommendations, has not been reviewed or approved by the Science
Advisory Board's Charter Board, and does not represent EPA policy.

APPENDIX C -MARSAME TYPOS AND CORRECTIONS

(To be moved to a memo from report to a memo from the RAC MARSAME Review Panel DFO
to the Multi-Agency Work Group via the ORIA Staf fOfficxe	KJK)

xxix line 504 power?

522 delete one (
xxxi 561 delete one )

567 delete one (

xxxiv

671 Technetium (sp.)

XXXV

676 delete (duplicates 675)

1-3

80 change "activity concentrations" to "area activity" or leave as is but change



"Bq/m2" to "Bq/m3" and add "and area activity (Bq/m2)

3-9

194 non-radionuclide-specific (insert dash)

4-5

Figure 4.1a replace second "Large" by "Much Larger"



Figure 4b. replace second "Small" by "Equally Small or Smaller"

5-21

523 value in denominator should be 0.4176 (see line 527)



527 plus should be behind square root of 87

5-53

1148 delete 2nd period

6-6

142 insert "to" behind "likely"

6-11

280 insert "that" behind "determine"

6-13

329 insert "that" behind "demonstrate"

6-23

474 and 482 critical value in symbols table is not in italics (italicized k is coverage



factor)

7-10

210 Tl-208 should be beta/gamma, not just beta, with gamma-ray energy in next



column

B-6

151 maximize, not minimize

D-9

219 what does "varies" mean?

D-36

849 for LS spectrometer, insert (alpha) on first line of column 2 and (gamma) for the



HPGE and Nal detectors

F-l

26 delete (FRER)

End of Document

31


-------