402D01001
                              NU REG-1576

                            NT1SPB2001-106745
      Multi-Agency Radiological
Laboratory Analytical Protocols Manual
    ASSESSMENT
IMPLEMENTATION
              USGS     MIST
  Draft for Public Comment
        August 2001

-------
                                 ABSTRACT

The Multi-Agency Radiological Laboratory Analytical Protocols (MARLAP) manual provides
guidance for the planning, implementation, and assessment of projects that require the laboratory
analysis of radionuclides. MARLAP's basic goal is to provide guidance and a framework for
project planners, managers, and laboratory personnel to ensure that radioanalytical laboratory
data will meet a project's or program's data requirements. To attain this goal, the manual is
intended to provide the guidance necessary for national consistency in the form of a performance-
based approach for meeting a project's data requirements. The guidance in MARLAP is designed
to help ensure the generation of radioanalytical data of known quality, appropriate for its
intended use.

MARLAP was developed by a workgroup that included representatives from the U.S. Environ-
mental Protection Agency (EPA), Department of Energy (DOE), Department of Defense (DOD),
Nuclear Regulatory Commission (NRC), National Institute of Standards and Technology (NIST),
U.S. Geological Survey (USGS), and U.S. Food and Drug Administration (FDA). State participa-
tion in the development of the manual involved contributions from representatives from the
Commonwealth of Kentucky and the State of California. Contractors to EPA, DOE, and NRC,
and members of the public, have been present during the open meetings of the MARLAP
workgroup.

Examples of data collection activities that MARLAP supports include site characterization, site
cleanup and compliance demonstration, decommissioning of nuclear facilities, remedial and
removal actions, effluent monitoring of licensed facilities, environmental  site monitoring,
background studies, and waste management activities.
JULY 200)                                                                     MARLAP
DRAFT FOR PUBLIC COMMENT               in                     DO NOT CITE OR QUOTE

-------
                                    NOTICE

This draft manual being released for simultaneous public and peer review, and technical
comments are solicited as described below. MARLAP has not been approved for use in part or in
whole and should not be used, cited, or quoted except for the purposes of providing comments as
requested by the agencies participating in its development.

MARLAP was developed by a workgroup that included representatives from the U.S. Environ-
mental Protection Agency (EPA), Department of Energy (DOE), Department of Defense (DOD),
Nuclear Regulatory Commission (NRC), National Institute of Standards and Technology (NIST),
U.S. Geological  Survey (USGS), and U.S. Food and Drug Administration (FDA). State participa-
tion in the development of the manual involved contributions from representatives from the
Commonwealth  of Kentucky and the State of California. Contractors to EPA, DOE, and NRC,
and members of the public, have been present during the open meetings of the MARLAP
workgroup.

Although Federal Government personnel are involved in the preparation of this document, the
draft manual does not yet represent the official position of any participating agency. This review
is a necessary step in the development of a multi-agency consensus manual. References within
this manual to any specific commercial product, process,  or service by trade name, trademark,
manufacturer, or otherwise does not necessarily constitute or imply its endorsement or
recommendation by the United States Government.

Members of the public  are invited and encouraged to submit comments to the following website
http://www.eml.doe.gov/marlap/. Comments may also be submitted to either.

                         U.S. Environmental Protection Agency
                     ATTN: Air and Radiation Docket, Mail Stop 6102
                       Docket Number A-2001-16, Room Ml500
                                  401 M Street, SW
                                Washington, DC 20460

                                         or

                           Chief, Rules and Directives Branch
                           Division of Administrative Services
                                  Mail Stop T6D59
                          U.S. Nuclear Regulatory Commission
                              Washington, DC 20555-0001
MARLAP                                                           -        JULY 2001
DO NOT CITE OR QUOTE                    IV               DRAFT FOR PUBLIC COMMENT

-------
	Notice

All comments received will be reviewed by the entire MARLAP workgroup. Comments received
by the date published in the Federal Register Notice announcing the availability of the document
for public review will be considered. Comments received after that date will be considered if it is
practical to do so, but no assurance can be given for consideration of late comments.

Copies of the draft MARLAP manual and all comments received may be examined or copied for
a fee at the EPA Docket Room M1500, Docket Number A-2001-16, First Floor Waterside Mall,
401 M Street, SW, Washington, DC 20460; and the NRC Public Document Room, at U.S.
Nuclear Regulatory Commission, Public Document Room, Washington, DC 20555. The
document is also available through the National Technical Information Service (NTIS). The
NTIS document number is PB2001-106745, and the NTIS Sales Desk can be reached between
8:30 a.m. and 6:00 p.m. Eastern Time, Monday through Friday at 1-800-553-6847; TDD (hearing
impaired only) at (703) 487-4639.

In addition to providing comments on individual chapters and appendices, reviewers are also
requested to address the following questions while reviewing the draft manual:

   (1) Is the performance-based approach used in MARLAP for the planning, implementation,
   and assessment phases of projects technically sound, and is the approach reasonable in terms
   of ease of implementation by project managers and laboratories? Does the approach
   effectively link the three phases of a project, and is the guidance on quality control
   appropriate and supportive of a performance-based approach?

   (2) Is the guidance on laboratory operations in Part n (Chapters 10-20) technically accurate
   and useful?

   (3) Are the concepts covered under measurement statistics—specifically measurement
   uncertainty, detection and quantification capability—presented accurately and appropriately?

   (4) Is the information understandable and presented in logical sequence? How can the
   presentation of material be modified to improve the manual?

   (5) Does MARLAP provide benefits that are not currently available through  other
   approaches? What are the costs associated with implementing the guidance in MARLAP in
   comparison with currently available alternatives?

Commentors are encouraged to use the website, http://www.eml.doe.gov/marlap, for their
review. The website has detailed instructions on how to submit comments and has several
JULY 2001                                                                     MARLAP
DRAFT FOR PUBLIC COMMENT                V                    DO NOT CITE OR QUOTE

-------
Notice	

features that should aid the review process. Commentors also may submit written comments to
either of the addresses listed on page iv of this Notice using the same general approach described
in the MARLAP website. Comments should be accompanied by supporting details, rationale, or
data. To ensure efficient and complete comment resolution, commentors are requested to
reference the page number and the line number to which the comment refers. Comments
corresponding to an entire chapter, section, or table should be referenced to the line number for
the title of the chapter (always line number 1), section, or table. Comments on footnotes should
be referenced to the line in the text where the footnote appears (footnotes do not have line
numbers). Comments on figures should be referenced to the page on which the figure appears
(figures do not have line numbers) and figure number. Comments on the entire manual should be
referenced to the title page.
MARLAP          ..    •"                         '                          JULY 2001
DO NOT CITE OR QUOTE                    VI               DRAFT FOR PUBLIC COMMENT

-------
                       ACKNOWLEDGMENTS

The origin of the Multi-Agency Radiological Laboratory Analytical Protocols (MARLAP)
manual can be traced to the recognition by a number of agencies for the need to have a nationally
consistent approach to producing radioanalytical data that meet a program's or project's needs. A
multi-agency workgroup was formed with representatives from the U.S. Environmental
Protection Agency (EPA), Department of Energy (DOE), Nuclear Regulatory Commission
(NRC), Department of Defense (DOD), U.S. Geological Survey (USGS), National Institute of
Standards and Technology (NIST), and Food and Drug Administration (FDA) to develop
guidance for the planning, implementation, and assessment of projects that require the laboratory
analysis of radionuclides. Representatives from the Commonwealth of Kentucky and the State of
California also contributed to the development of the manual.

Of particular importance to the workgroup is that the guidance needs to be both scientifically
rigorous and flexible enough to be applied to a diversity of projects and programs. The draft
MARLAP manual is the result of a cooperative effort with these goals in mind.

MARLAP would not have been possible without the workgroup members who contributed their
time, talent, and efforts to develop this guidance document:

                              John Griggs*, EPA, Chair

EPA:  H. Benjamin Hull                       DOE: EmileBoulos*
      Marianne Lynch*                             Carl Gogolak
      Keith McCroan*                               Pam Greenlaw*
      Eric Reynolds                                Catherine Klusek*
      Jon Richards                                 Stan Morton*
                                                  Colin Sanderson*
                                                  Stephanie Woolf*

DOD: CPT Andrew Scott (Army)               NRC: Rateb (Boby) Abu Eid
       Ronald Swatski* (Army)                      Tin Mo
      Jan Dunker (Army Corps of Engineers)           George Powers
      Troy Blanton (Navy)
      CAPT David Fanrand (Navy)              USGS: Ann Mullin*
      Dale Thomas (Air Force)

NIST: Kenneth G.W. Inn*                     FDA: Edmond Baratta

* These workgroup members also served as chapter chairs.

JULY 2001                                                                  MARLAP
DRAFT FOR PUBLIC COMMENT              VII                    DO NOT CITE OR QUOTE

-------
Acknowledgments
Special recognition is given to John Volpe, Commonwealth of Kentucky, and Penny Leinwander,
State of California, for their contributions to the development of the MARLAP manual. The
following Federal Agency contractors provided assistance in developing the MARLAP manual:

EPA:  N. Jay Bassin (Environmental Management Support, Inc.)
      Diane Dopkin (Environmental Management Support, Inc.)
      U. Hans Behling (S. Cohen & Associates, Inc.)
      Richard Blanchard (S. Cohen & Associates, Inc.)
      Harry Chmelynski (S.  Cohen & Associates, Inc.)
      Scott Hay (S. Cohen & Associates, Inc.)
      Patrick Kelly (S. Cohen & Associates, Inc.)
      Robert Litman (S. Cohen & Associates, Inc.)
      Charles (Chick) Phillips (S. Cohen & Associates, Inc.)
      William Richardson JJI (S.  Cohen & Associates, Inc.)
      Steven Schaffer (S. Cohen  & Associates, Inc.)

 DOE: David McCurdy (Duke Engineering & Services)
      John Maney (Environmental Measurements Assessments)
      Stan Blacker (MACTEC, Inc.)
      Pat Harrington (MACTEC, Inc.)
      Mike Miller (MACTEC, Inc.)
      Lisa Smith (Argonne National Laboratory)

NRC: Eric W. Abelquist (ORISE)
      Dale Condra (ORISE)

The MARLAP Workgroup was greatly aided in the development of the manual by the
contributions and support provided by the individuals listed below.

David Bottrell (DOE)          David Friedman (EPA)         Kevin Miller (DOE)
Lloyd Currie (NIST)          LCDR Lino Fragoso (Navy)    Jim Mitchell (EPA)
Mike Carter (EPA)            Richard Graham (EPA)        Colleen Petullo (EPA)
Mary Clark (EPA)             Patricia Gowland (EPA)        Steve Pia (EPA)
Ron Colle (NIST)             Larry Jensen (EPA)            Phil Reed (NRC)
Mark Doehnert (EPA)         Jim Kotton (NRC)             Cheryl Trottier (NRC)
Steve Domotor (DOE)   .      Ed Messer (EPA)             John Warren (EPA)
Joan Fisk (EPA)
MARLAP                                                                  JULY 2001-
DO NOT CITE OR QUOTE                   VIII             DRAFT FOR PUBLIC COMMENT

-------
                                  CONTENTS

                                                                                 Page
Abstract  	HI

Notice	IV

Acknowledgments	  VII

Acronyms and Abbreviations	XLVII

1 Introduction to MARLAP 	1-1
    1.1 Overview	1-1
    1.2 Purpose of the Manual	1-2
    1.3 Use and Scope of the Manual  	1-3
    1.4 Key MARLAP Concepts and Terminology	1-4
       1.4.1  Data Life Cycle 	1-4
       1.4.2  Directed Planning Process	1-5
       1.4.3  Performance-Based Approach	1-6
       1.4.4  Analytical Process	1-7
       1.4.5  Analytical Protocol  	1-8
       1.4.6  Analytical Method	1-8
       1.4.7  Uncertainty and Error	1-8
       1.4.8  Precision, Bias, and Accuracy	1-10
       1.4.9  Performance Objectives: Data Quality Objectives and Measurement Quality
             Objectives  	1-11
       1.4.10 Analytical Protocol Specifications	1-12
       1.4.11 The Assessment Phase	1-13
    1.5 The MARLAP Process	1-14
    1.6 Structure of the Manual	1-15
       1.6.1  Overview of Part I	1-17
       1.6.2  Overview of Part n	1-17
       1.6.3  Overview of the Appendices	1-19
    1.7 References	1-20

2 Project Planning  Process 	2-1
    2.1 Introduction	2-1
    2.2 The Importance of Directed Project Planning	2-2
    2.3 Directed Project Planning Processes	2-4
       2.3.1  A Graded Approach to  Project Planning	2-4

JULY 2001                                                                     MARLAP
DRAFT FOR PUBLIC COMMENT              IX   _                 DO NOT CITE OR QUOTE

-------
Contents
      2.3.2  Guidance on Directed Planning Processes	2-4
      2.3.3  Elements of Directed Planning Processes  	2-6
   2.4 The Project Planning Team	2-7
      2.4.1  Team Representation	2-7
      2.4.2  The Radioanalytical Specialists	2-8
   2.5 Direct Planning Process and Role of the Radioanalytical Specialists	2-9
      2.5.1  Define the Problem  	2-12
      2.5.2  Identify the Decision  	2-13
          2.5.2.1 Action Level	2-13
          2.5.2.2 Scale of the Decision	2-14
          2.5.2.3 Inputs and Boundaries to the Decision	2-15
          2.5.2.4 Data Needs	2-15
      2.5.3  Specify the Decision Rule and the Tolerable Decision Error Rates 	2-15
      2.5.4  Optimize the Strategy for Obtaining Data	2-17
          2.5.4.1 Analytical Protocol Specifications	2-18
          2.5.4.2 Measurement Quality Objectives	2-18
   2.6 Results of the Directed Planning Process  	2-19
      2.6.1  Output  Required by the Radioanalytical Laboratory: The Analytical Protocol
             Specifications	2-20
      2.6.2  Chain of Custody	2-21
   2.7 Project Planning and Project Implementation and Assessment  	2-21
      2.7.1  Documenting the Planning Process	2-21
      2.7.2  Obtaining Analytical Services	2-22
      2.7.3  Selecting Analytical Protocols	2-23
      2.7.4  Assessment Plans	2-23
          2.7.4.1 Data Verification	2-24
          2.7.4.2 Data Validation  	2-24
          2.7.4.3 Data Quality Assessment	2-24
   2.8 References	2-25

3 Key Analytical Planning Issues and Developing Analytical Protocol Specifications	3-1
   3.1 Introduction 	3-1
   3.2 Overview of the Analytical Process	3-2
   3.3 General Analytical Planning Issues	3-2
      3.3.1  Develop Analyte List	3-4
      3.3.2  Identify Concentration Ranges	3-6


MARLAP                                                                        JULY 2001
DO NOT CITE OR QUOTE                     X                DRAFTFOR PUBLIC COMMENT

-------
                                                                              Contents
      3.3.3  Identify and Characterize Matrices of Concern	3-6
      3.3.4  Determine Relationships Between the Radionuclides of Concern  	3-7
      3.3.5  Determine Available Project Resources and Deadlines	3-8
      3.3.6  Refine Analyte List and Matrix List  	3-8
      3.3.7  Method Performance Characteristics and Measurement Quality Objectives ... 3-9
          3.3.7.1 Develop MQOs for Select Method Performance Characteristics  	3-11
          3.3.7.2  The Role of MQOs in the Protocol Selection and Evaluation Process ... 3-16
          3.3.7.3  The Role of MQOs in the Project's Data Evaluation Process  	3-16
      3.3.8  Determine Any Limitations on Analysis Options 	3-17
          3.3.8.1  Gamma Spectrometry	3-18
          3.3.8.2  Gross Alpha and Beta Analysis	3-19
          3.3.8.3  Radiochemical Nuclide-Specific Analysis	3-19
      3.3.9  Determine Method Availability   	3-19
      3.3.10 Determine the Type and Frequency of, and Evaluation Criteria for, Quality
          Control Samples  	3-20
      3.3.11 Determine Sample Tracking and Custody Requirements  	3-21
      3.3.12 Determine Data Reporting Requirements	3-21
   3.4 Matrix-Specific Analytical Planning Issues	3-22
      3.4.1  Solids	3-23
          3.4.1.1  Homogenization and Subsampling	3-24
          3.4.1.2  Removal of Unwanted Materials   	3-24
      3.4.2  Liquids	3-25
      3.4.3  Filters and Wipes	3-26
   3.5 Assembling the Analytical Protocol Specifications	3-26
   3.6 Level of Protocol Performance Demonstration	3-27
   3.7 Project Plan Documents	3-30
   3.8 References 	3-31

4 Project Plan Documents	 4-1
   4.1 Introduction	4-1
   4.2 The Importance of Project Plan Documents  	4-2
   4.3 A Graded Approach to Project Plan Documents   	4-3
   4.4 Project Plan Documents	4-4
      4.4.1  Guidance on Project Plan Documents	4-4
      4.4.2  Approaches to Project Plan Documents  	4-5
   4.5 Elements of Project Plan Documents	4-6


JULY 2001                                                                      .MARLAP
DRAFT FOR PUBLIC COMMENT               XI                      DO NOT CITE OR QUOTE

-------
Contents
                                                                                  Page

      4.5.1  Content of Project Plan Documents	4-7
      4.5.2  Plan Documents Integration	4-9
      4.5.3  Plan Content for Small Projects  	4-10
   4.6 Linking the Project Plan Documents and the Project Planning Process  	4-10
      4.6.1  Planning Process Report  	4-15
      4.6.2  Data Assessment 	4-16
          4.6.2.1 Data Verification	4-16
          4.6.2.2 Data Validation 	4-16
          4.6.2.3 Data Quality Assessment	4-17
   4.7 References	4-18

5 Obtaining Laboratory Services	5-1
   5.1 Introduction	5-1
   5.2 Importance of Writing a Technical and Contractual Specification Document 	5-2
   5.3 Statement of Work—Technical Requirements  	5-2
      5.3.1  Analytes	5-3
      5.3.2  Matrix	5-3
      5.3.3  Measurement Quality Objectives	5-3
      5.3.4  Unique Analytical Process Requirements	5-4
      5.3.5  Quality Control Samples and Participation in External Performance Evaluation
             Programs  	5-4
      5.3.6  Laboratory Radiological Holding and Turnaround Times	5-5
      5.3.7  Number of Samples and Schedule	5-5
      5.3.8  Quality System	5-6
      5.3.9  Laboratory's Proposed Methods 	5-6
   5.4    Request for Proposal—Generic Contractual Requirements	5-7
      5.4.1 Sample Management	5-7
      5.4.2  Licenses, Permits and Environmental Regulations	5-8
          5.4.2.1 Licenses	5-8
          5.4.2.2 Environmental and Transportation Regulations  	5-9
      5.4.3  Data Reporting and Communications  	5-9
          5.4.3.1 Data Deliverables	5-9
          5.4.3.2 Software Verification and Control	5-10
          5.4.3.3 Problem Notification and Communication 	5-10
          5.4.3.4 Status Reports 	5-11
      5.4.4  Sample Re-Analysis Requirements	5-11


MARLA'P                                                                     JULY 2001
DO NOT CITE OR QUOTE                     XII              DRAFT FOR PUBLIC COMMENT

-------
                                                                               Contents
                                                                                   Page

       5.4.5   Subcontracted Analyses	5-11
   5.5 Laboratory Selection and Qualification Criteria	5-12
       5.5.1   Technical Proposal Evaluation  	5-12
          5.5.1.1 Scoring and Evaluation Scheme 	5-12
          5.5.1.2 Scoring Elements	5-13
       5.5.2   Pre-Award Proficiency Evaluation  	5-15
       5.5.3 Pre-Award Assessments and Audits	5-15
   5.6    References	5-16
       5.6.1   Cited References  	5-16
       5.6.2   Other Sources	5-17

6 Selection and Application of an Analytical Method	6-1
   6.1 Introduction	6-1
   6.2 Method Definition	6-2
   6.3 Life Cycle of Method Application	6-6
   6.4 Generic Considerations for Method Development and Selection	6-10
   6.5 Project-Specific Consideration for Method Selection	6-13
       6.5.1   Matrix and Analyte Identification 	6-13
          6.5.1.1 Matrices	6-13
          6.5.1.2. Analytes and Potential  Interferences   	6-15
       6.5.2 Process Knowledge	6-16
       6.5.3   Radiological Holding and Turnaround Times	6-17
       6.5.4   Unique Process Specifications	6-18
       6.5.5   Measurement Quality Objectives	6-19
          6.5.5.1 Method Uncertainty 	6-19
          6.5.5.2 Quantification Capability	6-20
          6.5.5.3 Detection Capability 	6-21
          6.5.5.4 Applicable Analyte Concentration Range	6-23
          6.5.5.5 Method Specificity	6-23
          6.5.5.6 Method Ruggedness  	6-24
          6.5.5.7 Bias Considerations	6-24
   6.6 Method Validation	6-25
       6.6.1   Laboratory's Method Validation Protocol	6-26
       6.6.2   Tiered Approach to Validation  	6-27
          6.6.2.1 Existing Methods Requiring No Additional Validation  	6-29
          6.6.2.2 Use of a Validated Method for Similar Matrices   	6-30
JULY 2001                                                                      MARLAP
DRAFT FOR PUBLIC COMMENT	         Xm                     DO NOT CITE OR QUOTE

-------
Contents
                                                                                   Page

          6.6.2.3 New Application of a Validated Method	6-30
          6.6.2.4 Newly Developed or Adapted Methods	6-32
       6.6.4  Method Validation Documentation	6-32
   6.7 Analyst Qualifications and Demonstrated Proficiency  	6-33
   6.8    Method Control  	6-33
   6.9    Continued Performance Assessment	6-34
   6.10   Documentation To Be Sent to the Project Manager  	6-36
   6.11   References	6-37

7 Evaluating Methods and Laboratories	7-1
   7.1 Introduction	7-1
   7.2 Evaluation of Proposed Analytical Methods	7-2
       7.2.1  Documentation of Required Method Performance 	7-2
          7.2.1.1 Method Validation Documentation	7-3
          7.2.1.4 Method Experience, Previous Projects, and Clients  	7-5
          7.2.1.5 Internal and External Quality Assurance Assessments  	7-5
       7.2.2  Performance Requirements of the SOW—Analytical Protocol Specifications .  7-6
          7.2.2.1 Matrix and Analyte Identification  	7-6
          7.2.2.2 Process Knowledge  	7-7
          7.2.2.3 Radiological Holding and Turnaround Times	7-7
          7.2.2.4 Unique Processing Specifications  	7-9
          7.2.2.5 Measurement Quality Objectives	7-9
          7.2.2.6 Bias Considerations	7-15
   7.3 Initial Evaluation of a Laboratory  	7-17
       7.3.1  Review of Quality System  Documents  	7-17
       7.3.2  Adequacy of Facilities, Instrumentation, and Staff Levels  	7-19
       7.3.3  Review of Applicable Prior Work	7-19
       7.3.4  Review of Performance Indicators	7-20
          7.3.4.1 Review of Internal QC  Results  	7-20
          7.3.4.2 External PE Program Results	7-21
          7.3.4.3 Internal and External Quality Assessment Reports	7-21
       7.3.5  Initial Audit	7-22
   7.4 Ongoing Evaluation of the Laboratory's Performance	7-22
       7.4.1  Quantitative Measures of Quality 	7-23
          7.4.1.1 MQO Compliance 	7-24
          7.4.1.2 Other Parameters 	7-30
MARLAP                                                        .               JULY 2001
DO NOT CITE OR QUOTE                    XIV               DRAFT FOR PUBLIC COMMENT

-------
                                                                                 Contents
       7.4.2  Operational Aspects	7-31
          7.4.2.1 Desk Audits	7-31
          7.4.2.2 Onsite Audits	7-33
   7.5 References	7-36

8 Radiochemical Data Verification And Validation	8-1
   8.1 Introduction 	8-1
   8.2 Data Assessment Process	8-2
       8.2.1  Planning Phase of the Data Life Cycle	8-2
       8.2.2  Implementation Phase of the Data Life Cycle	8-3
          8.2.2.1 Project Objectives	8-4
          8.2.2.2 Documenting Project Activities 	8-4
          8.2.2.3 QA/QC  	8-4
       8.2.3  Assessment Phase of the Data Life Cycle  	8-5
   8.3 Validation Plan	8-7
       8.3.1  Technical and Quality Objectives of the Project	8-8
       8.3.2  Validation  Tests	8-9
       8.3.3  Data Qualifiers	8-9
       8.3.4  Reporting and Documentation	8-11
   8.4 Other Essential Elements	8-11
       8.4.1  Statement of Work	8-12
       8.4.2  Verified Data Deliverables	8-12
   8.5 Data Verification and Validation Process  	8-13
       8.5.1  The Sample Handling and Analysis System 	8-14
          8.5.1.1 Sample Descriptors  	8-15
          8.5.1.2 Aliquant Size	8-16
          8.5.1.3 Dates of Sample Collection, Preparation, and Analysis 	8-16
          8.5.1.4 Preservation	8-17
          8.5.1.5 Tracking	8-18
          8.5.1.6Traceability 	8-18
          8.5.1.7 QC Types and Linkages	8-18
          8.5.1.8 Chemical Separation (Yield)	8-19
          8.5.1.9 Self-Absorption (Residue)	8-20
          8.5.1.10 Efficiency, Calibration Curves, and Instrument Background  	8-20
          8.5.1.11 Spectrometry Resolution	8-20
          8.5.1.12 Dilution and Correction Factors	8-21
JULY 2001
DRAFT FOR PUBLIC COMMENT
XV
              MARLAP
DO NOT CITE OR QUOTE
                                                       U.S. EPA Headquarters Library
                                                              Mai! code 3201
                                                       1200 Pennsylvania Avenue N W
                                                          Washington  DC 20460

-------
Contents
                                                                                  Page

          8.5.1.13 Counts and Count Time (Duration)	8-22
          8.5.1.14 Result of Measurement, Uncertainty, Minimum Detectable Concentration,
             and Units  	.'	8-22
       8.5.2  Quality Control Samples 	8-22
          8.5.2.1 Method Blank	8-24
          8.5.2.2 Laboratory Control Samples	8-25
          8.5.2.3 Laboratory Replicates	8-25
          8.5.2.4 Matrix Spikes and Matrix Spike Duplicates 	8-26
       8.5.3  Tests of Detection and Unusual Uncertainty	-	8-26
          8.5.3.1 Detection  	\.	8-26
          8.5.3.2 Detection Capability  	:	8-27
          8.5.3.3 Large or Unusual Uncertainty  	8-28
       8.5.4  Final Qualification and Reporting	8-29
   8.6 Validation Report	8-30
   8.7 Other Sources of Information  	8-32

9 Data Quality Assessment	9-1
   9.1 Introduction	9-1
   9.2 Assessment Phase  	9-2
   9.3 Graded Approach to Assessment	9-3
   9.4 The Data Quality Assessment Team 	9-4
   9.5 Data Quality Assessment Plan	9-4
   9.6 Data Quality Assessment Process  	9-6
       9.6.1  Review of Project Documents	9-8
          9.6.1.1 The Project DQOs and MQOs	9-8
          9.6.1.2 The DQA Plan	9-9
          9.6.1.3 Summary of the DQA Review	9-9
       9.6.2  Sample Representativeness 	9-10
          9.6.2.1 Review of the Sampling Plan	9-10
          9.6.2.2 Sampling Plan Implementation 	9-13
          9.6.2.3 Data Considerations	9-14
       9.6.3  Data Accuracy	9-16
          9.6.3.1 Review of the Analytical Plan	9-19
          9.6.3.2 Analytical Plan Implementation	9-21
       9.6.4  Decisions and Tolerable Error Rates	9-22
          9.6.4.1 Statistical Evaluation of Data  	9-23
MARLAP
DO NOT CITE OR QUOTE
XVI
                  JULY 2001
DRAFT FOR PUBLIC COMMENT

-------
                                                                                Contents
          9.6.4.2 Evaluation of Decision Error Rates  	9-26
   9.7 Data Quality Assessment Report	9-27
   9.8 References  	9-29
       9.8.1  Cited Sources	9-29
       9.8.2  Other Sources	9-29

10 Field and Sampling Issues That Affect Laboratory Measurements  	10-1
   Part I:  Generic Issues 	10-1
   10.1   Introduction	10-1
       10.1.1 The Need for Establishing Channels of Communication	10-2
       10.1.2 Developing Field Documentation	10-2
   10.2   Field Sampling Plan: Non Matrix Specific Issues	10-3
       10.2 J Determination of Analytical Sample Size	10-3
       10.2.2 Field Equipment and Supply Needs	10-3
       10.2.3 Selection of Sample Containers	10-4
           10.2.3.1   Container Material	10-4
           10.2.3.2   Container Opening and Closure	10-5
           10.2.3.3   Sealing Containers	10-5
           10.2.3.4   Precleaned and Extra Containers	10-5
       10.2.4 Container Label and Sample Identification Code  	10-5
       10.2.5 Field Data Documentation	10-6
       10.2.6 Field Tracking, Custody, and Shipment Forms	10-8
       10.2.7 Chain of Custody	10-9
       10.2.8 Field Quality Control	10-9
       10.2.9 Decontamination of Field Equipment  	10-10
       10.2.10 Packing and  Shipping 	10-11
       10.2.11 Worker Health and Safety Plan   	10-12
           10.2.11.1  Physical Hazards  	10-13
           10.2.11.2 Biohazards	10-15
   Part II:  Matrix-Specific  Issues That Impact Field Sample Collection, Processing, and
       Preservation	10-16
   10.3    Liquid Samples  	10-17
       10.3.1 Liquid Sampling Methods	10-18
       10.3.2 Liquid Sample Preparation:  Filtration	10-18
           10.3.2.1   EPA Guidance for Samples/Filtration	10-19
           10.3.2.2   Filters	10-21
JULY 2001                                                                      "MARLAP-
DRAFT FOR PUBLIC COMMENT               XVII                     DO NOT CITE OR QUOTE

-------
Contents
       10.3.3 Field Preservation of Liquid Samples	 10-22
          10.3.3.1   Sample Acidification	10-22
          10.3.3.2   Non-Acid Preservation Techniques	10-23
       10.3.4 Liquid Samples:  Special Cases  	10-26
          10.3.4.1   Radon-222 in Water	10-26
          10.3.4.1   Milk	10-27
       10.3.5 Non-aqueous Liquids and Mixtures 	10-27
    10.4   Solids   	10-29
       10.4.1 Soils	10-29
          10.4.1.1   Soil Sample Preparation	10-30
          10.4.1.2   Sample Ashing	10-31
       10.4.2 Sediments  	10-31
          10.4.2.1   Initial Mixing and Transport Dispersion of Radionuclides Discharged to
                    Water	10-31
          10.4.2.2   Sediment Effect	10-32
          10.4.2.3   Sample Preparation/Preservation  	10-32
       10.4.3 Other Solids  	10-32
          10.4.3.1   Structural Materials	10-32
          10.4.3.2   Biota: Samples of Plant and Animal Products	10-33
    10.5   Air Sampling	10-37
       10.5.1 Sampler Components  	10-37
       10.5.2 Filter Selection Based on Destructive Versus Non-destructive Analysis ... 10-39
       10.5.3 Sample Preservation and Storage   	10-39
       10.5.4 Special Cases: Collection of Gaseous and Volatile Air Contaminants	10-40
          10.5.4.1   Radioiodines  	10-40
          10.5.4.2   Gases 	10-41
          10.5.4.3   Tritium Air Sampling	10-41
       10.5.5 Radon	10-42
          10.5.5.1   Radon Sampling Methods  	10-43
          10.5.5.2   Selecting a Radon Sampling Method Based on Data
             Quality Objectives	10-46
    10.6   Wipe Sampling for Assessing Surface Contamination  	10-47
       10.6.1 Sample Collection Methods  	10-48
          10.6.1.1   Dry Wipes	10-48
          10.6.1.2   Wet Wipes	-r;	10-48
       10.6.2. SampjeHandling,,,..	10-50


MARLAP                                                                     JULY 2001
DO NOT CITE OR QUOTE                    XVm .             DRAFT FOR PUBLIC COMMENT

-------
                                                                              Contents
                                                                                  Page

   10.7   References 	10-50

11 Sample Receipt, Inspection, and Tracking	11-1
   11.1   Introduction 	11-1
   11.2   General Considerations	11-3
       11.2.1 Communication Before Sample Receipt	11-3
       11.2.2 Standard Operating Procedures	11-3
       11.2.3 Laboratory License	11-4
       11.2.4 Sample Chain-of-Custody	'.	11-5
   11.3   Sample Receipt  	11-6
       11.3.1 Package Receipt	:	11-6
       11.3.2 Radiological Screening  	11-7
       11.3.3 Corrective Action	11-9
   11.4   Sample Inspection  	11-9
       11.4.1 Physical Integrity of Package and Sample Containers	11-9
       11.4.2 Sample Identity Confirmation	11-10
       11.4.3 Confirmation of Field Preservation	11-11
       11.4.4 Presence of Hazardous Materials	11-11
       11.4.5 Corrective Action	11-12
   11.5   Laboratory Sample Tracking	11-12
       11.5.1 Sample Log-In	11-13
       11.5.2 Sample Tracking During Analyses	11-13
       11.5.3 Storage of Samples	11-14
   11.6   References 	11-14

12 Laboratory Sample Preparation	12-1
   12.1   Introduction 	12-1
   12.2   General Guidance for Sample Preparation	12-2
       12.2.1 Potential Sample Losses During Preparation	12-2
          12.2.1.1   Losses as Dust or Particulates  	12-2
          12.2.1.2   Losses Through Volatilization	12-3
          12.2.1.3   Losses Owing to Reactions Between Sample and Container	12-4
       12.2.2 Contamination from Sources in the Laboratory	12-6
          12.2.2.1   Airborne Contamination 	12-6
          12.2.2.2   Contamination of Reagents 	12-7
          12.2.2.3   Contamination of Glassware/Equipment	12-7


JULY 2001                                                                      MARLAP
DRAFT FOR PUBLIC COMMENT              XIX                     DO NOT CITE OR QUOTE

-------
Contents
                                                                                   Page

          12.2.2.4   Contamination of Facilities	12-8
       12.2.3 Cleaning of Labware, Glassware, and Equipment	12-8
          12.2.3.1   Labware and Glassware	12-8
          12.2.3.2   Equipment 	12-10
       12.3.1 General Procedures 	12-14
          12.3.1.1   Exclusion of Material	12-14
          12.3.1.2   Principles of Drying Techniques	12-14
          12.3.1.3   Obtaining a Constant Weight	12-24
          12.3.1.4   Subsampling	12-26
       12.3.2 Soil/Sediment Samples  	12-30
          12.3.2.1   Soils	;	12-31
          12.3.2.2   Sediments	12-31
          12.3.3.1   Biological Samples  	12-32
          12.3.3.2   Food	!	12-32
          12.3.3.3   Vegetation 	12-32
          12.3.3.4   Bone and Tissue	12-33
       12.3.4 Other Samples 	12-33
    12.4   Filters	12-33
    12.5   Wipe Samples 	12-34
    12.6   Liquid Samples	12-35
       12.6.1 Conductivity	12-35
       12.6.2 Turbidity	12-36
       12.6.3 Filtration	12-36
       12.6.4 Aqueous Liquids  	12-36
       12.6.5 Nonaqueous Liquids  	12-37
       12.6.6 Mixtures	12-38
          12.6.6.1   Liquid-Liquid Mixtures	12-38
          12.6.6.2   Liquid-Solid Mixtures	12-39
    12.7   Gases 	12-39
    12.8   Bioassay	12-40
    12.9   References	12-41
       12.9.1 Cited Sources	12-41
       12.9.2 Other Sources ;	12-47

13  Sample Dissolution 	13-1
    13.1   Introduction	13-1
MARLAP                                                                      JULY 2001
DO NOT CITE OR QUOTE                    XX               DRAFT FOR PUBLIC COMMENT

-------
                                                                               Contents
   13.2   The Chemistry of Dissolution 	13-2
       13.2.1 Solubility and the Solubility Product Constant, Ksp	13-2
       13.2.2 Chemical Exchange, Decomposition, and Simple Rearrangement Reactions  . 13-3
       13.2.3 Oxidation-Reduction Processes	13-4
       13.2.4 Complexation	13-5
       13.2.5 Equilibrium: Carriers and Tracers	13-6
   13.3   Fusion Techniques	13-6
       13.3.1 Alkali-Metal Hydroxide Fusions	13-10
       13.3.2 Boron Fusions 	13-11
       13.3.3 Fluoride Fusions	13-12
   13.4   Wet Ashing and Acid Dissolution Techniques	13-13
       13.4.1 Acids and Oxidants  	13-13
       13.4.2 Acid Digestion Bombs	13-23
       13.4.3 Is it Dissolved?   	13-23
   13.5   Microwave Digestion	13-24
       13.5.1 Focused Open-Vessel Systems  	13-25
       13.5.2 Low-Pressure, Closed-Vessel Systems  	13-25
       13.5.3 High-Pressure, Closed-Vessel Systems	13-26
   13.6   Special Matrix Considerations	13-26
       13.6.1 Liquid Samples  	13-26
          13.6.1.1 Aqueous Samples	13-26
          13.6.1.2 Nonaqueous Samples	13-27
       13.6.2 Solid Samples 	13-27
       13.6.3 Filters	13-27
       13.6.4 Wipe Samples 	13-28
       13.6.5 Liquid Scintillation Samples	13-28
          13.6.5.1   Wet Oxidation	13-28
          13.6.5.2   Dry Oxidation  	13-29
   13.7   Total Dissolution and Leaching	13-29
       13.7.1 Acid Leaching	13-30
       13.7.2 Total Dissolution through Fusion 	13-31
       13.7.3 Acid Digestion — Fusion Combined Approach  	13-32
   13.8   Examples of Decomposition Procedures	13-32
   13.9   References	13-33
       13.9.1 Cited References 	13-33
       13.9.2 Other Sources	13-36
JULY 2001                                                                       MARLAP
DRAFT FOR PUBLIC COMMENT              XXI                     DO NOT CITE OR QUOTE

-------
Contents
                                                                                    Page
14 Separation Techniques	;	14-1
   14.1   Introduction	14-1
   14.2   Oxidation/Reduction Processes	14-2
       14.2.1 Introduction  	14-2
       14.2.2 Oxidation-Reduction Reactions	14-3
       14.2.3 Common Oxidation States	14-6
       14.2.4 Oxidation State in Solution	14-11
       14.2.5 Common Oxidizing and Reducing Agents ;	14-12
       14.2.6 Oxidation State and Radiochemical Analysis  	.	14-14
   14.3   Complexation	14-19
       14.3.1 Introduction  	14-19
       14.3.2 Chelates	14-21
       14.3.3 The Formation (Stability) Constant	14-24
       14.3.4 Complexation and Radiochemical Analysis 	14-25
          14.3.4.1   Extraction of Laboratory Samples and Ores 	14-25
          14.3.4.2   Separation by Solvent Extraction arid Ion-Exchange Chromatography
                     	14-25
          14.3.4.3    Formation and Dissolution of Precipitates	14-26
          14.3.4.4    Stabilization of Ions in Solution ..'	14-27
          14.3.4.5   Detection and Determination	14-27
   14.4   Solvent Extraction	14-27
       14.4.1 Extraction Principles  	14-27
       14.4.2 Distribution Coefficient	14-28
       14.4.3 Extraction Technique	14-30
       14.4.4 Solvent Extraction and Radiochemical Analysis	14-33
       14.4.5 Solid-Phase Extraction	;	14-35
          14.4.5.1   Extraction Chromatography Columns	14-36
          14.4.5.2   Extraction Membranes	14-37
       14.4.6 Advantages and Disadvantages of Solvent Extraction 	14-38
          14.4.6.1   Advantages	14-38
          14.4.6.2   Disadvantages  	14-38
   14.5   Volatilization and Distillation	14-39
       14.5.1 Introduction	14-39
       14.5.2 Volatilization Principles  	'	."::	14^40
       14.5.3 Distillation Principles  	14-42


MARLAP         '        - -                                    _                JULY 2001
DO NOT CITE OR QUOTE                    XXII              DRAFT FOR PUBUC COMMENT

-------
                                                                                Contents
                                                                                   Page

       14.5.4 Separations in Radiochemical Analysis	14-43
       14.5.5 Advantages and Disadvantages of Volatilization  	14-44
          14.5.5.1   Advantages	14-44
          14.5.5.2   Disadvantages  	14-44
    14.6   Electrodeposition	14-45
       14.6.1 Electrodeposition Principles  	14-45
       14.6.2 Separation of Radionuclides  	14-46
       14.6.3 Preparation of Counting Sources	14-47
       14.6.4 Advantages and Disadvantages of Electrodeposition 	14-47
          14.6.4.1   Advantages	14-47
          14.6.4.2 Disadvantages	14-47
    14.7   Chromatography	14-48
       14.7.1 Chromatographic Principles	14-48
       14.7.2 Gas-Liquid and Liquid-Liquid Phase Chromatography	14-49
       14.7.3 Adsorption Chromatography	14-50
       14.7.4 Ion-Exchange Chromatography	14-50
          14.7.4.1   Principles of Ion Exchange  	14-50
          14.7.4.2   Resins	14-53
       14.7.5 Affinity Chromatography	14-59
       14.7.6 Gel-Filtration Chromatography	14-59
       14.7.7 Chromatographic Laboratory Methods  	14-60
       14.7.8 Advantages and Disadvantages of Chromatographic Systems  	14-61
    14.8   Precipitation and Coprecipitation 	14-61
       14.8.1 Introduction 	14-61
       14.8.2 Solutions  	14-62
       14.8.3 Precipitation	14-64
          14.8.3.1   Solubility and the Solubility Product Constant, Ksp	14-65
          14.8.3.2   Factors Affecting Precipitation  	14-70
          14.8.3.3   Optimum Precipitation Conditions 	14-75
       14.8.4 Coprecipitation	14-76
          14.8.4.1   Coprecipitation Processes 	14-77
          14.8.4.2   Water as an Impurity 	14-81
          14.8.4.3   Postprecipitation 	14-81
          14.8.4.4   Coprecipitation Methods	14-82
       14.8.5 Colloidal Precipitates	14-86
       14.8.6 Filterability of Precipitates	14-88


JULY 2001       "    -                                                            MARLAP
DRAFT FOR PUBLIC COMMENT              XXIII               .    DO NOT CITE OR QUOTE

-------
Contents
                                                                                   Page

       14.8.7 Advantages and Disadvantages of Precipitation and Coprecipitation	14-90
    14.9   Carriers and Tracers	14-91
       14.9.1 Introduction	'	14-91
       14.9.2 Carriers	'	14-91
          14.9.2.1   Isotopic Carriers	:	14-92
          14.9.2.2   Nonisotopic Carriers	14-93
          14.9.2.3   Common Carriers	14-94
          14.9.2.4   Holdback Carriers  	14-98
          14.9.2.5   Yield (Recovery) of Isotopic Carriers	14-98
       14.9.3 Tracers	14-99
          14.9.3.1   Characteristics of Tracers	14-101
          14.9.3.2   Coprecipitation	14-103
          14.9.3.3   Deposition on Nonmetallic Solids	14-103
          14.9.3.4   Radiocolloid Formation  	14-103
          14.9.3.5   Distribution (Partition) Behavior	14-105
          14.9.3.6   Vaporization	14-105
          14.9.3.7   Oxidation and Reduction	14-106
    14.10  Radiochemical Equilibrium  	14-107
       14.10.1   Basic Principles of Equilibrium	14-107
       14.10.2   Oxidation State  	14-110
       14.10.3   Hydrolysis	14-111
       14.10.4   Polymerization	14-113
       14.10.5   Complexation	14-114
       14.10.6   Radiocolloid Interference	14-114
       14.10.7   Isotope Dilution Analysis	14-115
       14.10.8   Masking and Demasking	14-116
       14.10.9   Review of Specific Radionuclides		14-120
          14.10.9.1   Americium	14-120
          14.10.9.2   Cesium	14-125
          14.10.9.3   Cobalt	14-130
          14.10.9.4   Iodine	j	14-136
          14.10.9.5   Plutonium	14-144
          14.10.9.6   Radium	14-153
          14.10.9.7   Strontium	-	14-161
          14.10.9.8   Technetium  	14-167
          14.10.9.9   Thorium	14-173
MARLAP                                                                       JULY 2001
DO NOT CITE OR QUOTE-                   XXIV              DRAFT FOR PUBLIC COMMENT.

-------
                                                                                Contents
                                                                                   Page

          14.10.9.10 Tritium	14-180
          14.10.9.11 Uranium	14-185
          14.10.9.12 Zirconium	14-196
   14.11  References 	14-204
   14.12  Selected Bibliography 	14-222
       14.12.1    Inorganic and Analytical Chemistry	14-222
       14.12.2    General Radiochemistry	14-223
       14.12.3    Radiochemical Methods of Separation  	14-223
       14.12.4    Radionuclides	14-223
       14.12.5    Separation Methods	14-225

15 Nuclear Counting Instrumentation  	15-1
   15.1   Introduction 	15-1
   15.2   Alpha Counting	15-2
       15.2.1 Introduction	15-2
       15.2.2 Detectors for Alpha Counting  	15-3
          15.2.2.1 lonization Chambers  	15-3
          15.2.2.2 Proportional Counters  	15-3
          15.2.2.3 Scintillation Counters  	15-4
          15.2.2.4 Liquid Scintillation Counters	15-5
          15.2.2.5 Semiconductor Detectors	15-6
   15.3   Beta Counting 	15-7
       15.3.1 Introduction  	15-7
       15.3.2 Proportional Counter	15-7
       15.3.3 Liquid Scintillation  	15-8
       15.3.4 Solid Organic Scintillators	15-9
       15.3.5 Beta Particle  Counter	15-10
       15.3.6 Associated Electronic Equipment  	15-11
   15.4   Gamma Counting	15-12
       15.4.1 Introduction  	15-12
       15.4.2 Energy Efficiency Relationship	15-16
       15.4.3 Sodium Iodide Detector Assembly  	15-18
       15.4.4 High Resolution Germanium Detectors	15-19
       15.4.5 Low Background High Resolution Germanium Detectors  	15-19
       15.4.6 High Resolution Detectors for Low Energy Spectrometry  	15-20
       15.4.7 CsI(Tl) Detectors	15-20
JULY 2001                                                                        MARLAP
DRAFT FOR PUBLIC COMMENT              XXV                    DO NOT CITE OR QUOTE

-------
Contents
       15.4.8 CdZnTe Detectors 	15-20
       15.4.9 BGO Detectors	15-21
   15.5   Spectrometry Systems	15-21
       15.5.1 Alpha/Gamma Coincidence Systems	15-21
       15.5.2 Beta/Gamma Coincidence Systems	'.	15-21
       15.5.3 Gamma/Gamma Coincidence Systems  	15-21
       15.5.4 Photon-Electron Rejecting Alpha Liquid Scintillation Systems	15-22
   15.6   Special Instruments	15-22
       15.6.1 4-7t Counter  	[	15-22
       15.6.2 Low-Geometry Counters	15-23
       15.6.3 Internal Gas Counters	j.	15-23
   15.7   Spectrometers and Energy-Dependent Detectors	15-24
      . 15.7.1 Anti-Coincidence Counters	15-29
       15.7.2 Coincidence Counters  	'.	15-30
   15.8   Shielding 	15-31
   15.9   Instrument Calibration	i	15-31
   15.10  Other Considerations	!	15-32
       15.10.1 Alpha	;	15-32
          15.10.1.1 Troubleshooting	;.	15-32
          15.10.1.2 Calibration	15-35
          15.10.1.3 Costs	15-35
          15.10.1.4 Quality Control  	'.	15-37
       15.10.2 Beta	15-40
          15.10.2.1 Introduction 	15-40
          15.10.2.2 Alpha Particle Interference and Beta Energy Resolution	15-41
          15.10.2.3 Liquid Scintillation Quenching 	15-42
          15.10.2.4 Beta Particle Attenuation	'.	15-42
          15.10.2.5 Calibration	15-44
          15.10.2.6 Costs  	15-44
          15.10.2.7 Quality Control  	,	15-46
       15.10.3 Gamma  	15-46
          15.10.3.1 Troubleshooting	15-46
          15.10.3.2 Calibration	15-48
          15.10.3.3 Software	j	15-49
          15.10.3.4  Costs	!	15-50
          15.10.3.5  Quality Control	15-50
MARLAP
DO NOT CITE OR QUOTE
XXVI
                  JULY 2001.
DRAFT FOR PUBLIC COMMENT

-------
	Contents

                                                                               Page

       15.10.4   Non-Nuclear Instrumentation  	15-51
          15.10.4.1 ICP-Mass Spectrometry	15-51
          15.10.4.2 Laser  	15-53
          15.10.4.3 Radionuclides Analyzed By Neutron Activation	15-54
   15.11  References	15-55
       15.11.1   Cited References 	15-55
       15.11.2   Other Sources	15-61

Attachment ISA Field Measurements	15-63
   15A.1  Introduction	15-63
   15A.2  Analytical Level of Measurements	15-63
   15A.3  Documentation of Methodology  	15-65
   15A.4  Instrument Operating Conditions	15-66
   15A.5  Site Conditions/Limitations	15-66
   15A.6  Interferences	15-67
   15A.7  Calibration	15-67
   15A.8  Minimum Detectable Concentrations  	15-68
   15A.9  Precision	15-69
   15A. 10Accuracy 	15-69
   15A.11 Representativeness	15-69
   15A.12Completeness	15-70
   15A.13Comparability	15-71
   15A. 14Reference Measurements	15-71
   15A.15Record Keeping	15-72
   ISA. 16Quality Improvement	15-72
   ISA. 17Management Assessment	15-73
   15A.18Combined Laboratory and Field Measurements  	15-73
   15A.19References  	15-73

16 Instrument Calibration and Test Source Preparation	16-1
   16.1   Introduction 	16-1
   16.2   Instrument Calibration	16-2
       16.2.1 Standards  	16-3
       16.2.2 Correspondence	16-3
       16.2.3 Homogeneity 	16-4
       16.2.4 Uncertainty	--	16-4


JULY 2001                                                                   MARLAP
DRAFT FOR PUBLIC COMMENT             XXVH                   DO NOT CITE OR QUOTE

-------
Contents
    16.3   General Test Source Characteristics  	J	16-4
       16.3.1 Geometrical Arrangement 	J	16-5
       16.3.2 Uniformity of Test Source Material	]	16-5
       16.3.3 Self-Absorption and Scattering	16-6
       16.3.4 Counting Planchets	16-8
    16.4   Test Source Preparation and Calibration for Alpha Measurements	16-8
       16.4.1 Proportional Counters  	I	16-9
          16.4.1.1 Alpha Test Source Preparation  		16-9
          16.4.1.2 Proportional Counter Calibration — Alpha	16-10
       16.4.2 ZnS(Ag) Scintillation Counter	:	16-11
       16.4.3 Alpha Spectrometry With Semiconductor Detectors	16-12
       16.4.4 Liquid-Scintillation Spectrometer	I	16-13
    16.5   Characteristics of Sources for Beta Measurements	16-13
       16.5.1 Proportional Counters  	:	16-13
          16.5.1.1 Beta Test Source Preparation	:	16-14
          16.5.1.2 Proportional Counter Calibration — Beta	16-14
       16.5.2 Liquid-Scintillation Spectrometers  	16-15
          16.5.2.1 Liquid Scintillation Test Source Preparation	16-16
          16.5.2.1 Liquid-Scintillation Spectrometer Calibration	16-17
    16.6   Characteristics of Sources for Gamma-Ray Measurements	16-18
       16.6.1 Gamma Test Source Preparation	16-18
       16.6.2 Gamma Spectrometer Calibration	.[	16-20
    16.7   Methods of Test Source Preparation	16-20
       16.7.1 Electrodeposition	
       16.7.2 Coprecipitation	
       16.7.3 Evaporation  	
       16.7.4 Thermal Volatilization/Sublimation 	
       16.7.5 Preparing Sources to Measure Radioactive
       16.7.6 Preparing Air Filters for Counting	
       16.7.7 Preparing Swipes/Smears for Counting ..
    16.8   References
  	16-20
  	16-23
  	16-25
  	16-26
Gases	16-27
  	16-29
  	16-29
  	16-30
17  Data Acquisition, Reduction, and Reporting	17-1
 .   17.1   Introduction  	.'	17-1
    17.2   Data Acquisition	17-2
       17.2.1  Generic Counting Parameter Selection  	17-3


MARLAP —                                 ~~       :                           JULY 2001
DO NOT CITE OR QUOTE                   XXVIII             DRAFT FOR PUBLIC COMMENT

-------
                                                                                 Contents

                                                                                    Page

           17.2.1.1 Counting Duration	17-4
           17.2.1.2 Counting Geometry	17-5
           17.2.1.3 Software	17-5
       17.2.2  Basic Data Reduction Calculations	17-6
    17.3   Data Reduction on Spectrometry Systems  	17-8
       17.3.1  Gamma Spectrometry  	17-8
           17.3.1.1 Peak Search or Identification	17-10
              Regions of Interest (ROI) Method	17-11
              Gaussian Function Derivative Method  	17-12
              Channel Differential Method	17-12
              Correlation Method	17-12
           17.3.1.2 Singlet/Multiplet Peaks  	17-13
           17.3.1.3 Definition of Peak Centroid and Energy 	17-14
           17.3.1.4 Peak Width Determination	17-14
           17.3.1.5 Peak Area Determination	17-16
           17.3.1.6 Calibration Reference File	17-19
           17.3.1.7 Activity and Concentration	17-19
           17.3.1.8    Summing Considerations	17-20
           17.3.1.9 Uncertainty Calculation	17-22
       17.3.2  Alpha Spectrometry	17-23
           17.3.2.1 Radiochemical Yield	17-27
           17.3.2.2 Uncertainty Calculation	17-27
       17.3.3  Liquid Scintillation Spectrometry	17-28
           17.3.3.1 Overview of Liquid Scintillation Counting	17-28
           17.3.3.2 Liquid Scintillation Spectra	17-29
           17.3.3.3 Pulse Characteristics 	17-29
           17.3.3.4 Coincidence Circuitry  	17-30
           17.3.3.5 Quenching  	17-30
           17.3.3.6 Luminescence	17-30
           17.3.3.7 Test Source Vials	17-31
           17.3.3.8 Data Reduction for Liquid Scintillation Counting	17-31
    17.4   Data Reduction on Non-Spectrometry Systems	17-32
    17.5   Reporting Data	17-37
       17.5.1  Sample and Analysis Method Identification 	17-37
       17.5.2  Units and Radionuclide Identification	17-38
       17.5.3  Values, Uncertainty, and Significant Figures	17-38


JULY 2001                                                                        MARLAP
DRAFT FOR PUBLIC COMMENT      	    XXIX                     DO NOT CITE OR QUOTE

-------
Contents
                                                                                   Page

       17.5.4 Other Information to be Provided on Request	17-38
    17.6   Data Packages	17-39
    17.7   Electronic Data Deliverables	17-39
    17.8   References 	•	17-40
       17.8.1 Cited References  	17-40
       17.8.2 Other Sources	17-42

18  Laboratory Quality Control 	18-1
    18.1   Introduction 	,	18-1
       18.1.1 Organization of Chapter	i	18-2
       18.1.2 Format 	\	18-2
    18.2   Quality Control  	!	 18-3
    18.3   Evaluation of Performance Indicators	18-4
       18.3.1 Importance of Evaluating Performance Indicators	18-4
       18.3.2 Statistical Means of Evaluating Performance Indicators — Control Charts  .. 18-5
       18.3.3 Measurement Uncertainty
    18.4   Radiochemistry Performance Indicators
       18.4.1 Method and Reagent Blank
 18-7
 18-9
 18-9
       18.4.2 Laboratory Replicates	18-13
       18.4.3 Laboratory Control Samples, Matrix Spikes, and Matrix Spike Duplicates  . 18-16
       18.4.4 Certified Reference Materials  	18-18
       18.4.5 Chemical/Tracer Yield	'	18-21
    18.5   Instrumentation Performance Indicators	18-25
       18.5.1 Instrument Background Measurements ...:	18-25
       18.5.2 Efficiency Calibrations	18-27
       18.5.3 Spectrometry Systems	18-31
          18.5.3.1   Energy Calibrations	18-31
          18.5.3.2   Peak Resolution and Tailing 	18-34
       18.5.4 Gas Proportional Systems 	18-38
          18.5.4.1   Voltage Plateaus	18-38
          18.5.4.2   Self-Absorption, Backscatter, and Crosstalk   	18-39
       18.5.5 Liquid Scintillation 	18-41
          18.5.6  Summary	18-41
    18.6   Related Concerns	18-43
       18.6.1 Detection Capability
       18.6.2 Secular Equilibrium
18-43
18-44
MARLAP                     --•  ~                                            JULY 2001
DO NOT CITE OR QUOTE                     XXX              DRAFT FOR PUBLIC COMMENT

-------
                                                                             Contents
       18.6.3 Half-Life	18-47
       18.6.4 Interferences	18-48
       18.6.5 Negative Results	18-50
       18.6.6 Blind Samples	18-51
       18.6.7 Calibration of Apparatus Used for Weight and Volume Measurements .... 18-54
   18.7   References	18-55
       18.7.1 Cited Sources	18-55
       18.7.2 Other Sources	18-57
   Attachment 18A: Control Charts  	18-59
       18A.1 Introduction  	18-59
       18A.2 X Charts	 18-59
       I8A.3 Charts  	'.	18-63
       18A.4 R Charts	18-64
       18A.5 Control Charts for Instrument Response	18-65
       18A.6 References 	'.	18-70
   Attachment 18B: Statistical Tests for QC Results 	18-71
       18B.1 Introduction  	18-71
       18B.2 Tests for Excess Variance in the Instrument Response	18-71
       18B.3 Instrument Background Measurements	:	18-78
          18B.3.1   Detection of Background Variability	18-78
          18B.3.2   Comparing a Single Observation to Preset Limits	18-80
          18B.3.3   Comparing the Results of Consecutive Measurements	18-84
       18B.4 Negative Activities	18-86
       18B.5 References 	18-86

19 Measurement Statistics  	19-1
   19.1   Overview	19-1
       19.2   Statistical Concepts and Terms  	19-2
          19.2.1 Basic Concepts	19-2
          19.2.2 Summary of Terms	19-5
       19.3   Measurement Uncertainty 	19-7
          19.3.1 Measurement, Error, and Uncertainty	".	19-7
          19.3.2 The Measurement Process	19-8
          19.3.3 Analysis of Measurement Uncertainty	19-10
          19.3.4 Corrections for Systematic Effects	19-11
          19.3.5 Counting Uncertainty	19-11


JULY 2001                                                                     MARLAP
DRAFT FOR PUBLIC COMMENT	         XXXI                   DO NOT CITE OR QUOTE

-------
Contents
          19B.5 The Covariance Matrix for a Least-Squares Solution 	19-102
          19B.6 Critical Values	
          19B.7 Detection and Quantification Limits
          19B.8 References  	
       Attachment 19C  Estimation of Coverage Factors
          19C.1 Introduction 	
          19C.2 Procedure 	
          19C.3 Poisson Counting Uncertainty
                                                      	  19-103
                                                      	  19-104
                                                      	  19-104
                                                      .....'	19-105
                                                      	  19-105
                                                      	  19-105
                                                      	19-106
          19C.4 References	t	19-107
       Attachment 19D  Low-Background Detection Limits	19-109
          19D.1 Overview 	.'...'	19-109
          19D.2 Calculation of the Critical Value	19-109
             19D.2.1 Normally Distributed Signals	19-109
             19D.2.2 Poisson Counting  	:	19-110
          19D.3 Calculation of the Minimum Detectable Concentration 	19-123
             19D.3.1 The Minimum Detectable Net Instrument Signal	19-123
             19D.3.2 Normally Distributed Signals	19-123
             19D.3.3 Poisson Counting  	19-126
          19D.4 References 	19-132
       Attachment 19E  Example Calculations	19-135
          19E.1 Overview	19-135
          19E.2 Sample Collection and Analysis	;	19-135
          19E.3 The Measurement Model	19-136
          19E.4 The Combined Standard Uncertainty	19-138
          19E.5 The Critical Net Count	19-140
          19E.6 The Minimum Detectable Concentration	19-143
          19E.7 The Minimum Quantifiable Concentration  	19-148
       Attachment 19F  Tests for Normality  	•	19-149
          19F.1 Purpose	19-149
          19F.2 Normal Probability Plots  	1	19-149
          19F.3 Filliben's Statistic	:	19-151
          19F.4 References	19-155
       Attachment 19G  Balance Measurement Uncertainty  	19-157
          19G.1 Purpose	!..'	19-157
          19G.2 Considerations	:	19-157
          19G.3 Repeatability 	19-157


MARLAP                                   .                                JULY 2001
DO NOT CITE OR QUOTE                  XXXIV            DRAFT FOR PUBLIC COMMENT

-------
	Contents

                                                                                 Page

          19G.4 Environmental Factors	19-158
          19G.5 Calibration	19-159
          19G.6 Linearity	19-160
          19G.7 Air Buoyancy Corrections	19-160
          19G.8 Combining the Components	19-164
          19G.9 References  	19-165

20 Waste Management in a Radioanalytical Laboratory  	20-1
   20.1   Introduction 	20-1
   20.2   Types of Laboratory Wastes  	20-1
   20.3   Waste Management Program	20-2
       20.3.1 Program Integration	20-3
       20.3.2 Staff Involvement	20-3
   20.4   Waste Minimization	20-4
   20.5   Waste Determinations and Characterization	 20-6
   20.6   Specific Waste Management Requirements  	20-7
       20.6.1 Sample/Waste Exemptions 	20-9
       20.6.2 Storage	20-10
          20.6.2.1 Container Requirements	20-11
          20.6.2.2 Labeling Requirements 	20-11
          20.6.2.3 Time Constraints 	20-11
          20.6.2.4 Monitoring Requirements 	20-12
       20.6.3 Treatment	20-12
       20.6.4 Disposal	20-13
   20.7   Contents of a Laboratory Waste Management Plan/Certification Plan	20-14
       20.7.1 Laboratory Waste Management Plan	20-14
       20.7.2 Waste Certification Plan/Program	20-14
   20.8   Useful Web Sites	20-16
   20.9   References 	20-17
       20.9.1 Cited References  	20-17
       20.9.2 Other Sources	20-18

Glossary	to be added following public review
JULY 2001-                          	                                      MARLAP
DRAFT FOR PUBLIC COMMENT             XXXV                   DO NOT CITE OR QUOTE

-------
Contents
                                     Appendices

Appendix A: Directed Planning Approaches	  A-l
   AJDirected Planning Approaches	  A-l
   A.2Elements Common to Directed Planning Approaches	  A-l
   A.3Data Quality Objectives Process  	  A-2
   A.4Observational Approach	  A-3
   A.SStreamlined Approach for Environmental Restoration	  A-4
   A.6Technical Project Planning  	  A-4
   AJExpedited Site Characterization	  A-5
   A.SValue Engineering	  A-5
   A.9Systems Engineering 	  A-6
   A.10  Total Quality Management  	  A-7
   A.ll   Partnering	  A-7
   A.12  References	  A-8
       A.12.1 Data Quality Objectives	  A-8
       A. 12.2 Observational Approach  	  A-10
       A.12.3 Streamlined Approach for Environmental Restoration (Safer)	  A-10
       A.12.4 Technical Project Planning  	  A-ll
       A.12.5 Expedited Site Characterization	  A-l 1
       A.12.6 Value Engineering	  A-13
       A.12.7 Systems Engineering	  A-14
       A.12.8 Total Quality Management  	  A-16
       A.12.9 Partnering	  A-17

Appendix B: The Data Quality Objectives Process  	B-l
   Bl.O  Introduction	B-l
   B2.0  Overview of the DQO Process	B-2
   B3.0  The Seven Steps of the DQO Process 	B-3
       B3.1   DQO Process Step 1: State the Problem 	;	B-3
       B3.2   DQO Process Step 2: Identify the Decision	B-4
       B3.3   DQO Process Step 3: Identify Inputs to the Decision  	B-5
       B3.4   DQO Process Step 4: Define the Study Boundaries  	B-7
       B3.5   Outputs of DQO Process Steps 1 to 4 Lead Into Steps 5 to 7 	B-8
   B3.6  DQO Process Step 5: Develop a Decision Rule  	B-8
   B3.7  DQO Process Step 6: Specify the Limits on Decision Errors 	B-9


MARLAP                                 "                           "      JULY 2001
DO NOT CITE OR QUOTE                  XXXVI             DRAFT FOR PUBLIC COMMENT

-------
                                                                             Contents
                                                                                Page

   B3.8   DQO Process Step 7: Optimize the Design for Obtaining Data  	B-12
   B3.9   References 	B-14
   Attachment B-l Decision Error Rates And The Gray Region 	B-16
      B-l. 1  Introduction	B-16
      B-1.2  The Region of Interest	B-16
      B-1.3  Measurement Uncertainty at the Action Level  	B-17
      B-1.4  The Null Hypothesis 	B-18
          Case 1: Assume The True Concentration is Over 1.0	B-18
          Case 2: Assume The True Concentration is 0.9	B-20
      B-1.5  The Critical Region 1	B-20
      B-1.6  The Gray Region	B-21

Appendix C: Measurement Quality Objectives for Method Uncertainty And Detection and
   Quantification Capability	C-l
   C.I    Introduction 	C-l
   C.2 -   Hypothesis Testing	C-2
   C.3    Development of MQOs for Analytical Protocol Selection  	C-4
   C.4    The Role of the MQO for Method Uncertainty in Data Evaluation  	C-9
      C.4.1 Uncertainty Requirements at Various Concentrations 	C-9
      C.4.2 Acceptance Criteria for Quality Control Samples	C-l2
   C.5    References	C-19

Appendix D Content of Project Plan Documents	  D-l
   Dl.O   Introduction 	  D-l
   D2.0   Group A: Project Management  	  D-3
      D2.1   Project Management (Al): Title and Approval Sheet	  D-6
      D2.2   Project Management (A2): Table  of Contents	  D-7
      D2.3   Project Management (A3): Distribution List  	  D-7
      D2.4  Project Management (A4): Project/Task Organization	  D-8
      D2.5   Project Management (A5): Problem Definition/Background 	  D-8
      D2.6   Project Management (A6): Project/Task Description 	  D-10
      D2.7   Project Management (A7): Quality Objectives and Criteria for
          Measurement Data	  D-12
          D2.7.1 Project's Quality Objectives   	  D-12
          D2.7.2 Specifying Measurement Quality Objectives	  D-l3
          D2.7.3 Relation between the Project DQOs, MQOs, and QC Requirements  ...  D-14


JULY 2001                                                                     MARLAP
DRAFT FOR PUBLIC COMMENT            XXXVH                 DO NOT CITE OR QUOTE

-------
Contents
      D2.8  Project Management (A8): Special Training Requirements/Certification ...  D-14
      D2.9  Project Management (A9): Documentation and Record	  D-14
   D3.0   Group B: Measurement/Data Acquisition	  D-16
      D3.1  Measurement/Data Acquisition (Bl): Sampling Process Design 	  D-16
      D3.2  Measurement/Data Acquisition (B2): Sampling Methods Requirements ...  D-18
      D3.3  Measurement/Data Acquisition (B3): Sample Handling and Custody
          Requirements	  D-20
      D3.4  Measurement/Data Acquisition (B4): Analytical Methods Requirements  ..  D-21
      D3.5  Measurement/Data Acquisition (B5): Quality Control Requirements  	  D-23
      D3.6  Measurement/Data Acquisition (B6): Instrument/Equipment Testing, Inspection,
             and Maintenance Requirements	  D-24
      D3.7  Measurement/Data Acquisition (B7): Instrument Calibration and Frequency  D-25
      D3.8  Measurement/Data Acquisition (B8): Inspection/Acceptance Requirements for
             Supplies and Consumables  	  D-26
      D3.9  Measurement/Data Acquisition (B9): Data Acquisition Requirements for Non-
             Direct Measurement Data  	  D-27
      D3.10 Measurement/Data Acquisition (BIO): Data Management	  D-28
   D4.0   Group C: Assessment/Oversight	  D-29
      D4.1  Assessment/Oversight (Cl): Assessment and Response Actions 	  D-29
      D4.2  Assessment/Oversight (C2): Reports To Management	  D-30
   D5.0   Group D: Data Validation and Usability	,	  D-31
      D5.1  Data Validation and Usability (D 1):  Verification and Validation
          Requirements  	  D-31
      D5.2  Data Validation and Usability (D2):  Verification and Validation Methods .  D-32
          D5.2.1 Data Verification 	  D-32
          D5.2.2 Data Validation 	  D-33
      D5.3  Data Validation and Usability (D3):  Reconciliation with Data Quality
          Objectives  	  D-34
   D6.0   References  	  D-35

Appendix E: Contracting Laboratory Services  	E-l
   E. 1 Introduction	E-l
   E.2 Procurement of Services 	E-5
      E.2.1  Request for Approval of Proposed Procurement Action	E-6
      E.2.2  Types of Procurement Mechanisms	E-6
   E.3 Request for Proposals^-The Solicitation  	E-8


MARLAP                _                                                   JULY 2001
DO NOT CITE OR QUOTE                  XXXVIII            DRAFT FOR PUBLIC COMMENT

-------
                                                                              Contents
      E.3.1  Market Research  	E-9
      E.3.2  Length of Contract	E-10
      E.3.3  Subcontracts	E-10
   E.4Proposal Requirements  	E-l 1
      E.4.1  RFP and Contract Information	E-ll
      E.4.2  Personnel	E-14
      E.4.3  Instrumentation  	E-17
          E.4.3.1 Type, Number, and Age of Laboratory Instruments  	E-18
          E.4.3.2    Service Contract	E-18
      E.4.4  Narrative to Approach	E-18
          E.4.4.1 Analytical Methods or Protocols	E-18
          E.4.4.2Meeting Contract Measurement Quality Objectives  	E-l9
          E.4.4.3    Data Package	E-19
          E.4.4.4    Schedule	E-19
          E.4.4.5    Sample Storage and Disposal	E-20
      E.4.5  Quality Manual  	E-21
      E.4.6  Licenses and Accreditations	E-22
      E.4.7  Experience	E-22
          E.4.7.1    Previous or Current Contracts	E-23
          E.4.7.2    Quality of Performance 	E-23
   E.5 Proposal Evaluation and Scoring Procedures  	E-23
      E.5.1  Evaluation Committee	E-24
      E.5.2  Ground Rules — Questions	E-24
      E.5.3  Scoring/Evaluation Scheme	E-24
          E.5.3.1 Review of Technical Proposal and Quality Manual  	E-26
          E.5.3.2    Review of Laboratory Accreditation  	E-28
          E.5.3.3    Review of Experience  	E-28
      E.5.4  Pre-Award Proficiency Samples  	E-28
      E.5.5  Pre-Award Audit  	E-29
      E.5.6  Comparison of Prices	E-33
      E.5.7  Debriefing of Unsuccessful Vendors	E-34
   E.6The Award	E-34
   E.7For the Duration of the Contract  	E-35
      E.7.1  Managing a Contract	E-35
      E.7.2  Responsibility of the Contractor  	E-36
      E.7.3  Responsibility of the Agency	E-36


JULY 2001                                                                     MARLAP
DRAFT FOR PUBLIC COMMENT             XXXIX                    DO NOT CITE OR QUOTE.

-------
Contents
      E.I.4  Anomalies and Nonconformance	E-36
      E.7.5  laboratory Assessment  	E-37
          E.7.5.1   Performance and Quality Control Samples 	E-37
          E.7.5.2   Laboratory Performance Evaluation Programs  	E-38
          E.7.5.3   Laboratory Evaluations Performed During the Contract Period	E-39
   E.SContractCompletion 	E-40
   E.9References	E-41

Appendix F Laboratory Subsampling	F-l
   F.I Introduction	F-l
   F.2 Basic Concepts	:	F-2
   F.3 Sources of Measurement Error  	F-4
      F.3.1  Sampling Bias	F-4
      F.3.2  Fundamental Error	F-5
      F.3.3  Grouping and Segregation Error  	F-7
   F.4 Implementation of the Paniculate Sampling Theory	F-10
      F.4.1  The Fundamental Variance  	F-10
      F.4.2  Scenario 1 - Natural Radioactive Minerals	F-l 1
      F.4.3  Scenario 2 - Hot Particles 	F-12
      F.4.4  Scenario 3 - Particle Surface Contamination  	F-14
   F.5 Summary 	F-16
   F.6 References 	F-17

G Statistical Tables 	  G-l
MARLAP                                                                     JULY 2001
DO NOT CITE OR QUOTE                    XL               DRAFT FOR PUBLIC COMMENT

-------
                                                                               Contents
                                    List of Figures

Figure 1.1 The Data Life Cycle	1-5
Figure 1.2 Typical Components of an Analytical Process  	1-7
Figure 1.3 The MARLAP Process  	:	1-16

Figure 3.1 Typical components of an analytical process  	3-3
Figure 3.2 Analytical protocol specifications	3-28
Figure 3.3 Example analytical protocol specifications	3-29

Figure 6.1 Analytical process	6-3
Figure 6.2 Method application life cycle  	6-6
Figure 6.3 Expanded Figure 6.2 addressing the laboratory's method evaluation process  ....  6-7

Figure 7.1 Considerations for the initial evaluation of a laboratory	7-18

Figure 8.1 The Assessment Process  	8-6

Figure 9.1 Using physical samples to measure a characteristic of the population representati\&lyl
Figure 9.2 Types of sampling and analytical errors	9-18

Figure 10.1 Example of chain-of-custody record	10-10

Figure 11.1 Overview of sample receipt, inspection, and tracking	11-2

Figure 12.1 Degree of error in laboratory  sample preparation (Scwedt, 1997)	12-1
Figure 12.2 Laboratory Sample Preparation Flowchart (for Solid Samples)	12-13

Figure 14.1  Ethylenediarninetetraacetic Acid (1)(EDTA)	14-22
Figure 14.2  Crown ethers	14-23
Figure 14.3  The behavior of elements in  concentrated hydrochloric acid on cation-exchange
   resins 	14-57
Figure 14.4  behavior of elements in concentrated hydrochloric acid on anion-exchange resid4-58
Figure 14.5  The electrical double layer	14-86

Figure 15.1 Gamma-ray Interactions with Germanium	15-12

JULY 2001          -  -                                                          MARLAP
DRAFT FOR PUBLIC COMMENT               XLI                     DO NOT CITE OR QUOTE

-------
Contents
                                                                                    Page
Figure 15.2 Gamma-ray Spectra of ^Co  	15-13
Figure 15.3 Energy Spectrum of 22Na  	*	15-15
Figure 15.4 Efficiency vs. Gamma-ray Energy	15-16
Figure 15.5 Standard Cryostat HPGe Background Spectrum  	15-20
Figure 15.6 Low Background Cryostat HPGe Background Spectrum	15-20
Figure 15.7 NaI(Tl) Energy Spectrum of I37Cs 	15-26
Figure 15.8 HPGe Energy Spectrum of I37Cs	15-27
Figure 15.9 Spectrum of 2!0Pb, 210Bi, and 210Po	15-29
Figure 15.10 Range vs. Energy for Alpha Particles in Air  	15-35
Figure 15.11 Range vs. Energy for Beta Particles in Air and Water	15-43
Figure 15.12 Beta Detector Efficiency Curve for 131I vs. Weight 	;	15-43
Figure 15.13 Beta-garnma coincidence efficiency curve for 129I	15-55

Figure 17.1 Gamma-ray spectrum	17-9
Figure 17.2 Gamma-ray analysis sequence	17-11
Figure 17.3 Low-energy tailing	17-16
Figure 17.4 Photopeak baseline continuum 	17-17
Figure 17.5 Photopeak baseline continuum-step function  	17-18
Figure 17.6 Alpha spectrum	17-23

Figure 18.1 Control chart for daily counting of a standard reference source, with limits corrected
   for decay	18-7
Figure 18.2 Three general categories of blank changes 	18-12
Figure 18.3 Failed performance indicator: replicates	18-15
Figure 18.4 Failed performance indicator: chemical yield	18-23

Figure 19.1 A symmetric distribution	19-4
Figure 19.2 An asymmetric distribution	19-4
Figure 19.3 The critical value xc and minimum detectable value XD
   of the net state variable 	19-20
Figure 19.4 Expected fraction of atoms remaining at time /	19-44
Figure 19.5 A normal distribution 	19-85
Figure 19.6 A log-normal  distribution  	19-86
Figure 19.7 Chi-square distributions	19-88
Figure 19.8 The ^-distribution with 3 degrees of freedom  	19-89
Figure 19.9 A rectangular distribution 	19-91


MARLAP                                                                       JULY 2001
DO NOT CITE OR QUOTE                    XLH               DRAFT FOR PUBLIC COMMENT

-------
                                                                               Contents
Figure 19.10 A trapezoidal distribution  	19-91
Figure 19.11 An exponential distribution	19-92
Figure 19.12 Type I error rate for the Poisson-normal approximation (tB~ts) 	19-113
Figure 19.13 Type I error rates for Formula A	19-115
Figure 19.14 Type I error rates for Formula B 	19-116
Figure 19.15 Type I error rates for Formula C 	19-118
Figure 19.16 Type I error rates for the Stapleton approximation	19-119
Figure 19.17 Type I error rates for the nonrandomized exact test  	19-121
Figure 19.18 Example: Normal probability plot	19-154

Figure B1 Seven steps of the DQO process	B-2
Figure B2(a) Decision performance goal diagram null hypothesis: the parameter exceeds the
   action level	B-l 1
Figure B2(b) Decision performance goal diagram null hypothesis: the parameter is less than the
   action level	B-l 1
Figure B3 How Proximity to the action level determines what is an acceptable level of
   uncertainty	B-13

Figure C.I Required Analytical Standard Deviation (oRcq)	C-10

Figure E-l General Sequence Initiating and Later Conducting Work with a Contract Laboratoly4
JULY 2001                                             -'       '-               MARLAP
DRAFT FOR PUBLIC COMMENT             XLJII                     DO NOT CITE OR QUOTE

-------
Contents

                                                                                   Page

                                    List of Tables

Table 2.1 Summary of the directed planning process and radioanalytical specialists
   participation	2-10

Table 3.1  Matrix-specific analytical planning issues	3-23

Table 4.1 Elements of project plan documents	  4-7
Table 4.2 Crosswalk between project plan document elements and directed planning process4-ll

Table 6.1 Tiered method validation approach	6-28

Table 7.1 Cross reference of information available for method evaluation	7-4

Table 9.1 Summary of the DQA process	9-6

Table 10.1  Summary of sample preservation techniques	10-25

Table 11.1  Typical topics addressed in standard operating procedures related to sample receipt,
   inspection, and tracking	11-4

Table 12.1  Examples of volatile radionuclides	12-4
Table 12.2  Properties of sample container materials  	12-5
Table 12.3  Examples of dry-ashing temperatures (platinum container)	12-23

Table 13.1  Common fusion fluxes	13-7
Table 13.2  Examples of acids used for wet ashing	13-14
Table 13.3  Standard reduction potentials of selected half-reactions at 25 °C	13-14

Table 14.1  Oxidation states of elements	14-9
Table 14.2  Stable oxidation states of selected elements	14-10
Table 14.3  Redox reagents for radionuclides	14-14
Table 14.5  Radioanalytical methods employing solvent extraction	14-35
Table 14.6  Radioanalytical methods employing extraction chromatography	14-36
Table 14.7  Elements separable by volatilization as certain species	14-41
Table 14.8  Typical functional groups of ion-exchange resins 	14-54

MARLAP   -                                                                    JULY 2001
DO NOT CITE OR QUOTE                    XLIV              DRAFT FOR PUBLIC COMMENT

-------
                                                                               Contents
Table 14.9  Common ion-exchange resins	14-55
Table 14.10 General solubility behavior of some cations of interest	 14-63
Table 14.11 Summary of methods for utilizing precipitation from
   homogeneous solution	14-74
Table 14.12 Influence of precipitation conditions on the purity of precipitates	14-76
Table 14.13 Common coprecipitating agents for radionuclides	14-83
Table 14.14 Coprecipitation behavior of plutonium and neptunium 	14-85
Table 14.15 General properties of common filter papers	14-89
Table 14.16 Atoms and mass of select radionuclides equivalent to  500 dpm  	14-91
Table 14.17 Masking agents for ions of various metals	14-117
Table 14.18 Masking agents for anions and neutral molecules 	14-119

Table 15.1 Typical percent gamma-ray efficiencies for a 55 percent high-purity germanium
   detector with various counting geometries  	15-17
Table 15.2 Nuclides for gamma-ray spectrometer calibration	15-48

Table 16.1  Nuclides for alpha calibration	16-10
Table 16.2  Nuclides for beta calibration	16-15

Table 17.1  Units for data reporting	17-38

Table 18.1  Problems leading to loss of analytical control  	18-3
Table 18.2a Certified Massic activities for natural radionuclides
   with a normal distribution of measurement results 	18-20
Table 18.2b Certified Massic activities for anthropogenic radionuclides with a Weibull
   distribution of measurement results	18-20
Table 18.2c Uncertified Massic activities	18-20
Table 18.3  Instrument background evaluation	18-27
Table 18.4  Root cause analysis of performance check results 	18-37
Table 18.5  Instrument calibration: example frequency and performance criteria  	18-41
Table 18A.I Bias-correction factor for the experimental standard deviation	18-60

Table 19.1  Applications of the uncertainty propagation  formula	19-34
Table 19.2  Density of air-free water	19-60
Table 19.3  95% confidence interval for a Poisson mean  	19-94
Table 19.4  Critical gross count (well-known blank)	19-111


JULY.2001.                                                                      MARLAP
DRAFT FOR PUBLIC COMMENT              XLV                    DO NOT CITE OR QUOTE

-------
Contents
Table 19.5  Bias factor for the experimental standard deviation	19-125
Table 19.6  Estimated and true values of SD (tB = tS)	19-131
Table 19.7  Input estimates and standard uncertainties	19-139

Table 20.1  Examples of laboratory-generated wastes	20-2

Table Dl QAPP groups and elements	  D-2
Table D2 Comparison of project plan contents  	  D-3
Table D3 Content of the three elements that constitute the project description	  D-9

Table E. 1  Examples of procurement options to obtain materials or services	E-7
Table E.2 SOW checklists for the agency and proposer .;	E-13
Table E.3 Laboratory technical supervisory personnel listed by position title and examples for
   Table suggested minimum qualifications  	E-15
Table E.4 Laboratory technical personnel listed by position title and examples for suggested
   minimum qualifications and examples of optional staff members 	E-16
Table E.5 Laboratory technical staff listed by position title and examples for suggested
   minimum qualifications	:	E-16
Table E.6 Example of a proposal evaluation plan  	E-26

Table G. 1 Quantiles of the standard normal distribution	  G-l
Table G.2 Quantiles of Student's t distribution	  G-3
Table G.3 Quantiles of chi-square	  G-5
Table G.4 Critical values for the nonrandomized exact test	  G-7
Table G.5 Critical values of Filliben's statistic	 G-ll
Table G.6 Summary of probability distributions	 G-12
MARLAP                                                                      JULY 2001
DO NOT CITE OR QUOTE                    XLVI             DRAFT FOR PUBLIC COMMENT

-------
              ACRONYMS AND ABBREVIATIONS

   Note: Bracketed numbers following each definition represent the first chapter in which the acronym appears.
ADC	analog to digital converter [18]
AEA	Atomic Energy Act [20]
AL 	action level [C]
ANSI 	American National Standards Institute [1]
AOAC 	Association of Official Analytical Chemists [3]
APHA	American Public Health Association [6]
APS 	analytical protocol specification [1]
ARARs	applicable or relevant and appropriate requirements (CERCLA/Superfund) [D]
ASL 	analytical support laboratory [15]
ASQC	American Society for Quality Control [2}
ASTM 	American Society for Testing and Materials [1]
ATD	alpha track detector [10]

BOA	basic ordering agreement [4]

CAA	Clean Air Act [20]
CBD	Commerce Business Daily [E]
CC 	charcoal canisters [10]
CEDE	committed effective dose equivalent [2]
CERCLA .... Comprehensive Environmental Response, Compensation, and Liability Act of
                 1980(Superfund)[2]
CFM	cubic feet per minute [16]
CFR 	Code of Federal Regulations [20]
CL 	central line (of a control chart) [15]
CMPO	[octyl(phenyl)]-N,N-diisobutylcarbonylmethylphosphine oxide [14]
CMST	Characterization, Monitoring, and Sensor Technology Program (DOE) [A]
COC	chain of custody [2]
COR	contracting officer's representative [5]
cpm	counts per minute [12]
cps 	counts per second [15]
CRM	continuous radon monitor [10]
CRM	certified reference material [18]
CWA 	Clean Water Act [20]
CWLM	continuous working level monitor [10]
JULY 2001                                                                 MARLAP
DRAFT FOR PUBLIC COMMENT     ""      XLVII                  DO NOT CITE OR QUOTE

-------
Acronyms and Abbreviations
DAAP	diamylamylphosphonate [14]
DCGL	derived concentration guideline level [2]
DIN	di-isopropylnaphthalene [16]
DL  	discrimination limit [C]
DoD	U.S. Department of Defense  [1]
DOE	U.S. Department of Energy [1]
DOELAP	DOE Lab Accreditation Program  [18]
DOT	U.S. Department of Transportation [5]
DPM	disintegrations per minute [12]
DPPP  	dipentylpentylphosphonate [14]
DQA	data quality assessment [1]
DQI	data quality indicators [3]
DQO	data quality objective [1]
DTPA	diethylene triamine penta-acetic acid [10]
DVB	divinylbenzene [14]

EDD	electronic data deliverables [17]
EDTA	ethylene diamine tetra acetic acid  [10]
EGTA	ethyleneglycol bis(2-aminoethylether)-tetraacetate [14]
EPA 	U.S. Environmental Protection Agency [1]
ERPRIMS  ... Environmental Resources Program Management System (U.S. Air Force) [17]
ESC 	expedited site characterization [A]
eV	electron volts [15]

FAR	Federal Acquisition Regulations [E]
FDA	U.S. Food and Drug Administration [1]
FWHM	full width of a peak at half maximum [8]
FWTM	full width of a peak at tenth maximum [18]

GC  	gas chromatography [14]
GLPC	gas-liquid phase chromatography  [14]
GM	Geiger-Mueller detector [11]
GUM  	Guide to the Expression of Uncertainty in Measurement [1]

HDBP	dibutylphosphoric acid [14]
HDEHP  	bis(2-ethylhexyl) phosphoric acid [16]
HDEHP  	diethylhexylphosphoricacid[14]                   ._  ,


MARLAP                           "                                       JULY 2001
DO NOT CITE OR QUOTE                  XLVIII             DRAFT FOR PUBLIC COMMENT

-------
                                                           Acronyms and Abbreviations
HOPE	high density polyethylene [10]
HPGe	high-purity germanium [semiconductor] [15]
HPLC	high-pressure liquid chromatography; high-performance liquid chromatography
                  [14]
HTRW	hazardous, toxic and radioactive waste [10]

ICP-MS  	inductively coupled plasma-mass spectroscopy [14]
IPPD	integrated product and process development [A]
ISO	International Organization for Standardization [1]
IUPAC	International Union of Pure and Applied Chemistry [1]

LAN	local area network [17]
LBGR	lower boundary of the gray region [B]
LCL 	lower control limit [18]
LCS 	laboratory control samples [3]
LDPE	low density polyethylene [10]
LEGe  	low energy germanium [15]
LJMS  	Laboratory Information Management System [17]
LLD 	lower limit of detection [19]
LLRW  	low-level radioactive waste [20]
LLRWPA	Low Level Radioactive Waste Policy Act [20]
LOMI	low oxidation-state transition-metal ion [10]
LPC 	liquid partition chromatography; liquid-phase chromatography [14]
US	liquid scintillation  [15]
LSC 	liquid scintillation counting [15]
LWL	lower warning limit [18]

MAPEP  	Mixed Analyte Performance Evaluation Program [DOE] [5]
MARSSIM ... Multi-Agency Radiation Survey and Site Investigation Manual [1]
MCA  	multichannel analyzer [15]
MDA  	minimum detection analysis [15]; minimum detectable amount [7]
MDC  	minimum detectable concentration [3]
MDL	method detection limit [19]
MDC  	minimum detectable concentration [2]
MIBK ,	methyl isobutyl ketone [14]
MQC  	minimum quantifiable concentration [3]
MQO  	measurement quality objective [1]


JULY 2001                                                                   MARLAP
DRAFT FOR PUBLIC COMMENT             XLIX  " " "        ,     DO NOT CITE OR QUOTE

-------
Acronyms and Abbreviations
MS	matrix spike [8]
MSD	matrix spike duplicate [8]
MVRM	method validation reference material [5]

NELAC  	National Environmental Laboratory Accreditation Conference [5]
NESHAP	National Emission Standards for Hazardous Air Pollutants [12]
NIST	National Institute of Standards and Technology [1]
NRC	U.S. Nuclear Regulatory Commission [1]
NRIP  	NIST Radiochemistry Intercomparison Program [18]
NT A or NTT A  nitrilotriacetate [ 14]
NTU	nephelometric turbidity units [10]
NVLAP  	National Voluntary Laboratory Accreditation Program (NIST) [5]

OA	observational approach [A]
OFHC	oxygen-free high-conductivity [15]
OFPP	Office of Federal Procurement Policy [E]

PARCC  	precision, accuracy, representativeness, completeness, and comparability [3]
PCB  	polychlorinated biphenyl [20]
PDF  	probability density function [19]
PE	  performance evaluation  [5]
PFA  	perfluoroalcoholoxil™ [13]
PIC	pressurized ionization chamber [15]
PT	performance testing [5]
PTFE  	polytetrafluoroethylene  [12]
PUREX  	plutonium uranium reduction extraction [14]
PVC	polyvinyl chloride [10]

QA	quality assurance [2]
QAP	  Quality Assessment Program (DOE) [5]
QAPP	quality assurance project plan [1]
QC  	quality control [1]

RCRA	Resource Conservation and Recovery Act [15]
REE  	rare earth elements [13]
REGe	reverse-electrode germanium [semiconductor] [15]
RFP  	request for proposals [5]


MARLAP                                                                   JULY 2001
DO NOT CITE OR QUOTE                    L               DRAFT FOR PUBLIC COMMENT

-------
                                                           Acronyms and Abbreviations
RFQ	request for quotations [E]
RMDC	required minimum detectable concentration [8]
ROI	regions of interest [17]
RPD	relative percent difference [7]
RPM	Remedial Project Manager [2]
RSD	relative standard deviation [19]
RSO	Radiation Safety Officer [11]

SA 	spike activity [7]
SAFER	streamlined approach for environmental restoration (DOE) [2]
SAM	Site Assessment Manager [2]
SAP 	sampling and analysis plan [1]
SI  	international system of units [3]
SMO	sample management office [2]
SOP 	standard operating procedure [4]
SOW	Statement of Work [1]
SQC	statistical quality control [15]
SR	unspiked sample result [7]
SRM	standard reference material [18]
SSR 	spiked sample result [7]

TAT	turnaround time [7]
TBP 	tributyl phosphate [14]
TC 	to contain [glassware] [18]
TCLP	toxicity characteristic leaching procedure [13]
TD 	to deliver [glassware] [18]
TEC 	technical evaluation committee [5]
TEDE	total effective dose equivalent [2]
TES 	technical evaluation sheet (USGS) [5]
TFM	tetrafluorometoxir [13]
TIOA 	tri-iso-octylamine [14]
TLD	thermoluminescent dosimeter [10]
TOPO	trioctylphosphinic oxide [14]
TPO 	Technical Project Officer [2]
TPP	technical project planning [2]
TPU 	total propagated uncertainty  [19]
TQM	Total Quality Management [A]


JULY 2001                                                                     MARLAP
DRAFT FOR PUBLIC COMMENT               LI                     DO NOT CITE OR QUOTE

-------
Acronyms and Abbreviations
TRUEX  	trans uranium extraction [14]
TSCA	Toxic Substances Control Act [20]
TSDF	treatment, storage, or disposal facility [20]
TTA	thenoyltrifluoroacetone [14]

UBGR  	upper bound of the gray region [7]
UCL	upper control limit [18]
USGS	United States Geological Survey [1]
UWL  	upper warning limit [18]

V	volts [15]

WCP	waste certification plan [20]

XtGe	extended-range germanium [semiconductor] [15]
MARLAP                                                                  JULY 2001
DO NOT CITE OR QUOTE        „          LIT              DRAFT FOR PUBLIC COMMENT

-------
                         1  INTRODUCTION TO MARLAP
 2     1.1    Overview

 3     Each year, hundreds of millions of dollars are spent on projects and programs that rely, to varying
 4     degrees, on radioanalytical data for decision-making. These decisions often have a significant
 5     impact on human health and the environment. Of critical importance to informed decision-
 6     making are data of known quality appropriate for its intended use. Making incorrect decisions
 7     due to data inadequacies, such as failing to remediate a radioactively contaminated site,
 8     necessitates the expenditure of additional resources, causes delays in project completions and,
 9     depending on the nature of the project, can result in the loss of public trust and confidence. The
10     Multi-Agency Radiological Laboratory Analytical Protocols (MARLAP) Manual addresses the
11     need for a nationally consistent approach to producing radioanalytical laboratory data that meet a
12     project's or program's data requirements. MARLAP provides guidance for the planning,
13     implementation,  and assessment phases of those projects that require the laboratory analysis of
14     radionuclides. The guidance provided by MARLAP is both scientifically rigorous and flexible
15     enough to be applied to a diversity of projects and programs. This guidance is intended for
16     project planners, managers, and laboratory personnel.

17     MARLAP is divided into two main parts. Part I is primarily for project planners and managers
18     and provides guidance on project planning with emphasis on analytical planning issues and
19     analytical data requirements. Part I also provides guidance on preparing project plan documents
20     and radioanalytical statements of work (SOWs), obtaining and evaluating radioanalytical
21     laboratory services, data validation, and data quality assessment Part I of MARLAP covers the
22     entire life of a project that requires the laboratory analysis of radionuclides from the initial
23     project planning  phase to the assessment phase.

24     Part II of MARLAP is primarily for laboratory personnel and provides guidance in the relevant
25     areas of radioanalytical laboratory work. Part II offers information on the laboratory analysis of
26     radionuclides. It  provides guidance on a variety of activities performed at radioanalytical
27     laboratories including sample preparation, sample dissolution, chemical separations, instrument
28     measurements, data reduction, etc. Note that Part II of the manual is not a compilation of
29     analytical procedures. While the chapters in Part II do not contain detailed step-by-step
30     instructions of how to perform certain laboratory tasks, they do provide information on many of
31     the options available for these tasks, and discuss  advantages  and disadvantages of each.

32     MARLAP was developed collaboratively by the  following Federal agencies: the Environmental
33     Protection Agency (EPA), the Department of Energy (DOE), the Nuclear Regulatory
34     Commission (NRC), the Department of Defense (DOD), the National Institute of Standards and

       JULY 2001                                                                     MARLAP
       DRAFT FOR PUBLIC COMMENT               1-1                    DO NOT CITE OR QUOTE

                                                              U.S. EPA Headquarters Library
                                                                    Mail code 3201
                                                              4 ortn Danncufuania  Avenue NW

-------
       Introduction to MARLAP
35     Technology (NIST), the United States Geological Survey (USGS), and the Food and Drug .
36     Administration (FDA). State participation in the development of MARLAP involved
37     contributions from representatives from the Commonwealth of Kentucky and the State of
38     California.

39     1.2    Purpose of the Manual

40     MARLAP's basic goal is to provide guidance and a framework for project planners, managers,
41     and laboratory personnel to ensure that radioanalytical laboratory data will meet a project's or
42     program's data requirements and needs. To attain this goal, MARLAP provides the necessary
43     guidance for national consistency in radioanalytical work in the form of a performance-based
44     approach for meeting a project's data requirements. In general terms, a performance-based
45     approach to laboratory analytical work involves clearly defining the analytical data needs and
46     requirements of a project in terms of measurable goals during the planning phase of a project.
47     These project-specific analytical data needs and requirements then serve as measurement
48     performance criteria for decisions as to exactly how the laboratory analysis will be conducted
49     during the implementation phase of a project. They are used subsequently as criteria for
50     evaluating analytical data during the assessment phase. Therefore, through a performance-basec
51     approach, MARLAP provides guidance in the planning, implementation and assessment phases
52     for those projects that require the laboratory analysis of radionuclides. The manual focuses on
53     activities performed at radioanalytical laboratories, as well as activities and issues that direct,
54     affect, or can be used to evaluate activities performed at radioanalytical laboratories. The
55     guidance in MARLAP is intended to help ensure the generation of radioanalytical data of know
56     quality appropriate for its intended use.

57     Specific objectives of MARLAP include:

58     •  Promoting a directed planning process for projects involving individuals from relevant
59        disciplines including radiochemistry;

60     •  Highlighting common radioanalytical planning issues;

61     •  Providing a framework and information resource for using a performance-based approach fl
62        planning and conducting radioanalytical work;

63     *  Providing guidance on linking project planning, implementation, and assessment;

64"    •  Providing guidance on obtaining and evaluating radioanalytical laboratory services;

       MARLAP                                                   	               JULY2C
       DO NOT CITE OR QUOTE   ....     ~  '       1-2            ~ DRAFT FOR PUBLIC COMME1

-------
                                                                       Introduction to MARLAP
65     •  Providing guidance for evaluating radioanalytical laboratory data, i.e., data verification, data
66        validation, and data quality assessment;

67     •  Promoting high quality radioanalytical laboratory work; and

68     •  Making collective knowledge and experience in radioanalytical work widely available.

69     As indicated by the list of objectives, MARLAP provides guidance to project planners, managers,
70     and laboratory personnel for a range of activities for those projects and programs that require the
71     laboratory analysis of radionuclides.

72     1.3    Use and Scope of the Manual

73     The guidance contained in MARLAP is for both governmental and private sectors. Users of
74     MARLAP include project planners, project managers, laboratory personnel, regulators, auditors,
75     inspectors, data evaluators, decision makers, and other end users of radioanalytical laboratory
76     data.

77     Since MARLAP uses a performance-based approach to laboratory measurements, the guidance
78     contained hi the manual is applicable to a wide range of projects and activities that require
79     radioanalytical laboratory measurements. Examples of data collection activities that MARLAP
80     supports include:

81     •  She characterization activities;
82     •  Site cleanup and compliance demonstration activities;
83     •  License termination activities;
84     •  Decommissioning of nuclear facilities;
85     •  Remedial and removal actions;
86     •  Effluent monitoring of licensed facilities;
87     •  Environmental site monitoring;
88     •  Background studies;
89     •  Routine ambient monitoring; and
90     •  Waste management activities.

91     MARLAP and the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM,
92     2000) are complementary guidance documents in support of cleanup and decommissioning
93     activities. MARSSIM provides guidance on how to plan and carry out a study to demonstrate that

       JULY 2001                                           -      "                    MARLAP
       DRAFT FOR PUBLIC COMMENT               1-3                -   DO NOT CITE OR QUOTE

-------
        Introduction to MARLAP
 94     a site meets appropriate release criteria. It describes a methodology for planning, conducting,
 95     evaluating, and documenting environmental radiation surveys conducted to demonstrate
 96     compliance with cleanup criteria. MARLAP provides guidance and a framework for both project
 97     planners and laboratory personnel to ensure that radioanalytical data will meet the needs and
 98     requirements of cleanup and decommissioning activities.

 99     While MARLAP is designed to support a wide range of projects, some topics are not specifically
100     discussed in the manual. These include high-level waste, mixed waste, and medical applications
101     involving radionuclides. While they are not specifically addressed, much of MARLAP's
102     guidance may be applicable in these areas. Although the focus of the manual is to provide
103     guidance for the planning, implementation, and assessment phases of those projects that require
104     the laboratory analysis of radionuclides, much of the guidance on the planning and assessment
105     phases can be applied wherever the measurement process is conducted, for example, in the field.
106     In addition, MARLAP does not provide specific guidance on sampling design issues, sample
107     collection, field measurements, laboratory quality assurance issues, or laboratory health and
108     safety practices. However, a brief discussion of some aspects of these activities has been included
109     in the manual because of the effect these activities often have on the laboratory analytical
110     process.

111     1.4   Key MARLAP Concepts and Terminology

112     Some of the terms used in MARLAP were developed for the purpose of this manual, while
113     others are commonly used terms that have been adopted by MARLAP. Where possible, every
114     effort has been made to use terms and definitions from consensus-based organizations (e.g.,
115     International Organization for Standardization [ISO], American National Standards Institute
116     [ANSI], American Society for Testing and Materials [ASTM], International Union of Pure and
117     Applied Chemistry [IUPAC]).

118     The following sections are intended to familiarize the reader with the key terms and concepts
119     used in MARLAP. In general, each term or concept is discussed individually in each section
120     without emphasizing how these terms and concepts are linked. Section 1.5 ties these terms and   •
121     concepts together to provide an overview of the MARLAP process.

122     1.4.1  Data Life Cycle

123     The data life cycle (EPA, 2000) approach provides a structured means of considering the major
124     phases of projects that involve data collection activities (Figure 1.1). The three phases of the data
125     life cycle are planning, implementation, and assessment. MARLAP provides information on all

        MARLAP                                                ~                     JULY 2001
        DO NOT CITE OR QUOTE                     1 -4              . DRAFT FOR PUBLIC COMMENT-

-------
                                                                          Introduction to MARLAP
126     three phases for two major types of
127     activities: those performed at radioanaly-
128     tical laboratories and those that direct,
129     affect, or evaluate activities performed at
130     radioanalytical laboratories (such as
131     project planning, development of plan
132     documents, data verification and data
133     validation). Consequently, MARLAP
134     provides guidance for project planners,
135     managers, and laboratory personnel.

136     One of the specific objectives of the
137     MARLAP Manual is to provide
138     guidance on, and to emphasize the
139     importance of, establishing the proper
140     linkages among the three phases of the
141     data life cycle—planning, implemen-
142     tation and assessment—thereby resulting
143     in an integrated and iterative process that
144     accurately translates the expectations
145     and requirements of data users into
146     measurement performance criteria for data suppliers. From an analytical perspective, the
147     integration of the three phases of the data life cycle is critical to ensure that the analytical data
148     requirements defined during the planning phase serve as measurement performance criteria
149     during the implementation phase and subsequently as criteria for data evaluation during the
150     assessment phase. The proper linkages and integration of the three phases of the data life cycle
151     should be established during the planning phase. Without the proper linkages and integration of
152     the three phases, there is a significant likelihood that the analytical data will not meet a project's
153     data requirements, and the data may be evaluated using criteria that have little relation to their
154     intended use. Therefore, failure to integrate and adequately link the three phases of the data life
155     cycle increases the likelihood of project cost escalation or project failure.

156     1.4.2  Directed Planning Process

157     MARLAP recommends the use of a directed or systematic planning process. A directed planning
158     process is an approach for setting well-defined, achievable objectives and developing a cost-
159     effective, technically sound sampling and analysis design that balances the data user's tolerance
160     for uncertainty in the decision process with the resources available for obtaining data to support a
DATA LIFE CYCLE
PROCESS
I
z.
E
I
|
Directed Planning
Process
Plan Documents
Contracting Services
Sampling
Analysis
Verification
Validation
Data Quality Assessment
PROCESS OUTPUTS
Development of Data Quality Objectives and
Measurement Quality Objectives (Including Optimized
Sampling and Analytical Design)
Project Plan Documents
Including Quality Assurance Project Plan (QAPP);
Work Plan or Sampling and Analysts Plan (SAP); Data
Validation Ran; Data Quality Assessment Plan
Statement of Work (SOW)
and Other Contractual Documents
Laboratory Samples
|
Laboratory Analysis
(Including QC Samples)
Complete Data Package
1
Verified Data
Data Verification Report
J,
Validated Data
Data Validation Report
Assessment Report
Data ol Known Quality Appropriate lor the Intended Use
      FIGURE 1.1 — The Data Life Cycle
        JULY 2001
        DRAFT FOR PUBLIC COMMENT
1-5
              MARLAP
DO NOT CITE OR QUOTE

-------
        Introduction to MARLAP
161      decision. While MARLAP recommends and promotes the use of a directed planning process, it
162      does not recommend or endorse any particular directed planning process. However, MARLAP
163      employs many of the terms and concepts associated with the data quality objective (DQO)
164      process (ASTM D5792, EPA, 2000). This was done to ensure consistent terminology throughout
165      the manual, and also because many of the terms and concepts of this process are familiar to those
166      engaged in environmental data collection activities.

16?      1.4.3   Performance-Based Approach

168      MARLAP provides the necessary guidance for using a performance-based approach to meet a
169      project's analytical data requirements. In a performance-based approach, the project-specific
170      analytical data requirements that are determined during directed planning serve as measurement
171      performance criteria for analytical selections and decisions. The project-specific analytical data
172      requirements also are used for the initial, ongoing, and final evaluation of the laboratory's
173      performance and the laboratory's data. MARLAP provides guidance for using a performance-
174      based approach for all three phases—planning, implementation and assessment—of the data life
175      cycle for those projects that require radioanalytical laboratory data. This involves not only using a
176      performance-based approach for selecting an analytical protocol, but also using a performance-
177      based approach for other project activities, such as developing acceptance criteria for laboratory
17S      quality control samples, laboratory evaluations, data verification, data validation, and data quality
179      assessment

180      There are three major steps or processes associated with a performance-based approach. The first
181      is clearly and accurately defining the analytical data requirements for the project This process is
182      discussed in more detail in Section 1.4.9 of this chapter. The second involves using an organized,
183      interactive process for selecting or developing analytical protocols to meet the specified
184      analytical data requirements and for demonstrating the protocol's ability to meet the analytical
185      data requirements. The last major activity involves using the analytical data requirements as
186      measurement performance criteria for the ongoing and final evaluation of the laboratory data,
187      which would include data verification, data validation, and data quality assessment. MARLAP
188      provides guidance in all three of these areas. Within the constraints of other factors, such as cost,
189      a performance-based approach allows for the use of any analytical protocol that meets the
190      project's analytical data requirements. For all relevant project activities, the common theme of a
191      performance-based approach is the use of project-specific analytical data requirements that are
192      developed during project planning and serve as measurement performance criteria for selections,
193      evaluations, and decision-making.
        MARLAP                                                                       JULY 2001
        DO NOT CITE OR QUOTE  "                   1-6               DRAFT FOR PUBLIC COMMENT

-------
                                                                            Introduction to MARLAP
194

195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
             Reid Sample Preparation and
            Sample Receipt and Inspection
            Laboratory Sampls Preparation
               Sample Dissolution
              Chemical Separation of
              RadionucMn of Concern
1.4.4  Analytical Process

Most environmental data collection
efforts center around two major
processes: the sampling process and
the analytical process. MARLAP
does not provide general guidance
on the sampling process, except for
brief discussions of certain activities
that often affect the analytical
process (field processing,
preservation, etc.). The analytical (or
measurement) process is a general
term used by MARLAP to refer to a
compilation of activities starting
from the time a sample is collected
and ending with the reporting of
data. These activities typically
include field sample preparation and
preservation, sample receipt and
inspection, laboratory sample
preparation, sample dissolution,
chemical separations, preparation of
samples for instrument measure-
ments, instrument measurements,
data reduction, data reporting, and
quality control of the process. Figure
1.2 illustrates the major components    FIGURE 1.2 — Typical Components of an Analytical Process
of an analytical process. It should be noted that a particular analytical process for a project may
not include all of the activities listed. For example, if a project involves the analysis of tritium in
drinking water, then the analytical process for the project will not include sample dissolution and
the chemical separation of the radionuclide of concern. It is important to identify the relevant
activities of the analytical process for a particular project early in the planning phase. Once the
activities have been identified, the analytical requirements of the activities can be established,
which will ultimately lead to defining how the activities will be accomplished through the
selection or development of written procedures for the various activities.
                  n of S*vnplm ftw
              Instrument Measurements
              Instrument Measurements
             Data Roductiofi •nd Reporting
        JULY 2001
        DRAFT FOR PUBLIC COMMENT
1-7
                                                                                   MARLAP
                                                                     DO NOT CITE OR QUOTE

-------
        Introduction to MARLAP
229     The analytical process should not be confused with the written procedures necessary to perform
230     the associated activities of the analytical process. The analytical process (i.e., the compilation ol
23 J     activities starting from the time a sample is collected and ending with the reporting of the data)
232     should be performed according to written procedures

233     1.4.5  Analytical Protocol

234     MARLAP uses the term "analytical protocol" to refer to a compilation of specific procedures an*
235     methods that are performed in succession for a particular analytical process. For example, a
236     protocol for the analysis of drinking water samples for tritium would be comprised of the set of
237     procedures that describe the relevant activities, such as sample tracking, quality control, field
238     sample preparation and preservation, sample receipt and inspection, laboratory sample prepara-
239     tion (if necessary), preparing the samples for counting, counting the samples, and data reduction
240     and reporting. A written procedure may cover one or more of the activities, but it is unlikely that
241     a single procedure will cover all of the activities of a given analytical process. It should be noted
242     that with a performance-based approach, there may be a number of alternative protocols that
243     might be appropriate analytical protocols for a particular analytical process. Selecting or develop-
244     ing an analytical protocol requires knowledge of the particular analytical process, as well as an
245     understanding of the analytical data requirements developed during the project planning phase.

246     1.4.6  Analytical Method

247     A major component of an analytical protocol is the analytical method, which normally includes
248     written procedures for sample digestion, chemical separation (if required) and counting. It is
249     recognized that in many instances the  analytical method may cover many of the activities of a
250     particular analytical process. Therefore attention is naturally focused on the selection or
251     development of an analytical method.  However, many analytical methods do not address
252     activities such as field preparation and preservation, certain aspects of laboratory preparation,
253     laboratory subsampling, etc., which are often important activities within an analytical process.
254     The analytical protocol is generally  more inclusive of the activities that make up the analytical
255     process than the analytical method. For this reason, MARLAP focuses on the selection,
256     implementation, and assessment of analytical protocols that cover the entire analytical process
257     for a particular project or program.

258     1.4.7  Uncertainty and Error

259     An important aspect of sampling and measurement is uncertainty. The term "uncertainty" has
260     different shades of meaning in different contexts, but generally the word always refers to a lack


        MARLAP                                                                       JULY 2001
        DO NOT CITE OR QUOTE                     1-8    -  —-    DRAFT FOR PUBLIC COMMENT

-------
                                                                         Introduction to MARLAP
261      of complete knowledge about something of interest. In the context of metrology (the science of
262      measurement), the more specific term "measurement uncertainty" often will be used. "Uncertain-
263      ty (of measurement)" is defined in the Guide to the Expression of Uncertainty in Measurement
264      (ISO 1995—"GUM") as a "parameter, associated with the result of a measurement, that charac-
265      terizes the dispersion of values that could reasonably be attributed to the measurand." The
266      "measurand" is the quantity being measured. MARLAP recommends the terminology and
267      methods of GUM for describing, evaluating, and reporting measurement uncertainty. The
268      uncertainty of a measured value is typically expressed as an estimated standard deviation, called
269      a "standard uncertainty" (or "one-sigma uncertainty"). The standard uncertainty of a calculated
270      result usually is obtained by propagating the standard uncertainties of a number of other
271      measured values, and in this case, the standard uncertainty is called a "combined standard
272      uncertainty." The combined standard uncertainty may be multiplied by a specified factor called a
273      "coverage factor" (e.g., 2 or 3) to obtain an "expanded uncertainty" (a "two-sigma" or "three-
274      sigma" uncertainty), which describes an interval about the result that can be expected to contain
275      the true value with a specified high probability. MARLAP recommends that either the combined
276      standard uncertainty or an expended uncertainty be reported with every result. Chapter 19
277      discusses the terminology, notation, and methods of GUM in more detail and provides guidance
278      for applying the concepts to radioanalytical measurements.

279      While measurement uncertainty is a parameter associated with an individual result and is
280      calculated after a measurement is performed, MARLAP uses the term "method uncertainty" to
281      refer to the predicted uncertainty of a measured value that likely would result from the analysis of
282      a sample at a specified analyte concentration. Method uncertainty is a method performance
283      characteristic much like the detection capability of a method.  Reasonable values for both
284  .    characteristics can be predicted for a particular method based on typical values for certain
285      parameters and on information and assumptions about the samples to be analyzed. These
286      predicted values can be used in the method selection process to  identify the most appropriate
287      method based on a project's data requirements. Chapter 3 provides MARLAP's recommenda-
288      lions for deriving analytical protocol selection criteria based on  the required method uncertainty
289      and other analytical requirements.

290      When a decision maker bases a decision on the results of measurements, the measurement
291      uncertainties affect the probability of making a wrong decision.  When sampling is involved,
292      sampling statistics also contribute to the probability of a wrong decision. Since decision errors
293      are possible, there is uncertainty in the decision-making process. MARLAP uses the terms
294      "decision uncertainty" or "uncertainty of the decision" to refer to this type of uncertainty.
295      Decision uncertainty is usually measured by the estimated probability of a decision error under
        JULY 2001                                                                      MARLAP
        DRAFT FOR PUBLIC COMMENT               1-9                     DO NOT CITE OR QUOTE

-------
        Introduction to MARLAP
361     objectives as "measurement quality objectives" (MQOs). The MARLAP Manual provides
362     guidance on developing the MQOs from the overall project DQOs (Chapter 3). MQOs can be
363     viewed as the analytical portion of the DQOs and are therefore project-specific. MARLAP
364     provides guidance on developing MQOs during project planning for select method performance
365     characteristics, such as method uncertainty at a specified concentration; detection capability;
366     quantification capability; specificity, or the capability of the method to measure the analyte of
367     concern in the presence of interferences; range; ruggedness, etc. An MQO is a statement of a
368     performance objective or requirement for a particular method performance characteristic. Like
369     DQOs, MQOs can be quantitative and qualitative statements. An example of a quantitative  MQO
370     would be a statement of a required method uncertainty at a specified radionuclide concentration,
371     such as the action level—i.e., "a method uncertainty of 3.7 Bq/kg (0.10 pCi/g) or less is required
372     at the action level of 37 Bq/kg (1.0 pCi/g)." An example of a qualitative MQO would be a
373     statement of the required specificity of the analytical protocol—the ability to analyze for the
374     radionuclide of concern given the presence of interferences—i.e., "the protocol must be able to
375     quantify the amount of 226Ra present given high levels of 235U in the samples."

376     The MQOs serve as measurement performance criteria for the selection or development of
377     analytical protocols and for the initial evaluation of the analytical protocols. Once the analytical
378     protocols have been selected and evaluated, the MQOs serve as criteria for the ongoing and final
379     evaluation of the laboratory data,- including data verification, data validation, and data quality
380     assessment. In a performance-based approach, analytical protocols are either selected or rejected
381     for a particular project, to a large measure, based on their ability or inability to achieve the stated
382     MQOs. Once selected, the performance of the analytical protocols is evaluated using the project-
383     specific MQOs.

384     1.4.10 Analytical Protocol Specifications

385     MARLAP uses the term "analytical protocol specifications" (APSs) to refer to the output of a
386     directed planning process that contains the project's analytical data requirements in an organized
387     concise form. In general, there will be an APS developed for each analysis type, and since most
388     projects require that a number of different analyses be performed, several APSs will normally be
389     developed for a particular project. These specifications serve as the basis for the evaluation and
390     selection of the analytical protocols that will be used for a particular project. In accordance  with a
391     performance-based  approach, the APSs contains only the minimum level of specificity required
392     to meet the project's analytical data requirements without dictating exactly how the requirements
393     are to be met. At a minimum,-the APSs should indicate the analyte of interest, the matrix of
394     concern, the type and frequency of quality control (QC)  samples, and provide the required MQOs
395     and any specific analytical process requirements, such as chain-of-custody for sample tracking.

  	•   MARLAP                                                                       JULY 2001
        DO NOT CITE OR QUOTE                     1-12              DRAFT FOR PUBLIC COMMENT

-------
                                                                          Introduction to MARLAP
396     Depending on the particular project, a number of specific analytical process requirements may be
397     included. For example, if project or process knowledge indicates that the radionuclide of interest
398     exists in a refractory form, then the APSs may require a fusion step for sample digestion.
399     However, the level of specificity in the APSs should be limited to those requirements that are
400     considered essential to meeting the project's analytical data requirements. In most instances, a
401     particular APS document would be a one-page form (see Chapter 3, Figure 3.2). For a particular
402     project, APSs would be developed for each analysis required.

403     Within the constraints of other factors, such as cost, MARLAP's performance-based approach
404     allows the use of any analytical protocol that meets the requirements contained in the APSs. The
405     requirements in the APSs, in particular the MQOs, are used for the selection and evaluation of
406     the analytical protocols. Once the analytical protocols have been selected and evaluated, the
407     APSs then serve as criteria for the ongoing and final evaluation of the laboratory data, including
40S     data verification, data validation, and data quality assessment.

409     1.4.11 The Assessment Phase

410     As noted, the MARLAP Manual provides guidance for the assessment phases for those projects
411     that require the laboratory analysis of radionuclides. The guidance on the assessment phase of
412     projects focuses on three major activities: data verification, data validation, and data quality
413     assessment.

414     Data verification assures that laboratory conditions and operations were compliant with the
415     statement of work and any appropriate project plan documents (e.g., Quality Assurance Project
416     Plan), which may reference laboratory documents such as laboratory standard operating
417     procedures) Verification compares the material delivered by the laboratory to these requirements
418     (compliance) and checks for consistency and comparability of the data throughout the data
419     package, correctness of calculations, and completeness of the results to ensure that all necessary
420     documentation is available. The verification process produces a report identifying which
421     requirements are not met. The verification report is used to determine payment for laboratory
422     services and to identify problems that should be investigated during data validation. Verification
423     works iteratively and interactively with the generator (i.e., laboratory) to assure receipt of all
424     available, necessary data. Although the verification process identifies specific problems, the
425     primary function should be to apply appropriate feedback resulting in corrective  action
426     improving the analytical services before the work is completed.

427     Validation addresses the reliability of the data. The validation process begins with a review of the
428     verification report and laboratory data package to screen the areas of strength and weakness of


        JULY 2001                                         - -                            MARLAP
        DRAFT FOR PUBLIC COMMENT              1-13                     DO NOT CITE OR QUOTE

-------
        Introduction to MARLAP
429     the data set. The validator evaluates the data to determine the presence or absence of an analyte
430     and the uncertainty of the measurement process for contaminants of concern. During validation,
431     the technical reliability and the degree of confidence in reported analytical data are considered.
432     Validation "flags" (i.e., qualifiers) are applied to data that do not meet the acceptance criteria
433     established to assure data meet the needs of the project. The product of the validation process is 2
434     validation report noting all data sufficiently inconsistent with the validation acceptance criteria in
435     the expert opinion of the validator. The appropriate data validation tests should be established
436     during the project planning phase.

437     Data quality assessment (DQA), the third and final step of the assessment phase, is defined as the
438     "scientific and statistical evaluation of data to determine if data are of the right type, quality, and
439     quantity to support their intended use." DQA is more global in its purview than the previous
440     verification and validation steps. DQA, in addition to reviewing the issues raised during verifica-
441     tion and validation, may be the first opportunity to review other issues, such as field activities
442     and their impact on data quality and usability. DQA should consider the combined impact of all
443     project activities in making a data usability determination, which is documented in a DQA report.

444     1.5   The MARLAP Process

445     An overarching objective of the MARLAP Manual is to provide a framework and information
446     for the selection, development, and evaluation of analytical protocols and the resulting laboratory
447     data. The MARLAP process is a performance-based approach that develops APSs and uses these
448     requirements as criteria for the analytical protocol selection, development and evaluation
449     processes, and for the evaluation of the resulting laboratory data. This process, which spans the
450     three phases of the data life cycle for a project—planning, implementation and assessment—is
451     the basis for achieving MARLAP's basic goal of ensuring that radioanalytical data will meet a
452     project's data requirements. A brief overview of this process, which is referred to as the
453     MARLAP process and is the focus of Part I of the manual, is provided below.

454     The MARLAP process starts with a directed planning process. Within a directed planning
455     process, key analytical issues based on the project's particular analytical processes are discussed
456     and resolved. The resolution of these key analytical issues produces the APSs, which include the
457     MQOs. The APSs are documented in project plan documents (e.g., Quality Assurance Project
458     Plans, Sampling and Analysis Plans). A SOW is  then developed that contains the APSs. The
459     laboratories receiving the SOW respond with proposed analytical protocols based on the require-
460     ments of the APSs and provide evidence that the proposed protocols meet the performance
461     criteria in the APSs. The proposed analytical protocols  are initially evaluated by the project
462     manager or designee to determine if they will meet the requirements in the APSs. If the proposed

        MARLAP       '     ~                              --_-.-.                     JULY-2001
        DO NOT CITE OR QUOTE                     1-14              DRAFT FOR PUBLIC COMMENT

-------
                                                                         Introduction to MARLAP
463     analytical protocols are accepted, the project plan documents are updated by the inclusion or
464     referencing of the actual analytical protocols to be used. During analyses, resulting sample and
465     QC data will be evaluated primarily using MQOs from the respective APSs. Once the analyses
466     are completed, an evaluation of the data will be conducted, including data verification, data
467     validation, and data quality assessment with the respective MQOs serving as criteria for
468     evaluation. The role of the APSs (particularly the MQOs, which make up an essential part of the
469     APSs) in the selection, development, and evaluation of the analytical protocols and the laboratory
470     data is to provide a critical link between the three phases of the data life cycle of a project. This
471     linkage helps to ensure that radioanalytical laboratory data will meet a project's data require-
472     ments, and that the data are of known quality appropriate for their intended use. The MARLAP
473     process is illustrated in Figure 1.3. Although the diagram used to represent the MARLAP Process
474     is presented in a linear fashion, it is important to note that the process is an iterative one, and
475     there can be many  variations on this stylized diagram.

476     1.6    Structure of the Manual

477     MARLAP is divided into two main parts. Part I provides guidance on implementing the
478     MARLAP process as described in Section 1.5. This part of the manual focuses on the sequence
479     of steps involved when using a performance-based approach for projects requiring radioanalytical
480     laboratory work starting with a directed planning process and ending with DQA. Part I provides
481     the overall guidance for using a performance-based approach for all three phases of a project. A
482     more detailed overview of Part I is provided in Section 1.6.1.
483
484     Part n of the manual provides information on the laboratory analysis of radionuclides to support
485     a performance-based approach. Part H provides guidance and information on the various
486     activities performed at radioanalytical laboratories, such as sample preparation, sample
487     dissolution, chemical separations, preparing sources for counting, nuclear counting, etc. Using
488     the overall framework provided in Part I, the material in Part n can be used to assist project
489     planners, managers, and laboratory personnel in the selection, development, evaluation, and
490     implementation of analytical protocols for a particular project or program. A more detailed
491     overview of Part n is provided in Section 1.6.2. In addition to Part I and Part EL, MARLAP has
492     several appendices that support both Part I and Part n of the manual. An overview of the
493     appendices is provided in Section  1.6.3 of this chapter.

494     Because of the structure and size of the manual, most individuals will naturally focus on those
495     chapters that provide guidance in areas directly related to their work. Therefore, to help ensure
496     that key concepts are conveyed to the readers, there is some material  is repeated, often in very
497     similar or even the same language, throughout the manual.

        JULY 2001 "                                                                     MARLAP
        DRAFT FOR PUBLIC COMMENT              1-15                    DO NOT CITE OR QUOTE

-------
Introduction to MARLAP
                                                           Directed Planning Process (Chapter 2}
                                                     • Key Analytical Issues (Chapter 3)
                                                     • Development of Analytical Protocol Specifications
                                                     »Includes MQOs (Chapter 3)
                                                           Develop Plan Documents That Incorporate
                                                          Analytical Protocol Specifications {Chapter 4)
                                                             (e.g., QAPP, SAP, Data Validation Plan)
                                                               Development of SOW (Chapter 5}
                                                       > Includes Analytical Protocol Specifications (MQOs)
                                                                           I
                                                             Planning
                                                              Phase
                                                         Laboratory Responds with Analytical Protocols
                                                                        (Chapter 6}
                                                • Selected to Meet Requirements of Analytical Protocol Specifications
                                                • Performance Data Provided
                    Project Manager
                                      Protocols
                                      Rejected
                                                                           I
Initial Evaluation of Analytical Protocols and Laboratory
                       (Chapter?)
• Review of Performance Data
• Performance Evaluation (PE) Samples/Certified Reference
   Materials (CRMs) Analyzed
• Quality Systems Audit
                                                                                 Protocols Accepted
                                                                 Update Plan Documents
                                                                    (Chapter 4)
                                                     • Include or Reference Accepted Analytical Protocols
                 Project Planning Team
                                      Corrective
                                       Actions
                                                                                Start Analysis of Samples
         Ongoing Evaluation of Laboratory Performance
                      (Chapter?)
     • Evaluation of QC and PE Sample Results
     • Laboratory Audits
     - Evaluation of Sample-Specific Parameters (i.e., yield)
                                                                                Analyses Completed
                                                          Implementation
                                                         ~   Phase
                                                                    Data Evaluation and Assessment
                                                          • Data Verification (Chapter 6)
                                                          • Data Validation (Chapter 8)
                                                          • Data Quality Assessment (Chapter 9)
                                                           Assessment
                                                          ~   Phase
                                                           Data of Known Quality for Decision Making
                                               FIGURE 1.3 — The MARLAP Process
MARLAP
DO NOT CITE OR QUOTE
        1-16
                          JULY 2001
DRAFT FOR PUBLIC COMMENT

-------
                                                                         Introduction to MARLAP
498     1.6.1  Overview of Part I

499     Part I begins with Chapter 2, Project Planning Process, which provides an overview of the
500     directed planning process and discusses important analytical outputs of the planning process.
501     Chapter 3, Key Analytical Planning Issues and Developing APSs, provides an overview of key
502     analytical planning issues that need to be addressed during a directed planning process and
503     provides guidance on developing APSs, which are outputs of the planning process. These outputs
504     are incorporated into plan documents (e.g., work plans, quality assurance project plans, sampling
505     and analysis plans), which are covered in Chapter 4, Project Plan Documents. Chapter 4 provides
506     an overview of different types of project plan documents and provides guidance on the linkage
507     between project planning and project plan  documents. Information from the plan documents is
508     then incorporated into a SOW, which is covered in Chapter 5, Obtaining Laboratory Services.
509     Chapter 5 provides guidance on developing a SOW that incorporates the APSs. Chapter 6,
510     Selection and Application of an Analytical Method,  provides guidance on selecting or developing
511     analytical protocols that will meet the MQOs and other requirements as outlined in the APSs.
512     Chapter 7, Evaluating Protocols and Laboratories,  provides guidance on the initial and ongoing
513     evaluation of analytical protocols and  also provides guidance on the overall evaluation of
514     radioanalytical laboratories. Chapter 8, Radiochemical Data Verification and Validation,
515     provides an overview of the data evaluation process, provides general guidelines for data
516     verification and validation, and provides "tools" for data validation. The last chapter of Part I,
517     Chapter 9, Data Quality Assessment, provides an overview of data quality assessment and
518     provides guidance on linking data quality assessment and the planning process.

519     Figure 1.3, the MARLAP Process, illustrates the sequence of steps that make up the framework
520     of a performance-based approach for the planning, implementation, and assessment phases of
521     projects that require the laboratory analysis of radionuclides. The primary audience for Part I is
522     project planners and managers. However, Chapter 6, Selection and Application of an Analytical
523     Method, is intended primarily for laboratory personnel. This is because, under a performance-
524     based approach, a laboratory would be able to use any analytical protocol that meets the
525     analytical requirements as defined by the APSs. Other factors, such as cost, also will play a role
526     in the selection of analytical protocols. While the primary audience for Part I is project planners
527     and managers, other groups, such as laboratory personnel, can benefit from the guidance  in Part I.

528     1.6.2  Overview of Part II

529     The chapters in Part II are intended to provide information on the laboratory analysis of
530     radionuclides. The chapters provide information on many of the options available for analytical
531  ~   protocols, and discuss common advantages and disadvantages of each. The chapters highlight


        JULY 2001                                                                        MARLAP
        DRAFT FOR PUBLIC COMMENT               1-17                    DO NOT CITE OR QUOTE

-------
        Introduction to MARLAP
532     common analytical problems and ways to identify and correct them. The chapters also serve to
533     educate the reader by providing a detailed explanation of the typical activities performed at a
534     radioanalytical laboratory. Consistent with a performance-based approach, the chapters in Part II
535     do not contain detailed step-by-step instructions on how to perform certain laboratory tasks, such
536     as the digestion of a soil sample. The chapters do contain information and guidance intended to
537     assist primarily laboratory personnel hi deciding on the best approach for a particular laboratory
538     task. For example, while the chapter on sample dissolution does not contain step-by-step
539     instructions on how to dissolve a soil sample, it does provide information on acid digestion,
540     fusion techniques, and microwave digestion, which is intended to help the reader select the most
541     appropriate technique or approach for a particular project.

542     The primary audience for Part II is laboratory personnel and the chapters generally contain a
543     significant amount of technical information. While the primary target audience is laboratory
544     personnel, other groups, such as project planners and managers, can benefit from the guidance in
545     Part II. Listed below are the chapters that make up Part II of the manual. It should be noted that
546     Part II of the manual does not provide  specific guidance for some laboratory activities that are
547     common to all laboratories, such as laboratory quality assurance, and laboratory health and safety
548     practices. This is primarily due to the fact that these activities are not unique to radioanalytical
549     laboratories and considerable guidance in these areas already exists.

550         Chapter 10   Requirements When Collecting, Preserving, and Shipping Samples That
551                 .     Require Analytical Measurement
552         Chapter 11    Sample Receipt, Inspection and Tracking
553         Chapter 12   Laboratory Sample  Preparation
554         Chapter 13    Sample Dissolution
555         Chapter 14   Separation Techniques
556         Chapter 15    Nuclear Counting Instrumentation
557         Chapter 16   Instrument Calibration and Source Preparation
558         Chapter 17   Nuclear Counting and Data Reduction and Reporting
559         Chapter 18   Laboratory Quality  Control
560         Chapter 19   Measurement Statistics
561         Chapter 20   Waste Disposal

562     Chapters 10 through 17 provide information on the typical components of an analytical process
563     in the order in which activities that make up an analytical process are normally performed. While
564     not providing step-by-step procedures for activities such as sample preservation, sample
565     digestion, nuclear counting, etc., the chapters do provide an overview of options available for the
566     various activities and importantly, provide information on the appropriateness of the assorted

        MARLAP                     —                                -             JULY 2001
        DO NOT CITE OR QUOTE                     1-18              DRAFT FOR PUBLIC COMMENT

-------
                                                                         Introduction to MARLAP
567     options under a variety of conditions. Chapter 18, Laboratory Quality Control, provides
568     guidance on monitoring key laboratory performance indicators as a means of determining if a
569     laboratory's measurement processes are in control. The chapter also provides information on
570     likely causes of excursions for selected laboratory performance indicators, such as chemical
571     yield, instrument background, quality control samples, etc. Chapter 19, Measurement Statistics,
572     provides information on statistical principles and methods applicable to radioanalytical
573     measurements, calibrations, data interpretation, and quality control. Topics covered in the chapter
574     include detection and quantification, measurement uncertainty, and procedures for estimating
575     uncertainty. Chapter 20, Waste Disposal provides an overview of many of the regulations for
576     waste disposal and provides guidance for managing wastes in a radioanalytical laboratory.

577     1.6.3  Overview of the Appendices

578     MARLAP includes several  appendices to both Part I and Part n of the manual to provide
579     additional guidance on specific topics. Brief descriptions of the appendices are provided below.

580      •  Appendix A, Directed Planning Approaches, provides an overview of a number of directed
581         planning processes and discusses some common elements of the different approaches.

582      •  Appendix B, The Data Quality Objective Process, provides an expanded discussion of the
583         Data Quality Objectives Process including detailed guidance on setting up a "gray region"
584         and establishing tolerable decision error rates.

585      •  Appendix C, Measurement Quality Objectives for Method Uncertainty and Detection and
586         Quantification Capability, provides the rationale and guidance for developing MQOs for
587         select method performance characteristics.

588      •  Appendix D, Content of Project Plan Documents, provides guidance on the appropriate
589         content of plan documents.

590      •  Appendix E, Contracting Laboratory Services, contains detailed guidance on contracting
591         laboratory services.

592      •  Appendix F, Laboratory Subsampling, provides information on improving and evaluating
593         laboratory subsampling  techniques.

594    '  •  Appendix G; Statistical Tables, provides a compilation of statistical tables.
        JULY 2001                                                 _.                     MARLAP
        DRAFT FOR PUBLIC COMMENT              1-19                    DO NOT CITE OR QUOTE

-------
        Introduction to MARLAP
595     1.7   References

596     American Society for Testing and Materials (ASTM). D5792. Standard Practice for Generation
597        of Environmental Data Related to Waste Management Activities: Development of Data
598        Quality Objectives, 1995.

599     International Organization for Standardization (ISO). 1993. International Vocabulary of Basic
600        and General Terms in Metrology. ISO, Geneva, Switzerland.

601     International Organization for Standardization (ISO). 1995. Guide to the Expression of
602        Uncertainty in Measurement. ISO, Geneva, Switzerland.

603     MARSSIM. 2000. Multi-Agency Radiation Survey and Site Investigation Manual, Revision 1.
604        NUREG-1575 Rev 1, EPA 402-R-97-016 Revl, DOE/EH-0624 Revl. August. Available
605        from http://www.epa.gov/radiation/marssim/filesfin.htm.

606     U.S. Environmental Protection Agency (EPA). 2000. Guidance for the Data Quality Objective
607        Process (EPA QA/G-4). EPA/600/R-96/055, Washington, DC. available from www.epa.gov/
608        qualityl/qa_docs.html.
        MARLAP-           	                                           '          JULY 2001
        DO NOT CITE OR QUOTE                   1 -20              DRAFT FOR PUBLIC COMMENT

-------
                       2  PROJECT PLANNING PROCESS
 2     2.1    Introduction

 3     Efficient environmental data collection activities depend on successfully identifying the type,
 4     quantity, and quality of data needed, as well as how the data will be used to support the decision
 5     making process. MARLAP recommends the use of a directed or systematic planning process.
 6     These planning processes provide a logic and framework for setting well-defined, achievable
 7     objectives and developing a cost-effective, technically sound and defensible sampling and
 8     analysis design that balances the data user's tolerance for uncertainty in the decision process and
 9     the available resources for obtaining data to support a decision. MARLAP has chosen to use the
10     term "directedplanning" to emphasize that the planning process, in addition to having a
11     framework or structure (i.e., it is systematic), is focused of defining the data needed to achieve an
\ 2     informed decision for a specific project.

13     The objective of this MARLAP chapter is to promote:

14     1.  Directed project planning as a tool for project management to identify and document the data
15        quality objectives (DQOs)—that is, qualitative and quantitative statements that define the
16        project objectives and the tolerable rate of making decision errors that will be used as the
17        basis for establishing the quality and quantity of data needed to support the decision—and the
18        measurement quality objectives (MQOs) that define the  analytical data requirements
19        appropriate for decision making;

20     2.  The involvement of technical experts in particular radioanalytical specialists in the planning
21        process; and

22     3.  Integration of the outputs from the directed planning process into the implementation and
23        assessment phases of the project through documentation in project plan documents, the
24        analytical SOW, and the data assessment plans (e.g., for data validation, data verification, and
25        data and data quality assessment—DQA).

26     MARLAP will use the terms "DQOs" and "MQOs," as defined above and in Chapter 1,
27     throughout this document because of their widespread use in environmental data collection
28     activities. These concepts may be expressed by other terms, such as "decision performance
29     criteria" or "project quality objectives" for DQOs and "measurement performance criteria" or
30     "data quality requirements" for MQOs.
       JULY 2001                  ,                                                   MARLAP
       DRAFT FOR PUBLIC COMMENT              2-1                    DO NOT-CITE OR QUOTC

-------
        Project Planning Process
31       This chapter provides an overview of the directed planning process. Additional discussion on the
32      planning process in Chapter 3, Key Analytical Planning Issues and Developing Analytical
33      Protocol Specifications, will focus on project planning from the perspective of the analytical
34      process and the development of Analytical Protocol Specifications (APSs). Section 2.2 will
35      discuss the importance of directed project planning. The approach, guidance and common
36      elements of directed planning are discussed in Section 2.3. The project planning team is
37      discussed in Section 2.4, and the role of the radioanalytical specialists is highlighted in Section
38      2.5. The results of the planning process are discussed in Section 2.6. Section 2.7 presents the next
39      steps of the planning phase of the project, which will document the results of the planning
40      process and will link the results of the planning process to the implementation and assessment
41      phases of data collection activities.

42      The environmental data collection process consists of a series of elements: planning, developing,
43      and updating project plan documents; contracting for services; sampling; analysis; data
44      verification;  data validation; and data quality assessment (see Section  1.4.7, "Data Life Cycle," of
45      Chapter 1, Introduction to MARLAP). These elements are interrelated  (sampling and analysis
46      cannot be performed efficiently or resources allocated effectively without first identifying data
47      needs during planning). Linkage and integration of the data collection process elements are
48      essential to the success of the environmental data collection activity.

49      2.2   The Importance of Directed Project Planning

50      A directed planning process has several notable strengths. It brings together the stakeholders (see
51       box), decision makers, and technical experts at the beginning of the project to gain commitment
52      to the project and a consensus on the nature of the problem and the desired decision. MARLAP
53      recognizes the need for a directed planning process that involves radioanalytical and other
54      technical experts as principals to ensure the decision makers' data requirements and the results
55      from the field and radioanalytical laboratory are linked effectively. Directed planning enables
56      each participant to play a constructive role in clearly defining:

57       •  The problem that requires resolution;
58       •  What type, quantity, and quality of data the decision maker needs to resolve that problem;
59       •  Why the  decision maker needs that type and quality of data;
60       •  What are the tolerable decision error rates; and
61        •  How the decision maker will use the data to make a defensible decision.
       MARLAP                                                                       JULY 2001
       DO NOT CITE OR QUOTE                     2-2               DRAFT FOR PUBLIC COMMENT

-------
                                                                        Project Planning Process
62

63
64

65
66

67
68


69
70

71
72


73
74
75
76
77

78
79
80
81
82
83
84.
85
86

87
88
89
90
                      Example of Stakeholders for a Cleanup Project

      A stakeholder is anyone with an interest in the outcome of an activity. For a cleanup
      project, some of the stakeholders could be:

       • Federal, regional, State,  and tribal environmental agencies with regulatory
         interests (e.g., NRC and EPA).

       • States with direct interest in transportation, storage and disposition of wastes,
         and a range of other issues.

       • City and County Governments with interest in the operations and safety at sites
         as well as economic development and site transition.

       * Site Advisory Boards, citizens groups, licensees, special interest groups, and
         other members of the public with interest in cleanup activities at the site.
A directed planning process encourages efficient planning by providing a framework for
organizing complex issues. The process promotes timely, open, and effective communication
among the stakeholders resulting in well-conceived and documented plans. Because of the
emphasis on documentation, directed planning also provides project management with a more
efficient and consistent transfer of knowledge to new project members.

A directed planning process focuses on collection of only those data needed to address the
appropriate questions and support defensible decisions. Directed planning helps to eliminate poor
or inadequate sampling and analysis designs that require analysis of (1) too few or too many
samples, (2) samples that will not meet the needs of the project, or (3) inappropriate QC samples.
During directed planning, which is an iterative process, the sufficiency of existing data is
evaluated, and the need for additional data to fill the gaps, as well as the desired quality of the
additional data, is determined. By defining the MQOs, directed planning provides input for
obtaining appropriate radioanalytical services, which balance constraints and the required data
quality.

The time invested in preliminary planning can greatly reduce resource expenditure hi the more
resource-intensive execution phase of the project. Less overall time (and money) is expended
when early efforts are focused on defining (and documenting) the project's objectives (DQOs),
technically based, project-specific analytical data needs (MQOs and any specific analytical
       JULY 2001
       DRAFT FOR PUBLIC COMMENT-
                                           2-3-
              MARLAP-
DO NOT CITE OR QUOTE

-------
        Project Planning Process
 91      process requirements), and measures of performance for the assessment phase of the data
 92      collection activity.

 93      2.3   Directed Project Planning Processes          :

 94      The recognition of the importance of project planning has resulted in the development of a
 95      variety of directed planning approaches. MARLAP does not endorse any one planning approach.
 96      Users of this manual are encouraged to consider the available approaches and choose a directed
 97      planning process that is appropriate to their project and agency. Appendix A, Directed Planning
 98      Approaches, provides brief descriptions of several directed planning processes.

 99      A graded approach to project planning will be discussed in Section 2.3.1. Standards and guidance
100      on project planning are presented in Section 2.3.2. An overview of common elements of project
101      planning is discussed in Section 2.3.3. The elements of project planning will be discussed in
102      detail in Section 2.5.

103      2.3.1  A Graded Approach to Project Planning

104      The sophistication, the level of QC and oversight, and the resources applied should be approp-
105      riate to the project (i.e., a "graded approach"). Directed planning for small or less complex
106      projects follows the logic of the process but will proceed faster and involve fewer people. The
107      goal still will be to (1) plan properly to collect only the data needed to meet the objectives of the
10S      project and (2) establish the measures of performance for the implementation and assessment
109      phases of the data life cycle of the project.

110      2.3.2  Guidance on Directed Planning Processes

111      The following national standards related to directed project planning for environmental data
112      collection are available:

113       • Standard Practice (D5792) for Generation of Environmental Data Related to Waste
114         Management Activities: Development of Data Quality Objectives (American Society for
115         Testing and Materials (ASTM, 1995a), which addresses the process of development of data
116         quality objectives for the acquisition of environmental data. This standard describes the DQO
117         process in detail.

118       • Standard Provisional Guide (PS85) for Expedited Site Characterization of Hazardous Waste
119         Contaminated Sites (ASTM, 1996a), which describes the Expedited Site Characterization

        MARLAP                                                                       JULY 200.1
        DO NOT CITE OR QUOTE                     2-4              DRAFT FOR PUBLIC COMMENT

-------
                                                                        Project Planning Process
120        (ESC) process used to identify all relevant contaminant migration pathways and determine
121        the distribution, concentration and fate of the contaminants for the purpose of evaluating risk,
122        determining regulatory compliance, and designing remediation systems.

123      • Standard Guide (D5730) Site Characteristics for Environmental Purposes with Emphasis on
124        Soil, Rock, the Vadose Zone and Ground Water (ASTM, 1996b), which covers a general
125        approach to planning field investigations using the process of defining one or more
126        conceptual site models that is useful for any type of environmental reconnaissance or
127        investigation plan with a primary focus on the surface and subsurface environment.

128      • Standard Guide (D5612) Quality Planning and Field Implementation of a Water Quality
129        Measurements Program (ASTM, 1994), which defines criteria and identifies activities that
130        may be required based on the DQOs.

131      • Standard Guide (D5851) Planning and Implementing a Water Monitoring Program (ASTM,
132        1995b), which provides a procedural flowchart for planning the monitoring of point and non-
133        point sources of pollution of water resources (surface or ground water, rivers, lakes or
134        estuaries).

135     Several directed planning approaches have been implemented by the federal sector for
136     environmental data collection activities. MARLAP does not endorse a single planning approach
137     and project planners should be cognizant of their agency's requirements for planning. The
138     following guidance is available:

139      • EPA developed the DQO Process (EPA, 2000) and has tailored DQO Process guidance for
140        specific programmatic needs of project planning under the Comprehensive Environmental
141        Response, Compensation, and Liability Act of 1980 (CERCLA/Superfund) (EPA, 1993) and
142        for site-specific remedial investigation feasibility study activities (EPA, 2000).

143      • The U. S. Army Corps of Engineers (ACE) Technical Project Planning (TPP) Process (ACE,
144        1998) was developed for technical projects planning for hazardous, toxic and radioactive
145        waste sites.

146      • DOE has developed the Streamlined Approach for Environmental Restoration (SAFER)
147        (DOE, 1993) for its environmental restoration activities.

148      • Planning guidance, including decision frameworks, for projects demonstrating compliance
149        with a dose- or risk-based regulation is available for final status radiological surveys


        JULY 2001                                                                      MARLAP
        DRAFT FOR PUBLIC COMMENT              2-5   -               DO NOT CITE OR QUOTE

-------
        Project Planning Process
150        (MARSSIM, 2000) and radiological criteria for license termination (NRC, 1998a; NRC,
151        1998b).

152     Additional information on the DQO Process (ASTM, 1995a; EPA, 2000) is presented in
153     Appendix B, The Data Quality Objectives Process.

154     2.3.3  Elements of Directed Planning Processes

155     Environmental data collection activities require planning for the use of data in decision making.
156     The various directed planning approaches, when applied to environmental data collection
157     activities, address common planning considerations. Some common elements of the planning
158     processes are:

159     1.  Define the problem: Identifying the problem(s) facing the stakeholder/customer that requires
160        attention, or the concern that requires streamlining.

161     2.  Identify the Decision: Defining the decision(s) or the alternative actions that will address the
162        problem(s) or concern and satisfy the stakeholder/customer, and determine if new data are
163        required to make the decision.

164     3.  Specify the Decision Rule and the Tolerable Decision Error Rates: Develop a decision rule to
165        get from the problem or concern to the desired decision and define the limits on the decision
166        error rates that will be acceptable to the stakeholder/customer. The decision rule can take the
167        form of "if ...then..." statements for choosing among decisions or alternative actions.

168     4.  Optimize the Strategy for Obtaining Data: Determine the optimum, cost-effective way to
169        reach the decision while satisfying the desired quality of the decision. Define the quality of
no        the data that will be required for the decision by establishing specific, quantitative and
171        qualitative analytical performance measures (e.g, MQOs). Define the process and criteria to
172        evaluate the suitability of the data to support their intended use (DQA).

173     The objective of the directed project planning process for environmental data collection activities
174     is to reach consensus among the stakeholders on defining the problem, the full range of possible
175     solutions, the desired decision, the optimal data collection strategy, and performance measures
176     for implementation and assessment phases of the project. If a cursory job is done defining the
177     problem or the desired results, the consequence will be the development of a design that may be
178     technically sound but answers the wrong question, may answer the question only after the
        MARLAP        "                                                               JULY 2001
        DO NOT.CITE OR QUOTE                     2-6               DRAFT FOR PUBLIC COMMENT

-------
                                                                          Project Planning Process
179     collection of significant quantities of unnecessary data, or may collect insufficient data to answer
180     the question.

181     The key outputs of the directed planning process are DQOs: qualitative and quantitative
182     statements that define the project objectives and the tolerable decision error rates that will be
183     used as the basis for establishing the quality and quantity of data needed to support the decision.
184     The MQOs and the decisions on key analytical planning issues will provide the framework for
185     Analytical Protocol Specifications. The MQOs and the tolerable decision error rates will provide
186     the basis for the data assessment phase (data validation and DQA). The elements of project
187     planning will be discussed in detail  in Section 2.5 from the perspective of the radioanalytical
188     specialists after introducing the concepts of the project planning team and radioanalytical
189     specialists in Section 2.4. Key analytical planning issues and Analytical Protocol Specifications
190     are discussed in Chapter 3, Key Analytical Planning Issues and Developing Analytical Protocol
191     Specifications.

192     2.4   The Project Planning Team

193     Participants in the project planning process will vary depending on the nature of the project, but
194     in most cases a multi-disciplinary team will be required. The project planning team should
195     consist of all the parties who have a vested interest or can influence the outcome (stakeholders).
196     A key to successful directed planning of environmental projects is getting the data users and data
197     suppliers to work together early in the process to understand each other's needs and require-
198     merits, to agree on the desired end product, and to establish lines of communication. Equally
199     important is having integrated teams of operational and technical experts. These experts will
200     determine whether the  problem has been sufficiently defined and if the desired outcomes are
201     achievable. With technical expert input early in the planning process, efforts are focused on
202     feasible solutions, and  resources are not wasted pursuing unworkable solutions.

203     2.4.1  Team  Representation

204     Thus, members of the project planning team may include program and project managers,
205     regulators, public representatives, project engineers, health and safety advisors, and specialists in
206     statistics, health physics, chemical analysis, radiochemical analysis, field sampling, quality
207     assurance/quality control (QA/QC), data assessment, contract and data management, field
208     operation, and other technical specialists. The program or project manager(s) may be a Remedial
209     Project Manager (RPM), a Site Assessment Manager (SAM), or a Technical Project Officer
210     (TPO). Some systematic planning processes, such as Expedited Site Characterization, utilize a
211     core technical  team supported as needed by members of larger technical and operational teams.

        JULY 2001                                                                        MARLAP
        DRAFT FOR PUBLIC COMMENT              2-7                     DO NOT CITE OR QUOTE

-------
        Project Planning Process
212      Throughout this document, the combined group of decision makers and technical experts is
213      referred to as the "project planning team."

214      The duration of service for the project planning team members can vary, as can the level of
215      participation required of each member during the various planning phases. While the project
216      planning team may not meet as frequently once the project objectives and the sampling and
217      analysis design have been established, a key point to recognize is that the project planning team
218      should not disband. Rather, the team or a "core group" of the team (including the project
219      manager and other key members) should continue to meet at agreed upon intervals to review the
220      project's progress and to deal with actual project conditions that require changes to the original
221      plan. The availability of a core team also provides the mechanism for the radioanalytical
222      laboratory to receive needed information to clarify questions as they arise.

223      A key concept built into directed planning approaches is the ability to revisit previous decisions
224      after the initial planning is completed (i.e., during the implementation phases of the
225      environmental data collection process). Even when objectives are clearly established by the
226      project planning team and contingency planning was included in the plan development, the next
227      phases of the project may uncover new information or situations, which require alterations to the
228      data collection strategy. For example, finding significantly different levels of analytes or different
229      analytes than were anticipated based on existing information may require changes in the process.
230      To respond to unexpected events, the project planning team (or the core group) should remain
231      accessible during other phases of the data collection process to respond to questions raised,
232      revisit and revise project requirements as necessary, and communicate the basis for previous
233      assumptions.

234      2.4.2  The Radioanalytical Specialists

235      Depending on the size and complexity of the project, MARLAP recognizes that a number of key
236      technical experts should participate on the project planning team and be involved throughout the
237      project as needed. When the problem or concern involves radioactive analytes, it is important
238      that the radioanalytical specialist(s) are part of the project planning team, in addition to radiation
239      health and safety specialists. MARLAP recommends that the radioanalytical specialists be a part
240      of the integrated effort of the project planning team. Throughout this manual, the term
241      "radioanalytical specialists" will be used to refer to the radioanalytical expertise needed.

242      Radioanalytical specialists may provide expertise in (1) radiochemistry and radiation/nuclide
243      measurement systems and (2) the knowledge of the chemical characteristics of the analyte of
244      concern. In particular, the radioanalytical specialist plays a key role in the development of

         MARLAP                      -.-   -                             -              JULY 2001
         DO NOT. CITE.OR QUOTE       ._..    ..      2-8               DRAFT FOR PUBLIC COMMENT

-------
                                                                         Project Planning Process
245     MQOs. The radioanalytical specialists may also provide knowledge about sample transportation
246     issues, preparation, preservation, sample size, subsampling, available analytical protocols and
247     achievable analytical data quality. If more than one person is needed, the specialists members
248     need not be from the same organization. The radioanalytical specialists need not be from the
249     contractual radioanalytical laboratory. The participation of the radioanalytical specialists is
250     critical to the success of the planning process and the effective use of resources available to the
251     project.

252     2.5    Direct Planning Process and Role of the Radioanalytical Specialists

253     The importance of technical input in a directed planning process becomes apparent when one
254     examines the common difficulties facing the radioanalytical laboratory. Without sufficient input,
255     there is often a disconnect in translating the project planning team's analytical data requirements
256     into laboratory requirements and products. Radioanalytical advice and input during planning,
257     however, help to assure that the analytical protocol(s) selected will satisfy the data requirements,
258     including consideration of time, cost and relevance to the data requirements and budget The role
259     of the radioanalytical specialists during the early stage of the directed planning process is to focus
260     on whether the desired radionuclides can be measured and the practicality of obtaining the
261     desired analytical data. During the latter part of the process, the radioanalytical specialists can
262     provide specific direction and fine tuning for defining the analytical performance requirements
263     (MQOs) and other items of the Analytical Protocol Specifications.

264     Planning with input from radioanalytical specialists can help ensure that the data received by  the
265     data users will meet the project's DQOs. Common areas that are improved with radioanalytical
266     specialists' participation in project planning include:

267      • The correct radionuclide is measured;

268      •  MQOs are adequately established and achievable;

269      • Consideration is given to the impact of half-life and parent/progeny factors;

270      • The data analysis is not compromised by interferences;

271      • Unnecessary or overly sophisticated analytical techniques are avoided in favor of analytical
272        techniques appropriate to the required level of measurement uncertainty;

273      • Optimum radioanalytical variables, such as count time and sample volume, are considered;

        JULY 2001           "                                                             MARLAP
        DRAFT FOR PUBLIC COMMENT               2-9                     DO NOT CITE OR QUOTE

-------
         Project Planning Process
274


275


276


277

278

279
 •  Environmental background levels are considered;

 •  Chemical speciation is addressed; and

 •  Consideration is given to lab operations (e.g., turnaround time, resources).

These improvements result in an appropriate data collection design with specified MQOs and any
specific analytical process requirements to be documented in the project plan documents and
SOWs.
280      The following sections, using the common planning elements outlined in Section 2.3.3, will
281      discuss the process and results of directed planning in more detail and emphasize the input of
282      radioanalytical specialists. Table 2.1 provides a summary of (1) the information needed by the
283      project planning team, (2) how the radioanalytical specialists participate, and (3) the output or
284      product for each element of the directed planning process. It must be emphasized that a directed
285      planning process is an iterative, rather than step-wise, process. Although the process is presented
286      in discrete sections, the project planning may not progress in such an orderly fashion. The
287      planning team will more precisely define decisions and data needs as the planning progresses and
288      use new information to modify or change earlier decisions until the planning team has
289      determined the most resource effective approach to the problem. The common planning elements
290      are used for ease of presentation and to delineate what should be covered in planning, not the
291      order of discussion.
292

293

294
295
    TABLE 2.1 Summary of the Directed Planning Process and Radioanalytical Specialists Participation
   Element
           Information Needed by The
             Project Planning Team
                            Radioanalytical Specialists
                               Participation/Input
                                 Output/Product
 I.
State the
problem
Key stakeholders and their
concerns.
Facts relevant to current
situation (e.g., site history,
ongoing studies).
Analytes of concern or
analytes driving risk.
Matrix of concern.
Regulatory requirements and
related issues.
Existing data and the
reliability of the information.
Known sampling constraints.
Resources and relevant
deadlines.
Evaluate existing radiological data
for use in defining the issues (e.g.,
analytes of concern).
Assure that the perceived problem
is really a concern by reviewing
the underlying data that is the
basis for the problem definition.
Consider how resource limitations
and deadlines will impact
measurement choices.
Use existing data to begin to
define the analyte of concern and
the potential range of
concentrations.
Define the problem with
specificity.
Identify the primary
decision maker, the
available resources, and
constraints.
         MARLAP
         DO NOT CITE OR QUOTE
                                            2-10
                                                                                JULY 2001
                                                            DRAFT FOR PUBLIC COMMENT

-------
                                                                                       Project Planning Process
296
297
298
              Element
2a. Identify
    the
    decision(s)
                 Information Needed by The
                   Project Planning Team
              Analytical aspects related to
              the decision.
              Possible alternative actions.
              Sequence and priority for
              addressing the problem.
                                             Radioanalytical Specialists
                                                Participation/Input
                             Provide focus on what analyles
                             need to be measured considering
                             analyte relationships and
                             background.
                             Begin to address the feasibility of
                             different analytical protocols.
                             Begin to identify the items of the
                             Analytical Protocol
                             Specifications.
                             Begin to determine how sample
                             collection and handling will affect
                             MQOs.              	
                                                                  Output/Product
                                   Statements that link the
                                   defined problem to the
                                   associated decision(s)
                                   and alternative actions.
299
300
301
302
2b. Identify
    inputs to
    the
    decision(s)
              All useful existing data.
              The general basis for
              establishing an action level.
              Acquisition strategy options
              (if new data is needed).
                             Review the quality and sufficiency
                             of the existing radiological data..
                             Identify alternate analytes.
                                   Defined list of needed
                                   new data.
                                   Define the characteristic
                                   or parameter of interest
                                   (analyte/matrix).
                                   Define the action level.
                                   Identify estimated.
                                   concentration range for
                                   analyte(s) of interest.
303
304
305
306
307
308
2c.
Define the
decision
boundaries
Sampling or measurement
timeframe.
Sampling areas and
boundaries.
Subpopulations.
Practical constraints on data
collection (season,
equipment, turnaround time,
etc.).
Available protocols.
Identify temporal trends and
spatial heterogeneity using
existing data.
With the sampling specialists,
identify practical constraints that
impact sampling and analysis.
Determine feasibility of obtaining
new data with current
methodology.
Identify limitations of available
protocols.   	    	
Temporal and spatial
boundaries.
The scale of decision.
309
310
311
312
3a. Develop a
    decision
    rule
              Statistical parameter to be
              used to describe the
              parameter of interest and to
              be compared to the action
              level.
              The action level
              (quantitative).
              The scale of decision
              making.
                             Potentially useful methods.
                             Estimates of measurement
                             uncertainty and detection limits of
                             available analytical protocols.
                                   A logical, sequential
                                   series of steps
                                   ("if...then") to resolve
                                   the problem.
313
314
315
316
317
318
319
320
3b. Specify
    limits on
    decision
    error rates
              Potential consequences of
              making wrong decisions.
              Possible range of the
              parameter of interest.
              Allowable differences
              between the action level and
              the actual value.
              Acceptable level of decision
                             Assess variability in existing data
                             for decisions on hypothesis testing
                             or statistical decision theory.
                             Evaluate whether the tolerable
                             decision error rates can be met
                             with available laboratory
                             protocols or the error tolerance
                             needs to be relaxed or new
                                   Definition of the
                                   baseline condition (null
                                   hypothesis) and quanti-
                                   tative estimates of
                                   acceptable decision
                                   error rates.
                                   Define the range of
                                   possible parameter
          JULY 2001
          DRAFT FOR PUBLIC COMMENT
                                                 2-11
                                                                                           MARLAP
                                                                          DO NOT CITE OR QUOTE

-------
         Project Planning Process
            Element
                Information Needed by The
                  Project Planning Team
                             Radioanalytical Specialists
                                Participation/Input
                                    Output/Product
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
                 errors or confidence.
                           methods developed.
                                  values where the
                                  consequence of a Type
                                  II decision error is
                                  relatively minor (gray
                                  region).	
 4.  Optimize
    the
    Strategy
    for
    Obtaining
    Data
All outputs from all previous
elements including
parameters (analytes and
matrix) of concern, action
levels, anticipated range of
concentration, tolerable
decision error rates,
boundaries, resources and
practical constraints.
Available protocols for
sampling and analysis.
With sampling specialists, consider
the potential combinations of
sampling and analytical methods, in
relation to:
• Sample preparation, compositing,
  subsampling.
• Available protocols.
• Method requirement by
  regulations (if any).
• Detection and quantitation
  capability.
• MQOs achievable by method,
  matrix and analyte.
• Quality control sample types,
  frequencies, and evaluation
  criteria.
• Sample volume, field processing,
  preservatives, and container
  requirements.
• Assure that the MQOs for sample
  analysis are realistic.
• Assure that the parameters for the
  Analytical Protocol Specifications
  are complete.
• Resources and time frame to
  develop and validate new
  methodCs). if required.	
The most resource-
effective sampling and
analysis design that
meets the established
constraints (i.e., number
of samples needed to
satisfy the DQOs and
the tolerable decision
error rates).
A method for testing the
hypothesis.
The MQOs and the
statement(s) of the
Analytical Protocol
Specifications.
The process and criteria
for data assessment.
342
2.5.1  Define the Problem
343      The first and most important step of the project planning process is a clear statement of the
344      fundamental issue to be addressed by the project. Correctly implemented, directed planning
345      ensures that a clear definition of the problem is developed before any additional resources are
346      committed. The project planning team should understand clearly the conditions or circumstances
347      that are causing the problem and the reason for making a decision (e.g., threat to human health or
348      environment).

349      Many projects present a complex interaction of technical, economic and political factors. The
350      problem definition should include a summary of the study objectives, regulatory context, funding
351      and other resources available, relevant deadlines, previous study results, and any obvious data
         MARLAP
         DO NOT CITE OR QUOTE
                                              2-12
                                                                      JULY 2001
                                                  DRAFT FOR PUBLIC COMMENT

-------
                                                                         Project Planning Process
352     collection design constraints. By participating in the initial stages of the project planning, the
353     radioanalytical specialists will understand the context of the facts and logic used to define the
354     problem and begin to formulate information on applicable protocols based on the projects's
355     resources (time and budget).

356     Existing data (e.g., monitoring data, radioactive materials license, emergency actions, site permit
357     files, operating records) may provide specific details about the identity, concentrations, and
358     geographic, spatial, or temporal distribution of analytes. However, these data should be examined
359     carefully. Conditions may have changed since the data were collected. For example, additional
360     waste disposal may have occurred, the contaminant may have been released or migrated, or
361     decontamination may have been performed. In some cases, a careful review of the historical data
362     by the project planning team will show that a concern is not a problem or the problem can be
363     adequately addressed using the available data.

364     2.5.2   Identify the Decision

365     The project planning team will define the decision(s) to be made (or the question the project will
366     attempt to resolve) and the inputs and boundaries to the decision. There may also be multiple
367     decision criteria that have to be met and each should be clearly defined. For example, the
368     decision may be for an individual survey area rather than the site as a whole, or a phase of the site
369     closure project (scoping,  characterization, operation or final status survey) rather than the project
370     as a whole because of the different objectives and data requirements.

371     The decision should be clear and unambiguous.  It may be useful to state specifically what
372     conclusions may and may not be drawn from the data. If the study is to be designed, for example,
373     to investigate whether or not a site may be released for use by the general public, then the project
374     planning team may  want to specifically exclude other possible uses for the data.

375     2.5.2.1 Action Level

376     The term "action level" is used in this document to denote the numerical value that will cause the
377     decision maker to choose one of the alternative actions. The action level may be a derived
378     concentration guideline level, background level, release criteria, regulatory decision limit, etc.
379     The action level is often associated with the type of medium, analyte and concentration limit,

380     Some action levels, such as the release criteria for license termination, are expressed in terms of
381     dose or risk. The release criterion is typically based on the total effective dose equivalent
382     (TEDE), the committed effective dose equivalent (CEDE), risk of cancer incidence  (morbidity)

        JULY 2001-                          -                                            MARLAP
        DRAFT FOR PUBLIC COMMENT               2-13                    DO NOT CITE OR QUOTE

-------
        Project Planning Process
383     or risk of cancer death (mortality) and generally cannot be measured directly. For example, in site
384     cleanup, a radiomiclide-specific predicted concentration or surface area concentration of specific
385     nuclides that can result in a dose (TEDE or CEDE) or specific risk equal to the release criterion
386     is called the "derived concentration guideline level" (DCGL). A direct comparison can be made
387     between the project's analytical measurements and the DCGL (MARSSIM, 2000). For drinking
388     water analysis, an example of an action level would probably be a radionuclide specific
389     concentration based on the Maximum Contaminant Level under the Safe Drinking Water Act.

390     The project planning team should also determine possible alternative actions that may be taken.
391     Consideration should also be given to the option of taking no action, as this option is frequently
392     overlooked (e.g., no technology available, too costly, relocation will create problems).

393     During these discussions of the directed planning process, the role of the radioanalytical
394     specialists is to ensure that the analytical aspects of the project have been clearly defined and
395     incorporated into the decision(s). The radioanalytical specialists focus on defining: (1) the
396     parameter (analyte/matrix) of interest; (2) what analytical information could resolve the problem;
397     and (3) the practicality of obtaining the desired field and laboratory data. Sections 3.3.1 through
398     3.3.7 of Chapter 3 discuss in more detail the analytical aspects of the decision (or question) and
399     determining the characteristic or parameter of concern. This information is incorporated into the
400     Analytical Protocol Specifications.

401     2.5.2.2 Scale of the Decision

402     The project planning team clearly should define the geographical area(s) to which the decision
403     will apply. The scale of the decision selected should be the smallest, most appropriate subset of
404     the population for which decisions will be made based on the spatial or temporal boundaries. For
405     example, at a remediation site, a survey unit is generally formed by grouping contiguous site
406     areas with a similar use history and the same classification of potential concentration of the
407     analyte of interest. The survey unit will be defined with a specified size and shape for which a
408     separate decision will be made as to whether the unit attains the site-specific reference-based
409     cleanup standard for the designated analyte of interest (MARSSIM, 2000; NRC, 1998c).

410     The survey unit is established to delineate areas or volumes of similar composition and history
411     for which a single decision can be made based on the statistical analysis of the data. The
412     variability in the measurement data for a survey unit is a combination of the imprecision of the
413     measurement process and the real spatial and temporal variability of the analyte concentration. If
414     the measurement data include a background contribution, the spatial variability of the
415     background adds to the overall measurement variability.

        MARLAP  -                                    •                                 JULY 2001
        DO NOT CITE OR QUOTE                     2-14              DRAFT FOR PUBLIC COMMENT

-------
                                                                         Project Planning Process
416      2.5.2.3 Inputs and Boundaries to the Decision

417      The project planning team determines the specific information and data required for decision
418      making. The statistical parameter (e.g., mean) that will be used in the comparison to the action
419      level should be established. Typically, the study boundaries are discussed when the project
420      planning team defines the problem. Changing conditions (e.g., weather, temperature, humidity)
421      that could impact the success of sampling or analysis or data interpretation should be considered
422      as well. The radioanalytical specialists can provide input during the determination of the
423      appropriate action level and the appropriate parameter of interest (e.g., mean concentration).

424      2.5.2.4 Data Needs

425      The project planning team should develop a list of the specific data (number and type) and data
426      requirements (quality). An estimate of the expected variability of the data will be needed.
427      Existing data, experience and scientific judgement can be used to establish the estimate.
428      Information on environmental background levels and variability may be needed (see Chapter 3
429      for a discussion of background). The project planning team establishes whether the existing data
430      are sufficient or whether new data are needed to resolve the problem.

431      2.5.3  Specify the Decision Rule and the Tolerable Decision Error Rates

432      A decision statement or rule is developed by combining the decisions and the alternative actions.
433      The decision rule presents the strategy or logical basis for choosing among the alternative
434      decisions, generally by use of a series of "if...then" statements. For a complex problem, it may be
435      helpful to develop a logic flow diagram (called a decision tree or decision framework), arraying
436      each element of the issue in its proper sequence along with the possible actions. The decision
437      rule identifies (1) the action level mat will be a basis for decision and (2) the statistical parameter
438      that is to be compared to the action level.
439
440
441
Example of a Decision Rule:

If the mean concentration in the survey unit is less than the action level, then the
survey unit is in compliance with the release criterion.
442      The radioanalytical specialists play a key role in the development of alternative technical actions
443      that are realistic and quantifiable and that satisfy the programmatic and regulatory needs. The
444      results of the technical actions must be measurable: the protocols suggested will be able to detect
         JULY 2001                                                                        MARLAP
         DRAFT FOR PUBLIC COMMENT              2-15                    DO NOT CITE OR QUOTE

-------
        Project Planning Process
445      the radionuclide of interest, (see Chapter 3, Critical Analytical Planning Issues and Developing
446      Analytical Protocol Specifications, for additional discussion on background.)

447      For each proposed alternative technical action, the radioanalytical specialists can:

448       •  Focus the project planning team on what radionuclides will need to be measured and what
449         types of analytical techniques are available;

450       *  Address whether it is feasible to obtain the necessary analytical results;

451       *  Present the technical limitations (i.e., the minimum detectable concentrations—MDCs) of
452         available measurement systems; and

453       •  Address how sample collection and handling will affect what measurement techniques can be
454         used.

455      The project planning team also assesses the potential consequences of making a wrong decision.
456      While the possibility of a decision error can never be totally eliminated, it can be controlled. The
457      potential consequences of a decisions error are used to establish tolerable limits on the
458      probability that the data will mislead the decision maker into making an incorrect decision, (see
459      Appendix B for a discussion of hypothesis testing, action levels, and Type I and Type n decision
460      errors). The decision rule and decision makers' limits on the decision error rates are used to
461      establish performance criteria for a data collection design.
462     In developing the tolerable decision error rate, the team needs to look at alternative measurement
463     approaches, the sources of error in field and laboratory handling of samples and analysis, factors
464     that would influence the likelihood of a Type I or Type n error, estimates of the cost of analysis,
465     and judicious use of resources. Determining realistic levels of tolerable decision error rates for
466     the decision rule will reduce or eliminate attempts by the project planning team in developing
467     and optimizing the sampling and analysis design that later will have to be re-designed to attain
468     more realistic decision error rates.

469     2.5.4  Optimize the Strategy for Obtaining Data

470     During the process of developing and optimizing the options for the sampling and analysis of
471     data, the technical team members should determine the most resource effective analytical
472     protocols and associated quality control that will meet all the requirements (desired outputs)


        MARLAP                                                                        JULY 2001
        DO NOT CITE OR QUOTE                     2-16               DRAFT FOR PUBLIC COMMENT

-------
                                                                           Project Planning Process
473      established by the project planning team. Optimizing the data collection design generally requires
474      extensive coordination between the radioanalytical specialists and the sampling specialists.

475      Typical issues that require consideration in the development of the analysis design include the
476      number of samples required, the analytical protocol specifications, which include the MQOs
477      (e.g., a statement of the required method uncertainty) required of the analytical procedures. The
478      analytical protocol specifications, which include the MQOs, will be discussed in Sections 2.5.4.1
479      and 2.5.4.2 below. In general, the more certainty required in the DQOs, the greater the number of
480      samples or the more precise and unbiased the measurements need to be. During planning, the
481      costs and time for field and analytical procedures must be balanced against the level of certainty
482      that is needed to arrive at an acceptable decision.

483      The radioanalytical specialists are involved in evaluating the technical options and their effect on
484      the sources of decision error, their resource requirements and the ability to meet the project's
485      objectives. The radioanalytical specialists can identify an array of potential analytical methods,
486      which can be combined in analytical protocols to meet the defined  data needs and MQOs.
487      Working with the sampling specialists, potential sampling methods are identified based on the
488      sample requirements of the potential analytical protocols and other sampling constraints. The
489      planning team specialists need to consider sources of bias and imprecision that will impact the
490      representativeness of the samples and the accuracy of the data collected. Appropriate
491      combinations of sampling methods, analytical protocols and sampling constraints can then be
492      assessed with regard to resource effectiveness.

493      It may be useful at this point for the project planning team to perform a sensitivity analysis on the
494      input parameters that contribute to the final analytical result. The final analytical result directly
495      impacts the decision, so this sensitivity analysis will allow the project planning team to identify
496      the portions of the analytical protocols, which potentially have the most impact on the decision.
497      Once identified, these portions of the analytical protocols can be targeted to receive a propor-
498      tionally larger share of the resources available for developing the protocols.

499      2.5.4.1  Analytical Protocol Specifications

500      Requirements of the desired analytical protocol(s) should be based  on the intended use of the
501      data. That is, project-specific critical parameters should be considered, including the type of
502      radioactivity and the nuclides of concern, the anticipated range of concentrations, the media type
503      and complexity, regulatory required methods and customer method preferences, the measurement
504      uncertainty required at some activity concentration, detection limits required, necessary chemical
505      separation, qualification or quantification requirements, QC requirements and turnaround time


         JULY 2001                                                                         MARLAP
         DRAFT FOR PUBLIC COMMENT               2-17                    DO NOT CITE OR QUOTE

-------
        Project Planning Process
506     needed. MQOs are a key component of the Analytical Protocol Specifications and are discussed
507     in Section 2.5.4.2. Chapter 3, Key Analytical Planning Issues and Developing Analytical
508     Protocol Specifications, contains more detailed discussion on some of the key decisions and
509     needed input to successfully optimize the sampling and analysis design and develop Analytical
510     Protocol Specifications. Chapter 6 discusses the selection of an analytical protocol from the
511     laboratory's perspective.

512     The project planning team should ensure that there are analytical methods available to provide
513     acceptable measurements. If analytical methods do not exist, the project planning team will need
514     to consider the resources needed to develop a new method, reconsider the approach for providing
515     input data, or perhaps reformulate the decision statement.

516     2.5.4.2 Measurement Quality Objectives

517     When additional data are to be obtained, the project planning process should establish measures
518     of performance for the analysis (MQOs) and evaluation of the data. Without these measures of
519     performance, data assessment is difficult and arbitrary.

520     A MQO is a statement of a performance objective or requirement for a particular method     t
521     performance characteristic such as the required method uncertainty at some concentration. MQOs
522     can be both quantitative and qualitative performance objectives. Quantitative and qualitative  :
523     MQOs are used for real-time compliance monitoring by field and lab staff and during subsequent
524     assessments and data usability determinations. Quantitative MQOs provide numerical criteria for
525     field and laboratory QC samples or procedure performance (e.g., specifications for MDC, yield,
526     efficiency, laboratory control sample precision and recovery, blank levels, lab duplicate
527     precision, collocated sample precision). Precision, bias, completeness, and sensitivity are
528     common data quality indicators for which quantitative MQOs could be developed during the
529     planning process (ANSI/ASQC, 1994). Thus, quantitative MQOs are statements that contain
530     specific units of measure, such as: x percent recovery, x percent relative standard uncertainty, a
531     standard deviation of x Bq/L, or a MDC of x Bq/g. The specificity of the MQOs allows specific
532     comparisons of the data to an MQO. Chapter 3 provides detailed guidance on developing MQOs
533     for select method performance characteristics.

534     A graded approach should be taken to the selection of the MQOs. For example, from a project
535     viewpoint, it is highly practical and economical to establish MQOs on a graded basis that are in
536     concert with the anticipated range of the analytes concentration compared to the action level. For
537     example, the required method uncertainty, when the analyte concentration is much greater than
        MARLAP                                                                       JULY 2001
        DO NOT CITE OR QUOTE                    2-18               DRAFT FOR PUBLIC COMMENT

-------
                                                                        Project Planning Process
538     the action level, can be less restrictive than when the analyte concentration approaches the action
539     level. These decisions are extremely important in the protocol selection process.

540     The MQOs for the analytical data should be documented in the project plan documents (e.g., the
541     QA Project Plan). MQOs are also the basis for the data verification and validation criteria (see
542     Appendix D, Section 2'.7, for discussion of MQOs and QA Project Plans).

543     2.6   Results of the Directed Planning Process

544     By the end of the directed planning process, the project planning team has established their
545     priority of concerns, the definition of the problem, the decisions) or outcome to address the
546     posed problem, the inputs and boundaries to the decision(s), and the tolerable decision error
547     rates. They have also agreed on decision rules that incorporate all this information into a logic
548     statement about what must be done to obtain the desired answer. The key output of the planning
549     process is the DQOs: qualitative and quantitative statements that clarify study objectives, define
550     the appropriate type of data, and specify the tolerable rate of making decision errors that will be
551     used as the basis for establishing the quantity and quality of data needed to support the decisions
552     and the criteria for data assessment.

553     If new data are required, then the project planning team has defined the desired analytical quality
554     of the data (MQOs). That is, the project planning team has determined the type, quantity, and
555     quality of data needed to support a decision. The directed planning process has clearly linked
556     sampling and analysis efforts to an action and a decision. This linkage allows the project
557     planning team to determine when enough data have been collected.

558     If new data are to be obtained, the project planning team has developed the most resource-
559     effective sampling and analysis design that will provide adequate data for decision making.
560     Based on the DQOs, the project planning team specifies the sampling collection design and
561     Analytical Protocol Specifications, including:

562      •  The type and quantity of samples to be collected;
563      •  Where, when, and under what conditions they should be collected;
564      •  What radionuclides are to be measured; and
565      •  The MQOs to ensure that the analytical errors are controlled sufficiently to meet the tolerable
566         decision error rates specified in the DQOs.
        JULY 2001                                                                       MARLAP
        DRAFT FOR PUBLIC COMMENT              2-19                    DO NOT CITE OR QUOTE

-------
        Project Planning Process
567     2.6.1  Output Required by the Radioanalytical Laboratory: The Analytical Protocol
568            Specifications

569     As a result of directed planning, the description of the DQOs for the project and the Analytical
570     Protocol Specifications, which contain the MQOs and any specific analytical process require-
571     ments for additional data will provide the radioanalytical laboratory with a clear and definitive
572     description of the desired data, as well as the purpose and use of the data. This information will
573     be provided to the project implementation team through the SOW and the project plan
574     documents. Precise statements of analytical needs may prevent the radioanalytical laboratory
575     from:

576      • Having to make a "best guess" as to what data are really required;
577      • Using the least costly or most routine protocol, which may not meet the needed data quality;
578      • Independently developing  solutions for unresolved issues without direction from the project
579        planning team; and
580      • Having "moving targets" and "scope creep" that stem from ambiguous statements of work.

581     The  output of the planning process, from the perspective of the radioanalytical laboratory, is the
582     Analytical Protocol Specifications. The Analytical Protocol Specifications should contain the
583     minimum level of specificity required to meet the project data requirements. In accordance with a
584     performance based measurement approach the laboratory will use this information to select or
585     develop (specific) analytical protocols that will meet the MQOs. The Analytical Protocol
586     Specifications should present the resolution of the project planning team on both general issues
587     and matrix-specific issues. Chapter 3, Key Analytical Planning Issues and Developing Analytical
588     Protocol Specifications, addresses some of the common radioanalytical planning issues.

589     The  Analytical Protocol Specifications should include, but not be limited to:

590      • The radionuclide(s) of concern;
591      • The media of concern with information on chemical, explosive and other hazardous
592        components;
593      • The anticipated concentration range (estimate, maximum or detection capability);
594      • The MQOs desired for the radionuclides of concern;
595      • The sample preparation and preservation requirements (laboratory and field);
596      • The type and frequency of QC samples required of each radionuclide  of concern;
597      • The sample transport, tracking and custody requirements;
598      • The required analytical turnaround time for the project and the anticipated budget for the
599        analysis; and


        MARLAP                                                                        JULY 2001
        DO NOT CITE OR QUOTE "   ~              2-20"       ..    DRAFT FOR PUBLIC COMMENT

-------
                                                                        Project Planning Process
600      • The data reporting requirements.

601     2.6.2   Chain of Custody

602     Requirements for formal Chain of Custody (COC) should be specified in the Analytical Protocol
603     Specifications if required. COC procedures provide the means to trace possession and handling
604     of the sample from collection to data reporting. The data report requires a number of items, not
605     all of which can be listed here. COC will impact how the field and lab components handle the
606     sample. COC is discussed in Chapter 10 and Chapter 11.

60?     2.7    Project Planning and Project Implementation and Assessment

608     A directed planning process generally is considered complete with the approval of an optimal
609     data collection design approach or when historical data are deemed sufficient to support the
610     desired decision. However to complete the process, the project planning team clearly should
611     document the results of the planning process and link DQOs and MQOs to the implementation
612     and assessment processes. The directed planning process is the first activity in the project's
613     planning phase (see Figure  1.1, "The Data Life Cycle"). The planning process outputs are key
614     inputs to the implementation and assessment processes of the data collection activities. That is,
615     the outputs of the directed planning process  are the starting point for developing plan documents,
616     obtaining analytical services,  selecting specific analytical protocols and assessing the data
617     collected. This section will  provide an overview of the next steps of the planning phase and the
618     linkage to the implementation and assessment phases and to other chapters in MARLAP, Part I.

619     2.7.1   Documenting the Planning Process

620     A concept inherent in directed planning approaches is the establishment of a formal process to
621     document both the decisions and supporting logic established by the team during the project
622     planning process. Establishing this documentation process is  not only good management practice,
623     but also tends to prevent situations where new team members recreate the past logic for activities
624     being performed upon the departure of their predecessors. As actual field conditions or other
625     situations force changes to the original plans, the documentation can then be updated through a
626     change control process to continue to maintain the technically defensible basis for the actions
627     being taken.

628     When properly documented, the directed planning process:

629      • Provides a background narrative of the project;

        JULY 2001                                                                      MARLAP
        DRAFT FOR PUBLIC COMMENT               2-21                    DO NOT CITE OR QUOTE

-------
        Project Planning Process
630       •  Defines the necessary input needed (nuclides, matrices, estimate of concentration range, etc.)
631       •  Defines the constraints and boundaries within which the project would have to operate;
632       •  Defines the decision rule, which states the action level that will be the basis for the decision
633         and the parameter that is to be compared to the action level;
634       •  Identifies the tolerable decision error rates;
635       •  Identifies MQOs for new analytical data; and
636       •  Identifies processes and criteria for usability of the data.

637     The results of the project planning process are also needed for the development of project plan
638     documents required for implementing the sampling and analysis activities. These project plan
639     documents may include a Quality Assurance Project Plan (QAPP), Work Plan, or Sampling and
640     Analysis Plan (SAP). The format and naming of plan documents are usually a function of the
64i     authoring organization's experience, the controlling federal or state regulations, or the controlling
642     Agency. Project plan documents are discussed in Chapter 4, Project Plan Documents, and in
643     Appendix D, Content of Project Plan Documents. The project plan documents will rely on the
644     planning process outputs, including the MQOs, to describe in comprehensive detail the necessary
645     QA, QC, and other technical activities that must be  implemented to ensure that the results of the
646     work performed will satisfy the stated DQOs. The project plan documents should also document
647     the processes and criteria developed for data assessment. MARLAP recommends that the
648     planning process rationale is documented and the documentation integrated with the project plan
649     documents. Documentation of the planning process can be incorporated directly in the project
650     plan documents or through citation to a separate report on the planning process.

651     2.7.2  Obtaining Analytical Services

652     If contractual laboratory services are required, the contracting office or Sample Management
653     Office (SMO) should rely on the planning process statements of required data and data quality,
654     the Analytical Protocol Specifications, to develop the Statement of Work (SOW) for the
655     laboratory. The SOW is the contractual agreement, which describes the project scope and
656     requirements (i.e., what work is to be accomplished). Contracting laboratory services is discussed
657     in Chapter 5, Obtaining Laboratory Services, and Chapter 7, Evaluating Methods and
658     Laboratories. MARLAP recommends that a  SOW be developed even if a contract is not
659     involved, for example, when an agency employs one of its own labs.

660     2.7.3  Selecting Analytical Protocols

661     From an analytical perspective, one of the most important functions of a directed planning
662     process is the identification and resolution of key analytical planning issues for a project. A key


        MARLAP                                                                      JULY 2001
        DO NOT CITE OR QUOTE                   2-22   ~          DRAFT FOR PUBLIC COMMENT

-------
                                                                         Project Planning Process
663     analytical planning issue may be defined as one that has the potential to be a significant eontribu-
664     tor of uncertainty to the analytical process and ultimately the resulting data. Identifying key
665     analytical issues for a particular process requires a clear understanding of the analytical process.
666     It is the role of the radioanalytical specialist on the project planning team to ensure that key
667     analytical planning issues have been clearly defined and articulated and incorporated into the
668     principal decision or principal study question. Chapter 3 discusses the key analytical planning
669     issues.

670     The selection of radioanalytical protocols by the laboratory is made in response to the Analytical
671     Protocol Specifications (for each analyte/matrix) developed by the project planning team as
672     documented in the SOW. Unless required by regulatory policy, rarely will a radioanalytical
673     method be specifically stated. A number of radioanalytical methods are available but no one
674     method provides a general solution; all have advantages and disadvantages. The selection of a
675     method is related to a broad range of consideration, including analyte and matrix characteristics,
676     technical complexity and practicality of the method, quality requirements, availability of
677     equipment, facility and staff resources, regulatory and economic considerations, and practicality
678     and previous use of the method. Chapter 6 discusses the selection of a protocol, as well as, the
679     modification of an existing protocol to account for changes in sample substrate.

680     2.7.4  Assessment Plans

681     Concurrent with the development of MQOs and other specifications of the optimized analytical
682     design, is the development of the data assessment plans. Data assessment is difficult and
683     arbitrary when attempted at the end of the project without planning and well defined, project
684     specific criteria. The development of these plans during the project planning process should
685     ensure that the appropriate documentation will be available for assessment and that those
686     implementing and assessing data will be aware of how the data will be assessed. Assessment of
687     environmental data consists of three separate and identifiable phases: data verification, data
688     validation, and data quality assessment (DQA). Verification and validation pertain to evaluation
689     of analytical data generated by the laboratory. DQA considers all sampling, analytical, and data
690     handling details, and other historical project data when determining the usability of data in the
691     context of the decisions to be made. The focus of verification and validation is on the analytical
692     process and a data point by data point review, -while DQA considers the entire data collection
693     process and the entire  data set as it assesses data quality. Verification, validation, and DQA
694     assure the technical strengths and weaknesses of the overall project data are known, and
695     therefore, establishes the technical defensibility of the data. Assessment plan documents are
696     discussed in detail in Chapters 8 and 9.         .-        .	
         JULY 2001                                                                .       MARLAP
         DRAFT FOR PUBLIC COMMENT               2-23                    DO NOT CITE OR QUOTE

-------
        Project Planning Process
697     2.7.4.1 Data Verification

698     The data verification process should be defined during the project planning process and
699     documented in a data verification plan or the project plan documents (e.g., the QAPP). The
700     verification plan should specify the types of documentation needed for verification. Analytical
701     data verification assures that laboratory conditions and operations were compliant with the
702     contractual SOW and project plan (i.e., SAP or QAPP). The contract for analytical services and
703     the project plan determine the procedures the laboratory must use to produce data of acceptable
704     quality (MQOs) and the content of the analytical data package. Verification compares the
705     material delivered by the laboratory to these requirements and checks for consistency of the data
706     throughout the data package, correctness of calculations, and completeness of the results to
707     ensure all documentation is available. Compliance, exceptions, missing documentation and the
708     resulting inability to verify compliance must be recorded in the data verification report. Data
709     verification is discussed in more detail in Chapter 8, Radiological Data Verification and
710     Validation.

711     2.7.4.2 Data Validation

712     Performance objectives and criteria for data validation should be developed during the project
713     planning process and documented in a separate plan or included in the project plan documents
714     (e.g., QAPP). Guidance on Data Validation Plans is provided in Chapter 8, Radiological Data
715 ,    Verification and Validation. After the data are collected, data validation activities will rely on the
716     planning process statements of the MQOs to confirm whether the obtained data meet the
717     requirements of the project.

718     2.7.4.3 Data Quality Assessment

719     The DQA process evaluates whether the quality and quantity of data will support their intended
720     use. The DQA process determines whether the data meet the assumptions under which the DQOs
721     and the data collection design were developed and whether the analytical uncertainty in the data
722     will allow the decision maker to use the data to support the decision within the tolerable decision
723     error rates established during the directed planning process. Guidance on the DQA Process and
724     plan development is provided in Chapter 9, Data Quality Assessment. The process and criteria to
725     be used for DQA process should be developed by the project planning team and documented in
726     the project plan documents or in a stand alone plan that is cited or appended to the project plan
727     documents.
        MARLAP                             "~                                         JULY 2001
        DO NOT CITE OR QUOTE                    2-24              DRAFT FOR PUBLIC COMMENT

-------
                                                                     Project Planning Process
728
729
730
731
732
733
Summary of Recommendations
 • MARLAP recommends the use of a directed project planning process.
 • MARLAP recommends that the radioanalytical specialists be a part of the integrated effort
   of the project planning team.
 • MARLAP recommends that the planning process rationale be documented and the
   documentation integrated with the project plan documents.
734     2.8    References

735     American National Standards Institute and the American Society for Quality Control
736        (ANSI/ASQC). 1994. Specifications and Guidelines for Quality Systems for Environmental
737        Data Collection and Environmental Technology Programs, National Standard E-4.

738     American Society for Testing and Materials (ASTM). 1994. Standard Guide for Quality
739        Planning and Field Implementation of a Water Quality Measurements Program, D5612.

740     American Society for Testing and Materials (ASTM). 1995a. Standard Practice for Generation
741        of Environmental Data Related to Waste Management Activities: Development of Data
742        Quality Objectives, D5792-95.

743     American Society for Testing and Materials (ASTM). 1995b. Standard Guide for Planning and
744        Implementing a Water Monitoring Program, D5851.

745     American Society for Testing and Materials (ASTM). 1996a. Standard Provisional Guidance for
746        Expedited Site Characterization of Hazardous Waste Contaminated Sites, PS85-96.

747     American Society for Testing and Materials (ASTM). 1996b. Standard Guide for Site
748        Characterization for Environmental Purposes with Emphasis on Soil, Rock, the Vadose Zone
749        and Ground Water, D5730-96.

750     MARSSIM. 2000. Multi-Agency Radiation Survey and Site Investigation Manual, Revision 1.
751        NUREG-1575 Rev 1, EPA 402-R-97-016 Revl, DOE/EH-0624 Revl. August. Available
752        from http://www.epa.gov/radiation/marssim/filesfin.htm.

753     U.S. Army Corps of Engineers (ACE). 1998. Technical Project Planning (TPP) Process.
754        Engineer Manual EM-200-1-2.
        JULY 2001                                                " '  "               MARLAP
        DRAFT FOR PUBLIC COMMENT             2-25                   DO NOT CITE OR QUOTE

-------
        Project Planning Process
755     U.S. Department of Energy (DOE). December 1993. Remedial Investigation/Feasibility Study
756        (RI/FS) Process, Elements and Techniques Guidance, Module 7 Streamlined Approach for
757        Environmental Restoration, Office of Environmental Guidance, RCRA/CERCLA Division
758        and Office of Program Support, Regulatory Compliance Division Report DOE/EH-
759        94007658.

760     U.S. Environmental Protection Agency (EPA). September 1993. Data Quality Objective Process
761        for Superfund. EPA/540/G-93/071,  Washington, DC.

762     U.S. Environmental Protection Agency (EPA). 2000. Guidance for the Data Quality Objective
763        Process (EPA QA/G-4). EPA/600/R-96/055, Washington, DC. available from www.epa.gov/
764        quality l/qa_docs.html.

765     U.S. Environmental Protection Agency (EPA). 1998. Guidance for the Quality Assurance
766        Project Plans (EPA QA/G-5). EPA/600/R-98/018, Washington, DC.

767     U.S. Environmental Protection Agency (EPA). 2000. Data Quality Objectives Process for
768        Hazardous Waste Site Investigations (Quality Assurance/G-4HW), EPA 600/R-00/007,
769        Washington, DC.

770     U.S. Nuclear Regulatory Commission (NRC). 1998a. Decision Methods for Dose Assessment to
771        Comply with Radiological Criteria for License Termination. NUREG-1549 (Draft).

772     U.S. Nuclear Regulatory Commission (NRC). 1998b. Demonstrating Compliance with the
773        Radiological Criteria for License Termination. Regulatory Guide DG-4006.

774     U.S. Nuclear Regulatory Commission (NRC). 1998c. A Nonparametric Statistical Methodology
775        for the Design and Analysis of Final Status Decommissioning Surveys. NUREG-1505, Rev. 1.
        MARLAP                                                                   JULY 2001
        DO NOT CITE OR QUOTE                   2-26              DRAFT FOR PUBLIC COMMENT

-------
 i               3 KEY ANALYTICAL PLANNING ISSUES
 2           AND DEVELOPING ANALYTICAL PROTOCOL
 3                                 SPECIFICATIONS


 4     3.1   Introduction

 5     This chapter provides an overview of key analytical planning issues that should be addressed and
 6     resolved during a directed planning process (see Chapter 2). The resolution of these issues results
 7     in the development of Analytical Protocol Specifications (APSs). A key analytical planning issue
 8     may be defined as one that has a significant effect on the selection and development of analytical
 9     protocols, or one that has the potential to be a significant contributor of uncertainty to the
10     analytical process and, ultimately, the resulting data. It should be noted that a key analytical
11     planning issue for one project may not be a key issue for another project. From an analytical
12     perspective, one of the most important functions of a directed planning process is the
13     identification and resolution of these key issues for a project.

14     In accordance with a performance-based approach, APSs only should contain the minimum level
15     of specificity required to meet the project or program data requirements and resolve the key
16     analytical planning issues. Identification and resolution of these issues should be an integral part
17     of a directed planning process, and the APSs should be an output or product of that process. This
18     chapter provides a focused examination of analytical planning issues and the development of
19     APSs.

20     In order to assist the project planning team in identifying key issues, this chapter provides a list
21     of potential key analytical planning issues. Neither the list nor discussion of these potential issues
22     is an exhaustive examination of all possible issues for a project. However, this chapter does
23     provide a framework and a broad base of information that can assist in the identification of key
24     analytical planning issues for a particular project during a directed planning process.

25     Analytical planning issues can be divided into two broad categories—those that tend to be
26     matrix-specific and those that are more general in nature. While there is certainly some overlap
27     between these two broad categories, MARLAP divides analytical planning issues along these
28     lines because of the structure and logic it provides in developing APSs. This approach involves
29     identifying key analytical planning issues from the general (non-matrix-specific) issues first and
30     then proceeding on to the matrix-specific issues. Examples of non-matrix-specific analytical
31     planning issues include sample tracking  and custody issues. These general issues are discussed in
32     detail in Section 3.3. Examples of matrix-specific issues include filtration and preservation issues
33     of water samples. Matrix-specific analytical planning issues will be discussed in detail in Section

      JULY 2001       "                              _.                           MARLAP
      DRAFT FOR PUBLIC COMMENT              3-1                   DO NOT CITE OR QUOTE

-------
       Key Analytical Planning Issues...
34     3.4. Section 3.5 provides guidance on assembling the APSs from the resolution of these issues.
35     Section 3.6 discusses defining the level of protocol performance demonstration required for a
36     particular project, and Section 3.7 discusses incorporating the APSs into the project plan
37     documents.

38     3.2    Overview of the Analytical Process

39     Identifying key analytical issues for a particular project requires a clear understanding of the
40     analytical process. The analytical process as described in Chapter 1 includes all activities, starting
41     with field sample preparation and preservation, followed by sample receipt and inspection,
42     laboratory sample preparation; sample dissolution; chemical separations; instrument measure-
43     ments, data reduction and reporting, and sample tracking and quality control of the process.
44     Figure 3.1 illustrates the analytical process. It should be noted that a particular project's ana-
45     lytical process may not include all of the activities listed above. For example, if the project's
46     analytical process involves performing gamma spectrometry on soil samples, sample dissolution
47     and chemical separation activities normally are not required. Bach step of a particular analytical
48     process contains potential planning issues that may be key analytical planning issues depending
49     on the nature and data requirements of the project. Therefore, it is important to identify the
50     relevant activities of the analytical process for a particular project early in the directed planning
51     process. Once the analytical process for a particular project has been established, key analytical
52     planning issues, including both general and matrix-specific ones, can be identified.

53     3.3    General Analytical Planning Issues

54     This section discusses a number of general analytical planning issues that are common to many
55     types of projects and are often key planning issues, depending on the nature and data
56     requirements of the project. (Section 6.5 of Chapter 6 also discusses a number of these planning
57     issues to provide context on the method selection process.) This section presents each planning
58     issue as an activity to be accomplished during a directed planning process and also identifies the
59     expected outcome of the activity in general terms. The resolution of these general analytical
60     planning issues, particularly those that are key planning issues for a project, provides the basic
61     framework of the APSs and, therefore, should be identified and resolved before proceeding to
62     matrix-specific planning issues. Normally the resolution of these issues results, at a minimum, in
63     an analyte list, identified matrices of concern, measurement quality objectives (MQOs), and
64     established frequencies and acceptance criteria for quality control (QC) samples. The resolution
65     of matrix-specific issues, particularly those that are key issues for a project, normally provides
66     the necessary additions and modifications to the basic framework of the APSs needed to
67     complete and finalize the specifications. MARLAP recommends that any assumptions made

       MARLAP                                                                       JULY 2001
       DO NOT CITE OR QUOTE                     3-2               DRAFT FOR PUBLIC COMMENT

-------
                                                                   Key Analytical Planning Issues.,.
100     and when establishing the MQOs. Every effort should be made to obtain as much existing infor-
101     mation as possible prior to initiating a directed planning process.

102     Sometimes there are little or no historical data that can help identify radionuclides or the
103     concentration range of potential concern, or the existing data may be of inadequate quality. Li
104     these cases, it may be necessary to perform preliminary analyses to identify the radionuclides of
105     concern or their concentration range. A fourth source of information is generated by conducting a
106     preliminary survey or characterization study. The design of preliminary surveys or characteriza-
107     tion studies should be part of the project planning process. The need for fast turnaround and
108     lower costs at this stage of the project may lead to different data quality objectives (DQOs) and
109     MQOs that are less restrictive than those used for the primary phase of the project.  However, it is
110     important that analytical requirements for the survey or study be established during the project
111     planning process. Gross alpha, gross beta, and gamma spectrometry analyses often are used for
112     preliminary survey or characterization studies.

113     The benefits of performing these types of measurements include:

114     •   Rapid analysis and short turnaround time;
115     •   Relatively low analytical costs; and
116     •   Detecting the presence of a wide range of radionuclides in a variety of media.

117     There are also limitations on the use of these analyses. These limitations include:

118     •   No specific identification for pure alpha- or pure beta-emitting radionuclides and low-energy,
119         gamma-emitting radionuclides are generally not identified; and

120     •   Failing to identify the presence of several radionuclides (e.g., 3H and other volatile
121         radionuclides; 55Fe and other radionuclides that decay by electron capture).

122     OUTPUT: An initial list of radionuclides of potential concern including a brief narrative explain-
123     ing why each radionuclide is on the list as well as an explanation of why certain radionuclides
124     were considered but not listed. This list may be modified as more project-specific information
125     becomes available. It is better to include radionuclides on the initial list even if the probability
126     that they significantly contribute to the addressed concerns is small. The consequence of
127     discovering an additional radionuclide of concern late in a project generally outweighs the effort
128     of evaluating its potential during planning.
        JULY 2001                       --                                               MARLAP
        DRAFT FOR PUBLIC COMMENT               3-5                      DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues.,.
129     3.3.2  Identify Concentration Ranges

130     Once the radionuclides of concern have been identified, the expected concentration range for
131     each radionuclide should be determined. Historical data, process knowledge, and previous
132     studies, if available, can be used to determine the expected concentration range for each analyte.
133     While most analytical protocols are applicable over a fairly large concentration range for the
134     radionuclide of concern, performance over a required concentration range can serve as an MQO
135     for the protocol selection process  and some analytical protocols may be eliminated if they cannot
136     accommodate the expected concentration range. In addition, the expected concentration ranges of
137     all of the radionuclides of concern can provide useful information about possible chemical and
138     spectral interferences. For example, while an analytical protocol for a particular radionuclide may
139     be able to accommodate the expected concentration range for that radionuclide, the concentra-
140     tions of other radionuclides may present interference problems, thus eliminating the use of that
141     analytical protocol.

142     OUTPUT: The expected concentration range for each radionuclide of concern as well as the
143     expected concentration range of any potential chemical or radiological interference.

144     3.3.3  Identify and Characterize Matrices of Concern

145     During a directed project planning process, the matrices of concern should be clearly identified.
146     For many projects, typical matrices may include surface soil, subsurface soil, sediment, surface
147     water, groundwater, drinking water, air particulates, biota, structural materials, metals, etc.
148     Historical data, process knowledge, previous studies, conceptual site models, transport models,
149     and other such sources generally are used to identify matrices of concern. It is critical to be as
150     specific as possible when identifying a matrix.

151     From an analytical perspective, information on the chemical and physical characteristics of a
152     matrix is extremely useful.  Therefore, in addition to identifying the matrices of concern, every
153     effort should be made to obtain any information available on the chemical and physical charac-
154     teristics of the matrices. This information is particularly important when determining the required
155     specificity of the analytical protocol, i.e., the ability to accommodate possible interferences. It is
156     also important to identify any possible hazards associated with the matrix, such as the presence
157     of explosive or other highly reactive chemicals. Issues related to specific matrices, such as filtra-
158     tion of water samples and removal of foreign material, are discussed in more detail in Section 3.5
159     and in Section 6.5.1.1 of Chapter  6.
        MARtAP                       -                                                JULY 2001
        DO NOT CITE OR QUOTE                     3-6               DRAFT FOR PUBLIC COMMENT

-------
                                                                  Key Analytical Planning Issues...
160     OUTPUT: A list of the matrices of concern along with any information on the chemical and
161     physical characteristics of the matrices and any information on possible hazards associated with
162     them. As previously noted, the list of matrices of concern and the analyte list often are developed
163     concurrently. In some cases, one analyst list is applicable to all the matrices of concern, and in
164     other cases there are variations in the analyte lists for each matrix.

165     33.4  Determine Relationships Between the Radionuclides of Concern

166     Known or expected relationships among radionuclides can be used to establish "alternative"
167     radionuclides that may be easier and less costly to measure. In most cases, an "easy-to-measure"
168     radionuclide is analyzed, and the result of this analysis is used to estimate the concentration of
169     one or more radionuclides that may be difficult to measure or costly to analyze.

170     One of the best known and easiest relationships to establish is between a parent radionuclide and
171     its associated progeny. Once equilibrium conditions have been established, the concentration of
172     any member of the decay series can be used to estimate the concentration of any other member of
173     the series. For example, the thorium decay series contains 12 radionuclides. If each radionuclide
174     in this series is analyzed separately, the analytical costs can be very high. However, if equilib-
175     rium conditions for the decay series have been established, a single analysis using gamma spec-
176     trometry may be adequate for quantifying all of the radionuclides in the series simultaneously.

177     Similarly, process knowledge can be used to predict relationships between radionuclides. For
178     example, in a nuclear power reactor, steel may become irradiated, producing radioactive isotopes
179     of the elements present in the steel. These isotopes often include ^Co, 63Ni, and S5Fe. wCo decays
180     by emission of a beta particle and two high-energy gamma rays, which are easily measured using
181     gamma spectrometry. 63Ni also decays by emission of a beta particle but has no associated
182     gamma rays. 53Fe decays by electron capture and has several associated X-rays with very low
183     energies. Laboratory analysis of 63Ni and 55Fe typically is time-consuming and expensive.
184     However, since all three radionuclides are produced by the same mechanism from the same
185     source material, there is an expected relationship at a given time in their production cycle. Once
186     the relationship between these radionuclides has been established, the ^Co concentration can be
187     used to estimate the concentration of 63Ni and 55Fe.

188     The uncertainty in the concentration ratio between radionuclide concentrations used in the alter-
189     nate analyte approach should be included as part of the combined standard uncertainty of the
190     analytical protocol in the measurement process. Propagation of uncertainties is discussed in
191     Chapter 19.
        JULY 2001            .                                                           MARLAP
        DRAFT FOR PUBLIC COMMENT              3-7                     DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
192     OUTPUT: A list of known radionuclide relationships (e.g., those based on parent-progeny rela-
193     tionships or previous study results) and a list of potential radionuclide relationships (i.e., based
194     on process knowledge). A preliminary study to determine the project-specific radionuclide
195     relationships may be necessary, and additional measurements may be required to confirm the
196     relationship used during the project. This information may be used to develop a revised analyte
197     list.

198     3.3.5  Determine Available Project Resources and Deadlines

199     The available project resources can have a significant impact on the selection or development of
200     analytical protocols, as well as the number and type of samples to be analyzed.  In addition,
201     project deadlines, and, in particular, required analytical turnaround times (see Section 6.5.3), can
202     be important factors in the selection and development of analytical protocols for a particular
203     project. During a directed planning process, radioanalytical specialists can provide valuable
204     information on typical costs and turnaround times for various types of laboratory analyses.

205     OUTPUT: A statement of the required analytical turnaround times for the radionuclides of concern
206     and the anticipated budget for the laboratory analysis of the samples.

207     3.3.6  Refine Analyte List and Matrix List

20S     As additional information about a project is collected, radionuclides may be added to or removed
209     from the analyte list. There may be one analyte list for all matrices or separate lists for each
210     matrix. Developing an analyte  list is an iterative process, however. The b'st should become more
211     specific during the project planning process.

212     Radionuclides might be added to the analyte list when subsequent investigations indicate that
213     additional radionuclides were involved in a specific project. In some cases, radionuclides may be
214     removed from the analyte list. When the initial analyte list is compiled, there may be significant
215     uncertainty associated with the presence of specific radionuclides. These radionuclides may be
216     included on the analyte list to be conservative, even when there is only a small  probability they
217     may be present. Subsequent investigations may determine if specific radionuclides are actually
21S     present and need to be considered as part of the project. For example, a research laboratory was
219     licensed for a specific level of activity from all radionuclides with atomic numbers between 2 and
220     87. Even limiting the analyte list to radionuclides with a half-life greater than six months
221     provides several dozen radionuclides. A study may be designed to identify the actual
222     radionuclides of concern  through the use of historical records and limited analyses to justify
223     removing radionuclides from the analyte list.

        MARLAP                                                                        JULY 2001
    — DO NOT CITE OR QUOTE                      3-8               DRAFT FOR  PUBLIC COMMENT

-------
                                                                  Key Analytical Planning Issues...
224     OUTPUT: A revised analyte list. Radionuclides can always be added to or removed from the
225     analyte list, but justification for adding or removing radionuclides should be included in the
226     project documentation.

227     3.3.7  Method Performance Characteristics and Measurement Quality Objectives

228     The output of a directed planning process includes DQOs for a project. DQOs apply to all data
•229     collection activities associated with a project, including sampling and analysis. In particular,
230     DQOs for data collection activities describe the overall level of uncertainty that a decisionmaker
231     is willing to accept for project results. This overall level of uncertainty is made up of
232     uncertainties from sampling and analysis activities.

233     Since DQOs apply to both sampling and analysis activities, what are needed from an analytical
234     perspective are performance objectives specifically for the analytical process of a particular
235     project. MARLAP refers to these performance objectives as measurement quality objectives. The
236     MQOs can be viewed as the analytical portion of the overall project DQOs. In a performance-
237     based approach, the MQOs are used initially for the selection and evaluation of analytical
238     protocols and are subsequently used for the ongoing and final evaluation of the analytical data.

239     In MARLAP, the development of MQOs for a project depends on the selection of an action level
240     and gray region for each analyte during the directed planning process. The term "action level" is
241     used to denote the numerical value that will cause the decisionmaker to choose one of the
242     alternative actions. The "gray region" is a set of concentrations close to the action level, where
243     the project planning team is willing to tolerate a high decision error rate (see Chapter 2 and
244     Appendices B and C for a more detailed discussion  of action levels and gray region). MARLAP
245     recommends that an action level and gray region be established for each analyte during the
246     directed planning process.

247     MARLAP provides guidance on developing MQOs for select method performance characteristics
248     such as:

249     •   The method uncertainty at a specified concentration (expressed as an estimated standard
250         deviation);

251     •   The method's detection capability (expressed as the minimum detectable concentration, or
252         MDC);
         JULY 2001                                                                        MARLAP
         DRAFT FOR PUBLIC COMMENT               3-9                     DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
253     •   The method's quantification capability (expressed as the minimum quantifiable
254         concentration, or MQC);

255     •   The method's range, which defines the method's ability to measure the analyte of concern
256         over some specified range of concentration;

257     •   The method's specificity, which refers to the ability of the method to measure the analyte of
258         concern in the presence of interferences; and

259     •   The method's ruggedness, which refers to the relative stability of method performance for
260         small variations in method parameter values.

261     An MQO is a statement of a performance objective or requirement for a particular method per-
262     formance characteristic. An example MQO for the method uncertainty at a specified concentra-
263     tion, such as the action level, would be: "A method uncertainty of 0.01 Bq/g or less is required at
264     the action level of 0.1 Bq/g." A qualitative example of an MQO for method specificity would be
265     'The method must be able to quantify the amount of 226Ra present, given elevated levels of 235U
266     in the samples." MQOs may be quantitative or qualitative in nature.

267     The list provided in this section is not intended to be an exhaustive list of method performance
268     characteristics, and for a particular project, other method performance characteristics may be
269     important and should be addressed during the project planning process. In addition, one or more
270     of the method performance characteristics listed may not be important for a particular project.
271     From an analytical perspective, a key activity during project planning is the identification of
272     important method performance characteristics and the development of MQOs for the method
273     performance characteristics.

274     In addition to developing MQOs for method performance characteristics, MQOs may be estab-
275     lished for other parameters, such as data quality indicators (DQIs). DQIs are qualitative and
276     quantitative descriptors used in interpreting the degree of acceptability or utility of data. The
277     principal DQIs are precision, bias, representativeness, comparability, and completeness. These
278     five DQIs are also referred to by the acronym PARCC; the "A" stands for accuracy instead of
279     bias, although both indicators are included in discussions of the PARCC parameters (EPA,
280     1998). Since the distinction between imprecision and bias depends on context, and since a
281     reliable estimate of bias requires a data set that includes many measurements, MARLAP focuses
282     on developing an MQO for method uncertainty. Method uncertainty effectively combines
283     imprecision and bias into a single parameter whose interpretation does not depend on context.
284     This approach assumes that all potential sources of bias present in the analytical process have


        MARLAP                                                   "                  JULY 2001
        DO NOT CITE OR QUOTE                   3-10              DRAFT FOR PUBLIC COMMENT

-------
                                                                 Key Analytical Planning Issues..
285     been considered in the estimation of the measurement uncertainty and, if not, that any appre-
286     ciable bias would only be detected after a number of measurements of QC and performance
287     evaluation samples have been performed. MARLAP provides guidance on the detection of bias,
288     for example, during analytical protocol validation and evaluation (Chapters 6 and 7). However,
289     the most likely time to detect, and possibly correct, an unanticipated bias is during data quality
290     assessment (see Chapter 9).

291     While MARLAP does not provide specific guidance on developing MQOs for the DQIs, estab-
292     lishing MQOs for the DQIs may be important for some projects. EPA Guidance for Quality
293     Assurance Project Plans (EPA, 1998) contains more information on DQIs. MARLAP provides
294     guidance on developing MQOs for method performance characteristics in the next section.

295     3.3.7.1 Develop MQOs for Select Method Performance Characteristics

296     Once the important method performance characteristics for an analytical process have been iden-
297     tified, the next step is to develop MQOs for them. This section provides guidance on developing
298     MQOs for the method performance characteristics listed in the previous section. As noted, other
299     method performance characteristics may be important for a particular analytical process, and
300     MQOs should be developed for them during project planning.

301     METHOD UNCERTAINTY

302     While measurement uncertainty is a parameter associated with an individual result and is calcu-
303     lated after a measurement is performed, MARLAP uses the term "method uncertainty" to refer to
304     the predicted uncertainty of a measured value that would likely result from the analysis of a
305     sample at a specified analyte concentration. Method uncertainty is a method performance charac-
306     teristic much like the detection capability of a method. Reasonable values for both characteristics
307     can be predicted for a particular method based on typical values for certain parameters and on
308     information and assumptions about the samples to be analyzed. These predicted values can be
309     used in the method selection process to identify the most appropriate method based on a project's
310     data requirements. Because of its importance in the selection and evaluation of analytical proto-
311     cols and its importance in the evaluation of analytical data, MARLAP recommends that the
312     method uncertainty at a specified concentration (typically the action level) always be identified
313     as an important method performance characteristic, and that an MQO be established for it for
314     each analyte.

315     The MQO for the method uncertainty at a specified concentration plays a key role in MARLAP's
316     performance-based.approach. It effectively links the three phases of the data life cycle: planning,

        JULY 2001                                                                      MARLAP
        DRAFT FOR PUBLIC COMMENT             3-11                    DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
317     implementation, and assessment. This MQO, developed during the planning phase, is used
318     initially in the selection and validation of an analytical method for a project (Chapter 6). This
319     MQO provides criteria for the evaluation of QC samples during the implementation phase
320     (Appendix C and Chapter 7). It also provides criteria for verification and validation during the
321     assessment phase (Chapter 8). The use of the project-specific MQOs for the method uncertainty
322     of each analyte in the three phases of the life of a project, as opposed to arbitrary non-project-
323     specific criteria, helps to ensure the generation of radioanalytical data of known quality
324     appropriate for its intended use.

325     The MQO for method uncertainty for an analyte at a specified concentration, normally the action
326     level, is related to the width of the gray region. The gray region has an upper bound and a lower
327     bound. The upper bound typically is the action level. The width of the gray region is represented
328     by the symbol A. See Appendix B for information on setting up a gray region.

329     Appendix C provides the rationale and detailed guidance on the development of MQOs for
330     method uncertainty. Outlined below is MARLAP's recommended guideline for developing
331     MQOs for method uncertainty when a decision is to be made about the mean of a population
332     represented by multiple samples. Appendix C provides additional guidelines for developing
333     MQOs for method uncertainty when decisions are to be made about individual items or samples.

334     If decisions are to be made about the mean of a sampled population, MARLAP recommends that
335     the method uncertainty (M^) be less  than or equal to the width of the gray region divided by 10
336     for sample concentrations at the upper bound of the gray region (typically the action level). If this
337     requirement cannot be met, the project planners should require at least that the method
338     uncertainty be less than or equal to the width of the gray region divided by 3 (Appendix C).
339
340
341
342
343
344
345
                                    EXAMPLE

Suppose the action level is 0.1 Bq/g and the lower bound of the gray region is 0.02 Bq/g. If
decisions are to be made about survey units based on samples, then the required method uncer-
tainty (UMR) at 0.1 Bq/g is
                             A  _Q.l -0.02
                             10      10
= 0.008 Bq/g
If this uncertainty cannot be achieved, then a method uncertainty («MR) as large as A / 3
0.027 Bq/g may be allowed if more samples are taken.
        MARLAP                                                                      JULY 2001
        DO NOT CHE OR QUOTE                    3-12              DRAFT FOR PUBLIC COMMENT

-------
                                                                 Key Analytical Planning Issues...
346     In the example above, the required method uncertainty (MMR) is 0.008 Bq/g. In terms of method
347     selection, this particular MQO calls for a method that can ordinarily produce measured results
348     with expected combined standard uncertainties (la) of 0.008 Bq/g or less at sample concentra-
349     tions at the action level (0.1 Bq/g in this example). Although individual measurement uncertain-
350     ties will vary from one measured result to another, the required method uncertainty is effectively
351     a target value for the individual measurement uncertainties.

352     OUTPUT: MQOs expressed as the required method uncertainty at a specified concentration for
353     each analyte.

354     DETECTION AND QUANTIFICATION CAPABILITY

355     For a particular project, the detection capability or the quantification capability may be identified
356     as an important method performance characteristic during project planning. If the issue is
357     whether an analyte is present in an individual sample and it is  therefore important that the
358     method be able to reliably distinguish small amounts of the analyte from zero, then an MQO for
359     the detection capability should be established during project planning. If the emphasis is on being
360     able to make precise measurements of the analyte concentration for comparing the mean of a
361     sampled population to the action level, then an MQO for the quantification capability should be
362     established during project planning.

363     Detection Capability

364     When decisions are to be made about individual items or samples (e.g., drinking water samples),
365     and the lower bound of the gray region is at or near zero for the analyte of concern, the detection
366     capability of the method is an important method performance  characteristic, and an MQO should
367     be developed for it. MARLAP recommends that the MQO for the detection capability be
368     expressed as a required MDC (Chapter 19).

369     Outlined below is MARLAP's recommended guideline for developing MQOs for detection
370     capability. Appendix C provides the rationale along with detailed guidance on the development
371     of MQOs for detection capability.

372     If the lower bound of the gray region is at or near zero and decisions are to be made about
373     individual items or specimens, choose an analytical method whose minimum detectable
374     concentration is no greater than the upper bound of the  gray  region.1
          1 The MDC is defined as the analyte concentration at which the probability of detection is 1 - p.

        JULY2001-.  .                                 _                                    MARLAP
        DRAFT FOR PUBLIC COMMENT              3-13                    DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
375     Quantification Capability

376     When decisions are to be made about a sampled population and the lower bound of the gray
377     region is at or near zero for the analyte of concern, the quantification capability of the method is
378     an important method performance characteristic and an MQO should be developed for it.
379     MARLAP recommends that the MQO for the quantification capability be expressed as a required
380     MQC (see Chapter 19).

381     Outlined below is MARLAP's recommended guideline for developing MQOs for quantification
382     capability. The MQC, as used in the guideline, is defined as the analyte concentration at which
383     the relative standard uncertainty is 10 percent (see Chapter 19). Appendix C provides the ration-
384     ale along with detailed guidance on the development of MQOs for quantification capability.

385     If the lower bound of the gray region is at or near zero and decisions are to be made about a
386     sampled population, choose an analytical method whose minimum quantifiable concentration is
387     no greater than the upper bound of the gray region which is typically the action level.

388     If an MQO for method uncertainty has been established, then establishing an MQO for the
389     quantification capability in terms of a required MQC is somewhat redundant since an MQC is
390     defined in terms of a specified relative standard uncertainty. However, this method performance
391     characteristic is included in MARLAP for several reasons. First, it has been included to empha-
392     size the importance of the quantification capability of a method for those instances where the
393     issue is not whether an analyte is present or not—for example measuring ^U in soil where the
394     presence of the analyte is given—but rather how precisely the analyte can be measured. Second,
395     this method performance characteristic has been included so as to promote the MQC as an
396     important method parameter. And last, it has been included as an alternative to the overemphasis
397     on establishing required detection limits in those instances where detection (reliably distinguish-
398     ing an analyte concentration from zero) is not the key analytical question.

399     OUTPUT: If the lower bound of the gray region is at or near zero, and decisions are to be made
400     about a sample population, MQOs expressed as MQCs should be developed for each analyte. If
401     the lower bound of the gray region is zero and decisions are to be made about individual items or
402     specimens, MQOs expressed as MDCs should be developed for each analyte.

403     RANGE

404     Depending on the expected concentration range for an analyte (Section 3.3.2), the method's
405     range may be an important method performance characteristic. Most radioanalytical methods are

        MARLAP                                                                  .    JULY 2001
        DO NOT CITE OR QUOTE                    3-14              DRAFT FOR PUBLIC COMMENT

-------
                                                                  Key Analytical Planning Issues..
406     capable of performing over a fairly large range of activity concentrations. However, if the
407     expected concentration range is large for an analyte, the method's range should be identified as
408     an important method performance characteristic and an MQO should be developed for it. The
409     radioanalytical specialist on the project planning team will determine when the expected concen-
410     tration range of an analyte warrants the development of an MQO for the method's range. Since
411     the expected concentration range for an analyte is based on past data which may or may not be
412     accurate, the MQO for the method's range should require that the method perform over a larger
413     concentration range than the expected range. This will help prevent the selection of methods
414     which cannot accommodate the actual concentration range of the analyte.

415     OUTPUT: MQOs for the method's concentration range for each analyte.

416     SPECIFICITY

417     Depending on the chemical and physical characteristics of the matrices, as well as the concen-
418     trations of analytes and the concentrations of other chemical constituents, the method's speci-
419     ficity may be an important method performance characteristic for an analytical process. Method
420     specificity refers to the ability of the method to measure the analyte of concern in the presence of
421     interferences. In order to determine if method  specificity is an important method performance
422     characteristic, the radioanalytical specialist on the project planning team will need information on
423     expected concentration ranges of the analytes  of concern and other chemical constituents in the
424     samples (Section 3.3.2), along with information on the chemical and physical characteristics of
425     the matrices (Section 3.3.3). If it is determined that method specificity is an important method
426     performance characteristic, then an MQO should be developed for it. The MQO can be qualita-
427     tive or quantitative in nature.

428     OUTPUT: MQOs for the method specificity for those analytes likely affected by interferences.

429     RUGGEDNESS

430     For a project which involves analyzing samples which are complex in terms of their chemical
431     and physical characteristics, the method's ruggedness may be an important method performance
432     characteristic. Method ruggedness refers to the relative stability of the method's performance
433     when small variations in method parameter values are made, such as  a change in pH, a change in
434     amount of reagents used, etc. In order to determine if method ruggedness is an important method
435     performance characteristic, the radioanalytical specialist on the planning team needs detailed
436     information on the chemical and physical characteristics of the samples. If it is determined that
437     method ruggedness is an important method performance characteristic, then an MQO should be


        JULY 2001                   ....                                                   MARLAP
        DRAFT FOR PUBLIC COMMENT               3-15               "    DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
438     developed for it. The MQO may require performance data which demonstrates the method's
439     raggedness for specified changes in select method parameters. The statistical manual of the
440     Association of Official Analytical Chemists (AOAC) and the Standard Guide for Conducting
441     Ruggedness Tests ASTM El 169 provides guidance on ruggedness testing.

442     OUTPUT: MQOs for method ruggedness for specified changes in select method parameters.

443     3.3.7.2 The Role of MQOs in the Protocol Selection and Evaluation Process

444     Once developed, the MQOs become an important part of the project's APSs and are subsequently
445     incorporated into project plan documents (Chapter 4) and into the analytical Statement of Work
446     (Chapter 5). In MARLAP, MQOs are used initially in the selection, validation, and evaluation of
447     analytical protocols (Chapters 6 and 7). In a performance-based approach, analytical protocols
448     are either accepted or rejected largely on their ability or inability to meet the project MQOs.

449     3.3.7.3 The Role of MQOs in the Project's Data Evaluation Process

450     Once the analytical protocols have been selected and implemented, the MQOs and—in
451     particular—the MQOs for method uncertainty, are used in the evaluation of the resulting
452     laboratory data relative to the project's analytical requirements. The most important MQO for
453     data evaluation is the one for method uncertainty at a specified concentration. It is expressed as
454     the required method uncertainty («MR) at some concentration,  normally the action level (for this
455     discussion, it is assumed that the  action  level is the upper bound of the gray region). When the
456     analyte concentration of a laboratory sample is less than the action level, the combined standard
457     uncertainty of the  measured result should not exceed the required method uncertainty.

458     For example, if the required method uncertainty is 0.01 Bq/g or less at an action level of 0.1
459     Bq/g, then for any measured result less than 0.1 Bq/g, the laboratory's reported combined
460     standard uncertainty should be less than or equal to 0.01 Bq/g. When the concentration is greater
461     than the action level, the combined standard uncertainty of the measured result should not exceed
462     the relative value of the required method uncertainty. If the required method standard uncertainty
463     is 0.01 Bq/g or less at an action level of 0.1 Bq/g (10 percent of the action level), then for any
464     measured result greater than 0.1 Bq/g, the laboratory's reported combined standard uncertainty
465     should be no greater than 10 percent of the measured result. If an expanded uncertainty is
466     reported with each measured value, and the coverage factor is also specified, the combined
467-     standard uncertainty may be calculated and checked against the required value. The check
468     described relies on the laboratory's estimate of its measurement uncertainty. Additional checks
469     are needed to ensure that the uncertainties are not seriously underestimated.


        MARLAP                                                                       JULY 2001
        DO NOT CITE OR QUOTE                    3-16              DRAFT FOR PUBLIC COMMENT

-------
                                                                 Key Analytical Planning Issues.,
470
471
472

473
474
475


476

477
478
479
480
481
482

483

484
485
486
487
488
489
490


491

492
493
494
495
496
497
498
499
Appendix C provides guidance on developing criteria for QC samples based on the MQO for
method uncertainty. Specifically, Appendix C contains equations for determining warning and
control limits for QC sample results based on the project's MQO for method uncertainty.

The following example illustrates the use of the MQO for method uncertainty in evaluating QC
sample results. Chapter 8, Data Verification and Validation, provides guidance on developing
validation criteria based on the MQO for the required method uncertainty.
                                      EXAMPLE

 Suppose the upper bound of the gray region (the action level) is 0.1 Bq/g, and the required
 method uncertainty (MMR) at this concentration is 0.01 Bq/g, or 10 percent. A routine laboratory
 control sample (LCS) is prepared with an analyte concentration of 0.150 Bq/g. (For the
 purpose of this example the uncertainty in the spike concentration is assumed to be negligible.)
 The lab analyzes the LCS with a batch of samples and obtains the measured result 0.140 ±
 0.008 Bq/g, where 0.008 Bq/g is the combined standard uncertainty (la).

 Question: Is this LCS result acceptable?

 Answer: The LCS result may be acceptable if it differs from the accepted true value by no
 more than three times the required method uncertainty at that concentration. In this example
 the required method uncertainty is 10 percent at 1.50 Bq/g. So, the LCS result is required to be
 within 30 percent of 1.50 Bq/g, or in the range 0.105-0.195 Bq/g. Since 0.140 Bq/g is clearly
 in the acceptance range, the data user considers the result acceptable. Note also that the
 laboratory's reported combined standard uncertainty is less than the required method
 uncertainty, as expected.
3.3.8  Determine Any Limitations on Analysis Options

With the outputs of the resolution of a number of key analytical planning issues, such as a refined
analyte list, MQOs for the analyte list, known relationships between radionuclides of concern, a
list of possible alternate analytes, required analytical turnaround times, the analytical budget, etc.,
the project planning team may choose to determine the analyses to be performed for the project
and thereby limit the analysis options available to the laboratory. It should be emphasized that
determining which analyses need to be performed is not the same as indicating that a particular
analytical protocol or analytical method has to be used. With the exception of gross alpha and
beta measurements and gamma spectrometry, MARLAP uses the term "analysis" to refer to a
        JULY 2001
        DRAFT FOR PUBLIC COMMENT
                                          3-17
              MARLAP
DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
500     radionuclide/matrix combination. Examples of analyses to be performed include 3H in water,
501     in milk, 238Pu in soil, etc. Although determining the analyses to be performed during the planning
502     process may seem inconsistent with a performance-based approach, the project planning team
503     may determine the analyses to be performed or may decide to eliminate some analyses from
504     consideration. This decision may be based on information obtained during project planning, such
505     as the absence of equilibrium between the analyte and other radionuclides in its decay chain or
506     the presence of other radionuclides known to cause spectral interferences. However, in the
507     absence of such considerations, the project planning team should allow the laboratory the flexi-
508     bility of selecting the analyses which meet the analytical requirements as contained in the Ana-
509     lytical Protocol Specifications.

510     The role of the radioanalytical specialist is critical in determining if any limitations on analytical
511     options are necessary because of the many laboratory-related issues and factors involved. For
512     example, if several of the radionuclides of concern on the target analyte list are gamma-emitters,
513     the radioanalytical specialist can determine if gamma spectrometry is an appropriate analysis
514     given the required MQOs, matrices of concern, possible spectral interferences, etc. The radio-
515     analytical specialist may determine that not only is gamma spectrometry an appropriate analysis
516     for the gamma-emitting radionuclides of concern, but since mere is evidence that equilibrium
517     conditions are present, the results for gamma spectrometry can be used for other radionuclides of
518     concern in the same decay chain as the gamma-emitting radionuclides. In other instances, such as
519     the use of gamma spectrometry to quantify 226Ra in the presence of elevated levels of 23SU, the
520     radioanalytical specialist may determine that gamma spectrometry is not an appropriate analysis
521     due to possible spectral interferences. The following sections provide a brief overview of some
522     analysis procedures.

523     3.3.8.1 Gamma Spectrometry

524     In general, gamma spectrometry has many advantages over other choices. It is capable of
525     identifying and quantifying a large number of radionuclides. In comparison with other analyses, it
526     offers a fairly quick turnaround time and, since limited sample manipulation is involved, it is
527     relatively inexpensive, particularly compared to analyses which require sample dissolution and
528     chemical separations. It also allows for the use of relatively large sample sizes, thereby reducing
529     the measurement uncertainty associated with subsampling at the laboratory. However, given its
530     many advantages, gamma spectrometry cannot be used to analyze for all radionuclides. For
531     example, gamma spectrometry may not be able to achieve the project's MQOs, since some or all
532     of the radionuclides of concern may not be gamma-emitters, interfering radionuclides may
533     present problems, etc. The radioanalytical specialist on the planning team can evaluate the
        MARtAP-                                                _-.                    JULY 2001
        DO NOT CITE OR QUOTE                    3-18       "' '"  "" DRAFT FOR PUBLIC COMMENT

-------
                                                                  Key Analytical Planning Issues...
534     appropriateness of the use of gamma spectrometry for some or all of the radionuclides on the
535     analyte list or for alternate analytes.

536     3.3.8.2 Gross Alpha and Beta Analysis

537     Gross alpha and beta analysis provides information on the overall level of alpha- and beta-
53S     emitting radionuclides present in a sample. The analysis has the advantage of a relatively quick
539     turnaround time and generally is inexpensive compared to other analyses. The analysis also has
540     significant limitations. It does not identify specific alpha- and beta-emitting radionuclides, so the
541     source of the overall alpha and beta radiation is not determined by the analysis. It does not detect
542     contribution from low-energy beta-emitting radionuclides such as 3H. The measurement uncer-
543     tainty of the analysis, particularly for matrices other than water, tends to be larger than the meas-
544     urement uncertainty of other analyses. However, even with these limitations, gross alpha and beta
545     analysis can be an important and appropriate analysis for a project.

546     3.3.8.3 Radiochemical Nuclide-Specific Analysis
547
548
549
550
        In many instances, due to the project's MQOs, the lack of an appropriate alternate analyte, the
        lack of equilibrium conditions, etc., radiochemical nuclide-specific analyses are required. This is
        often true when radionuclides such as 3H, 14C, '"Sr, isotopes of Pu, "Tc, etc., are on the analyte
550     list. These analyses generally involve more manipulation of the samples than do gamma spec-
551     trometry and gross alpha and beta analysis. These analyses often require sample dissolution and
552     chemical separation of the radionuclides of concern. For liquid scintillation counting, distillation
553     is usually required for water samples, and some oxidative/combustion procedure is usually
554     required for solid samples. Because of this, these analyses generally have longer turnaround
555     times and are more expensive than other analyses.
555     times and are more expensive than other analyses.
556     Given the many analytical factors and considerations involved, the role of the radioanalytical
557     specialist is critical to determining if any limitations on analysis options are necessary.

558     OUTPUT: Any limitations on analysis options, if appropriate.

559     3.3.9  Determine Method Availability

560     After the required analyses have been determined along with the sample matrices, the required
561     MQOs, the analytical turnaround times, etc., the radioanalytical specialist should be able to
562     determine if there are analytical methods currently available to meet the project's requirements.
563     There are a number of sources of radioanalytical methods, including those published by the


        JULY 2001                                                                        MARLAP
        DRAFT FOR PUBLIC COMMENT              3-19-                   DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
564     American Society of Testing and Materials (ASTM), Standard Methods for the Examination of
565     Water and Waste Water (APHA/AWWA, 1992), methods published in scientific journals,
566     methods published in laboratory procedure manuals, and those published by Federal and State
567     agencies.

568     If there are no known analytical methods that would meet the project's analytical requirements,
569     the project planning team must evaluate options. They may decide to reevaluate the analytical
570     data requirements, such as the MQOs, to see if they can be changed to allow the use of existing
571     methods or increase the analytical budget and project timeline to allow for method development.

572     OUTPUT: A statement of method availability.

573     3.3.10  Determine the Type and Frequency of, and Evaluation Criteria for, Quality Control
574            Samples

575     There are three main types of laboratory QC samples—blanks, replicates, and spikes. In addition,
576     there are different types of blanks, replicates, and spikes. For example, spikes can be matrix
577     spikes, laboratory control samples, external performance evaluation samples, etc. Chapter 18
578     contains a detailed discussion of the different types of QC samples and the information they pro-
579     vide. Since the results of the three main types of QC samples often are used to evaluate different
580     aspects of the analytical process, most projects should employ all three types as part of the QC
581     process.

582     The frequency of laboratory QC sampling for a project essentially represents a compromise
583     between the need to evaluate and control the analytical process and the resources available. In
584     addition, the nature of the project and the intended use of the data will play a role in determining
585     the frequency of QC samples required/For example, the frequency of QC samples for a project
586     involving newly developed methods for analytes in a complex matrix normally should be greater
587     than the frequency of QC samples for a project using more established methods on a simpler
588     matrix, assuming the intended use of the data is the same for both projects. The radioanalytical
589     specialists on the project planning team play a key role in determining the type and frequency of
590     QC samples for a project.

591     In order to adequately evaluate laboratory data, it is important that the QC samples be clearly
592     linked to a group of project samples. Typically, this is done by analyzing QC samples along with
593     a batch of samples and reporting the results together.
        MARLAP -                                 •                                    JULY 2001
        DO NOT CITE OR QUOTE                    3-20              DRAFT FOR PUBLIC COMMENT

-------
                                                                 Key Analytical Planning Issues...
594     In addition to determining the type and frequency of QC samples, evaluation criteria for the QC
595     sample results should be developed during the directed planning process and incorporated into
596     the project's APSs. Appendix C provides guidance on developing criteria for QC samples and
597     contains equations that calculate warning and control limits for QC sample results based on the
598     project'sMQO for method uncertainty.

599     OUTPUT: List of type and frequency of QC samples required and the criteria for evaluating QC
600     sample results.

601     33.11  Determine Sample Tracking and Custody Requirements

602     A procedural method for sample tracking should be in place for all projects so that the proper
603     location and identification of samples is maintained throughout the life of the project. Sample
604     tracking should cover the entire process from sample collection to sample disposal. For some
605     projects, a Chain-of-custody (COC) process is needed. COC procedures are particularly
606     important in demonstrating sample control when litigation is involved. In many cases, Federal,
607     State, or local agencies may require that COC be maintained for specific samples. Chapter 10,
608     Field and Sampling Issues that affect Laboratory Measurements, provides guidance on sample
609     tracking and COC. It is important that the requirements for sample tracking be clearly established
610     during project planning.

611     OUTPUT: Project sample tracking requirements.

612     3.3.12  Determine Data Reporting Requirements

613     The data reporting requirements should be established during project planning. This involves
614     determining not only what is to be reported but also how it is to be reported. Items that are -
615     routinely reported are listed below. It should be noted that this is not a comprehensive list, and
616     some projects may require the reporting of more items while other projects may require the
617     reporting of fewer items:

618     •  Field sample identification number
619     •  Laboratory sample identification number
620     •  Sample receipt date
621     •  Analysis date
622     «  Radionuclide
623     •  Radionuclide concentration units
624     •  Sample size (volume, mass)


        JULY 2001                              "                                       MARLAP
        DRAFT FOR PUBLIC COMMENT          "   3-21                    DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
625     •   Aliquant size (volume, mass)
626     •   Radionuclide concentration at specified date
627     •   Combined standard uncertainty or expanded uncertainty (coverage factor should be indicated)
628     •   Sample-specific minimum detectable concentration
629     •   Analysis batch identification
630     •   Quality control sample results
631     •   Laboratory instrument identification
632     •   Specific analytical parameters (e.g., chemical yields, counting times, etc.)
633     •   Analytical method/procedure reference

634     It is important that the required units for reporting specific items be determined during project
635     planning. MARLAP recommends that units of the International System of Units (SI) be used
636     whenever possible. However, since regulatory compliance levels are usually quoted in traditional
637     radiation units, it may be appropriate to report in both SI and traditional units, with one being
638     placed in parenthesis. MARLAP also recommends that all measurement results be reported
639     directly as obtained, including negative values, along with the measurement uncertainty—for
640     example 2o, 3a, etc. Additional guidance on data reporting, including a discussion of electronic
641     data deliverables, is provided in Chapter  17, Data Acquisition, Reduction, and Reporting, and in
642     Chapter 5, Obtaining Laboratory Services.

643     OUTPUT: Data reporting requirements for a project.

644     3.4   Matrix-Specific Analytical Planning Issues

645     This section discusses a number of matrix-specific analytical planning issues common to many
646     types of projects. For each matrix there is a discussion of several potential key analytical plan-
647     ning issues specific to that matrix. It should be noted that what may be a key analytical planning
648     issue for one project, may not be a key issue for another project. The list of potential matrix-
649     specific key analytical planning issues discussed in this section is summarized in Table 3.1.
650     Table 3.1 is not a comprehensive list, but rather is an overview of some common matrix-specific
651     planning issues.

652     This section is divided into solids, liquids, filters and wipes. While filters and wipes are solids,
653     they are discussed separately because of the unique concerns associated with them.
         MARLAP                                   "                         "          JULY 2001
         DO NOT CITE OR QUOTE                    3-22              DRAFT FOR PUBLIC COMMENT

-------
                                                                      Key Analytical Planning Issues..
654
655
656
657
658
                  TABLE 3.1 — Matrix-specific analytical planning issues
      MATRIX
 RECOMMENDED KEY ISSUES
659
660
661
662
 Solids (soil, sediment,
 structural material,
 biota, metal, etc.)
Homogenization
Subsampling
Removal of unwanted material
Container type
Container material
Sample preservation
Screening samples for health and safety
Volatile compounds
Sample identification
Cross-contamination
Sample size
Compliance with radioactive materials license
Compliance with shipping regulations
Chemical and physical form of the substrate
 Liquids (drinking water,
 groundwater,
 precipitation, solvents,
 oils, etc.)
Is filtering required?
Sample preservation
Should sample be filtered or preserved
first?
Sample identification
Volume of sample
Immiscible layers
Precipitation
Total dissolved solids
Reagent background
Compliance with radioactive materials license
Compliance with shipping regulations
663
 Filters and Wipes
Filter material
Pore size
Sample volume or area wiped
Sample identification
Compliance with radioactive materials license
Compliance with shipping regulations
Subsampling
Background from filter material	
664
3.4.1  Solids
665      Solid samples consist of a wide variety of materials that include soil and sediment; plant and
666      animal tissue; concrete; asphalt; trash, etc. In general, most solid samples do not require preser-
667      vation (Chapter 10) but do require specific processing both in the field and in the laboratory. In
668      certain instances, some biota samples may require preservation, primarily in the form of lowered
669      temperatures, to prevent sample degradation and loss of water. Some common analytical
670      planning issues for solid samples include homogenization and Subsampling (Section 3.4.1.1) and
671      the removal of unwanted materials (Section 3.4.1.2). For certain types of biological samples,
672      removal and analysis of edible portions may be a key analytical planning issue.

673      Other issues that may represent key analytical issues for solids include container type and mate-
674      rial (Chapter 10); sample preservation (Chapter 10); sample drying—wet, dry, ashed weights and
675      ratios—(Chapter 10), screening  samples for health and safety (Chapter 11); volatile_compounds
         JULY 2001
         DRAFT FOR PUBLIC COMMENT
                                            3-23
                                                               MARLAP
                                                DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
676     (Chapter 10); sample identification (Chapters 10,11, and 12); cross-contamination (Chapter 10);
677     sample size (Chapters 10,11, and 12); compliance with the radioactive materials license and
678     shipping regulations (Chapter 11); and the chemical and physical form of the sample substrate
679     (Chapters 13 and 14).

680     3.4.1.1 Homogenization and Subsampling

681     For many types of analyses, a portion of the sample sent to the laboratory must be removed for
682     analysis. As with sampling in the field, this portion of the sample should be representative of the
683     entire sample. Adequate homogenization and proper subsampling techniques are critical to
684     obtaining a representative portion of the sample for analysis. Developing requirements for—and
685     measuring the adequacy of—homogenization processes and subsampling techniques can be
686     complicated for various types of solid matrices. General guidance on homogenization and sub-
687     sampling is provided in Chapter 12 and Appendix F. The input of the radioanalytical specialist as
688     a member of the project planning team is critical to developing requirements for homogenization
689     processes and subsampling techniques.

690     3.4.1.2 Removal of Unwanted Materials

691     When a solid sample is collected in the field, extraneous material may be collected along with the
692     "intended" sample. For example, when collecting a soil sample, rocks, plant matter, debris, etc.,
693     may also be collected. Unless instructed otherwise, samples received by the laboratory typically
694     are analyzed exactly as they are received. Therefore, it is important to develop requirements
695     regarding the treatment of extraneous materials. Ultimately, these guidelines should be based on
696     the project's DQOs. The requirements should clearly state what, if anything, is to be removed
697     from the sample and should indicate what is to be done with the removed materials. The
698     guidelines should indicate where the removal process should occur (in the field, in the laboratory
699     or at both locations) and the material to be removed should be clearly identified.

700     For soil samples, this may involve identifying rocks of a certain sieve size, plant matter, debris,
701     etc., as extraneous material to be removed, weighed, and stored at the laboratory. For sediment
702     samples, requirements for occluded water should be developed. In the case of biological samples,
703     if the entire sample is not to be analyzed, the analytical portion should be identified clearly.
        MARLAP                                                                      JULY 2001
        DO NOT CITE OR QUOTE                    3-24              DRAFT FOR PUBLIC COMMENT

-------
                                                                   Key Analytical Planning Issues..
704     3.4.2  Liquids

705     Liquids include aqueous liquids (e.g., surface water, groundwater, drinking water, aqueous
706     process wastes, and effluents), nonaqueous liquids (e.g., oil, solvents, organic liquid process
707     wastes), and mixtures of aqueous and nonaqueous liquids.

708     A key analytical planning issue for most liquids is whether or not filtering is required or neces-
709     sary; this is discussed in Chapter 10. The question of whether or not to filter a liquid is generally
710     defined by the fundamental analytical question (Section 3.3.3). If the question is related to total
711     exposure from ingestion, the liquids are generally not filtered or the filters are analyzed
712     separately and the results summed. If the question is concerned with mobility of the analyte the
713     concentration in the liquid fraction becomes more important than the concentration in the sus-
714     pended solids (although some suspended solids may still be important to questions concerning
715     mobility of contamination). In many projects, all of the liquids are filtered and the question
716     becomes which filters need to be analyzed. Issues related to this decision include where and
717     when to filter (Chapter  10); homogenization and subsampling (Chapter 10); volatile compounds
718     (Chapter 10); screening for health and safety (Chapter 11); and cross-contamination (Chapter
719     10).

720     Another key analytical planning issue involves preservation of liquid samples, which is also dis-
721     cussed in Chapter 10. Sample preservation involves decisions about the method of preservation
722     (temperature or chemical, Chapter 10), container type and material (Chapter 10), and chemical
723     composition of the sample (Chapters 13 and 14). Preservation of radionuclides in liquids is
724     generally accomplished in the same manner as preservation of metals for chemical analysis.
725     There are of course exceptions  such as for 3H and 129I.

726     A third key analytical issue results from the first two issues and involves the decision of which
727     issue should be resolved first. Should the sample be filtered and then preserved, or preserved first
728     and filtered later? This  issue is also discussed in Chapter 10. In general, acid is used to preserve
729     liquid samples. Since acid brings many radionuclides into solution from suspended or undis-
730     solved material, filtering is generally performed in the field prior to preserving the sample with
731     acid.

732     Other analytical planning issues that may be important for a specific project include: sample
733     identification (Chapters 10,11, and 12); volume of sample (Chapter 10); compliance with radio-
734     active materials license and shipping regulations (Chapter 11); immiscible layers (for mixtures of
735     aqueous and nonaqueous liquids, Chapter 12); precipitation between filtration and analysis
736     (Chapter 12); total dissolved solids (Chapter 12); and reagent background (Chapter 12).


        JULY 2001                                                                        MARLAP
        DRAFT FOR PUBLIC COMMENT              3-25                    DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
737     3.4.3  Filters and Wipes

738     Filters include a wide variety of samples, including liquid filters, air filters for suspended
739     particulates, and air filters for specific compounds. Once the decision to filter has been made,
740     there are at least three key analytical planning issues: filter material, pore size, and volume of
741     material to be filtered.

742     The selection-of filter or wipe material can be very important. The wrong filter or wipe can
743     dissolve, break, or tear, thus invalidating the sample. Chapter 10 includes a discussion of the
744     various types of filter and wipe materials. Issues influencing this decision include the volume of
745     material to be filtered, the loading expected on the filter, and the chemical composition of the
746     material to be filtered.

747     The pore size is also important when preparing to filter. Too large a pore size will fail to collect
748     all of the material that is needed, while too small a pore size may lead to clogged filters and
749     reduced sample sizes. If an evaluation is being performed of respirable-size particles being
750     released by a process, the pore size of the filter should reflect this requirement.

751     The volume of material to be filtered, or area to be wiped, is generally determined by the detec-
752     tion requirements for the project. Lower detection limits require larger samples. Larger samples
753     may, in turn, result in problems with shipping samples or analytical problems where multiple
754     filters were required to meet the requested detection limits.

755     Other analytical planning issues that may be important for a specific project include sample
756     identification (Chapters  10,11, and 12), compliance with radioactive materials license and ship-
757     ping regulations (Chapter 11), and background contributions from filter materials (Chapter 12).

758     3.5   Assembling the Analytical Protocol Specifications

759     After key general and matrix-specific analytical planning issues have been identified and
760     resolved, the next task of the project planning team is to organize and consolidate the results of
761     this process into APSs for the project. In general, there will be an APS for each type of analysis
762     (analyte-matrix combination). At a minimum, the APS should include the analyte list, the sample
763     matrix, possible interferences, the MQOs, any limitations on analysis options, the type and
764     frequency of QC samples along with acceptance criteria, and any analytical process requirements
765     (e.g., sample tracking requirements). The analytical process requirements should be limited to
766     only those requirements which are considered essential to meeting the project's  analytical data
767     requirements. For example, if the analyte of concern is known to exist in a refractory form in the

        MARLAP                                                         - ._           JULY 2001
        DO NOT CITE OR QUOTE     "             3-26              DRAFT FOR PUBLIC COMMENT

-------
                                                                  Key Analytical Planning Issues...
768     samples, then fusion for sample digestion may be included as an analytical process requirement.
769     However, in a performance-based approach, it is important that the level of specificity in the
770     Analytical Protocol Specifications should be limited to those requirements which are considered
771     essential to meeting the project's analytical data requirements. The APS should be a one- or
772     two-page form that summarizes the resolution of key analytical planning issues.

773     Figure 3.2 provides an example form for Analytical Protocol Specifications with references to
774     sections in this chapter as major headers on the form. Figure 3.3 provides for the purpose of an
775     example, an APS for 226Ra in soil for an information gathering project.

776     3.6   Level of Protocol Performance Demonstration

777     As discussed in Section 3.3.7.3, during project planning, the project planning team should deter-
778     mine what level of analytical performance demonstration or method validation is appropriate for
779     the project. The question to be answered is how the analytical protocols will be evaluated. There
780     are three parts of this overall evaluation process: (1) the initial evaluation, (2) the ongoing evalu-
781     ation, and (3) the final evaluation. This section briefly discusses the initial evaluation of protocol
782     performance. Chapters 7 and 8 provide guidance on the ongoing and final evaluation of protocol
783     performance, respectively.

784     The project planning team should determine what level of initial performance demonstration is
785     required from the laboratory to demonstrate that the analytical protocols the laboratory proposes
786     to use will meet the MQOs and other requirements in the APSs. The project planning team
787     should decide the type and amount of performance data required. For example, for the analysis of
788     3H in drinking water, the project planning team may decide that past performance data from the
789     laboratory, such as the results of internal QC samples for the analysis of 3H in drinking water, are
790     sufficient for the initial demonstration of performance for the laboratory's analytical protocols if
791     they demonstrate the protocol's ability to meet the MQOs. If the analysis is for 238Pu in a sludge,
792     the project planning team may decide that past performance data (if it exists) would not be
793     sufficient for the initial demonstration of performance. The planning team may decide that
794     satisfactory results on performance evaluation samples would be required for the initial
795     demonstration of analytical protocol performance. Section 6.6 provides detailed guidance on
796     protocol performance demonstration/method validation, including a tiered approach based on the
797     project analytical needs and available resources.
        JULY 2001
        DRAFT FOR PUBLIC COMMENT
3-27
              MARLAP
DO NOT CITE OR QUOTE.
                                                             U.S. EPA Headquarters Library
                                                                    Mail code 3201
                                                             1200 Pennsylvania Avenue NW

-------
         Key Analytical Planning Issues...
798                   '                 Analytical Protocol Specifications
799      Analyte List: (Section 3.3.1.3.3.7	  Analysis Limitations: (Sections 3.3.9)	
800      Matrix: (Section 3.3.3)	  Possible Interferences: (Sections 3.3.3.3.3.5")
801      Concentration Ranee: (Section 3.3.2)     Action Level (Section 3.3.8)	
802
803
804
(SectionJLLSL
(Section 3.3.8)
                                             MQOs:
  (Section 3.3.8)
  (Section 3.3.8)
805
806
807
808
809
810

811
812
813
814
815
816
817
818
819
820
821
822

823
824

825
QC Samples
Type
(Section 3.3. 11)
(Section 3.3.11)
(Section 3.3. 11)
(Section 3.3. 11)
Frequency
(Section 3.3.11)
(Section 3.3.11)
(Section 3.3.11)
(Section 3.3. 11)
Evaluation Criteria
(Section 3.3.8.2)
(Section 3.3.8.2)
(Section 3.3.8.2)
(Section 3.3.8.2)
Analytical Process Requirements*
Activity
Field Sample Preparation and Preservation
Sample Receipt and Inspection
Laboratory Sample Preparation
Sample Dissolution
Chemical Separations
Preparing Sources for Counting
Nuclear Counting
Data Reduction and Reporting
Sample Tracking Requirements
Other
Special Requirements
(Section 3.4)
(Section 3.4.12)
(Section 3.4)
(Section 3.4)
(Section 3.4)
(Section 3.4)
(Section 3.4)
(Section 3.3.13)
(Section 3.3.12)

*Consistent with a performance-based approach, analytical process requirements should be kept to a minimum,
therefore none or N/A may be appropriate for many of the activities.
                       FIGURE 3.2 — Analytical protocol specifications
         MARLAP
         DO NOT CITE OR QUOTE
                                              3-28
                    JULY 2001
DRAFT FOR PUBLIC COMMENT

-------
                                                                      Key Analytical Planning Issues...
826

827
828
829
830
                        Analytical Protocol Specifications (Example)
Analyte List:
Matrix: Soil
Analysis Limitations: Must perform direct measurement of
analvte or analysis of progeny allowed if equilibrium established at
laboratory
Possible Interferences: Elevated levels of asU
831
Concentration Range: 0.01 to l.SOBa/g     Action Level:  0.5 Bq/g
832                                                 MQOs:
833      A method uncertainty (u^) of 0.04 BoVe or less at 0.5 Ba/g
QC Samples
Type
Method blank
Duplicate
Matrix Spike
Frequency
1 per batch
1 per batch
1 per batch
Evaluation Criteria
See attachment B*
See attachment B*
See attachment B*
834
835
836
837
838

839
840
841
842
843
844
845
846
847
848
849
850
851

852
Analytical Process Requirements
Activity
Field Sample Preparation and Preservation
Sample Receipt and Inspection
Laboratory Sample Preparation
Sample Dissolution
Chemical Separations
Preparing Sources for Counting
Nuclear Counting
Data Reduction and Reporting
Sample Tracking Requirements
Other
Special Requirements
None
None
None
None
None
None
None
See attachment A*
Chain-of-Custody
-
 Attachments A and B are not provided in this example
                  FIGURE 3.3 — Example analytical protocol specifications
         JULY 2001
         DRAFT FOR PUBLIC COMMENT
                                             3-29
                                             MARLAP
                              DO NOT CITE OR QUOTE

-------
        Key Analytical Planning Issues...
853     3.7   Project Plan Documents
854
855
856
857
858
859


860

861
862
863

864
865

866
867
868

869
870

871
872

873
874

875
876
Once the APSs have been completed, they should be incorporated into the appropriate project
plan documents and, ultimately, into the analytical Statement of Work. Chapters 4 and 5 provide
guidance on the development of project plan documents and analytical Statements of Work,
respectively. While the APSs are concise compilations of the analytical data requirements, the
appropriate plan documents should detail the rationale behind the decisions made in the develop-
ment of the APSs.
                            Summary of Recommendations

    MARLAP recommends that any assumptions made during the resolution of key analytical
    planning issues are documented, and that these assumptions are incorporated into the
    appropriate narrative sections of project plan documents.

    MARLAP recommends that an action level and gray region be established for each analyte
    during the directed planning process.

    MARLAP recommends that the method uncertainty at a specified concentration (typically
    the action level) always be identified as an important method performance characteristic,
    and that an MQO be established for it for each analyte.

    MARLAP recommends that the MQO for the detection capability be expressed as a
    required minimum detectable concentration.

    MARLAP recommends that the MQO for the quantification capability be expressed as a
    required minimum quantifiable concentration.

    MARLAP recommends that units of the International System of Units (SI) be used
    whenever possible.

    MARLAP recommends that all measurement results be reported directly as obtained,
    including negative values, along with the measurement uncertainty.
        MARLAP _..
        DO NOT CITE OR QUOTE
                                        3-30
                  JULY 2001
DRAFT FOR PUBLIC COMMENT

-------
                                                              Key Analytical Planning Issues..
877      3.8   References

878      American Public Health Association, American Water Works Association, Water Environment
879         Federation (APHA/AWWA). Standard Methods for the Examination of Water and
880         Wastewater, 18* ed, 1992. American Public Health Association, Washington, DC.

881      American Society for Testing and Materials (ASTM) El 169. Standard Guide for Conducting
882         Ruggedness Test. 1989.

883      U.S. Environmental Protection Agency (EPA). 1998. Guidance for the Quality Assurance
884         Project Plans (EPA QA/G-5). EPA/600/R-98/018, Washington, DC.

885      MARSSIM. 2000. Multi-Agency Radiation Survey and Site Investigation Manual, Revision 1.
886         NUREG-1575 Rev 1, EPA 402-R-97-016 Revl, DOE/EH-0624 Revl. August. Available
887         from http://www.epa.gov/radiation/marssim/filesiin.htm.

888      Youden, WJ. and E.H. Steiner.  1975. Statistical Manual of the Association of Official Analytical
889         Chemists. Association of Official Analytical Chemists International, Gaithersburg, MD.
        JULY 2001                           "                        ' *  _             MARLAP
        DRAFT FOR PUBLIC COMMENT             3-31                   DO NOT CITE OR QUOTE

-------

-------
                        4 PROJECT PLAN DOCUMENTS
 2     4.1    Introduction

 3     The project plan documents are a blueprint for how a particular project will achieve data of the
 4     type and quality needed and expected by the project planning team. In the planning documents,
 5     the data user's expectations and requirements, which are developed during the planning
 6     process—including the Analytical Protocol Specifications and measurement quality objectives
 1     (MQOs)—are documented along with the standard operating procedures (SOPs), health and
 8     safety protocols, and quality assurance/quality control (QA/QC) procedures for the field and
 9     laboratory analytical teams. The objectives of this chapter are to discuss:

10     •  The importance of project plan documents;
11     •  The elements of project plan documents; and
12     •  The link between project planning and project plan documents, in particular the incorporation
13        of the analytical protocols.

14     The importance of project plan documents is discussed in Section 4.2. Section 4.3 discusses a
15     graded approach to project plan documents. The different types of planning documents and the
16     elements of the project plan documents are discussed in Sections 4.4  and 4.5, respectively. The
17     link between project planning and project plan documents is discussed in Section 4.6.

18     The project plan documents should be dynamic documents, used and updated over the life of the
19     project. Under a performance-based approach, the analytical protocols requirements in the project
20     plan documents initially may reflect the Analytical Protocol Specifications established by the
21     project planning team and issued in the statement of work (SOW) (or Basic Ordering Agreement
22     Task Order). When the analytical laboratory has been selected, the project plan documents should
23     be updated to reflect the actual protocols to be used. The protocols should be cited, or the SOPs
24     for the  protocols should be included as appendices. (Analytical Protocol Specifications and the
25     relation to project measurement quality objectives (MQOs) have been discussed in Chapter 3 and
26     represented in Figure 3.2  and 3.3).

27     While this chapter will address the documentation of QA/QC used in project activities,
28     MARLAP is cognizant of, and fully endorses, the need for an organizational quality system and a
29     quality system, management plan, or quality manual. The development of the project plan
30     documents should be addressed in the quality system requirements documentation. The project
31     plan documents should reflect, and be consistent with, the organization's QA policies and
32     procedures. Guidance on  elements of a quality system for environmental data collection activities
33     is available from several sources including ANSI/ASQC (1994) and ISO Standard 9001 (1994).

       JULY 2001                                                                      MARLAP
       DRAFT FOR PUBLIC COMMENT              4-1                     DO NOT CITE OR QUOTE

-------
       Project Plan Documents
34     The QA requirements have been developed by several Federal Agencies and consensus standard
35     organizations including the following:

36     •  10CFR830.120
37     •  10 CFR 50, Appendix B
38     •  ANSI N42.23-1996
39     •  ASMENQA-M989
40     •  DOE Order 4.14.1 on QA
41     •  EPA Order 5360.1 on Quality Systems (1998c)
42     •  DOD QA requirement MIL-Q-9858A

43     4.2    The Importance of Project Plan Documents

44     Project plan documents are important in environmental data collection activities to ensure that
45     the type and quantity of data are sufficient for the decision to be made. Project plans document
46     the decisions made during the planning process and integrate the technical operations with the
47     management and quality system practices. Project plans also:

48     •  Support data defensibility for environmental compliance;
49     •  Can be used to defend project objectives and budget; and
50     •  Are a tool for communication with stakeholders.

51     The development of project plan documents and the implementation of the project plan provide
52     the following benefits:

53     •  Full documentation for legal, regulatory, and historical use of the information;

54     •  Specification of data collection and quality control;

55     •  Documentation of analytical requirements through the incorporation of an Analytical
56        Protocol Specifications;

57     •  Implementation  of planned data collection activities (through internal and external
58        assessment and oversight activities); and

59     •  Meeting project-specific criteria (i.e., MQOs, DQOs) through data validation and usability
60        assessment.
       MARLAP                                                     -               JULY 200
       DO NOT CITE OR QUOTE	                4-2       . -..     DRAFT FOR PUBLIC COMMEN

-------
                                                                         Project Plan Documents
6i     4.3   A Graded Approach to Project Plan Documents

62     A graded approach is the process of basing the level of management controls applied to an item
63     or work on the intended use of the results and the degree of confidence needed in the quality of
64     the results (ANSI/ASQC, 1994). MARLAP recommends a graded approach to project plan
65     development because of the diversity of environmental data collection activities. This diversity in
66     the type of project and the data to be collected impacts the content and extent of the detail to be
67 .    presented in the plan document. The plan document development team should be flexible in their
68     application of guidance according to the nature of the work being performed and the intended use
69     of the data.

70     Under a graded approach, a mix of project-specific and site-based quality system documentation
71     may be relied upon to ensure quality. For example, the project specific plan may:

72     •   Address design, work processes, and inspection; and

73     *   Incorporate by citation site-wide plans that address records management, quality
74         improvement, procurement, and assessment.

75     A comprehensive and detailed project plan is required for some data collection activities because
76     of the need for legal and scientific defensibility of the data. A comprehensive and detailed plan
77     may also be desirable when Office of Management and Budget (OMB) clearance and approval is
78     needed to carry out the project (e.g., NRC/EPA proposed Publicly Owned Treatment Works
79     Survey).

80     Other environmental data collection activities, such as basic studies or small projects, may only
81     require a discussion of the experimental process and its objectives, which is often called a project
82     narrative statement. (Other titles used for project narrative statements are "QA narrative
83     statement" and "proposal QA plan" (EPA, 1998a). Basic studies and small projects generally are
84     of short duration or limited scope and could include proof of concept studies, exploratory
85     projects, small data collection tasks, feasibility studies, qualitative screens, or initial work to
86     explore assumptions or correlations. Although basic studies and small projects may be used to
87     acquire a better understanding of a phenomenon, they will not by themselves be used to make
88     significant decisions or establish policy. Further discussion on the content of plan documents for
89     basic studies and small projects is provided in Section 4.5.3.
       JULY 2001                                                                      MARLAP
       DRAFT FOR PUBLIC COMMENT              4-3                     DO NOT CITE OR QUOTE

-------
        ProjectPlan Documents
 90     4.4   Project Plan Documents

 91     The ANSFASQC (1994) definition for a QA Project Plan (QAPP), which is also applicable to
 92     other integrated project plan documents, is "a formal document describing in comprehensive
 93     detail the necessary QA, QC and other technical activities that must be implemented to ensure
 94     that the results of the work performed will satisfy the stated performance criteria." The project
 95     plan documents should contain this information in a clear and integrated manner so that all
 96     implementation teams can understand their role and the project objectives.

 97     Project plan documents vary in size and format and are referred to by a variety of names. The
 98     size of the project plan documents tends to reflect the issuing agency's requirements, complexity,
 99     and scope of the project activities. Some projects with multiple phases may have more than one
100     plan document. For example, separate plan documents may be developed for scoping surveys,
101     characterization, and the final status survey for the same site because of the different objectives
102     and data requirements. Available guidance on project plans will be discussed in Section 4.4.1,
103     and a general discussion of various approaches is discussed in Section 4.4.2.

104     4.4.1  Guidance on Project Plan Documents

105     National standards guidance on project plan documents is available in:

106     •  ASTM Standard Practice (D5283) for Generation of Environmental Data Related to Waste
107        Management Activities: Quality Assurance and Quality Control Planning and
108        Implementation (ASTM,  1992);

109     •  Standard Guide (D5612), Quality Planning and Field Implementation of a Water Quality
110        Measurements Program (ASTM, 1994); and

111     •  Standard Provisional Guide (PS85) for Expedited Site Characterization of Hazardous Waste
112        Contaminated Sites (ASTM, 1996).

113     Guidance on project plans for environmental data collection activities in the federal sector is also
114     available (EPA, 1998a; 40 CFR 300.430; NRC, 1989; and USAGE, 1994 and 1997). Other
115     Federal Agency guidance may follow EPA guidance for QAPPs (EPA, 1998a).
        MARJLAP                                                                   JULY 2001
        DO NOT CITE OR QUOTE                    4-4              DRAFT FOR PUBLIC COMMEN1

-------
                                                                        Project Plan Documents
116     4.4.2   Approaches to Project Plan Documents

117     The approach and naming of project plan documents is usually a function of the authoring
118     organization's experience, any controlling Federal or state regulations, or the controlling Agency.
119     Project plan, work plan, QAPP, field sampling plan, sampling and analysis plan, and dynamic
120     work plan are some of the names commonly used for project plan documents. The names can
121     however often represent different documents to different agencies, states, companies and even to
122     different people within the same organization.

123     A work plan is often the primary and integrating plan document when the data collection activity
124     is a smaller supportive component of a more comprehensive project (for example, data collection
125     activity in support of an aspect of an environmental impact statement for a large multi-year
126     project). The QAPP is often the primary document when the data collection activity is a major
127     portion of the project (for example, data collection activity in support of an initial site
128     investigation). A National Contingency Plan (NCP) format (specified in 40 CFR 300.430) is
129     appropriate when data collection activities are in support of National Priorities List (NPL)
130     Superfund site projects. The NCP format has a sampling and analysis plan as the primary plan
131     document The project documentation consists of two integrated documents: a field sampling
132     plan and a QAPP. Stand-alone health and safety plans are also developed.

133     Traditional site investigations are generally based on a phased engineering approach, which
134     collects samples based on a pre-specified grid pattern and does not provide the framework for
135     making changes in the plan in the field. The work plan (the project plan document) for the site
136     investigation typically will specify the number of samples to be collected, the location of each
137     sample and the analyses to be performed. A newer concept is to develop a dynamic work plan
138     (the project plan document), which, rather than specifying the number of samples to be collected
139     and the location of each sample, would specify the decision making logic that will be used in the
140     field to determine where the samples will be collected, when the sampling will stop, and what
141     analyses will be performed. Guidance on dynamic work plans is available in the Standard
142     Provisional Guide (PS85) for Expedited Site Characterization of Hazardous Waste Contaminated
143     Sites (ASTM, 1996).

144     MARLAP does not recommend a particular project plan document approach, title or arrange-
145     ment. Federal and state agencies have different requirements for the various environmental data
146     collection activities. In certain cases there are regulatory requirements. If an organization has
147     successful experience addressing the essential content of plan documents (Section 4.5) in a well
148     integrated, document format, it is usually unnecessary and wasteful of time and monies to change
        JULY 2001	MARLAP
        DRAFT FOR PUBLIC COMMENT              4-5                     DO NOT CTTC OR QUOTE

-------
        Project Plan Documents
149     a proven approach. The project plan document should reflect, and be consistent with, the
150     organization's QA policies and procedures.

151     MARLAP recommends a primary project plan document that includes other documents by
152     citation or as appendices. The primary project plan document serves to integrate the multi-
153     disciplinary sections, other management plans, and stand alone documents into a coherent plan.
154     Appropriate management plans may include the Health and Safety Plan, Waste Management
155     Plan, Risk Analysis Plan, Community Relations Plan, or Records Management Plan. If a detailed
156     discussion of the project already exists in another document, which is available to project
157     participants, then a brief description of site history and incorporation of the document into the
158     project plan document by reference may be appropriate. Incorporation by citation may also be
159     appropriate when the complexity of the project requires an extensive discussion of background
160     issues. Other documents that should be integrated, if available, are the report on the planning
161     process, the Data Validation Plan (Chapter 8), and the DQA Plan (Chapter 9). If stand alone
162     documents are not immediately available to project participants, they should be appended to the
163     (primary) project plan document.

164     4.5   Elements of Project Plan Documents
                                                      r
165     A project plan document must address a range of issues. The extent of the detail is dependent on
166     the type of project and the intended use of the results as previously discussed in applying a
167     graded approach to plan documents (Section 4.3). For all projects, the project plan document
168     must provide the project information and decisions developed during the project planning
169     process. Project plan documents should address:

170     •  The project's DQOs and MQOs;

171     •  The sampling and analytical protocols that will be used to achieve the project objectives; and

172     *  The assessment procedures and documentation that are sufficient to confirm that the data are
173        of the type and quality needed.

174     Content of plan documents is discussed in Section 4.5.1. The integration of project plan
175     documents is discussed in Section 4.5.2. Special consideration of project documentation for
176     small projects is discussed in Section 4.5.3.
        MARLAP                                                                      JULY 200
        DO NOT CITE OR QUOTE                     4-6              DRAFT FOR PUBLIC COMMEN1

-------
                                                                             Project Plan Documents
177
4.5.1   Content of Project Plan Documents
178      The plan document development team should remain flexible with regards to format and should
179      focus on the appropriate content of plan documents needed to address the elements listed above.
180      The content of plan documents, regardless of the title or format, will include similar information,
IS I      including:
182
   The project description and objectives;
183
184
   Identification of those involved in the data collection and their responsibilities and
   authorities;
185
   Enumeration of the QC procedures to be followed;
186
   Reference to specific SOPs that will be followed for all aspects of the projects; and
187
   Health and Safety protocols.
188      The project plan document(s) should present the document elements as integrated chapters,
189      appendices, and stand alone documents, and plans should be included by citation. Table 4.1
190      provides summary information on project plan elements for three different plan documents:
191      project plans, dynamic work plans, and QAPPs as provided in ASTM and EPA guidance. The
192      table also illustrates the similarity of project plan content.
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
                         TABLE 4.1—Elements of Project Plan Documents
             Project Plan
 (ASTM D5283,1992 and ASTM D5612,
   	1994)	
    Dynamic Work Plan
     (ASTM PS 83,1996)
           QAPP
         (EPA, 1998a)
 Project Management
 Identify individuals with designated res-
 ponsibility and authority to: (1) develop
 project documents; (2) select organizations
 lo perform the work; (3) coordinate com-
 munications; and (4) review and assess
 final data.
 Background Information
 Reasons for data collection.
 Identify regulatory programs governing
 data collection.
1.  Regulatory Framework
2.  Site Descriptions and
   History of Analyte Use and
   Discovery
3.  Analysis of Prior Data and
   Preliminary Conceptual
   Site Model	
                           A. Project Management
                           Al Approval Sheel
                           A2 Table of Contents
                           A3 Distribution List
                           A4 Project Organization
A5 Problem Definition and
   Background
         JULY 2001
         DRAFT FOR PUBLIC COMMENT
                                           4-7
                                               MARLAP
                                DO NOT CITE OR QUOTE

-------
         Project Plan Documents
                         1992 and ASTMD5612,
208
209
210
211
212
213
214
215
216
217
218
219


220
221
222
223
224
225
226
227
228

229
230
231
232
233
234
235
236
237
238
239
240
Project Objectives
  Clearly define objectives of field and
  laboratory work.
  Define specific objectives for the
  sampling location.
  Describe intended use of data.
Dynamic Technical Program
  Essential questions to be
  answered or specific  .
  objectives.
  Identify the investigation
  methods and the areas in which
  they may be applied.
  Provide clear criteria for
  determining when the project
  objectives have been met
A6 Project Description.
A7 Quality Objectives and Criteria
    for Measurement Data.
AS Special Training Require-
    ments/Certifications.
A9 Documentation and Records.
Sampling Requirements
Sample requirements are specified,
including:
  Sampling locations.
  Equipment and Procedures (SOPs).
  Sample preservation and handling.
Field Protocols and Standard
Operating Procedures (this
section may be attached as a
separate document)

[* see footnote]
Analytical Requirements
The analytical requirements are specified,
including:
  Analytical procedures (SOPs).
  Analytelist.
  Required method uncertainty.
  Required detection limits.
  Regulatory requirements and DQO
  specifications are considered.	
B. Measurement/Data
Acquisition
Bl Sampling Process Designs.
B2 Sampling Method
    Requirements.
B3 Sample Handling and Custody
    Requirements.
B4 Analytical Methods
    Requirements.
Quality Assurance and Quality Control
Requirements
  QA/QC requirements are addressed for
  both field and laboratory activities.
  Type and frequency of QC samples will
  be specified.
  Control parameters for field activities
  will be described.
  Performance criteria for laboratory
  analysts will be specified.
  Data validation criteria (for laboratory
  analysis) will be specified.
Quality Assurance and Quality
Control Plan
B5 Quality Control Requirements.
B6 Instrument/Equipment Testing
    Inspection and Maintenance
    Requirements.
B7 Instrument Calibration and
    frequency.
B8 Inspection/Acceptance
    Requirements for Supplies and
    Consumables.
B9 Data Acquisition
    Requirements for Non-direct
    Measurements.
BIO Data Management.	
         MARLAP  -
         DO NOT CITE OR QUOTE
                                               4-8
                                                    JULY 2001
                              DRAFT FOR PUBLIC COMMEN1

-------
                                                                            Project Plan Documents
                     Project Plan
         (ASTM D5283,1992 and ASTM D5612,
         	1994)	
                                       Dynamic Work Plan
                                       (ASTM PS 85,1996)
                                       QAPP
                                    (EPA, I998a)
241
242
243
244
245
246
247
248
249
250
251
252


253
254
255
256
257
258
259


260


261
262
263
264
265
266
267
268


269
270
271
272
273
 Project Documentation
 All documents required for planning,
 implementating, and evaluating the data
 collection efforts are specified, may
 include:
   SOW, Work Plan, SAP, QAPP, H&S
   Plan, Community Relations Plan.
   Technical reports assessing data.
   Requirements for field and analytical
   records.	.
1.  Data Management Plan
2.  Health and Safety Plan
3.  Community Relations Plan
C. Assessment/Oversight
Cl  Assessments and response
    Actions.
C2  Reports to Management.
D. Data Validation and Usability
D1  Data Review, Verifications
    and Validation Requirements.
D2  Verification and Validation
    Methods.
D3  Reconciliation with POO.
[* The combined Dynamic Technical Program section and Field Protocols and SOPs section is the functional
equivalent of a Field Sampling and Analysis Plan.]

Appendix D provides more detailed guidance on the content of project plan documents following
the outline developed by EPA requirements (EPA, 1998b) and guidance (EPA, 1998a) for
Quality Assurance Project Plans for environmental data operations. The EPA element identifiers
(Al, A2, etc.) and element titles are used in the tables and text of this chapter for ease of cross
reference to the appropriate section in Appendix D. The EPA elements for a QAPP are used to
facilitate the presentation and do not represent a recommendation by MARLAP on the use of a
QAPP as the project plan document format.

4.5.2  Plan Documents Integration

MARLAP strongly discourages the use of a number of stand-alone plan components of
equivalent status without integrating  information and without a document being identified as a
primary document. For large project plan compilations, it is appropriate to issue stand-alone
portions of the plan that focus on certain activities such as sampling, analysis or data validation,
since it can be cumbersome for sampling and laboratory personnel to keep the entire volume(s)
of the project plan document readily available. However, each stand-alone component should
contain consistent project information, in addition to the component specific  plan information,
such as the following:

•   A brief description of the project including pertinent history;
•   A brief discussion of the problem to be solved or the question to be answered (DQO);
•   An organizational chart or list of key contact persons and means of contact;
•   The analyte(s) of interest; and
•   The appropriate health and safety protocols and documentation requirements.
        JULY 2001
        DRAFT FOR PUBLIC COMMENT
                                           4-9
                                               MARLAP
                                 DO NOT CITE OR QUOTE

-------
        Project Plan Documents
274     In addition, a cross-referenced table is helpful in the primary document, which identifies where
275     project plan elements are located in the integrated plan document.

276     4.5.3  Plan Content for Small Projects

277     The project plan documents for small projects and basic studies (Section 4.3) generally consist of
278     three elements: the Title and Approval Sheet, the Distribution List, and a Project Narrative. The
279     Project Narrative should discuss in a concise manner the majority of issues that are normally
280     addressed in a project plan document,  such as a QAPP. A typical Project Narrative (EPA, 1998b)
281     may be a concise and brief description of:

282     •  Problem and site history (A5)
283     •  Project/task organization (A4)
284     •  Project tasks, including a schedule and key deliverables (A6)
285     •  Anticipated use of the data (A5, A6)
286     •  MQOs(A7)
287     •  Sampling process design requirements and description (Bl)
288     •  Sample type and sampling location requirements (B2)
289     •  Sample handling and custody requirements (B3)
290     •  Analytical protocols (B4)
291     *  QC and calibration requirements for sampling and analysis (B5, B7)
292     •  Inspection and maintenance of analytical instrumentation (B6)
293     •  Plans for peer or readiness reviews prior to data collection (C1)
294     •  Assessments to be conducted during actual operation (Cl)
295     •  Procedure for data review (D2)
296     •  Identification of any special reports on QA/QC activities, as appropriate (C2)
297     •  Reconciliation with DQOs or other objectives (D3)

298     Table 4.2 or Appendix D gives information on what is addressed in each bullet above, using the
299     element identifier shown in parenthesis.

300     4.6   Linking the Project Plan Documents and the Project Planning Process

301     Directed planning processes (see Chapter 2 and Appendix B) yield many outputs, such as the
302     Analytical Protocol Specifications (Chapter 3), which must be captured in project plan
303     documents to ensure that data collection activities are implemented properly. MARLAP
304     recommends that the project plan documents integrate all technical and quality aspects for the life
305     cycle of the project, including planning, implementation, and assessment.

        MARLAP                                                                     JULY 200)
        DO NOT CITE OR QUOTE                   4-10             DRAFT FOR PUBLIC COMMEm

-------
                                                                                        Project Plan Documents
306
307
308


309
310
311
312



313
                    TABLE 4.2-Crosswalk Between Project Plan Document Elements
                   	and Directed Planning Process	
 ID
  Project Plan Document
         Elements
   (QAPP, EPA QA/R-5,
          1998b)'
         ?«.  Content
   Directed Planning Process Input
                                        PROJECT MANAGEMENT
Al
Title and Approval Sheet
Title and approval sheet.
A2_
A3
Table of Contents
Document control format.
Distribution List
Distribution list for the plan
document revisions and final
guidance.
   Include the members of the project
   planning team and stakeholders.
A4
Project/Task Organization
1} Identify individuals or
organizations participating in the
project and discuss their roles and
responsibilities.
2) Provide an organizational chart
showing relationships and
communication lines.
The directed planning process:
  Identified the stakeholders, data
  users, decision makers.
  Identified the core planning team and
  the technical planning team members
  who will often be responsible for
  technical oversight.
  Will often identify the specific
  persons/organizations that will be
  responsible for project
  implementation (sampling and
  analysis).	
314
A5
Problem Definition/
Background
1) State the specific problem to be
solved and decision to be made.
2) Include enough background to
provide a historical perspective.
Project planning team:
  Documented the problem, site
  history, existing data, regulatory
  concerns, background levels and
  thresholds.
  Developed a decision statement.
315
A6
Project/Task Description
Identify measurements, special
requirements, sampling and
analytical methods, action levels,
regulatory standards, required data
and reports, quality assessment
techniques, and schedules.
Project planning team identified:
  Deadlines and other constraints that
  can impact scheduling.
  Existing and needed data inputs.
Project planning team established:
  Action levels and tolerable decision
  error rates that will be the basis for
  the decision rule.
  The optimized sampling and
  analytical design as well as quality
  criteria.
316
A7
Quality Objectives and
Criteria for Measurement
Data
1) Identify DQOs, data use, type of
data needed, domain, matrices,
constraints, action levels, statistical
parameters, and acceptable decision
errors.

2) Establish MQOs that link
Project planning team:
  Identified the regulatory standards
  and the action level(s).
  Established the decision rule.
  Described the existing and needed
  data inputs.
  Described practical constraints and
          JULY 2001               i-
          DRAFT FOR PUBLIC COMMENT
                                                 4-11
                                                                                         MARLAP
                                                                         DO HOT CITE OR QUOTE

-------
       Project Plan Documents
ID

A8
A9
Project Plan Document
Elements
(QAPP,EPAQA/R-5,
1998b)*

Special Training
Requirements/
Certification
Documentation and Record
Content
analysis to the user's quality
objectives.
Identify and discuss special
training/certificates required to
perform work.
Itemize the information and records,
which must be included in a data
report package including report
format and requirements for storage
etc.
Directed Planning Process Input
the domain.
• Established the statistical parameter
that is compared to the action level.
• Established tolerable decision error
rates used to choose quality criteria.
* Established quality criteria linked to
the optimized design.
• Establish data verification, validatior
and assessment criteria and
procedures.
• Establish APS and MQOs.
Project planning team:
• Identified training, certification,
accreditation requirements for field
and laboratory.
* Identified Federal and state
requirements for certification for
laboratories.
• Identified Federal and state
requirements for activities, such as
disposal of field-generated residuals
Project planning team:
• Indicated whether documents will IK
controlled and the distribution list
incomplete.
* Identified documents that must be
archived.
• Specified period of time that
documents must be archived.
• Specified procedures for error
corrections (for hard copy and
electronic files).
MEASUREMENT/DATA ACQUISITION
Bl
B2
Sampling Process Designs
(Experimental Designs)
Sampling Methods
Requirements
(1) Outline the experimental design,
including sampling design and
rationale, sampling frequencies,
matrices, and measurement
parameter of interest.
(2) Identify non-standard methods
and validation process.
Describe sampling procedures,
needed materials and facilities,
decontamination procedures, waste
handling and disposal procedures,
• Project planning team established th
rationale for and details of the
sampling design.
• Project planning team specified the
preliminary details of the optimized
sampling method.
317
318
319
320
321
       MARLAP
       DO NOT CITE OR QUOTE
4-12
                 JULY 20
DRAFT FOR PUBLIC COMMH

-------
                                                                                      Project Plan Documents
           ID
       Project Plan Document
             Elements
        (QAPP, EPA QA/R-5,
              1998b)'
                                        Content
                                    Directed Planning Process Input
                                           and include a tabular description of
                                           sample containers, sample volumes,
                                           preservation and holding time
                                           requirements.	    	
322
323
B3
Sample Handling and
Custody Requirements
Describe the provisions for sample
labeling, shipment, sample tracking
forms, procedures for transferring
and maintaining custody of samples.
   Project planning team described the
   regulatory situation and site history,
   which can be used to identify the
   appropriate sample tracking level.
B4
324
B5
325
B6
326
B7
327
B8
Analytical Methods
Requirements
Identify analytical methods and
procedures including needed
materials, waste disposal and
corrective action procedures.
Project planning team;
  Identified inputs to the decision
  (nuclide of interest, matrix, etc.).
  Established the allowable
  measurement uncertainty that will
  drive choice of the analytical
  protocols.
  Specified the optimized sampling
  and analytical design.      	
Quality Control
Requirements
(1) Describe QC procedures and
associated acceptance criteria and
corrective actions for each sampling
and analytical technique.
(2) Define the types and frequency
of QC samples should be defined
along with the equations for
calculating QC statistics.	
Project planning team:
  Established the allowable
  measurement uncertainty, which will
  drive QC acceptance criteria.
  Established the optimized analytical
  protocols and desired MQOs.
Instrument/Equipment
Testing Inspection and
Maintenance Requirements
1} Discuss determination of
acceptable instrumentation
performance.
2) Discuss the procedures for
periodic, preventive and corrective
maintenance.
Instrument Calibration and
Frequency
'!) Identify tools, gauges and
instruments, and other sampling or
measurement devices that need
calibration.
(2) Describe how the calibration
should be done.
  Project planning team established the
  desired MQOs, which will drive
  acceptance criteria for
  instrumentation performance.
Inspection/Acceptance
Requirements for Supplies
and Consumables
Define how and by whom the
sampling supplies and other
consumables will be accepted for
use in the project. 	
          JULY 2001
          DRAFT FOR PUBLIC COMMENT
                                               4-13
                                                                                       MARLAP
                                                                       DO NOT CITE OR QUOTE

-------
       Project Plan Documents
ID
B9
Bl
0
Project Plan Document
Elements
(QAPP,EPAQA/R-5,
1998b)'
Data Acquisition
Requirements (Non-direct
Measurements)
Data Management
Content
Define criteria for the use of non-
direct measurement data such as
data that come from databases or
literature.
(1) Outline of data management
scheme including path of data, use
of storage and& record keeping
system.(2) Identify all data handling
equipment and procedures that will
be used to process, compile, analyze
the data, and correct errors.
Directed Planning Process Input
Project planning team:
• Identified the types of existing data
that are needed or would be useful.
• Established the desired MQOs that
would also be applicable to archived
data.

ASSESSMENT/OVERSIGHT
Cl
C2
Assessments and Response
Actions
Reports to Management
(1) Describe the number, frequency
and type of assessments needed for
the project.
(2) For each assessment: list
participants and their authority, the
schedule, expected information,
criteria for success and unsatis-
factory conditions and those who
will receive reports and procedures
for corrective actions.
Identify the frequency, content and
distribution of reports issued to keep
management informed.
• The project planning team
established the MQOs and
developed statements of the
Analytical Protocol Specifications,
which are used in the selection of th<
analytical protocols and in the
ongoing evaluation of the protocols.

DATA VALIDATION AND USABILITY
Dl
D2
Data Review, Verification
and Validation Requirements
Verification and Validation
Methods
State the criteria including specific
statistics and equations, which will
be used to accept or reject data
based on quality.
Describe the process to be used for
validating and verifying data,
including COC for data throughout
the lifetime of the project.
• Project planning team established.
• Established the MQOs for the
sample analysis, and may also have
discussed completeness and
representativeness requirements that
will be the basis of validation.
• Established the action level(s)
relevant to the project DQOs.
• Established the data validation
criteria.
Project planning team:
• Determines appropriate level of
custody.
• May develop a Validation Plan.
328
329
330
331
332
333
334
335
336
       MARLAP     -  -  -
       DO NOT CITE OR QUOTE
4-14
                 JULY 201
DRAFT FOR PUBLIC COMMEb

-------
                                                                           Project Plan Documents
337
         n>
 D3
Reconciliation With Data
Quality Objectives
Describe how results will be
evaluated to determine if DQOs are
satisfied.
Project planning team:
  Defined the necessary data input
  needs.
  Defined the constraints and
  boundaries with which the project
  would have had to comply.
• Defined the decision rule.
  Identified the hypothesis and
  tolerable decision error rates.
  Defined MQOs for achieving the
  protect DOQs.	
338
339

340
341
342
343
344
345
346

347

348
349
350
351
352
353
354
355
356
357

358
359
[Adapted from: EPA, 1998a]
[* EPA QAPP elements are discussed in Appendix D]

The project plan should be a dynamic document, used and updated over the life of the project.
For example, the analytical methods requirements in the project plan documents (B4) will
initially reflect the Analytical Protocol Specifications established by the project planning team
(Chapter 3) and issued in the SOW or BOA Task Order (Chapter 5). When the analytical
laboratory has been selected (Chapter 7), the project plan document should be updated to reflect
the specific analytical protocols: the actual protocols to be used, which should be included by
citation or inclusion of the SOPs as appendices.

4.6.1   Planning Process Report

MARLAP recommends the inclusion, by citation or as an appendix, of the directed planning
process report in the project plan documents. If the planning process was not documented in a
report, MARLAP recommends that a summary of the planning process addressing, for example,
the assumptions and decisions, the established action levels, the DQO statement, and the
Analytical Protocol Specifications, which  include the established MQOs and any specific
analytical process requirements, be included in the project plan document section on Problem
Definition/Background (A5). Additional detailed information on the analytical protocol
specifications including the MQOs will be presented in the project plan document sections on
Project/Task Description (A6), Quality Objectives and Criteria for Measurement Data (A7), and
Analytical Methods Requirements (B4).

MARLAP views the project plan documents as the principal product of the planning process. To
illustrate how to capture and integrate the  outputs of the planning process into the plan docu-
         JULY 2001
         DRAFT FOR PUBLIC COMMENT
                                          4-15
                                                                            MARLAP
                                                              DO NOT CITE OR QUOTE

-------
360     ment(s), Table 4.2 presents a crosswalk of the elements of the EPA QAPP Document and outpi
361     of a directed planning process.

362     4.6.2   Data Assessment

363     Assessment (Verification, Validation and Data Quality Assessment) is the last step in the
364     project's data life cycle and precedes the use of data. Assessment, and in particular DQA, are
365     designed to evaluate the suitability of project data to answer the underlying project question or
366     the suitability of project data to support the project decision. The project planners should define
367     the assessment process in enough detail that achievement or failure to meet goals can be
368     established upon project completion. An important output of the directed planning process to b<
369     captured in the project plan document is the data verification, validation and assessment criteria
370     and procedures.

371     4.6.2.1 Data Verification

372     Analytical data verification assures that laboratory conditions and operations were compliant
373     with the contractual SOW and the project plan. Verification compares the data package to these
374   .  requirements (contract compliance) and checks for consistency and comparability of the data
375     throughout the data package and completeness of the results to ensure all necessary documen-
376     tation is available. Performance criteria for verification should be documented in the contract ai
377     in the project plan document in the sections that address Data Review, Verification, and
378     Validation Requirements (Dl), and Verification and Validation Methods (D2).

379     4.6.2.2 Data Validation

380     Validation addresses the reliability of the data. During validation, the technical reliability and tl
381     degree of confidence in reported analytical data are considered. Data validation criteria and
382     procedures should be established during the planning process and captured in the project plan
383     document (and the SOW for the validation contractor). Performance criteria for data validation
384     can be documented directly in the project plan document in Data Review, Verifications,  and
385     Validation Requirements (Dl) and Verifications and Validation Methods (D2) or in a separate
386     plan, which is included  by citation or as an  appendix in the project plan document.

387     Guidance on Data Validation Plans is provided in Chapter 8, Section 8.3. The data validation
388     plan should contain the  following information:

389     • A summary of the project, which provides sufficient detail about the project's Analytical
390       Protocol Specifications, including the MQOs;
        MARLAP                                                                        JULY 2(
        DO NOT CITE OR QUOTE                    4-16               DRAFT FOR PUBLIC COMME

-------
                                                                          Project Plan Documents
391      • The set of data to be validated and whether all the raw data will be reviewed and in what detail;

392      • The necessary validation criteria and the MQOs deemed appropriate for achieving project
393        DQOs;

394      • Specifications on what qualifiers are to be used and how final qualifiers are to be assigned; and

395      • Information on the content of the validation report.

396      4.6.2.3 Data Quality Assessment

397      Data Quality Assessment consists of a scientific and statistical evaluation of project-wide
398      knowledge to determine if the data set is of the right type, quality and quantity to support its
399      intended use. The data quality assessor integrates the data validation report, field information,
400      assessment reports and historical project data and compares the findings to the original project
401      objectives and criteria (DQOs).

402      Performance criteria for data usability for the project should be documented in the project plan
403      documents in a section on DQA or reconciliation of the data results with DQOs (D3) or in a
404      separate plan, which is included by citation or as an appendix in the project plan document.
405      Guidance on DQA Plans is provided in Section 9.5, The DQA plan should contain the following
406      information:

407      • A summary of the project, which provides sufficient detail about the project's DQOs and
408        tolerable decision error rates;

409      • Identification of what issues will be addressed by the DQA;

410     * Identification of any statistical tests that will be used to evaluate the data;

411      • Description of how the representativeness of the data will be evaluated (for example, review
412        the sampling strategy, the suitability of sampling devices, subsampling procedures, assessment
413        findings);

414      • Description of how the accuracy of the data, including potential impact of non-measurable
415        factors (for example, subsampling bias) will be considered (for example, review the Analytical
416       Protocol Specifications and the analytical plan, the suitability of analytical protocols,
417       subsampling procedures, assessment findings);


        JULY 2001                                                                       MARLAP
        DRAFT FOR PUBLIC COMMENT              4-17                   DO NOT CITE OR QUOTE

-------
418
419

420

421


422

423
424

425
426

427
428

429
430
431
432
433


434

435
436
437

438
439

440
441
• Description of how the MQOs will be used to determine the usability of measurement data
  (that is, did the uncertainty in the data significantly affect confidence in the decision);

• Identification of what will be included in the DQA report; and

• Identification of who will receive the report and the mechanism for its archival.
 Summary of Recommendations

 • A graded approach to project plan writing because of the diversity of environmental data
   collection activities.

 • A primary integrating project plan that includes other documents by citation or as
   appendices.

 • Project plan documents that integrate all technical and quality aspects for the life-cycle of
   the project, including planning, implementation, and assessment.

 • Inclusion, by citation or as an appendix, of the report on the directed planning process in the
   project plan documents. If the planning process was not documented in a report, MARLAP
   recommends that a summary of the planning process addressing assumptions and decisions,
   established action levels, DQO statement and established MQOs, and Analytical Protocol
   Specifications be included in the project plan documents.
4.7    References

American National Standards Institute and the American Society for Quality Control
  (ANSI/ASQC). 1994. Specifications and Guidelines for Quality Systems for Environmental
  Data Collection and Environmental Technology Programs, National Standard E-4.

American National Standards Institute (ANSI). 1996. Measurement and Associated Instruments
  Quality Assurance for Radioassay Laboratories, National Standard N42.23.

American Society of Mechanical Engineers (ASME). 1989. Quality Assurance Program
  Requirements for Nuclear Facilities. NQA-1, ASME, New York, New York.
         MARLAP
         DO NOT CITE OR QUOTE
                                         4-18
                  JULY 200
DRAFT FOR PUBLIC COMME1ST

-------
                                                                       Project Plan Documents
442     American Society of Testing and Materials (ASTM). 1992. Standard Practice for Generation of
443       Environmental Data Related to Waste Management Activities: Quality Assurance and Quality
444       Control Planning and Implementation, D5283.

445     American Society of Testing and Materials (ASTM). 1994. Standard Guide for Quality Planning
446       and Field Implementation of a Water Quality Measurements Program, D5612.

447     American Society of Testing and Materials (ASTM). 1996. Standard Provisional Guidance for
448       Expedited Site Characterization of Hazardous Waste Contaminated Sites, PS85.

449     Code of Federal Regulations (CFR). 1999.  10 CFR 50 Appendix B, "Quality Assurance Criteria
450       for Nuclear Power Plants and Fuel Reprocessing Plants."

451     Code of Federal Regulations (CFR). 1994.10 CFR 830.120, "Nuclear Safety Management -
452       Quality Assurance Requirements."

453     Code of Federal Regulations (CFR). 1997.40 CFR 300.430, "National Oil and Hazardous
454       Substance Pollution Contingency Plan - Remedial Investigation/Feasibility Study and
455       Selection of Remedy."

456     International Organization for Standardization (ISO). 1994. Quality Systems - Model for Quality
457       Assurance in Design, Development, Installation and Servicing, ISO Standard 9001.

458     U.S. Army Corps of Engineers (USAGE). 1994. Requirements for the Preparation of Sampling
459       and Analysis Plans. Engineer Manual EM 200-1 -3.

460     U.S. Army Corps of Engineers (USAGE). 1997. Chemical Quality Assurance for Hazardous,
461       Toxic and Radioactive Waste Projects. Engineer Manual EM 200-1-6.

462     U.S. Department of Defense (DOD). 1963.  Quality Program Requirements. Military
463       Specification M3L-Q-9858A. Washington, DC.

464     U.S. Department of Energy (DOE). 1991. Quality Assurance. DOE Order 414.1 (Replaced DOE
465       Order 5700.6C), Washington, DC.

466     U.S. Environmental Protection Agency (EPA). 1998a. EPA Guidance for Quality Assurance
467       Project Plans (EPA QA/G-5). EPA/600/R-98/018, Washington, DC.
        JULY 2001               -'                      -                            MARLAP
        DRAFT FOR PUBLIC COMMENT             4-19                    DO NOT CITE OR QUOTE

-------
468     U.S. Environmental Protection Agency (EPA). 1998b. EPA Requirements for Quality Assurance
469       Project Plans for Environmental Data Operations. EPA QA/R-5, External Review Draft Final,
470       Washington, DC.

471     U.S. Environmental Protection Agency (EPA). 1998c. EPA Policy and Program Requirements
472      for the Mandatory Agency-Wide Quality System. EPA Order 5360.1, Washington, DC.

473     U.S. Nuclear Regulatory Commission (NRC). 1989. Standard Format and Content of
474       Decommissioning Plans for Licensees Under 10 CFR Parts 30, 40, and 70. Regulatory Guide
475       3.65.
        MARLAP                                                                  JULY 2001
        DO NOT CITE OR QUOTE                  4-20              DRAFT FOR PUBLIC COMMEN1

-------
                  5  OBTAINING LABORATORY SERVICES
 2     5.1    Introduction
 3
 4     This chapter provides guidance on obtaining radioanalytical laboratory services. In particular,
 5     this chapter discusses the broad items that should be considered in the development of a
 6     procurement vehicle to obtain laboratory services. Throughout this chapter, MARLAP uses the
 7     request for proposal (RFP) as an example of a procurement vehicle. Agencies and other
 8     organizations may use a variety of procurement vehicles, depending upon circumstances and
 9     policies. The RFP typically includes a statement of work (SOW), generic contractual
10     requirements, and the description of the laboratory qualification and selection process. It should
H     be noted that for some agencies or organizations, not all technical, quality, and administrative
12     aspects of a contract are specified in a SOW. Many technical, administrative, legal, and
13     regulatory items are specified in a RFP  and eventually in a contract. More detailed guidance and
14     discussion for contracting issues (such as scoring proposals, etc.) can be found in Appendix E.
15     This chapter is written for contracting outside laboratory services, but the principal items and
16     information provided would be applicable to obtaining services not requiring a formal contract,
17     such as a service agreement within an Agency or organization. It should be noted that the
18     information and specifications of a SOW may appear in many contract vehicles other than a
19     formal contract resulting from a RFP. These include purchase and work orders, as well as a task
20     order under a Basic Ordering Agreement. MARLAP recommends that technical specifications be
21     prepared in writing in a single document designated as a SOW for all radioanalytical laboratory
22     services, regardless of whether the services are to be contracted out or performed by an
23     Agency's laboratory.
24
25     Analytical Protocol Specifications (APSs) should be compiled in the SOW in order for the
26     laboratory to propose the analytical protocols that the laboratory wishes to  use for the project
27     (Chapter 6). The development of APSs, which includes the measurement quality objectives
28     (MQOs), was described in detail in Chapter 3, and the incorporation of these protocols into the
29     relevant project plan documents was covered in Chapter 4. These specifications should include
30     such items as the MQOs, the type and frequency of quality control (QC) samples, the level of
31     performance demonstration needed, number and type of samples, turnaround times,  and type of
32     data package.
33
34     Section 5.3 of this chapter discusses the technical requirements of a SOW,  Section 5.4 provides
35     guidance on generic contractual requirements,  and Section 5.5 discusses various elements of the
36     laboratory selection and qualification criteria.
37
       JULY 2001—  -                                                                 MARLAP
       DRAFT FOR PUBLIC COMMENT            "'5-F '                  DO NOT CITE OR QUOTE

-------
       Obtaining Laboratory Services
38     5*2   Importance of Writing a Technical and Contractual Specification
39            Document
40
41     One objective of the SOW and contractual documents is to provide the analytical requirements in
42     a concise format that will facilitate the laboratory's selection of the appropriate analytical
43     protocols. The authors of the SOW may be able to extract most, if not all, of the necessary
44     technical information from the project plan documents (Chapter 4) if they have been prepared
45     properly. If specific information is not available, the author should contact the planning team.
46     The preparation of a SOW can be viewed as a check to make sure that the project planning
47     documents contain all the information required for the selection and implementation of the
48     appropriate analytical  protocols. One important aspect of writing the SOW is that it should
49     clearly identify the project laboratory's responsibility for documentation to be provided for
SO     subsequent data verification, validation, and quality assessment—these project laboratory
51     requirements should be addressed in the assessment plans developed during directed planning
52     (Chapter 2).
53
54     5.3   Statement of Work — Technical Requirements
55
56     A review of the project plan documents (Chapter 4) should result in a summary list of the
57     technical requirements needed to develop a SOW. Much of this information, including the
58     project MQOs and any unique analytical process requirements, will be contained in the APSs.
59     When possible, a project  summary of sufficient detail (i.e., process knowledge) to be useful to
60     the laboratory should be included in the SOW. The Project Planning Team is responsible for
61     identifying and resolving key analytical planning issues and for ensuring that the resolutions of
62     these issues are captured in the APSs. Consistent with a performance-based approach, the level
63     of specificity in the APSs is limited to those requirements that are essential to meeting the
64     project's analytical data requirements. In response to such project management decisions, the
65     laboratory may propose for consideration several alternative validated methods that meet the
66     MQOs under the performance-based approach (such as measurement of a decay progeny as an
67     alternate radionuclide). Chapter 7 provides guidance on the evaluation of a laboratory and
68     analytical methods.
69
70     The SOW should specify what the laboratory needs to provide in order to demonstrate its ability
71     to meet the technical specifications in the RFP. This should include documentation relative to the
72     method validation process to demonstrate compliance with the MQOs and information on
73     previous contracts  for similar analytical work as well as performance in performance evaluation
74     (PE) programs using the proposed method. Any specific requirements on sample deli very
        MARLAP                                                                      JULY 200
        DO NOT CITE OR QUOTE                    5-2               DRAFT FOR PUBLIC COMMElsT

-------
                                                                   Obtaining Laboratory Services
 75     (Section 5.3.7) should also be made clear to the laboratory. In addition, the requirements for the
 76     laboratory's quality system should be discussed.
 77
 78     5.3.1  Analytes
 79
 80     Each APS should state the analyte of concern. The SOW should specify all analytes of concern
 81     and, when possible, an analyte's expected chemical form and anticipated concentration range
 82     (useful information for separating high activity samples from low activity samples) and potential
 83     chemical or radiometric interferences (Chapter 3, Sections 3.3.1 and 3.3.2). In some instances,
 84     because of process knowledge and information on the absence of equilibrium between analytes
 85     and their parents and progeny, the SOW may require the direct measurement of an analyte rather
 86     than allowing for the measurement of other radionuclides in the analyte's decay chain. In these
 87     cases, the SOW should indicate the analyses to be performed. Examples of analyses include gross
 88     alpha and beta, gamma spectrometry, and radionuclide/matrix specific combinations such as 3H
 89     in water and 238Pu in soil.
 90
 91     5.3.2  Matrix
 92
 93     Each APS should state the sample matrix to be analyzed. The sample matrix for each
 94     radionuclide or analysis type (e.g., gamma-ray spectrometry) should be listed and described in
 95     detail where necessary. The matrix categories may include surface soil, sub-surface soil,
 96     sediment, sludge, concrete, surface water, ground water, salt water, aquatic and terrestrial biota,
 97     air, air sample filters, building materials, etc. Additional information should be provided for
 98     certain matrices (e.g., the chemical form of the matrix for solid matrices) in order for the
 99     laboratory to select the appropriate sample preparation or dissolution method (Chapter 3, Section
100     3.3.3).
101
102     5.3.3  Measurement Quality Objectives
103
104     The APSs should provide the MQOs for each analyte-matrix combination. The MQOs can be
105     viewed as the analytical portion of the overall project data quality objectives (DQOs). An MQO
106     is a statement of a performance objective or requirement for a particular method performance
107     characteristic. Examples of method performance characteristics include the method's uncertainty
108     at some concentration, detection capability, quantification capability, specificity, analyte
109     concentration range, and ruggedness. An example MQO for the method uncertainty at some
110     analyte concentration such as the action level would be, "A method uncertainty of 0.5 Bq/g is
111     required at the action level of 5.0 Bq/g" (Chapters 1, 3, and  19). The MQOs  are a key part of a
        JULY 2001                                                                       MARLAP
        DRAFT FOR PUBLIC COMMENT              5-3                     DO NOT CITC OR QUOTE

-------
         Obtaining Laboratory Services
112     project's APSs. Chapter 3 provides guidance on developing MQOs for select method
113     performance characteristics.
114
115     5.3.4  Unique Analytical Process Requirements
116
117     The APS should state any unique analytical processing requirement. The SOW should give any
118     matrix-specific details necessary for the laboratory to process the sample, such as type of soil,
119     type of debris to be removed, whether or not filtering a sample at the laboratory is required,
120     processing whole fish versus edible parts, drying of soils, information on any known or suspecte
121     interferences, hazards associated with the sample, etc. (Chapter 3, Section 3.4). In some cases,
122     unique analytical process requirements or instructions should be specified that further delineate
123     actions to be taken in case problems occur during sample processing. For example, the SOW ma
124     require that the laboratory reprocess another aliquant of the sample by a more robust technique
125     when a chemical yield drops below a  stated value.
126
127     If necessary, special instructions should be provided as to how or when the analytical results are
128     to be corrected for radioactive decay or ingrowth. In some cases, the sample collection date may
129     not be the appropriate date to use in the decay or ingrowth equations.
130
131     5.3.5  Quality Control Samples and Participation in External Performance Evaluation
132            Programs
133
134     The SOW should state the type and frequency of internal QC samples needed as well as whethei
135     they are to be included on a batch or some other basis. The quality acceptance limits for all types
136     of QC samples should be stated (see Appendix E for guidance on developing acceptance limits
137     for QC samples based on the MQO for method uncertainty). In addition, the SOW should state
138     when and how the project manager or the contracting officer's representative (COR)  should be
139     notified about any nonconformity. In addition, the SOW should spell out the conditions under
140     which the laboratory will have to re-analyze samples due to a nonconformance.
141
142     The evaluation of the laboratory's ability to perform the required radiochemical analyses should
143     be based on  the acceptability of the method validation documentation submitted by the
144     laboratory. The evaluation should also include the laboratory's performance in various external
145     PE programs administered by government agencies or commercial radioactive source suppliers
146     that are traceable to the National Institute of Standards and Technology (NIST; additional
147     information  on evaluating a laboratory's performance is provided in Chapter 7). As such, the
148     RFP should  request the laboratory's participation in a NIST-traceable PE program appropriate fc
149     the analytes  and matrices under consideration. In addition, the weighting factor (Appendix E)~


        MARLAP                                                                       JULY 20C
        DO NOT CITE OR QUOTE                    "5-4"             DRAFT FOR PUBLIC COMMEN

-------
                                                                   Obtaining Laboratory Services
150     given to scoring the laboratory's performance in such a program should be provided to the
151     laboratory. Some examples of government programs include DOE's Quality Assessment
152     Program (QAP) and the Mixed Analyte Performance Evaluation Program (MAPEP) and the
153     NIST-administered National Voluntary Laboratory Accreditation Program (NVLAP)
154     Performance Testing (PT) providers.
155
156     5.3.6  Laboratory Radiological Holding and Turnaround Times
157
158     The SOW should include specifications on the required laboratory radiological holding time (i.e.,
159     the time between the date of sample collection and the date of analysis) and the sample
160     processing turnaround time (i.e., the time between the receipt of the sample at the laboratory to
161     the reporting of the analytical results). Such radiological holding and turnaround times, which are
162     usually determined by specific project requirements, are typically specified in terms of calendar
163     or working days. The  SOW should state whether the laboratory may be requested to handle
164     expedited or rush samples. In some cases, time constraints become an important aspect of sample
165     processing (e.g., in the case of radionuclides that have short half-lives). Some analyses will call
166     for specific steps that  take a prescribed amount of time. Requesting an analytical protocol  that
167     requires several days to complete is obviously not compatible with a 24-hour turnaround time.
168     This highlights the need for input from radioanalytical specialists during the planning process.
169
170     In some cases, the required sample-processing turnaround times are categorized according to
171     generic headings such as routine, expedited or rush, and emergency sample processing. Under
172     these circumstances, the SOW should specify the appropriate category for the samples and
173     analyses.
174
175     5.3.7  Number of Samples and Schedule
176
177     Estimating the volume of work for a laboratory is commonly considered  part of the planning
178     process that precedes  the initiation of a project. Thus, the SOW should estimate the anticipated
179     amount of work and should spell out the conditions under which the laboratory will have to
180     reanalyze samples due to some non-conformance. Similarly, the estimate should allow the
181     laboratory to judge if its facility has the capacity to compete for the work. The estimate for the
182     number of samples is  a starting point, and some revision to the volume of work may occur,
183     unless the laboratory sets specific limits on the number of samples to be processed.
184
185     The SOW should indicate  whether samples will be provided on a regular basis, seasonally, or on
186     some other known or unknown schedule. It should also be specified if some samples may  be sent
187     by overnight carrier for immediate analysisJHtolidays may be listed when samples will not be


        JULY 2001                                                                      MARLAP
        DRAFT FOR PUBLIC COMMENT              5-5                    DO NOT CITE OR QUOTE

-------
         Obtaining Laboratory Services
188      sent to the laboratory. The SOW should state if Saturday deliveries may be required.
189      Furthermore, it should specify whether samples will be sent in batches or individually, and fron
190      one location or different locations.
191
192      The carrier used to ship samples to the laboratory should be experienced in the delivery of field
193      samples, provide next day and Saturday deliveries, have a package tracking system and be
194      familiar with hazardous materials shipping regulations.
195
196      5.3.8  Quality System
197
198      The RFP should require that a copy of the laboratory's Quality System documentation (such as
199      Quality Manual), related standard operating procedures (including appropriate methods) and
200      documentation (such as a summary of the internal QC and external PE sample results) be
201      included with the proposal submittal, as necessary. Only those radioanalytical laboratories that
202      adhere to a well-defined quality system can ensure the appropriate quality of scientifically valid
203      and defensible data. The laboratory's Quality System (NELAC, 2000; ANSI N42.23; ISO/TEC
204      17025) for a radioanalytical laboratory should address at a minimum the following items:
205
206       • Organization and management;
207       • Quality system establishment, audits, essential quality controls and evaluation and data
208         verification;
209       • Personnel (qualifications and resumes);
210       • Physical facilities—accommodations and environment;
211       • Equipment and reference materials;
212       • Measurement traceability and calibration;
213       • Test methods and standard operating procedures (methods);
214       • Sample handling, sample acceptance policy and sample receipt;
215       • Records;
216       • Subcontracting analytical samples;
217       • Outside support services and supplies; and
218       • Complaints.
219
220      5.3.9  Laboratory's Proposed Methods
221
222      Under the  performance-based approach to method selection, the laboratory will select and
223      identify a radioanalytical methods (Chapter 6) that will meet the MQOs and other performance
224      specifications of the SOW. MARLAP recommends that the laboratory submit the proposed
225      methods and required method validation documentation with the formal response. The SOW


         MARLAP                                                                      JULY 20
         DO. NOT CITE OR QUOTE                     5-6               DRAFT FOR PUBLIC COMMEI

-------
                                                                  Obtaining Laboratory Services
226      should state that the proposed methods and method validation documentation will be evaluated in
227      accordance with agency procedures by a Technical Evaluation Committee (TEC) based on
228      experience, expertise, and professional judgement. MARLAP uses the term TEC for the group
229      that performs this function. Agencies and other organizations may use various terms and
230      procedures for this process.
231
232      The TEC should provide their findings and recommendations to the organization's contracting
233      officer for further disposition. In some cases, the organization may inform a laboratory that the
234      proposed methods were deemed inadequate, and, if appropriate, request that the laboratory
235      submit alternative methods with method validation documentation within a certain time period.
236
237      When the methods proposed by the laboratories have been deemed adequate to meet the technical
238      specifications of the SOW, the TEC may want to rank the proposed methods (and laboratories)
239      according to various factors (e.g., robustness, performance in PE programs or qualifying samples,
240      etc.) as part  of the contract scoring process.
241
242      5.4   Request for Proposal—Generic Contractual Requirements
243
244      Not all quality and administration aspects of a contract are specified in a SOW. Many quality
245      (e.g., requirement for a quality system), administrative, legal, and regulatory items need to be
246      specified in  a RFP and eventually in the contract. Although not inclusive, the items or categories
247      discussed in the following sections should be considered as part of the contractual requirements
248      and specifications of a RFP.
249
250      5.4.1 Sample Management
251
252      The RFP should require the laboratory to have an appropriate sample management program that
253      includes those administrative and quality assurance aspects covering sample receipt, control,
254      storage and disposition. The RFP should require the laboratory to have adequate facilities,
255      procedures,  and personnel in place for the following actions:
256
257       • Receive, log-in, and store samples in a proper  fashion to prevent deterioration, cross-
258         contamination, and analyte losses;
259
260       • Verify the receipt of each sample shipment:  compare shipping documentation with samples
261         actually received; notify the point of contact or designee by telephone within a prescribed
262         number of business days and subsequently provide details in all case narratives of any
263         discrepancies in the documentation;

         JULY 2001                                                                     MARLAP
         DRAFT FOR PUBLIC COMMENT             5-7                   DO NOT CITE OR QUOTE

-------
         Obtaining Laboratory Services
264
265       •  Sign, upon receipt of the samples, the sample receipt form or, if required, chain of custody
266         (COC) form(s) submitted with each sample release. Only authorized laboratory personnel
267         should sign the forms. The signature date on the COC form, if required, is normally the
268         official sample receipt date. All sample containers should be sealed prior to their removal
269         from the site; and
270
271       •  Store unused portions of samples in such a manner that the analyses could be repeated or new
272         analyses requested, if required, for a certain specified time period following the submission
273         of an acceptable data package. Unused sample portions should be stored with the same
274         sample handling requirements that apply to samples awaiting analysis. Documentation should
275         be maintained pertaining to storage conditions and sample archival or disposal.
276
277      5.4.2  Licenses, Permits and Environmental Regulations
278
279      Various Federal, State, and local permits, licences and certificates (accreditation) may be
280      necessary for the operation of a radioanalytical laboratory. The RFP should require the laboratory
281      to have the necessary government permits, licenses, and certificates in place before the
282      commencement of any laboratory work for an awarded contract. The following sections provide a
283      partial list of those provisions that may be necessary. Some projects may require special
284      government permits in order to conduct the work and transport and analyze related samples. For
285      these cases, the necessary regulations or permits should be cited in the RFP.
286
287      5.4.2.1 Licenses
288
289      When required, the laboratory will be responsible for maintaining a relevant Nuclear Regulatory
290     . Commission (NRC) or Agreement State License to accept low-level radioactive samples for
291      analyses. In certain circumstances, the laboratory may have to meet host nation requirements if
292      operating outside the United States (e.g., military fixed or deployed laboratories located
293      overseas).
294
295      When necessary, the laboratory should submit a current copy of the laboratory's radioactive
296      materials license with their proposal. Some circumstances may require a copy of the original
297      radioactive materials license. For more complete information on license requirements, refer to
298      either the NRC or State government offices in which the laboratory resides, or to 10 CFR 30.
299
         MARLAP                          -~                                          JULY 200
         DO NOT CITE OR QUOTE                     5-8              DRAFT FOR PUBLIC COMMENT

-------
                                                                     Obtaining Laboratory Services
300      5.4.2.2 Environmental and Transportation Regulations
301
302      Performance under a contract or subcontract must be in compliance with all applicable local,
303      State, Federal, and international laws and regulations. Such consideration must not only include
304      relevant laws and regulations currently in effect, but also revisions thereto or public notice that
305      has been given that may reasonably be anticipated to be effective during the term of the contract.
306
307      The laboratory may be required to receive (and in some cases ship) samples according to
308      international, Federal, State, and local regulations. In particular, the laboratory should be aware
309      of U.S. Postal Service and Department of Transportation (DOT) hazardous materials  regulations
310      applicable to the requirements specified in the SOW and aware that appropriate personnel should
311      be trained in these regulations.
312
313      5.43  Data Reporting and Communications
314
315      The type of information, schedules and data reports required to be delivered by the laboratory, as
316      well as the expected communications between the appropriate staff or organizations,  should be
317      delineated in the RFP. The required schedule and content of the various reports, including sample
318      receipt acknowledgment, chain of custody, final  data results, data packages, QA/QC project
319      summaries, status reports, sample disposition, and invoices should be provided in the RFP. In
320      addition, the expected frequency and lines of communications should be specified.
321
322      In some cases, the RFP may request relevant information relative to the point-of-contact for
323      certain key laboratory positions such as the Laboratory Director, Project Manager, QA Officer,
324      Sample Manager, Record Keeping Supervisor, Radiation Safety or Safety Officer and
325      Contracting Officer. Contact persons should be identified along with appropriate telephone
326      numbers (office, FAX, pager), e-mail, and postal and courier addresses.
327
328      5.4.3.1 Data Deliverables
329
330      The SOW should specify what data are required  for data verification, validation, and quality
331      assessment. A data package, the pages of which should be sequentially numbered, may include a
332      project narrative, the results in a specified format including units, a data review checklist, any
333      non-conformance memos resulting from the work, sample receipt acknowledgment or chain of  .
334      custody form (if required), sample and quality control sample data, calibration verification data,
335      and standard and tracer information. In addition, the date and time of analysis, instrument
336      identification, and analyst performing the analysis should be included on the appropriate
337      paperwork. At the inception of the project, initial calibration data may be required for the


         JULY 2001                                                                       MARLAP
         DRAFT FOR PUBLIC COMMENT               5-9                    DO NOT CITE OR QUOTE

-------
         Obtaining Laboratory Services
338      detectors used for the work. When a detector is recalibrated, or a new detector is placed in
339      service, updated calibration data should be required whenever those changes could affect the
340      analyses in question. In some cases, only the summary or final data report may be requested. In
341      these cases, the name of the data reviewer, the sample identification information, reference and
342      analysis dates, and the analytical results along with the reported measurement uncertainties
343      should be reported.
344
345      The laboratory should be informed of the acceptable formats for electronic and hard copy
346      records. The SOW should state at what intervals the data will be delivered (batch, monthly, etc.).
347
348      5.4.3.2 Software Verification and  Control          •
349                                                     I        I
350      The policy for computer software  verification, validation and documentation typically are
351      included in the laboratory's Quality Manual. If there are specific software verification and
352      validation  requirements germane to the project, the RFP should instruct or specify such
353      requirements. ASTM E919, "Standard Specification for Software Documentation for a
354      Computerized System," describes computer program documentation that should be provided by a
355      software supplier. Other sources for software QC are ANSI ANS 10.3 "Documentation of
356      Computer  Software" and IEEE Standard 1063, "IEEE Standard Tor Software User
357      Documentation."
358
359      5.4.3.3 Problem Notification and Communication
360
361      Communication is key to the successful management and execution  of the contract. Problems,
362      schedule delays, potential overruns, etc., can be resolved quickly only if communication between
363      the laboratory and organization's representative is conducted promptly. The RFP should state
364      explicitly when, how, and in what time frame communication or notification is required by the
365      laboratory  for special technical events, such as the inability to meet MQO specifications for a
366      sample or analyte, when a QC sample result is outside of an acceptance limit or some other non-
367      conformance and when—if required by the project manager—the laboratory fails to meet its
368      internal QC specifications.
369
370      The laboratory should document and report all deviations from the method and unexpected
371      observations that may be of significance to the data reviewer or user. Such deviations should be
372      documented  in the narrative section of the data package produced by the contract laboratory.
373      Each narrative should be monitored closely to assure that the laboratory is documenting
374      departures from contract requirements  or acceptable practice.
375                       ~


         MARLAP                                                                      JULY 2001
         DO NOT CITE OR QUOTE                    5-10              DRAFT FOR PUBLIC COMMENT

-------
                                                                    Obtaining Laboratory Services
376     Communication from the organization's representative to the laboratory is also important. A key
377     element in managing a contract is the timely review of the data packages provided by the
378     laboratory. Early identification of problems allows for corrective actions to improve laboratory
379     performance and, if necessary, the cessation of laboratory analyses until solutions can be
380     instituted to prevent the production of large amounts of data that are unusable. Note that some
381     sample matrices and processing methods can be problematic for even the best laboratories. Thus,
382     the organization's technical representative must be able to discern between failures due to
383     legitimate reasons and poor laboratory performance.
384
385     5.4.3.4 Status Reports
386
387     The SOW may require the laboratory to submit, on a specified frequency, sample processing
388     status reports that include such information as the sample identification number, receipt date,
389     analyses required, expected analytical completion date and report date. Depending on the
390     project's needs, a status report may include the disposition of remaining portions of samples
391     following sample processing or sample processing wastes.
392
393     5.4.4   Sample Re-Analysis Requirements
394
395     There may be circumstances when samples should be re-analyzed due to questionable analytical
396     results or suspected poor quality as reflected by the laboratory's batch QC or external PT
397     samples. Specific instructions and contractual language should be included in the RFP that
398     address such circumstances and the resultant fiscal responsibilities (Appendix E).
399
400     5.4.5   Subcontracted Analyses
401
402     MARLAP recommends that the RFP state that subcontracting will be permitted only with the
403     contracting organization's approval. In addition, contract language should be included giving the
404     contracting organization the authority to approve proposed subcontracting laboratories. For
405     continuity or for quality assurance, the contract may require one laboratory to handle  the entire
406     analytical work load. However, the need may arise to subcontract work to another laboratory
407     facility if the project calls for a large number of samples requiring quick turnaround times or
408     specific methodologies that are not part of the primary laboratory's support services. The use of
409     multiple service providers adds complexity to the organization's tasks of auditing, evaluating and
410     tracking services.
411
412     Any intent to use a subcontracted laboratory should be specified in the response to the RFP or
413     specific task orders. The primary laboratory should specify which laboratory(ies) are to be used,


         JULY 2001                                                                        MARLAP
         DRAFT FOR PUBLIC COMMENT              5-11                    DO NOT CITE OR QUOTE

-------
         Obtaining Laboratory Services
414      should require that these laboratories comply with all contract or task order requirements, and
415      verify that their operations can and will provide data quality meeting or exceeding the SOW
416      requirements. Subcontract laboratories should be required to allow the contracting organization
417      full access to inspect their operations, although it should be understood that the primary
418      laboratory should maintain full responsibility for the performance of subcontract laboratories.
419                     -                                  j
420      5.5   Laboratory Selection and Qualification Criteria
421
422      A description of the laboratory qualification and selection process should be stated in the RFP.
423      The initial stages of the evaluation process focus on the technical considerations only. Cost will
424      enter the selection process later. The organization's TEC will consider all proposals and then will
425      make an initial selection (see Figures E.6a and E.6b in Appendix E), whereby some laboratories
426      are eliminated based on the screening process. The laboratory selection process is based on
427      predetermined criteria that are related to the RFP and how a laboratory is technically able to
428      support the contract. A laboratory that is obviously not equipped to perform work according to
429      the RFP is certain to be dropped early in the selection process. In some cases, the stated ability to
430      meet the analysis request may be verified by the organization, through pre-award audits and
431      proficiency testing as described below. Letters notifying unsuccessful bidders may be sent at this
432      time.
433
434      5.5.1  Technical Proposal Evaluation
435
436      The RFP requires each bidding contractor laboratory to submit a technical proposal and a copy of
437      its Quality Manual. This Quality Manual is intended to address all of the technical and general
438      laboratory requirements. As noted previously, the proposal and Quality Manual are reviewed by
439      members of the TEC who are both familiar with the proposed project and are clearly
440      knowledgeable in the field of radiochemistry and laboratory management.
441
442      5.5.1,1 Scoring and Evaluation Scheme
443
444      The RFP should include information concerning scoring of proposals or weighting factors for
445      areas of evaluation. This helps a laboratory to understand the relative importance of specific
446      sections in a proposal and how a proposal will be evaluated or scored. This allows the laboratory
447      to focus on those areas of greater importance. If the laboratory submits a proposal that lacks
448      sufficient information to demonstrate support in a specific area, the organization can then
449      indicate how the proposal does not fulfill the need as stated in the request. Because evaluation
450      formats differ from organization to organization, laboratories  may wish to contact the
451      organization for additional organization-specific details concerning this process. A technical

         MARLAP                                                                      JULY 2001
         DO NOT CITE OR QUOTE                    5-12              DRAFT FOR PUBLIC COMMENT

-------
                                                                    Obtaining Laboratory Services
452      evaluation sheet (TES) may be used in conjunction with the Proposal Evaluation Plan as outlined
453      in the next section (see Figures E.6a and E.6b in Appendix E) to list the total weight for each
454      factor and to provide a space for the evaluator's assigned rating. In the event of a protest, the TES
455      can be used to substantiate the selection process. The TES also provides areas to record the RFP
456      number, identity of the proposer, and spaces for total score, remarks, and evaluator's signature.
457      The scoring and evaluation scheme is based on additional, more detailed, considerations which
458      are discussed briefly in the Sections E.4 and E.5 in Appendix E.
459
460      Once all proposals are accepted by the organization, the TEC scores the technical portion of the
461      proposal. MARLAP recommends that all members of the TEC have a complete technical
462      understanding of the subject matter related to the proposed work. These individuals are also
463      responsible for responding to any challenge to the organization's selection for the award of the
464      contract. Their answers to such challenges are based on technical merit in relation to the
465      proposed work.
466
467      5.5.1.2 Scoring Elements
468
469      Although each organization may have a different scoring process to evaluate a laboratory's
470      response to a RFP, there are various broad categories or common elements that are typically
471      evaluated. For example, these may include the following:
472
473       •  Technical merit;
474       •  Adequacy and suitability of laboratory resources and equipment;
475       *  Staff qualifications;
476       •  Related experience and record of past performance; and
477       •  Other RFP requirements.
478
479      Although each organization may score or weight these items differently, performance-based
480      contracting requires the weighting of past performance of the contractor as a significant technical
481      element. Each of these elements is considered in the following paragraphs. Outlined below are
482      the key elements that are discussed in more detail in Appendix E.
483
484      TECHNICAL MERIT
485
486      The response to the RFP should include details of the laboratory's Quality System and all the
487      analytical methods to be employed by the laboratory as well as the method validation
488      documentation (Section 6.6). The information provided should outline or demonstrate that the
489      methods proposed are likely to be suitable and meet the APSs. The methods should be evaluated


         JULY 2001                                                        -             MARLAP
         DRAFT FOR PUBLIC COMMENT              5-13                    DO NOT CITE OR QUOTE

-------
         Obtaining Laboratory Services
490      against the APSs and MQOs provided in the SOW. Chapter 7 provides guidance on the
491      evaluation of methods and laboratories. The laboratory's Quality Manual should be reviewed for
492      adequacy and completeness to ensure the required data quality.
493
494      ADEQUACY AND SurrABmrY OF LABORATORY RESOURCES AND EQUIPMENT
495
496      When requested, the laboratory will provide a listing of the available instrumentation or
497      equipment by analytical method category. In addition, the RFP may request information on the
498      available sample processing capacity and the workload for other clients during the proposed
499      contract period. The information provided should be evaluated by the TEC to determine if the
500      laboratory has the sample processing capacity to perform the work. The instrumentation and
501      equipment must be purchased, set-up, calibrated, and on-line before award of contract. In
502      addition, the laboratory should provide information relative to the adequacy and suitability of the
503      laboratory space available for the analysis of samples.
504
505      STAFF QUALIFICATIONS
506
507      The RFP should require the identification of the technical staff and their duties, along with their
508      educational background and experience in radiochemistry, radiometrology or laboratory
509      operations. The laboratory staff that will perform the radiochemical analyses should be employed
510      and trained prior to the award of the contract. Appendix E provides guidance on staff
511      qualifications.
512
513      RELATED EXPERIENCE AND RECORD OF PAST PERFORMANCE
514
515      The RFP should require the laboratory to furnish references in relation to its past or present work.
516      To the extent possible, this should be done with regard to contracts or projects similar in
517      composition, duration and number of samples to the proposed project. In some cases, the
518      laboratory's previous performance for the same Agency may be given special consideration.
519
520      OTHER RFP REQUIREMENTS
521
522      Within the response to the RFP, the laboratory should outline the various programs and
523      commitments (QA, safety, waste management, etc.) as well as submit various certifications,
524      licences  and permits to ensure the requirements of the RFP will be met. The reasonableness of
525      the proposed work schedule, program and commitments should be evaluated by the TEC. In
526      addition, if accreditation is required in the RFP, the TEC should confirm the laboratory's
527      accreditation for radioanalytical services by contacting the organization that provided the


         MARLAP                                                                     JULY 2001
         DO NOT CITE OR QUOTE                    5-14              DRAFT FOR PUBLIC COMMENT

-------
                                                                    Obtaining Laboratory Services
528     certification. If State accredited, a laboratory is typically accredited by the State in which it
529     resides. If the organization expects a laboratory to process samples from numerous States across
530     the United States, then additional accreditations for other States may be required. The TEC
531     should review and confirm the applicability and status of the licenses and permits with respect to
532     the technical scope and duration of the project.
533
534     5.5.2   Pre-Award Proficiency Evaluation
535
536     Some organizations may elect to send proficiency or PT samples (also referred to as "perfor-
537     mance evaluation" samples) to the laboratories that meet a certain scoring criteria in order to
538     demonstrate the laboratory's analytical capability. The composition and number of samples
539     should be determined by the nature of the proposed project. The PT sample matrix should be
540     composed of well-characterized materials. It is recommended that site specific PT matrix
541      samples or method validation reference material (MVRM, See Chapter 6) be used when
542     available.
543
544     Each competing lab should receive an identical set of PE samples. The RFP should specify who
545      will bear the cost of analyzing these samples as well as the scoring scheme, e.g., pass/fail or a
546     sliding scale. Any laboratory failing to submit results should be disqualified. The results should
547      be evaluated and each laboratory given a score. This allows the organization to make a second
548      cut—after which only two or three candidate laboratories are considered.
549
550      5.5 J Pre-Award Assessments and Audits
551
552     The RFP should indicate that the laboratories with the highest combined scores for technical
553      proposals and proficiency samples may be given an on-site audit. A pre-award assessment or
554     audit may be performed to provide assurance that a selected laboratory is capable of fulfilling the
555      contract in accordance with the RFP (Appendix E). In other words, is the laboratory's represen-
556     tation on paper (i.e., proposal) realistic when compared to the actual facilities? To answer this
557     question,  auditors should be looking to see that a candidate laboratory appears to have all the
558     required elements to meet the proposed contract's needs. Refer to Appendix E for details on the
559     pre-award assessments and audits.
560
562
563
564
565
566
                        Summary of Recommendations

MARLAP recommends that technical specifications be prepared in writing in a single
document designated as a SOW for all radioanalytical laboratory services, regardless of
whether the services are to be contracted out or performed by an Agency's laboratory.
         JULY 2001                                                                       MARLAP
         DRAFT FOR PUBLIC COMMENT              5-15                    DO NOT CITE OR QUOTE

-------
Obtaining Laboratory Services
 •  MARLAP recommends that the laboratory submit the proposed methods and required
    method validation documentation with the formal response.

 •  MARLAP recommends that the RFP state that subcontracting will be permitted only with
    the contracting organization's approval.

 •  MARLAP recommends that all members of the TEC have a complete technical
    understanding of the subject matter related to the proposed work.
567
568
569
570
571
572
573
574
575
576
577
578     5.6    References
579
580     5.6.1   Cited References
581
582     American National Standard Institute (ANSI) N42.23. Measurement and Associated
583        Instrumentation Quality Assurance for Radioassay Laboratories. 1996.
584
585     American National Standard Institute (ANSI) ANS 10.3. Documentation of Computer Software.
586
587     American Society for Testing and Materials (ASTM) E919. Standard Test Methods for Software
588        Documentation for a Computerized System.
589
590     U.S. Environmental Protection Agency (EPA). 1998. Guidance on Quality Assurance Project
591        Plans (EPA QA/G-5). EPA/600/R-98/018, Washington, DC. Available from www.epa.gov/
592        quality l/qs-docs/g5-fmal.pdf.
593
594     International Electrical and Electronics Engineers (IEEE). Standard 1063. Software User
595        Documentation.
596
597     International Standards Organization/International Electrotechnical Commission (ISO/EC)
598         17025. General Requirements for the Competence of Testing and Calibration Laboratories.
599        December 1999, 26 pp.
600
601     National Environmental Laboratory Accreditation Conference (NELAC). 2000. Quality Systems.
602        July. Available from: http://www.epa.gov/ttn/nelac/.
603
MARLAP               .                                                    - JULY 2001
DO NOT CITE OR QUOTE                   5-16             DRAFT FOR PUBLIC COMMENT

-------
                                                                 Obtaining Laboratory Services
604     5.6.2  Other Sources
605
606     U.S. Department of Energy (DOE). Order 414.1-1: Implementation Guide for Use with Indepen-
607        dent and Management Assessment Requirements of 10 CFR Part 830.120 and DOE 5700.6c
608        Quality Assurance. August. Available from www.directives.doe.gov/pdfs/doe/doetext/
609        neword/414/g4141 -1.
610
611     U.S. Department of Energy (DOE). 1997. Model Statement of Work for Analytical Laboratories.
612        Albuquerque Operations Office, Prepared by AGRA Earth and Environmental, Inc.,
613        Albuquerque, NM. March.
614
615     U.S. Nuclear Regulatory Commission (NRC). 1996. NRC Acquisition of Supplies and Services.
616        Directive 11.1, July 23.
617
618     U.S. Nuclear Regulatory Commission (NRC). 1994. NRC Procedures for Placement and
619        Monitoring of Work With the U.S. Department of Energy (DOE). Directive 11.7, May 3.
620
621     Office of Federal Procurement Policy (OFPP) 1997. Performance-Based Service Contracting
622        (PBSQ Solicitation/Contract/Task Order Review Checklist. August 8. Available from: http://
623        www.arnet.gov/Library/OFPP/PolicyDocs/pbscckls.html.
624
625
        JULY 2001                                                                   MARLAP
        DRAFT FOR PUBLIC COMMENT             5-17                  DO NOT CITE OR QUOTE

-------

-------
 i                6  SELECTION AND APPLICATION OF AN

 2                             ANALYTICAL METHOD

 3      6.1   Introduction

 4      This chapter provides guidance to both the project manager and the laboratory on the selection
 5      and application of analytical method. It offers guidance to the project manager on the develop-
 6      ment of the Analytical Protocol Specifications (APSs) from the laboratory's perspective on
 7      method appropriateness and availability. It offers guidance to the laboratory on the key elements
 8      to consider when selecting an analytical method (Chapter 1, Section 1.4.5) to meet the objectives
 9      of the APSs contained in the Statement of Work (SOW). Assuming that the laboratory has
10      received a SOW, certain subsections of Section 6.5 provide guidance on how to review and
11      properly evaluate the APSs therein. However, Section 6.5 also provides guidance for the project
12      planning team on the important laboratory considerations needed to develop the Measurement
13      Quality Objectives (MQOs). Section 6.6 deals with method validation requirements and has been
14      written for both the project planners and the laboratory.

15      Because the method constitutes the major part of the analytical protocol (Chapter 1), this chapter
16      focuses on the selection of a method. However, other parts of the protocol should be evaluated
17      for consistency with the method (Figure 6.1). MARLAP recommends the performance-based
18      approach for method selection. Thus, the laboratory should be able to propose whichever method
19      meets the project's analytical data requirements (MQOs), within constraints of other factors such
20      as regulatory requirements, cost, and project deadlines. The selection of a method by the
21      laboratory is in response to the APSs (Chapter 3) that were formulated during the directed
22      planning process (Chapter 2) and documented in the SOW (Chapter 5). In most project plan
23      documents, the project manager or the project planning team has the authority and responsibility
24      for approving the methods proposed by the laboratory. The APSs will, at a minimum, document
25      the analytes, sample matrices, and the MQOs. A MQO is a statement of a performance objective
26      or requirement for a particular method performance characteristic. The MQOs can be viewed as
27      the analytical portion of the DQOs (Chapter 3).

28      Background material in Section 6.2.1 provides the reader with the subtleties of the performance-
29      based approach to method selection, contrasted with the use of prescribed methods and the
30      importance of the directed panning process and MQOs in the selection of the method. This
31      chapter does not provide a listing of existing methods with various attributes indexed to certain
32      applications. Analytical methods may be obtained from national standards bodies, government
33      laboratories and publications, and  the open literature.
       JULY 2001                                                                    MARLAP
       DRAFT FOR PUBLIC COMMENT              6-1                    DO NOT CITE OR QUOTE

-------
       Selection and Application of an Analytical Method
34     In this chapter, method validation is defined as the demonstrated method applicability for a
35     particular project. MARLAP recommends that only methods validated for a project's application
36     be used. This recommendation should not be confused with the generic method validation that all
37     methods should undergo during method development. The laboratory should validate the method
38     to the APS requirements of a SOW for the analyte/matrix combination and provide the method
39     validation documentation to the project manager prior to the implementation of routine sample
40     processing (Section 6.6). If applicable, consideration should be given to the uncertainty of the
41     laboratory's protocol for subsampling (heterogeneity) of the received field sample when selecting
42     a method. Appendix F provides guidance on the minimization of subsampling uncertainty.

43     Section 6.3 provides an overview of the generic application of a method for a project and how a
44     laboratory meets the recommendations of the guidance provided in this and other chapters.
45     Generic considerations for the method selection process that a laboratory should evaluate are
46     provided in Section 6.4. Project-specific considerations for method selection relevant to APSs are
47     discussed in Section 6.5. Recommendations on the degree of method validation specified by the
48     project planning team are outlined in Section 6.6. Sections 6.7,6.8, and 6.9 provide guidance on
49     analyst qualifications, method control, and continued laboratory performance assessment,
50     respectively. Section 6.10 outlines recommendations for the method proposal and validation
51     documentation that a laboratory should send to the project manager.

52     6.2   Method Definition

53     For this chapter, a laboratory "method" includes all physical, chemical, and radiometric processes
54     conducted at a laboratory in order to provide an analytical result. These processes, depicted in
55     Figure 6.1, may include sample preparation, dissolution, chemical separation, mounting for
56     counting, nuclear instrumentation counting, and analytical calculations. This chapter will
57     emphasize the laboratory's selection of the radioanalytical method that will be proposed in
58     response to a SOW. Each each method is assumed to address a particular analyte in a specified
59     matrix or, in some cases, a group of analytes having the same decay emission category that can
60     be identified through spectrometric means (e.g., gamma-ray spectrometry). However, it should be
61     emphasized that the project planning team should have evaluated every component of the APSs
62     for compatibility with respect to all analytes in a sample and the foreseen use of multiple
63     analytical methods by the laboratory. For example, samples containing multiple analytes must be
64     of sufficient size (volume or mass) to ensure proper analysis and to meet detection and quantifi-
65     cation requirements. Multiple analytes in a sample will require multiple analyses for which a
66     laboratory may use a sequential method that addresses multiple analytes or stand-alone individual
67     methods for each analyte. The analytical  protocol must ensure that the samples are properly
        MARLAP "    ~                                                                JULY 2001
        DO NOT CITE OR QUOTE                    6-2  -            DRAFT-FOR-PUBLIC COMMENT

-------
                                    Selection and Application of an Analytical Method
             Field Sample Preparation
                 and Preservation
                 Sample Receipt
                   and Tracking
                Laboratory Sample
                   Preparation
                Sample Dissolution
               Chemical Separation
               Sample Preparation
           for Instrument Measurement
            Instrument Measurement of
                  Radionuclides
            Analytical Calculations and
                  Data Reduction
            Data Verification, Validation
                  and Reporting
          I  May he Included
              These Steps Are
              Typically Considered
              to be "The Method"
                        FIGURE 6.1 — Analytical process
JULY 2001
DRAFT FOR PUBLIC COMMENT
6-3
            MARLAP
DO NOT CITE OR QUOTE

-------
        Selection and Application of an Analytical Method
 68      preserved for each analyte and sufficient sample is collected in the field to accommodate the
 69      analytical requirements.

 70      Certain aspects of a method are defined in this chapter in order to facilitate the method selection
 71      process. The following subsections describe the underlying basis of a performance-based
 72      approach to method selection and provide a functional definition related to MARLAP.

 73      Performance-Based Approach and Prescriptive Method Application

 74      MARLAP uses a performance-based approach to select a method, which is based on a
 75      demonstrated capability to meet defined project performance criteria (e.g., MQOs). With a
 76      properly implemented quality system, a validated method should produce appropriate and
 77      technically defensible results under the applicable conditions. The selection of any new method
 78      usually requires additional planning and, in some cases, may result in additional method
 79      development or validation. The selection of a method under the performance-based approach
 80      involves numerous technical,  operational,  quality, and economic considerations. However, the
 81      most important consideration in the selection of a method under the performance-based approach
 82      is compliance with the required MQOs for the analytical data. These requirements should be
 83      defined in the SOW or appropriate project plan document.

 84      When developing the MQOs, the project planning team should have evaluated all processes that
 85      have a potential to affect the analytical data. Those involved in the directed planning process
 86      should understand and communicate the needs of the project. They should also understand how
 87      the sampling (field, process, system, etc.) and analytical activities will interact and the ramifica-
 88      tions that the data may have on the decisionmaking process. These interactive analysis and
 89      communication techniques should be applied in all areas where analytical data are produced. As
 90      new projects are implemented, it should not be assumed that the current methods are necessarily
 91      the most appropriate and accurate; they should be reevaluated based on project objectives. The
 92      application of a performance-based approach to method selection requires the quantitative
 93      evaluation of all aspects of the analytical process.  Once the MQOs for a project have been
 94      determined and incorporated into the APSs, under the performance-based approach, the
 95      laboratory will evaluate its existing methods and propose one or more methods that meet each
 96      APS. This chapter contains guidance on how to use the APSs in the laboratory's method
 97      evaluation process.

 98      The objective of a performance-based approach to method selection is to facilitate the selection,
 99      modification, or development of a method that will reliably produce quality analytical data as
100      defined by the MQOs. Under the performance-based approach, a laboratory, responding to a


        MARLAP                                                                      JULY 2001
        DO NOT CITE OR QUOTE                     6-4              DRAFT FOR PUBLIC COMMENT

-------
                                                  Selection and Application of an Analytical Method
101      SOW, will propose a method that best satisfies the requirements of the MQO and the laboratory's
102      operations.

103      In certain instances, the requirement to use prescribed methods may be included in the SOW. The
104      term "prescribed methods" has been associated with those methods that have been selected by
105      industry for internal use or selected by a regulatory agency, such as the U.S. Environmental
106      Protection Agency (EPA), for specific programs. The methods for analyzing radionuclides in
107      drinking water prescribed by EPA (1980) provides an example of applying a limited number of
108      methods to a well-defined matrix. In many companies or organizations, prescribed methods are
109      widely used. Methods that have been validated for a specific application by national standard
110      setting organizations such as the American Society for Testing and Materials (ASTM), American
111      National Standards Institute  (ANSI), American Public Health Association (APHA), etc., may
112      also be used as prescribed methods by industry and government agencies.

113      Typically, the prescribed methods were selected by an organization to meet specific objectives
114      for a regulation under consideration or for a program need. In most cases, the prescribed methods
115      had undergone some degree  of method validation, and the responsible organization had required
116      a quality system to demonstrate continued applicability and quality, as well as laboratory
117      proficiency. The use of any analytical method, whether prescribed or from the performance-based
118      approach, has a life cycle that can be organized into the major categories of selection, validation,
119      and continued demonstrated capability and applicability. This chapter will cover in detail only
120      the first two of these categories. A discussion on ongoing laboratory evaluations is presented in
121      Chapter 7 and Appendix C.

122      A final note should be made relative to prescribed methods and the performance-based approach
123      to method selection. The performance-based approach for method selection allows more latitude
124      in dealing with the potential  diversity of matrices (such as waste-, sea-, ground- or surface water;
125      biota; air filters; waste streams;  swipes; soil; sediment; or sludge) from a variety of projects, or in
126      dealing with different levels  of data quality requirements or a laboratory's analytical proficiency.
127      Even though the prescribed method approach may initially appear suitable and cost effective, it
128      does not allow a laboratory to select a method from the many possible methods that will meet the
129      MQOs.

130      Many individuals have the wrong impression that prescribed methods do not need to be validated
131      by a laboratory. However, as discussed in this chapter, all methods should be validated to some
132      level of performance for a particular project by the laboratory prior to their use. In addition, the
133      laboratory should demonstrate continued proficiency in using the method through internal QC
134      and external performance evaluation (PE) programs (Chapter 18).

        JULY 2001                         -                                 ""           MARLAP
        DRAFT FOR PUBLIC COMMENT               6-5                      DO NOT CITE OR QUOTE

-------
        Selection and Application of an Analytical Method
135     6.3   Life Cycle of Method Application

136     In responding to a SOW for a given analyte/matrix combination, a laboratory may have one or
137     more methods that may be appropriate for meeting the MQOs. The final method selected from a
138     set of methods may be influenced by many other technical, operational, or quality considerations.
139     Figure 6.2 provides an overview of the life cycle of the method application. Figure 6.3 expands
140     the life cycle into a series of flow diagrams.
                                     Analyte / Matrix
                                   Process Knowledg
                                                 Project
                                             Management
                Documentation of
                Method Validation
                 & Performance
                 During Project
              Continued
             Performance
             Assessments
                                              Analytical
                                              Protocol
                                             Specifications
                                       Method
                                      Modification
          External
         P£ Programs
                          Laboratory
                         Management
                                                    Method
                                                   Selection
                Method
                Control
               (Quality System}
                                                   Method
                                                  Validation
                                                  (DefDons&ated
         '  \Analyst
           Selection /
           Qualification
                                                                      Available
                                                                      Methods
                                                                        A
                                                                        Existing     Method     Method
                                                                        Methods  Development Modification
    Samples
   /  \
Project   External QC
  oainpies

/\
                                                       Approval
                                        Method
                                FIGURE 6.2 — Method application life cycle
        MARLAP
        DO.NOT.CITE OR QUOTE
                                         6-6
                                                                            JULY 2001
                                                         DRAFT FOR PUBLIC COMMENT

-------
                                                               Selection and Application of an Analytical Method
            Analyte / Matrix
          Process Knowledge
           Analytical Protocol
             Specifications
Analyles -  Radionuclides wl decay products

       •  Health significance ol nuclides
       •  Scaling factors (alternative related analytes) for related nuclides; decay corrected and
         based on process knowledge and trie uncertainty of the alternate analyte
         measurements
       -  Chemical species ol analyte, process knowledge or experiment
       •  Analyte stability during  and following sampling
          - Preservation requirements

Matrix   -  Description Irom process knowledge or Held collection reports
       -  Chemical or radioactive interferences and inherent analyle in matrix
       •  Analyte Contaminant or inherent in matrix: process knowledge
       •  Analyte uniformly distributed within matrix
       •  Sample ptep considerations
Data use  • Define linal form for analysis
             • wet or dry lor soil and vegetation
             • analyte cone. / particle size distribution
             - analyte cone. I dissolved or suspended or both
             • analyte cone. / chemical & physical species

MOO   - Define action level for each analyte / matrix
       - Define MDC or MOC
       • Define required method uncertainty at the action level
       - Define MOOs for other method performance characteristics as
         appropriate • method specifieity. raggedness and analyte concentration
         range

Method Validation Testing Protocol
       • Select Method Validation Level
              • Test at several analyte concentration levels including zero analyte (blanks):
                 MDC or MOC requirement
              • include known chemical or tadionuclide interferences a! appropriate levels
       - Select project specific or appropriate surrogate matrix PT samples
       • Establish acceptable chemical I radiotracer yield values
       • State data testing criteria
     Method Development / Selection
             (Section 6.5)
FIGURE 6.3 — Expanded Figure 6.2 addressing the laboratory's method evaluation process
JULY 2001
DRAFT FOR PUBLIC COMMENT
        6-1
                    MARLAP
DO NOT CITE OR QUOTE
                                                                               U.S. EPA Headquarters Library
                                                                                         Mail code 3201
                                                                              1200 Pennsylvania Avenue NW
                                                                                   Washington  DC 204fin

-------
           Selection and Application of an Analytical Method
              Method Development / Selection
                       (Section 6.5}
                     Method Validation
                       (Section 6.6)
                      Method Approval
Sample Prep / Sample Dissolution / Chemical Separation I
          Test Source I Nuclear Counting

       • Measurement quality objectives
       • Analyte / radicnuclide of interest
       - Sample volume
       - Chemical / physical species
       - Preservation applied in field / lab
       • Chemical / radionuclide interferences
       • Matrix considerations - subsampling considerations
       • Method of analyte (or alternative analyte) detection
       - Method complexity
       • Required turnaround and radiological holding times
       - Validation status of possible methods
       • Availability of qualified staff
       • Hazardous waste production
       • Facility I bench space and equipment availability
       - Associated costs
Review specificied method validation requirements and determine:
      • Use of exisiling validated method
      - Modify existing method and validate
      • Develop new method and validate

Prepare method validation documentation
laboratory proposes method and submits:
      •SOP
      • Method validation documentation
      - Previous history of method use

Proiecl Manager evaluates proposed method
      • Technical review of proposed method
      - Review of method validation documentation
      - Approval signoll
                          Analyst
                  Selection / Qualifications
                 	(Section 6.7)	
                FIGURE 6.3 (continued) — Expanded Figure 6.2 addressing the laboratory's method
                                                              evaluation process
           MARLAP
           DO NOT CITE OR QUOTE
   6-8
                            JULY 2001
DRAFT FOR PUBLIC COMMENT
:'(*••

-------
                                                           Selection and Application of an Analytical Method
              Analyst
      Selection / Qualifications
            (Section 6.7)
           Method Control
            (Section 6.8)
        Continued Performance
            Assessment
            (Suction 6.9)
   Analyst selection consistent with level of method difficulty
        • Education
        • Experience 4 familiarity of method concepts

   Documented training in lab safety, radiation safely, chemical hygiene and waste
         management

   Documented training on selected method
        • Metrics review
        • Supervised hands on training

   Analyst completes proliency tests
        - Analytical results meet quality performance requirements for MQOs
- Controlled Method Manual
      - Latest revision applied
      • Signature signoff

- Instrument calibration & radiolracers - MIST traceable standards

- Instrumentation quality control
      • Balances, pipettes, volumetric glassware
      - Daily / jmor-to-use nuclear and chemistry instrumentation QC checks

* Radtottacer / jta»imMtic yield within sp«ili«d range

* Internal batch OC samples

- SOPs tor troubleshooting "out of control" situations
     FIGURE 6.3 (continued) — Expanded Figure 6.2 addressing the laboratory's method
                                                 evaluation process
JULY 2001
DRAFT FOR PUBLIC COMMENT
        6-9
                   MARLAP
DO NOT CITE OR QUOTE

-------
         Selection and Application of an Analytical Method
              Continued Performance
                  Assessment
                  (Section 6.9}
- Internal batch QC samples meet quality performance criteria
- External double and single blind QC / PT samples from contracting organization and / or agency
 monitoring laboratory
• External single blind PT samples from national PE program • traceable to MIST
- Data verification and validation
• Internal assessments / audits / surveillances
- External assessments / audits / surveillances
                 Documentation
                  (Section 6.10)
 Method validation records
 Analyst training program
 Method manual control and archiving
 Software verification and validation records
 Instrument calibration & QC records
 Internal method batch QC sample results
 Internal & external assessments
 External double / single QC sample results
 Corrective action report!
 Analytical results - hard and electronic copy
             FIGURE 6.3 (continued) — Expanded Figure 6.2 addressing the laboratory's method
                                               evaluation process

141      6.4   Generic Considerations for Method Development and Selection
142      This section provides guidance on the technical, quality, and operational considerations for the
143      development of a new method or the selection of an existing radioanalytical method. Unless
144      required by a regulatory or internal policy, rarely should a method be specified in an APS or a
145      SOW. MARLAP recommends that a SOW containing the MQOs and analytical process
146      requirements be provided to the laboratory.

147      If the nature of the samples and analytes are known in advance, and variations in a sample matrix
148      and analyte concentration are within a relatively small range, the development or selection of
149      analytical methods is easier. In most situations, however, the number of samples, sample
150      matrices, analyte interferences, chemical form of analytes, and variations among and within
151      samples may influence the selection of a method for a given analyte. A number of radioanalytical
152      methods are available, but no single method provides a general solution (all have advantages and
         MARLAP
         DO NOT CITE OR QUOTE
           6-10
                     JULY 2001
DRAFT FOR PUBLIC COMMENT

-------
                                                   Selection and Application of an Analytical Method
 153      disadvantages). The method selection process should consider not only the classical
 154      radiochemical methods involving decay emission detection (alpha, beta or gamma) but also non-
 155      nuclear methods, such as mass spectrometric and kinetic phosphorescence analysis.

 156      In the performance-based approach to method selection, the laboratory may select and propose a
 157      gross measurement (alpha, beta, or gamma) method that can be applied to analyte concentrations
 158      well below the action level for the analyte, as well as an analyte specific method for analyte
 159      levels exceeding a proposed "screening level" that is a fraction of the action level. For example,
 160      it may be acceptable to propose a gross measurement method when its combined standard
 161      uncertainty meets the method uncertainty requirement at concentration levels much below the
 162      action level. A gross measurement method may be employed initially for some projects. Such an
 163      approach would have to be agreed to by the laboratory and project manager. The method
-164      validation, discussed in Section 6.6, should demonstrate that the gross measurement method can
 165      measure the analyte of interest (directly or indirectly) at the proposed analyte concentration and
 166      meet the uncertainty requirement in the presence of other radionuclides. Appendix C provides
 167      guidance on how to determine the acceptable method uncertainty at an analyte concentration
 168      relative to the action level.

 169      In general, the development  or selection of a method follows several broad considerations. These
 170      include analyte and matrix characteristics, technical complexity and practicality of methods,
 171      quality requirements, availability of equipment, facility and staff resources, regulatory concerns,
 172      and economic considerations. Each of the broad considerations can be detailed. The following
 173      list, although not inclusive, provides insight into the  selection of an appropriate method. Many of
 174      these categories are discussed in subsequent MARLAP Part II chapters.

 175       0 Analyte/radionuclide/isotope of interest
 176         °  Decay emission (particle or photon), atom detection, or chemical (photon detection)
 177         o  Half-life of analyte
 178         °  Decay products (progeny); principal detection method or interference
 179        . o  Chemical/physical forms (e.g., gas, volatile)
 180         o  Use of nondestructive or destructive sample analysis

 181       0 Level of other radionuclides or chemical interference
 182         o  Level of decontamination or selectivity required, e.g., a decontamination factor of 103 for
 183                an interfering nuclide ('"Co) present with the analyte of interest (241Pu)
 184         o  Resolution of measurement technique
 185         °  Robustness of technique for handling large fluctuations in interference levels and
 186             variations in a matrix


         JULY 2001            .                                                           MARLAP
         DRAFT FOR PUBLIC COMMENT               6-11                     DO NOT CITE OR QUOTE

-------
        Selection and Application of an Analytical Method
187        o   Radionuclides inherent in background
188      0 Matrix
189        °   Destructive testing
190            -  Stable elemental interferences
191            -  Difficulty in dissolution of a matrix
192            -  Difficulty in ensuring homogeneity of aliquant
193            -  Inconsistency in chemical forms and oxidation states of the analyte versus the tracer
194        °   Non-destructive testing
195            -  Heterogeneity of final sample for analysis
196            -  Self absorption of particle/photon emissions within a matrix

197      0 Degree of method complexity
198        °   Level of technical  ability required of analysts
199        °   Reproducibility of quality results between analysts
200        °   Method applicability to sample batch processing
201        °   Extensive front-end chemical-processing technique (sample dissolution, analyte
202            concentration and purification/isolation, preparation for final form for radiometrics)
203        o   Nuclear instrumentation oriented technique (minimal chemical processing)

204      0 Required sample turnaround time
205        °   Half-life of analyte
206        o   Sample preparation or chemical method processing time
207        °   Nuclear instrumentation measurement/analysis time
208        °   Chemical or sample matrix preservation time
209        °   Batch processing
210        o   Degree of automation available/possible

211      0  Status of possible methods and applications
212        °   Validated for the intended application
213        o   Staff qualified and trained to  use method(s)
214        °   Existing QC program for method(s)
215        °   Specialized equipment, tracers, reagents, or materials available

216      0 Hazardous or Mixed waste production
217        °   Older classical techniques versus new advanced chemical technologies
218        o   Availability and expense of waste disposal

219      0 Associated costs


        MARLAP                                            •    '                      JULY 2001
        DO NOT CITE OR QUOTE                    6-12               DRAFT FOR PUBLIC COMMENT

-------
                                                  Selection and Application of an Analytical Method
220         °   Labor, instrumentation usage, facilities, radiological waste costs
221         °   Method applicability to portable or mobile laboratory facilities
222         o   Availability of service hookups
223         °   Need for facility environmental controls
224         °   Need for regulatory permitting of mobile laboratory facility

225     6.5   Project-Specific Consideration for Method Selection

226     Certain parameters of the APSs (See Chapter 3 and the example in Figure 3.2) within the SOW
227     are important to the method selection process.  These include the analytes, matrix type, matrix
228     characterization, analyte and matrix interferences, analyte speciation information gathered from
229     process knowledge, sample process specifications (such as radiological holding times and sample
230     processing turnaround times), and the MQOs. While these issues should be resolved during
231     project planning, they are presented here as guidance to the laboratory for their review and
232     evaluation of the technical adequacy of the SOW and to provide context for the method
233     evaluation and selection process. Many of the issues from the project planning point of view are
234     discussed in Section 3.3.

235     6.5.1   Matrix and Analyte Identification

236     The first step in selecting a method is knowing what analytes and sample matrices are involved.
237     The following sections discuss what important information should accompany analyte and matrix
238     identification.

239     6.5.1.1 Matrices

240     A detailed identification and description of the sample matrix are important aspects in the
241     selection of an analytical method to meet the MQOs. The SOW should provide the necessary
242     detailed sample matrix description, including those important matrix characteristics gathered
243     from process knowledge. The laboratory should evaluate whether the existing sample preparation
244     and dissolution steps of a method (Chapters 10 and 12 through 15) will be sufficient to meet the
245     MQOs or the method validation requirements. The matrix will also determine, to a certain extent,
246     waste handling and disposal at the laboratory. If the matrix description is too vague or generic,
247     the laboratory should contact the technical representative named in the SOW and request
248     additional information.
249
250     The laboratory should ensure that the sample matrix description in the SOW reflects what is
251     considered to be the "sample" by the project manager and the description is of sufficient detail to

        JULY 2001                 "                                                      MARLAP
 -  -  -DRAFT FOR PUBLIC COMMENT"               6-13                   DO NOT CITE OR QUOTE

-------
         Selection and Application of an Analytical Method
252      select the method preparation or analyte isolation steps that will meet the MQOs for the matrix.
253      The laboratory should not accept generic sample matrix descriptions such as liquids or solids. For
254      example, the differences between potable water and motor oil are obvious, but both may be
255      described as a "liquid sample." However, there may be only subtle differences between potable
256      surface water and groundwater but major differences between potable and process effluent
257      waters. The laboratory should consider how much method robustness is needed in order to
258      address the varied amounts of possible stable elements or compounds within a non-specified
259      water matrix. Furthermore, when water from a standing pool is received in the laboratory, it may
260      contain some insoluble matter.  Now the questions arise whether the sample is the entire contents
261      of the container, what remains in the container, the insoluble material, or just the water? A clay
262      will act as an ion exchange substrate, while a sand may have entirely different retention
263      properties. Both can be described as a soil or sediment, but the properties with which they retain
264      a radionuclide are substantially different; thus, the method to properly isolate a particular
265      radionuclide will vary. The laboratory should ensure that the selected method is consistent with
266      the intended sample matrix, and the analytical results convey analyte concentration related to the
267      proper matrix (i.e., Bq/L dissolved, Bq/L suspended, or Bq/L total). For such cases, the
268      laboratory should request the project manager to clarify the "matrix" or "sample" definition.

269      Matrices generically identified  as "solid" require additional clarification or information in order
270      to select and validate a method properly. For example, sludges from a sewerage treatment facility
271      may be classified as a solid, but the suspended and aqueous portions (and possibly the dried
272      residual material) of the sample may have to be analyzed. Normally, the radioanalyte concentra-
273      tion in soils and sediments is reported in terms of becquerels per dry weight. However, certain
274      projects may require additional sample process specifications (Section 6.5.4) related to the soil or
275      sediment matrix identification that will affect the method selection process and the reporting of
276      the data. This may involve sectioning of core samples, specified drying temperature of the
277      sample, determining wet-to-dry weight ratio, removing organic material  or detritus, homogeni-
278      zing and pulverizing, sieving and sizing samples, etc. In order to determine the average analyte
279      concentration of a sample of a given size containing radioactive particles, proper sample
280      preparation and subsampling coupled with the applicable analytical methods are required
281      (Chapter 12 and Appendix F). For alpha-emitting radionuclides, the method selected may only be
282      suitable to analyze a few grams of soil or sediment, depending on the organic content. The
283      laboratory should identify to the project manager the typical subsample or aliquant size that is
284      used for the proposed method. If information provided to the laboratory on process knowledge
285      indicates that there may be a possibility of radioactive particles, or selected analyte adsorption
286      onto soil or sediment particles, the laboratory should propose sample preparation and analytical
287      methods that will address these matrix characteristics. The laboratory should submit the proposed
288      methods annotated with the suspected matrix characterization issues.


         MARLAP            '_•   '                                                     JULY 2001
         DO NOT CITE OR QUOTE                    6-14              DRAFT FOR PUBLIC COMMENT

-------
                                                   Selection and Application of an Analytical Method
289     When selecting the methods for the analysis of flora (terrestrial vegetation, vegetables, aquatic
290     plants, algae, etc.) or fauna (terrestrial or aquatic animals) samples, the detailed information on
291     the matrix or the unique process specifications should be used by the laboratory to select or
292     validate the method, or both. The laboratory should ensure that the specific units for the
293     analytical results are consistent with the matrix identification and unique process specifications
294     stated in the SOW. Most flora and fauna results are typically reported in concentrations of wet
295     weight. However,  for dosimetric pathway analyses, some projects may want only the edible
296     portion of the sample processed and the results to reflect this portion, e.g., fillet of sport fish,
297     meat and fluid of clams, etc. For the alpha- and beta-emitting radionuclides, aquatic vegetation
298     normally is  analyzed in the dry form, but the analyte concentration is reported as wet weight. The
299     laboratory should ensure that the sample preparation method (Chapter 12) includes the
300     determination of the necessary wet and  dry weights.

301     These considerations bear not only on the method selected but also on how the sample should be
302     collected and preserved during shipment. When possible, the laboratory should evaluate the
303     proposed sample collection and preservation methods, as well as timeliness of shipping, for
304     consistency with the available analytical methods. Discrepancies noted in the SOW for such
305     collateral areas should be brought to the attention of the project manager. For example, sediment
306     samples that have  been cored to evaluate the radionuclide depth profile should have been
307     collected and treated in a fashion to retain the depth profile. A common method is to freeze the
308     core samples in the original plastic coring sleeves and ship the samples on ice. The SOW should
309     define the specifics on how to treat the core samples and the method of sectioning the samples
310     (e.g., cutting the cores into the desired lengths or flash heating the sleeves with subsequent
311     sectioning).

312     The SOW should have properly delineated the proper matrix specifications required for method
313     validation. In some cases, sufficient information may have been provided to define the
314     parameters necessary to prepare method validation reference material (MVRM) for method
315     validation purposes (Section 6.6). The laboratory should ensure that sufficient information and
316     clarity have been provided on the matrix to conduct a proper method validation.

317     6.5.1.2. Analytes and Potential Interferences

318     The SOW should describe the analytes of interest and the presence of any other chemical and
319     radionuclide contaminants (potential method interferences and their anticipated concentration)
320     that may be in the  samples. This information should be provided in the SOW to allow the
321     laboratory's radiochemist to determine the specificity and robustness of a method that will
322     address the multiple analytes and their interferences. The delineation of other possible interfering


        JULY 2001                                               -  -_                     MARLAP
        DRAFT FOR PUBLIC COMMENT              6-15                    DO NOT CITE OR QUOTE

-------
        Selection and Application of an Analytical Method
323     radionuclides is extremely important in the selection of a method to ensure that the necessary
324     decontamination factors and purification steps are considered.

325     The size of the sample needed by the laboratory will depend on the number of analytes and
326     whether the laboratory will select individual methods for each analyte or a possible "sequential"
327     analytical method, where several analytes can be isolated from the same sample and analyzed. If
328     a sample size is listed in the SOW, the laboratory should determine if there will be sufficient
329     sample available to analyze all analytes, the associated QC samples, and any backup sample for
330     re-anaiyses. Other aspects, such as the presence of short-lived analytes or analytes requiring very
331     low detection limits, may complicate the determination of a proper sample size.

332     The laboratory should ensure that the method validation requirements in the SOW are consistent
333     with the analytes and matrix. The method validation protocols defined in Section 6.6 are
334     applicable to methods for single analyte analyses or to  a "sequential method" where several
335     analytes are isolated and analyzed. The laboratory should develop a well-planned protocol
336     (Section 6.6.2) for method validation that considers the method(s), analyte(s), matrix and
337     validation criteria.

338     6.5.2 Process Knowledge

339     Process knowledge typically is related to facility effluent and environmental surveillance
340     programs, facility decommissioning, and site remediation activities. Important process
341     knowledge may be found in operational history or regulatory reports associated with these
342     functions or activities. It is imperative that the laboratory review the information provided in the
343     SOW to determine whether the anticipated analyte concentration and matrix are consistent with
344     the scope of the laboratory operations. Process  knowledge contained in the SOW should provide
345     sufficient detail for the laboratory to determine, quickly and decisively, whether or not to pursue
346     the work. If sufficient detail is not provided in the SOW, the laboratory should request the project
347     planning documents. Laboratories having specialized sample preparation facilities that screen the
348     samples upon arrival can make the necessary aliquanting or dilutions to permit the processing of
349     all low-level samples in the laboratories. Laboratories that have targeted certain sectors  of the
350     nuclear industry or a particular nuclear facility may be  very knowledgeable in the typical
351     chemical and physical forms of the analytes of a given  sample matrix and may not require
352     detailed process knowledge information. However, under these circumstances, the laboratory's
353     method should be robust and rugged enough to handle  the expected range of analyte concen-
354     trations, ratios of radionuclide and chemical interferences, and variations in the sample matrix.
        MARLAP                                                 ..                      JULY.2001
        DO NOT CITE OR QUOTE                     6-16              DRAFT FOR PUBLIC COMMENT

-------
                                                   Selection and Application of an Analytical Method
355      Process knowledge may provide valuable information on the possible major matrix constituents,
356      including major analytes, chemical/physical composition, hazardous components, radiation
357      levels, and biological growth (e.g., bacteria, algae, plankton, etc.) activities. When provided, the
358      laboratory should use this information to determine if the sample collection and preservation
359      methodologies are consistent with the proposed radioanalytical method chosen. In addition, the
360      information also should be reviewed to ensure that the proposed sample transportation or
361      shipping protocols comply with regulations governing the laboratory operation.

362      Process knowledge information in the SOW may be used by the laboratory to refine method
363      selection from possible radiometric/chemical interferences, chemical properties of the analytes or
364      matrix, and hazardous components, among others. Chapter 14 describes the various generic
365      chemical processes that may be used to ensure proper decontamination or isolation of the analyte
366      from other interferences in the sample. These include ion exchange, co-precipitation, oxidation/
367      reduction, and solvent extraction among others. The process knowledge information provided in
368      the SOW should be reviewed to determine whether substantial amounts of a radionuclide that
369      normally would be used as a radiotracer will be present in the sample.  Similarly, information on
370      the levels of any stable isotope of the analyte being evaluated is equally important. Substantial
371      ambient or background amounts of either a stable isotope of the radionuclide or the radiotracer in
372      the sample may produce elevated and false chemical yield factors. In addition, substantial
373      amounts of a stable isotope of the analyte being evaluated may render certain purification
374      techniques inadequate (e.g., ion exchange or solid extractants).

375      6.5.3   Radiological Holding and Turnaround Times

376      The SOW should contain the requirements for the analyte's radiological holding and sample
377      turnaround times. MARLAP defines radiological holding time as the time differential between
378      the date of sample collection and the date of analysis. It is important that the laboratory review
379      the specifications for radionuclides that have short half-lives (less than 30 days), because the
380      method proposed by the laboratory may depend on the required radiological holding time. For
381      very short-lived radionuclides, such as 13II or 224Ra, it is very important to analyze the samples
382      within the first two half-lives in order to meet the MQOs conveniently. A laboratory may have
383      several methods for the analysis of an analyte, each having a different analyte detection and
384      quantification capability. Of the possible methods available, the method selected and proposed by
385      the laboratory most likely will be dependent on the radiological holding time requirement, half-
386      life of the analyte, and the time available after sample receipt at the laboratory. When a
387      laboratory has several methods to address variations in these constraints, it is recommended that
388      the laboratory propose more than one method with a clarification that addresses the radiological
389      holding time and MQOs. In some cases, circumstances arise which require the classification of


         JULY 2001                                                  •—                   MARLAP-
         DRAFT FOR PUBLIC COMMENT              6-17           -        DO NOT-CITE OR QUOTE

-------
         Selection and Application of an Analytical Method
 390      sample processing into several time-related categories (Chapter 5). For example, the determina-
 391      tion of I3II in water can be achieved readily within a reasonable counting time through direct
 392      gamma-ray spectrometry (no chemistry) using a MarinelU beaker counting geometry, when the
 393      detection requirement is 0.4 Bq/L and the radiological holding time is short. However, when the
 394      anticipated radiological holding time is in the order of weeks, then a radiochemistry method
 395      using beta detection or beta-gamma coincidence counting would be more appropriate to meet the
 396      detection requirement. The more sensitive method also may be used when there is insufficient
 397      sample size or when the analyte has decayed to the point where the less sensitive method cannot
 398      meet the required MQOs. Another example would be the analysis of 226Ra in soil, where the
 399      laboratory could determine the 226Ra soil concentration through the quantification of a 226Ra
 400      decay product by gamma-ray spectrometry after a certain ingrowth period, instead of direct
 401      counting of the alpha particle originating from the final radiochemical product (micro-
'402      precipitate) using alpha spectrometry.

 403      Sample (processing) turnaround time normally means the time differential from the receipt of the
 404      sample at the laboratory to the reporting of the analytical results. As such, the laboratory should
 405      evaluate the SOW to ensure that the sample turnaround time, radiological holding time, data
 406      reduction and reporting times, and project needs for rapid data evaluation are consistent and
 407      reasonable. Method selection should take into consideration the time-related SOW requirements
 408      and operational aspects. When discrepancies are found in the SOW, the laboratory should
 409      communicate with the project manager and resolve any issue. Additionally, the response to the
 410      SOW should include any clarifications needed for sample turnaround time and/or radiological
 411      holding time issues.                                        !

 412      6.5.4   Unique Process Specifications

 413      Some projects may incorporate detailed sample processing parameters, specifications, or both
 414      within the SOW. Specifications for parameters related to sample preparation may include the
 415      degree of radionuclide heterogeneity in the final sample matrix prepared at the laboratory, the
 416      length of the sections of a soil or sediment core for processing, analysis of dry versus wet weight
 417      material, partitioning of meat and fluid  of bivalves for analyses, and reporting of results for
 418      certain media as a dry or wet weight.  Specifications related to method analysis could include
 419      radionuclide chemical speciation in the sample matrix. The laboratory must evaluate these
 420      specifications carefully, since various parameters may affect the method proposed by the
 421      laboratory. When necessary, the laboratory should request clarification of the specifications in
 422      order to determine a compatible method. In addition, the laboratory should ensure that the
 423      method validation process is consistent with the unique process requirements. In some cases, not
 424      all special process specifications must be validated and, in other cases, site-specific materials


         MARLAP                                                                       JULY 2001
         DO NOT CITE OR QUOTE                    6-18               DRAFT FOR PUBLIC COMMENT

-------
                                                   Selection and Application of an Analytical Method
425     (also referred to as MVRM) will be required for method validation. When necessary, the
426     laboratory also should request site-specific reference materials having the matrix characteristics
427     needed for proper method validation consistent with the special process requirements. It is
428     incumbent upon the laboratory to understand clearly the intent of the special process
429     specifications and how they will be addressed.

430     6.5.5  Measurement Quality Objectives

431     The specific method performance characteristics having a measurement quality objective may
432     include:

433       •  Method uncertainty at a specified analyte concentration level;
434       •  Quantification capability (minimum quantifiable concentration);
435       •  Detection capability (minimum detectable concentration);
436       •  Applicable analyte concentration range;
437       •  Method specificity; and
438       •  Method ruggedness.

439     How each of these characteristics affect the method selection process will be discussed in detail
440     in the subsequent paragraphs.

441     6.5.5.1  Method Uncertainty

442     From the directed planning process, the required method uncertainty at a stated analyte
443     concentration should have been determined for each analyte/matrix combination. The method
444     uncertainty requirement may be linked to the width of the gray region (Appendix C). MARLAP
445     recommends that the SOW include the specifications for the action level and the required method
446     uncertainty for the analyte concentration at the action level for each analyte/matrix. For research
447     and baseline monitoring programs, the action level and gray region concepts may not be
448     applicable. However, for  these applications, the project manager should establish a concentration
449     level of interest and a required method uncertainty at that level. The laboratory should ensure that
450     this method uncertainty requirement is clearly stated in the SOW.

451     The laboratory should select a method that will satisfy the method uncertainty requirement at the
452     action level or other required analyte level. MARLAP uses the term "method uncertainty" to
453     refer to the predicted uncertainty of a result that would be measured if a method were applied to a
454     hypothetical laboratory sample with a specified analyte concentration. The uncertainty of each
455     input quantity (method parameter) that may contribute significantly to the total uncertainty


        JULY 2001                                                                        MARLAP
        DRAFT FOR PUBLIC COMMENT              6-19                    DO NOT CITE OR QUOTE

-------
        Selection and Application of an Analytical Method
456     should be evaluated. For some methods, the uncertainty of an input quantity may vary by analyst
457     or spectral unfolding software. Chapter 19 provides guidance on how to calculate the combined
458     standard uncertainty of the analyte concentration, and Section 19.6.12 shows how to predict the
459     uncertainty for a hypothetical measurement. For most basic methods, uncertainty values may be
460     included for the following input quantities (parameters):

461       •  Poisson counting statistics (net count rate);
462       •  Detector efficiency, if applicable;
463       •  Chemical yield (when applicable) or tracer yield;
464       •  Sample volume/weight;
465       •  Decay/ingrowth factor; and
466       •  Radiometric interference correction factor.

467     Typically, for low-level environmental remediation or surveillance activities, only those input
468     quantities having an uncertainty greater than one percent significantly contribute to the combined
469     standard uncertainty. Other than the radiometric interference correction factor and Poisson
470     counting uncertainties, most input quantity uncertainties normally do not vary as a function of
471     analyte concentration. At analyte levels near or below the detection limit, the Poisson counting
472     uncertainty may dominate the method's uncertainty. However, at the action level or above, the
473     Poisson counting uncertainty may not dominate.

474     When appropriate, the laboratory should determine the method uncertainty over the MQO analyte
475     concentration range (Section 6.5.5.3), including  the action level or other specified analyte
476     concentration. The laboratory's method validation (Section 6.6) should demonstrate or show
477     through extrapolation or inference (e.g., from a lower or higher range of concentrations) that this
478     method uncertainty requirement can be met at the action level or specified analyte concentration
479     value. Method validation documentation should be provided in the response to the SOW.

480     6.5.5.2 Quantification Capability

481     For certain projects or programs, the project planning team may develop an MQO for the
482     quantification capability of a method. The quantification capability, expressed as the minimum
483     quantifiable concentration (MQC), is the smallest concentration of the analyte that ensures a
484     result whose relative standard deviation is not greater than a specified value, usually 10 percent.
485     Chapter 19 provides additional information on the minimum quantifiable concentration.

486     MARLAP recommends that, when required; a laboratory analyze each sample to meet the MQC
487     requirement. For example, if the MQC requirement for 89Sr is 1.0 Bq/g (with a 10 percent relative


        MARLAP                                                                        JULY 2001
        DO NOT CITE OR QUOTE                     6-20              DRAFT FOR PUBLIC COMMENT

-------
                                                   Selection and Application of an Analytical Method
488     standard deviation), the laboratory should select a method that has sufficient chemical yield
489     (Chapter 19), beta detection efficiency, low background, sample (processing) turnaround time for
490     a given sample mass, and radioactive decay to achieve a nominal measurement uncertainty of 0.1
491     Bq/g when the 89Sr concentration is 1.0 Bq/g. The same forethought that a laboratory gives to
492     estimating a method's minimum detectable concentration (MDC) for an analyte should be given
493     to the MQC requirement. The laboratory should consider the uncertainties of all input quantities
494     (detector efficiency, chemical yields, interferences, etc.), including the Poisson counting
495     uncertainty when selecting a method. This is an important consideration, because for some
496     methods, the Poisson counting uncertainty at the MQC level may contribute only 50 percent of
497     the combined standard uncertainty. Therefore, the laboratory may have to select a method that
498     will meet the MQC requirement for a variety of circumstances, including variations in matrix
499     constituents and chemical yields, radionuclide and chemical interferences, and radioactive decay.
500     In addition, sufficient sample size for processing may be critical to achieving the MQC
501     specification.

502     During the method validation process, the ability of the method to meet the required MQC
503     specification should be tested. The method validation acceptance criteria presented in Section 6.6
504     have been formulated to evaluate the MQC requirement at the proper analyte concentration level,
505     i.e., action level or other specified analyte concentration.

506     Since the laboratory is to report the analyte concentration value and its measurement uncertainty
507     for each sample, the project manager or data validator easily can evaluate the reported data to
508     determine compliance with the MQC requirement. Some projects may send performance testing
509     (PT) material spiked at the MQC level as a more in-depth verification of the compliance with this
510     requirement.

511     6.5.5.3 Detection Capability

512     For certain projects or programs, the method selected and proposed by the laboratory should be
513     capable of meeting a required MDC  for the analyte/matrix combination for each sample
514     analyzed. For certain monitoring or research projects, the analyte MDC may be the important
515     MQO to be specified in the SOW. For such projects,  the MDC specification may be based on the
516     analyte concentration of interest or the state-of-the-art capability of the employed technology or
5!7     method. No matter what premise is used to set the value by  the project planning team, the
518     definition of, or the equation used to calculate, the analyte MDC  should be provided in the SOW
519     (Chapter 19). Furthermore, the SOW should specify how to treat appropriate blanks or the
520     detector background when calculating the MDC. The laboratory should be aware that not all
521     agencies or organizations define or calculate the MDC in the same manner. It is important for the


        JULY 2001                                                              -       MARLAP
        DRAFT FOR PUBLIC COMMENT               6-21                    DO NOT CITE OR QUOTE

-------
         Selection and Application of an Analytical Method
522      laboratory to check that the SOW clearly defines the analyte detection requirements. In most
523      cases, it would be prudent for the laboratory to use a method that has a lower analyte MDC than
524      the SOW required MDC.

525      In some situations, a radiochemical method may not be robust or specific enough to address
526      interferences from other radionuclides in the sample. The interferences may come from the
527      incomplete isolation of the analyte of interest resulting in the detection of the decay emissions
528      from these interfering nuclides. These interferences would increase the background of the
529      measurement for the analyte of interest and, thus, increase the uncertainty of the measurement
530      background. Consequently, an a priori MDC, since it is calculated without prior sample
531      knowledge or inclusion of the interference uncertainties, would .underestimate the actual
532      detection limit for the sample under analysis.  Another example of such interferences or increase
533      in an analyte's background uncertainty can be cited when using gamma-ray spectrometry to
534      determine l44Ce in the presence of 137Cs. The gamma energy usually associated with the
535      identification and quantification of 144Ce is 133.5 keV. The gamma energy for 137Cs is  661.6 keV.
536      If a high concentration of 137Cs is present in the sample, the Compton scattering from the 661.6
537      keV into the 133.5 keV region may decrease the ability to detect 144Ce by one to two orders of
538      magnitude over an a priori calculation that uses a nominal non-sample specific background
539      uncertainty. Another example can be cited for alpha-spectrometry and the determination of
540      isotopic uranium. If some interfering metal is present in unexpected quantities and carries onto
541      the final filter mount or electrodeposited plate, a substantial decrease in the peak resolution may
542      occur (resulting in an increased width of the alpha peak). Depending on the severity of the
543      problem, there may be overlapping alpha peaks resulting in additional interference terms that
544      should be incorporated into the MDC equation. In order to avoid subsequent analyte detection
545      issues, it is important for the laboratory to inquire whether or not the project manager has
546      considered all the constituents (analytes and interferences) present in the sample when specifying
547      a detection limit for an analyte.

548      The laboratory should include documentation in the response to the SOW that the method
549      proposed can meet the analyte's MDC requirements for the method parameters (e.g., sample size
550      processed, chemical  yield, detector efficiency, counting times, decay/ingrowth correction factors,
551      etc.). When practicable, care should be given  to ensure the blank or detector background
552      uncertainty includes contributions from possible anthropogenic and natural radionuclide
553      interferences. In addition, any proposed screening method should meet the detection limit
554      requirement in the presence of other radionuclide interferences or natural background
555      radioactivity. When appropriate or required, the laboratory should test the method's capability of
556      meeting the required MDC using MVRMs that have analytes and interferences in the expected
         MARLAP                                                                       JULY 2001
         DO NOT CITE OR QUOTE                   6-22               DRAFT FOR PUBLIC COMMENT

-------
                                                   Selection and Application of an Analytical Method
551     analyte concentration range. Upon request, the project manager should arrange to provide
558     MVRMs to the laboratory.

559     6.5.5.4 Applicable Analyte Concentration Range

560     The SOW should state the action level for the analyte and the expected analyte concentration
561     range. The proposed method should provide acceptable analytical results over the expected
562     analyte concentration range for the project. Acceptable analytical results used in this context
563     means consistent method precision (at a given analyte concentration) and without significant
564     bias. The applicable analyte concentration range may be three or four orders of magnitude.
565     However, most radioanalytical methods, with proper analyte isolation and interference-decon-
566     lamination steps, will have a linear relationship between the analytical result and the analyte
567     concentration. For certain environmental monitoring or research projects, the laboratory should
568     ensure that there are no instrument or analytical blank background problems. If the background is
569     not well-defined, there may be an inordinate number of false positive and false negative results.

570     In its response to the SOW, the laboratory should include method validation documentation that
571     demonstrates the method's capability over the expected range. The laboratory's method
572     validation (Section 6.6) should demonstrate or show through extrapolation or inference (e.g.,
573     from a different range of concentrations) that the method is capable of meeting the analyte
574     concentration range requirement.

575     6.5.5.5 Method Specificity

576     The proposed method should have the necessary specificity for the analyte/matrix combination.
577     Method specificity refers to the method's capability, through the necessary decontamination or
578     separation steps, to remove interferences or to isolate the analyte of interest from the sample over
579     the expected analyte concentration range. Method specificity is applicable to both stable and
580     radioactive constituents inherent in the sample. Certain matrices, such as soil and sediments,
581     typically require selective isolation of femtogram amounts of the analyte from milligrams to
582     gram quantities of matrix material. In these circumstances, the method requires both specificity
583     and ruggedness to handle variations in the sample constituents.

584     If other radionuclide interferences are known or expected to be present, the SOW should provide
585     a list of the radionuclides and their expected concentration ranges. This information enables the
586     laboratory to select and propose a method that has the necessary specificity to meet the MQOs.
587     As an alternative, the project manager may specify in the SOW the degree of decontamination a
588     method needs for the interferences present in the samples. If the laboratory is not provided this


        JULY 2001                                                                        MARLAP
        DRAFT FOR PUBLIC COMMENT               6-23                    DO NOT CITE OR QUOTE

-------
        Selection and Application of an Analytical Method
589     information, method specificity cannot be addressed properly. The laboratory should ensure that
590     related information on the matrix characteristics, radiometric or chemical interferences, and
591     chemical speciation is provided to properly select a method.

592     6.5.5.6 Method Ruggedness

593     Ruggedness is the ability of the method to provide accurate analytical results over a range of
594     possible sample constituents, interferences, and analyte concentrations, as well as to tolerate
595     subtle variations in the application of the method by various chemists (EPA, 1998; APHA,
596     1989). Ruggedness is somewhat qualitative (Chapter 7). Therefore, the desirable parameters of a
597     rugged method are difficult to specify quantitatively. A ruggedness test usually is conducted by
598     systematically altering the critical variables (or quantities) associated with the method and
599     observing the magnitude of the associated changes in the analytical results. ASTM El 169
600     provides generic guidance on how to conduct method ruggedness tests under short-term, high-
601     precision conditions. In many cases, a rugged method may be developed over time (typically
602     when difficulty is experienced applying an existing method to variations in the sample matrix or
603     when two analysts have difficulty achieving the same level of analytical quality or precision).

604    " A laboratory may have several methods for an analyte/matrix combination. Samples from
605     different geographical locations or having different processes may have completely different
606     characteristics. Therefore, the laboratory  should select a method that is rugged enough to meet
607     the APSs in the SOW. As indicated in Section 6.6, the prospective client may send site-specific
608     MVRM samples for the method validation process or for PT samples (Chapter?).

609     6.5.5.7 Bias Considerations

610     As discussed earlier, the proposed method should provide acceptable analytical results over the
611     expected analyte concentration range for  the project. Acceptable results used in this context
612     means consistent method precision (at a given analyte concentration) and without significant
613     bias. According to ASTM (E177, E1488, D2777,  D4855), "bias of a measurement process is a
614     generic concept related to a constant or systematic difference between a set of test results from
615     the process and an accepted reference value of the property being measured," or "the difference
616     between a population mean of the measurements or test results and the accepted reference or true
617     value." In contrast, ASTM (D2777) defines precision as "the degree of agreement of repeated
618     measurements of the same property, expressed in  terms of dispersion of test results (measure-
619     ments) about the arithmetical mean result obtained by repetitive testing of a homogeneous
620     sample under specified conditions." MARLAP considers bias to be a persistent difference of the
621     measured result from the true value of the quantity being measured, which does not vary if the

        MARLAP                   -  --                                " '              JULY 2001
        DO NOT CITE OR QUOTE   ~                6-24          ..   DRAFT FOR PUBLIC COMMENT

-------
                                                  Selection and Application of an Analytical Method
622     measurement is repeated. Normally, bias cannot be determined from a single result or a few
623     results (unless the bias is large) because of the analytical uncertainty component in the measure-
624     ment. Bias may be expressed as the percent deviation from a "known" analyte concentration.
625     Note that the estimated bias, like any estimated value, has an uncertainty—it is not known
626     exactly.

627     If bias is detected in the method validation process or from other QA processes, the laboratory
628     should make every effort to eliminate it when practical. Implicitly, bias should be corrected
629     before using the method for routine sample processing. However, in some cases,  the bias may be
630     very small and not affect the overall data quality. The project manager should review the method
631     validation documentation and results from internal QC and external PE programs obtained during
632     the laboratory review process  (Chapter 7) and determine if there is a bias and its possible impact
633     on data usability.

634     6.6   Method Validation

635     For the purposes of MARLAP, method validation is the demonstration that the radioanalytical
636     method selected by the laboratory for the analysis of a particular radionuclide in a given matrix is
637     capable of providing analytical results to meet the project's MQOs and any other requirements in
638     the APS. Without reliable analytical methods, all the efforts of the project may be jeopardized.
639     Financial resources, timeliness, and public perception and confidence are at risk,  should the data
640     later be called into question. Proof that the method used is applicable to the analyte and sample
641     matrix of concern is paramount for defensibility. The project manager should ensure the methods
642     used in the analyses of the material are technically sound and legally defensible.

643     The method selected and proposed by the laboratory must be based on sound scientific principles
644     and must be demonstrated to produce repeatable results under  a variety of sample variations.
645     Each step of the method should have been evaluated and tested by a qualified expert (radio-  ,
646     analytical specialist) in order to understand the limits of each step and the overall method in
647     terms of the MQOs. These steps may involve well-known and characterized sample digestion,
648     analyte purification and decontamination steps that use ion exchange, solvent extraction,
649     precipitation and/or oxidation /reduction applications. Method validation will independently test
650     the scientific basis of the method selected for a given analyte and sample matrix.

651     A method validation protocol  should be a basic element in the quality system employed by a
652     laboratory. A proposed method for a specific analyte should be validated in response to the
653     requirements within a SOW. Demonstration of method performance to meet the MQOs prior to
654     processing project samples is a critical part of the MARLAP process. As a result of internal QC

        JULY 2001                                                                       MARLAP
        DRAFT FOR PUBLIC COMMENT              6-25                     DO NOT CITE OR QUOTE

-------
        Selection and Application of an Analytical Method
655     and external PE programs, most laboratories normally have documentation on the general or
656     overall performance of a method. As discussed later, this information, depending on many
657     aspects, may be sufficient in meeting the method validation criteria.

658     Methods obtained from the literature, from recognized industry standards (ASTM, ANSI, APHA)
659     or government method manuals may have been validated for certain general applications by the
660     developing or issuing laboratory. However, other laboratories would have to validate the method
661     for specific project use.

662     6.6.1   Laboratory's Method Validation Protocol

663     During the discussion on method validation, certain terms are used. These include MVRM, QC,
664     and PT materials. QC samples and programs are related to those samples or processes that are
665     used to evaluate the quality of the analytical results for the fundamental purpose of directly
666     controlling the quality of the analytical process by initiating control mechanisms. PT materials
667     are materials prepared for use in a PE program or for validating methods. MVRM refers to site-
668     specific materials that have the same or similar chemical and physical properties as the proposed
669     project samples. Although the MVRM  is the most appropriate material for testing a laboratory's
670     project-specific performance, or for validating a method for a particular project, its availability
671     may be limited depending on the project manager's ability to supply such material.

672     The laboratory's method validation protocol should include the evaluation of the method for
673     project specific MQOs for an analyte or generic quality performance criteria as well as other
674     generic parameters. With a properly designed method validation protocol, important information
675     may be ascertained from the analytical  results generated by the method validation process.

676     The parameters that should be specified, evaluated, or may be ascertained from the analytical
677     results generated by the method validation process are listed below:

678      0 Defined Method Validation Level (Table 6.1)
679      0 APSs including MQOs for each analyte/matrix
680        o   Chemical or physical characteristics of analyte when appropriate
681        o   Action level (if applicable)
682        o   Method uncertainty at a specific concentration
683        o   MDC or MQC
684        o   Bias (if applicable)
685        o   Applicable analyte concentration range including zero analyte (blanks)
686        o   Other qualitative parameters to  measure the degree of method ruggedness or specificity


        MARLAP                                                                        JULY 2001
        DO NOT CITE OR QUOTE                    6-26               DRAFT FOR PUBLIC COMMENT

-------
                                                  Selection and Application of an Analytical Method
68?      0 Defined matrix for testing, including chemical and physical characteristics that approximate
688        project samples
689      0 Selected project-specific or appropriate alternative matrix PT samples, including known
690        chemical or radionuclide interferences at appropriate levels
691      0 Defined sample preservation
692      0 Stated additional data testing criteria (such as acceptable chemical/radiotracer yield values)

693     In order to properly demonstrate that a method will meet project MQOs, the method should be
694     evaluated over a range of analyte concentrations. The analyte concentration range of the matrix
695     spikes (covering the testing levels) used for method validation should cover the expected analyte
696     concentration range for the project (Section 6.5.5.3), with the middle of the range set near the
697     action level. At the upper end of the range, the method validation samples should be analyzed to
698     have a Poisson counting uncertainty between 1 percent (ANSI N42.23) and 3 percent (1 sigma).
699     Keeping the Poisson uncertainty <3 percent (1  sigma) will ensure the observed precision, as
700     measured by multiple samples, is not dominated by the Poisson counting uncertainty. In addition,
701     anticipated or known chemical and radionuclide interferences should be added in the appropriate
702     "interference to analyte" activity or concentration ratio. Appropriate method blanks (also
703     containing interferences when practical) should be analyzed concurrently with the matrix spikes
704     to determine analyte interferences or biases near the detection limit.

705     The number of samples for the method validation process varies according to the method
706     validation level needed. As proposed in Table 6.1, the number of samples may vary from 6 to 21,
707     depending on the robustness of the method validation.

70S     6.6.2   Tiered Approach to Validation

709     While MARLAP recommends that as each new project is implemented, the methods used in the
710     analysis of the associated samples undergo some level of validation, it is the project manager's
711     responsibility to assess the level of method validation necessary. Although the end result of
712     method validation is to ensure that the method selected meets the MQOs for an analyte/matrix,
713     the extent of the validation process depends on whether the laboratory should elect to develop a
714     new method or whether there is an existing validated method available that can be adapted or
715     validated for another specific project need. Therefore, MARLAP recommends that a tiered
716     approach be taken for method validation. The recommended protocols to be considered for
717     existing methods are provided in the next four sections, requiring from least to most effort: no
718     additional validation, modification of a method for a similar matrix, new application of a method,
719     and newly developed or adapted methods.  Table 6.1 consolidates recommended validation
720     requirements from various government agencies.and consensus organizations. The suggested


        JULY 2001                                                                     .  MARLAP
        DRAFT FOR PUBLIC COMMENT"             6-27                    DO NOT CITE OR QUOTE

-------
         Selection and Application of an Analytical Method
721      levels of validation are indicative of the modification required of the method. It should be noted
722      that the method validation requirements of Table 6.1 permit the laboratory to use internal QC, PE
723      program, or site-specific MVRM samples, or permit the project manager may provide PT, PE
724      program, or site-specific MVRM samples for the laboratory to use. Sometimes, a project
725      manager may provide PT samples as part of the qualifying process. In this case, the project
726      manager should ensure consistency with the method validation requirements of Table 6.1.
727
728
729
730
731
732
733
734

735

736
737

738
739

740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
                       TABLE 6.1 — Tiered method validation approach
Validation
Level
A
Without
Additional
Validation
B
C
D
ASTM D2777
E
ASTM D2777
F
EPA
Equivalency
G
ASTM D2777
H
ASTM D2777
(Involves the
two testing
protocols stated
to the right)
Application
Existing
Validated
Method
Similar Matrix
Similar Matrix/
New Application
New
Application
New Application
New
Application/
newly Developed
or Adapted
Method

Newly
Developed
or
Adapted Method
Sample
Type
-
Internal QC
External PE
Internal QC
External PE
MVRM
Samples
MVRM
Samples
MVRM
Samples for
both
protocols
Acceptance
Criteria*
Method previously
validated
{by one of Validation
Levels B though H)
Measured value within
±3 «MR of known value
Measured value within
±3 UMR of known value
Each measured value
s30% of known
at 5 times MDC
Levels
(Concentrations)**
-
3
3
Replicates
-
3
7
Three to five groups of two
samples with concentrations
within 20% of each other
3
7
Three to five groups of two
samples with concentrations
within 20% of each other
Three to five groups of two
samples with concentrations
within 20% of each other
Three to five groups of two
samples with concentrations
within 20% of each other
bracketing 5 times the MDC
# of Analyses
-
9
21
6-10
6-10
21
6-10
6-10
6-10
* Assumes that each sample is counted to have a Poisson counting uncertainty of < 3% (sigma) when the analyte concentration is
near the action level or MQC. This criterion is applied to each analysis in the method validation, not to the mean of the analyses.
«MR is the required method uncertainty at the action level or required concentration. UMR is an absolute value for concentrations
less than the action level and a relative (%) value for concentrations greater than the action level. In the absence of a specified
value, the default of ± 3 WMR acceptance criterion is: each measured value at the action level or other specified concentration must
be within ± 30% of known value. See references for ASTM D2777.
**Concentration levels should cover the expected analyte concentration range for a project including the action level
concentration. A set of three blanks (not considered a level) should be analyzed during the method validation process.
          MARLAP
          DO NOT. CITE OR QUOTE
                                               6-28
                     JULY 2001
DRAFT FOR PUBLIC COMMENT

-------
                                                   Selection and Application of an Analytical Method
 759      The tiered approach to method validation outlined Table 6.1 was developed to give the project
 760      manager flexibility in the method validation process according to the project requirements. The
 761      degree of method validation increases from the lowest (Level A) to the highest (Level H). The
 762      table's acceptance criteria for the validation process for a given project are based on the MQO for
 763      the method uncertainty at the action level or other stated concentration. Each of the validation
 764      levels evaluates the proposed method over the expected concentration range of the analytes and
 765      interferences. The acceptance criterion of having each analytical result meet the ± 3 MMR of the
 766      known value ensures a high degree of confidence that a method will meet the required method
 767      uncertainty (MQO) at the action level or other specified concentration. (See Appendix C for the
 768      definition of the method uncertainty at the action level or other stated concentration, «MR.) In
 769      addition to evaluating the method uncertainty, the method should be evaluated for bias.

'770      During the method validation process, the laboratory should ensure that the observed precision
 771      for the samples processed is consistent with the estimated individual  sample measurement uncer-
 772      tainty. An evaluation should be conducted for replicate sample analyses that have the same
 773      approximate relative measurement uncertainties. Samples having analyte concentrations within a
 774      narrow range of one another (ASTM D2777 Youden Pairs) may be considered when their
 775      relative measurement uncertainties are approximately the same. If the estimated measurement
 776      uncertainty of a given sample is much smaller than the observed method precision for the
 777      replicate samples, then the laboratory may not have properly estimated the uncertainty of one of
 778      the input quantities (parameters) or has omitted an input quantity in the measurement uncertainty
 779      (combined standard uncertainty).

 780      6.6.2.1 Existing Methods Requiring No Additional Validation

 781      For completeness, it is necessary to discuss the possibility that a previously validated method of
 782      choice requires no additional validation (Level A of Table 6.1) for a specific project use. As
 783      noted in the table, the method has undergone some level (Level B through H) of previous
 784      validation. It may be that the samples (matrix and  analyte specific) associated with a new project
 785      are sufficiently similar to past samples analyzed by the same laboratory that the project manager
 786      feels additional validation is unwarranted. The decision to use Level A method validation should
 787      be made with caution. While the sampling scheme may  be a continuation, the analytical
 788      processing capabilities at the laboratory may have changed sufficiently to merit limited method
 789      validation. Without some level of method validation, the project manager has no  assurance that
 790      the analytical laboratory will perform to the same standards as an extension of the earlier work.
         JULY 2001                    -—- =•                                             MARLAP
         DRAFT FOR PUBLIC COMMENT              6-29                    DO NOT CITE OR QUOTE

-------
        Selection and Application of an Analytical Method
791      6.6.2.2 Use of a Validated Method for Similar Matrices

792     When a previously validated method is to be used in the analysis of samples that are similar to
793     the matrix and analyte for which the method was developed, MARLAP recommends that
794     validation of the method be implemented according to Level B or C of Table 6.1. These levels
795     will provide a reasonable assurance to both the laboratory and the project manager that the
796     method will meet the required MQOs associated with the project. Level B may be used if the
797     laboratory has the capability to produce internal QC samples. When the laboratory does not have
798     the capability to produce internal QC samples, the Level C validation protocol should be used.
799     However, PE programs may not provide the necessary matrices needed for the Level C validation
800     protocol.

801      Since a method inherently includes initial sample preparation, projects that have severe
802     differences in analyte heterogeneity may require a moderate change in a radiochemical method's
803     initial sample treatment. A change in the method to address the increased heterogeneity of the
804     analyte distribution within the sample may require another method validation depending on the
80S     robustness of the method and the degree of analyte heterogeneity.

806     6.6.2.3 New Application of a Validated Method

807     Methods that have been validated for one application normally require another validation for a
808     different application, such as a different sample matrix. In addition, the MQOs may change from
809     one project to another or from one sample matrix to another. The validation process for an
810     existing validated method should be reviewed to ensure applicability of the new (which can be
811      more or less restrictive) measurement quality objectives. In most cases, applying an existing
812     method for one matrix to another matrix is not recommended without another method validation.
813     MARLAP recommends, based on the extent of the modification and the difficulty of the matrix,
814     that Levels C-F of Table 6.1 be used to validate the performance of the modified method. The
815     following paragraphs and the next section provide information on whether a validated method
8i6     requires a slight modification or a complete revision.

817     Validation of an existing method for a different application depends on the extent of the
818     departure from the original method application, in terms of:

819      •  Dissimilarity of matrices;
820      •  Chemical speciation of the analyte or possible other chemical interference;
821      •  Analyte, chemical or radiometric interferences;
822      •  Complete solubilization of the analyte and sample matrix; and

        MARLAP                                                        -              JULY 2001
        DO NOT CITE OR QUOTE                   6-30              DRAFT FOR PUBLIC COMMENT

-------
                                                  Selection and Application of an Analytical Method
823      • Degree of analyte or sample matrix heterogeneity.

824     When the chemical species of the analyte in a sample from a new project varies from the
825     chemical species for which the  method was validated, then the method will have to be altered
826     and another validation performed. An example would be when a method had been developed to
827     extract iodide via ion exchange chromatography but the new application may have I2, iodate, or
828     iodide in the sample. Another example would be the initial development of a method for Pu in
829     soil generated from liquid effluents using acid dissolution and then trying to apply the same
830     method to high-fired plutonium oxide in soil. For these two examples, if the original methods
831     were to undergo the validation process for the new application, definite deficiencies and poor
832     results would become evident. Portions of the original method would have to be modified to
833     address the chemical speciation problems. The modified method requires validation to ensure
834     that the measurement quality objectives for the new application can be met.

835     When additional analyte, chemical, or sample matrix interferences are known to exist for a new
836     application compared to the old method application, the previously validated method should
837     undergo another validation, depending on the degree of interference and the problems anticipa-
838     ted. For example, applying a method used for the analysis of an analyte in an environmental
839     matrix containing few interfering radionuclides would typically be inappropriate for the analysis
840     of process waste waters containing many interfering radionuclides at high concentrations. In
841     essence, the degree of decontamination (degree of interference removal) or analyte purification
842     (isolation of the analyte from other radionuclides) necessary for one application may be
843     completely inadequate or inappropriate for another application (an indication of method
844     specificity).

845     Another example would be the  use of a method for soil analysis employing  234Th as a radiotracer
846     for chemical yield for the isotopic analysis of thorium when the soil also has a high concentration
847     of uranium. 234Th is an inherent decay product of 238U and will exist in the sample as a natural
848     analyte, thus creating erroneous chemical yield factors. A third example would be the application
849     of a wSr method developed for  freshwater to seawater samples for which the amount of chemical
850     interferences and ambient Sr levels are extensive. For these three examples, conducting the
851     validation process for the original methods for the new applications would,  depending on the
852     severity of the analyte and chemical interference, illustrate method deficiencies and the inability
853     to meet measurement quality objectives.

854     Some matrices and analytes may be solubiiized easily through acid dissolution or digestion. For
855     some applications, the analyte of interest may be solubiiized from the sample matrix through an
856     acid extraction process. The applicability of such methods should be carefully chosen and, most


        JULY 2001                                                                       MARLAP
        DRAFT FOR PUBLIC COMMENT               6-31                	  DO NOT CITE OR QUOTE.

-------
        Selection and Application of an Analytical Method
857     important, the method must be validated for each application. Definite problems and
858     misapplication can be the result of using an acid extraction process when a more robust complete
859     sample dissolution is necessary.

860     6.6,2.4 Newly Developed or Adapted Methods

861     MARLAP recommends that methods under development by the laboratory or adapted from the
862     literature that have not been previously validated for a project be validated according to Levels
863     F to H of Table 6.1. These levels provide the most comprehensive testing of method perfor-
864     mance. For low-level environmental surveillance applications, it may be advantageous to use the
865     second set of requirements of Level H (each measured value must be within ± 30 percent of the
866     known value at 5 times the MDC) as part of the other validation levels as well. This requirement
867     will assess the method's ability to perform at the concentration ranges more commonly associated
868     with environmental samples. When process knowledge is available or the matrix under
869     consideration is unique or site-specific, it is best to validate the method using the matrix (e.g.,
870     MVRM) under consideration. This is extremely important for process/effluent waters versus
871     laboratory deionized water and for various heavy metal radionuclides in soils or sediments when
872     compared to spiked sand or commercial topsoil. For site-specific materials containing severe
873     chemical and radionuclides interferences, many methods have been unable to properly address
874     the magnitude of interferences.

875     6.6.4  Method Validation Documentation

876     Method validation, depending on the required level  of validation, can be accomplished by the
877     project manager sending PT samples to the laboratory or by the laboratory using internal or
878     external PT/QC samples. When PT samples are sent to a laboratory to evaluate or validate the
879     laboratory's method and capabilities, the appropriate technical representative should retain all
880     records dealing with applicable method validation protocols (Section 6.6.3), PT sample
881     preparation certification, level of validation (from Table 6.1), results, and evaluations. The
882     laboratory should provide the necessary documentation to the project manager for these PT
883     samples as required by the SOW. The laboratory should request feedback from the project
884     manager as to the method performance. This information, along with the sample analytical
885     results documentation, should be retained by the laboratory for future method validation
886     documentation.

887     When the laboratory conducts its own method validation, all records, laboratory workbooks, and
888     matrix spike data used to validate an analytical method should be retained on file and retrievable
889     for a specified length of time after the method has been discontinued.


        MARLAP                                           ••••—                       JULY 2001
        DO NOT CITE OR QUOTE                    6-32               DRAFT FOR PUBLIC COMMENT

-------
                                                  Selection and Application of an Analytical Method
890     6.7    Analyst Qualifications and Demonstrated Proficiency

891     The required level of qualification of an analyst is commensurate with the degree of difficulty
892     and sophistication of the method in use. The selection of the analyst for the method application is
893     typically determined initially on experience, education and proven proficiency in similar
894     methods. Basic guidance for the minimum education and experience for radioassay laboratory
895     technicians and analysts has been provided in Appendix E and ANSI N42.23.

896     For radiochemical methods, there may be several analysts involved. At most major laboratories,
897     different individuals may be involved in the sample preparation, radiochemistry, and radiation
898     detection aspects of the method. In these cases, the entire staff involved in the method should
899     undergo method proficiency tests to demonstrate their ability to meet quality requirements and
900     performance goals. The staff involved in the initial validation of an acceptable method would be
90!     considered proficient in their particular role in the method application and the results of their
902     performance should be documented in their training records.

903     Successful proficiency is established  when the performance of the analyst or staff meet
904     predefined quality requirements defined in the laboratory's quality system or a SOW, as well as
905     processing goals. Parameters involved in operational processing goals are typically turnaround
906     time, chemical yields, frequency of re-analyses (percent failure rate), and frequency of errors.

907     The continued demonstrated analyst proficiency in the method is usually measured through the
908     acceptable performance in internal QC and external PE programs associated with routine sample
909     processing.

910     6.8    Method Control

911     Method control is an inherent element of a laboratory's quality system. Simply stated, method
912     control is the ongoing process used to ensure that a validated method continues to meet the
913     expected requirements as the method is routinely used. Method control is synonymous with
914     process control in most quality systems. For a laboratory operation, method control can be
915     achieved by the application of the following:

916      • Controlled method manual (latest revision and signature sign-off);

917      * NIST traceable calibration standards and the conduct of an instrument QC program that
918        properly evaluates the variable parameters on an appropriate frequency;
        JULY 2001                                                                       MARLAP
        DRAFT FOR PUBLIC COMMENT              6-33                    DO NOT CITE OR QUOTE

-------
        Selection and Application of an Analytical Method
919      • Radiotracers or chemical yields for each sample and the evaluation of the measured chemical
920        yield values to expected ranges;

921      • Internal QC and external PT samples to determine deviations from expected quality
922        performance ranges;

923      • Standard operating procedures for troubleshooting "out of control" situations; and

924      • Problem reporting, corrective action, and quality improvement process.

925     The above method control elements are typically addressed in the quality manual of the
926     laboratory or the project plan document for the project under consideration. Refer to Chapter 18
927     for additional information.

92$     6.9    Continued Performance Assessment

929     The assessment of a laboratory's continued performance is covered in detail in Chapter 7.
930     However, it is important to briefly discuss certain aspects of evaluating a method's continued
931     performance from a laboratory's perspective.
932
933     In order to properly perform statistical analyses or compliance interpretation of the analytical data
934     produced from an analytical method, it is assumed that data quality does not vary significantly.
935     Therefore, the user of the data expects that the overall data quality will not change throughout the
936     program or project. From a laboratory management perspective, a performance indicator system
937     should be in place that assesses and provides feedback on the quality of the routine processing.
938     The most useful and cost-effective means of assessing a method's performance is through the
939     implementation of internal QC or external performance evaluation programs or both. Of course,
940     it can be argued that method assessment through a QC or PE program evaluates the combined
941     performance of the method and the analyst. However, statistical and inferential interpretation of
942     the QC/PE data can provide insight into whether the method is failing or whether an analyst is
943     underperfbrming.  Chapters 7 and 18 and Appendix C provides guidance on quality control
944     programs and the  use of the internal laboratory QC or external PE data to assess the laboratory's
945     performance in meeting performance criteria.

946     The laboratory management should use the internal QC program to detect and address
947     radioanalytical issues before the client does. Many SOWs require the use of internal QC samples
948     for every batch of project samples (Chapter 18). In effect, the client is essentially setting the level
949     of internal quality control and the frequency of method performance evaluation. It should be
                                                                \
        MARLAP               -    -                                                  JULY 2001
        DO NOT CITE OR QUOTE                    6-34              DRAFT FOR PUBLIC COMMENT"-

-------
                                                    Selection and Application of an Analytical Method
 950      recognized that an internal QC program evaluates method performance related to the initial
 951      calibrations or internal "known values." An external NIST-traceable PE program will explain
 952      method biases relative to the national standard or to the agency's PE program.

 953      Some users of laboratory services have developed "monitoring" laboratory programs (ANSI
 954      N42.23). For these programs, the user engages a recognized independent monitoring laboratory
 955      to intersperse double- and single-blind external PT materials into batches of normal samples
 956      submitted to a laboratory. The complexity and frequency of the monitoring laboratory PT
 957      samples vary among programs, projects, and Federal and state agencies. An external double-blind
 958      PE program conducted by a monitoring laboratory using site-specific matrices probably provides
 959      the most realistic estimate of the method's or laboratory's true performance. When the
 960      monitoring laboratory is traceable to NIST, either directly or through a NIST reference laboratory
• 961      (ANSI N42.23), the monitoring laboratory program will provide an estimate of any method bias
 962      as related to the national standard.

 963      Method performance can also be determined, although on a less frequent basis, through the
 964      laboratory's participation in the various PE programs. For a laboratory providing services to
 965      government agencies, the participation in such programs is typically a requirement. The PE
 966      programs commonly send out non site-specific PT materials on a quarterly or semiannual basis.

 967      The laboratory's performance in certain PE program is public knowledge. Such information is
 968      useful to project managers in selecting a laboratory during the laboratory selection and qualifying
 969      processes. Similar to the monitoring laboratory, when the laboratory conducting the PE program
 970      is traceable to NIST, either directly or through a NIST reference laboratory (ANSI N42.23), the
 971      PE program may provide an estimate of the bias as related to the national standard as well as the
 972      precision of the method, depending on the distribution of replicate samples.

 973      Some projects require that all analytical results received from a laboratory undergo a data
 974      verification and validation process. Chapter 8 provides more detail on these processes. When
 975      properly conducted, certain aspects and parameters of the method can be assessed during the data
 976      verification and validation process.

 977      Internal and external audits/assessments are also key elements in a laboratory's quality system to
 978      assess the continuing performance of a method (Chapter 7). The level and frequency of the audits
 979      and assessments typically vary according to the magnitude and importance of the project and on
 980      the performance of the laboratory. Another quality system element that is very effective is a self-
 981      assessment program. A functioning and effective self-assessment program may identify
         JULY 2001                                                                       MARLAP
         DRAFT FOR PUBLIC COMMENT               6-35        "'           DO NOT CITE'OR QUOTE

-------
         Selection and Application of an Analytical Method
 982     weaknesses or performance'issues more readily and timely than formal internal and external
 983     audits..

 984     6.10  Documentation To Be Sent to the Project Manager
 985
 986
 987
 988
 989

 990
'991
 992
 993
 994
 995
 996

 997
 998
 999
1000


1001

1002

1003

1004
1005

1006
1007
1008
The documentation related to the life cycle of a method application is essentially the information
gathered during the use of the method. A formal method documentation program is unnecessary
since the information should be part of the quality system documentation. Documented
information available from the quality system, related to a method's development, validation, and
control, include the following:

 • Method validation protocol and results;
 • Analyst training and proficiency tests;
 • Method manual control program;
 • Instrument calibration and QC results;
 • Internal QC and external PT sample results;
 * Internal and external assessments; and
 • Corrective actions.

Data verification and validation information should be kept available and retained for those
projects requiring such processes. In addition to QA documentation, the analytical results, either
in hard copy or electronic form, should be available from the laboratory for a specified length of
time after the completion of a project.
                            Summary of Recommendations

    MARLAP recommends the performance-based approach for method selection.

    MARLAP recommends that only methods validated for a project's application be used.

    MARLAP recommends that a SOW containing the MQOs and analytical process
    requirements be provided to the laboratory.

    MARLAP recommends that the SOW include the specifications for the action level and
    the required method uncertainty for the analyte concentration at the action level for each
    analyte/matrix.
         MARLAP
         DO NOT CITE OR QUOTE
                                         6-36
                  JULY 2001
DRAFT FOR PUBLIC COMMENT

-------
                                                 Selection and Application of an Analytical Method
1009
1010
1011
• MARLAP recommends that as each new project is implemented, the methods used in the
  analysis of the associated samples undergo some level of validation.

• MARLAP recommends that a tiered approach (Table 6.1) be taken for method validation.
1012     6.11  References

1013     American National Standards Institute (ANSI) N42.23. Measurement and Associated
1014        Instrumentation Quality Assurance for Radioassay Laboratories. 1996.

1015     American Public Health Association (APHA) 1989. Standard Methods for the Examination of
1016        Water and Waste Water. Washington, DC.

1017     American Society for Testing and Materials (ASTM) D 2777. Standard Practice for
1018        Determination of Precision and Bias of Applicable Test Methods of Committee D-19 on
1019        Water.

1020     American Society for Testing and Materials (ASTM) D 4855. Standard Practice for Comparing
1021        Methods.

1022     American Society for Testing and Materials (ASTM) E 177. Standard Practice for Use of the
1023        Terms Precision and Bias in ASTM Test Methods.

1024     American Society for Testing and Materials (ASTM) E 1169. Standard Guide for Conducting
1025        Ruggedness Tests.

1026     American Society for Testing and Materials (ASTM) E 1488. Standard Guide for Statistical
1027        Procedures to Use in Developing and Applying ASTM Test Methods.

1028     U.S. Environmental Protection Agency (EPA).  1980. Prescribed Procedures for Measurement of
1029        Radioactivity in Drinking Water. Environmental Monitoring and Support Laboratory,
1030        Cincinnati, OH. EPA 600-4-80-032, August.

1031     U.S. Environmental Protection Agency (EPA).  1998. Guidance for Quality Assurance Project
1032        Plans EPA QA/G-5. EPA 600-R-98-018, February.
         JULY 2001                                                                    MARLAP
         DRAFT FOR PUBLIC COMMENT  	       6-37                    DO NOT CITE OR QUOTE

-------

-------
           7  EVALUATING METHODS AND LABORATORIES
 2     7.1 Introduction

 3     This chapter provides guidance for the initial and ongoing evaluation of radioanalytical labora-
 4     tories and methods proposed by laboratories. Appendix E, Contracting Laboratory Services,
 5     provides additional guidance on the initial laboratory evaluation. More details about evaluating
 6     and overseeing a laboratory's performance can be found in ASTM El691 and ASTM E548.

 7     The performance-based approach to method selection allows a laboratory the freedom to propose
 8     one or several methods for a specific analyte/matrix combination that will meet the needs of the
 9     Analytical Protocol Specifications (APSs) and measurement quality objectives (MQOs)
10     delineated in the Statement of Work (SOW). However, the laboratory should demonstrate,
11     through a method validation process, that the method is capable of producing analytical results of
12     quality that meet the needs of the SOW (Chapter 5). Guidance and recommendations on the
13     selection of an analytical method based on the performance-based approach were presented in
14     Chapter 6. Section 7.2 of this chapter provides guidance on how to evaluate the methods
15     proposed by a laboratory. Section 7.3 provides guidance on the initial evaluation of a laboratory,
16     and Section 7.4 discusses the continual evaluation of the quantitative measures of quality and
17     operational aspects of the laboratory once sample processing has commenced.

18     Method applicability and performance compliance should be demonstrated prior to the initiation
19     of the sample analyses, as well as during the project period. A defined logical process for demon-
20     strating and documenting that the analytical method selected meets the project's data needs and
21     requirements may involve, for example, a review of the method validation documentation, an
22     evaluation of past performance data from other projects (if available), the analysis of external
23     performance evaluation (PE) program results, the analysis of matrix-specific standard reference
24     materials (or method validation reference materials) sent during the initial work period and
25     throughout the project, and the final evaluation of the protocol's performance during the data
26     verification and validation process. Chapter 8, Radiochemical Data Verification and Validation,
27     covers the final evaluation of the protocol's performance.

28     In addition to the evaluation of the analytical methods, the capability of the laboratory to meet all
29     SOW requirements needs to be reviewed and evaluated. Supporting information, such as method
30     validation documentation, safety manuals, licenses and certificates, and quality manual are typi-
31     cally submitted with the response to the Request for Proposals (RFP). A generic evaluation of the
32     laboratory operation may be conducted during the initial laboratory audit or assessment. This
33     may be an initial onsite audit. This first evaluation covers those generic SOW requirements
34     dealing with the laboratory's capability and operation, including verification of adequate

       JULY 2001                 .    " "                                             MARLAP
       DRAFT FOR PUBLIC COMMENT               7-1                     DO NOT CITE OR QUOTE

-------
        Evaluating Methods and Laboratories
35      facilities, instrumentation, and staffing and staff training and qualifications. Following the first
36      audit, emphasis should be on ensuring the laboratory continues to meet the APSs through a
37      continuous or ongoing evaluation effort.

38      7.2   Evaluation of Proposed Analytical Methods

39      A laboratory may submit several methods for a particular APS contained in the SOW, but each
40      method should be evaluated separately and, if appropriate, approved by the project manager or
41      designee. The method should be evaluated to be consistent with the overall analytical process
42      that includes the proposed field sampling and preservation protocols (Chapter 1). The project
43      manager may delegate the method review process to a technical evaluation committee (TEC) that
44      has a radioanalytical specialist. MARLAP recommends that a radioanalytical specialist review
45      the methods for technical adequacy. The acceptance, especially of a new method, may be the
46      most critical  aspect of the performance-based approach for method selection. Acceptance of the
47      method requires the project manager to verify that the method is scientifically sound.

48      Each step of the method should be evaluated by a radioanalytical specialist in order to understand
49      how the results are derived. These steps may involve sample digestion, analyte purification and
50      decontamination steps that use ion exchange, solvent extraction, precipitation or oxidation/
51      reduction applications. Once these steps have been  reviewed, and the method evaluation data
52      (e.g., from method validation documentation or various performance evaluation results) confirm
53      that the proposed method is acceptable, the project  manager should have the confidence
54      necessary to endorse and verify the use of the method in the analysis of the routine samples.

55      As discussed in Chapter 6, the laboratory should provide method validation and analytical data
56      that demonstrates method performance. The data should show conclusively that the proposed
57      method meets the requirements as defined by the APSs. If method performance is questionable,
58      additional data may be required. For such cases, the project manager may decide to send per-
59      formance testing (PT) materials to the laboratory in order to evaluate or validate the method. The
60      preparation of the PT material used to evaluate the  method should be based on sound scientific
61      principles and representative of the expected sample matrix (see Chapter 6 on method validation
62      options using site-specific materials). If there is sufficient reason to believe that the PT material
63      is an adequate substitute for the sample matrix and  that the laboratory will follow the same
64      method, then the need to justify each step in the method may be drastically reduced.

65      7.2.1  Documentation of Required Method Performance

66      Certain documentation submitted by the laboratory with the proposed methods, as well as

        MARLAP                                                                      JULY-2001
        DO NOT CITE OR QUOTE    -  ••              7-2               DRAFT FOR PUBLIC COMMENT

-------
                                                             Evaluating Methods and Laboratories
67      available external information on the laboratory's analytical performance, should be reviewed
68      and evaluated by the radioanalytical specialist. Table 7.1 outlines where such information can be
69      typically found by the TEC. This section will discuss various information categories that may be
70      available during the method evaluation process.

71      7.2.1.1 Method Validation Documentation

72      Chapter 6 outlines the various method validation options that can be specified by the project
73      manager. In the MARLAP process, the method validation requirements will be contained in the
74      SOW. The laboratory must submit the necessary method validation documentation consistent
75      with the SOW specification. The laboratory may choose to validate a method to a higher degree
76      of validation or to submit method validation documentation for a higher degree of validation than
77      that specified by the SOW. The radioanalytical specialist or project manager should review the
78      documentation to ensure that validation criteria for the number of analyte concentration levels
79      and replicates meet or exceed the required validation criteria (Chapter 6, Table 6.1). Although
80      not specified in the method validation protocol, some laboratories may include chemical and
81      analytical interferences in their method validation plan to gain a perspective on the method's
82      specificity and ruggedness. However, it should be noted that the graded approach to method
83      validation presented in Chapter 6 does inherently increase the degree of ruggedness in terms of
84      having the method address site-specific materials which may include chemical and radionuclide
85      interferences.

86      In addition to reviewing the documentation for compliance with the method validation protocol,
87      the results of the method validation process  should be evaluated to determine if the project
88      specific MQOs will be met. The method validation may or may not have been specifically
89      conducted for the project at hand. When the method has been validated (Chapter 6, Section 6.6)
90      to the  SOW specifications (validation level and MQOs), then evaluation of the documentation
91      can be straight forward. If the method has been previously validated for the MQOs of other
92      projects, then the laboratory should provide  a justification and calculations to show that the
93      method validation results will meet the MQOs for the new project. The TEC should verify these
94      calculations and review the assumptions and justifications for reasonableness and technical
95      correctness.

96      7.2.1.2 Internal Quality Control or External  PE Program Reports

97      The documentation of internal QC and external PE program results should be reviewed relative
98      to the  MQOs. Method uncertainty and internal biases.can be estimated from the information
99      available in the laboratory's internal quality control reports, summaries of batch QC results that
        JULY 2001
        DRAFT FOR PUBLIC COMMENT
7-3
              MARLAP
DO NOT CITE OR QUOTE
                                                                U.S. EPA Headquarters Library
                                                                       Mail code 3201
                                                                1200 Pennsylvania Avenue NW

-------
                TABLE 7.1 — Cross reference of information available for method evaluation
I1
r
Evaluation Element
Addressed
Analyte/Matrix
Process Knowledge
Previous Experience
Radiological Holding
Time
Turnaround Time
Unique Process
Specifications
Bias
Method Uncertainty
(MQO/MDC
Analyte/lnterference
Range
Method Ruggedness
Method Specificity
Method
Validation
•


•

•
•
•
•
•
•
Internal and
External QC
Reports



O
o

•
•
•
o
o
External PE
Programs



O
O

•
• _
•
•
•
Internal/External
QA Assessments



•
•

O
O.

•
•
Information from
RFP and Other
Sources
•
•
•
•
•
•
•
•
•
•
•
                                                                                                                   S
                                                                                                                   s-
                                                                                                                   I
                                                                                                                   I
                                                                                                                   2

• Denotes that the information relevant to method evaluation should be present.

O Denotes that the information relevant to method evaluation may be present.

-------
                                                             Evaluating Methods and Laboratories
100      may be submitted with the RFP response and external PE program reports. The TEC should
101      review these documents and, when possible, estimate the method uncertainty and bias for various
102      analyte concentration levels. However, it is imperative that no confusion exists in terms of what
103      method produced the results: the proposed method or another method available to the laboratory.
104      This is especially important when reviewing external PE program results. It should also be noted
105      that although a laboratory may meet performance acceptance criteria for an external PE program,
106      this fact may have no bearing on whether the method will meet the MQOs of the SOW.

107      Review of the internal batch QC data can provide additional information on typical sample
108      analysis times and rates of blank contamination and sample reanalysis. This information is
109      important when comparing methods (from the same or between laboratories) in terms of APS
110      characteristics. The frequency of blank contamination would be very important to national char-
111      acterization studies (groundwater or soil analyses) for the determination of ambient analyte
112      levels. Method evaluation for these projects may weight the blank contamination rate more
113      heavily than other SOW parameters. The rate of sample  re-analysis would be important to
114      projects having pending operations that are conducted based on a short sample processing turn-
115      around time (TAT). In some site remediation projects, the contractor may remain onsite pending
116      analytical results. A delay in reporting data or not meeting a TAT due to sample re-analysis may
117      be costly. Projects of this nature may weight TAT and low sample re-analyses more heavily than
118      other SOW parameters.

119      7.2.1.4 Method Experience, Previous Projects,  and Clients

120      When permitted by former clients, the laboratory may submit information relative to the previous
121      or ongoing clients and projects for which the proposed method has been used. The TEC should
122      verify with the laboratory's clients that the laboratory has previous experience using the method.
123      When available and allowed, the information should also include the analyte(s) and interferences
124      and their applicable concentration range, matrix type, and project size in terms of the number of
125      samples per week or other time periods. From this information, the TEC can evaluate whether or
126      not to contact the laboratory's client for further information on the operational adequacy of the
127      method. The client may offer some information on the quality of the results based on their
128      external single- or double-blind QC program, percent completion.of reports,  TAT, and sample re-
129      analysis frequency.  The sharing of laboratory assessment reports may be advantageous when
130      reviewing the performance of the laboratory during its employment of the method.

131      7.2.1.5 Internal and External Quality Assurance Assessments

132      When available, internal and external quality assurance assessment reports should be evaluated to


         JULY 2001                                                                      MARLAP
         DRAFT FOR PUBLIC COMMENT               7-5                    DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
133      determine the adequacy of the method performance based on previous projects. Problems with
134      the conduct of the method due to procedural and technical issues may be readily evident. These
135      issues may include an ineffective corrective action program creating delayed remedies to
136      problems, insufficient understanding of the method, inadequate training of staff, internal and
137      project-specific QC issues, and higher-than-expected failure rates for sample TATs and re-
138      analyses. Information in these reports may disclose problems with a particular method that are
139      not common to another proposed method. As such, the TEC may give one method a higher
140      weighting factor than another method.

141      7.2.2  Performance Requirements of the SOW—Analytical Protocol Specifications

142      Under the performance-based approach to method selection, a laboratory will propose one or
143      several analytical methods that can meet the stated APSs and MQOs in the SOW for a given
144      analyte and matrix combination. Chapters 3, 5, and 6 discuss the APSs and MQOs in detail in
145      terms of their basic description, their inclusion in a SOW, and as key considerations for
146      identifying existing validated methods or developing new methods. The purpose of this section is
147      to provide guidance on what available information should be evaluated in order to approve the
148      various proposed methods.

149      The following subsections cover key aspects of the SOW that should be addressed during the
150      method evaluation and approval process.

151      7.2.2.1 Matrix and Analyte Identification

152      The TEC should review the method(s) proposed by the laboratory to determine if the method
153      under evaluation is applicable for the analyte/matrix combination specified in the SOW. In some
154      cases, several methods may be proposed, including gross screening methods and specific
155      radionuclide or isotopic methods having high specificity and ruggedness (Section 6.5.1.1 has
156      additional guidance). Each method should be evaluated on its own application and merit. When
157      methods are proposed by the laboratory that use alternative nuclides (such as decay products) to
158      determine the analyte of interest, the TEC should carefully review  the objective or summary of
159      the method to determine if the proposed method is truly applicable for the analyte of interest
160      given the radiological holding time and MQOs (i.e., can it properly quantify the analyte of
161      interest through decay progeny measurements?). For gross screening techniques, the TEC should
162      evaluate the analyte's decay scheme to determine the underlying gross radiation category (beta,
163    •  alpha, X-ray, or gamma-ray emitting) and the applicability of the proposed method's radiation
164      detection methodology.
         MARLAP                                                                       JULY 2001
         DO NOT CITE OR QUOTE                    7-6               DRAFT FOR PUBLIC COMMENT

-------
                                                              Evaluating Methods and Laboratories
165     Each proposed method should be evaluated to determine if the method can analyze the sample
166     matrix identified in the SOW. A method validated for water cannot be applied to soil samples
167     without modification and validation (Section 6.5). The planning team should have made—
168     through historical process knowledge, previous matrix characterization studies or common
169     experience—a determination on the uniqueness of the site-specific matrices compared to typical
170     matrices and provided guidance in the SOW as to the level of method validation. In addition, if
171     the radioanalytical specialist of the project planing team is concerned that the physiochemical
172     form of the analyte or the sample matrix substrate may present special problems to the radio-
173     analytical process, a detailed description of the analyte and matrix should have been included in
174     the SOW. Chapters 12 and 13 discuss possible sample matrix problems and Section 6.5 provides
175     guidance on the need for method validation. The radioanalytical specialist should carefully
176     review the summary of the method to determine if the proposed method is applicable for the
177     sample matrix.

178     At this point, if it is determined that the proposed method(s) is not applicable and cannot meet
179     the SOW specifications, there is no need to continue the method evaluation process.

180     7.2.2.2 Process Knowledge

181     The radioanalytical specialist should review the process knowledge information and determine if
182     the proposed method is capable of addressing these issues by virtue of its specificity, ruggedness
183     and applicability. Discussions on method specificity and ruggedness  may be  found on in
184     subsections on pages 7-13 and 7-15, respectively.

185     As discussed in Section 6.5.2 and above, process knowledge is extremely important for identify-
186     ing potential radioanalytical problems on some projects. Historical information or process
187     knowledge may identify chemical and radionuclide interferences, expected analyte and inter-
188     fering radionuclide concentration ranges, sample analyte heterogeneity issues, and the physio-
189     chemical form of the analyte, and the sample matrix substrate. In some special cases,  it may be
190     necessary to determine if the radiological holding time will be an issue if the laboratory must
191     analyze an alternative nuclide to determine supported and unsupported radionuclides  (decay
192     progeny nuclides) in the matrix.

193     7.2.2.3 Radiological Holding and Turnaround Times

194     The radioanalytical specialist should review the proposed method in  light of the radiological
195     holding time, analyte's half-life and typical sample delivery options and determine if the method
196     is capable of meeting the MQOs in a reasonable counting period given the typical method param-


        JULY 2001                                                                       MARLAP
     "  DRAFT FOR PUBLIC COMMENT               7-7                      DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
197      eters (such as sample weight processed, chemical yields, radiation detection efficiency, branching
198      ratio and background, ingrowth periods for decay progeny analysis, etc.). Radiological holding
199      time is defined as the time between the sample collection and the end of the sample analysis (end
200      of final measurement), while sample processing TAT refers to the time between sample receipt at
201      the laboratory and the issuance of an analytical report. The physical (analyte's half-life) and
202      chemical (stability or preservation concerns) characteristics of the analyte, as well as biological
203      degradation for some matrices, usually will dictate the radiological holding time. Project-specific
204      schedules and practicalities related to project and laboratory processing capacities normally enter
205      into establishing TATs. If the radiological holding time appears to be a critical issue, then the
206      laboratory should submit information on the typical batch size being processed by the method.
207      This information is needed in the method evaluation and review process. Without special
208      problems (e.g., inadvertent delay of sample delivery), the laboratory should be able to meet the
209      MQOs with a good margin of error for the majority of the samples processed. For very short-
210      lived analytes, too large a batch size may result in the last samples in the batch having difficulty
211      in meeting the radiological holding time. For short-lived analytes, counting the sample (or final
212      processing products) longer  typically is not practical because the analyte is decaying too rapidly
213      to make any gain counting the sample longer.

214      In some cases, the laboratory may want to propose two methods for a short-lived analyte: one for
215      normal delivery and processing schedules and another method for situations when lower detec-
216      tion limits are needed. An example of such a situation is the analysis of 131I in environmental
217      media. A method with adequate detection limits for reasonable radiological holding times is
218      gamma spectrometry. Another method that can be applied for lower detection limits or longer
219      radiological holding times is radiochemical separation followed by beta-gamma coincidence
220      counting.

221      Certain projects may be concerned with the chemical speciation of the analyte in the sample. For
222      these projects, the radiological holding time should have been specified to ensure that the chem-
223      ical species are not altered prior to processing. The project normally should specify chemical
224      preservation specifications applicable at the time of sample collection.

225      In the case of biological media, sample deterioration (Chapter 10) may become a problem, and
226      biological preservatives should be added to the sample to retard degradation. However, the
227      radiological holding time should be specified to  limit problems with sample degradation. The
228      radioanalytical specialist should evaluate the method in light of the foregoing information and
229      determine its adequacy to meet the radiological holding time and the pertinent MQOs

230      A laboratory's sample (processing) TAT for a method typically is not related to the method's


         MARLAP                                                                       JULY 2001
         DO NOT CITE OR QUOTE                    7-8               DRAFT FOR PUBLIC COMMENT

-------
                                                              Evaluating Methods and Laboratories
231      technical basis unless the radiological holding time and the TAT are nearly equal for a short-
232      lived analyte. However, sufficient time should be available between the completion of sample
233      analysis and the delivery of the analytical report. Meeting the radiological holding time but
234      failure to meet the TAT will not affect the quality of the analytical results but may place a
235      hardship on the project to meet schedules. The TEC should review the proposed method, the
236      radiological holding time and the TAT to determine if the method can process the samples in a
237      reasonable time period to meet the TAT. The sample delivery rate, sample batch size, level of
238      data automation and the laboratory's existing sample processing capacity will affect the
239      laboratory's ability to meet the TAT requirement.

240      7.2.2.4 Unique Processing Specifications

241      The TEC should review the proposed methods for compliance or applicability to unique sample
242      processing specifications stated in the SOW. Chapter 6 provides a limited discussion on what a
243      project may identify as unique or special sample process specifications. Examples may include
244      chemical speciation, analyte depth profiles, analyte particle size distribution, analyte hetero-
245      geneity within the sample, wet-to-dry analyte concentration ratios in biologicals, and possible
246      scaling factors between radionuclides in the sample. Li some cases, the proposed method(s) for
247      the analyte(s) may have to be evaluated with respect to all analytes or other sample preparation
248      specifications in order to determine method applicability and adequacy.

249      7.2.2.5 Measurement Quality Objectives

250      Method  performance characteristics (Method Uncertainty, Quantification Capability, Detection
251      Capability, Applicable Analyte Concentration Range, Method Specificity, and Method
252      Ruggedness) will be discussed in the following subsections. For a particular project, MQOs
253      normally will be developed for several (but not all) of the performance characteristics discussed
254      below.

255      METHOD UNCERTAINTY

256      The SOW should specify the required method uncertainty at a stated analyte concentration (or
257      activity level) for each sample matrix and the level of method validation (Section 6.6) needed to
258      qualify the method at the stated analyte concentration.

259      MARLAP uses the term  "method uncertainty" to refer to the predicted uncertainty of a result that
260      would be measured if a method were applied to a hypothetical laboratory sample with a specified
261      analyte concentration. As presented in Chapter 6 and formulated in Chapter 19, the method

         JULY 2001                                                  -  '                 MARLAI
         DRAFT FOR PUBLIC COMMENT              7-9                     DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
262      uncertainty of the analyte concentration for a given method is determined by mathematically
263      combining the standard uncertainties of the many input quantities (parameters), involved in the
264      entire radioanalytical process. This will involve making some assumptions and normally involve
265      using typical or worst case values for a conservative estimate of the method uncertainty. Some of
266      these input quantities, and thus the method uncertainty, vary according to analyte level or concen-
267      tration in the final measured product; others do not. In some  cases, the magnitude of the method
268      uncertainty for an analyte may increase in proportion to the magnitude (concentration/activity) of
269      any interfering radionuclide present in the  final measurement product. Therefore, it is imperative
270      that the TEC evaluate the laboratory's submitted documentation relative to this requirement,
271      especially the information provided on method specificity, given  the historical or expected inter-
272      fering nuclides and the needed decontamination factors (chemical separation factors) to render a
273      good measurement for the analyte of interest.
274
275      In evaluating the documentation relevant to meeting the method uncertainty requirement, it is
276      important to determine if the method validation requirements stated in the SOW have been met.
277      The TEC should review the submitted method validation documentation and verify that the
278      method's performance meets the requirements of Table 6.1 (Chapter 6) for the specified valida-
279      tion level. It is important that the laboratory submit definitive documentation of method
280      validation compliance for the method uncertainty requirement.

281      The method performance documentation may include documentation or data from method
282      validation, internal or external (organization sending QC samples) QC data, external PE program
283      data, and results of pre-qualifying laboratories by sample analyses. By evaluating the actual QC
284      and PE program performance data, it can be determined if the quoted measurement uncertainty
285      for a reported QC sample result (calculated by the laboratory) truly reflects the method uncer-
286      tainty under routine processing of samples. The required method  uncertainty can be viewed as a
287      target value for the overall average measurement uncertainty for the samples at a specified
288      analyte concentration. It is important that the precision, as calculated from repeated measure-
289      ments, is consistent with the laboratory's stated measurement uncertainty for a given sample
290      result whose analyte concentration is near the specified concentration. If the quoted measurement
291      uncertainty of a QC or test measurement is quoted to be ± 10 percent and QC or PE program data
292      indicates a data set standard deviation of ± 20 percent, then the laboratory may not have
293      identified all possible uncertainty components or may have underestimated the magnitude of a
294      component.

295      QUANTIFICATION CAPABILITY

296      A requirement for the quantification capability of a method and the required method validation


         MARLAP                     _                                                 JULY 2001
         DO NOT CITE OR QUOTE       "            7-10         '  "   DRAFT FOR PUBLIC COMMENT

-------
                                                              Evaluating Methods and Laboratories
297      criteria may be specified in a SOW. The quantification capability, expressed as the minimum
298      quantifiable concentration (MQC), is the smallest concentration of the analyte that ensures a
299      result whose relative standard deviation is not greater than a specified value, usually 10 percent.

300      The project manager or TEC should review available documentation on the method to determine
301      if the laboratory can meet the method quantification requirement. Method validation documen-
302      tation sent by the laboratory should demonstrate explicitly, or by extrapolation, that the method,
303      using certain input quantities and their uncertainties, can meet the quantification requirement.
304      The method validation acceptance criteria presented in Section 6.6 have been formulated to eval-
305      uate the MQC requirement at the proper analyte concentration level, i.e., action level or other
306      specified analyte concentration.

307      Some projects may send performance testing material spiked at the MQC level as a more in-
308      depth verification of the compliance with this requirement. Laboratories may also submit docu-
309      mentation for internal QC or external PE program results that cover the MQC value. The TEC
310      should evaluate the reported results to determine if the MQC requirement can be met.

311      DETECTION CAPABILITY

312      A radiochemical method's detection capability for an analyte is usually expressed in terms of
313      minimum detectable concentration (MDC) or activity (MDA). Chapter 19 provides the definition
314      and mathematical equations for the MDC1 and MDA. A MDC requirement for each analyte/
315      matrix combination may be stated in a SOW. Any proposed  method should document the basis
316      and equation for calculating the MDC. The supporting documentation on the method should
317      contain the input quantity values that may be entered into the MDC equation to calculate the
318      detection capability under a variety of assumptions. The TEC should evaluate the assumptions
319      and parameter values for reasonableness and practicality. This evaluation is especially important
320      for recently validated methods that have a limited routine processing history. It is recommended
321      that the TEC perform an independent calculation of the method's MDC using laboratory-stated
322      typical or sample-specific parameters.

323      When the proposed method has been  validated recently or previously used on similar projects,
324      sufficient data should exist that either are directly related to testing the method's detection capa-
325      bility or can be used to estimate the method's detection capability. Any data submitted that
326      document direct testing of the method's detection capability should be reviewed for appropri-
327      ateness or applicability, reasonableness, and accuracy. If method detection testing is performed, it
         'The MDC should not be confused with the concept of the critical value (Chapter 19).

         JULY 2001                                                                        MARLAP
         DRAFT FOR PUBLIC COMMENT              7-11                     DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
328      normally will be for one analyte concentration level or value. It should not be expected that the
329      MDC testing process included varying the magnitude of the method's many parameters over a
330      wide range.

331      The reported quantitative results of the blanks can be used to estimate the MDC to within a
332      certain degree of confidence (for most methods). At or below the MDC value, the majority of the
333      measurement uncertainty typically is due to the Poisson counting uncertainty. For well-controlled
334      methods, the uncertainties of the other method parameters (input quantities), such as sample
335      weight, detection efficiency, and chemical yield, may range up to 10 percent. Therefore, a simple
336      rule of thumb to estimate the MDC for most methods involves reviewing the measurement
337      uncertainty for the reported blank results. If the blanks were analyzed to meet the MDC
338      requirement, then the reported MDC (based on blank and sample paired observations) for most
339      methods should be between 3 and 4 times the measurement uncertainty of the blank when the
340      background counts (per measurement interval) are greater than 10. It is more complicated to
341      estimate the  MDC for methods that use low background detectors (such as alpha spectrometry)
342      having background counts less than 10 per counting interval. The TEC should evaluate the blank
343      data to determine the reasonableness of the quoted MDC values. These  rules of thumb can be
344      applied to actual samples when the quoted analyte concentration value is less than two times its
345      associated combined standard uncertainty value.

346      APPLICABLE ANALYTE CONCENTRATION RANGE

347      The applicable analyte concentration range can vary substantially depending on whether the
348      project deals with process waste streams, environmental remediation or monitoring, or environ-
349      mental or waste tank characterization research. The proposed method being evaluated should
350      provide accurate results over the analyte concentration range stated in the SOW. Acceptable
351      analytical results used in this context means consistent method uncertainty (at a given analyte
352      concentration) and without significant bias. The range may be over several decades, from a
353      minimum value (the MDC for some projects) to 100 times the action level or MQC.

354      Due to the effects of the Poisson counting uncertainty, most methods will provide more precise
355      results at higher analyte concentration levels compared to those concentration levels near zero. At
356      concentration levels near zero, background effects will render the results less precise. If the
357      background (instrument or ambient levels of analyte in the matrix) is not well characterized,  a
358      bias may also exist. For projects or programs (environmental characterization research) that have
359      no action level requirement, the lower portion of the required concentration range or the MDC
360      requirement  may be most important. For those situations, particular emphasis should be placed
361      on evaluating method and reagent blank data (i.e., net results that take into account inherent


         MARLAP                                                 "                    JULY 2001
         DO NOT CITE OR QUOTE                    7-12-             DRAFT FOR PUBLIC COMMENT

-------
                                                              Evaluating Methods and Laboratories
362      analyte content in the reagents or tracers) to ensure that a bias does not exist. Refer to Section
363      7.2.2.6, "Bias Considerations," on page 7-15 for additional guidance.

364      Typically, radiation detection systems are linear in signal response over a very large range of
365      count rates. However, depending on the magnitude of the chemical or radionuclide interferences
366      in the sample, the method may not produce linear results over the entire application range.
367      Therefore, it is critical that when a mixture of radionuclides is present in a sample, the method
368      must provide sufficient "analyte selectivity/isolation or impurity decontamination" to ensure
369      valid results and "method linearity." In some cases, such as that for pure beta-emitting analytes,
370      the degree of needed decontamination from other interfering nuciides may be as much as six
371      orders of magnitude.

372      There are several sources of information available from the laboratory that should be reviewed
373      and possibly evaluated to ensure the method is capable of meeting this MQO. These include
374      method validation documentation, previous projects or experience using the method, PE program
375      results, internal and external QC sample results, and pre-qualifying test samples. When evalua-
376      ting the data, the TEC should evaluate the method's performance as a function of analyte concen-
377      tration with and without interferences. However, this evaluation would be most valid when the
378      samples were processed to the same MQO (especially MDC or MQC), a situation that may not
379      be realistic for different projects. If the MDC requirement results in a longer counting time from
380      one project to another, there may be an impact on the method's uncertainty for a given analyte
381      concentration due to difference in the Poisson counting uncertainty. Bias typically is not affected
382      by increasing the counting time. A graphical plot of this data would be visually helpful and may
383      be used to determine if the method uncertainty requirement would be met at the action level
384      (extrapolation may be necessary).

385      METHOD SPECIFICITY

386      Method specificity refers to the ability of the method to measure the analyte of concern in the
387      presence of other radionuclide or chemical interferences. The need for or degree of method
388      specificity depends on the degree or magnitude of the interferences and their effect on the ability
389      to measure the analyte of interest. Gross alpha, beta, and gamma-ray methods are considered to
390      be methods of low specificity and are used when individual nuclide specificity is not possible or
391      needed. Radiochemical methods involving sample digestion, purification and decontamination
392      steps followed by alpha spectrometry, such as for 239Pu in soil, are considered methods of high
393      specificity. However, the relative degree of specificity of these nuclide specific methods depends
394      on the number of analyte isolation and interference decontamination steps. High resolution
395"     gamma-ray spectrometry employing a germanium (Ge) detector is considered to have better


         JULY 2001                                                                       MARLAP
         DRAFT FOR PUBLIC COMMENT              7-13                    DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
396      specificity than the lower resolution sodium iodide (Nal) gamma-ray spectrometry.

397      The TEC should evaluate the proposed methods for adequacy to meet the specificity require-
398      ments stated in the SOW. As mentioned in Chapter 6, methods of low specificity, such as gross
399      radiation detection methods, may be proposed if the methods meet the MQOs. For example,
400      when a single analyte having a relatively elevated action level needs to be evaluated, such as 60Co
401      in soil at an action level of 26 Bq/kg (0.7 pCi/g), then a method with less specificity (gross
402      counting methods for gamma-ray or beta emitting nuclides) may be sufficient to meet the MQOs.
403      For this example, a less expensive Nal gamma-ray spectrometric analysis with a lower resolution
404      capability may be more desirable compared to a more costly high resolution germanium gamrna-
405      ray spectrometric analysis. If greater method specificity for a certain analyte/matrix combination
406      has been required in the SOW, then a high resolution non-destructive sample analysis method
407      (such as high resolution gamma-ray spectrometry) or a destructive sample analysis by a detailed
408      radiochemical method would be appropriate. For proposed methods of high specificity, it is
409      important that the TEC review and evaluate the basic purification and decontamination steps of
410      the method, or the resolution of the radiation detection system, for adequacy in relation to the
411      expected mixture of analytes and interferences. For radiochemical methods, the TEC may be able
412      to estimate the needed distribution/partition coefficients, extraction and solubility factors, etc., of
413      the various purification steps and compare the values against the needed decontamination factors
414      for the interfering chemical or radionuclide interferences.

415      The adequacy of method specificity can be evaluated by the analytical results from the analysis of
416      site-specific PT materials during method validation and/or laboratory pre-qualifying tests. A
417      further discussion on the use of these materials is presented below.

418      METHOD RUGGEDNESS

419      Method ruggedness refers to the ability of the method to produce accurate results over wide
420      variations in sample matrix composition and chemical and radionuclide interferences, as well as
421      when steps (such as pH adjustments) in the method are varied slightly by the analyst. For some
422      projects, the matrix composition and level of analyte or interferences may very dramatically in a
423      given project.

424      Ruggedness studies have been defined by EPA (1998). A testing protocol for method ruggedness
425      has been outlined by the American Public Health Association (APHA). Some laboratories may
426      have developed methods according to the APHA protocol for method ruggedness or are using
427      methods contained in standards methods (APHA, 1989). Documentation on any internal
428      ruggedness study may be  available from the laboratory.


         MARLAP                                            ""                        JULY 2001
         DO NOT CITE OR QUOTE                    7-14              DRAFT FOR PUBLIC COMMENT

-------
                                                              Evaluating Methods and Laboratories
429      As mentioned in Chapter 5 and 6, the use of site-specific PT materials is a means of testing the
430      ruggedness of a method for a defined project. If ruggedness and method specificity are concerns
431      due to the sample matrix of a defined project, then a variety of site-specific performance testing
432      materials should be sent to the laboratory as part of the pre-qualification process or as a method
433      validation requirement. National PE programs, such as DOE's Multiple Analyte Performance
434      Evaluation Program (MAPEP) and Quality Assessment Program (QAP), use generic PT
435      materials and may not be applicable or representative of the matrices for a defined project. The
436      results of the pre-qualifying or method validation processes using site-specific PT materials
437      should be evaluated by the TEC to determine the adequacy of the method to meet this MQO
438      parameter. If the sample matrix and analytes are  fairly standard, then no other evaluation of the
439      available information may be necessary.

440      7.2.2.6 Bias Considerations

441      The method proposed by the laboratory should produce analytical results that are unbiased.
442      MARLAP considers bias to be a persistent difference of the measured result from the true value
443      of the quantity being measured, which does not vary if the measurement is repeated. Normally,
444      bias cannot be determined from a single result or a few results (unless the bias is large). Bias may
445      be expressed as the percent deviation in (or deviation from) the "known" analyte concentration.
446      Since bias is estimated by repeated measurements, there will be an uncertainty in the calculated
447      value. It is incumbent upon the project manager or TEC to evaluate the proposed methods for
448      possible bias over the applicable analyte concentration range. A laboratory should eliminate all
449      known biases before using a method. However, there may be circumstances, such as the
450      processing of site-specific sample matrices, that  may produce some inherent bias that is difficult
451      to assess or correct in a reasonable time or economical fashion. For the methods proposed, the
452      project manager must determine  if the magnitude of the bias will significantly affect the data
453      quality.

454      A bias can be positive or negative. Methods may have a bias at all analyte concentration levels
455      due to the improper determinations of chemical yield, detector efficiency or resolution, subtrac-
456      lion of interferences, and improper assumptions  for the analyte's half-life or an emission
457      branching ratio. When reporting  an analyte concentration based on a decay progeny analysis,
458      improper ingrowth assumptions may lead to a bias.

459      It is recommended that the project manager or TEC evaluate the available data provided by the
460      laboratory or from performance evaluations for bias, based on multiple analyses covering the
461      applicable analyte concentration  range. One means of estimating a bias is through the evaluation
         JULY 2001                                                                        MARLAP
         DRAFT FOR PUBLIC COMMENT              7-15                     DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
462      of external PE program data.2 For proper evaluation of the PE program sample results, it is
463      essential that the PE program provider use sample preparation techniques that will produce
464      performance testing (PT) samples (or a sample distribution) having insignificant "within or
465      between" sample analyte heterogeneity and whose analyte concentrations are accurately known.

466      For the purpose of evaluating whether a laboratory method has an observable bias based on
467      multiple laboratory internal QC samples (matrix or method spikes) or external PE program
468      samples, the following equations can be used:
                                      Dj =100*
                                                               \

                                                         Known
                                                       Known
                                                               /
(7-1)
469      where Dj is the percent deviation, Xj is an individual analytical result and Ys ^j,^ is the "known"
470      value for the sample analyzed. The D; should be determined for each test sample in the data set.
471      The mean percent deviation for the method for a series of analyses in the data set can be
472      estimated by the equation:

                                                                                             (7.2)
473      Refer to various references (ASTM D2777, NBS 1963, Taylor 1990) for applicable tests that may
474      be performed to determine if there is a statistical difference at a given significance level.

475      There may be a negative or positive bias at low analyte concentrations due to the improper
476      determination of the appropriate detector background or analytical blank value. For an individual
477      blank result, the result (net activity or concentration value) would be considered to be a
478      statistically positive value if the magnitude of its value is greater than 1.65 times the quoted
479      measurement uncertainty. An older, much more conservative approach was to consider a reported
480      value as a positive value when the magnitude of a result was greater than 3 times the measure-
481      ment uncertainty.

482      Since the measurement process is statistical in nature and involves the subtraction of an
483      appropriate background or blank which also has an uncertainty, there is a 50 percent probability
         2 In order to standardize against the national standard (MIST), an external performance evaluation program should
         be implemented by a well-qualified provider that has standardized its reference materials to NIST or is participating
         in a NIST traceability program

         MARLAP                                — -    . .                            ,-JULY 2001
         DO NOT CITE OR QUOTE                    7-16              DRAFT FOR PUBLIC COMMENT

-------
                                                             Evaluating Methods and Laboratories
484      (half of the results) that the analytical result for a blank sample will have a negative magnitude,
485      e.g., -1.5 ± 2.0. For an individual blank measurement, the measurement may be considered to be
486      problematic when the negative magnitude is greater than 2 or 3 times the measurement
487      uncertainty.

488      For most radionuclides, other than those that are naturally occurring, the major source of a
489      positive blank is from contamination, either cross-contamination from other samples or dirty
490      glassware during sample processing or from tracer impurities. A poor estimate of the instrument
491      background or ambient analyte levels in the matrix/reagent can lead to results being too negative
492      in magnitude. A statistical test should be performed on a series of the data results to determine if
493      there is a negative bias. The relative importance of the negative bias depends on the magnitude of
494      the negative bias, magnitude of the action level and type of project.

495      7.3    Initial  Evaluation of a Laboratory

496      The basic information to be considered in the initial evaluation of a laboratory has been
497      summarized according to major categories in Figure 7.1. Not all categories will be discussed in
498      detail as subsections. Some categories may be grouped and discussed under a single generic
499      subsection heading. In order to allow for flexibility, no definitive guidance or detailed acceptance
500      criteria for the parameters under discussion will be provided.

501      7.3.1   Review  of Quality System Documents

502      A radiochemical laboratory providing usable analytical data  should have a quality manual. A
503      review of this document by a knowledgeable evaluator can reveal a great deal about the quality
504      and acceptability of the laboratory relative to the work to be  performed. A well-developed quality
505      manual contains a description of the quality system and descriptive material covering most other
506      aspects of a laboratory's operation. The standard operating procedures, method documentation,
507      list of instrumentation, and personnel resumes should be reviewed. For some projects, the project
508      manager may require the laboratory to develop a specific project quality plan, system, and
509      manual. The following items, taken from the NELAC Quality Systems (NELAC 2000), should be
510      discussed at a minimum:

511       •  Organization and management
512       •  Quality system establishment, audits, essential quality controls and evaluation, and data
513         verification
514       •  Personnel (qualifications and resumes)
515       •  Physical facilities (accommodations and environment)

         JULY 2001                                                                      MARLAP
         DRAFT-FOR PUBLIC COMMENT              7-17                    DO NOT CITE OR QUOTE

-------
          Evaluating Methods and Laboratories
                                       INITIAL EVALUATION OF LABORATORY
            GENERIC SPECIFICATIONS
                FOR LABORATORY
                  OPERATIONS
                                 CAPABILITY TO MEET SOW
                                     SPECIFICATIONS
              ADMINISTRATIVE / PROJECT
                  MANAGEMENT
                EXISTING PERFORMANCE
                   INDICATOR REVIEW
                                    ADEQUACY OF FACILITIES,
                                  INSTRUMENTATION, AND STAFF
                                         LEVELS
                 QUALITY SYSTEM
                 DOCUMENTATION
                                   EXPERIENCE BASE FOR SOW
                                   AMALYTKALREQUIREMEMTS
                 QUALITY MANUAL

               Oro»niMlion «nd Mmgement
             Outlay Sytem. E*tib**hmM<, Audits
  EMeflUO Ointty Control* Mid
  r.»hiit»on.. «nd tw« v«rific«lian
       PcrMonel
Pty»lcri FMiitin - Accomodnfiom w
            MetturoneM TracoMNftrtndCtllbriMori t
               Hedwd »nd Standtfd Operating
                 e Hmdling, Smple Ac
                Pt>ttcv.>nd scnpte Rnevt
              SutKontractina An**tic*l Smpm
             Outwdc Suvpart Senicez tnd SuppUe*
      EXTERNAL PE PROGRAM
           RESULTS
       PROFIQENCY TESTING
           SAMPLES
                                   CAP ABILITY TO KEET SAMPLE
                                   PROCESSING AND REPORTING
                                       RCQUREMeNTS
                                                RAOKMNALYTKAI. METHODS
                                                 APPUCABOJTY ; OUAUTV
                                      BFTHOD REVIEW
                                      SCOWPARABttJTY
                                            METHOD
                                          VALIDATION
                                         DOCUMENTATION
 DEMONSTRATED
  QUALITY ON
SIMILAR PROJECTS
                            FIGURE 7.1 — Considerations for the initial evaluation of a laboratory
5l6        •  Equipment and reference materials
517        *  Measurement traceability and calibration
518        •  Test methods and standard operating procedures (methods)
519        •  Sample handling, sample acceptance policy and sample receipt
520        •  Records
521        •  Subcontracting analytical samples
522        •  Outside support services and supplies
523        •  Complaints

524      The laboratory evaluation should involve a review of the quality system documents for
525      completeness, thoroughness, and clarity.
          MARLAP
          DO NOT CITE OR QUOTE
                                             7-18
                                JULY 2001
          DRAFT FOR PUBLIC COMMENT

-------
                                                              Evaluating Methods and Laboratories
526      7.3.2  Adequacy of Facilities, Instrumentation, and Staff Levels

527      Many factors enter into a laboratory's ability to meet the analytical requirements of a SOW. The
528      resources and facilities of a laboratory may become stretched depending on the number of clients,
529      the analytical services needed, and the deadlines of the committed work activities. Some SOWs
530      may request information about the current workload of the laboratory and available facilities,
531      staff and nuclear instrumentation for the specified work scope. The resources needed will vary
532      considerably depending on the analysis  and number of samples: from minimal bench space,
533      hoods, and nuclear instrumentation for fairly simple gross analyses to maximum bench space,
534      hoods, staff, and nuclear instrumentation for low-level analyses of soil. In addition, the laboratory
535      capacity also depends on the number of samples that are routinely processed in a batch. Various
536      factors may control the batch size, including the hood processing area, bench space, and
537      equipment setup, available number of radiation detectors, counting time, and half-life of
538      radionuclide, among others.

539      The adequacy of the facilities, instrumentation, and staff levels can be estimated by two general
540      mechanisms: detailed supporting information in the SOW and an initial onsite audit. Information
541      received from the prospective laboratory may provide an estimate of the laboratory's resources,
542      but an initial onsite audit goes verifies the actual existence and maintenance of the resources.

543      7.3.3  Review of Applicable Prior Work

544      If required in a SOW, a laboratory will provide a list  of clients for whom radioanalytical services
545      had been performed that are considered comparable in terms of work scope, DQOs, MQOs,
546      APSs, and project type. A written or oral verification of the client list should be performed. As
547      part of the verification process, the following items related to adherence to contract or project
548      requirements should be discussed and documented:

549       • Radionuclides analyzed;
550       • Sample matrices types;
551       • Laboratory capacity (number of samples per week or another time period);
552       • MQO for method uncertainty, detection and quantification capability;
553       • Radiological holding times;
554       • Sample turnaround times;
555       • Corrective actions; and
556       • Communications related to schedule, capacity, or quality issues.

557      It should be noted that under performance-based contracting, a laboratory's prior work for an


         JULY 2001                                                    -                   MARLAP
         DRAFT FOR PUBLIC COMMENT             7-19                   DO HOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
558      agency should be considered, either as a positive or negative performance weighting factor, when
559      scoring a laboratory's performance during the technical evaluation process.

560      7.3.4   Review of Performance Indicators

561      Some laboratories compile a semiannual or annual QA report summarizing the internal QC
562      sample results for the methods used during a given time period, as well as an internal quality
563      assessment report summarizing the internal and external audit findings and corrective actions
564      taken. Although the laboratory's internal quality criteria for a given radionuclide/matrix may be
565      different from the project MQOs, the internal QC sample results can be used to gauge the
566      laboratory's performance capabilities. If these documents are available, they should be reviewed
567      for documentation of process control and pertinent quality parameters such as bias, precision,
568      unusually high number of positive blank detection, chemical recoveries, turnaround times,
569      number of recurring deficiencies or findings, and corrective action effectiveness.
                 \
570      7.3.4.1 Review of Internal QC Results

571      A quality assessment report may contain a summary of various QA-related activities, including
572      internal audits and surveillance, report of conditions adverse to quality, investigation requests,
573      corrective actions, and the results of external PE programs and internal QC samples. The content
574      and frequency of the reports normally are outlined in the laboratory's quality manual. Frequently,
575      this type of quality assessment report may be submitted with the laboratory's response to the RFP
576      without request. The TEC may want to specifically request such a report when available.

577      When the laboratory's quality system is effectively implemented, the information contained in
578      these Q A reports can be used not only to gauge the quality of the analyses but also the effective-
579      ness and timeliness of such quality system activities as identifying conditions adverse to quality,
580      controlling and monitoring the radioanalytical quality using internal QC samples, and corrective
581      actions. The internal QC sample results can be used to gauge the laboratory's performance
582      capability. Results of the QC  samples for a radionuclide and sample matrix should be reviewed
583      for both the batch QC samples and single- or double-blind samples submitted by the QA officer.
584      Batch QC samples typical include laboratory control samples, method blanks, matrix spikes, and
585      duplicates. Such parameters as acceptable percent deviation for spiked samples, acceptable
586      precision as measured by duplicate sample analyses, false nuclide detection, positive blanks, and
587      compliance to internal quality requirements should be reviewed, depending on the type of QC
588      sample. The single- and double-blind samples submitted independently by the QA officer are
589      considered more operationally independent than the batch QC samples.
         MARLAP                                                                      JULY 2001
         DO NOT CITE OR QUOTE                   7-20              DRAFT FOR PUBLIC COMMENT

-------
                                                              Evaluating Methods and Laboratories
590      When quality problems are observed by the reviewer, it is important to check if the laboratory's
591      quality system also has found and reported the same problem and whether an investigation or
592      corrective action has been undertaken.

593      Additional specific guidance is provided in Chapter 18 on evaluating internal QC samples to
594      meet internal laboratory QC performance criteria. It is recommended that the project managers
595      review this chapter to gain a perspective on how to use reported internal QC results to gauge a
596      laboratory's potential to meet project MQOs.

597      7.3.4.2 External PE Program Results

598      Typically, a laboratory's performance or capability to perform high quality radioanalyses can be
599      evaluated through two external PE program mechanisms. The first mechanism, which may not be
600      available for all projects, is the submittal, as an initial laboratory evaluation process, of project-
601      specific PT samples prepared by the organization or a contracted source manufacturer. When
602      previous knowledge or experience exists, well-characterized site-specific matrix samples
603      containing the nuclides of interest can be used. This approach can use site-specific matrix
604      materials for background samples or for samples spiked with target analytes. For this evaluation
605      mechanism, and depending on the number and type of samples, the laboratory's capability to
606      meet all proposed project MQOs  and quality performance specifications may be evaluated.

607      The second mechanism, available to most projects, is the laboratory's participation in
608      government or commercial PE programs for radioanalyses. Each PE program has its own
609      acceptable performance criteria related to a laboratory's bias with respect to the PE program's
610      "known" analyte concentration value. Acceptable performance criteria are established for each
611      nuclide/matrix combination. A PE program may also evaluate a laboratory based on a false
612      positive analyte detection criterion. Typically, the laboratory's performance data in government
613      PE programs are provided in reports available to the public.
614
615      The project manager should be aware that the acceptable performance criteria used by the PE
616      programs may be inconsistent with or more lenient than the MQOs of the project. The
617      laboratory's performance should be evaluated in terms of the established MQOs of the project
618      rather than a PE program's acceptable performance criteria. In some cases, the laboratories could
619      be ranked as to their level of performance in these programs.

620      7.3.4.3 Internal and External Quality Assessment Reports

621      Most laboratories undergo several external and internal QA audits per year, with resultant audit


         JULY 2001                                    "  ~                               MARLAP
         DRAFT FOR PUBLIC COMMENT              7-21                    DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
622      reports. Typically, a summary of the findings and commitments of internal and external quality
623      audits or assessments are tracked on some type of QA database as part of the laboratory's
624      corrective action process. Access to the audit reports or database information may be limited.
625      This information is not normally requested as part of the RFP process, nor do most laboratories
626      submit such information with their response to an RFP. Therefore, obtaining previous QA audit
627      information from a laboratory outside a formal, external, onsite audit process may be limited.

628      7.3.5  Initial Audit

629      An initial assessment or audit may be performed to provide assurance that a potentially selected
630      laboratory is capable of fulfilling the project requirements in accordance with the SOW.
631      Essentially, the objectives of an initial audit are twofold. The first objective is to verify that what
632      the laboratory claims in response to the SOW or RFP, such as the various quality and safety
633      programs, are being correctly and fully implemented, and when used during the project period,
634      will ensure that stipulated requirements will be met. The second objective is to determine if the
635      laboratory has the instruments, facilities, staffing levels and other operational requirements
636      available to handle the anticipated volume of work. In other words, is the laboratory's proposal
637      realistic when compared to the actual facilities? To answer this question, auditors will be looking
638      to see whether a candidate laboratory has all the required elements to meet the project needs.

639      Detailed guidance and information on what should be evaluated in an initial audit has been
640      provided in Appendix E, Section E5.5 and Table E7. This section also contains recommendations
641      on the key items or parameters that should be reviewed during the initial audit. Depending on the
642      project, other quality or operational parameters/requirements (such as requirements related to
643      chemical speciation or subsampling at the laboratory) not covered in Appendix E should be
644      included in the initial audit plan.

645      7.4   Ongoing Evaluation of the Laboratory's Performance

646      The evaluation framework presented here is intended to be sufficiently generic to cover the
647      operations of a laboratory performing work according to a SOW as recommended in Chapter 5.
648      As described in MARLAP, MQOs are a key component of the SOW. Therefore, the sample
649      schedule, analyses to be performed, MQOs, and other analytical requirements have been defined.
650      The methods selected by the laboratory have been demonstrated to meet the MQOs and have
651      been approved by the project manager. In addition, the laboratory and its programs should have
652      undergone an initial audit to ensure that the laboratory has  met or is capable of meeting project
653      requirements, including sample processing capacity, sample TATs, deliverables for analytical
654      reports, etc. This would include maintaining a satisfactory quality system that includes

         MARLAP                                                                       JULY 2001
         DO NOT CITE OR QUOTE                     7-22       .-      DRAFT FOR PUBLIC COMMENT

-------
                                                              Evaluating Methods and Laboratories
655      monitoring and controlling the radioanaJytical processes through an instrument and internal
656      sample QC program and the acceptable performance in an external PE program.

657      The ongoing evaluation of a laboratory's performance includes the evaluation of the method
658      applicability or the quality of the data produced, and assessing the laboratory's quality system
659      and operations through onsite or desk audits or assessments. The continued method performance
660      can be evaluated through the laboratory's internal sample QC program, a possible external QC
661      program maintained by the project manager, or an external PE program. It should be noted that
662      samples used to control and monitor the quality of laboratory analyses have been defined
663      according to their use. For example, batch or external QC samples are used to control as well as
664      monitor the quality of the analytical process (the process can be stopped immediately if the QC
665      sample results indicate that the process is outside appropriate SOW specifications or laboratory
666      control limits). As defined previously, PT samples are used to compare the performance of the
667      radioanalytical processing to some acceptance criteria but are not used to control the process.

668      The ongoing evaluation of the laboratory quality system and operations is accomplished through
669      a visit to the laboratory or by a desk audit (the review of records and data from the laboratory),
670      These audits or assessments are more focused on whether the laboratory is meeting project
671      specifications rather than whether the laboratory has the capability to meet project or SOW
672      requirements.

673      Once a laboratory has initiated work on a project, the laboratory's performance should be
674      evaluated for the duration of the project. The quality of the radioanalytical measurements, as well
675      as the pertinent key operational aspects of the laboratory, should be evaluated against the
676      requirements of the MQOs and SOW. Both the quantitative and qualitative measures of
677      laboratory performance should be evaluated on a continual basis. In addition, the operational
678      aspects of the laboratory germane to the effective implementation of the project requirements
679      should be evaluated/monitored on a continual basis.

680      7.4.1   Quantitative Measures of Quality

681      The laboratory's ongoing demonstrated ability to meet the MQOs and other APS requirements
682      can be evaluated through various quantitative measures using internal QC data and external PE
683      program QC data. From these data, quantitative, tests, as outlined in Appendix C can be used to
684      measure and monitor the MQO parameters on a short-term basis. Also, the QC and PE program
685      data can be used to evaluate the laboratory's performance, on a long-term trending basis, in
686      meeting other quality related parameters such as bias and precision, unusually high number of
687      positive.blank detection, false nuclide detection, MDC or MQC adherence, radiological holding


         JULY 2001                                                                       MARLAF
         DRAFT FOR PUBLIC COMMENT               7-23                     DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
688      times, etc. The following subsections will discuss the use of data from these samples to evaluate
689      the laboratory's radioanalytical quality with respect to the requirements.

690      7.4. 1 . 1 MQO Compliance

69 1      MARLAP recommends that project specific MQOs be established and incorporated into the
692      SOW for laboratory radioanalytical services. Appendix C provides guidance on developing the
693      MQOs for method uncertainty, detection capability, and quantification capability. Establishing a
694      gray region and action level are important to the development of the MQOs. For certain research
695      programs and characterization studies, the concept of an action level may not be applicable. For
696      these studies or programs, the MDC requirement and restrictions on the frequency of false
697      positive detections may be more important. As such, the project planning team for these
698      programs should establish the basis for their own MQOs and develop tests to evaluate a
699      laboratory's performance to meet the requirements. These tests may be different from those
700      presented below.

701      MARLAP recommends that a MQO for method uncertainty be established for each analyte/
702      matrix combination. The method uncertainty is affected by laboratory sample preparation, sub-
703      sampling, and the analytical method. In the absence of other information, the required method
704      uncertainty («MR) at the upper bound of the gray region (UBGR) may be defined as:


                                              um = -j^                                     (7.3)

705      where WMR is the method uncertainty and A is the width of the gray region (difference between the
706      upper and lower bounds of the gray region) as defined in Appendix C. In terms of the relative
707      fraction of the upper bound of the gray region (action level), <$m, is defined:
                                                                                           (7'4)
708      The following subsections describe methods to quantitatively monitor a laboratory's performance
709      relative to meeting this principal MQO through the use of internal or external batch QC samples.
710      In some cases, the laboratory's internal quality program may have more restrictive quality control
71 1      limitations for method performance compared to the proposed control limits used by the project
712      manager to monitor adherence to the MQO for method uncertainty. Evaluation of the labora-
713      lory's performance in NIST-traceable external PE programs will determine the degree of bias of
714      the laboratory's method with respect to the national standard, as opposed to the determination of


         MARLAP                  -        -                                       -   JULY 2001
        -DO NOT CITE OR QUOTE                   7-24              DRAFT FOR PUBLIC COMMENT

-------
                                                             Evaluating Methods and Laboratorie,
715      the laboratory's internal bias through the use internal QC samples. The tests presented assume
716      that all known internal (related to QC values and calibrations) and external (calibration differ-
717      ences with respect to the national standard) biases have been defined and eliminated and, as such
718      the difference between the measured result and the "expected known" value is a result of the
7 1 9      method uncertainty only.

720      USE OF INTERNAL QC SAMPLE RESULTS

721      For most projects, the SOW will specify that the laboratory incorporate internal QC samples
722      within a defined batch of samples. The QC samples may include a laboratory control sample,
723      sample duplicates, a matrix spike sample and a method or reagent blank, or both. Appendix C
724      provides examples on the use of the following quantitative tests to measure a laboratory's
725      performance in meeting the MQO for method uncertainty.

726      Quality Performance Tests and Acceptance Criteria for Quality Control Samples

727      Laboratory Control Sample (LCS). The analyte concentration of an LCS should be high enough
728      so that the resulting Poisson counting uncertainty is small and the relative uncertainty limit cpMR i
729      appropriate with respect to the action level and the spike concentration chosen. The percent
730      deviation (%D) for the LCS analysis is defined as

                                      %D = SSR"SA  x 100%                              (7.5
738
739
740
741
731      where
732         SSR   is the measured result (spiked sample result) and
733         SA    is the spike activity (or concentration) added.

734      It is assumed that the uncertainty of S A is negligible with respect to the uncertainty of SSR.
735      Refer to Appendix C for the basic assumption and limitation of this test. For long-term trending,
736      the %D results should be plotted graphically in terms of a quality control chart as described in
737      Chapter 18. The warning and control limits on %D are summarized below:
Laboratory Control Samples

   Statistic:         %D
   Warning limits:   (± 2
-------
        Evaluating Methods and Laboratories
742
743
744
745
746
747

748
749
        Duplicate Analyses. The acceptance criteria for duplicate analysis results depend on the analyte
        concentration of the sample, which is estimated by the average x of the two measured results x}
        and x2.
                                            _  x, +x2
                                            x = -Lj-1                                    (7.6)

        When x < UBGR , the absolute difference) jc, - x2 \ of the two measurements is used in the testing
        protocol.  For these tests, only upper warning and control limits are used, because the absolute
        value \xl - x2 \ is being tested.
        When x > UBGR , the acceptance criteria may be expressed in terms of the relative percent
        difference (RPD) defined as
RPD =
                                                     x 100%
                                                                                         (7.7)
750     The requirements for duplicate analyses are summarized below.
751
752
753
754
755
756
757
758
759
760
761
762
Duplicate Analyses
If Jc
-------
                                                            Evaluating Methods and Laboratories
763

764
765
766

767
768
769
770
772
773
774
775
776
777

778
779
780
781
783


784

785
786

Method Blanks
Statistic:
Warning limits:
Control limits:
Measured Concentration Value
±2wMR
±3«MR
Matrix Spikes. The acceptance criteria for matrix spikes are more complicated than those
described above for the other laboratory QC samples because of the pre-existing activity that is
inherent to the unspiked sample. The pre-existing activity (or concentration) must be measured
and subtracted from the activity measured after spiking.
771      MARLAP recommends the "Z score," defined below, as the test for matrix spikes.
                                 Z =
                                   SSR - SR - SA
                                                max(SR, UBGR)2
                                                                                         (7.8)
where:
   SSR   is the spiked sample result,
   SR    is the unspiked sample result,
   SA    is the spike concentration added (total activity divided by aliquant mass), and
          max(SR,UBGR) denotes the maximum of SR and UBGR.

The warning and control limits for Z are set at ± 2 and ± 3, respectively. It is assumed that the
uncertainty of S A is negligible with respect to the uncertainty of SSR. For long-term trending, the
Z results should be plotted graphically in terms of a quality control chart, as described in Chapter
18.
782      The requirements for matrix spikes are summarized below.
Matrix Spikes

   Statistic:
                     Z =
                               SSR - SR - SA
                                   + max(SR, UBGR)2
    Warning limits:   ± 2
    Control limits:    ± 3
         JULY 2001
         DRAFT FOR PUBLIC COMMENT
                                        7-27
                                                                            MARLAP
                                                              DO NOT GTE OR QUOTE

-------
         Evaluating Methods and Laboratories
787      USE OF EXTERNAL PE PROGRAM AND QC SAMPLE RESULTS

788      Information on a laboratory's performance in an external PE program or from double-blind QC
789      samples is very useful in monitoring a laboratory's ability to meet MQOs. A PE program will
790      provide a snapshot in time whereas external QC samples included with samples submitted to the
791      laboratory permit a continuous evaluation of the method's performance. When traceable to NIST,
792      the PE program will elucidate any measurement or instrument calibration biases as related to the
793      national standard. An external QC program may not have NIST traceability, and thus calibration
794      biases to the national standard would not be determined.

795      For monitoring the performance of a laboratory using external PE program and QC sample
796      results, the tests provided in the previous subsection ("Use of Internal QC Sample Results," page
797      7-25) may be used when there are sufficient data.  The test equations assume that the project has
798      an MQO for method uncertainty at a specific concentration. In addition, it is assumed that the
799      Poisson counting uncertainty for the radioanalysis of these samples is minimal.

800      Results from PE Programs

801      In many SOWs, the laboratory is required to participate in a recognized PE program for the
802      nuclides and media of interest. In some cases, a certificate of participation may be needed as part
803      of response to the RFP. However, it also should be noted that although a laboratory may meet
804      performance acceptance criteria for an external PE program, this fact may have no bearing on
805      whether the method  will meet the MQOs of the SOW.

806      Monitoring ongoing laboratory performance is limited due to the minimum frequency of testing
807      of the PE program, i.e., usually quarterly or  semiannually. Some PE programs require multiple
808      measurements to estimate precision but most only request a single result be reported. In addition,
809      the concentration of the analyte typically never approaches an action level value and the media
810      used are not site specific. For PE program samples, when possible, the laboratory should analyze
811      a sample to reach a 1 a Poisson counting uncertainty that is less than five percent.

812      Multiple Analyses and Results

813      When a PE program requires the analysis of multiple samples, the laboratory's measurement
814      precision and bias (to a "known value") at the analyte concentration may be estimated and
815      reported by the PE program provider. When only duplicates sample results are reported, then the
816      tests for laboratory control samples and duplicate  analyses given in the previous section should
817      be used. The duplicate analysis test can be used as is, but the laboratory control sample test


         MARLAP                                                                       JULY 200!
         DO NOT CITE OR QUOTE                   7-28              DRAFT FOR PUBLIC COMMENT

-------
                                                              Evaluating Methods and Laboratories
818      should be evaluated based on the mean of the duplicate results. By using the mean of the two
819      results, the LCS test provides a better estimate of any laboratory measurement bias with respect
820      to the PE program provider. As discussed in Appendix C, the measurement (combined standard)
821      uncertainty of each measured result value should be smaller than the required «MR or cpMR.

822      Results from External QC Samples

823      The project manager may elect to establish an external QC program wherein QC samples are
824      submitted to the laboratory with each batch of routine samples for the purpose of "controlling,"
825      rather than monitoring, the quality of the analytical processes. The types of QC samples may
826      include matrix spikes, blanks, and possibly duplicates if prepared under controlled and exacting
827      protocols. An agency may use a qualified reference or monitoring laboratory (ANSI N42.23) to
828      prepare the performance testing materials. When available, these QC samples may be prepared
829      from site-specific materials.

830      When acceptance criteria are not met, the organization may issue a stop-work order and request
831      corrective actions and reanalysis before routine processing can resume. In order to do this, the
832      SOW must define the performance acceptance criteria and stipulate that the agency or
833      organization has the right to stop laboratory processing when the performance requirements are
834      not met. This application is not widespread but may have merit for certain project types. For
835      example, research or national monitoring programs may monitor groundwater for specific
836      naturally occurring radionuclides at state-of-art detection levels. For these programs, frequent
837      false positive results, due to the application of incorrect instrument background or an analytical
838      blank to the analytical result, would be unacceptable.  Rather than permit a high rate of false
839      positive results to continue, the agency can use the external batch QC samples to detect problems
840      early and have the laboratory discontinue sample processing until a root cause is discovered and a
841      corrective action undertaken. Non-conformance of a single analysis to performance criteria
842      would not warrant the issuance of a stop work order unless a severe blunder has occurred.
843      Typically, a certain amount of statistical trending of the data is in order to truly elucidate
844      deficiencies.

845      Since the number of QC samples is  similar to the recommendations for the laboratory's internal
846      batch QC samples, there should be sufficient data for trending. The statistical tests provided in
847      the section on "Use of Internal QC Sample Results," beginning on page 7-25, may be applied to
848      these QC samples.
         JULY 2001                                                                       MARLAP
         DRAFT FOR PUBLIC COMMENT              7-29                    DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
849      7.4.1.2 Other Parameters

850      The laboratory's performance in meeting the requirements for the other APSs that are listed in
851      the SOW should be evaluated quantitatively when possible. In some cases, the information
852      needed to perform the evaluations may be found in the final analytical results data package. For
853      certain types of evaluations, a follow-up onsite or desk audit may be needed to complete the
854      evaluation, e.g., a review of logbooks on unique processes or software algorithms and the
855      analytical data base for proper spectral resolution.

856      RADIOLOGICAL HOLDING AND TURNAROUND TIMES

857      The data packages or analytical results report should contain the sample collection (reference),
858      sample analysis, and reporting dates. From this information, the radiological holding and sample
859      processing TATs can be calculated and compared against requirements. When a method uses a
860      decay progeny to measure the analyte of interest (222Rn to measure 226Ra), the decay of the parent
861      nuclide and ingrowth of the decay progeny are important parameters for evaluation. Unless
862      requested in the SOW, most laboratories do not report the ingrowth factor as a standard output.
863      Therefore, the information on the sample specific ingrowth factor may be available in the data
864      reports or during audits. When required, these time related requirements will be evaluated for
865      compliance;during data verification and validation,

866      CHEMICAL YIELD

867      When appropriate, the SOW may specify limits on the chemical yield for each analyte. For
868      radionuclides, this requirement typically is related to the provision of robust or rugged methods
869      so that extreme yields become flags indicating potential problems. Wide swings in the chemical
870      yield may be indicative of method's difficulty handling matrix or radionuclide interferences. The
871      data packages or analytical results report should contain the chemical yield for each analyte
872      listed. This reported value can be compared to the SOW yield limit. When required, these
873      requirements will be evaluated for compliance during data verification and validation.

874      SPECTRAL RESOLUTION

875      Problems with spectral resolution of gamma-ray and alpha spectra cannot be evaluated through a
876      review of the analytical results report. If spectral resolution limits have been stated in the SOW,
877      the evaluator should review and evaluate each sample spectrum against the SOW limit. Spectral
878      information may be available  in data packages when required or may be obtained during audits.
         MARLAP                                                               ~      JULY 2001-
         DO NOT CITE OR QUOTE                    7-30              DRAFT FOR PUBLIC COMMENT

-------
                                                               Evaluating Methods and Laboratories
879      During an initial audit, a preliminary evaluation of the method's SOP and review of past
880      performance data for spectral resolution should be undertaken. The TEC may want to determine
881      the baseline or typical spectral resolution for the radiation detection systems that will be used in
882      the analysis of project samples. Trends of the spectral resolution of each detection system during
883      the conduct of the project may be used to determine compliance with a spectral resolution
884      specification.

885      7.4.2   Operational Aspects

886      Once a laboratory begins providing radioanalytical services, certain operational aspects need to
887      be reviewed and evaluated periodically to determine if the laboratory is maintaining project
888      requirements or if new problems have occurred. It is also important to ensure that the laboratory
•889      has been properly maintained and is operated and managed in a manner that will not create a
890      liability to any client. Many of the operational areas that were discussed in Sections 7.3.1 and
891      7.3.2 for the initial evaluation of a laboratory also should be evaluated periodically to ensure
892      commitments are being met. The audit frequency varies according to the organization and  the
893      extent of the project or contract. Desk audits can be  conducted more frequently than onsite audits
894      because they require fewer resources. However, not all operational aspects may be reviewed
895      during desk audits. The operational aspects that may be considered during desk and onsite audits
896      are presented below.

897      7.4.2.1 Desk Audits

898      A desk audit is conducted as an off-site activity, usually by a technical representative of the
899      project manager. A radioanalytical specialist should review all technical aspects of the desk
900      audit, including method and calculation (data reduction) changes, method performance,
901      instrument recalibrations, corrective actions, and case narratives. The desk audit is most useful
902      when performed periodically to monitor certain activities or programs following an extensive
903      onsite laboratory audit. However, for some smaller projects, the desk audit may be the only
904      assessment mechanism used to monitor the laboratory's operations. The desk audit may be used
905      to review or monitor the following operational aspects or items:

906       0 Organization and Management
907         o   Changes in key personnel
908         o   Reassignments

909       0 Quality System
910         o    Internal and external audits conducted, including laboratory certification audits

         JULY 2001         - -                                                            MARLAP
         DRAFT FOR PUBLIC COMMENT              7-31                    DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
911         o   Corrective action implementations
912         o   Quality control and performance evaluations
913             -  Instrument and batch sample QC results
914             -  External PE program results
915         o   Laboratory data verification (narrative status reports)
916         o   Additional method validation studies

917       0 Certificates, licenses, equipment, and reference materials
918         o   Standard and tracer certificates
919         °   New and updates to instrument calibrations
920         o   Instrument repairs and new instruments put into service
921         o   NRC/State radioactive materials licence updates
922         o   State or EPA drinking water certification status changes

923       0 Personnel
924         o   Updates to staff qualification/proficiency for methods
925         o   Updates to staff training files
926             -  Radiation and chemical safety
927             -  Quality assurance
928             -  Technical principles
929             -  Hands-on training records

930       0 Radioanalytical Methods and Standard Operating Procedures
931         o   Updates to methods and SOPs
932         o   Technical basis for updates
933         o   Detection limits or method uncertainty studies

934       <> Sample Receipt, Handling and Disposal
935         o   Sample receipt acknowledgment
936         o   Chain-of-custody
937         o   Sample- and waste-disposal tracking logs and manifests

938      Desk audits may also be used to review the data packages provided by the laboratory and,
939      periodically, to verify certain method results by hand calculations. In addition, verification of
940      compliance to radiological holding and turnaround times may be performed during the desk
941      audit. In the absence of a full data verification and validation program (Chapter 8), the desk audit
942      may be used to periodically evaluate the detailed instrument and data reduction reports of the
943      data packages for method adherence, technical correctness and valid application.


         MARLAP   ~                                                                  JULY 2001
         DO NOT CITE OR QUOTE                   7-32             DRAFT FOR PUBLIC COMMENT

-------
                                                              Evaluating Methods and Laboratories
944      7.4.2.2 Onsite Audits

945      The onsite laboratory audit is more comprehensive and resource intensive than a desk audit. An
946      onsite audit typically is conducted to assess, periodically and in depth, a laboratory's capability to
947      meet project requirements. Section E.5.5 of Appendix E provides guidance on the conduct of an
948      initial onsite audit during a contract award process. EPA (1997) provides limited guidance on the
949      conduct of an audit for a radiological laboratory. NELAC (2000) provides some generic guidance
950      on laboratory assessments, although not specifically for a radiological laboratory.

951      Onsite audits usually cover the operational aspects delineated in Section 7.4.2.1 and also provide
952      an opportunity to evaluate the physical conditions at the laboratory, in terms of adequacy and
953      upkeep of the facilities, and the full application or conduct of programs and resources. Informa-
954      tion sent in data packages or submitted for desk audits can be confirmed or verified during an
955      onsite audit. Furthermore, an onsite audit permits the tracking of a sample from receipt through
956      processing to sample storage and disposition and can verify the related instrument and batch QC
957      samples specific to the sample being tracked. During an onsite audit, the auditors may have
958      interviews with the staff to gauge their technical proficiency and familiarity with methods.

959      For large projects, onsite audits may be formal in nature and have a predefined audit plan, which
960      has been developed by a designated audit team, for a specific project or program. The audit team
961      typically is comprised of qualified QA representatives and technical experts. MARLAP
962      recommends that the audit team include a radioanalytical specialist familiar with the project's or
963      program's technical aspects and requirements.

964      In addition to the items in Section 7.4.2.1 ("Desk Audits"), the following items and programs
965      should be assessed during an onsite laboratory audit:

966       0 Organization and Management
967         °   Qualifications of assigned laboratory project manager
968         o   Implementation of management's policy on quality
969         °   Timeliness of addressing client complaints
970         o   Timeliness of implementing corrective actions

971       0 Physical Facilities
972         °   Adequacy of facilities (sample receipt, processing, instrumentation and storage areas,
973             waste processing and storage, offices, etc.)
974         o   Physical conditions of facilities including laboratories, hoods, bench tops, floors, offices,
975             etc.


         JULY 2001                                 -:  .                                   MARLAP
         DRAFT FOR PUBLIC COMMENT              7=33                     DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
 976         °  Environmental controls, such as climate control (heating, ventilation, air conditioning)
 977            and electrical power regulation
 978         o  Sample processing capacity
 979         °  Sample storage conditions including chain-of-custody lockup areas and cross
 980            contamination control (separation of samples by project and from radioactive sources or
 981            wastes)

 982      0 Instrumentation and Equipment
 983         o  Age of nuclear instrumentation and equipment
 984         o  Functionality of nuclear instrumentation and equipment
 985         o  Calibrations and QC logs
 986         o  Maintenance and repair logs
 987         °  Sample throughput capacity
 988         o  Contamination control for radiation detectors
 989         °  Background spectra of radiation detectors

 990      0 Methods and Standard Operating Procedures
 991         °  Use of latest revisions of methods and SOPs (spot check method manuals used by
 992            technical staff)
 993         °  Confbimance to method application (surveillance of method implementation)
 994      •   o  Effectiveness of administering the controlled method manual

 995      0 Certifications, Licenses and Certificates of Traceability
 996         °  Ensure existence and applicability of, and conformance to, certifications and licenses
 997         o  Noted citations during audits related to certifications and licenses
 998         o  Ensure use of NIST-traceable materials (calibration standards)/review of vendors' report
 999            ofNISTtraceability

1000      0 Waste Management Practices
1001         o  Adherence to waste management SOPs
1002         °  Proper packaging, labeling, manifests, etc.
1003         o  Sample storage and records
1004         o  Training and qualification records

1005      0 Radiological Controls
1006         o  Adherence to radiological safety SOPs
1007         o  Contamination control effectiveness (spill control, survey requirements and adherence,
1008            posted or restricted areas, proper ventilation, cleaning policies, etc.)


          MARLAP                                                                       JULY 2001
          DO NOT CITE OR QUOTE                    7-34              DRAFT FOR PUBLIC COMMENT

-------
                                                             Evaluating Methods and Laboratories
1009         °  Badging and survey adherence

1010       0 Personnel
1011         °  Number and technical depth of processing staff
1012         o  Training files
1013         °  Testing/qualifications
1014         o  Personal interviews to determine familiarity of methods and safety SOPs

1015       0 Quality Systems
1016         °  Performance indicator program (feedback from program)Quality assurance reports (QC
1017            and audits) for all laboratory processing
1018         °  Ongoing method evaluations and validations
1019         °  Corrective action program (effectiveness and outstanding issues for all processing; spot
1020            check for implementation of corrective actions)
1021         o  Records/reports related to audits of vendors used by laboratory
1022         °  Reagent control program (spot check conformance for effectiveness)
1023         o  Audits of laboratories that are subcontracted
1024         °  Laboratory's data verification  and validation processes

1025       0 Software Verification and Validation
1026         o  Spot review of key method calculation and data reduction programs that include MDC,
1027            MQC, and measurement uncertainty; spectral unfolding routines or crosstalk factors;
1028            application of instrument background and analytical blanks; etc.
1029         °  Spot verification of consistency between electronic data deliverable and data packages

1030       0 Radiological Holding and Sample Turnaround Times
1031         °  Verification of compliance to  radiological holding and sample TAT specifications (spot
1032            check samples and confirm paperwork)
1033
1034
1035

1036
1037
1038
                        Summary of Recommendations

MARLAP recommends that a radioanalytical specialist review the methods for technical
adequacy.

MARLAP recommends that project specific MQOs be established and incorporated into
the SOW for laboratory radioanalytical services.

MARLAP recommends that a MQO for method uncertainty be established for each
          JULY 2001                                                                      MARLAP
          DRAFT FOR PUBLIC COMMENT   """          7:35"                   DO NOT CITE OR QUOTE

-------
         Evaluating Methods and Laboratories
1039
1040
1041
analyte/matrix combination.

MARLAP recommends that an audit team include a radioanalytical specialist familiar with
the project's or program's technical aspects and requirements.
1042     7.5   References

1043    . American National Standards Institute (ANSI) N42.23. Measurement and Associated
1044        Instrumentation Quality Assurance for Radioassay Laboratories.

1045     American Public Health Association (APHA) 1989. Standard Methods for the Examination of
1046        Water and Waste Water. Washington, DC.

1047     American Society for Testing and Materials (ASTM)-D2777. Standard Practice for
1048        Determination of Precision and Bias of Applicable Test Methods of Committee D-19 on
1049        Water.

1050     American Society for Testing and Materials (ASTM) El 77. Standard Practice for Use of the
1051        Terms Precision and Bias in ASTM Test Methods.

1052     American Society for Testing and Materials (ASTM) E548. Standard Guide for General Criteria
1053        Used for Evaluating Laboratory Competence.

1054     American Society for Testing and Materials (ASTM) E1580. Standard Guide for Surveillance of
1055        Accredited Laboratories.

1056     American Society for Testing and Materials (ASTM) E1691. Standard Guide for Evaluation and
1057        Assessment of Analytical Chemistry Laboratories.

1058     U.S. Environmental Protection Agency (EPA). 1997. Manual for the Certification of
1059        Laboratories Analyzing Drinking Water. EPA 815-B-97-001.

1060     U.S. Environmental Protection Agency (EPA). 1998. Guidance for Quality Assurance Project
1061        Plans EPA QA/G-5. EPA 600-R-98-018, February.
         MARLAP                                            -                     JULY 2001
         DO NOT CITE OR QUOTE                   7-36              DRAFT FOR PUBLIC COMMENT

-------
                                                             Evaluating Methods and Laboratories
1062     International Organization for Standardization (ISO) 17025. General Requirements for the
1063        Competence of Testing and Calibration Laboratories. International Organization for
1064        Standardization, Geneva, Switzerland. 1999.

1065     National Bureau of Standards (NBS). 1963. Experimental Statistics. NBS Handbook 91, National
1066        Bureau of Standards, Gaithersburg, MD.

1067     National Environmental Laboratory Accreditation Conference (NELAC) 2000. Chapter 5,
1068        Quality Systems. July. Available at: http://www.epa.gov/ttn/nelac/.

1069     Taylor, John_Keenan.  1990. Statistical Techniques for Data Analysis. Lewis Publishers, Chelsea,
1070        ML
         JULY 2001         _                            ,                    ,  .       MARLAP
         DRAFT FOR PUBLIC COMMENT              7-37                    DO NOT CITE OR QUOTE

-------

-------
 i           8 RADIOCHEMICAL DATA VERIFICATION AND

 2                                       VALIDATION


 3      8.1    Introduction

 4      The goal of the data collection process is to produce credible and cost-effective data to meet the
 5      needs of a particular project. The process can be divided into several stages, as illustrated in the
 6      data life cycle (Chapter 1). This chapter is the first of two chapters that address the assessment
 7      phase of the project. Because the efficiency and success of these assessment activities are heavily
 8      dependent on the completion of the preceding steps in the data collection process, especially the
 9      initial planning activity (Chapter 2), the integration of planning and assessment is discussed in
10      Section 8.2 prior to presenting material on data verification and validation.

11      Data verification compares the material delivered by the laboratory to the requirements in the
12      statement of work (SOW) and identifies problems, if present, that should be investigated during
13      data validation. Data validation compares the data produced with the measurement quality
14      objectives (MQOs) and any other analytical process requirements contained in the analytical
15      protocol specifications (APSs) developed in the planning process. It may not be necessary in all
16      instances to validate all project data. This chapter outlines a validation plan that specifies the data
17      deliverables and data qualifiers to be assigned that will facilitate the data quality assessment. The
18      project-specific data validation plan should establish a protocol that prioritizes the data to be
19      validated. This is to eliminate unnecessarily strict requirements that commit scarce resources to
20      the in-depth evaluation of data points with high levels of acceptable uncertainty. For example,
21      results very much above or below an action level may not require rigorous validation, since
22      relatively large measurement uncertainty would not affect the ultimate decision or action.
23      Planners should also identify those samples or data sets that have  less rigorous standards for data
24      quality and defensibility.

25      This chapter presents suggested criteria to evaluate data and addresses the appropriate function
26      and limits of radiochemical techniques and measurements. Since calibration is more efficiently
27      evaluated as part of an audit, this chapter does not recommend that the complete calibration-
28      support documentation be  included as part of the data package. MARLAP recommends that
29      calibration be addressed in a Quality System and through an audit (Chapter 18), although
30     demonstration of calibration may be required as part of a project's deliverables. Detector
31      calibration, self absorption curves and efficiencies should be addressed as part of the evaluation
32     of laboratories during the procurement process and continued during subsequent assessments
33     (Chapter 7). Availability and retention of calibration records are decisions that are project-
34     specific, but should be clearly identified for contract clarity and to assure project completeness

       JULY 2001                                                                      MARLAP
       DRAFT FOR PUBLIC COMMENT              8-1                    DO NOT CITE OR QUOTE

-------
       Radiochemical Data Verification and Validation
35     (i.e., customer needs met). External sources of information, such as performance evaluation
36     sample results and internal laboratory control samples, provide useful interim information on
37     calibration status and accuracy.

38     8.2   Data Assessment Process

39     Figure 1.1 of Chapter 1 graphically depicts the three phases—planning, implementation, and
40     assessment—of the data life cycle, and the associated activities and products of each phase.
41     While these activities are addressed in separate chapters in MARLAP, it should be emphasized
42     that integration of planning, sampling,  and analysis with subsequent data verification, data
43     validation, and data quality assessment (DQA) is essential.

44     This section reviews the data life cycle from the perspective of the assessment phase and focuses
45     on those issues that have the potential to impact the quality and usability of the data. Section
46     8.2.1 addresses the development of the assessment procedures during project planning. Section
47     8.2.2 considers assessment needs for documentation and a quality system during implemen-
48     tation. Section 8.2.3 focuses on the assessment phase and addresses the interrelationship of the
49     three assessment processes. This introduction to the data life cycle process emphasizes the
50     importance of linkages among planning, implementation, and assessment.

51     8.2.1  Planning Phase of the Data Life Cycle

52     Directed project planning and the development of the associated DQOs, MQOs, and other
53     specifications for the project were reviewed in Chapters 2 and 3. These chapters emphasize the
54     need for planners to thoroughly define the assessment processes (i.e, verification,  validation and
55     data quality 'assessment) in sufficient detail that success or failure in meeting goals can be
56     determined upon project completion. MARLAP recommends that the assessment phase of a
57     project (verification, validation, and DQA processes) be designed during the directed planning
58     process and documented in the respective plans as part of the project plan documents. This
59     requires the project planning team to develop detailed procedures for data verification, data
60     validation, and data quality assessment, as well as identify the actual personnel who will perform
61     assessment or the required qualifications and expertise of the assessors.

62     The development of these procedures during the directed planning process will increase the
63     likelihood that the appropriate documentation will be available for assessment, and that those
64     generating and assessing data will be aware of how the data will be assessed. A secondary
65     advantage, which assessment plans have, is that prior to their completion, they often result in the
66     detection of design flaws (e.g., lack of proper quality control [QC] samples, lack of a field audit)

       MARLAP                                                                        JULY 2001
       DO NOT CITE OR QUOTE                    8-2               DRAFT FOR PUBLIC COMMENT

-------
                                                   Radiochemical Data Verification and Validation
67     that upon correction will result in the complete information necessary for the proper assessment
68     of data usability.

69     The culmination of the planning process is documentation of the outputs of the directed planning
70     process in the project plan documents. The project plan documents should capture the DQOs,
71     MQOs, and the optimized data collection design (i.e., Analytical Protocol Specifications,
72     sampling and analysis plans, and SOPs). The project plans should also include the assessment
73     plans as discussed above, and describe the field, lab, safety, and QA activities in sufficient detail
74     that the project can be implemented as designed. Chapter 4 discusses guidance for the authoring
75     and content of project plan documents.

76     If the directed planning process, its outputs (DQOs, MQOs, optimized sampling and analysis
77     designs), and associated assumptions are not documented well in project plan documents, the
78     assessment phase will have difficulties evaluating the resulting data in terms of the project's
79     objectives.

80     8.2.2  Implementation Phase of the Data Life Cycle

81     The project plans are executed during the implementation phase. Ideally, the plans would be
82     implemented as designed, but due to errors, misunderstandings, the uncontrolled environments
83     under which sampling is implemented, and matrix-specific issues  that complicate sample
84     handling and analysis, most project plans are not implemented without some deviation.

85     Understanding the realities of implementation, the assessment process, in particular the DQA
86     process, will evaluate the project's implementation by considering: (a) if the plans were adequate
87     to meet the project's DQOs, (b) if the plans were implemented as  designed, and (c) if the plans as
88     implemented were adequate to meet the project DQOs. MARLAP recommends that project
89     objectives, implementation activities and QA/QC data be well documented in project plans,
90     reports, and records, since the success of the assessment phase is highly dependent upon the
91     availability of such information.

92     Documentation and record keeping during the planning and implementation phase of the data life
93     cycle are essential to subsequent data verification, data validation, and data quality assessment.
94     Thorough documentation will  allow for a determination of data quality and data usability.
95     Missing documentation can result in uncertainty, and a lack of critical documentation (e.g.,
96     critical quality control results) can result in unusable data. The quality and usability of data can
91     not be assessed if the supporting documentation is not available.
       JULY 2001                       .                                                MARLAP
       DRAFT FOR PUBLIC COMMENT -              8-3                    DO NOT CITE OR QUOTE

-------
        Radiochemical Data Verification and Validation
 98     8.2.2.1 Project Objectives

 99     The DQOs, MQOs, and other specifications, requirements, and assumptions developed during
100     the planning phase will influence the outcomes during the subsequent implementation and
101     assessment phases of the data life cycle. It is important that these objectives, specifications,
102     requirements, and assumptions are well documented and available to those implementing the
103     program so they can make informed decisions. This documentation is reviewed during the DQA
104     process (see discussion of the review of DQOs in Section 9.6.1.1, sampling plan in
105     Section 9.6.2.1, and analysis plan in Section 9.6.3.1).

106     8.2.2.2 Documenting Project Activities

107     The assessment of data in terms of sampling and analytical MQOs requires an accurate record of
108     QC sample data and compliance with specifications and requirements. If these records are
109     missing or inadequate, then compliance with APSs, including the MQOs that were identified
110     during the planning phase, will not be ascertainable and will raise questions regarding quality.

111     Additional documentation is required to assess compliance with plans and contracts, and to
112     assess field and lab activities (e.g., compliance with SOPs) and the associated organizational
113     systems (e.g., laboratory Quality Manual). This information is gleaned from the review of field
114     and laboratory notebooks, deviation reports, chahvof-custody forms, verification reports, audit
115     reports, surveillance reports, performance evaluation sample analyses, corrective action reports
116     and reports to management that may identify deviations, contingencies, and quality problems.
117     Assessment of these types of contemporaneous records allow for the assessment of data in the
118     context of pertinent issues that may have arisen during project implementation.

119     Project records should be maintained for an agreed upon period of time, which should be
120     specified in project plan documents. Record maintenance should comply with all regulatory
121     requirements and parallel the useful life of the data for purposes of re-assessment as questions
122     arise or for purposes of secondary data uses that were not originally anticipated.

123     8.2.2.3 QA/QC

124     To ensure that the data collection activity generates data of known quality, it is essential that the
125     project plan documents specify the requirements for an appropriate quality system that is capable
126     of implementing the quality controls and the quality assurance necessary for success.
         MARLAP                                                            - -  -   ,    JULY 2001-
         DO NOT CITE OR QUOTE                    8-4              DRAFT FOR PUBLIC COMMENT

-------
                                                    Radiochemical Data Verification and Validation
127     The quality system will oversee the implementation of QC samples, documentation of QC
128     sample compliance or non-compliance with MQOs, audits, surveillances, performance evaluation
129     sample analyses, corrective actions, quality improvement and reports to management. The
130     documentation generated by these quality assurance activities and their outputs during project
131     implementation will be a key basis for subsequent assessments and data usability decisions.

132     8.2.3   Assessment Phase of the Data Life Cycle

133     Assessment of environmental data currently consists of three separate and identifiable phases:
134     data verification, data validation, and DQA. Verification and validation pertain to evaluation of
135     analytical data. Verification and validation are considered as two separate processes, but as the
136     MARLAP recommended planning process is implemented, they may be combined—with the
137     verification activities constituting the bulk of the review. DQA considers all sampling, analytical,
138     and data handling details, external QA assessments, and other historical project data to determine
139     the usability of data for decision-making.

140     Figure 8.1 is a graphical depiction of the assessment phase. Although, it portrays a linear
141     progression through the various steps, and from verification and validation to data quality
142     assessment, this linear advancement is not entirely necessary. It is possible for parallel progress
143     within an assessment process (e.g., existing documents are verified while waiting for the
144     production of others) and between assessment processes (e.g., analysis of the DQOs for data
145     quality assessment while data validation is being completed). Typically, the focus of verification
146     and validation is on the analytical process and on a data point by data point review, while data
147     quality assessment considers the entire data collection process and the entire data set as it
148     assesses data quality.

149     Analytical data verification assures laboratory conditions and operations were compliant with the
150     SOW based on project plan documents.  The updated project plan documents specify the
151     analytical protocols the laboratory should use to produce data of acceptable quality and the
152     content of the analytical data package (see MARLAP Process in Chapter 1). Verification
153     compares the analytical data package delivered by the laboratory to these requirements
154     (compliance), and checks for consistency and comparability of the data throughout the data
155     package, correctness of basic calculations, data for basic calculations, and completeness of the
156     results to ensure all necessary documentation is available. Verification can be accomplished
157     through use of a plan or simply a check list. The verification process produces a report
158     identifying which requirements are not met (i.e., exceptions qualified with an "E" to alert the
159     validator). The verification report is used to confirm laboratory compliance with the SOW and to
160     identify problems that should be investigated during data validation. Verification works


        JULY 2001                        '  ~                                      "~      MARLAP
        DRAFT FOR PUBLIC COMMENT               8-5                     DO NOT CITE OR QUOTE

-------
        Radiochemical Data Verification and Validation

uti!itl«uld«illllodln
[ Vnllleillon Rijxxl


DftonnliM II uutyUeil tyttMn
WH In contfol
{COKIPlllllMWlthtlaOt)



ipplleiblt lo itnvplo miBli
i
Apply qutntltttln u«tt of dtuetlon
•nd uneortatnty

*wira

.
UIIIMri
,
Viudill«o Ropon
J

^
(
^v^ 	
Focus is typically on the analytical process
& individual datum

I
f CAT A QUALITY ASSESSMEMT 1
4
Rt«li» OOOi md M«l 'lira
i
DcUlmln* If Hmpl*« tf* MprtMMiliM
i
0»lw(nln*JI(»«tiir<
•ccimta
i
ttetarmliH II I dicliton un to nail
|
(^Endof AssessnienT^i

C^JIAKE DECISION^)

USABILITY
Focus on the entire data collection process
A the entire dataset
                                  FIGURE 8.1 — The Assessment Process
161      iteratively and interactively with the generator (i.e., laboratory) to assure receipt of all necessary
162      data. Although the verification process identifies specific problems, the primary function should
163      be to apply appropriate feedback to the lab resulting in corrective action improving the analytical
164      services before the project is completed.
165     Validation addresses the reliability of the data. The validation process begins with a review of the
166     verification report and laboratory data package to identify its areas of strength and weakness.
167     This process involves the application of qualifiers that reflect the impact of not meeting the
168     MQOs. Validation then evaluates the data to determine the presence or absence of an analyte,
169     and the uncertainty of the measurement process. During validation, the technical reliability and
170     the degree of confidence in reported analytical data are considered. The data validator should be
171     a scientist with radiochemistry experience.
        MARLAP
        DO NOT CITE OR QUOTE
8-6
                   JULY 2001
DRAFT FOR PUBLIC COMMENT

-------
                                                     Radiochemical Data Verification and Validation
172     Validation flags (i.e., qualifiers) are applied to data that do not meet the performance acceptance
173     criteria established in the SOW and the project plan documents. The products of the validation
174     process are validated data and a validation report stating which data are acceptable, which data
175     are sufficiently inconsistent with the validation acceptance criteria in the expert opinion of the
176     validator, and a summary of the QC sample performance. The appropriate data validation tests.
177     should be established during the project planning phase. The point of validation is to perform a
178     systematic check on a set of data being used to meet the project MQOs and any other analytical
179     process requirements. Documenting that such a check cannot be done is an appropriate and
180     essential validation activity. (For example, applying numerical tests to data already determined to
181     be unreliable data are of no value.)

182     Data Quality Assessment is the last phase of the data collection process, and consists of a
183     scientific and statistical evaluation of project-wide knowledge to assess the usability of data sets.
184     To assess and document overall data quality and usability, the data quality assessor integrates the
185     data validation report, field information, assessment reports, and historical project data, and
186   \ compares the findings to the original project DQOs. The DQA process uses the combined
187     findings of these multi-disciplinary assessments to determine data usability for the intended
188     decisions, and to generate a report documenting that usability and the causes of any deficiencies.
189     It may be useful for a validator to work with the assessor to assure the value of the validation
190     process (e.g., appropriateness of rejection decision), and to make the process more efficient.
191     DQA will be covered in Chapter 9.

192     8.3 Validation Plan

193     The validation plan should integrate the contributions and requirements of all stakeholders and
194     present this information in a clear, concise format. To achieve this goal, validation planning
195     should be part of initial planning (e.g., directed planning process) to assure that the data will be
196     validated efficiently to determine its reliability and technical defensibility in an appropriate
197     context and to an appropriate degree.

198     The validation plan is an integral part of the project plan documents (Chapter 4), and should be
199     included as either a section within the plan or as a stand-alone document  attached as an  appendix.
200     The validation plan should be approved by an authorized representative of the project, the
201     validation group performing the validation, and any other stakeholder whose agreement is
202     needed.

203     The information and documentation identified in the validation plans should be communicated to
204     the laboratory as part of the SOW. Integration of validation plan specifications, contractual

        JULY 2001                                                   ~                     MARLAP
        DRAFT FOR PUBLIC COMMENT               8-7                     DO NOT CITE OR QUOTE

-------
        Radiochemical Data Verification and Validation
205      requirements, and validator instructions/contracts is essential to ensure data collection process
206     . efficiency. Implementation of the data validation plan will ensure that proper laboratory
207      procedures are followed and data are reported in a format useful for validation and assessment,
208      and will improve cost-effectiveness of the data collection process.

209      The data validation plan should contain the following information:

210       • Summarize the project that provides sufficient detail about the project technical and quality
211         objectives in terms of sample and analyte lists, required measurement uncertainty, and
212         required detection limit  and action level on a sample/analyte-specific basis. Specify the scope
213         of validation, e.g., whether all the raw data will be reviewed and in what detail (see Section
214         8.3.1).

215       • Specify the necessary validation criteria, as derived from the MQOs, and performance
216         objectives deemed appropriate for achieving project objectives (see Section 8.3.2).

217       • Direction to the validator on what qualifiers are to be used and how final qualifiers are
218         assigned (see Section 8.3.3).

219       • Direction to the validator on the content of the validation report (see Section 8.3.4).

220      8.3.1   Technical and Quality Objectives of the Project

221      The identity of key analytes and how the sample results drive project decisions should be
222      specified in the validation plan. Li addition, the plan should define the association of required
223      quality control samples with project environmental samples.

224      This section of the validation plan should specify the following:

225       • Quality control (QC) acceptance criteria;

226       • Level of measurement uncertainty considered unusually high and unacceptable (tests of
227         unusual uncertainty and rejection); and

228       • Action level and MQOs for detection and quantification capability (e.g., required detection
229         and quantification limit) (tests of detection).
         MARLAP                                                                         JULY 2001
         DO NOT CITE OR QUOTE                     8-8               DRAFT FOR PUBLIC COMMENT

-------
                                                     Radiochemical Data Verification and Validation
230      The quality control acceptance criteria serve two purposes: (1) to establish if the analytical
231      process was in control; and (2) to determine if project requirements were met. If the analytical
232      process is in control, the assumption was that the analysis was performing within established
233      limits and indicates a reasonable match among matrix/analyte/method. Generally this means that
234      routine data quality expectations are appropriate. The tests of unusual (i.e., analysis not in
235      control) uncertainty should verify the data meet the statistical confidence limits for uncertainty
236      associated with the planning process. During validation, the uncertainty associated with sampling
237      cannot be estimated. The tests of detection determine the presence or absence of analytes.

238      8.3.2  Validation Tests

239      Validating data requires three specific decisions that will allow the validator to qualify the data.
240      The project planning team should determine:

241       •  Which QC samples should be employed and how do they relate to the environmental
242         samples?

243       •  Which validation tests are appropriate?

244       •  What validation limits should be used for the specific tests?

245      The answers to these questions are driven by the need to know whether the data meets the MQOs
246      for the project, and the allocation of resources between planning and implementation (i.e.,
247      conservative review may be more costly than real or perceived value in the decision). This
248      section of the validation  plan should address the following:

249       •  QC sample validation criteria;

250       •  Specific validation tests to be used; and

251       •  Statistical confidence intervals or fixed limit intervals applied to each of the validation tests
252         and criteria based on the MQOs for the project (Appendix  C).

253      8.3.3  Data Qualifiers

254      Data qualifiers are codes placed on an analytical result that alert data users to the validator's or
255      verifier's concern about the result. This section of the validation plan should outline:
         JULY 2001                                                                        MARLAP
         DRAFT FOR PUBLIC COMMENT               8-9                     DO NOT CITE OR QUOTE

-------
        Radiochemical Data Verification and Validation
256      •  The basis for rejection or qualification of data; and
257      •  The qualification codes that will be assigned.

258     These issues are discussed in detail in Section 8.5, which provides guidance for assigning data
259     qualifiers.

260     The verification process uses a qualifier (E) to alert the validator to non-compliance, including
261     missing documentation, contract compliance, etc. This qualifier may be removed or replaced
262     during validation, based on the validator's interpretation of the effect of the non-compliance on
263     the data's integrity.

264         E  A notice to the validator that something was noncompliant.

265     The validation process uses the qualifiers listed below to identify data points that do not meet the
266     project MQOs or other analytical process requirements listed in the SOW or appropriate project
267     plan document. The assignment of the J and R qualifiers relies heavily on the judgement and
268     expertise of the reviewer and therefore, these qualifiers should be assigned as appropriate at the
269     end of data validation.

270         U  A normal, not detected (< critical value) result.

271
272         Q  A reported combined standard uncertainty, which exceeds the project's required method
273            uncertainty.

274         J   An unusually uncertain or estimated result.

275         R  A rejected result: the problems (quantitative and/or qualitative) are so severe that the data
276            can not be used.

277     The data validator should be aware that a data qualifier or a set of qualifiers does not apply to all
278     similar data. The data validator should incorporate the project MQOs into the testing and
279     qualifying decision-making process. During the data validation process the data validator may
280     use additional qualifiers based on QC sample results and acceptance criteria. These qualifiers
281     may be summarized as U, J, R or Q in the final.validation-report. The final validation reports
282     should also include a summary of QC sample performance for use by the data assessor.
        MARLAP             - • -'                                                       JULY2001
        DO NOT CUE OR QUOTE                     8-10              DRAFT FOR PUBLIC COMMENT

-------
                                                                     UILCIO uuieuy
                                                             Mail code 3201
                                                      1200 Pennsylvania Avenue NW
                                                         Washington DC 20460
                                                    Radiochemical Data Verification and Validation
283        S  A result with a related spike result (laboratory control sample (LCS), matrix spike(MS) or
284            matrix spike duplicate MSD), which is outside the control limit for recovery (%R), S+ or
285            S- used to indicate high or low recovery.

286        P  A result with an associated replicate result that exceeds the control limit.

287        B  A result with associated blank result, which is outside the control limit, B+ or B-.

288     8.3.4  Reporting and Documentation

289     The purpose of this section is to define the format and program needs for validation reports and
290     supporting documentation. This section should include:  -

291      • Documentation and records that should be included in a validation report;

292      • Disposition requirements for records and documents from the project;

293      • Report format, i.e., a summary table with results, uncertainties  and qualifiers; and

294      • Procedures for non-conformance reporting, which detail the means by which the laboratory
295        communicates non-conformances against the validation plan. The procedures should include
296        all instances where the analytical data requirements and validation requirements established
297        by the planning process and  validation plan, respectively, cannot be met due to sample matrix
298        problems and/or unanticipated laboratory issues (loss of critical personnel or equipment).

299     Detailed information about the Validation Report is presented in Section 8.6.

300     8.4    Other Essential Elements

301     Effective data validation is dependent on:

302      • A SOW and project plan  documents that clearly define the data needs and the data quality
303        requirements (i.e., MQOs); and

304      • A data package that has been verified for completeness, consistency, compliance, and
305        correctness.

306     8.4.1  Statement of Work

        JULY 2001                                                                       MARLAP
        DRAFT FOR PUBLIC COMMENT               8-11                    DO NOT CITE OR QUOTE

-------
        RadiochemicalData Verification and Validation
307     The analytical services procurement options should be considered during the planning process.
308     The SOW should specify the QC requirements that will be evaluated by the validator (see
309     Chapter 5). The elements that should be specified include, but are not limited to:

310       •  External performance evaluation (PE) participation and acceptance criteria;
311       •  Replicate sample frequency and acceptance criteria;
312       •  LCS and acceptance criteria;
313       *  Blank requirements and acceptance criteria;
314       •  MS and MSD samples and acceptance criteria;
315       •  Uncertainty calculations; and
316       •  Sample result equations and calculations including corrections for yield, percent moisture,  .
317         efficiencies and blank, if applied.

318     Section 8.5.2 provides guidance on evaluating QC sample results based on the project's MQO for
319     measurement uncertainty.

320     8.4.2  Verified Data Deliverables

321     Verification compares the sample receipt information and the sample report delivered by the
322     laboratory against the SOW and produces a report that identifies those requirements that were not
323     met (called exceptions). Verification can be accomplished using a plan or checklist, which
324     doesn't necessarily need to be project-specific. Verification exceptions normally identify:

325       •  Required steps not carried out by the laboratory (i.e., correction for yield, proper signatures);

326       •  Method QC not conducted at the required frequency (i.e., blanks, duplicates); and

327       •  Method QC not meeting pre-set acceptance criteria (i.e., non-compliant laboratory control
328         sample analysis).

329     The verifier checks the data package (paper or electronic) for completeness, consistency,
330     correctness, and compliance.  Completeness means all required information is present.
331     Consistency means values are the same when reported redundantly on different reports, or
332     transcribed from one report to another. Correctness means the reported results are based on
333     properly documented and correctly applied algorithms. Compliance means the data pass
334     numerical QC tests based on parameters or limits derived from the MQOs specified in the SOW.
         MARLAP                                                                      JULY 2001
         DO NOT CITE OR QUOTE                    8-12              DRAFT FOR PUBLIC COMMENT

-------
                                                    Radiochemical Data Verification and Validation
335     The verifier should provide, within the verification package, checklists for contract or SOW
336     specifications, noted deficiencies related to contract compliance, noted discrepancies or obvious
337     quality related problems, and pertinent external QC results. The verification package notes the
338     deficiencies, discrepancies, and quality-related problems that could not be resolved with the
339     laboratory. The validator should take this information into consideration during the data
340     validation process.

341     8.5    Data Verification and Validation Process

342     In its most basic form, data validation focuses on the reliability of each data point. After each
343     point is evaluated, summary conclusions concerning the validity of groups of data (sets) are
344     drawn and finally, after the reliability of all data sets has been established, an overall conclusion
345     about the quality and defensibility of a project's analytical database is reached (DQA).

346     The first step in establishing the reliability of an analytical measurement is to determine that the
347     measurement analytical process used in making the measurement is in control. That is, the
348     sample handling and analysis system is performing within an accepted operating range
349     (established by instrument manufacturer, method, or contract specifications and/or long-term
350     historical laboratory performance). After it has been determined that the measurement analytical
351     process is in control, it is necessary to demonstrate that the sample is responding as expected
352     when introduced into the measurement system.

353     The measurement process includes devices such as detectors for measuring radioactive decay and
354     balances for determining the mass of materials. The measurement process also includes the
355     software that takes the output from the measurement device and calculates the result as a quantity
356     of target radionuclide (activity/mass activity/volume). The measurement process performance
357     normally is specified by the SOW and appropriate project plan documents, and monitored by
358     routine laboratory quality control procedures. Laboratory performance against these requirements
359     is determined by the verification process uses these requirements to determine laboratory
360     performance.

361     When an environmental sample is analyzed, new sources of variability are encountered in
362     addition to those associated with the measurement process. These sources include laboratory
363     subsampling, sample preparation (e.g., digestion, leaching, etc.), sample matrix effects, and data
364     transcription, to list a few. These processes, taken together with the previously discussed
365     measurement process, comprise the analytical process.
        JULY 2001                                                                        MARLAP
        DRAFT FOR PUBLIC COMMENT              8-13                     DO NOT CITE OR QUOTE

-------
        Radiochemical Data Verification and Validation
366     The performance of the analysis can be predicted based on previous experience with similar
367     materials. Analysis performance is monitored by laboratory quality control procedures specified
368     in the SOW and appropriate project plan documents. Unlike the analytical process performance,
369     the overall performance of the analysis is not amenable to assessment by the data verification
370     process. Since each sample matrix, analyte, and method set is unique, the evaluation of overall
371     analysis performance and resulting data is the role of a knowledgeable validator.

372     Using the validation plan, which specifies QC samples, validation tests, and validation limits,
373     validation occurs in four stages:

374       •  Determine whether the sample handling and analysis system is in control (Section 8.5.1);
375       •  Determine whether quality control sample analyses meet specified MQOs (Section 8.5.2);
376       •  Apply validation tests of detection and unusual uncertainty (Section 8.5.3); and
377       •  Determine final data qualifiers and document the results (Section 8.5.4).

378     For other chemistry methods, identification of the analyte is also a primary decision. Except for
379     gamma spectroscopy, this is rarely an issue in radiochemistry. For radiochemistry, the
380     laboratory's ability to reliably identify analytes do reliable identifications is best checked by
381     auditors and verified by checking the calibration check samples.

382     8.5.1  The Sample Handling and Analysis System

383     As described in earlier sections of this guidance, it is necessary to know the extent to which the
384     data delivered for validation meet the requirements of the SOW and appropriate project plan
385     documents. These documents normally specify the minimum acceptable performance of the
386     analytical process. These specifications are the basis of the tests of quality control (QC tests) that
387     establish that the sample handling and analysis system is in control at the time the analyses were
388     performed. It is also necessary to know that all reporting requirements are complete. Normally,
389     this evaluation against the requirements is made during the data verification process. If the data
390     do not conform to the requirements, notification should be provided in the verification report.

391     The review of the verification package (and data package) by the validator determines if
392     sufficient information is provided to proceed with data validation. The outcome of the
393     verification process is the designation of exceptions to the quality control tests. These exceptions
394     should be flagged with a qualifier (re-evaluated by the validator), which is appended to a data or
395     report requirement that does not meet specifications to alert the validator of potential problems.
396     The validator should then determine if sufficient reliable data are available to proceed with
397      validation. The validator should use the data requirements and criteria developed in the


         MARLAP             -                                                          JULY 2001
      "   DO NOT CITE OR QUOTE           "        8-14            "  DRAFT FOR PUBLIC COMMENT

-------
                                                    Radiochemical Data Verification and Validation
398     validation plan to determine if the quality control exceptions have an adverse impact on one or
399     more of the data points being validated.

400     Rarely, if ever, should quality control exceptions result in the decision to reject a complete data
401     set. Those types of situations should have been detected by the laboratory during the analytical
402     process and the samples reanalyzed. The validator should not reject (assign an "R" code) single
403     data points based on a single QC test exception. Normally, only numerous QC exceptions and
404     failures in one or more of the tests of detection and uncertainty are sufficient reason to reject
405     data. The validation report should fully explain the assignment of all qualifiers as previously
406     discussed.

407     The following paragraphs discuss some of the more important evaluations that should be applied
408     to the sample handling and analysis system. Limited guidance is provided on how the QG test
409     may impact data quality and defensibility.

410     8.5.1.1 Sample Descriptors

411     Sample descriptors include sample identification number, analytical method, analyte,  and matrix,
412     among others.

413     Criteria. Each sample should have a unique identifier code that can be cross-referenced to a
414     unique field sample or an internally generated laboratory sample. This unique identifier and
415     associated sample descriptors should be included in all analytical reports to properly document
416     the sample and requested analysis (Chapters 10 and 11).

417     The matrix and other characteristics of the sample that affect method selection and performance
418     should be clearly identified.  The method(s) used in sample preparation and analysis should be
419     identified.

420     If laboratory replicate analyses are reported for a sample, they should be distinguishable by a
421     laboratory-assigned code.

422     Verification. Each of the criteria related to describing the sample should be checked for and
423     found in the analytical data package. If any of the criteria are missing, they should be  flagged
424     with an "E" code.

425.     Validation. Missing-information will increase the uncertainty on any result reported on a
426     sample(s) and justify the assignment of a "J" code. Missing information may be inferred from

        JULY 2001                                                                       MARLAP
        DRAFT FOR PUBLIC COMMENT               8-15                   DO NOT CITE OR QUOTB

-------
        Radiochemical Data Verification and Validation
427     other information in the data package and eliminate the added uncertainty. For example, if the
428     sample matrix is not provided, it may be inferred from:

429      •  The aliquant units are expressed in units of mass or volume;
430      •  The sample preparation method is specific for soils;
431      •  The final results are expressed in units of mass; and
432      •  The sampling report describes sampling soil.

433     The majority of related information should support the decision that the exception does not
434     increase the uncertainty of the result. If the supporting information is incomplete or conflicting,
435     the assignment of a "J" code to data points is warranted. If documentation is inadequate to
436   ,  support the reporting of a data point, the data point should be qualified with an "R" code.

437     8.5.1.2 Aliquant Size

438     Criteria. The aliquant or sample size used for analysis should be documented so that it can be
439     checked when reviewing calculations, examining dilution factors or analyzing any data that
440     requires aliquant as an input. It is also imperative that the appropriate unit (liter, kilogram, etc.) is
441     assigned to the aliquant.

442     Verification. The criteria related to describing the sample aliquant should be  checked for and
443     found  in the analytical data package. If the aliquant size is missing, it should be flagged with an
444     "E" code.

445     Validation. The missing  information will increase the uncertainty on any result reported on a
446     sample(s) and justify the assignment of a "J" code.

447     8.5.1.3 Dates of Sample Collection, Preparation, and Analysis

448     Criteria. The analytical data package should report date of sampling, preparation, and analysis.
449     These data are used to calculate radiological holding times, some of which may be specified in
450     the sampling and analysis plan.

451     There are few circumstances where radiological holding times are significant for radionuclides.
452     The best approach to minimize the impact of holding time on analysis is  to analyze the samples
453     as quickly as possible. Holding times may be applied to environmental samples that contain
454     radionuclides with short half lives. Holding times would apply to these radionuclides to prevent
         MARLAP       ~                                                               JULY 2001
         DO NOT CITE OR QUOTE                     8-16            "DRAFT FOR PUBLIC COMMENT

-------
                                                    Radiochemical Data Verification and Validation
455     reporting of high measurement uncertainties and MDCs, and to detect the radionuclide, if present
456     at low concentration, before it decays to undetectable levels.

457     Verification. Each of the criteria related to sample holding time should be checked for and found
458     in the analytical data package. If any of the objectives are missing, they should be flagged with
459     an "E" code.

460     If a holding time is specified in the project plan documents or validation plan, the reported values
461     should be compared to this specification. If the holding time is exceeded, the affected criteria
462     (holding time) should be flagged with an "E" code.

463     Validation. The data points impacted by the missed holding time should be flagged with a "J"
464     code by the validator or the justification for discounting the holding time impact described in the
465     narrative section of the validation report.

466     8.5.1.4 Preservation

467     Criteria. Appropriate preservation is dependent upon analyte and matrix, and should be defined
468     in sampling and analysis documentation. Generally, preservation is applied to samples being
469     analyzed for radionuclides to prevent precipitation, adsorption to container walls, etc. The criteria
470     (required presence or absence) for this QC process should be provided in the sampling and
471     analysis plan (see Chapter 10).

472     Verification. The criteria related to preservation should be checked for and found in the
473     analytical data package. If any of the criteria are missing, they should be flagged with an "E"
474     code.

475     Validation. If exceptions to  the preservation criteria are noted, the validator should decide if a
476     "J" code should be assigned  to data points because the improper preservation increased the
477     overall uncertainty in the data point(s). In some cases where improper preservation severely
478     impacts data quality or defensibility (e.g., the use of acid preservation in water samples being
479     analyzed for 14C), the validator should assign an "R" qualifier. The assessor may elect to use the
480     data, but they have the responsibility of addressing the data quality and defensibility in the
481     assessment report.
         JULY2001       .  _       -.                   .                                   MARLAP
         DRAFT FOR PUBLIC COMMENT               8-17                    DO NOT CITE OR QUOTE

-------
         Radiochemical Data Verification and Validation
482      8.5.1.5 Tracking

483      Criteria. Each analytical result should be traceable to the instrument or detector on which it was
484      counted. The requirement for this traceability normally is found in the project plan documents.
485      The analytical sequence log (or some other suitable record) should be available in the data
486      package submitted by the laboratory.

487      Verification. If any of the analytical data are not traceable to the instrument or detector, it should
488      be flagged with an "E" code.

489      Validation. The validator may factor the absence of the traceability into their evaluation of data
490      quality and usability. At most, this should result in increasing the uncertainty of the
"491      determination and the possible assignment of a "J" code to the data. This would not occur
492      normally unless one or more of the detectors used in analyzing the samples was shown to be
493      unreliable. Then, the inability to trace a reliable detector to a sample increases the uncertainty of
494      the data point(s).

495      8.5.1.6 Traceability

496      Criteria. The traceability of standards and reference materials to be used during the analysis
497      should be specified in the sampling and analysis plan.

498      Verification. The source of the reference materials and standards should be checked for and
499      found or referenced in the analytical data package. If any of the sources are missing they should
500      be flagged with an "E" code.

501      Validation. The validator may factor the absence of the traceability into their evaluation of data
502      quality and usability. At most, this should result in increasing the uncertainty of the
503      determination and the possible assignment of a "J" code to the data. This would not occur
504      normally unless one or more of the standards used in analyzing the samples was shown to be
505      unreliable. Then, the inability to trace a reliable standard to a sample increases the uncertainty of
506      the data point(s).

507      8.5.1.7 QC Types and Linkages

508      Criteria. The type and quantity of QC samples should be identified and listed in the SOW, and
509      the results provided by the laboratory in a summary report. Replicates and matrix spike results
510      should be linked to the original sample results. The approximate level of matrix spike

         MARLAP                                       -                                 JULY 2001
         DO NOT CITE OR QUOTE                    8^18.,             DRAFT FOR PUBLIC COMMENT

-------
                                                      Radiochemical Data Verification and Validation
511      concentrations should be specified in the SOW, but the actual levels should be reported by the
512      laboratory. The QC analyses should be traceable to the original field sample.

513      Verification. Each of the criteria related to the QC samples should be checked for and found in
514      the analytical data package. If any of the objectives are missing, they should be flagged with
515      an "E" code.

516      Validation. The validator should compare any QC sample exceptions to similar ones that
517      precede and follow the non-conforming QC sample. If these are in control, the validator can
518      discount the impact of the single QC sample exception on the data results (i.e., analytical
519      blunder). If a trend of failing values is found, the validator should consider if they affected a
520      group of data points to the extent that the level of uncertainty was increased. This may warrant
;52l      the assignment of a "J" code to the data.

522 .     8.5.1.8 Chemical Separation (Yield)

523      Criteria. Yield assesses the effects of the sample matrix and the chemical separation steps on the
524      analytical result and estimates the analyte loss throughout the total analytical process. Yield is
525      typically measured gravimetrically (via a carrier) or radiometrically (via a tracer). All the
526      components in  the calculation of the yield should be identified in a defined sequence. These
527      specifications are found in the project plan documents.

528      Criteria for both analytical process and sample analysis may be given in the project plan
529      documents. The criteria should be based on historical data for the method and matrix. In that
530      case, yield is  determined on both quality control samples  and actual field samples.

531      The most important yield-related question is whether the  yield has been determined accurately.
532      Typically, a yield estimate that is much greater than 100 percent cannot be accurate, but the
533      estimate may also be questionable if the yield is far outside its historical range. Extremely low
534      yields also tend to have large measurement uncertainties, which increase the uncertainties of the
535      results. The uncertainties of factors such as the yield, counting efficiency,  and aliquant volume,
536      which affect the sensitivity of the measurement, should be kept relatively small.

537      Verification. Each of the yield-related criteria pertaining to the sample should be checked for
538      and found in  the analytical data package. If missing, the data should be returned to the lab to
539      correct for yield.
         JULY 2001                                                                        MARLAP
         DRAFT FOR PUBLIC COMMENT              8-19                    DO NOT CITE OR QUOTE

-------
        Radiochemical Data Verification and Validation
540     Validation. The experimentally determined yield is used to normalize the observed sample
541     results to 100% yield. Exceptions to the yield value outside the range specified in the project plan
542     documents may result in the validator assigning a "J" qualifier to otherwise acceptable data.

543     8.5.1.9 Self-Absorption (Residue)

544     Criteria. For some radiochemical analytical methods, the SOW may specify the generation of a
545     self-absorption curve, which correlates mass of sample deposited in a known geometry to
546     efficiency.

547     Verification. Each self-absorption curve called for in the SOW should be checked for and found
548     in the analytical data package. If missing, they should be flagged with an "E" code.

549     Validation. If required self-absorption curves are missing, the validator may select to qualify
550     affected data with a "J" qualifier to signify an increased level of uncertainty in the measurement
551     because of the inability to correct the measured value for self-absorption.

552     8.5.1.10 Efficiency, Calibration Curves, and Instrument Background

553     Criteria. For some methods based on decay emission counting, efficiency is reported as count
554     rate divided by disintegration rate. Methods employing radiotracers determine a sample-specific
555     effective efficiency factor that is a product of the chemical yield and the detector efficiency. This
556     criteria may be specified in the SOW. Instrument background count rate is determined for each
557     detector for each region of interest and subtracted from the sample count rate.

558     Verification. Each efficiency determination, efficiency calibration curve, and instrument
559     background called for in the project plan documents should be checked for and found in the
560     analytical data package. If missing, they should be flagged with an "E" code.

561     Validation. If required factors are missing, the validator may select to qualify affected data with
562     a "J" qualifier to signify an increased level of uncertainty in the measurement because of the
563     inability to correct the measured value for efficiency.

564     8.5.1.11 Spectrometry Resolution

565     Criteria. The measured resolution of alpha, gamma-ray, and liquid scintillation spectrometers, in
566     terms of the full width of a peak at half maximum (FWHM), can be used to assess the adequacy
567     of instrument setup, detector selectivity, and chemical separation technique that may affect the

        MARLAP-                                                                      JULY 2001
        DO NOT CITE OR QUOTE                     8-20              DRAFT FOR PUBLIC COMNffiNT

-------
                                                     Radiochemical Data Verification and Validation
568      identification and quantification of the analyte. When sufficient peak definition (i.e., sufficient
569      number .of counts to provide an adequate Gaussian peak shape) has been reached for a sample,
570      the resolution of the analyte peak should be evaluated to determine if proper peak identification
571      and separation or deconvolution was made. Spectral information should be provided in the data
572      packages to accomplish this evaluation.

573      Verification. There are no established acceptance criteria, but should be provided in the package
574      or available in the audit.

575      Validation. If required calculations are missing, the validator may select to qualify affected data
576      with a "J"  qualifier to signify an increased level of uncertainty in the measurement because of the
577      inability to evaluate instrument setup and separation technique. An "R" code may be applied if
578      there is no separation.

579      8.5.1.12 Dilution and Correction Factors

580      Criteria. Samples for radiochemistry are usually not diluted, but a larger sample may be
581      digested, taking an aliquant for analysis to obtain a more representative subsample. The dilution
582      factors are normally used for tracers and carriers. Dilutions of the stock standards are prepared
583      and added to the samples. This dilution normally affects yield calculations, laboratory control
584      samples, and matrix spikes. This data should be provided in the data package so that the final
585      calculations of all data affected by dilution factors can be recalculated and confirmed, if required.

586      Other correction factors that may be applied to the data are dry weight correction, ashed weight
587      correction, and correction for a two-phased sample analyzed as separate phases.

588      Verification. Each dilution and correction factor affecting the sample should be checked for and
589      found in the analytical data package. If any of the factors are missing, they should be flagged
590      with an "E" code.

591      Validation. Those results impacted by missing dilution factors should be flagged with a "J" or
592      "R" qualifier, reflecting increased uncertainty in the data point(s). "R" may be warranted if the
593      calculation cannot be confirmed due to missing data.
         JULY 2001. .                                                                      MARLAP
         DRAFT FOR PUBLIC COMMENT               8-21                    DO NOT CITE OR.QUOTE

-------
        Radiochemical Data Verification and Validation
594      8.5.1.13 Counts and Count Time (Duration)

595      Criteria. The count time for each sample, QC analysis, and instrument background should be
596      recorded in the data package. The ability to detect radionuclide disintegrations is directly related
597      to the count time. The longer the count time, the lower the detection limit. The project plan
598      documents should specify the MQOs, which will drive the count time for each analyte.

599      Verification. Each count time relating to the sample analysis should be checked for and found in
600      the analytical data package. If any of the objectives are missing, they should be flagged with
601      an "E" code.

602      Validation. The validator should estimate the impact of the actual count times on the ability to
•603      detect the target analyte and the impact on the uncertainty of the measurement. If the MQOs are
604      met, the sample should not be qualified for count time. It should be noted that preset count
605      determination, rather than preset count time, will result in the same uncertainty for all the
606      samples. The qualifiers should be adjusted accordingly and the justification provided in the
607      validation report.

608      8.5.1.14 Result of Measurement, Uncertainty, Minimum Detectable Concentration, and Units

609      Criteria. MARLAP recommends that the result of each measurement, its expanded measurement
610      uncertainty, and the estimated sample- or analyte-specific MDC be reported for each sample in
611      the appropriate units.  These values, when compared with each other, provide information about
612      programmatic problems with the calculations, interference of other substances, and bias. The
613      report should state the coverage factor used if calculating expanded measurement uncertainties,
614      and the Type I and Type H error probabilities used to calculate MDCs.

615      Verification. The linkage between the.result, measurement uncertainties, MDC, and the sample
616      identification should be checked. If linkage is not evident, data should be flagged with an "E"
617      code.

618      Validation. The validator should assign data qualifiers to those data points for which they feel
619      sufficient justification exists. Each qualifier should be discussed in the validation report.

620      8.5.2  Quality Control Samples

621      Historically, data validation has placed a strong emphasis on review of QC sample data
622      (laboratory control samples, duplicates, etc). The assumption is that if the analytical process was

         MARLAP'"                         •                                          JULY 2001
         DO NOT CITE OR QUOTE                   8-22              DRAFT FOR PUBLIC COMMENT

-------
                                                    Radiochemical Data Verification and Validation
623      in control and the QC samples responded properly, then the environmental samples (field
624      samples plus the preparation sequences used to prepare the sample for analysis) would respond
625      properly. It is possible to have excellent performance on simple matrices (e.g., quality control
626      samples), but unacceptable performance on complex matrices (i.e., environmental) reported in
627      the same batch as the QC samples. Directly evaluating the environmental sample performance is
628      essential to determine measurement uncertainty and the likelihood of false positive and negative
629      detection of the target analyte,
                                                                                1
630      Method blanks and laboratory control samples relate to the analytical batch (a series of similar
631      samples prepared and analyzed together as a group) quality control function. They are required
632      by most analytical service contracts, sampling and analysis plans, and project plan documents.
633      They serve a useful function as monitoring tools that track the continuing analytical process
'634      during extended analytical sequences. They are the most ideal samples analyzed as part of a
635      project. Normally, their performance is compared to fixed limits derived from historical
636      performance or additionally project specific limits derived from the MQOs.

637      Laboratory duplicates and matrix spikes are quality control samples that directly monitor sample
638      system performance. The laboratory duplicates (two equal-sized samples of the material being
639      analyzed, prepared, and analyzed separately as part of the same batch) measure the overall
640      precision of the sample measurement process beginning with laboratory sub-sampling of the field
641      sample. Matrix spikes (a known amount of target analyte added to the environmental sample)
642      provide a direct measure of how the target analyte responds when  the environmental sample is
643      prepared and measured, thereby estimating the bias introduced by  the sample matrix.

644      Other QC tests can be applied to determine how the analytical process performs during the
645      analysis of environmental samples. These are yield/recovery, efficiency, self-absorption,
646      resolution, and drift. They are the same QC tests that were applied to routine QC samples (blanks
647      and laboratory control samples) in the previous discussion of the analytical process, but now are
648      applied to environmental samples. The difference lies in how performance is measured. Fixed
649      limits based on historical performance and/or statistics are usually the basis for evaluating the
650      results of routine QC samples.

651      The following paragraphs discuss how QC tests should be used to determine if the results for QC
652      samples meet the project MQOs. Guidance is provided on how to  relate QC sample and
653      environmental sample performance to  determine environmental sample data quality and
654      defensibility. Direction is also given about how to assign data qualifiers to environmental sample
655      data based on the tests of quality control. Appendix C provides guidance on developing criteria
656      for evaluating QC sample results. Specifically, Appendix C contains equations that allow for the


       _ JULY 2001                       "  "                                            MARLAP
         DRAFT FOR PUBLIC COMMENT         .     8-23                    DO NOT CITE OR QUOTE

-------
        Radiochemical Data Verification and Validation
657     determination of warning and control limits for QC sample results based on the project's MQO
658     for measurement uncertainty.

659     8.5.2.1 Method Blank

660     The method blank (Section 18.4.1) is generated by carrying all reagents and added materials
661     normally used to prepare an environmental sample through the same preparation process. It
662     establishes how much, if any, of the measured analyte is contributed by the reagents and
663     equipment used in the preparation process. For an ideal system, there will be no detected
664     concentration or activity.

665     Since measured results are usually corrected for instrument and reagent background levels, it is
666     possible to obtain final results that are less than zero. A method blank result that is much less
667     than zero may indicate that the correction term is too large and therefore analyte concentrations
668     in actual samples may be underestimated.

669     Criteria. The requirement for a method blank is usually established in the SOW and appropriate
670     plan documents. The objective is to establish the target analyte concentration or activity
671     introduced by the sample preparation sequence. Method blanks are normally analyzed once per
672     analytical batch.

673     Other types of blanks, such as field blanks and trip blanks, are used to evaluate aspects of the
674     data collection effort and laboratory operations that are not directly related to the validation of
675     environmental analytical data quality or technical defensibility. They can be important to the
676     overall data assessment effort, but are beyond the scope of this guidance (Chapter 10).

677     See Appendix C for guidance on developing criteria for evaluating blanks based on the project's
678     MQO for method uncertainty.

679     Verification. If a method blank was required but not performed, or if the required data is
680     missing, the verifier flags the missing information with an "E" code.

681     Validation. If.a blank result does not comply with the established criteria, the associated samples
682     are flagged "B+" to indicate that the blank result is greater than the upper limit, or "B-" to
683     indicate that the blank result is less than the lower limit.
         MARLAP                                                                        JULY 2001
         DO NOT CITE OR QUOTE                    8-24              DRAFT FOR PUBLIC COMMENT

-------
                                                    Radiochemical Data Verification and Validation
684     8.5.2.2 Laboratory Control Samples

685     The laboratory control sample (LCS) is a QC sample of known composition or an artificial
686     sample (created by spiking a clean material similar in nature to the environmental sample), which
687     is prepared and analyzed in the sample manner as the environmental sample. In an ideal situation,
688     the LCS would give 100 percent of the concentration or activity known to be present in the
689     fortified sample or standard material. Acceptance criteria for the LCS sample are based on the
690     complexity of the matrix and the historical capability of the lab and method to recover the
691     activity. The result normally is expressed as percent recovery. The LCS recovery differs from the
692     recovery of a matrix spike in that the matrix spike is added directly to the environmental sample
693     and the percent recovery is determined by comparing the difference between the original and
694     spiked samples.

695     Criteria. The objective of the LCS is to measure the response of the analytical process to a QC
696     sample with  a matrix similar to the environmental sample. This will allow inferences to be drawn
697     about the reliability of the analytical process.

698     See Appendix C for guidance on developing control limits for LCS results based on the project's
699     MQO for method uncertainty.

700     Verification. If a required LCS is not analyzed, or if required information is missing, the verifier
701     flags the missing information with an "E" code.

702     Validation. When the measured result for the LCS is outside the control limits, the associated
703     samples are flagged with the "S" qualifier (S+ or S-).

704     8.5.2.3 Laboratory Replicates

705     Replicates are used to determine the precision of laboratory preparation and analytical
706     procedures. Laboratory replicates are two aliquants selected from the laboratory sample and
707     carried through preparation and analysis as part of the same batch.

708     The discussion of field replicates is beyond the scope of this chapter.

709     Criteria. The objective of replicate analyses is to measure laboratory precision based on each
710     sample matrix. The variability of the samples  due to field sample heterogeneity is also reflected
711 ~   in the replicate result. The laboratory may not be in control of the precision. Therefore, replicate
        JULY 2001                                                                       MARLAP
        DRAFT FOR PUBLIC COMMENT              8-25                    DO NOT CITE OR QUOTE

-------
        Radiochemical Data Verification and Validation
712     results are used to evaluate reproducibility.of-the.complete laboratory process.that includes
713     subsampling, preparation, and analytical process.

714     See Appendix C for guidance on developing control limits for replicate results based on the
715     project's MQO for method uncertainty.

716     Verification. If replicate analyses are required but not performed, or if the required data is not
717     present in the report, the verifier flags the missing information with an "E" code.

718     Validation. When the replicate analysis is outside the control limit, the associated samples are
719     flagged with the "P" qualifier.

720     8.5.2.4 Matrix Spikes and Matrix Spike Duplicates

721     The matrix spike is an aliquant of a sample, fortified (spiked) with known quantities of target
722     analytes and subjected to the entire analytical procedure to establish if the method or procedure is
723     appropriate for the analysis of the particular matrix.

724     Criteria. Matrix spike samples provide information about the effect of each sample matrix on
725     the preparation and measurement methodology. The test uncovers the possible existence of
726     recovery problems, based on either a statistical test or a specified fixed control limit.

727     See Appendix C for guidance on developing criteria for evaluating matrix spikes based on the
728     project's MQO for method uncertainty.

729     Verification. If a required matrix spike analysis was not performed, or if the required
730     information is missing,  the missing information should be flagged with an "E" code.

731     Validation. If the results of the matrix  spike analysis do not meet the established criteria, the
732     samples should be qualified with an "S+" or "S-" indicating unacceptable spike recoveries.

733     8.5.3  Tests of Detection and Unusual Uncertainty

734     8.5.3.1 Detection

735     The purpose of a test of detection is to decide if each result for a regular sample is significantly
736     different from zero. Since most radiochemistry methods always produce a result, even if a very
737     uncertain or negative one, some notion of a non-detected but measured result may be needed for

        MARLAP                                                    ""  ~              JULY 200
        DO NOT CITE OR QUOTE                    8-26              DRAFT FOR PUBLIC COMMENT

-------
                                                     Radiochemical Data Verification and Validation
738      some projects. A non-detected result is generally as valid as any other measured result, but it is
739      too small relative to its measurement uncertainty to give high confidence that a positive amount
740      of analyte was actually present in the sample. Ordinarily, if the material being analyzed is
741      actually analyte-free, most results should be "non-detected."

742      For some projects, detection may not be an important issue. For example, it may be known that
743      all the samples contain a particular analyte, and the only question to be answered is whether the
744      mean concentration is less than an action level. However, all laboratories should be able to
745      perform a test of detection routinely for each analyte in each sample.

746      Criteria. An analyte is considered detected when the measured analyte concentration exceeds its
747      critical value (see Chapter 19). Both values are calculated by the laboratory performing the
748      measurement; so, the detection decision can be made at the laboratory and indicated in its report.
749      If there is no evidence of additional unquantified uncertainty in the result (e.g., lack of statistical
750      control or blank contamination), the laboratory's decision may be taken to be final.

751      Verification. Typically, the role of  the verifier is limited to checking that required information,
752      such as the critical value, is present in the report. If information is missing, the result should be
753      flagged with an "E" code.

754      Validation. The validator examines the result of the measurement, its critical value, and other
755      information associated with the sample and the batch in which it was analyzed, including method
756      blank results in particular, to make a final determination of whether the analyte has been detected
757      with confidence. If the data indicates the analyte has been detected in both the sample and the
758      method blank, its presence in the sample may be questionable. A quantitative comparison of the
759      total amounts of analyte in the sample and method blank, which takes into account the associated
760      measurement uncertainties, may be needed to resolve the question.

761      8.5.3.2 Detection Capability

762      Criteria. If the project requires a certain detection capability, the requirement should be
763      expressed as a required minimum detectable concentration (RMDC). The data report should
764      indicate the RMDC and the sample-specific estimate of the actual minimum detectable
765      concentration (MDC) for each analyte in each sample.

766      In some situations, it may not be necessary or even possible for a laboratory to meet the MDC
767      requirement for all analytes in all samples. In particular, if the analyte is present and quantifiable
768      at a concentration much greater than the action level, a failure to meet a contract-required

         JULY 2001                                                                        MARLAP
         DRAFT FOR PUBLIC COMMENT               8-27                    DO NOT CITE OR QUOTE

-------
        Radiochemical Data Verification and Validation
769     detection limit is usually not a cause for concern. A failure to meet the RMDC is more often an
770     important issue when the analyte is not detected.

771     Verification. The RMDC specified in the contract is compared to the sample-specific MDC
772     achieved by the method. The analytes that do not meet the RMDC are flagged with an "E" code.

773     Validation. If the sample-specific MDC estimate exceeds the RMDC, the data user may be
774     unable to make a decision about the sample with the required degree of certainty. A "UJ"
775     qualifier is warranted if the estimated MDC exceeds the RMDC and the analyte was not detected
776     by the analysis. A final decision about the usability of the data should be made during the data
777     assessment phase of the data collection process.

"778     An assignment of "R" to the data points affected by this type of exception may be appropriate in
779     some cases, but the narrative report may classify the data as acceptable (no qualifier), "U," or "J,"
780     based on the results of the tests of detection and uncertainty. This allows the assessor to make an
781     informed judgement about the usability of the data point(s) and allows them the opportunity to
782     provide a rationale of why the data can be used in the decision process.

783     8.5.3.3 Large or Unusual Uncertainty

784     When project planners follow MARLAP's recommendations for developing MQOs, they
785     determine a required method uncertainty at a specified analyte concentration. The required
786     method uncertainty is  normally expressed in concentration units, but it may be expressed as a
787     relative method uncertainty (percent based on the upper bound of the gray region, which is
788     normally the action level). It is reasonable to expect the laboratory's combined standard
789   .  uncertainty at concentrations lower than the action level to be no greater than the required
790     method uncertainty (expressed in concentration units) and to expect the laboratory's relative
791     combined standard uncertainty at concentrations above the action level to be no greater than the
792     required relative method uncertainty (expressed as a percent). Each measured result should be
793     checked against these  expectations (see Appendix C).

794     Criteria. The reported combined standard uncertainty is compared to the maximum allowable
795     standard uncertainty. Either absolute (in concentration units) or relative uncertainties (expressed
796     as a percent) are used  in the comparison, depending on the reported concentration. The result is
797     qualified with a "Q" if the reported uncertainty is larger than the requirement allows.

798     Verification. The test for large uncertainty is straightforward enough to be performed during
799     either verification or validation. If there is a contractual  requirement for measurement

        MAREAP-   —                                         -                        JULY 200
        DO NOT CITE OR QUOTE                    8-28              DRAFT FOR PUBLIC COMMEN

-------
                                                     Radiochemical Data Verification and Validation
800      uncertainty, the verifier should perform the test and assign the "E" qualifier to results that do not
801      meet the requirement. Note that it may sometimes happen that circumstances beyond the control
802      of the laboratory make it impossible to meet the requirement.

803      Validation. If a "Q" qualifier is assigned, the validator may consider any special circumstances
804      that tend to explain it, such as interferences, small sample sizes, or long decay times, which were
805      beyond the control of the laboratory. He or she may choose to remove the qualifier, particularly if
806      it is apparent that the original uncertainty requirement was too restrictive.

807      8.5.4  Final Qualification and Reporting

808      The final step of the validation process is to assign and report final qualifiers for all regular
"809      sample results. The basis for assignment of final qualifiers is qualifiers and reasons from all
810      previous tests, patterns of problems in batches of samples, and validator judgement.

811      The difficult issue during final qualifier assignment is rejecting data. What follows summarizes
812      some of the issues to consider when thinking about rejecting data.

813      Rejecting a result is an unconditional statement that it is not useable for the intended purpose. A
814      result should only be rejected when the risks of using it are significant relative to the benefits of
815      using whatever information it carries. If the DQA team or users feel data is being rejected for
816      reasons that don't affect usability, they may disregard all validation conclusions. Rejected results
817      should be discarded and not used in the DQA phase of the data life cycle.

818      There are three bases on which to reject data:

819         1. Insufficient or only incorrect data are available to make fundamental decisions about data
820            quality.  For example, if correctly computed uncertainty estimates are not available, it is
821            not possible to do most of the suggested tests. If the intended use depends on a consistent,
822            high level of validation, it may be proper to reject such data.

823            The missing data should be fundamental. For example, missing certificates for standards
824             are unlikely to be fundamental if lab performance on spiked samples is acceptable. In
825             contrast, if no spiked sample data is available, it may be impossible to determine if a
826             method gives even roughly correct results, and rejection may be appropriate.

827         2.  Available data indicate that the assumptions underlying the method are not true. For
828             example, QC samples may demonstrate that the lab's processes are out of control.
829      -      Method performance data may indicate that the method simply does not work for
830             particular samples. These problems should be so severe that is not possible to make
831             quantitative estimates of their effects.

         JULY 2001                                                                       MARLAP
         DRAFT FOR PUBLIC COMMENT             8-29                   DO NOT CITE OR QUOTE

-------
         Radiochemical Data Verification and Validation
832         3.  A result is "very unusually uncertain." It is difficult to say what degree of uncertainty
833             makes a result unusable. Whenever possible, uncertain data should be rejected based on
834             multiple problems with one result, patterns in related data, and the validator's judgement,
835             not the outcome of a single test. This requires radiochemistry expertise and knowledge of
836             the intended use.

837      Based on an evaluation of the tentative qualifiers, final qualifiers are assigned to each regular
838      sample result.

839      After all necessary validation tests have been completed and a series of qualifiers assigned to
840      each data point based on the results of the tests, a final judgment to determine which, if any, final
841      qualifiers will be attached to the data should be made. The individual sample data from the
'842      laboratory should retain all the qualifiers. The basic decision making process for each result is
843      always subject to validator judgement:

844       • As appropriate, assign a final "R";

845       • If "S", "P", or "B" were assigned, determine whether the qualifiers warrant the assignment of
846  .       an "R";

847       • If "R" is not assigned, but some test assigned a tentative S, P, B, Q, or J, or a pattern exists
848         that makes it appropriate, assign a final S, P, B, Q, or J and summarize QC sample
849         performance;

850       • If a final S, B, or J was assigned,  + or -, but not both, was tentatively assigned, and the
851         potential bias is not outweighed by other sources of uncertainty, make the + or - final; and

852       • For non-R  results, if any test assigned a tentative "U," make it final.

853      The final validation decision should address the fact that the broader purpose of validation is to
854      contribute to the total data collection process, i.e., effectively translate and interpret analytical
855      results for efficient use by an assessor. This means the validator should examine the full range of
856      data available to search for and utilize relationships among the data elements to support the
857      acceptance and use of data that falls outside method or contract specifications and data validation
858      plan guidance.

859      8.6    Validation Report

860      The final product of validation is a package that summaries the validation process and its
861      conclusions in an orderly fashion. This package should include:
         MARLAP                                                                        JULY 200
         DO NOT CITE OR QUOTE                    8-30              DRAFT FOR PUBLIC COMMEN'

-------
                                                    Radiochemical Data Verification and Validation
862       •  A narrative or summary table written by the validator that summarizes exceptional
863         circumstances: In particular, it should document anything that prevented executing the
864         planned validation tests. Further, the narrative should include an explicit statement explaining
865         why data has been rejected or qualified based on the findings of the validation tests and the
866         validator's judgment.

867       •  A list of validated samples that provides a cross-reference of laboratory and client sample
868         identifiers: This report should also include other identifiers useful in the context of the
869         project, such as reporting batch, chain of custody, or other sample management system
870         sample information.

871       •  A summary of all validated results with associated uncertainty for each regular sample with
872         final qualifiers: Unless specified in the sampling and analysis plan, non-detects are reported
.873         as measured, not replaced by a detection limit or other "less than" value.

874       •  A summary of QC sample performance and the potential effect on the data both qualified and
875         not qualified.

876      Assuming the client wants additional information, the following, more detailed reports can be
877      included in the validation package. Otherwise, they are simply part of the validation process and
878      the verification contract compliance:

879       •  A detailed report of all tentative qualifiers and associated reasons for their assignment;

880       •  QC sample reports that document analytical process problems; and

881       •  Reports that summarize performance by method—these should support looking across related
882         analyses at values such as yields and result ratios.

883      The data in the summary reports should be available in a computer-readable format. If no result
884      was obtained for a particular analyte, the result field should be left blank. The validation report
885      should package analytical results as effectively as possible for application and use by the
886      individual assembling and assessing all project data.

887      The validation report should contain a discussion describing the problem(s) found during the
888      validation process. For the validation codes, the discussion summarizes the performance criteria
889      established in the validation plan. If the validation test performance criteria were changed (e.g.,
890      increased or decreased level of unusual uncertainty) because the nature of the sample matrix or
891      analyte was different than expected, the new criteria should be explained in the report and the
892      qualifiers applied using the new criteria. The approval of the project manager should be obtained
893      (and documented) before the new criteria are applied. The project manager should communicate

         JULY 2001                                                                        MARLAP
      -  DRAFT FOR PUBLIC COMMENT              8-31                     DO NOT CITE OR QUOTE

-------
        Radiochemical Data Verification and Validation
894     the changes to the project planning team to maintain the consensus reached and documented
895     during validation planning.

896     Well-planned and executed analytical activities can be expected to meet reasonable expectations
897     for data reliability. This means that for most data points or data sets, the results of the tests of
898     quality control, detection, and unusual uncertainty will show that the data are of sufficient quality
899     and defensibility to be forwarded to the assessor with little or no qualification for final
900     assessment. A small number of points will be rejected because random errors in the analytical
901     process or unanticipated matrix problems resulted in massive failure of several key validation
902     tests.

903     A smaller number of data points will show conflicting results from the validation tests and
904     present the greatest challenge to the validator. The more important the decision and/or the lower
905     the required detection limit, the more common this conflict will become, and the more critical it
906     is that the data validation plan provide guidance to the validator about how to balance the
907     conflicting results. Is the ability to detect the analyte more important than the associated
908     statistical unusual uncertainty, or is the presence of the analyte relatively definite but the unusual
909     uncertainty around the project decision point critical to major decisions? The necessary guidance
910     should be developed during the planning phase to guide the final judgment of the validator.

911     8.7    Other Sources of Information

912     American National Standards Institute (ANSI) N13.30.1996. Performance Criteria for
913        Radiobioassay.

914     U.S. Environmental Protection Agency (EPA). 1994. Contract Laboratory Program National
915        Functional Guidelines for Inorganic Data Review. EPA-540/R-94-013 (PB94-963502).
916        February. Available from http://www.epa.gov/oenpage/superfund/programs/clp/download/
917        fginorg.pdf.
         MARLAP               .                                                        JULY200I
         DO NOT CITE OR QUOTE                     8-32              DRAFT FOR PUBLIC COMMEN1

-------
                        9 DATA QUALITY ASSESSMENT
 2     9.1   Introduction

 3     This chapter provides an overview of the data quality assessment (DQA) process, the third and
 4     final process of the overall data assessment phase of a project. Assessment is the last phase in the
 5     data life cycle and precedes the use of data. Assessment—in particular DQA—is intended to
 6     evaluate the suitability of project data to answer the underlying project questions or the suitability
 ?     of project data to support the project decisions. The output of this final assessment process is a
 8     determination as to whether a decision can or cannot be made within the project-specified data
 9     quality objectives (DQOs).

10     The discussions in this chapter assume that prior to the DQA process, the individual data
11     elements have been subjected to the first two assessment processes, "data verification" and "data
12     validation" (see Chapter 8, Radiochemical Data Verification and Validation), The line between
13     these three processes has been blurred for some time and varies from guidance to guidance and
14     practitioner to practitioner. Although the content of the various processes is the most critical
15     issue, a common terminology is necessary to minimize confusion and to improve communication
16     among planning team members, those who will implement the plans, and those responsible for
17     assessment. MARLAP defines these terms in Section 1.4 and discusses assessment in Section 8.2

18     This chapter is not intended to  address the detailed and specific technical issues needed to assess
19     the data from a specific project but rather to impart a general understanding of the DQA process
20     and its relationship to the other assessment processes, as well as of the planning and implemen-
21     tation phases of the project's data life cycle. The target audience for this chapter is the project
22     planner, project manager, or other member of the planning team who wants to acquire a general
23     understanding of the DQA process; not the statistician, engineer, or radiochemist who is seeking
24     detailed guidance for the planning or implementation of the assessment phase. Guidance on
25     specific technical issues is available  (EPA, 2000; MARSSIM, 2000; NRC, 1998).

26     This chapter emphasizes that assessment, although represented as the last phase of the project's
27     data life cycle, should be planned for during the directed planning process, and the needed
28     documentation should be provided during the implementation phase of the project.

29     Section 9.2 reviews the role of DQA in the assessment phase. Section 9.3 discusses the graded
30     approach to DQA. The role of the  DQA team is discussed in Section 9.4. Section 9.5 describes
31     the content of DQA plans. Section 9.6 details the activities that are involved in the DQA process.
       JULY 2001                        .                                             MARLAP
       DRAFT FOR PUBLIC COMMENT               9-1                    DO NOT CITE OR QUOTE

-------
       Data Quality Assessment
32     9.2   Assessment Phase

33     The assessment phase was discussed in Section 8.2. This subsection provides a brief overview of
34     the individual assessment processes, their distinctions, and how they interrelate.

35     "Data verification" generally evaluates compliance of the analytical process with project-plan
36     and other project-requirement documents, and the statement of work (SOW), and documents
37     compliance and noncompliance in a data verification report. Data verification is a separate
38     activity in addition to the checks and review done by field and laboratory personnel during
39     implementation.

40     Documentation generated during the implementation phase will be used to determine if the
41     proper procedures were employed and to determine compliance with project plan documents
42     (e.g., QAPP), contract-specified requirements, and measurement quality objectives (MQOs). Any
43     data associated with noncompliance will be identified as an "exception," which should elicit
44     further investigation during data validation.

45     Compliance, exceptions, missing documentation, and the resulting inability to verify compliance
46     should be recorded in the data verification report. Validation and DQA employ the verification
47     report as they address the usability of data in terms of the project DQOs.

48     "Data validation" qualifies the usability of each datum after interpreting the impacts of
49     exceptions identified during verification. The validation process should be well defined in a
50     validation plan that was completed during the planning phase. The validation plan, as with the
51     verification plan or checklist, can range from sections of a project plan to large and detailed
52     stand-alone documents.  Regardless of its size or format, the validation plan should address the
53     issues presented in Section 8.3. Data validation begins with a review of project objectives and
54     requirements, the data verification report, and the identified exceptions. The data validator
55     determines if the analytical process was in statistical control (Section 8.5.1) at the time of sample
56     analysis, and whether the analytical process as implemented was appropriate for the sample
57     matrix and analytes of interest (Section  8.5.2). If the system being validated is found to be under
58     control and applicable to the analyte and matrix, then the individual data points can be evaluated
59     in terms of detection (Section 8.5.3.1), detection capability (Section 8.5.3.2), and unusual
60     uncertainty (Section 8.5.3.3). Following these determinations, the data are assigned qualifiers
61     (Section 8.5.4) and a data validation report is completed (Section 8.6). Validated data are rejected
62     only when the impact of an exception is so significant that the datum is unreliable.
        MARLAP                                                                      JULY 200:
        DO NOT CITE OR QUOTE                    9-2  -           DRAFT FOR PUBLIC COMMEN":

-------
                                                                         Data Quality Assessment
63     While both data validation and DQA processes address usability, the processes address usability
64     from different perspectives. "Data validation" attempts to interpret the impacts of exceptions
65     identified during verification and the impact of project activities on the usability of an individual
66     datum. In contrast, "data quality assessment" considers the results of data validation while
67     evaluating the usability of the entire data set.

68     During data validation, the guidance in Chapter 8 strongly advises against the rejection of data
69     unless there is a significant argument to do so. As opposed to rejecting data, it is generally
70     preferable that data are qualified and that the data validator details the concerns in the data
71     validation report. However, there are times when data should be rejected, and the rational for the
72     rejection should be explained in the data validation report. There are times when the data
73     validator may have believed data should be rejected based on a viable concern, yet during DQA,
74     a decision could be made to employ the rejected data.

75     In summary, data validation is a transition from the compliance testing of data verification to
76     usability determinations. The results of data validation, as captured in the qualified data and
77     validation reports, will greatly influence the decisions made during the final assessment process,
78     data quality assessment, which is discussed in Section 9.6.

79     9.3   Graded Approach to Assessment

80     The sophistication of the assessment phase—and in particular DQA and the resources applied—
81     should be appropriate for the project (i.e., a "graded approach"). Directed planning for small or
82     less complex projects usually requires fewer resources and typically involves fewer people and
83     proceeds faster. This graded approach to plan design is also applied to the assessment phase.
84     Generally, the greater the importance of a project, the more complex a project, or the greater the
85     ramifications of an incorrect decision, the more resources will be expended on assessment in
86     general and DQA in particular.

87     It is important to note that the depth and thoroughness of a DQA will be affected by the
88     thoroughness of the preceding verification and validation processes. Quality control or statement
89     of work (SOW) compliance issues that are not identified as an "exception" during verification, or
90     qualified during validation, will result in potential error sources not being reviewed and their
91     potential impact on data quality will not be evaluated. Thus, while the graded approach to
92     assessment is a valid and necessary management tool, it is necessary to consider all assessment
93     phase processes (data verification, data validation, and data quality assessment) when assigning
94     resources to assessment.
        JULY 2001                                  -                                     MARLAP
        DRAFT FOR PUBLIC COMMENT              9-3                     DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
 95     9.4    The Data Quality Assessment Team

 96     The project planning team is responsible for ensuring that its decisions are scientifically sound
 97     and comply with the tolerable decision-error rates established during planning. MARLAP
 98     recommends the involvement of the data assessment specialist(s) on the project planning team
 99     during the directed planning process. This should result in a more efficient assessment plan and
100     should increase the likelihood that flaws in the design of the assessment processes will be
101     detected and corrected during planning. Chapter 2.4 noted that it is important to have an
102     integrated team of operational and technical experts. The data assessment specialises) who
103     participated as members of the planning team need not be the final assessors. However, using the
104     same assessors who participated in the directed planning process is advantageous, since they will
105     be aware of the complexities of the project's goals and activities.

106     The actual personnel who will perform data quality assessment, or their requisite qualifications
107     and expertise, should be specified in the project plan documents. The project planning team
108     should choose a qualified data assessor (or team of data assessors) who is technically competent
109     to evaluate the project's activities and the impact of these activities on the quality and usability of
110     data. Multi-disciplinary projects may require a team of assessors (e.g., radiochemist, engineer,
111     statistician) to address the diverse types of expertise needed to assess properly the representa-
112     tiveness of samples, the accuracy  of data, and whether decisions can be made within the specified
113     levels of confidence. Throughout  this manual, the term "assessment team" will be used to refer to
114     the assessor expertise needed.

115     9.5    Data Quality Assessment Plan

116     To implement the assessment phase as designed and ensure that the usability of data are assessed
117     in terms of the project objectives, a detailed DQA plan should be completed during the planning
118     phase of the data life cycle. This section focuses on the development of the DQA plan and its
119     relation to DQOs and MQOs.

120     The DQA plan should address the concerns and requirements of all stakeholders and present this
121     information in a clear, concise format. Documentation of these DQA specifications,
122     requirements, instructions, and procedures are essential to assure process efficiency and that
123     proper procedures are followed. Since the success of a DQA depends upon the prior two
124     processes of the assessment phase, it is key that the verification and validation processes also be
125     designed and documented in respective plans during the planning phase. Chapter 8 lists the types
126     of guidance and information that  should be included in data verification and validation plans.
        MARLAP                                "  ~  " "*                             JULY 2001
        DO NOT CITE OR QUOTE                     9-4              DRAFT FOR PUBLIC COMMENT

-------
                                                                        Data Quality Assessment
127     MARLAP recommends that the DQA process should be designed during the directed planning
128     process and documented in a DQA plan. The DQA plan is an integral part of the project plan
129     documents  and can be included as either a section or appendix to the project plan or as a cited
130     stand-alone document. If a stand-alone DQA plan is employed, it should be referenced by the
131     project plan and subjected to a similar approval process.

132     The DQA plan should contain the following information:

133      • A short summary and citation to the project documentation that provides sufficient detail
134        about the project objectives (DQOs), sample and analyte lists, required detection limit, action
135        level, and level of acceptable uncertainty on a sample- or analyte-specific basis;

136      • Specification of the necessary sampling and analytical assessment criteria (typically
is?        expressed as MQOs for selected parameters such as method uncertainty) that are appropriate
138        for measuring the achievement of project objectives and constitute a basis for usability
139        decisions;

140      • Identification of the actual assessors or the required qualifications and expertise that are
141        required for the assessment team performing the DQA (Section 9.4);

142      * A description of the steps and procedures (including statistical tests) that will constitute the
143        DQA, from reviewing plans and implementation to authoring  a DQA report;

144      • Specification of the documentation and information to be collected during the project's
145        implementation;

146      • A description for any project-specific notification or procedures for documenting the usability
147        or non-usability of data for the project's decision making;

148      • A description of the content of the DQA report;

149      • A list of recipients for the DQA report; and

150      • Disposition and record maintenance requirements.
        JULY200L  -                                       -                            MARLAP
        DRAFT FOR PUBLIC COMMENT              9=5                    DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
151     9.6   Data Quality Assessment Process

152     MARLAP's guidance on the DQA process has the same content as other DQA guidance (ASTM
153     D6233; EPA, 2000; MARSSIM, 2000; NRC, 1998; USAGE, 1998), however, MARLAP
154     presents these issues in an order that parallels project implementation more closely. The
155     MARLAP guidance on the DQA process can be summarized as an assessment process that—
156     following the review of pertinent documents (Section 9.6.1)—answers the following questions:

157       •  Are the samples representative? (Section 9.6.2)
158       •  Are the analytical data accurate? (Section 9.6.3)
159       •  Can a decision be made? (Section 9.6.4)

160     Each of these questions is answered first by reviewing the plan and then evaluating the
161     implementation. The process concludes with the documentation of the evaluation of the data
162     usability in a DQA Report (Section 9.7).

163     The DQA Process is more global in its purview than the previous verification and validation
164     processes. The DQA process should consider the combined impact of all project activities in
165     making a data usability determination. The DQA process, in addition to reviewing the issues
166     raised during verification and validation, may be the first opportunity to review other issues, such
167     as field activities and their impact on data quality and usability. A summary of the DQA steps
168     and their respective output is presented in Table 9.1.           '
169
170
171
172
                      TABLE 9.1 — Summary of the DQA process
1.  Review Project
   Plan Document
The project plan document (or a cited
stand-alone document) that addresses:
(a) Directed Planning Process Report,
   including DQOs, MQOs and
   optimized Sampling and Analysis
   Plan.
(b) Revisions to documents in (a) and
   problems or deficiency reports.
(c) DQA Plan.
Identification of project documents.
Gear understanding by the assessment team of
project's DQOs and MQOs.
Clear understanding of assumptions made
during the planning process.
If a clear description of the DQOs does not
exist, the assessment team should record the
DQOs (as they were established for
assessment).
173
174
2.  Are the Samples
   Representative?
The project plan document (or a cited
stand-alone document) that addresses:
(a) The sampling portion of the
   Sampling and Analysis Plan.
Documentation of all assumptions as potential
limitations and, if possible, a description of
their associated ramifications.
The determination of whether the design
         MARLAP
         DO NOT CITE OR QUOTE
                                          9-6
                                                               JULY 2001
                                            DRAFT FOR PUBLIC COMMENT

-------
                                                                                     Data Quality Assessment
                              (b) SOPs for sampling.
                              (c) Sample handing and preservation
                                 requirements of the APS
                                                          resulted in a representative sampling of the
                                                          population of interest.
                                                          The determination of whether the sampling
                                                          locations introduced bias.
                                                          The determination of whether the sampling
                                                          equipment and their use as described in the
                                                          sampling procedures were capable of extracting
                                                          a representative set of samples from the material
                                                          of interest.
                                                          An evaluation of the necessary deviations
                                                          (documented),  as well as those deviations
                                                          resulting from misunderstanding or error, and a
                                                          determination of their impact on the representa-
                                                          tiveness of the affected samples.
175
176
3.  Are the Data     The project plan documents (or a
   Accurate?       cited stand-atone document) which
                   address:
                   (a) The analysis portion of the
                       Sampling and Analysis Plan.
                   (b) The MQOs.
                   (c) SOPs for analysis.
                   (d) Analytical Protocol Specifications,
                       including quality control
                       requirements and MQOs.
                   (e) SOW.
                   (f) The selected analytical protocols.
                   (g) Ongoing evaluations of
                       performance.
                   (h) Data Verification and  Validation
                       plans and reports.	
   A determination of whether the selected method
   was appropriate for the intended application.
   The identification of any potential sources of
   inaccuracy.
   An assessment of whether the sample analyses
   were implemented according to the analysis
   plan.
   An evaluation of the impact of any deviations
   from the analysis plan on the usability of the
   data set.
177
178
4. Can a Decision
   be Made?
The project plan document (or a cited stand-alone
document) that addresses:
(a) The DQA plan, including the statistical tests to
   be used.
(b) The DQOs and the tolerable decision error
   rates.
 • Results of the statistical tests. If new tests were
   selected, the rationale for their selection and the
   reason for the inappropriateness of the
   statistical tests selected in the DQA plan.
 * Graphical representations of the data set and
   parameters) of interest.
 • Determination of whether the DQOs and
          JULY 2001
          DRAFT FOR PUBLIC COMMENT
                                                 9-7
                                      MARLAP
                      DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
          DQA'PROCESS *   ?/     ^Input"*  *V,/    'OutputforDQ/Re^ort
                                                           tolerable decision error rates were tret.
                                                           A final determination as to whether the data are
                                                           suitable for decision-making, estimation, or
                                                           answering questions within the levels of
                                                           certainty specified during planning.
179     9.6.1   Review of Project Documents

180     The first step of the DQA process is for the team to identify and become familiar with the DQOs
181     of the project and the DQA plan. Like the planning process, the steps of the DQA process are
182     iterative, but they are presented in this text in a step-wise fashion for discussion purposes.
183     Members of the assessment team may focus on different portions of the project plan documents
184     and different elements of the planning process. Some may do an in-depth review of the directed
185     planning process during this step; others will perform this task during a later step. The
186     assessment team should receive revisions to the project planning documents and should review
187     deficiency reports associated with the project. The subsections below discuss the key and
188     minimum project documents that should be reviewed.
189
190     9.6.1.1 The Project DQOs and MQOs

191     Since the usability of data is measured in terms of the project DQOs, the first step in the DQA
192     process is to acquire a thorough understanding of the DQOs. If the DQA will be performed by
193     more than one assessor, it is essential that the assessment team shares a common understanding
194     of the project DQOs and tolerable decision error rates. The assessment team will refer to these
195     DQOs continually as they make determinations about data usability. The results of the directed
196     planning process should have been documented in the project plan documents. The project plan
197     documents, at a minimum, should describe the DQOs and MQOs clearly and in enough detail
198     that they are not subject to misinterpretation or debate at this  last phase of the project.

199     If the DQOs and MQOs are not described properly in the project plan documents or do not
200     appear to support the project decision, or if questions arise, it may be necessary to review other
201     planning documents (such as memoranda) or to consult the project planning team or the core
202     group (Section 2.4). If a clear description of the DQOs does not exist, the assessment team
203     should record any clarifications the assessment team made to the DQO statement as part of the
204     DQA report.
        MARLAP                                                                       JULY 2001
        DO NOT CITE OR QUOTE                     9=8.               DRAFT FOR PUBLIC COMMEN1

-------
                                                                       Data Quality Assessment
205     9.6.1.2 The DQA Plan

206     If the assessment team was not part of the directed planning process, the team should familiarize
207     itself with the DQA plan and become clear on the procedures and criteria that are to be used for
208     the DQA Process. If the assessment team was part of the planning process, but sufficient time has
209     elapsed since the conclusion of planning, the assessment team should review the DQA plan. If
210     the process is not clearly described in a DQA plan or does not appear to support the project
211     decision, or if questions arise, it may be necessary to consult the project planning team or the
212     core group. If necessary, the DQA plan should be revised. If it cannot be, any deviations from it
213     should be recorded in the DQA report.

214     During DQA, it is important for the team, including the assessors and statistician, to be able to
215     communicate accurately. Unfortunately, this communication can be complicated by the different
216     meanings assigned  to common words (e.g., samples, homogeneity). The assessment team should
217     be alert to these differences during their deliberations. The assessment team will need to
218     determine the usage intended by the planning team.

219     It is important to use a directed planning process to ensure that good communications exist from
220     planning through data use. If the statistician and other experts are involved through the data life
221     cycle and commonly understood terms are employed, chances for success are increased.

222     9.6.1.3 Summary of the DQA Review

223     The review of project documents should result in:

224      * An identification and understanding of project plan documents, including any changes made
225        to them and any problems encountered with them;

226      • A clear understanding of the DQOs for the project. If a clear description of the DQOs does not
227        exist, the assessment team should reach consensus on the DQOs prior to commencing the
228        DQA and record the DQOs (as they were established for assessment) as part of the DQA
229        report; and

230      • A clear understanding of the terminology, procedures, and criteria for the DQA process.
        JULY 2001                                                                     MARLAP
        DRAFT FOR PUBLIC COMMENT              9-9                   DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
231     9.6.2  Sample Representativeness

232     MARLAP does not provide guidance on developing sampling designs or a sampling plan. The
233     following discussion of sampling issues during a review of the DQA process is included for
234     purposes of completeness.

235     "Sampling" is the process of obtaining a portion of a population (i.e., the material of interest as
236     defined during the planning process) that can be used to characterize populations that are too
237     large or complex to be evaluated in their entirety. The information gathered from the samples is
238     used to make inferences whose validity reflects how closely the samples represent the properties
239     and analyte concentrations of the population. "Representativeness" is the term employed for the
240     degree to which samples properly reflect their parent populations. A "representative sample," as
241     defined in ASTM D6044, is "a sample collected in such a manner that it reflects one or more
242     characteristics of interest  (as defined by the project objectives) of a population from which it was
243     collected" (Figure 9.1). Samples collected in the field as a group and subsamples generated as a
244     group in the laboratory (Appendix F) should reflect the population physically and chemically. A
245     flaw in any portion of the sample collection or sample analysis design or their implementation
246     can impact the representativeness of the data and the correctness of associated decisions.
247     Representativeness is a complex issue related to analyte of interest, geographic and temporal
248     units of concern, and project objectives.

249     The remainder of this subsection discusses the issues that should be considered in assessing the
250     representativeness of the samples: the sampling plan (Section 9.6.2.1) and its implementation
251     (Section 9.6.2.2). MARLAP recommends that all sampling design and statistical assumptions be
252     identified clearly in project plan documents along with the rationale for their use.

253     9.6.2.1 Review of the Sampling Plan

254     The sampling plan and its ability to generate representative samples are assessed in terms of the
255     project DQOs. The assessors review the project plan with a focus on the approach to sample
256     collection, including sample preservation, shipping and subsampling in the field and laboratory,
257     and sampling standard operating procedures (SOPs). Ideally the assessors would have been
258     involved in the planning process and would be familiar with the DQOs  and MQOs and the
259     decisions made during the selection of the sampling and analysis design. If the assessors were
260     part of the project planning team, this review to become familiar with the project plan will go
261     quickly, and the team can focus on deviations from the plan that will  introduce unanticipated
262     imprecision or bias (Section 9.6.2.2).
         MARLAP                     _                     ...                         JULY 200
         DO NOT CITE OR QUOTE                    9-10              DRAFT FOR PUBLIC GCMMEST

-------
                                                                        Data Quality Assessment
                                             ANALYTICAL
                                             SUBSAMPLES
           FIELD SAMPLES
                                                    Collectively Subsamples
                                                     represent population
            00
            00
            OQ
                 Collectively Samples
                 represent population
                                                                           DATABASE
                                                                A     = --
Database accurately represents
  the measured population
     characteristic
                                             POPULATION
263
   FIGURE 9.1 — Using physical samples to measure a characteristic of the population representatively.
APPROACH TO SAMPLE COLLECTION
264     Project plan documents (e.g., QAPP, SAP, Field Sampling Plan) should provide details about the
265     approach to sample collection and the logic that was employed in its development. At this stage,
266     the assessment team should evaluate whether the approach, as implemented, resulted in
267     representative samples. For example, if the approach was probabilistic, the assessment team
268     should determine if it was appropriate to assume that spatial or temporal correlation is not a
269     factor, and if all portions of the population had an equal chance of being sampled. If an
270     "authoritative" sample collection approach was employed (i.e., a person uses his knowledge to
271     choose sample locations and times), the assessment team—perhaps in consultation with the
272     appropriate experts (e.g., an engineer familiar with the waste generation process)—should
273     determine if the chosen sampling conditions do or do not result in a "worst case" or "best case."

274     The assessment team should evaluate whether the chosen sampling locations resulted in a
275     negative or positive bias, and whether the frequency and location of sample collection accounted
276     for the population heterogeneity.
        JULY 2001
        DRAFT FOR PUBLIC COMMENT
                                         9-11
                        MARLAP
          DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
 277      Optimizing the data collection activity (Section 2.5.4 and Appendix B3.8) involved a number of
 278      assumptions. These assumptions are generally employed to manage a logistical, budgetary, or
 279      other type of constraint, and are used instead of additional sampling or investigations. The
 2SO      assessment team needs to understand these assumptions in order to fulfill its responsibility to
 281      review and evaluate whether their continued validity based on the project's implementation. The
 282      assessment team should review the bases for the assumptions made by the planning team because
 283      they can result in biased samples and incorrect conclusions. For example, if samples are collected
 284      from the perimeter of a lagoon to characterize the contents of the lagoon, the planning team's
 285      assumption was that the waste at the lagoon perimeter has the same composition as that waste
 286      located in the less-accessible center of the lagoon. In this example, there should be information to
 287      support the assumption, such as historical data, indicating that the waste is relatively homogen-
 288      ous and well-mixed. Some assumptions will be stated clearly in project plan documents. Others
' 289      may only come to light after  a detailed review. The assessment team should review assumptions
 290      for their scientific soundness and potential impact on the representativeness of the samples.

 291      Ideally, assumptions would be identified clearly in project plan documents, along with the
 292      rationale for their use. Unfortunately, this is uncommon, and in some cases, the planners may be
 293      unaware of some of the implied assumptions associated with a design choice. The assessment
 294      team should document any such assumptions in the DQA report as potential limitations and, if
 295      possible, describe their associated ramifications. The assessment team may also suggest
 296      additional investigations to verify the validity of assumptions which are questionable or key to
 297      the project.

 298      SAMPLING SOPs

 299      Standard operating procedures for sampling should be assessed for their appropriateness and
 300      scientific soundness. The assessment team should assess whether the sampling equipment and
 301      their use, as described in the  sampling procedures, were capable of extracting a representative set
 302      of samples from the material of interest. The team also should assess whether the equipment's
 303      composition was compatible with the analyte of interest. At this stage, the assessment team
 304      assumes the sampling device was employed according to the appropriate SOP. Section 9.6.2.2
 305      discusses implementation and deviations from the protocols.

 306      In summary, the assessment team should investigate whether:

 307       • The sampling device was compatible with the material being sampled and with the analytes of
 308         interest;
         MARLAP                                                                      JULY 2001
         DO NOT CITE OR QUOTE                     9-12              DRAFT FOR PUBLIC COMMEN1

-------
                                                                        Data Quality Assessment
309      • The sampling device accommodated all particle sizes and did not discriminate against
310        portions of the material being sampled;

311      • The sampling device resulted in contamination or loss of sample components;

312      • The sampling device allowed access to all portions of the material of interest;

313      • The sample handling, preparation, and preservation procedures maintained sample integrity;
314        and

315      • The field and laboratory subsampling procedures resulted in a subsample that accurately
316        represents the contents of the original sample.

317     These findings should be detailed in the DQA report.

318     9.6.2.2 Sampling Plan Implementation

319     The products of the planning phase are integrated project plan documents that define how the
320     planners intend the data collection process to be implemented. At this point in the DQA process,
321     the assessment team determines whether sample collection was done according to the plan,
322     reviews any noted deviations from the protocols, identifies any additional deviations, and
323     evaluates the impact of these deviations on sample representativeness and the usability of the
324     data. The success of this review will be a function of the documentation requirements specified
325     during the planning process, and how thoroughly these requirements were met during sample
326     collection.

327     The determination as to whether the plans were implemented as written typically will be based
328     on a review of documentation generated during the implementation phase, through on-site
329     assessments, and during verification, if sampling activities (e.g., sample preservation) were
330     subjected to verification. In some instances, assessment team members may have firsthand
331     knowledge from an audit that they performed, but in general the assessment team will have to
332     rely upon documentation generated by others. The assessment team will review field notes,
333     sample forms, chain-of-custody forms, verification reports, audit reports, deviation reports,
334     corrective action documentation, QA reports, and reports to management. The assessment team
335     also may choose to interview field personnel to clarify issues or to account for missing
336     documentation,.
        JULY 2001                                                                      MARLAP
        DRAFT FOR PUBLIC COMMENT     .  .       9-13                    DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
337     Due to the uncontrolled environments from which most samples are collected, the assessment
338     team expects to find some deviations even from the best-prepared plans. Those not documented
339     in the project deficiency and deviation reports should be detailed in the DQA report. The
340     assessment team should evaluate these necessary deviations, as well as those deviations resulting
341     from misunderstanding or error, and determine their impact on representativeness of the affected
342     samples. These findings also should be detailed in the DQA report.

343     In summary, the assessment team will develop findings and determinations regarding any
344     deviations from the original plan, the rationale for the deviations, and if the deviations raise
345     question of representativeness.

346     9.6.2.3 Data Considerations

34?     Sample representativeness also can be evaluated in light of the resulting data. Favorable
348     comparisons of the data to existing data sets (especially those data sets collected by different
349     organizations and by different methods) offer encouraging evidence of representativeness, but
350     not absolute confirmation of sample representativeness, since both data sets could suffer from the
351     same bias and imprecision. The project plan documents should have referenced any credible and
352     applicable existing data sets identified by the planning team. Comparisons to existing data sets
353     may offer mutual support for the accuracy of each other, and when differences result they tend to
354     raise questions about both data sets. Quite often, the DQA assessors are looking for confirmatory
355     or conflicting information. How existing data sets are used during the DQA will be determined
356     by how much confidence the assessors place in them. If they are very confident  in the accuracy of
357     existing data sets, then they may classify the new data as unusable if it differs from the existing
358     data. If there is little confidence in the existing data set, then the assessors may just mention in
359     the DQA report that the new data set was in agreement or not in agreement. However, if the
360     planning team has determined that additional data were needed, they probably will not have
361     sufficient confidence in the existing data set for purposes of decision-making.

362     Data comparison is an issue that could be addressed during validation to some degree, depending
363     on the validation plan. However, at this point in the DQA, comparable data sets serve a different
364     purpose. For example, the MDCs, concentration units, and the analytical methods may be the
365     same and allow for data comparison in validation. However, the assessors during DQA would
366     look for similarities and dissimilarities in reported concentrations for different areas of the
367     populations, and whether any differences might be an indication of a bias or imprecision that
368     makes the samples less representative. Temporal and spatial plots of the data also may be helpful
369     in identifying portions of the sampled population that were over- or under-represented by the data
370     collection activity.

        MARLAP                              -                                          JULY 2001
       ._ .DO NOT CITE OR QUOTE                    9-14              DRAFT FOR PUBLIC COMMENT

-------
                                                                           Data Quality Assessment
371      The planning process and development of probabilistic sampling plans typically require
372      assumptions regarding average concentrations and variances. If the actual average concentrations
373      and variances are different than anticipated, it is important for the assessment team to evaluate
374      the ramifications of these differences on sample representativeness. As reported values approach
375      an action level, the greater the need for the  sample collection activities to accurately represent the
376      population characteristics of interest.

377      During the evaluation of sample representativeness, as discussed in the previous subsections, the
378      assessment team has the advantage of hindsight, since they review the sample collection design
379      in light of project outcomes and can determine if the sample collection design could have been
380      optimized differently to better achieve project objectives. Findings regarding the representative-
381      ness of samples and how sampling can be optimized should be expeditiously passed to project
'382      managers if additional sampling will be performed.

383      In summary, results of the evaluation of the sample representativeness are:

384       • An identification of any assumptions that present limitations and, if possible, a description of
385        their associated ramifications;

386       • A determination of whether the design resulted in a representative sampling of the population
387        of interest;

388       • A determination of whether the specified sampling locations, or alternate locations as
389        reported, introduced bias;

390       • A determination of whether the sampling equipment and their use, as described in the
391        sampling procedures or as implemented, were capable of extracting a representative set of
392        samples from the material of interest;  and

393       • An evaluation of the necessary deviations from the plan, as well as those deviations resulting
394        from misunderstanding or error, and a determination of their impact on the representativeness
395        of the affected samples.

396      The product of this step is a set of findings regarding the impact of representativeness—or the
397      lack thereof—that affects data usability. Findings and determinations regarding representative-
398      ness will impact the usability of the resulting data to varying degrees. Some findings may be so
399      significant (e.g., the wrong waste stream was sampled) that the samples can be determined to be
400      non-representative and the associated data  cannot be used; as a result, the DQA need not progress


         JULY 2001                         ~                                     - -         MARLAP
         DRAFT FOR PUBLIC COMMENT      .       9-15                     DO NOT CITE OR QUOTE

-------
         Data Quality Assessment
401      any further. Typically, findings will be subject to interpretation, and the impacts on representa-
402      tiveness will have to be evaluated in light of other DQA findings to determine the usability of
403      data.

404      9.6.3   Data Accuracy

405      The next step in the DQA process is the evaluation of the analysis process and accuracy of the
406      resulting data. The term "accuracy" describes the closeness of the result of a measurement to the
407      true value of the quantity being measured. The accuracy of results may be affected by both
408      imprecision and bias in the measurement process, and by blunders and loss of statistical control
409      (see Chapter 19, Measurement Statistics, for a discussion on measurement error).

'410      Since MARLAP uses "accuracy" only as a qualitative concept, in accordance with the
411      International Vocabulary of Basic and General Terms in Metrology (ISO, 1993), the agreement
412      between measured results and true values is evaluated quantitatively in terms of the
413      "imprecision" and "bias" of the measurement process. "Imprecision" usually is expressed as a
414      standard deviation, which measures the dispersion of results about their mean. "Bias" is a
415      persistent distortion of results from the true value.

416      During the directed planning process, the project planning team should have made an attempt to
417      identify and control sources of imprecision and bias (Appendix B3.8). During DQA, the
418      assessment team should evaluate the degree of imprecision and bias and determine its impact on
419      data usability. Quality control samples are analyzed for the purpose of assessing imprecision and
420      bias. Spiked samples and method blanks typically are used to assess bias, and duplicates are used
421      to assess imprecision.  Since a single measurement of a spike or blank principle cannot
422      distinguish between imprecision and bias, a reliable estimate of bias requires a data set that
423      includes many such measurements. Control charts of quality control (QC) data, such as field
424      duplicates, matrix spikes, and laboratory control samples are graphical representations and
425      primary tools for monitoring the control of sampling and analytical methods and identifying
426      precision and bias trends (Chapter 18, Quality Control).

427      Measurable types  of bias are identified and controlled through the application of quantitative
428      MQOs to QC samples, such as blanks, standard reference materials, performance evaluation
429      samples, calibration check standards, and spikes samples. Non-measurable forms of bias (e.g., a
430      method being implemented incorrectly, such as reagents being added in the incorrect order) are
431      usually identified  and  controlled by well-designed plans that specify quality assurance systems
432      that detail needed training, use of appropriate SOPs, deficiency reporting systems, assessments,
433      and quality improvement processes.

         MARLAP                                                           . .          JULY 2001
         DO NOT CITE OR QUOTE                    9-16              DRAFT FOR PUBLIC COMMENT

-------
                                                                          Data Quality Assessment
434      Bias in a data set may be produced by measurement errors that occur in steps of the measurement
435      process that are not repeated. Imprecision may be produced by errors that occur in steps that are
436      repeated many times. The distinction between bias and imprecision is complicated by the fact
437      that some steps, such as instrument calibration and tracer preparation and standardization, are
438      repeated at varying frequencies. For this reason, the same source of measurement error may
439      produce an apparent bias in a small data set and apparent imprecision in a larger data set. During
440      data assessment, an operational definition of bias is needed. A bias may exist if results for
441      analytical spikes (i.e., laboratory control samples, matrix spike, matrix spike duplicate),
442      calibration checks, and performance evaluation samples associated with the data set are mostly
443      low or mostly high, if the results of method blank analyses tend to be positive, or if audits
444      uncover certain types of biased implementation of the SOPs. At times, the imprecision of small
445      data sets can incorrectly indicate a bias, while at other times, the presence of bias may be masked
"446      by imprecision. Statistical methods can be applied to imprecise data sets and used to determine if
44?      there are statistically significant differences between data sets or between a data set and an
448      established value. If the true value or reference value (e.g., verified concentration for a standard
449      reference material) is known, then statistics can be used to determine whether there is a bias.

450      Figure 9.2 employs targets to depict the impacts of imprecision and bias on measurement data.
451      The true value is portrayed by the bulls-eye and is 100 units (e.g., dpm, Bq, pCi/g). Ideally, all
452      measurements with the same true value would be centered on the target, and after analyzing a
453      number of samples with the same true value, the reported data would be 100 units for each and
454      every sample. This ideal condition of precise and unbiased data is pictured in Figure 9.2(a). If the
455      analytical process is very precise but suffers from a bias, the situation could be as pictured in
456      Figure 9.2(b) in which the data are very reproducible but express a significant 70 percent
457      departure from the true value—a significant bias. The opposite situation is depicted in Figure
458      9.2(c), where the data are not precise and every sample yields a different concentration. However,
459      as more samples are analyzed, the effects of imprecision tend to  average out, and lacking any
460      bias, the average measurement reflects the true concentration. Figure 9.2(d) depicts a situation
461      where the analytical process suffers from both imprecision and bias, and even if innumerable
462      samples with the same true value are collected and analyzed to control the impact of imprecision,
463      the bias would result in the reporting of an incorrect average concentration.
         JULY 2001            ---                                                          MARLAP
         DRAFT FOR PUBLIC COMMENT              9-17                     DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
                                      Precise
                                                                Precise
                                                100ppm = true
                                                concentration
                                     Ave. » 100 = True Value
                                                           True Value
                                                           (100 ppm)
                                       ConcintritJon
                                      Imprecise
                                                                  Oncmlrilton
           Imprecise
                                                 lOOppm etrue
                                                 concentration
                               (c)     Unbiased
                                      Ave. = 100
                                                                Ave.=150
                                      Ave. = 100 = True Value
                                        i
                                        i
       True Value
        (100 ppm)
iAve. = 150
                                                                   Concmtnllen
                           FIGURE 9.2 — Types of sampling and analytical errors.
464      Each target in Figure 9.2 has an associated frequency distribution curve. Frequency curves are
465      made by plotting a concentration value versus the frequency of occurrence for that concentration.
466      Statisticians employ frequency plots to display the imprecision of a sampling and analytical
467      event, and to identify the type of distribution. The curves show that as imprecision increases the
468      curves flatten-out and there is a greater frequency of measurements that are distant from the
         MARLAP
         DO NOT CITE OR QUOTE
9-18
                    JULY 2001
DRAFT FOR PUBLIC COMMENT

-------
                                                                          Data Quality Assessment
469     average value (Figures 9.2c and d). More precise measurements result in sharper curves (Figures
470     9.2a and b), with the majority of measurements relatively closer to the average value. The greater
471     the bias (Figures 9.2b and d), the further the average of the measurements is shifted from the true
472     value. The smaller the bias (Figures 9.2a and c), the closer the average of the measurements is to
473     the true value.

474     The remainder of this subsection focuses on the review of analytical plans (Section 9.6.3.1) and
475     their implementation (Section 9.6.3.2) as a mechanism to assess the accuracy of analytical data
476     and their suitability for supporting project decisions.

477     9.6.3.1 Review of the Analytical Plan

478     The analytical plan is that portion of the project plan documentation (e.g., in QAPP or SAP) that
479     addresses the optimized analytical design and other analytical issues (e.g., analytical protocol
480     specifications, SOPs). Its ability to generate accurate data is assessed in terms of the project
481     DQOs. The assessment team will refer to the DQOs and the associated MQOs as they review the
482     analytical protocol specifications to understand how the planning team selected methods and
483     developed the analytical plan. If the assessors were part of the project planning team, this review
484     process will go quickly and the team can focus on deviations from the plan that will  introduce
485     unanticipated imprecision or bias.  (The term "analytical plan" is not meant to indicate a separate
486     document.)

487     REVIEW OF THE MQOs, ANALYTICAL PROTOCOL SPECIFICATIONS, AND OPTIMIZED ANALYTICAL
488     DESIGN

489     The assessment team's review of the analytical plan first should focus on the analytical protocol
490     specifications, including the MQOs, which were established by the project planning team
491     (Chapter 3). The team should understand how the analytical protocol specifications were used to
492     develop the SOW (Chapter 5) and select the method (Chapter 6).  If the project and contractual
493     documentation are silent or inadequate on how they address these key issues, the assessment
494     team may be forced to review the analytical results in terms of the project DQOs and determine if
495     the MQOs achieved were sufficient to meet the project's objectives.

496     As with the approach to sample collection, optimizing the analytical activity involved a number
497     of assumptions. Assumptions were made when analytical issues were resolved during planning
498     and the decisions were documented in the analytical protocol specifications (Chapter 3). It is
499     important for the assessment team to be aware of these assumptions because they can result in
500     biases and incorrect conclusions. Some assumptions will be clearly stated in the project plan

        JULY 2001      -                                                                MARLAP
        DRAFT FOR PUBLIC COMMENT              9-19                   DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
501     documents. Others may only come to light after a detailed review. The assessment team should
502     review assumptions for their scientific soundness and potential impact on the data results.

503     Ideally, assumptions would be identified clearly in project plan documents, along with the
504     rationale for their use. Unfortunately, this is uncommon, and in some cases, the planners may be
505     unaware of some of the implied assumptions associated with a design choice. The assessment
506     team should document any such assumptions in the DQA report as potential limitations and, if
507     possible, describe their associated ramifications. The assessment team may also suggest
508     additional investigations to verify the validity of assumptions which are questionable or key to
509     the project.

510     REVIEW OF THE ANALYTICAL PROTOCOLS

511     The analytical plan arid the associated analytical protocols will be reviewed and assessed for their
512     scientific soundness, applicability to the sample matrix and the ability to generate precise and
513     unbiased data. The analytical protocols review should consider the entire analytical process, from
514     sample preparation through dissolution and separations, counting, data reduction, and reporting.
515     MARLAP, whose focus is on the analytical process, defines "analytical process" as including
516     sample handling in the field (e.g., filtration, sample preservation) to ensure that all activities that
517     could impact analyses would be considered. The assessment team should consider both sampling
518     and analytical processes in assessing data quality—and such field activities as sample preserva-
519     tion—along with other issues that can affect representativeness (Section 9.6.2). The assessment
520     team also should review the contract evaluation (under the performance-based approach) for the
521     selection of the analytical protocols to assure that the documentation showed that the protocol
522     could meet the analytical protocol specifications (which defines the MQOs).

523     Since the review of the analytical protocols will be performed with the advantage of hindsight
524     gained from the data verification and data validation reports, the assessment team also should
525     attempt to  identify any flaws in the analytical protocols that may have resulted in  noncompliance
526     with MQOs. The identification of these flaws is essential if future analyses will be required.

527     REVIEW OF VERIFICATION AND VALIDATION PLANS

528     To understand how the verification and validations processes were implemented and the  degree
529     to which the assessors can rely upon their findings, the assessors should familiarize themselves
530     with the verification and validation plans that were developed during the planning phase. A
531     review of these plans will indicate whether the issues deemed of importance to the assessors were
532     evaluated and the thoroughness of the evaluations.


        MARLAP                                                                       JULY 2001
        DO NOT CITE OR QUOTE                     9-20-  - - ~         DRAFT FOR PUBLIC COMMEN1

-------
                                                                         Data Quality Assessment
533     9.6.3.2 Analytical Plan Implementation

534     After reviewing the analytical plan, the assessment team should assess whether sample analyses
535     were implemented according to the analysis plan. Typically, the first two steps of the assessment
536     phase—data verification and data validation—have laid most of the groundwork for this
537     determination. However, the issue of whether the plan was implemented as designed needs to be
538     reviewed one final time during the DQA process. This final review is needed since new and
539     pertinent information may have been uncovered during the first steps of the DQA process.

540     The goal of this assessment of the analytical protocols and associated MQOs is to confirm that
541     the selected method was appropriate for the intended application and to identify any potential
542     sources of inaccuracy, such as:

543      • Laboratory subsampling procedures that resulted in the subsample that may not accurately
544        represent the content of the original sample;

545      • Sample dissolution methods that may not have dissolved sample components quantitatively;

546      • Separation methods whose partitioning coefficients were not applicable to the sample matrix;

547      * Unanticipated self-absorption that biased counting data;

548      • Non-selective detection systems that did not resolve interferences; or

549      • Data reduction routines that lacked  needed resolution or appropriate interference corrections
550
551     The success of the assessment of the analytical plan implementation will be a function of the
552     documentation requirements specified during the planning process, and how thoroughly these
553     requirements were met during sample analysis. In some instances, assessment team members
554     may have firsthand knowledge from an audit that they performed, but in general the assessment
555     team will have to rely upon documentation generated by others.

556     In addition to verification  and validation reports, the  assessment team will review pertinent
557     documents such as: laboratory notebooks, instrument logs, quality control charts, internal
558     sample-tracking documentation, audit  reports, deviation reports, corrective action documentation,
559     performance evaluation sample reports, QA reports, and reports to management provided for
560     verification and validation. To clarify issues or to account for missing documentation, the
561     assessment team may choose to interview laboratory personnel.


        JULY 2001     "                 '            .                                   MARLAP
        DRAFT FOR PUBLIC COMMENT              9-21                     DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
562     Verification and validation reports will be used to identify nonconformance, deviations, and
563     problems that occurred during the implementation of the analytical plan. The challenge during
564     DQA is to evaluate the impact of nonconformance, deviations, problems, and qualified data on
565     the usability of the overall data set and the ability of the data set to support the decision.

566     Deviations from the plan will be encountered commonly and the assessment team will evaluate
567     the impact of these deviations upon the accuracy of the analytical data. The deviations and the
568     assessment team's related findings should be detailed in the data quality assessment report.

569     The prior verification and validation processes and the prior DQA steps involving the evaluation
570     of sampling are all an attempt to define the quality of data by (1) discovering sources of bias,
571     quantifying their impact, and correcting the reported data; and (2) identifying and quantifying
572     data imprecision. The products of this step are a set of findings regarding the analytical process
573     and their impact on data usability. Some findings may be so significant (e.g., the wrong analytical
574     method was employed) that the associated data cannot be used, and as a result, the DQA need not
575     progress any further. Typically, findings will be subject to interpretation and a final
576     determination as to the impacts will have to wait until the data has been subjected to evaluations
577     described in Section 9.6.4.

578     After reviewing the verification and validation reports,  the outputs of the analytical data
579     evaluation are:

580      • A determination of whether the selected analytical protocols and analytical performance
581        specifications were appropriate for the intended application;

582      • An identification of any potential sources of inaccuracy; and

583      • A determination of whether sample analyses were implemented according to the analysis plan
584        and the overall impact of any deviations on the usability of the data set.

585     9.6.4  Decisions and Tolerable Error Rates

586     A goal of DQA is to avoid making a decision based on inaccurate data generated by analytical
587     protocols found to be out of control or on data generated from samples found to be nonrepresen-
588     tative, and to avoid making decisions based on data of unknown quality. Preferably, a decision
589     should be made with data of known quality (i.e., with data of known accuracy from samples of
590     known representativeness) and within the degree of confidence specified during the planning
591     phase.

        MARLAP                                                                        JULY 2001
        DO NOT CITE OR QUOTE                    9-22"              DRAFT FOR PUBLIC COMMEN1

-------
                                                                           Data Quality Assessment
592      This section focuses on the final determination by the assessment team, who uses the information
593      taken from the previous assessment processes and statistics to make a final determination of
594      whether the data are suitable for decision-making, estimating, or answering questions within the
595      levels of certainty specified during planning.

596      9.6.4.1 Statistical Evaluation of Data

597      Statistics are used for the collection, presentation, analysis, and interpretation of data. The two
598      major branches of statistics, "descriptive statistics" and "inferential statistics," are applicable to
599      data collection activities. "Descriptive statistics" are those methods that describe populations of
600      data. For example, descriptive statistics include the mean, mode, median, variance, and
601      correlations between variables, tables, and graphs to describe a set of data. "Inferential statistics"
'602      use data taken from population samples to make estimates about the whole population
603      ("inferential estimations") and to make decisions ("hypothesis testing"). Descriptive statistics is
604      an important tool for managing and investigating data in order that their implications and
605      significance to the project goals can be understood.

606      Sampling and inferential statistics have identical goals—to use samples to make inferences about
607      a population of interest and to use sample data to make defensible decisions. This similarity is
608      the reason why planning processes, such as those described in Chapter 2, couple sample
609      collection activities with statistical techniques to maximize the representativeness of samples, the
610      accuracy of data, and the certainty of decisions.

6i l      Due to the complexity of some population distributions (Appendix 19A) and the complex
612      mathematics needed to treat these distributions and associated data, it is often best to consult
613      with someone familiar with statistics to ensure that statistical issues have been addressed
614      properly. However, it is critical for the non-statistician to realize that statistics has its limitations.
615      The following statistical limitations should be considered when assessment teams and the project
616      planning team are planning the assessment phase and making decisions:

617       • Statistics are used to measure precision and, when true or reference values are known,
618         statistics can be applied to imprecise data to determine if a bias exists. Statistics do not
619         address all types of sampling or measurement bias directly.

620       • If the characteristic of interest in a sample is more similar  to that of samples adjacent to it than
621         to samples that are further removed, the samples are deemed to be "correlated" and are not
622         independent of each other (i.e., there is a serial correlation such that samples collected close in
623         time  or space have more similar concentrations than those samples further removed).


         JULY 2001                                      :~                             -~      MARLAP
         DRAFT FOR PUBLIC COMMENT               9-23                    DO NOT CITE OR QUOTE

-------
         Data Quality Assessment
624         Conventional parametric and non-parametric statistics require that samples be independent
625         and are not applicable to populations that have significantly correlated concentrations.

626      The statistical tests typically are chosen during the directed planning process and are documented
627      in the project plan documents (e.g., DQA plan, QAPP). However, there are occasions when the
628      conditions encountered during the implementation phase are different than anticipated (e.g., data
629      were collected without thorough planning, or data are being subjected to an unanticipated
630      secondary data use). Under these latter conditions, the statistical tests will be chosen following
631      data collection.

632      The statistical analysis of data consists of a number of steps. The following outline of these steps
633      is typical of the analyses that a statistician would implement in support of a data quality
•634      assessment.

635      CALCULATE THE BASIC STATISTICAL PARAMETERS

636      Chapter 19 has a detailed discussion of statistical issues, so a few concepts key to understanding
637      are summarized here. Statistical "parameters" are fundamental quantities that are used to describe
638      the central tendency or dispersion of the data being assessed. The mean, median, and mode are
639      examples of statistical parameters that are used to describe the central tendency, while range,
640      variance, standard deviation, coefficient of variation, and percentiles are statistical parameters
641      used to describe the dispersion of the data. These basic parameters are used because they offer a
642      means of understanding the data, facilitating communication and data evaluation, and generally
643      are necessary for subsequent statistical tests.

644      GRAPHICAL REPRESENTATIONS

645      Graphical representations of the data are similar to basic statistical parameters in that they are a
646      means of describing and evaluating data sets. Graphical representations of QC-sample results
647      used to evaluate project-specific control limits and warning limits derived from the MQO criteria
648      are discussed in Appendix C. Graphical representations of field data over space or time have the
649      additional ability of offering insights, such as identifying temporal and spatial patterns, trends,
650      and correlations. Graphical depictions are also an excellent means of communicating and
651      archiving information.
         MARLAP                                                                        JULY 200
         DO NOT CHE OR QUOTE                    9-24              DRAFT FOR PUBLIC COMMEN

-------
                                                                         Data Quality Assessment
652     REVIEW AND VERIFY TEST ASSUMPTIONS

653     Statistical tests are the mathematical structure that will be employed to evaluate the project's data
654     in terms of the project decision, question, or parameter estimate. Statistical tests are not
655     universally applicable, and their choice and suitability are based on certain assumptions. For
656     example:

657       • Some tests are suitable for "normal" distributions, while others are designed for other types of
658        distributions.

659       • Some tests assume that the data are random and independent of each other.

660       • Assumptions that underlie tests for "outliers" should be understood to ensure that hot spots or
661        the high concentrations symptomatic of skewed distributions (e.g., lognormal) are not
662        incorrectly censored.

663       • Assumptions are made regarding the types of population distributions whenever data are
664        transformed before being subjected to a test.

665       • Assumptions of test robustness need to be reviewed in light of the analyte. For example,
666        radiological data require statistical tests that can accommodate positive and negative numbers.

667     It is important that a knowledgeable person identify all assumptions that underlie the chosen
668     statistical tests, and that the data are tested to ensure that the assumptions are met. If any of the
669     assumptions made during planning proved to be not true, the assessment team should evaluate
670     the appropriateness of the selected statistical tests. Any decision to change statistical tests should
671     be documented in the DQA report.

672     APPLYING STATISTICAL TESTS

673     The chosen statistical tests will be a function of the data properties, statistical parameter of
674     interest, and the specifics of the decision or question. For example, choice of the appropriate tests
675     will vary according to whether the data are continuous or discrete; whether the tests will be
676     single-tailed or double-tailed, whether a population is being compared to a standard or to a
677     second population, or whether stratified sampling or simple random sampling was employed.
678     Once the statistical tests are deemed appropriate, they should be applied to the data by an
679     assessor who is familiar with statistics. The outputs from applying the statistical tests and
680     comparisons to project DQOs are discussed in the following section.

        JULY2001                         -.                             ....... MARLAP
   - - • — DRAFT FOR PUBLIC COMMENT              9-25                     DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
681     9.6.4.2 Evaluation of Decision Error Rates

682     The heterogeneity of the material being sampled and the imprecision of the sampling and
683     analytical processes generate uncertainty in the reported data and in the associated decisions and
684     answers. The project planning team, having acknowledging this decision uncertainty, will have
685     chosen "tolerable decision errors rates" during the planning process, which balanced resource
686     costs against the risk of making a wrong decision or arriving at a wrong answer. During this final
687     step of DQA process, the assessment team will use the project's tolerable levels of decision error
688     rates as a metric of success.

689     The DQA process typically corrects data for known biases and then subjects the data to the
690     appropriate statistical tests to make a decision, answer a question, or supply an estimate of a
691     parameter. The assessment team will compare statistical parameters—such as the sample mean
692     and sample variance estimates employed during the planning process—to those that were
693     actually obtained from sampling. If the distribution was different, if the mean is closer to the
694     action level, or if the variance is greater or less than estimated, one or all of these factors could
695     have an impact on the certainty of the decision. The assessment team also will review the results
696     of the statistical tests in light of missing data, outliers, and rejected data. The results of the
697     statistical tests are then evaluated in terms of the project's acceptable decision error rates. The
698     assessment team determines whether a decision could or could not be made, or why the decision
699     could not be made, within the project specified decision error rates.

700     In summary, outputs from this step are:

701       • Generated statistical parameters;

702       • Graphical representations of the data set and parameters of interest;

703       • If new tests were selected, the rationale for selection and the reason for the inappropriateness
704        of the statistical tests selected in the DQA plan;

705       • Results of application of the statistical tests; and

706       • A final determination as to whether the data are suitable for decision making, estimating, or
707        answering questions within the levels of certainty specified during planning.
         MARLAP                         -  -                                 .     ..     JULY 2001
         DO NOT CITE OR QUOTE          "          9-26               DRAFT FOR PUBLIC COMMEN1

-------
                                                                        Data Quality Assessment
708     9.7    Data Quality Assessment Report

709     The DQA process concludes with the assessment team documenting the output of the statistical
710     tests and the rationale for why a decision could or could not be made, or why the decision could
711     not be made within the project specified decision error rates. The DQA report will document
712     findings and recommendations and include or reference the supporting data and information. The
713     DQA report will summarize the use of the data verification and data validation reports for data
714     sets of concern, especially if rejected for usability in the project's decision making. The report
715     also will document the answers to the three DQA questions:

716      • Are the samples representative?
717      • Are the data accurate?
'718      • Can a decision be made?

719     Although there is little available guidance on the format for a DQA report, the report should
720     contain, at a minimum:

721      • An executive summary that briefly answers the three DQA questions and highlights major
722        issues, recommendations, deviations, and needed corrective actions;

723      * A summary of the project DQOs used to assess data usability, as well as pertinent
724        documentation such as the project plan document, contracts, and SOW;

725      • A listing of those people who performed the DQA;

726      • A summary description of the DQA process, as employed, with a discussion of any deviations
727        from the DQA plan designed during the planning process (the DQA plan  should be appended
728        to the report);

729      • A summary of the data verification and data validation reports that highlights significant
730        findings and a discussion of their impact on data usability (the data verification and data
731        validation reports should be appended to the DQA report);

732      • A discussion of any missing documentation or information and the impact of their absence on
733        the DQA process and the usability of the data;
        JULY 2001                                                                     MARLAP
        DRAFT FOR PUBLIC COMMENT             9-27            -        DO NOT CITE OR QUOTE

-------
         Data Quality Assessment
734
735
736


737
738
739
740


741
742


•743
744
745
746


747

748
749
750

751
752
753

754
755

756
757

758
759
 • A thorough discussion of the three DQA questions addressing the details considered in
   Sections 9.6.2 through 9.6.4 (possible outputs to be incorporated in the report are listed at the
   conclusion of each these section);

 • A discussion of deviations, sampling, analytical and data management problems, concerns,
   action items, and suggested corrective actions (the contents of this section should be
   highlighted in the executive summary if the project is ongoing and corrections or changes are
   needed to improve the quality and usability of future data); and

 • A recommendation or decision on the usability of the data set for the project's decision
   making.

Upon completion, the DQA report should be distributed to the appropriate personnel as specified
in the DQA plan and archived along with supporting information for the period of time specified
in the project plan document. Completion of the DQA report concludes the assessment phase and
brings the data life cycle to closure.
                            Summary of Recommendations

   MARLAP recommends that the assessment phase of a project (verification, validation, and
   DQA processes) be designed during the directed planning process and documented in the
   respective plans as part of the project plan documents.

   MARLAP recommends that project objectives, implementation activities, and QA/QC data
   be well documented in project plans, reports, and records, since the success of the
   assessment phase is highly dependent upon the availability of such information.

   MARLAP recommends the involvement of the data assessment specialises) on the project
   planning team during the directed planning process.

   MARLAP recommends that the DQA process should be designed during the directed
   planning process and documented in a DQA plan.

   MARLAP recommends that all sampling design and statistical assumptions be clearly
   identified in project plan documents along with the rationale for their use.	
         MARLAP
         DO NOT CITE OR QUOTE
                                         9-28
                  JULY 200
DRAFT FOR PUBLIC COMMENT

-------
                                                                     Data Quality Assessment
760     9.8   References

761     9.8.1  Cited Sources

762     American Society for Testing and Materials (ASTM) D6044. Guide for Representative Sampling
763       and Management of Waste and Contaminated Media. 1996.

764     American Society for Testing and Materials (ASTM) D6233. Standard Guide for Data
765       Assessment for Environmental Waste Management Activities. 1998.

766     International Organization for Standardization (ISO) 1993. International Vocabulary of Basic
767       and General Terms in Metrology.

768     MARSSIM. 2000. Multi-Agency Radiation Survey and Site Investigation Manual, Revision 1.
769       NUREG-1575 Rev 1, EPA402-R-97-016 Revl, DOE/EH-0624 Revl. August. Available
770       from http://www.epa.gov/radiation/marssim/filesfin.htm.

771     U.S. Army Corps of Engineers (USAGE). 1998. Technical Project Planning (TPP) Process.
772       Engineer Manual EM-200-1 -2.

773     U.S. Environmental Protection Agency (EPA). 2000. Guidance for the Data Quality Objective
774       Process (EPA QA/G-4). EPA/600/R-96/055, Washington, DC. available from www.epa.gov/
775       qualityl/qa_docs.html.

776     U.S. Nuclear Regulatory Commission (NRC). 1998. A Nonparametric Statistical Methodology
777       for the Design and Analysis of Final Status Decommissioning Surveys. NUREG 1505, Rev. 1.

778     9.8.2  Other Sources

779     American Society for Testing and Materials (ASTM). 1997. Standards on Environmental
780       Sampling, 2nd Edition, ASTM PCN 03-418097-38. West Conshohocken, PA.

781     American Society for Testing and Materials (ASTM) D5956. Standard Guide for Sampling
782       Strategies for Heterogeneous Wastes.

783     American Society for Testing and Materials (ASTM) D6051. Guide for Composite Sampling and
784       Field Subsampling for Environmental Waste Management Activities. 1996.
     »
        JULY 2001                                                                   MARLAP
        DRAFT FOR PUBLIC COMMENT             9-29                   DO NOT CITE OR QUOTE

-------
        Data Quality Assessment
785      American Society for Testing and Materials (ASTM) D6311. Standard Guide for Generation of
786        Environmental Data Related to Waste Management Activities: Selection and Optimization of
787        Sampling Design. 1998.

788      American Society for Testing and Materials (ASTM) D6323. Standard Guide for Laboratory
789        Subsampling of Media Related to Waste Management Activities. 1998.

790      Taylor, John Keenan. 1987. Quality Assurance of Chemical Measurements. Lewis Publishers,
791        Chelsea MI, ISBN 0-87371 -097-5.

792      U. S, Environmental Protection Agency (EPA). 1994. Guidance for the Data Quality Objective
793        Process, EPA QA/G-4, EPA/600/R-96/055, September.

794      U. S. Environmental Protection Agency (EPA). 1997. Guidance for Quality Assurance Project
795        Plans, EPA QA/G-5, August.
        MARLAP                                   -                       -  -      JULY 200
        DO NOT CITE OR QUOTE         	    9-30   .          DRAFT FOR PUBLIC COMMENT

-------