Publication Numbers:

                                 EPA: EPA-505-B-04-900A
                                 DoD:DTICADA427785
     Intergovernmental Data Quality Task Force


         Uniform Federal Policy for
       Quality Assurance Project Plans

Evaluating, Assessing, and Documenting Environmental
         Data Collection and Use Programs
            Part 1: UFP-QAPP Manual
                  SEW
                     Final
                   Version 1
                  March 2005

-------
This page intentionally left blank.

-------
                                    FOREWORD

Part I of the Uniform Federal Policy for Quality Assurance Project Plans (the UFP-QAPP Manual)
is a consensus document prepared by the Intergovernmental Data Quality Task Force (JDQTF). It
provides instructions for preparing Quality Assurance Project Plans (QAPPs) for any environmental
data collection operation.  The purpose of the LFP-QAPP 'Manual is to implement the project-
specific requirements of ANSi/'ASQ K4, Quality Systems for Environmental Data and Technology;
Programs - Requiremetifs with guidance for use, Section 6 (Part B). The Uniform Federal Policy
for Implementing Environmental Quality Systems (UFP-QS) was developed hy  the 1DQTF to
implement Section 5 (Part. A) of AN SI/A SQ K4.

Though the LTP-QAPP Manual  is a consensus  policy document, it becomes mandatory for new
QAPP development for any government department, agency, or program that voluntarily adopts the
policy. As is described in the Executive Summary and Introduction, use of ihe UFP-QAPP Manual
will be phased in over time as contracts allow and new QAPPs arc required.

Although this UFP-QAPP Manual was initially written for hazardous waste programs and federal
agencies, the IDQ'fF recognizes that it  provides suitable guidance  for a wide range of other
environmental data collect ion activities (e.g.. permitting, compliance) and a niicipfiles that (he policy
may he adopted by other programs (e.g., water, air) and by private parties as well as by other Federal
agencies. States, and Tribes.  Programs and agencies that voluntarily adopt the policy requirements
of   the   LTP-QAPP   Manual    will   he   noted  on   the   following  Website:
hup:.i7wvvvv.epa,gov/{btliae/docunienl&ilifilerguv  qua!  task force.htm.

Because adoption of the requirements of this LPP-QAPP Manual hy Federal  departments, agencies,
or programs is voluntary, failure to  use this guidance  is not  subject to enforcement action.
However, once adopted, the use of this UFP-QAPP Manual and oversight by the adopting Federal
department, agency, or program is required to ensure a consistent approach to QAPP development,
 _ _
"•  /                ™^                          ....................................        " "" '"
Thomas Dunne                                             Date
Acting Assistant Administrator for Solid Waste and Emergency Response
U.S. Knv iron mental Protection Agency
                        _
 Alex Bechler                                               Date
 Assistant Deputy Undersecretary of Defense
 (Environment, Safety, and Occupational Health)
John Spitaleri Shaw                                         Date
Acting Assistant Secretary for Environment Safety and Health
U.S. Department of Energy

-------
This page intentionally left blank.

-------
                              TABLE OF CONTENTS

QUESTIONS AND  ANSWERS REGARDING  THE UNIFORM FEDERAL POLICY FOR
QUALITY ASSURANCE PROJECT PLANS (UFP-QAPP MANUAL)	  vii

ACRONYMS	xi

EXECUTIVE SUMMARY	xiii

1.0   INTRODUCTION	1
      1.1 Uniform Federal Policy for Quality Assurance Project Plans	1
          1.1.1      Scope  	1
          1.1.2      Purpose	2
          1.1.3      Organization	3
      1.2 Quality Assurance Project Plans 	3
          1.2.1      Purpose	5
          1.2.2      Types of Quality Assurance Project Plans	5
          1.2.3      Required QAPP Element Groups and the Systematic Planning Process  . . 6
          1.2.4      QAPP Requirements 	8
          1.2.5      Graded Approach	16
          1.2.6      Review and Approval of QAPPs	16
                1.2.6.1     Lead Organization Review and Approval	16
                1.2.6.2     Regulatory Review and Approval of QAPPs	17
          1.2.7      QAPP Implementation and Modification 	17
          1.2.8      QAPP Archiving  	18
      1.3 Roles and Responsibilities	18
          1.3.1      Lead Organization	18
          1.3.2      Project Manager	19
          1.3.3      Project Team  	19
          1.3.4      QAPP Preparation Team/Writer  	20
          1.3.5      Project Personnel	20

2.0   PROJECT MANAGEMENT AND OBJECTIVES ELEMENTS	21
      2.1 Title and Approval Page  	21
      2.2 Document Format and Table of Contents  	22
          2.2.1      Document Control Format	22
          2.2.2      Document Control Numbering System	23
          2.2.3      Table of Contents	23
          2.2.4      QAPP Identifying Information  	24
      2.3 Distribution List and Project Personnel Sign-Off Sheet 	24
          2.3.1      Distribution List	24
          2.3.2      Project Personnel Sign-Off Sheet  	25
      2.4 Project Organization  	26
          2.4.1      Project Organizational Chart	26
IDQTF, UFP-QAPP Manual V1, March 2005     i                                      Final
Table of Contents

-------
                          TABLE OF CONTENTS (continued)

          2.4.2      Communication Pathways	27
          2.4.3      Personnel Responsibilities and Qualifications	29
          2.4.4      Special Training Requirements and Certification	30
       2.5 Project Planning/Problem Definition	30
          2.5.1      Project Planning (Scoping) 	30
          2.5.2      Problem Definition, Site History, and Background	35
       2.6 Project Quality Objectives and Measurement Performance Criteria	36
          2.6.1      Development of Project Quality Objectives Using the Systematic Planning
                    Process 	 36
          2.6.2      Measurement Performance Criteria	41
                 2.6.2.1     Precision	43
                 2.6.2.2     Accuracy/Bias  	44
                 2.6.2.3     Sensitivity and Quantitation Limits	45
                 2.6.2.4     Representativeness	47
                 2.6.2.5     Comparability  	49
                 2.6.2.6     Completeness	54
       2.7 Secondary Data Evaluation	54
       2.8 Project Overview and Schedule	58
          2.8.1      Project Overview (Outcome of Project Scoping Activities)	58
          2.8.2      Project Schedule 	62

3.0    MEASUREMENT AND DATA ACQUISITION ELEMENTS	63
       3.1 Sampling Tasks  	64
          3.1.1      Sampling Process Design and Rationale	64
          3.1.2      Sampling Procedures and Requirements	66
                 3.1.2.1     Sample Collection Procedures	67
                 3.1.2.2     Sample Containers, Volume, and Preservation	67
                 3.1.2.3     Equipment/Sample  Containers Cleaning  and  Decontamination
                           Procedures 	67
                 3.1.2.4     Field Equipment Calibration, Maintenance, Testing, and Inspection
                           Procedures 	68
                 3.1.2.5     Sampling Supply Inspection and Acceptance Procedures	68
                 3.1.2.6     Field Documentation Procedures	69
       3.2 Analytical Tasks   	69
          3.2.1      Analytical SOPs	71
          3.2.2      Analytical Instrument Calibration Procedures	71
          3.2.3      Analytical Instrument and Equipment Maintenance, Testing and Inspection
                    Procedures 	72
          3.2.4      Analytical Supply Inspection and Acceptance Procedures	72
       3.3 Sample Collection Documentation, Handling, Tracking, and Custody Procedures  . 73
          3.3.1      Sample Collection Documentation  	73
          3.3.2      Sample Handling and Tracking System	73
IDQTF, UFP-QAPP Manual V1, March 2005     ii                                       Final
Table of Contents

-------
                         TABLE OF CONTENTS (continued)

                3.3.2.1    Sample Handling	74
                3.3.2.2    Sample Delivery	74
          3.3.3     Sample Custody	75
      3.4 Quality Control Samples 	75
          3.4.1     Sampling Quality Control Samples	78
          3.4.2     Analytical Quality Control Samples 	78
      3.5 Data Management Tasks 	88
          3.5.1     Project Documentation and Records	88
          3.5.2     Data Package Deliverables  	90
                3.5.2.1    Sample  Collection  and   Field  Measurements  Data  Package
                          Deliverables	90
                3.5.2.2    On-site Analysis Data Package Deliverables	94
                3.5.2.3    Off-site Laboratory Data Package Deliverables 	94
          3.5.3     Data Reporting Formats	95
          3.5.4     Data Handling and Management	96
          3.5.5     Data Tracking and Control  	97

4.0   ASSESSMENT AND OVERSIGHT ELEMENTS	99
      4.1 Assessment and Response Actions 	99
          4.1.1     Planned Assessments	100
          4.1.2     Assessment Findings and Corrective Action Responses	102
      4.2 QA Management Reports  	103
      4.3 Final Project Report	104

5.0   DATA REVIEW ELEMENTS  	107
      5.1 Overview	107
      5.2 Data Review Steps	Ill
          5.2.1     Step I: Verification	Ill
          5.2.2     Step II: Validation	114
                5.2.2.1    Step Ha Validation Activities   	116
                5.2.2.2    Step lib Validation Activities   	118
          5.2.3     Step III: Usability Assessment	119
                5.2.3.1    Data Limitations and Actions from Usability Assessment  .... 120
                5.2.3.2    Activities 	125
      5.3 Streamlining Data Review  	126
          5.3.1     Data Review Steps To Be Streamlined 	127
          5.3.2     Criteria for Streamlining Data Review 	128
          5.3.3     Amounts and Types of Data Appropriate for Streamlining	129

REFERENCES  	131
GLOSSARY OF QUALITY ASSURANCE AND RELATED TERMS  	135
APPENDIX A - STANDARD OPERATING PROCEDURES 	  A-l
IDQTF, UFP-QAPP Manual V1, March 2005     iii                                     Final
Table of Contents

-------
                                 LIST OF TABLES

Table 1.   Comparison of UFP-QS and UFP-QAPP Systematic Planning Process	7
Table 2.   QAPP Requirement Summary	9
Table 3.   Tracking of QAPP Requirements: Example Crosswalk to Other Project Documents 12
Table 4.   Recommended Types and Frequencies of Sampling QC Samples for Chemical
          Data Collection  	76
Table 5.   Recommended Types and Frequency of Analytical QC Samples for Chemical Data
          Collection	77
Table 6.   Information Derived from Quality Control Samples	79
Table 7.   Example Analytical Data Deliverable Elements  	91
Table 8.   Data Review Process Summary	109
Table 9.   Example Inputs to Data Review Process	112
Table 10.  Step Ha Validation Activities  	117
Table 11.  Step lib Validation Activities  	118
Table 12.  Considerations for Usability Assessment 	  125

                                 LIST OF FIGURES

Figure 1. Life Cycle of a QAPP 	4
Figure 2. QAPP Process Elements 	7
Figure 3. Distribution List  	25
Figure 4. Project Personnel Sign-Off Sheet	25
Figure 5. Example Project Organizational Chart	26
Figure 6. Communication Pathways	27
Figure 7. Personnel Responsibilities and Qualifications  	30
Figure 8. Special Personnel Training Requirements 	30
Figure 9. Example Data Needs Worksheet - Compliance Perspective  	32
Figure 10. Example Data Needs Worksheet - Remedy Perspective	33
Figure 11. Example Data Needs Worksheet-Risk Assessment Perspective	34
Figure 12. Project Scoping Session Participants Sheet	35
Figure 13. Systematic Planning Process  	37
Figure 14. Example Measurement Performance Criteria  	42
Figure 15. Relationships to Project Quantitation Limits	48
Figure 16. Example Data Comparison Flow Diagram and Criteria for Results from Two
          Different Laboratories Analyzing Individual Aqueous Split Samples (generated
          using equivalent analytical methods and achieving equivalent QLs)  	51
Figure 17. Example Comparability Determination	53
Figure 18. Secondary Data Evaluation Process	55
Figure 19. Example Secondary Data Criteria and Limitations  	56
Figure 20. Example Summary of Project Tasks 	59
Figure 21. Example Reference Limits and Evaluation 	61
Figure 22. Project Schedule/Timeline	62
Figure 23. Example Sampling Locations and Methods/SOP Requirements	65
Figure 24. Analytical SOP Requirements  	66
Figure 25. Field Quality Control Sample Summary 	66
Figure 26. Project Sampling SOP References	67
Figure 27. Field Equipment Calibration, Maintenance, Testing, and Inspection Procedures . . 68
IDQTF, UFP-QAPP Manual VI, March 2005
Table of Contents
IV
                                       Final

-------
                            LIST OF FIGURES (continued)

Figure 28.  Analytical SOP References  	71
Figure 29.  Analytical Instrument Calibration 	72
Figure 30.  Analytical Instrument and Equipment Maintenance, Testing, and Inspection .... 72
Figure 31.  QC Samples  	88
Figure 32.  Project Documents and Records	90
Figure 33.  Analytical Services	95
Figure 34.  Planned Project Assessments	101
Figure 35.  Assessment Findings & Corrective Action Responses  	103
Figure 36.  QA Management Reports	105
Figure 37.  Data Review Process  	110
Figure 38.  Example Verification (Step I) Process   	114
Figure 39.  Validation (Steps Ha and lib) Process	115
Figure 40.  Example Validation Summary (Steps Ha and lib)	116
Figure A-l Sample Handling System	 A-5
IDQTF, UFP-QAPP Manual V1, March 2005      v                                       Final
Table of Contents

-------
This page intentionally left blank.

-------
 QUESTIONS AND ANSWERS REGARDING THE UNIFORM FEDERAL POLICY FOR
           QUALITY ASSURANCE PROJECT PLANS (UFP-QAPP MANUAL)

I.    Background

Q. 1  What is a Quality Assurance Project Plan (QAPP)?

A.I  A QAPP is a formal document describing in comprehensive detail the necessary quality
     assurance (QA), quality control (QC), and other technical activities that must be implemented
     to ensure that the results of the work performed will satisfy the stated performance criteria.
     A QAPP presents the steps that should be taken to ensure that environmental data collected
     are of the correct type and quality required for a specific decision or use.  It presents an
     organized and systematic description of the ways in which QA and QC  should be applied to
     the collection and use of environmental data. A QAPP integrates technical and quality control
     aspects of a project throughout its life cycle, including planning, implementation, assessment,
     and corrective actions.  ANSI/ASQ E4 Section 6 (Part B) requires that a QAPP be approved
     for all data collection projects.  While this UFP-QAPP Manual uses the term  QAPP, the
     information may be incorporated into other planning documents, such as a Sampling and
     Analysis Plan (SAP), Work Plan, and Field Sampling Plan.  This Manual focuses on the
     required content of any such document, regardless of its name.

Q.2  What is the Intergovernmental Data Quality Task Force (IDQTF)?

A.2  The IDQTF consists of representatives from the U. S. Environmental Protection Agency (EPA),
     the Department of Defense (DoD), and the Department of Energy (DOE). It was  established
     to address real and perceived inconsistencies and deficiencies in quality control for laboratory
     data, within and across governmental organizations, that result in greater costs, time delays,
     and increased risk. The task force is working to ensure that environmental data are of known
     and documented quality and suitable for their intended uses. It is chaired by the Director of the
     Federal Facilities Restoration and Reuse Office  (FFRRO) and operates as a partnership,
     reaching decisions through consensus.

Q.3  What is the purpose of the UFP-QAPP Manual?

A.3  The Manual's  purpose is  to act as a single national consensus guidance document for
     implementing the requirements of ANSI/ASQ E4, Quality Systems for Environmental Data
     and Technology Programs  - Requirements 'with guidance for use, Section 6 (Part B),
     consistently and systematically across the Federal agencies involved in the IDQTF (currently
     EPA, DoD, and DOE). Although ANSI/ASQ E4 Section 6 establishes standards describing the
     essential elements of a QAPP, it lacks sufficient detail to promote the degree of consistency
     needed to address the issues of common expectations, conflict, and rework.

Q.4  Will  compliance with  the UFP-QAPP Manual meet the requirements  of QA/R-5, EPA
     Requirements for Quality Assurance Project Plans and QA/G-5, EPA Guidance for Quality
     Assurance Project Plans!

A.4  The UFP-QAPP Manual is consistent with the QAPP requirements outlined in Chapter 5 of
     the EPA Quality Manual 5360 Al, EPA Quality Manual for Environmental Programs. The
     EPA Quality Staff recognizes that adherence to the Manual will result in compliance with both
IDQTF, UFP-QAPP Manual VI, March 2005   vii                                       Final
Questions and Answers

-------
      QA/R-5 and QA/G-5 for environmental data collection efforts under CERCLA and RCRA at
      Federal facilities.

 Q.5  Why do Federal agencies need another QAPP guidance document?

 A. 5  Because approaches and requirements for QAPPs differ among Federal agencies, the IDQTF
      believes it is necessary to implement a QAPP guidance that is applicable to  any Federal
      department, agency, or program. This UFP-QAPP Manual was developed by the IDQTF to
      provide a common organizational framework and approach to QAPPs. It will reduce conflict
      by providing all who are involved at Federal facilities with a common set of guidelines and
      expectations.

 II.   Basis of the IDQTF UFP-QAPP Manual

Q.6  What is the basis of the UFP-QAPP Manual?

A.6  The basis of the UFP-QAPP Manual is the ANSI/ASQ E4 Section 6 (Part B) requirements.
     However, because ANSI/ASQ E4 Section 6 lacks sufficient detail, the IDQTF considered other
     existing QAPP guidance and selected the Region 1 (New England) QAPP guidance as the point
     of departure for creation of the UFP-QAPP Manual. The Region 1  QAPP guidance has
     significant breadth of coverage, level of detail, and structured implementation tools, which the
     IDQTF believes helps to minimize inconsistencies among QAPPs and makes them easier to
     review (and therefore quicker and  cheaper). However, the Region 1 QAPP guidance  was
     significantly modified to make the UFP-QAPP Manual applicable across all Federal agencies,
     including all EPA Regions.

Q.7  Why wasn't EPA Requirements for Quality Assurance Project Plans (EPA QA/R-5) or EPA
     Guidance for Quality Assurance Project Plans (QA/G-5) used as the base document?

A.7  Although the UFP-QAPP Manual is consistent with both EPA QA/R-5 and QA/G-5, it provides
     a greater level of detail and more implementation tools than either of those documents. In
     addition, QA/R-5 applies specifically to EPA-funded projects and often uses EPA-specific
     language and processes. QA/G-5 is a broad guidance document that lacks the specificity and
     the implementation tools that the IDQTF believes will make this Federal consensus document
     so useful.

Q. 8  Doesn't the use of the Contract Laboratory Program (CLP) provide sufficient quality assurance
     for environmental data by ensuring data of known and documented quality?

A. 8  No. The CLP provides a series of contract specifications, in the form of a statement of work,
     that covers laboratory services purchased under specific contracts for Superfund sites. The CLP
     also provides guidelines for evaluating laboratory conformance to its contract specifications;
     however, it does not address any of the data usability requirements and therefore does not
     provide assurance that collected data are appropriate for their intended uses. There are many
     environmental programs are not covered by CLP, and many aspects of environmental data
     collection outside its scope (e.g., the systematic planning process, sampling activities,  QA
     oversight). The CLP does not address overall quality systems.
 IDQTF, UFP-QAPP Manual VI, March 2005   viii                                      Final
 Questions and Answers

-------
III.  Implementation Issues

Q.9  How will the UFP-QAPP Manual be used and implemented?

A.9  The UFP-QAPP Manual is expected to be used to develop QAPPs for managing the collection
     and use of environmental data at Federal facilities. Each  participating Federal department,
     agency, or program will develop its own implementation  plans that recognize the contracts
     through which the UFP-QAPP Manual will be implemented, the status of previously approved
     QAPPs, and the stage of the data collection effort.

     Since the vast maj ority of these QAPPs will be generated by contractors, implementation of the
     UFP-QAPP Manual will be, at least in part, through contracts. Federal Acquisition Regulations
     (FARs) already require, when appropriate, that contractors maintain high-level quality
     standards with a quality system based on existing standards, such as E4.

     Part 2 of the UFP-QAPP provides three tools to assist in implementation of the UFP-QAPP
     Manual: Part 2A, the QAPP Workbook; Part 2B, the QA/QC Compendium: Minimum QA/QC
     Activities; and Part 2C, the Example QAPPs. The QA/QC Compendium outlines minimum
     QA/QC activities that should be included in a QAPP for all CERCLA proj ects. These minimum
     QA/QC activities, although developed for the CERCLA  process,  are transferable to other
     environmental data collection and  use programs. In addition, the IDQTF is developing a
     training program to facilitate consistent implementation of the UFP-QAPP Manual.

Q. 10 What is the timeframe for implementation?

A. 10 Because DoD components and  DOE offices each have unique contracting practices, each
     agency will need to determine its strategy and timeframe for implementation. Implementation
     could be conducted in phases as existing contracts expire and new ones are instituted.

Q. 11 What if I already have an approved QAPP? Will I have to totally rewrite it to comply with the
     UFP-QAPP Manual?

A. 11 No. Previously written and approved QAPPs will not require revision. The UFP-QAPP Manual
     is aimed at future data collection  efforts.  Approved project-specific QAPPs  will remain
     acceptable for ongoing data collection activities until the projects are complete.

     The UFP-QAPP Manual  requires that all QAPPs be reviewed and updated every 5 years, if
     necessary. This requirement applies to both generic and project-specific QAPPs. If revisions
     are necessary after the 5-year review, the revised QAPP must comply with the UFP-QAPP
     Manual. The implementation plan for each participating Federal department, agency, or
     program should specify exactly how and when QAPPs will be revised.

Q. 12 What if I have an approved generic basewide or facilitywide QAPP?

A. 12 Generic QAPPs are written to address elements of data collection that generally don't change
     from site to  site or activity to activity. They are always  supplemented by project-specific
     QAPPs, standard operating procedures (SOPs), sampling and analysis plans (SAPs), and field
     sampling and analysis plans (FSAPs) that address issues that cannot be addressed by the generic
     QAPP. The UFP-QAPP Manual specifically  allows cross-referencing to other documents that
     contain relevant information. Approved generic QAPPs should not be discarded, but rather


 IDQTF, UFP-QAPP Manual V1, March 2005   ix                                       Final
 Questions and Answers

-------
     should be referenced  in appropriate parts of the project-specific QAPP to help  reduce
     redundancies and create a more focused document.

Q.I3 If the UFP-QAPP Manual is the required guidance document for developing QAPPs, what
     happens if an agency fails to follow it? Will noncompliance result in a Notice of Violation?

A. 13 Failure to implement the  UFP-QAPP Manual will not subject an agency to a Notice of
     Violation because implementation of the Manual is voluntarily and would apply only under
     future intergovernmental MOUs. Because the Manual was not developed or promulgated
     through the Federal rule-making process, its requirements do not have the force of regulation
     and are not subject to regulatory enforcement or a Notice of Violation. The purpose of the UFP-
     QAPP Manual is to assist proj ect teams in creating consistent, high-quality, and easy-to-review
     documents. Each  department, agency,  or program must develop its  own procedures for
     assessing nonconformance and initiating appropriate corrective action. Notice of Violation
     would be given only in circumstances  in which two parties have chosen to make the use
     of the UFP-QAPP Manual part of an enforceable agreement (such as a Federal Facilities
     Agreement). In general, the consequences of not using the UFP-QAPP Manual will be
     continuation of the conflicts and rework that currently permeate the data collection and review
     process.
 IDQTF, UFP-QAPP Manual V1, March 2005   x                                        Final
 Questions and Answers

-------
                                    ACRONYMS

A A           Atomic absorption
AFCEE       Air Force Center for Environmental Excellence
ANSI/ASQ    American National Standards Institute/American Society for Quality
ASTM        American Society for Standards and Materials
BOD         Biological oxygen demand
CERCLA     Comprehensive Environmental Response, Compensation, and Liability Act of 1980
CLP          Contract Laboratory Program
COC         Contaminants of concern
CRDL        Contract-required detection limit
CWA         Clean Water Act
DoD          Department of Defense
DOE         Department of Energy
DQI          Data quality indicator
DQO         Data quality objective
EPA          Environmental Protection Agency
FS            Feasibility study
FSAP         Field sampling and analysis plan
GC           Gas chromatograph
GC/MS       Gas chromatograph/mass spectrometer
GIS           Geographic information system
GPC          Gel permeation chromatography
GPS          Global positioning system
GW           Groundwater
TCP           Inductively coupled plasma
IDQTF        Intergovernmental Data Quality Task Force
LCS          Laboratory control sample
LFB          Laboratory fortified blank
LIMS         Laboratory information management systems
MARLAP     Multi-Agency Radiological Laboratory Analytical Protocols (Manual)
MARS SIM    Multi-Agency Radiation Survey and Site Investigation Manual
MCL         Maximum contaminant level
MDL         Method detection limit
MOU         Memorandum of understanding
MFC         Measurement performance criteria
MQO         Measurement quality objectives
MS/MSD     Matrix spike/matrix spike duplicate
MSR         Management systems review
NEIC         National Enforcement Investigations Center
NIST         National Institute of Standards and  Technology
NPDES       National Pollutant Discharge Elimination System
NPL          National Priorities List
PA/SI         Preliminary assessment/site investigation
PARCC       Precision, accuracy, representativeness, completeness, and comparability
PCBs         Polychlorinated biphenyls
PDF          Portable document format
 IDQTF, UFP-QAPP Manual VI, March 2005
 Acronyms
XI
                                         Final

-------
                               ACRONYMS (continued)

PT            Proficiency testing (previously known as performance evaluation (PE) sample)
PQOs         Project quality objectives
PRP          Potentially responsible party
QA           Quality assurance
QC           Quality control
QS            Quality system
QAPP         Quality Assurance Project Plan
QL           Quantitation limit
QMP         Quality management plan
RCRA        Resource Conservation and Recovery Act
RI            Remedial investigation
RIC          Reconstructed ion chromatogram
RPD          Relative percent difference
RSD          Relative standard deviation
RT            Retention time
SAP          Sampling and analysis plan
SD            Standard deviation
SDG          Sample delivery group
SDWA        Safe Drinking Water Act
SOP          Standard operating procedure
SQLs         Sample quantitation limits
SRM         Standard reference material
SVOC         Semivolatile organic compound
SW           Surface water
TCLP         Toxicity characteristic leaching procedure
ISA          Technical systems  audit
UFP          Uniform Federal Policy
USAGE       United States Army Corps of Engineers
VOA         Volatile organic analytes
VSP          Visual Sample Plan
 IDQTF, UFP-QAPP Manual VI, March 2005
 Acronyms
xn
                                          Final

-------
                               EXECUTIVE SUMMARY

Introduction

Part 1 of the Uniform FederalPolicy'for Quality Assurance Project Plans (theUFP-QAPP Manual),
prepared  by the Intergovernmental Data Quality Task Force (IDQTF), provides instructions for
preparing Quality Assurance Project Plans (QAPPs).1 It is the companion document to the IDQTF's
Uniform Federal Policy for Implementing Environmental Quality Systems (UFP-QS). The UFP-QS
was developed to consistently implement the quality system requirements ofANSI/ASQ E4, Quality
Systems for Environmental Data and Technology Programs - Requirements with guidance for use,
Section 5 (Part A).  Similarly, this UFP-QAPP Manual has been developed to consistently implement
the proj ect-specific requirements in Section 6 of that standard (ANSI/ASQ E4). This Manual requires
that a QAPP be approved for all environmental data collection projects. The QAPP will integrate
technical  and quality control aspects of a project throughout its life cycle, including planning,
implementation, assessment, and corrective actions.  The QAPP will present the steps that will be
taken to ensure that environmental data collected are of the correct type and quality required for a
specific decision or use. It will present an organized and systematic description of the ways in which
quality assurance  (QA) and quality control (QC) will be applied  to the collection and use of
environmental data.

The UFP-QAPP Manual  was developed as a joint initiative between  the U.S.  Environmental
Protection Agency (EPA), the Department of Defense (DoD), and the Department of Energy (DOE).
The purpose of the Manual is to provide a single national consensus document for consistently  and
systematically implementing the project-specific requirements of ANSI/ASQ E4 across the Federal
agencies involved in the IDQTF (currently EPA, DoD, and DOE). It is consistent with EPA's existing
QAPP guidance (QA/G-5) and QAPP requirements (QA/R-5). In addition, implementation of the
UFP-QAPP Manual will help to ensure the quality, obj ectivity, utility, and integrity of environmental
information that the Federal  government  disseminates as required by  the Information Quality
Guidelines (EPA/260R-02-008, October 2002).

Part 2 of the UFP-QAPP provides three tools to assist in implementing this UFP-QAPP Manual. Part
2A is the QAPP Workbook Part 2B, the QA/QC Compendium: Minimum QA/QC Activities, outlines
QA/QC activities that should be included in a QAPP for all CERCLA projects. These  activities,
although developed for the CERCLA process, are transferrable to other environmental data collection
and use programs. Part 2C presents Example QAPPs.

Background

Many Federal agencies have independently  created their own QAPP guidance. EPA has one QAPP
requirements  document and one guidance  document that encompass all the systematic planning
process elements  that should be addressed in a QAPP.  These documents are QA/R-5, EPA
Requirements for Quality Assurance Project Plans (requirements), and QA/G-5, EPA Guidance for
Quality Assurance Project Plans (guidance). The EPA Quality Staff recognizes that adherence to
the UFP-QAPP Manual  will  result in full compliance  with  both QA/R-5 and  QA/G-5  for
environmental data collection efforts under CERCLA and RCRA at Federal facilities.
 'This UFP-QAPP Manual is based on guidance developed by EPA Region 1.  It has been modified by a workgroup of
 the IDQTF to reflect EPA, DoD, and DOE comments on the Region 1 guidance, to remove Region 1 -specific elements,
 and to address data review approaches identified by members of the IDQTF.

 IDQTF, UFP-QAPP Manual V1, March 2005     xiii                                     Final
 Executive Summary

-------
Because approaches to and requirements for QAPPs differ among Federal agencies, the IDQTF
believes it is necessary to implement consistent QAPP requirements that are applicable to any Federal
department,  agency,  or  program that adopts the UFP-QAPP  Manual. Manual provides an
organizational framework and approach to QAPP preparation with a common set of expectations and
guidelines.

Although the States were not involved in the development of this Manual, the IDQTF recognizes the
importance of their role in the review and approval of QAPPs. States are encouraged to review and
approve QAPPs based on the requirements in this Manual.

Scope

This Manual provides Federal departments, agencies, and programs with policy and guidelines for
developing QAPPs for management of environmental data collection and use.  This  document
represents a voluntary consensus policy. Implementation is therefore not subject to oversight by
another Federal department, agency, or program, or to a Notice of Violation if one department,
agency, or program fails to implement all or part of the policy.  However, once the  requirements of
this Manual are adopted by a Federal department, agency, or program, its use is mandatory within
that department, agency, or program.

Each participating Federal department, agency, or program will develop its own implementation plan.
It is anticipated that the use of the Manual will be phased in and that it will be used to develop the
initial and revised versions of QAPPs for managing the collection and use of environmental data at
Federal facilities. It is not intended to apply retroactively to previously approved QAPPs.

Overview of This Manual

The UFP-QAPP Manual covers a variety of topics regarding QAPP preparation, some of which are
often included in other documents (e.g., sampling and analysis plans, work plans), and encompasses
EPA's Systematic Planning Process. Several principles are important for understanding the Manual:

•  Although designed for use in support of hazardous waste programs  (CERCLA and RCRA) at
   Federal facilities, the UFP-QAPP Manual is applicable to any environmental program for which
   field data will be collected and analyzed.

•  Part 2B of the UFP-QAPP outlines minimum QA/QC activities that should be included in all
   CERCLA project QAPPs. These minimum QA/QC activities, although developed for the
   CERCLA process, are transferrable to other environmental data collection and use programs.

•  The content and level of detail required for individual QAPPs will vary according to the work
   being performed.  Project planners are encouraged to use a "graded approach"  when preparing
   QAPPs. In other words, the degree of documentation, level of effort, and level of detail will  vary
   based on the complexity of the project.

•  The UFP-QAPP Manual recommends, but does not require, the use of tables and charts (called
   worksheets) to document the requirements of the QAPP. The specific elements of the various
   tables and charts are outlined in the Manual, and templates are provided in the QAPP Workbook
   (Part 2A of the UFP-QAPP). The use of the worksheets is expected to expedite the review of
   QAPPs by an approval authority.
 IDQTF, UFP-QAPP Manual V1, March 2005      xiv                                     Final
 Executive Summary

-------
•  The UFP-QAPP Manual is designed to be used to generate both generic and or project-specific
   QAPPs. When elements required by the Manual are present in other documents (e.g., SOPs),
   careful cross-referencing of these other documents can be used in lieu of repeating information.

•  Depending on the implementation plan of each Federal department, agency, or program, existing
   QAPPs should not have to be rewritten when the requirements of the UFP-QAPP Manual are
   adopted. The Manual's requirements will be applicable to QAPP revisions and to new QAPPs.

Related IDQTF Documents and Products

Part 2 of the UFP-QAPP provides supplementary materials for use with the UFP-QAPP Manual:

•  Part 2A, "QAPP Workbook," contains blank worksheets that will assist with the preparation of
   QAPPs by addressing specific requirements of the Manual.

•  Part 2B, "Quality Assurance/Quality Control (QA/QC) Compendium:  Minimum QA/QC
   Activities," specifies minimum QA/QC activities for environmental data collection and use for
   hazardous waste projects.

•  Part 2C, "Example QAPPs," provides several example QAPPs that are based on the requirements
   in the Manual. They use the worksheets recommended in this Manual to demonstrate how they
   may be completed to write a QAPP.

Organization of This Manual

This document is organized into five major sections. The introductory section describes the nature
of this policy and provides detail on the overall approach. Each of the subsequent four sections
addresses  one of the four major QAPP element  groups: Project Management and Objectives,
Measurement/Data Acquisition, Assessment and Oversight, and Data Review. In addition, Appendix
A provides additional details for the content of SOPs.
 IDQTF, UFP-QAPP Manual V1, March 2005     xv                                     Final
 Executive Summary

-------
This page intentionally left blank.

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page:  1 of 149
1.0    INTRODUCTION
The complexity of environmental data collection operations demands that a systematic planning
process and structure for quality be  established if decision-makers are to have the necessary
confidence in the quality of data that support their decisions. This process and structure must include
the means to determine whether the data are fully usable and what to do if they are not. This process
and structure are  provided by  the  quality system used by the organization conducting the
environmental data operations.

The lead organization (see Section 1.3.1) must develop, operate, and document quality systems to
ensure that environmental data collected or compiled for environmental programs are scientifically
sound, of known and documented quality, and suitable for their intended use.

In order to assist lead organizations with the implementation of their quality systems, the Uniform
Federal Policy for Implementing Environmental Quality Systems (UFP-QS)  was developed to
facilitate consistent implementation of the quality system requirements of Section 5 (Part A) of
ANSI/ASQ E4, Quality Systems for Environmental Data and Technology Programs - Requirements
with guidance for use (January 1995 superceded by February 2004). Similarly, the Uniform Federal
Policy for Quality Assurance Project Plans (UFP-QAPP) has been developed to facilitate consistent
implementation of the project-specific requirements of Section 6 (Part B) of E4, as well as EPA
Requirements for Quality Assurance Project Plans (EPA QA/R-5, EPA/240/B-01/003, March 2001)
wdEPA Guidance for Quality Assurance Project Plans (EPA QA/G-5, EPA/600/R-98/018, February
1998). This Manual is Part 1 of the UFP-QAPP. It provides instructions for the preparation of quality
assurance project plans (QAPPs). Part 2 provides implementation tools for Part  1.

This introduction outlines the purpose and organization of the UFP-QAPP as well as the purpose and
content of a QAPP; the QAPP review, approval, and  modification processes; and the roles and
responsibilities of those involved in QAPP development.

1.1    Uniform Federal Policy for Quality Assurance Project Plans

1.1.1   Scope

This document provides policy and guidelines to Federal departments, agencies, and programs for
developing QAPPs for the management of environmental data collection and use. This document
represents a voluntary consensus policy; therefore, implementation is not subject to oversight by
another Federal department, agency, or program or to a Notice of Violation for failure to implement
all or part of the policy. However, once a Federal department, agency, or program, adopts the
requirements of this document, its use is mandatory  within  that department, agency, or
program.
 IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
 Introduction

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 2 of 149

Each participating Federal department, agency, or program will develop its own implementation plan.
It is anticipated that use of the Manual will be phased in as it is used to develop the initial and revised
versions of QAPPs for managing the collection and use of environmental data at Federal facilities.
It is not intended to apply retroactively to previously approved QAPPs.

1.1.2   Purpose

This UFP-QAPP Manual is intended to provide instructions for QAPP preparation in accordance with
Section 6 (Part B) of ANSI/ASQ E4. Once adopted, the requirements presented in this UFP-QAPP
Manual must be adhered to by the lead organization and its contractors when developing new QAPPs
that guide the performance of environmental data collection operations. The requirements of the
UFP-QAPP Manual must also be adhered to by regulatory entities collecting environmental data for
oversight purposes. In addition, review and approval of QAPPs by EPA must be in accordance with
the requirements of this UFP-QAPP Manual.

This UFP-QAPP Manual is not program-specific and is intended to be as comprehensive as possible.
Since the content and level of detail in QAPPs vary according to the work being performed and the
intended use of the data,  parts of this UFP-QAPP Manual may not be applicable to all programs.
However, each of the sections and subsections in the Manual must be addressed in the QAPP to the
degree appropriate for the data collection activity, even if only by the statement "not applicable."
To the extent practicable, information should be provided in tabular format. However, sufficient
written discussion should accompany the tables to facilitate understanding.

To assist  in compiling critical QAPP information, Part 2 of this  UFP-QAPP provides three
supplemental documents:

•  Part 2A, "QAPP Workbook," provides blank worksheets.
•  Part 2B, "Quality Assurance/Quality  Control Compendium: Minimum QA/QC Activities,"
   outlines QA/QC activities that should be included in a  QAPP for all CERCLA projects.
•  Part 2C, "Example QAPPs," provides examples of completed worksheets and shows how to
   fulfill the requirements of this UFP-QAPP Manual.

The QAPP worksheets can be taken to project scoping sessions and completed during the project
planning stage. Subsequently, the worksheet information can be presented in tabular format in the
QAPP. The worksheets are designed to ensure consistent content and presentation of information in
a project-specific QAPP. Use of a consistent format for QAPPs is expected to streamline the review
of QAPPs by regulators and  others. If the QAPP worksheets are not used, information required by
the worksheets must still be presented in the QAPP, as appropriate to the project.
 IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
 Introduction

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 3 of 149
1.1.3   Organization
The remainder of this UFP-QAPP Manual is organized in accordance with the four elements of a
QAPP (see Section 1.2.3):

•  Project Management and Objectives
•  Measurement and Data Acquisition
•  Assessment and Oversight
•  Data Review

Appendix A to this Manual provides additional guidance for Standard Operating Procedures.

1.2    Quality Assurance Project Plans

The QAPP integrates all  technical and quality aspects for the life cycle of the project, including
planning, implementation, and assessment. The ultimate success of an environmental program or
project depends on the quality of the environmental data collected and used in decision-making, and
this quality depends significantly on the adequacy of the QAPP and on its effective implementation.
The QAPP documents how quality assurance and quality control are applied to an environmental data
collection operation to ensure that the results obtained will satisfy the stated performance criteria.
It is important to note that quality assurance and quality control are defined and used differently.
Quality assurance refers to the system of management activities, whereas quality control refers to the
system of technical activities that measure performance against defined standards.
All QAPPs must, at a minimum, address all
elements detailed in this UFP-QAPP Manual.
In some cases,  certain elements will  not be
appropriate   for  a   particular   project.
Requirements of the Manual that do not apply
can be addressed with a simple statement of why
the information is not relevant or with a cross-
reference  to  another approved  document in
which the information appears.
The QAPP document may be referred to by another
name or incorporated into  other project planning
documents.  The document  for some  programs or
projects may be referred to as a Sampling and Analysis
Plan (SAP), Work Plan, Field Sampling Plan, etc. This
UFP-QAPP Manual focuses on the required content of
any such document.
A QAPP is often subject to regulatory review and approval by EPA or the other appropriate approval
authority  (including, but not limited to, EPA-delegated approval authorities) prior to  sample
collection.
Figure 1 presents the life cycle of a QAPP.
 IDQTF, UFP-QAPP Manual VI, March 2005
 Introduction
                                      Final

-------
                                   Environmental Problem
                                                                  IDQTF UFP-QAPP Manual
                                                                  Page:  4 of 149
                                     Assemble Project Team

                                             I
                              Schedule and Conduct Scoping Sessions

                                             T
             Plan Project and Compile Information Required by QAPP Worksheet Information
                                             T
                             Prepare Project-Specific or Generic QAPP

                                             T
                                     Perform Internal Review

                                             T
                                Submit QAPP for External Approval

                                             T
                      Revise QAPP as Required and Submit for External Approval
                            i	1
                            j      *** QAPP APPROVED ***    j

                                             T
                                  Implement QAPP as Prescribed

                                             T
                      Amend QAPP as Needed To Address Unexpected Conditions

                                             T
          Submit Amendments for Approval or Obtain and Document Verbal or Electronic Approval

                                             T
                           Modify Project Work After Approval Received

                                              T
                             Archive QAPP in Project or Program File

                                              T
Review QAPP Annually. Revise QAPP if Necessary, When Directed by the Approval Authority or, at a Minimum,
                                         Every 5 Years

                              Figure 1.  Life Cycle of a QAPP
IDQTF, UFP-QAPP Manual V1, March 2005                                                 Final
Introduction

-------
                                                                IDQTF UFP-QAPP Manual
                                                                Page:  5 of 149
1.2.1    Purpose
The purpose of a QAPP is to document the planned activities for environmental data collection
operations and to provide a project-specific "blueprint" for obtaining the type and quality of
environmental  data needed  for  a specific  decision or  use. The planning  should include the
"stakeholders" (e.g., data users, data producers, decision-makers) to ensure that all needs are defined
adequately and that the planning for quality addresses those needs. While time spent on such planning
may initially seem unproductive and costly, the penalty for ineffective planning often  is greater
conflict and extensive reworking, which results in increased cost and lost time.

The QAPP serves several purposes:

•   As a technical planning document,  it identifies the purpose of the project, defines the project
    quality objectives, and  outlines the  sampling, analytical,  and quality assurance/quality control
    (QA/QC) activities that will be used to support environmental decisions.
•   As  an organizational document,  it  identifies key  project personnel,  thereby facilitating
    communication.
•   As an assessment and oversight document, it provides the criteria for assessment of project
    implementation and for QA and contractor oversight.
   QAPPs and Quality Management Plans

   The Uniform Federal Policy for Implementing Environmental Quality Systems requires documentation of an
   organization's quality system in a quality management plan (QMP). A QMP is a formal document that describes
   the quality system in terms of the organization's structure, the functional responsibilities of management and staff,
   the lines of authority, and the required interfaces for those planning, implementing, and assessing all activities
   conducted. Organizations participating in the project (e.g., Federal agency, prime contractor, laboratory) must
   have a QMP or some other documentation of a quality system. The management, organization, and personnel
   responsibilities outlined in the QAPP should be consistent with that quality system.
1.2.2  Types of Quality Assurance Project Plans

QAPPs can be of two types:

 •    A generic QAPP is an overarching plan that describes the quality obj ectives and documents the
     comprehensive set of standard operating procedures (SOPs) for sampling, analysis, QA/QC, and
     data review that are specific to a site (e.g., facility, base) or to an activity (e.g., compliance with
     an environmental program such as Safe Drinking Water Act, repetitive groundwater monitoring).
     A generic  QAPP  may  be applicable to  a single  site with  multiple activities (e.g., soil,
     groundwater,  and surface water sampling) or to a single activity that will be implemented at
     multiple sites  (e.g., same type of air monitoring at several Air Force bases) or at multiple times
 IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
 Introduction

-------
                                                           IDQTF UFP-QAPP Manual
                                                           Page:  6 of 149

   (e.g., a groundwater monitoring program that will sample the same locations every 3 months for
   5 years).

   A generic program QAPP may serve as an umbrella under which project-specific tasks are
   conducted over an extended period of time. Project or task-specific information not covered by
   the umbrella is documented in detailed sampling and analysis plans (SAPs) or work plans, which
   use the generic QAPP as an informational reference whenever appropriate. The use of generic
   QAPPs, with supplemental proj ect-specific QAPPs as needed, is a significant opportunity to use
   a graded approach, reducing repetition and streamlining the QAPP development,  review, and
   approval process (see Section 1.2.4).

   When  a generic  QAPP is being developed that will apply across multiple EPA Regions or
   regulatory approval authorities, the scoping process must involve those  entities  early in the
   development of the QAPP. Receiving input early will help streamline review and approval of
   the generic QAPP.

•  ^project-specific QAPP provides a Q A blueprint specific to one proj ect or task. Proj ect-specific
   QAPPs are used  for projects of limited scope and time and, in general, can be considered the
   SAP or work plan for the project. A project-specific QAPP for each site or activity may be
   needed to supplement a generic QAPP.

1.2.3   Required QAPP Element Groups and the Systematic Planning Process

There are four basic element groups addressed in a QAPP: Project Management and Objectives
(Section 2), Measurement/Data Acquisition (Section 3), Assessment/Oversight (Section 4), and Data
Review (Section 5). As shown in Figure 2, the four QAPP element groups represent the pieces of
a project's life cycle, which are integrated through the use of scoping sessions.
IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
Introduction

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 7 of 149
                           Assessment /

                            Oversight
                                       Scoping
                                       Sessions
                  i  Project

                  Management
                  '   ^i  and
                     )
                   \  Objectives
                              Data
                             Review
                Measurement /

               Data Acquisition
                            Figure 2. QAPP Process Elements

The four basic element groups of a QAPP present a framework consistent with EPA Requirements
for Quality Assurance Project Plans (EPA QA/R-5), which requires use of a systematic planning
process.  The inconsistencies in language between the UFP-QS and the UFP-QAPP Manual are due
to the different purposes of the documents. The UFP-QS, which outlines a six-step process, serves
as a high-level policy document for implementing quality systems, as defined in ANSI/ASQ E4.
Thus, it describes the systematic planning process at a conceptual level. This UFP-QAPP Manual
is an implementation guide that significantly expands on the UFP-QS.  Table 1 identifies how the
four QAPP element groups outlined in the UFP-QAPP Manual and the detailed systematic planning
process shown in Figure 13 (Section 2.6.1) address the six planning process elements outlined in the
UFP-QS Section 7.2.
       Table 1. Comparison of UFP-QS and UFP-QAPP Systematic Planning Process
   Systematic Planning Process
      Elements (UFP-QS)
QAPP Element
    Group
      Systematic Planning Process Steps
     	(UFP-QAPP)	
 Establishment of a Team-Based
 Approach to Planning
    Project
Management and
  Objectives
ID Lead Organization, Approval Authority, and
Project Team; ID Project Organization and
Responsibilities; Schedule and Convene Scoping
Sessions (Sections 2.3, 2.4)
 Description of the Project Goal,
 Objectives, and Questions and
 Issues To Be Addressed
    Project
Management and
  Objectives
Define Environmental Problem (Sections 2.5, 2.6,
2.7)
IDQTF, UFP-QAPP Manual VI, March 2005
Introduction
                                                     Final

-------
                                                                  IDQTF UFP-QAPP Manual
                                                                  Page:  8 of 149

 Table 1. Comparison of UFP-QS and UFP-QAPP Systematic Planning Process (continued)
   Systematic Planning Process
       Elements (UFP-QS)
QAPP Element
    Group
       Systematic Planning Process Steps
      	(UFP-QAPP)	
Identification of Project Schedule,
Resources (Including Budget)
Milestones, and Any Applicable
Requirements
    Project
Management and
   Objectives
Develop Project Schedule (Section 2.8)
Matching of the Data Collection
and Analysis Process to Project
Objectives
 Measurement/
Data Acquisition
Determine the "Type" of Data Needed; Determine
the "Quality" of Data Needed; Determine the
"Quantity" of Data Needed; Develop Sampling
Design Rationale (Sections 3.1.1, 3.4)
Identification of Collection and
Analysis Requirements
 Measurement/
Data Acquisition
Determine Sampling Requirements; Select Sampling
SOPs that have Documented QC Limits Supporting
the MFC (Obtain Services of On-Site Sampling
Group); Develop Analytical Requirements; Select
On-site/Off-site Analytical Methods/SOPs that have
Documented QC Limits Supporting the MFC
(Sections 3.1.2, 3.2, 3.3, 3.5, Appendix A)
Description of the Generation,
Evaluation, and Assessment of
Collected Data
  Assessment/
   Oversight

  Data Review
Determine Quality Assurance Assessments that will
be Performed and Identify Organizations Performing
Assessments; Decide How Project Data will be
Evaluated After Review to Determine if the User's
Needs Have Been Met (Sections 4.0, 5.0)	
1.2.4  QAPP Requirements

The information specified in Table 2 must be provided in all QAPPs submitted to EPA or the
delegated regulatory authority. Table 2 also provides a crosswalk between the QAPP element
groups, the required QAPP sections, and the QAPP worksheets, which are in the QAPP Workbook
that accompanies this Manual (Part 2A). It is important to remember that worksheet use is optional,
although desirable, and that required information will be proj ect-specific. As the following text box
states, other project documents may be cross-referenced and, as appropriate, provided for approval.
 Note: All QAPP worksheets, when used, should be completed with project-specific information.  If the QAPP
 worksheets are not used, relevant information required on the worksheets must still be presented in the QAPP.
 In addition, QAPP preparers are encouraged to develop additional tables, as appropriate to the project. Sufficient
 written discussion in text format should accompany all tables. Certain sections, by their nature, will require more
 written discussion than others. In particular, Section 3.1.1 should provide an in-depth explanation of the sampling
 design rationale, and Section 5.2 should describe the procedures and criteria that will be used for data review.
IDQTF, UFP-QAPP Manual VI, March 2005
Introduction
                                                          Final

-------
                                                                     IDQTF UFP-QAPP Manual
                                                                     Page:  9 of 149
                            Table 2. QAPP Requirement Summary
        Required QAPP Element(s) and
        Corresponding QAPP Section(s)
 Optional QAPP
 Worksheet # in
QAPP Workbook
       Required Information
                                Project Management and Objectives
 2. 1  Title and Approval Page
                 - Title and Approval Page
 2.2  Document Format and Table of Contents
    2.2.1 Document Control Format
    2.2.2 Document Control Numbering System
    2.2.3 Table of Contents
    2.2.4 QAPP Identifying Information
                   Table of Contents
                   QAPP Identifying Information
 2.3  Distribution List and Project Personnel Sign-
      Off Sheet
    2.3.1 Distribution List
    2.3.2 Project Personnel Sign-Off Sheet
       6
       4
  Distribution List
  Project Personnel Sign-Off Sheet
 2.4  Project Organization
    2.4.1 Project Organizational Chart
    2.4.2 Communication Pathways
    2.4.3 Personnel Responsibilities and
         Qualifications
    2.4.4 Special Training Requirements and
         Certification
       5
       6
       7
- Project Organizational Chart
- Communication Pathways
- Personnel Responsibilities and
  Qualifications Table
- Special Personnel Training
  Requirements Table
 2.5  Project Planning/Problem Definition
    2.5.1 Project Planning (Scoping)
    2.5.2 Problem Definition, Site History, and
         Background
       9

      10
  Project Planning Session
  Documentation (including Data
  Needs Tables)
  Project Scoping Session Participants
  Sheet
  Problem Definition, Site History and
  Background
  Site Maps (historical and present)
 2.6  Project Quality Objectives and Measurement
      Performance Criteria
    2.6.1 Development of Project Quality
         Objectives Using the Systematic
         Planning Process
    2.6.2 Measurement Performance Criteria
      11
      12
  Site-Specific PQOs
  Measurement Performance Criteria
  Table
   2.7  Secondary Data Evaluation
                                                      13
                    Sources of Secondary Data and
                    Information
                    Secondary Data Criteria and
                    Limitations Table
IDQTF, UFP-QAPP Manual VI, March 2005
Introduction
                                               Final

-------
                                                       IDQTF UFP-QAPP Manual
                                                       Page:  10 of 149
                 Table 2. QAPP Requirement Summary (continued)
Required QAPP Element(s) and
Corresponding QAPP Section(s)
2.8 Project Overview and Schedule
2.8.1 Proj ect Overview
2.8.2 Project Schedule
Optional QAPP
Worksheet # in
QAPP Workbook
14
15
16
Required Information
- Summary of Proj ect Tasks
- Reference Limits and Evaluation
Table
- Project Schedule/Timeline Table
Measurement/Data Acquisition
3.1 Sampling Tasks
3.1.1 Sampling Process Design and Rationale
3.1.2 Sampling Procedures and Requirements
3.1.2.1 Sampling Collection Procedures
3.1.2.2 Sample Containers, Volume, and
Preservation
3.1.2.3 Equipment/Sample Containers
Cleaning and Decontamination
Procedures
3.1.2.4 Field Equipment Calibration,
Maintenance, Testing, and
Inspection Procedures
3. 1.2.5 Supply Inspection and Acceptance
Procedures
3.1.2.6 Field Documentation Procedures
3.2 Analytical Tasks
3.2.1 Analytical SOPs
3.2.2 Analytical Instrument Calibration
Procedures
3.2.3 Analytical Instrument and Equipment
Maintenance, Testing, and Inspection
Procedures
3.2.4 Analytical Supply Inspection and
Acceptance Procedures
3.3 Sample Collection Documentation,
Handling, Tracking, and Custody Procedures
3.3.1 Sample Collection Documentation
3.3.2 Sample Handling and Tracking System
3.3.3 Sample Custody
3.4 Quality Control Samples
3.4.1 Sampling Quality Control Samples
3.4.2 Analytical Quality Control Samples
17
18
19
20
21
22
23
24
25
26
27
28
- Sampling Design and Rationale
- Sample Location Map
- Sampling Locations and Methods/
SOP Requirements Table
- Analytical Methods/SOP
Requirements Table
- Field Quality Control Sample
Summary Table
- Sampling SOPs
- Project Sampling SOP References
Table
- Field Equipment Calibration,
Maintenance, Testing, and
Inspection Table
- Analytical SOPs
- Analytical SOP References Table
- Analytical Instrument Calibration
Table
- Analytical Instrument and
Equipment Maintenance, Testing,
and Inspection Table
- Sample Collection Documentation
Handling, Tracking, and Custody
SOPs
- Sample Container Identification
- Sample Handling Flow Diagram
- Example Chain-of-Custody Form
and Seal
- QC Samples Table
- Screening/Confirmatory Analysis
Decision Tree
IDQTF, UFP-QAPP Manual VI, March 2005
Introduction
Final

-------
                                                           IDQTF UFP-QAPP Manual
                                                           Page:  11 of 149
                   Table 2. QAPP Requirement Summary (continued)
Required QAPP Element(s) and
Corresponding QAPP Section(s)
3.5 Data Management Tasks
3.5.1 Project Documentation and Records
3.5.2 Data Package Deliverables
3.5.3 Data Reporting Formats
3.5.4 Data Handling and Management
3.5.5 Data Tracking and Control
Optional QAPP
Worksheet # in
QAPP Workbook
29
30
Required Information
- Project Documents and Records
Table
- Analytical Services Table
- Data Management SOPs
Assessment/Oversight
4. 1 Assessments and Response Actions
4.1.1 Planned Assessments
4.1.2 Assessment Findings and Corrective
Action Responses
4.2 QA Management Reports
4 . 3 Final Proj ect Report
31
32
33

- Assessments and Response Actions
- Planned Project Assessments Table
- Audit Checklists
- Assessment Findings and Corrective
Action Responses Table
- QA Management Reports Table

Data Review
5.1 Overview
5.2 Data Review Steps
5.2.1 Step I : Verification
5.2.2 Step II: Validation
5.2.2. 1 Step Ha: Validation Activities
5.2.2.2 Step lib: Validation Activities
5.2.3 Step III: Usability Assessment
5.2.3.1 Data Limitations and Actions from
Usability Assessment
5.2.3.2 Activities
5 . 3 Streamlining Data Review
5.3.1 Data Review Steps To Be Streamlined
5.3.2 Criteria for Streamlining Data Review
5.3.3 Amounts and Types of Data
Appropriate for Streamlining

34
35
36
37


- Verification (Step I) Process Table
- Validation (Steps Ha and lib)
Process Table
- Validation (Steps Ha and lib)
Summary Table
- Usability Assessment

It is recommended that QAPPs be identified as generic or proj ect-specific and be prepared using the
format described in this Manual. However, if some or all of the required QAPP element groups are
incorporated into other project  planning documents (such as SAPs, field sampling plans, field
operations plans, project operations plans, or general project work plans), then a cross-reference
table similar to Table 3 must be provided to identify where each required QAPP element is located
in the appropriate proj ect document. The reference should specify the complete document title, date,
IDQTF, UFP-QAPP Manual VI, March 2005
Introduction
Final

-------
                                                          IDQTF UFP-QAPP Manual
                                                          Page: 12 of 149
section number, page numbers, and location of the information in the document. Table 3 provides
an example using a fictitious project in which several elements of the project-specific QAPP are
found in existing facilitywide project planning documents. This table cross-references the required
QAPP element groups with information found in documents such as generic facilitywide QAPPs,
SAPs, and others.

     Table 3.  Tracking of QAPP Requirements: Example Crosswalk to Other Project
                                     Documents
Required QAPP Element(s) and Corresponding
QAPP Section(s) in UFP-QAPP Manual
Required Information
Crosswalk to Related
Documents
Project Management and Objectives
2. 1 Title and Approval Page
2.2 Document Format and Table of Contents
2.2.1 Document Control Format
2.2.2 Document Control Numbering System
2.2.3 Table of Contents
2.2.4 QAPP Identifying Information
2 . 3 Distribution List and Proj ect Personnel
Sign-Off Sheet
2.3.1 Distribution List
2.3.2 Project Personnel Sign-Off Sheet
2.4 Project Organization
2.4.1 Proj ect Organizational Chart
2.4.2 Communication Pathways
2.4.3 Personnel Responsibilities and
Qualifications
2.4.4 Special Training Requirements and
Certification
2 . 5 Proj ect Planning/Problem Definition
2.5.1 Project Planning (Scoping)
2.5.2 Problem Definition, Site History, and
Background
2.6 Project Quality Objectives and
Measurement Performance Criteria
2.6.1 Development of Proj ect Quality
Objectives Using the Systematic
Planning Process
2.6.2 Measurement Performance Criteria
- Title and Approval Page
- Table of Contents
- QAPP Identifying
Information
- Distribution List
- Project Personnel Sign-
Off Sheet
- Project Organizational
Chart
- Communication Pathways
- Personnel Responsibilities
and Qualifications Table
- Special Personnel
Training Requirements
Table
- Project Planning Session
Documentation (including
Data Needs Tables)
- Project Scoping Session
Participants Sheet
- Problem Definition, Site
History and Background
- Site Maps (historical and
present)
- Site-Specific PQOs
- Measurement
Performance Criteria
Table






IDQTF, UFP-QAPP Manual VI, March 2005
Introduction
Final

-------
                                                       IDQTF UFP-QAPP Manual
                                                       Page:  13 of 149
     Table 3.  Tracking of QAPP Requirements: Example Crosswalk to Other Project
                             Documents (continued)
Required QAPP Element(s) and Corresponding
QAPP Section(s) in UFP-QAPP Manual
2.7 Secondary Data Evaluation
2.8 Project Overview and Schedule
2.8.1 Proj ect Overview
2.8.2 Project Schedule
Required Information
- Sources of Secondary
Data and Information
- Secondary Data Criteria
and Limitations Table
- Summary of Proj ect Tasks
- Reference Limits and
Evaluation Table
- Project Schedule/Timeline
Table
Crosswalk to Related
Documents

Generic Facilitywide
QAPP, Section 3.0
Measurement/Data Acquisition
3.1 Sampling Tasks
3.1.1 Sampling Process Design and
Rationale
3.1.2 Sampling Procedures and
Requirements
3.1.2.1 Sampling Collection Procedures
3.1.2.2 Sample Containers, Volume, and
Preservation
3.1.2.3 Equipment/Sample Containers
Cleaning and Decontamination
Procedures
3.1.2.4 Field Equipment Calibration,
Maintenance, Testing, and
Inspection Procedures
3. 1.2.5 Supply Inspection and
Acceptance Procedures
3.1.2.6 Field Documentation Procedures
3.2 Analytical Tasks
3.2.1 Analytical SOPs
3.2.2 Analytical Instrument Calibration
Procedures
3.2.3 Analytical Instrument and Equipment
Maintenance, Testing, and Inspection
Procedures
3.2.4 Analytical Supply Inspection and
Acceptance Procedures
- Sampling Design and
Rationale
- Sample Location Map
- Sampling Locations and
Methods/ SOP
Requirements Table
- Analytical Methods/SOP
Requirements Table
- Field Quality Control
Sample Summary Table
- Sampling SOPs
- Project Sampling SOP
References Table
- Field Equipment
Calibration, Maintenance,
Testing, and Inspection
Table
- Analytical SOPs
- Analytical SOP
References Table
- Analytical Instrument
Calibration Table
- Analytical Instrument and
Equipment Maintenance,
Testing, and Inspection
Table
Generic Facilitywide
QAPP, Volume 3
Approved Field Sampling
Plan for Base, Pages
12-18
Approved Field Sampling
Plan for Base, Pages
24-28
Approved Field Sampling
Plan for Base, Pages
32-38
Generic Facilitywide
QAPP, Volume 2
Approved Field Sampling
Plan for Base, Pages
40-43
Generic Facilitywide
QAPP, Volume 4
IDQTF, UFP-QAPP Manual VI, March 2005
Introduction
Final

-------
                                                       IDQTF UFP-QAPP Manual
                                                       Page:  14 of 149
     Table 3.  Tracking of QAPP Requirements: Example Crosswalk to Other Project
                             Documents (continued)
Required QAPP Element(s) and Corresponding
QAPP Section(s) in UFP-QAPP Manual
3 . 3 Sample Collection Documentation,
Handling, Tracking, and Custody
Procedures
3.3.1 Sample Collection Documentation
3.3.2 Sample Handling and Tracking
System
3.3.3 Sample Custody
3.4 Quality Control Samples
3.4.1 Sampling Quality Control Samples
3.4.2 Analytical Quality Control Samples
3.5 Data Management Tasks
3.5.1 Proj ect Documentation and Records
3.5.2 Data Package Deliverables
3.5.3 Data Reporting Formats
3.5.4 Data Handling and Management
3.5.5 Data Tracking and Control
Required Information
- Sample Collection
Documentation Handling,
Tracking, and Custody
SOPs
- Sample Container
Identification
- Sample Handling Flow
Diagram
- Example Chain-of-
Custody Form and Seal
- QC Samples Table
- Screening/Confirmatory
Analysis Decision Tree
- Project Documents and
Records Table
- Analytical Services Table
- Data Management SOPs
Crosswalk to Related
Documents
Approved Field Sampling
Plan for Base, Pages
50-52
Approved Field Sampling
Plan for Base, Pages
54-58
Approved Field Sampling
Plan for Base, Pages
60-66
Generic Facilitywide
QAPP, Section 6.1, Pages
6-4 to 6-5, Table 6-2
Generic Facilitywide
QAPP, Section 6.2, Pages
6-8 to 6-9
Generic Facilitywide
QAPP, Section 6.3, Pages
6-12 to 6-14, Table 6-3
Generic Facilitywide
QAPP, Section 6.4, Pages
6-20 to 6-23, Table 6-4

Generic Facilitywide
QAPP, Section 8, Page 8-
2, Table 8-1
Generic Facilitywide
QAPP, Volume 6
Assessment/Oversight
4. 1 Assessments and Response Actions
4.1.1 Planned Assessments
4.1.2 Assessment Findings and Corrective
Action Responses
- Assessments and
Response Actions
- Planned Project
Assessments Table
- Audit Checklists
- Assessment Findings and
Corrective Action
Responses Table
Generic Facilitywide
QAPP, Section 11.1, Page
11-2, Table 10-1
IDQTF, UFP-QAPP Manual VI, March 2005
Introduction
Final

-------
                                                          IDQTF UFP-QAPP Manual
                                                          Page: 15 of 149

     Table 3. Tracking of QAPP Requirements: Example Crosswalk to Other Project
                               Documents (continued)
Required QAPP Element(s) and Corresponding
QAPP Section(s) in UFP-QAPP Manual
4.2 QA Management Reports
Required Information
- QA Management Reports
Table
Crosswalk to Related
Documents

4 . 3 Final Proj ect Report
Data Review
5.1 Overview
5.2 Data Review
5.2.1 Step I: Verification
5.2.2 Step II: Validation
5.2.2. 1 Step Ha: Validation Activities
5.2.2.2 Step lib: Validation Activities
5.2.3 Step III: Usability Assessment
5.2.3.1 Data Limitations and Actions
from Usability Assessment
5.2.3.2 Activities
5.3 Streamlining Data Review
5.3.1 Data Review Steps To Be
Streamlined
5.3.2 Criteria for Streamlining Data
Review
5.3.3 Amounts and Types of Data
Appropriate for Streamlining

- Verification (Step I)
Process Table
- Validation (Steps Ha and
lib) Process Table
- Validation (Steps Ha and
lib) Summary Table
- Usability Assessment




Note: Table 3 represents a fictitious site created to demonstrate how to crosswalk QAPP requirements with
other project documents.
IDQTF, UFP-QAPP Manual VI, March 2005
Introduction
Final

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page:  16 of 149
1.2.5   Graded Approach
Since the content and level of detail in individual QAPPs will vary according to the work being
performed and the intended use of the data, planners will want to use a "graded approach" when
preparing QAPPs. A graded approach is the process of establishing the project requirements and
level of effort according to the intended use of the results and the degree of confidence needed in
the quality of the results In other words, the degree of documentation, level of effort, and detail
will  vary based  on  the complexity and cost  of the project.  Appropriate and objective
consideration should be given to the significance of the environmental problems to be investigated,
the environmental decisions to be made, and the impact on human health and the environment.
Documentation will consist of a concise explanation whenever the particular project does not need
to address a specific area. In addition, by cross-referencing to approved generic QAPPs, project-
specific QAPPs may need less detail in certain areas. Throughout the remainder of the document,
examples of the graded approach are provided in text boxes.

1.2.6   Review and Approval of QAPPs

Depending on  the nature of the QAPP, the environmental program it implements,  and various
enforceable agreements, QAPP review and approval may be required by several different entities.
This section discusses:

•  Review and approval of QAPPs within the lead organization
•  Regulatory review and approval of QAPPs

The following sections also present a timeline for the review, implementation, and record archival
of a QAPP that also shows the needed lead-time for review, approval and implementation. This
timeline is presented as follows:

Review and Approval of QAPPs  (Section  1.2.6.1)  - Organization personnel,  contractors,
subcontractors review before regulatory approval authority submittal
Regulatory Review and Approval of QAPPs (Section 1.2.6.2) - Submitted to regulatory authority
at least 30 days in advance of scheduled data collection
QAPP Implementation and Modification (Section  1.2.7) - Verbal authorization of modification
documented; Amendment to QAPP within 7 working days for signature approval
QAPP Annual Review (Section  1.2.7) - One year from the date the QAPP was approved
QAPP Revision at Least Every 5 Years (Section  1.2.7)
QAPP Archival (Section  1.2.8) - Reviewer's Comments and Responses

1.2.6.1 Lead Organization Review and Approval

The QAPP should undergo internal  review at all appropriate levels.  The lead organization is
responsible for ensuring that the QAPP is accurate and complete, that it  conforms to the
requirements stated in this UFP-QAPP Manual, and that all project quality objectives (PQOs),

IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Introduction

-------
                                                           IDQTF UFP-QAPP Manual
                                                           Page: 17 of 149

technical activities, and related QA/QC will result in data of known and documented quality. To that
end, the lead organization  should require  that organizational personnel,  contractors, and
subcontractors review applicable sections of the  QAPP prior to submitting it to EPA or other
regulatory approval authority.

1.2.6.2 Regulatory Review and Approval of QAPPs

It is EPA policy that a QAPP be reviewed and approved prior to initiation of fieldwork. The type
of regulatory review and approval required is project-specific. In many instances, requirements for
such review and approval will be specified in legal agreements. EPA is responsible for reviewing
and approving all CERCLA QAPPs, except in cases where the review and approval authority has
been delegated by EPA to a non-EPA partner organization such as a State, Tribe, or other Federal
department, agency, or program, or where EPA has chosen not to review QAPPs for sites not on the
National Priorities List (NPL). EPA's delegation  of this authority to a non-EPA organization is
contingent on that organization  having an  acceptable quality system  documented in an EPA-
approved Quality Management Plan (QMP).

All QAPPs and related documents subject to regulatory approval are reviewed to ensure that they
are complete, are technically adequate, and meet the requirements of this UFP-QAPP Manual. The
QAPP is reviewed to ensure that PQOs, technical activities, and related QA/QC will result in data
of known and documented quality that can be used in environmental decision-making. Review and
approval of QAPPs by EPA and EPA-delegated approval authorities  must be in accordance with
requirements of this UFP-QAPP Manual. States, Tribes, and other Federal departments, agencies,
or programs are encouraged to use this Manual when reviewing and approving QAPPs. Although
the format established in the Manual and worksheets is not required, it will facilitate review by the
approval authority.

When developing generic QAPPs that will apply across multiple EPA Regions or regulatory
approval authorities, the scoping process must involve those entities early in the development of the
QAPP.  Receiving input early will help streamline review and approval of the generic QAPP.

All comments provided by EPA or the approval authority must be acceptably addressed in writing
prior to beginning field activities. The response  document (either a revised QAPP or a letter
responding to specific deficiencies) should contain complete  identifying information as it is
presented on the original QAPP. Any revisions to the original QAPP should be identified to expedite
document review and approval.

1.2.7   QAPP Implementation and Modification

The approved QAPP must be implemented as prescribed; however, the implementation process is
intended to be flexible and nonrestrictive. If an approved QAPP must be altered in  response to
project needs (including changes to project team members), the amended QAPP must be reviewed
and approved by the original approval authority in the same manner as the original  QAPP. The
amendment must contain complete identifying information, as presented on the original QAPP Title

IDQTF, UFP-QAPP Manual V1, March 2005                                            Final
Introduction

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 18 of 149

and Approval Page, with updated signatures and dates. Only after the amendment has been approved
can the change be implemented.

Verbal or electronic approval of modifications may be obtained to expedite project work. Verbal
approvals should be documented in telephone logs. Both verbal and electronic approvals should be
retained in the project file. Subsequently, the approved modification must be documented in an
amendment to the QAPP and submitted to EPA (or other approval authority, if applicable) for
signature approval within 7 working days.

Corrective actions must  be implemented when deviations from the QAPP are noted by project
personnel outside of the formal assessment process. Corrective actions need to be initiated whenever
project personnel identify field sampling or analytical problems that could potentially affect  data
quality or usability. Such incidents should be documented and resolved using the procedures and
personnel for planned assessments described in the QAPP (see Section 4.1.2).

Both project-specific and generic  QAPPs should be reviewed annually by the lead organization's
project manager. Project-specific  and generic QAPPs must be kept current and be revised when
necessary, when directed by the approval authority, or at least every 5 years.

1.2.8   QAPP Archiving

All QAPPs, including reviewers' comments and responses to reviewers' comments (revised QAPPs
or response  letters addressing  specific issues),  must be archived in the appropriate project or
program file according to the procedures specified by the lead organization in its QAPP or QMP.

Project files must be retained for the period of time specified in the interagency agreement, MOU,
cooperative agreement, financial agreement, contract, or voluntary or enforcement consent decree,
agreement, or order.

1.3    Roles and Responsibilities

1.3.1   Lead Organization

The lead organization for environmental data collection operations, as defined by this UFP-QAPP
Manual, will be a Federal department, agency, or program and will usually be the entity that owns
the facility or installation where work is being done. The lead organization is responsible for all
phases of environmental data  collection,  as well as for ensuring that organization personnel,
contractors, and subcontractors perform proj ect work as prescribed in the approved QAPP. The lead
organization may perform the project work directly or contract for field sampling, analytical
services, or data review.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Introduction

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 19 of 149
1.3.2  Project Manager
The proj ect manager is responsible for directing or overseeing and coordinating all proj ect activities
for the lead organization, including assembling a project team (see Section  1.3.3). He or she is
responsible for submitting QAPPs and QAPP revisions and amendments to appropriate personnel
for review and approval.

The project manager must ensure that all technical issues  identified  during QA review are
satisfactorily addressed and documented prior to beginning the data collection activity. The proj ect
manager is also responsible for reviewing the QAPP  annually and documenting this review in a
letter to the approval authority.
 Note: QAPPs should be submitted to the approval authority for review and approval no less than 30 days in
 advance of the scheduled environmental data collection or in whatever time period is specified by proj ect-specific
 agreements (e.g., MOU, Federal facilities agreement, permits). All QAPP revisions and amendments should be
 submitted in a timely way, so that the approval authority has sufficient time to complete the review and approval
 process prior to the collection of environmental data.
1.3.3  Project Team

The project team consists of technical personnel, including data generators, QA scientists, and data
users (e.g., geologists, chemists, risk assessors).  The project team may  include contractors and
subcontractors. The size and makeup of the project team should reflect the size, complexity, and
needs of the project. For example, small projects may have project teams that consist of only two
or three people (a situation for which the graded approach may be appropriate). Individuals
responsible for the following tasks are critical to the success of the project and should be selected
as project team members by the lead organization: project management, health and safety, field
mobilization, sampling, geotechnical operations, sample analysis, QA activities (including field and
laboratory assessment), data review, and risk assessment.

During planning (scoping), the proj ect team identifies proj ect and data quality obj ectives, decisions
to be made, project "action limits," the type and quantity of data, and how "good" the data must be
(the data quality) to ensure that scientifically defensible environmental decisions are made. The
project team defines the quality of the data by setting acceptance limits for the projects to meet the
project quality objectives (PQOs). Once  the  acceptance limits, also  known as  measurement
performance  criteria have been decided on, the project team can select sampling and analytical
methods that have appropriate quantitation limits and QC limits to achieve project objectives.

The project team is responsible for providing all the  information required by this UFP-QAPP
Manual and for resolving all technical issues prior to QAPP preparation. Ultimately, it is the
responsibility of the project team (not the QAPP preparer) to design a QA "blueprint" that meets
project objectives.
IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Introduction

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 20 of 149
1.3.4   QAPP Preparation Team/Writer
The QAPP should be written by a team or individual that has been involved in the project planning
phase and has experience or training with QAPP preparation. Members of the QAPP preparation
team  should be experienced in many aspects  of environmental science, including chemistry,
engineering, hydrogeology, and risk assessment. In addition, the QAPP preparation team should be
experienced with the sample collection procedures, analytical methods, and data review procedures
that will be used for the project.

1.3.5   Project Personnel

An organizational chart must clearly  show the reporting relationships  among all of the lead
organization's project personnel, including contractors and subcontractors.

All project personnel are responsible for reading and understanding applicable sections of the QAPP
before beginning fieldwork. All individuals who have project responsibilities should sign a Project
Personnel Sign-Off Sheet to document that they have read all relevant portions of the QAPP.

All project personnel are responsible for implementing the QAPP as prescribed, and for reporting
all deviations from the QAPP to the  project manager. Corrective action procedures must be
implemented when deviations from the QAPP are noted or whenever proj ect personnel identify field
sampling or analytical problems that could potentially affect data quality or usability.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Introduction

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 21 of 149


2.0    PROJECT MANAGEMENT AND OBJECTIVES ELEMENTS

The project management and objectives elements of a QAPP ensure that the project has a defined
purpose by documenting the environmental problem, the environmental questions being asked, and
the environmental decisions that need to be made. The elements in this part of the QAPP identify
the project quality objectives necessary to answer those questions and support those environmental
decisions. They also address management considerations, such as roles and responsibilities, for the
project.
  QAPP Worksheets

  The following sections provide a comprehensive list of the information required in a QAPP. As much as possible,
  this information should be presented in tabular format for ease of review. However, sufficient written discussion
  in text format should accompany all tables. To assist with this process, worksheets for optional use are provided
  in Part 2A of the UFP-QAPP, the QAPP Workbook. In addition, examples of QAPPs developed for different
  programs are available in Part 2C of the UFP-QAPP, and examples of selected worksheets are presented throughout
  this Manual. Although the examples are from typical chemical sampling and analysis projects, worksheets can and
  should be modified to  reflect project-specific requirements and  address the type of investigation (e.g.,
  radiochemical, biological, ordnance and explosives).
2.1    Title and Approval Page

The Title and Approval Page is the first page of the QAPP. It documents that the QAPP has received
proper regulatory approval prior to implementation.

The Title and Approval Page should contain the required approval signatures and other information
shown below. (Note: In the QAPP Workbook, this information corresponds with QAPP Worksheet
#1.)

    Site name/project name
•   Site location
•   Document title
•   Lead organization (see Section 1.3.1)
•   Preparer's name and organizational  affiliation
•   Preparer's address, telephone number, and e-mail address
•   Preparation date (day/month/year)
•   Investigative organization's project  manager's signature and printed name/organization/date2
•   Investigative organization's proj ect QA officer's signature and printed name/organization/date
2The investigative organization is an entity contracted by the lead organization for one or more phases of a data
collection operation.

IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
Project Management and Objectives Elements

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 22 of 149

•  Lead organization's project manager's signature and printed name/organization/date
•  Appropriate approval signatures and printed names/titles/dates

2.2    Document Format and Table of Contents

The organization of the QAPP should  be easy to understand and should follow the format and
section headings as described in this UFP-QAPP Manual to expedite review. All tables, diagrams,
charts, worksheets (if used), and other deliverables, which are itemized in this Manual, should be
included as components of the QAPP and listed  in the Table of Contents. Any required QAPP
element groups that are not applicable to the  project should be identified either on the QAPP
Requirements Summary table (Table 2  in Section 1.2.4) or in some other format provided by the
QAPP preparer, along with a justification for their exclusion.

2.2.1   Document Control Format

Document control procedures are used to identify the most current version of the QAPP and to
ensure that only that version of the QAPP is used by all project participants.

The QAPP preparer should use a consistent document control format (e.g., in the upper right-hand
corner of each page of the document) to present the following information:

•  The title of the document (abbreviations may be used).
•  The original version number or revision number, whichever is applicable, including document
   status (i.e., draft, interim draft, interim final, final).
•  The date of the  original version or current revision, whichever is applicable.
   The page number in relation to the total number of pages. Alternatively, pages may be numbered
   as part of the total pages for a discrete section. (In the case of the second option, the Table of
   Contents should list inclusive page numbers for each subsection, e.g., 1-1 through 1-9).

The document control  procedures should be applied to the QAPP beginning on the Title and
Approval Page and including the Table of Contents and all figures, tables, and diagrams. Each
revision of the QAPP should be differentiated by a new revision number and date.
IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
Project Management and Objectives Elements

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 23 of 149

2.2.2  Document Control Numbering System

A  document  control  numbering   system
accounts for all copies of the QAPP provided to
project personnel and helps to ensure that the
most current version is in use. A sequential
numbering system is used to identify controlled
copies of the QAPP. Controlled copies should
be   assigned  to  individuals  within   an
organization or team. Individuals receiving a
controlled  copy of the QAPP are provided with all revisions, addenda, and amendments to the
QAPP. Individuals who receive a controlled copy are responsible for removing all outdated material
from circulation.
Graded Approach

The use of a document control numbering system may
not always be necessary, such as for small projects or
projects that do not require the distribution of copies
across multiple organizations.
The document control system does not preclude making and using copies of the QAPP; however,
holders of controlled copies are responsible for distributing revised or added material to update any
copies within their organizations. The distribution list for controlled copies should be maintained
by the organization that prepares the QAPP, and a copy of that distribution list should be provided
to the lead organization.

2.2.3   Table of Contents

A Table of Contents clearly outlines the organization of the QAPP and makes project information
easy to reference. The QAPP should include a Table of Contents that is comprehensive and contains
the title and location (i.e., page number, appendix or attachment number) of the following items:
   Major sections
   Subsections
   References
   Appendices and/or attachments (e.g., SOPs)
   Tables
   Figures
   Diagrams
  References

  The reference section of this Manual lists applicable
  national  requirements and guidance documents.
  Web site addresses are noted for most documents.
 Note: Applicable appendices and/or attachments include but are not limited to the following:
 •• SOPs for sampling, drilling, sample preparation and analysis, etc., that are included as attachments
 •• The completed QAPP worksheets, if the QAPP worksheets are used and not included as tables in the QAPP
 •• Laboratory quality assurance plans or quality assurance manuals for participating laboratories
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
                                        Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 24 of 149
2.2.4   QAPP Identifying Information
The QAPP identifying information prefaces the content of the QAPP and places the document in
context for the reviewer. It identifies the key project players, previous site work, if any, and the
program for which the current project is being performed. QAPP identifying information should be
consolidated in one table (QAPP Worksheet #2). The identifying information required includes:

•  Site name/project name
•  Site location
   Site number/code
•  Operable unit
   Contractor name
   Contractor number
•  Contract title
•  Work assignment number
•  Guidance used to prepare QAPP
•  Regulatory program (e.g., RCRA, CERCLA, Clean Water Act)
•  Approval entity
•  Data users
•  Identification as a generic or project-specific QAPP
   Scoping session dates

Additional information that may be presented includes:

•  Dates  and  titles of  QAPP documents
   written for previous site work
   Organizational partners (stakeholders) and
   connection with lead organization
   QAPP  element  groups  and  required
   information not applicable to this project
   (Exclusions can be  identified in QAPP
   Worksheet #2 and described in the text.)
Graded Approach

QAPP Worksheet #2, equivalent to Table 3 in Section
1.2.4, may be used to identify opportunities for the
graded approach.
2.3    Distribution List and Project Personnel Sign-Off Sheet

2.3.1   Distribution List

The Distribution List documents those entities   Graded A  roach
to whom copies of the approved QAPP and any
subsequent revisions will be sent. (See Figure
3 and QAPP Worksheet #3.) A complete copy
of the QAPP  should  be sent to the project
manager and key project personnel for the lead
organization and to EPA  or the  delegated
 In those cases where the QAPP will have a limited
 distribution, such as within a single facility, a simple
 sign-out sheet could be sufficient.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 25 of 149

approval authority. Key project personnel are those working for the lead organization, including
contractors and subcontractors. Examples include the lead field sampler, project manager, data
reviewer, statistician, risk assessor, assessment personnel, and laboratory QC manager. In addition,
a complete copy of the original version and all revisions of the QAPP, including addenda and
amendments, should be maintained on file by the lead organization and made available to approval
authorities upon request. The distribution list may change and should be revised for each QAPP
revision submitted. Each revision of the QAPP should contain the information shown in Figure 3.

QAPP
Recipients

Title

Organization

Telephone
Number

Fax
Number

E-mail
Address
Document
Control
Number
                               Figure 3. Distribution List
                                 (QAPP Worksheet #3)
  Note: Examples of select worksheets that are complicated and require additional information are provided
  throughout this Manual to help the reader understand their intended content. Worksheets that are self-
  explanatory do not include examples, only table headings.
2.3.2   Project Personnel Sign-Off Sheet

The Project Personnel Sign-Off Sheet documents that all key project personnel performing work
have read the applicable sections of the QAPP and will perform the tasks as described. For example,
the laboratory manager who receives the QAPP should have all supervisory personnel sign off on
the applicable analysis sections of the QAPP before beginning sample analysis. Supervisory or
oversight personnel are responsible for communicating the requirements of the applicable portions
of the QAPP to those doing work. Although it is not always possible to identify people by name
early in the planning stages, the project team should identify by function (e.g., laboratory  QC
manager) all personnel who are to read and sign off on the applicable sections of the QAPP. Figure
4 (QAPP Worksheet #4) shows what information to include in the original QAPP and all revisions.
Project
Personnel
Title
Telephone
Number
Signature
Date QAPP
Read
                       Figure 4. Project Personnel Sign-Off Sheet
                                 (QAPP Worksheet #4)
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
Final

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 26 of 149
2.4    Project Organization
The proj ect team should identify the reporting relationships between the organizations, proj ect team
members, and other key project personnel and describe their specific roles, responsibilities,  and
qualifications. In addition, the QAPP text should include an explanation of the lines of authority and
paths of communication.

2.4.1  Project Organizational Chart

The Project Organizational Chart shows reporting relationships between all organizations involved
in the project, including the lead organization and all contractors and subcontractors. Proj ect team
members should identify the organizations providing field sampling, on-site and off-site analysis,
and  data review services, including the  names and telephone numbers of all project managers,
proj ect team members, and proj ect contacts for each organization. See Section 1.3.3 of this Manual
for a discussion of the proj ect team. The types of information required in an organizational chart are
shown in Figure 5 (QAPP Worksheet #5).
      Contractor Organi/atioi
      Project Manage)
                                      Approval Authority:
                                  Load Ors-nni/.atMHi Project Manager
                                 Contractor Organization:
                                 Project Manage)
                                                               !,t>ad Organizatm
                                                               QA Officer:
                                                             Contractor Organizatioi
                                                             Project Manager
                     Figure 5. Example Project Organizational Chart
                                  (QAPP Worksheet #5)
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 27 of 149
  Graded Approach

  Generally, for smaller, less complex projects, the organizational chart will be considerably smaller. Project
  personnel may be assigned multiple responsibilities. However, in all cases the QA officer should be independent
  of data collection activities.
2.4.2   Communication Pathways

One of the keys to a successful project is communication. Communication pathways and modes of
communication (faxes, newsletters,  electronic mail, reports) should be delineated in the project
planning stage and  documented in the QAPP. These pathways include the points of contact for
resolving sampling and analysis problems and for distributing preliminary, screening, and definitive
data to managers, users, and the public. The project team should describe the proper procedures for
soliciting  concurrence and obtaining approval between project  personnel, between different
contractors, and between samplers and laboratory staff. Figure 6 (QAPP Worksheet #6) may be
used to capture these communication pathways.
| Communication Drivers
Responsible
Entity
Name
Phone Number
Procedure (Timing,
Pathways, etc.) |
                           Figure 6.  Communication Pathways
                                  (QAPP Worksheet #6)
Communication drivers are those activities that necessitate communication between different
responsible entities. These drivers can include, but are not limited to:

•  Approval of amendments to the QAPP
•  Initiation, notification and/or approval of real time modifications
•  Notification of delays or changes to field work
•  Recommendations to stop work and initiation of corrective action
•  Reporting of issues related to analytical data quality, including, but not limited to, ability to
   meet reporting limits

Responsible entities are the proj ect personnel that may be responsible for initiating, communicating,
or approving one of the communication drivers. Example responsible entities include, but are not
limited to:

•  Regulatory approval authority
•  Lead organization project manager

IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 28 of 149

   Contracting officer representative
•  Lead organization QA officer
•  Field sampling project manager
•  Field sampling QA officer
•  Health and safety officer
•  Laboratory project manager
•  Laboratory QA officer

Procedures (timing, pathways and types of acceptable communications) must be outlined in
sufficient detail to ensure  that users of the QAPP understand the processes  and the roles and
responsibilities associated with those processes when communication is necessary. Issues that should
be addressed include, but are not limited to:

•  Who may approve specific types of real time modifications, including who will be notified and
   the timing of such notification
•  Nature of required communication forms (e.g. electronic, verbal, written)
•  Processes and authorities for recommending work stoppage and corrective action.

The following statements are examples of communication processes that may  be documented as
procedures in QAPP Worksheet #6:

•  If field sampling will be delayed, then the project manager from the field sampling contractor
   organization will notify	.
•  No data may be released to the public until	.
•  If the laboratory fails to accurately analyze a proficiency testing (PT) sample, then the project
   manager from the lead organization will	.

The project team also should document the procedures that will be followed when any project
activity originally documented in an approved QAPP requires real-time modification to achieve
project goals. These project activities include, but are not limited to:

•  Sampling design
   Sample collection procedures
   Sample analysis procedures
•  Data review and reporting

All significant QAPP modifications must be documented and submitted for approval in accordance
with the original QAPP (see Section 1.2.7). The person requesting a modification and the person
who must approve the modification, and the rationale for the modification must be documented. All
changes, including minor changes or changes dictated by field conditions,  must be reported to
approving organizations and documented.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 29 of 149

The proj ect team also should describe the procedures for initiating modifications to proj ect activities,
name the individual who has the authority to initiate procedural modifications, and describe how
amendments to the QAPP will be documented and submitted to EPA, or the delegated authority, for
approval. All amendments must be included with the final version of the QAPP that is maintained
by the lead organization as part of the official project records.

The QAPP should spell out the difference between a modification and a one-time deviation, and
between a significant deviation and one considered minor. All deviations and the reasons for them
must be documented in writing and incorporated into the proj ect files. In the case of a time-sensitive
issue, verbal or electronic approval for the change may be given; however, any such change must
subsequently be documented in writing and included in the project files. The QAPP must specify
who has the authority to request and to issue verbal or electronic approvals for modifications or one-
time deviations from the approved QAPP.

2.4.3   Personnel Responsibilities and Qualifications

Project personnel in responsible roles may include  both prime contractors and subcontractors.
Proj ect personnel' s responsibilities and qualifications can be presented in a table identifying proj ect
team members. Resumes for each person identified should be attached to the QAPP or their location
noted. Figure 7 (QAPP Worksheet #7) shows what information to include in each revision of the
QAPP. The lead organization  must ensure that the responsible project personnel meet any specific
QAPP qualifications, such as laboratory certification or professional engineer (PE).

The table should include the name, title, and affiliation of the following:

•  Data users - Technical personnel who use
    ,       n    1   1            r-      1  •     Graded Approach
   the  collected   data  to   perform  their
   responsibilities  (e.g.,  risk   assessment,
   remedial design, legal compliance).
   Decision-makers - Individuals who will
   make decisions based on the collected data.
   Lead  organization's  project  manager -
   Individual  with  the  responsibility  and
      Al  .A   A    ,,                      ,    Superfund risk assessment project.
    authority  to  allocate  resources  and
The actual requirements  for responsible  project
personnel depend on the complexity, type, and size of
the project. For example, requirements for a sampling
technician for a routine compliance project may be
quite different from the requirements for a technician
collecting samples that may be used in a complex
   personnel to accomplish the proj ect tasks as
   documented in the QAPP.
   Lead organization's quality assurance officer- Individual who provides Q A oversight of proj ect
   activities and who works independently of those performing project tasks.
   Project manager(s) and/or project contact(s) for other organizations involved in the project.
   QA manager or officer or QA contact for other organizations involved in the project -
   Individual responsible for checking that correct procedures are used; is independent of the group
   performing the task; has the authority to initiate a work stoppage to correct  quality concerns.
IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Project Management and Objectives Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 30 of 149

   Project health and safety officer - Individual certified in health and safety; has the authority to
   initiate a work stoppage due to health and safety concerns.
   Geotechnical engineers and hydrogeologists.
   Field operation personnel, including field sampling coordinator, drillers, direct-push technology
   operators (Geoprobes, Cone Penetrometers), and field sampling personnel.
   Analytical services, including on-site analytical support and off-site laboratory services.
   Data reviewers.
Name
Title
Organizational
Affiliation
Responsibilities
Education and
Experience
Qualifications
                 Figure 7. Personnel Responsibilities and Qualifications
                                 (QAPP Worksheet #7)

2.4.4   Special Training Requirements and Certification

Certain proj ects may require uniquely trained personnel to perform specialized field reconnaissance,
sampling, on-site or off-site analysis, data review, and other proj ect functions. All proj ect personnel
must be qualified and experienced in the project tasks  for which they are responsible. A table
showing any specialized training needed to  achieve project objectives can be provided. Training
records and/or certificates should be attached to the QAPP, or their location noted. If training
records or certificates do not exist or are unavailable, this should be noted in the QAPP. Figure 8
(QAPP Worksheet #8) shows what information to include in the  Special Personnel Training
Requirements table.



Project
Function
Specialized
Training -
Title or
Description of
Course



Training
Provider



Training
Date

Personnel/
Groups
Receiving
Training


Personnel Titles/
Organizational
Affiliation



Location of Training
Records/Certificates
                   Figure 8. Special Personnel Training Requirements
                                 (QAPP Worksheet #8)

2.5    Project Planning/Problem Definition

To ensure QAPP approval,  the QAPP should provide a regulatory, programmatic, and historical
context for the project and convey to the reviewer a clear understanding of the project background
and environmental problems that exist. The QAPP must address project planning, identify the
environmental problem, define the environmental questions that need to be answered, and provide
background information
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
Final

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 31 of 149
2.5.1   Project Planning (Scoping)
Project scoping is key to the success of any project.  Scoping defines the purpose and expected
results of the project; the environmental decisions that need to be made; the project quality
objectives necessary to achieve expected results and support environmental decisions; the sampling,
analytical, and data review activities that will be performed; and the final products and deliverables
for the project. Prior to QAPP preparation, the project team  should hold one or more scoping
sessions. The QAPP should document all project planning sessions held during the initial planning
phase.

If the  project team is using worksheets from the QAPP Workbook, the worksheets should be
completed at the initial scoping session using as much information as is available. The worksheets
should be finalized at subsequent sessions and included as tables, diagrams, and figures in the
QAPP. The QAPP should include explanatory text for tables, figures, and diagrams whenever
necessary.  If the worksheets are not used, the project team members must produce a QAPP that
addresses the information required by this UFP-QAPP Manual (see Table 3). Alternatively, the
project team may create or modify the worksheets in the Workbook to meet their specific needs.

Other worksheets that focus on the data user's perspective may be used during the scoping sessions.
Examples for compliance, remedy, and risk assessment scenarios are shown in Figures 9 through
11. (These examples were adapted from USAGE Manual No. 200-1-2, Technical Project Planning
Process, August 31, 1998, Appendix F.) Using additional worksheets will help prepare data users
for the scoping session and focus the planning on the most appropriate sampling and analytical
strategy to obtain the data needed to support the project's environmental decisions.
 Note: The following example worksheets are not included in the QAPP Workbook because they illustrate specific
 user needs that will not always be applicable. They are  presented here as a demonstration of the kinds of
 additional worksheets that may be useful.
Scoping  session   participants   should  be
documented in the QAPP or in the project file
and  should include project  managers, data
generators (including sampling and laboratory
analysis personnel), data reviewers,  quality
assurance personnel, data users, and all other
stakeholders. The project team members who
are responsible for planning the project should
be identified. Figure 12 (QAPP Worksheet #9)
shows what information to include in the
Project Scoping Session Participants Sheet.
Graded Approach

Note that the type and frequency of scoping sessions
and the type and number of persons who participate in
scoping sessions are related to the size and complexity
of the project, the technical components of the project,
and the  number of organizations  involved. For
example, small projects may use project teams that
consist of only two or three people and convene via
teleconference.
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
                                        Final

-------
                                                                                        IDQTF UFP-QAPP Manual
                                                                                        Page 32 of 149
Data Need
Target Analyte
Chromium, Cr
Total Chromium, Cr
Chromium, Cr III
Chromium, Cr VI
Chromium, Cr
Matrix
Soil
GW
Water
Water
GW
Data Use(s)
Regulatory Program
or Statute and Citation
RCRA
40CFR261.24
RCRA
40CFR261.24
CWA
40CFR131
CWA
40CFR131
SDWA
40 CFR 141
Specific Use
Determine if IDW is
hazardous waste.
Determine if treatment
plant effluent requires
pretreatment prior to
discharge to surface
water.
Determine if GW
concentrations exceed
maximum contaminant
levels.
Number or
Frequency of
Samples
1 composite
sample per rolloff
container
1 sample per drum
1 sample
(timeframe is to be
determined)
1 sample
(timeframe is to be
determined)
1 per well
Compliance
Reference
Concentration
5.0 mg/L
(TCLP Cr)
5.0 mg/L
(Total Cr)
180'g/L
10'g/L
0.1 mg/L
Point(s) of
Compliance/Sample
Location(s) and Depth
Representative sample of
waste stream (soil).
Representative sample of
waste stream (purge water).
Groundwater treatment plant
effluent at point source
discharge location.
Groundwater treatment plant
effluent at point source
discharge location.
Required at the point-of- use
tap, but sampling at
monitoring wells is adequate.
                         Figure 9. Example Data Needs Worksheet - Compliance Perspective
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
Final

-------
                                                                                        IDQTF UFP-QAPP Manual
                                                                                        Page 33 of 149
Data Need
Target Analyte or
Characteristic of
Interest
Vinyl chloride
Depth to bedrock
Hydraulic
conductivity, grain
size distribution,
and porosity
Lead and cadmium
pH, total dissolved
solids, and total
organic carbon
Matrix
Air
Soil
GW
Soil
SW
Data Use(s)
Remedy Method(s) of
Interest
Air stripping
Slurry wall
Treatment wall
Treatment wall
Off-site disposal
On-site water treatment
by electrochemical
precipitation or ion
exchange
Criteria to be
Considered
Effectiveness control
Implementability and
conceptual cost estimate
Effectiveness,
implementability, and
conceptual cost estimate
Removal action estimate
of transportation and
disposal costs
Effectiveness,
implementability, cost,
and O&M
Number or
Frequency of
Samples
3 over 3 -day
operating period
1 location every 100
ft.
1 location every 25 ft.
5
Composite 1 per 100
cubic yards of
stockpiled soils
5
Concentration of
Interest or
Sensitivity of
Measurement(s)
2.0gm/hr
Measurements
should be within
+/- 1 ft.
ASTM, +7-0.1%
TCLP
pH within +/- 0.5,
TDS and TOC
within +/- 0.5
mg/L
Remediation Area(s)/ Sample
Location(s) and Depth
At stack emissions after air stripper.
Along planned alignments of slurry
wall and treatment wall as shown on
attached figure.
Preferred locations distributed along
middle of planned alignment of
treatment wall.
Random, composite samples from
within each stockpiled soil pile (i.e.,
BV2, BV4, BV7-9, and BV12) on the
attached figure.
Surface water samples halfway down
water column; 2 in the center of basin
15, and 3 along the edges.
                           Figure 10.  Example Data Needs Worksheet - Remedy Perspective
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
Final

-------
                                                                                         IDQTF UFP-QAPP Manual
                                                                                         Page 34 of 149
Data Need
Target
Analyte
Vinyl chloride
Vinyl chloride
Lead and
cadmium
Lead and
cadmium
Matrix
GW
GW
Soil
Soil
Data Use(s)
Current or
Future Use
Current use
Future use
Current use
Future use
Receptor
Group (s)
Industrial
workers
Residents
Industrial
workers
Residents
Receptor's Exposure
Route (s)
Incidental ingestions,
dermal, and inhalation
Incidental ingestions,
dermal, and inhalation
Ingestion and dermal
Ingestion and dermal
Number of
Samples
20
20
20
20
Risk Action Level(s)
Human
Health
N/A
»' g/L
1,000 mg/kg
400 and 39
mg/kg
Ecological
N/A
N/A
N/A
0.1 and 2. 5
mg/kg
Exposure Area(s)/ Sample Location(s)
and Depth
The two worst-case downgradient wells
found during PA/SI.
The two worst-case downgradient wells
found during PA/SI.
Within area outlined on attached figure
and at 0-24 inches.
Within area outlined on attached figure
and at 0-24 inches.
                       Figure 11. Example Data Needs Worksheet - Risk Assessment Perspective
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page:  35 of 149
Project Name:
Projected Date(s) of Sampling:
Project Manager:
Site Name:
Site Location:
Date of Session:
Scoping Session Purpose:
Name
Title
Affiliation Phone # E-mail Address Project Role
                  Figure 12. Project Scoping Session Participants Sheet
                                 (QAPP Worksheet #9)

2.5.2   Problem Definition, Site History, and Background

The QAPP should frame, for the reader or reviewer, the reasons for conducting the proj ect, including
historical information, current site conditions, and other existing data applicable to the project. This
information can be used to clearly define the problem and the environmental questions that should
be answered for the current investigation, as well as to develop the proj ect "If..., then..." statements
in the QAPP, linking data results with possible actions.

The following information should be summarized in the text of the QAPP or presented in QAPP
Worksheet  #10:

   The problem to be addressed by the project. For example, "Residential drinking water
   wells in Toadville have shown increasing levels of benzene over the past two years."

   The environmental questions being asked. For example,  "What is the source of the benzene
   contamination in the residential drinking water wells of Toadville, NH?"

•  Observations from any site reconnaissance reports. Information about pertinent existing site
   conditions (e.g.,  evident soil staining and the presence of free product materials, odors, and
   other known hazards) should be identified and their location specified. Physical objects (e.g.,
   metallic debris, drums, dilapidated buildings, processing equipment, and known safety hazards)
   also should be identified and their location specified.

•  A synopsis  of secondary data or information from all site  reports. Existing reports (e.g.,
   monitoring  reports  and remedial investigation/remedial  action reports) that  describe site
   conditions and indicator chemicals for long-term remediation or monitoring projects should be
   cited. Refer to Section  2.7 for a complete discussion of  the identification and use of data
   acquired from secondary sources.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page:  36 of 149

   The possible classes of contaminants and  the  affected matrices. The past and current
   chemical use information will be the basis for deciding the target analytes/contaminants of
   concern to be investigated during the project. Information to consider includes historical site
   usage, site neighbors, industrial processes, process by-products, waste disposal practices, and
   possible contaminant breakdown products.

•  The rationale for inclusion of chemical and nonchemical analyses.

•  Information concerning various environmental indicators. These indicators describe the
   present condition of the environment (e.g., water, soil, sludge, sediment, air,  and biota) and
   provide a benchmark to monitor changes in the condition of the environment.

Additionally, the following site maps and/or figures should be provided in the QAPP, as available:

•  A detailed site map that shows the site in its present state and specifies its boundaries
•  A map that places the site in geographical context
•  Historical maps or plans of the site prior to the investigation
•  Maps identifying past and planned sampling locations
•  Historical and current aerial photographs

An SVa" x 11"  copy of all site maps and drawings should be included in the QAPP in addition to
larger foldout maps and drawings.

2.6    Project Quality Objectives and Measurement Performance Criteria

The QAPP must document the environmental decisions that need to be made and the level of data
quality needed to ensure that those decisions are based on sound scientific data.

2.6.1   Development of Project Quality Objectives Using the Systematic Planning Process

Project quality objectives (PQOs) define the type, quantity, and quality of data that are needed to
answer specific environmental questions and support proper environmental decisions. The project
team should determine and agree on PQOs during the initial scoping sessions using a systematic
planning process. A  team can  develop acceptance or performance criteria specific to the type,
quality, and quantity  of the data needed for the decision that is to be made. Figure 13 diagrams a
systematic planning process. Although the activities presented in Figure 13 are  sequential, the
planning process is iterative and project planners are advised to revisit relevant activities whenever
necessary.

The systematic planning process is based on the scientific method and includes  concepts such as
objectivity of approach and acceptability of results. It uses a commonsense graded approach to
ensure that the level of detail in planning is commensurate with the importance and intended
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                                                         IDQTF UFP-QAPP Manual
                                                                                         Page:  37  of 149
              Identify Lead Organization, Approval
                  Authority, anil Project lest in
                     Identify Project Organization and Responsibilities
                     Include project management, data users, duta generators.
                          QAI'i* preparation team* and stakeholder's
                                                                                                        Document
                                                                                                         QAPP
                                                                                                      Requirements
                                                                                                (e.g.* ( ompletc Worksheets)
                          Schedule and Convene Scoping Sessions
      - Rose,
      • I den
      • Iden
      - I den
      • l>etc
      • Idei
      - De\
                 Define i-'nvironincntal Problem
 h site history and background
 secondary data sources (acquired data) and limitations
 environmental decisions thai need to be made
 questions thai need to be ans\\ered to make environmenlal
ine JJ  form ill PQO process is needed tor eritieal deeision-m
 data  users' needs
p "tr/ihcn" statements that link data results and possible aeti
decisions
-iking
                                          1
                                Develop Project Schedule
                      • Identify resource and/or time limitations
                      - Identify regulatory requirements mid/wrestrictions
                      - Identify seasonal samp ling restrictions
                                  ine the **Type** of I>nta Needed
                    *"» Tdeniiiy large! anslytes/eontaniinunls of concern
                          concentraiion levels
                     - Select analytical groups
                     - DtSlcrmine appropriateness of Held screening, on-site
                              l and/or oil-site laboratory techniques
                     - Oaluale appropriateness of sampling technkjues
                     -^"   Determine the "Quality" of Data Needed   ""
                     - r.siablish project sampling/anolytical measurement
                      performance criteria (MPC") for precision. accuriic>/hiits.
                      sensitivity (quantiuition limits), comparability
                      representativeness and coitipleleucss
                         Determine tlie "Quantity*' of Data Needed
                      • I JeEenninc the number of samples needed tor each
                       analytical group/niatrix/coneentration level
                                                                      llelermine Data  \
                                                                          Review
                                                                     Requirenienls anil
                                                                          Oiteria
                                                                     - To ensure thai data
                                                                      are seicntiilk-ally
                                                                        und and that on]\
                                                                        Ut meeting  project
                                                                                be used
                                                                                   Obtain Services
                                                                                   of Data Review
                                                                                      Ciroup if
                                                                                      Required
                                                                                              \
                                    Figure 13.  Systematic Planning Process
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
                                                                                                            Final

-------
                                                                            IDQTF UFP-QAPP Manual
                                                                            Page:  38 of 149
                                           lltvdop Sampiinj; Design Mat tana it
                                        iiif Qujlii\ Vssier^nrr V'.M'^mrnls Eh^tVVill Ur Prtft»nn«fl
                                        nltft ctraHini/aliiiBis PrrfwrinAin}* \vws>vmrttl«.
                                         tidp How Projfrt Oalu Will Be KvsJuntod After
                                         u 10 IlL'k'rinuu' ifrlu- ( scr\ Vccl*. lhuH!o'ii \l
                                               t QAITitm! Stibmit for
                                                                                 Requirement*
                                                                              .jj,., r«ni|}letf Workshe*!*)
                        Figure 13.  Systematic Planning Process (continued)
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
Final

-------
                                                                   IDQTF UFP-QAPP Manual
                                                                   Page:  39  of 149
purpose of the work and the use of available resources. This framework promotes communication
between all organizations and individuals involved in an environmental project.

When critical environmental decisions need to be made (e.g., final decision-making or compliance
with a standard), the project team should follow a formal systematic planning process such as the
data quality objectives (DQO) process described in the Guidance for the Data Quality Objectives
Process (EPA QA/G-4), August 2000, EPA/600/R-96/055. The formal DQO process as described
in EPA QA/G-4 requires statistical expertise to define the amount of error acceptable when making
an environmental decision and includes the following seven steps:
    Step 1.  State the problem
    Step 2.  Identify the decision
    Step 3.  Identify the inputs to the decision
    Step 4.  Define the study boundaries
    Step 5.  Develop a decision rule
    Step 6.  Specify tolerable limits on
            decision error
    Step 7.  Optimize the design
Graded Approach

For data collection activities that are either exploratory
or small in nature, or where specific decisions cannot
be identified, the formal DQO process is not necessary.
For these projects, the  project team should use an
abbreviated systematic planning process (e.g., Steps 1-
4) to help identify the PQOs and action limits, and to
select appropriate sampling, analytical, and assessment
activities.
  Estimating Measurement Error

  This Manual requires that error be addressed at severalkey points in the QAPP. See Project Quality Objectives
  and Measurement Performance Criteria (Section 2.6), Field Documentation Procedures (Section 3.1.2.6),
  Quality Control Samples (Sections.4), Q A Management Reports (Section4.2), andDataReview (Sections.2).

  Estimation of the amount of error that will be acceptable to meet the goals of the project is essential for proper
  planning. Measurement error is influencedby imperfections in the measurement and analysis system. Sampling
  error is generally thought to contribute the majority of the measurement error associated with project data,
  where:

                   Measurement Error = Sampling Error + Analytical Error

  Random and systematic measurement errors are introduced in the measurementprocess during physical sample
  collection, sample handling, sample preparation, sample analysis, data reduction, transmission, and storage.

  Once data have been generated, calculating the impact of the measurement error is a significant part of the data
  usability assessment. Measurement error can and does often lead to decision errors. Therefore, it is essential
  to reduce total project error to a minimum.  This is done during planning by choosing an appropriate sample
  design and measurement system that will reduce the possibility of making a decision error.

  Potentially relevant guidance documents have been added to the reference section.
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
                                           Final

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page:  40 of 149

Statistical analysis is beyond the scope of many projects; therefore, whether formal DQOs should
be developed using the process described in EPA QA/G-4 will depend on the critical nature of the
environmental decisions to be  made as determined by the project team.
PQOs developed using a systematic planning process are presented as qualitative and quantitative
statements that answer questions such as the following:
•  Who will use the data?
•  What will the data be used  for? Simple, clear statements, such as the following should be used
   to describe anticipated data uses:
   — "These data will be used to determine if there is a potential current risk to human health
       during recreational use  in the top foot of soil from contaminants exceeding specified action
       levels."
   — "These data will be used to determine the location of the leading edge of the contaminated
       plume, as measured by concentrations of 1A the action level  at a	% confidence interval."
   — "These data will be  used to identify the presence or absence of DNAPL that may be a
       continuing  source of contamination in groundwater."
 Note: The following are poor examples of PQOs because they are too vague and do not truly address the purpose
 of the data:

 •• "These data will be used to determine the nature and extent of contamination."
 •• "These data will be used to determine regulatory compliance with CERCLA statutes."
 •• "These data will be used to assess the quality of the data generated by potentially responsible parties (PRPs)."
   What type of data are needed?
   —  Target analytes and analytical groups
   —  Field screening, on-site analytical, and/or off-site laboratory techniques
   —  Type of sampling techniques (e.g., low-flow sampling)
   How "good" do the data need to be in order to support the environmental decision?
   — The quality of the data is determined by establishing criteria for performance measures,
       including precision, accuracy/bias, sensitivity (quantitation limits), data comparability,
       representativeness, and completeness.
   How much data are needed?
   — The number of samples needed for each analytical group, matrix, and concentration level.
   Where, when, and how should the data be collected or generated?
   Who will collect or generate the data?
   How will the data be reported?
   How will the data be managed and archived?
IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Project Management and Objectives Elements

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page:  41 of 149

Site-specific PQOs identified at the scoping sessions should be documented in the QAPP using
Worksheet #11 or a similar format.

2.6.2  Measurement Performance Criteria

Once the project team has defined the environmental decisions and identified the PQOs, the data
users and QA personnel can determine the measurement performance criteria that should be satisfied
in order to support defensible decisions.

Measurement  performance  criteria should be  determined for each matrix,  analytical group,
concentration  level, and  analyte,  if applicable. The criteria should  relate to the parameters of
precision, accuracy/bias,  representativeness, comparability, sensitivity (quantitation limits), and
completeness. The parameters indicate the qualitative and quantitative degree of quality associated
with measurement data and, hence, are referred to as data quality indicators (DQIs).3

The QAPP should document the performance criteria for both the proj ect-specific sampling and the
analytical measurement systems that will be used to judge whether the project objectives have been
met. For example, to determine whether the monitoring wells were installed correctly and will yield
representative samples, the  project team  should identify appropriate performance criteria  (e.g.,
during purging prior to sample collection, the monitoring wells must recover within	minutes
in order to obtain an acceptable sample).

After measurement  performance  criteria have  been established, the  data generators and QA
personnel should select sampling and analytical procedures and methods that have QC acceptance
limits that support the achievement of established performance criteria.
 Note: The determination of the analytical data validation criteria should be concurrent with the development of
 measurement performance criteria and the selection of sampling and analytical procedures and methods. To ensure
 that only data meeting proj ect-required measurement performance criteria are used in decision-making?, data users
 and QA personnel should select data validation criteria that support both the established project-specific
 measurement performance criteria and the analytical procedure and method QC acceptance limits (see Section
 5.0).
Figure 14 (QAPP Worksheet #12) provides an example of the Measurement Performance Criteria
table. This table should be completed for each matrix (soil, groundwater,  sediment), analytical
group, and concentration level. The analytical group canbe described by common compound
3Data quality indicators should not be confused with the overall project quality objectives that are developed using the
formal data quality objective process.

IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
Project Management and Objectives Elements

-------
                                                                                                 IDQTF UFP-QAPP Manual
                                                                                                 Page: 42 of 149
Matrix
Analytical Group1
Concentration Level
Sampling Procedure2
S-l
Ground Water
VOA
Low
Analytical
Method/SOP3
L-l

Data Quality
Indicators
(DQIs)
Precision-Overall
Precision-Lab
Accuracy/Bias
Accuracy/Bias
Accuracy/Bias
Contamination
Sensitivity
Measurement Performance Criteria
RPD < 30% when VOA detects for both field
duplicate samples >_ QL. RPD <, 40% when
gaseous VOA detects for both field duplicate
samples are >_ QL.
RPD < 20% when VOC detects for both
laboratory duplicate samples >_ QL. RPD <_
30% when gaseous VOC detects for both
laboratory duplicate samples are >_ QL.
± 20% VOAs except volatile gases ± 40%
No false negatives, no false positives,
quantitation within warning limits (j_ 20)
No target compounds >_ QL
+ 40% at QL
QC Sample and/or Activity
Used to Assess
Measurement Performance
Field Duplicates
Laboratory Duplicates
Surrogate Spikes
Single-Blind PT
Equipment Blanks, Field
Blanks, Method Blanks &
Instrument Blanks
Laboratory Fortified Blank at
QL
QC Sample Assesses Error
for Sampling (S), Analytical
(A) or both (S+A)
S+A
A
A
A
S+A
A
'If information varies within an analytical group, separate by individual analyte.
2Reference number from QAPP Worksheet #21 (see Section 3.1.2).
3Reference number from QAPP Worksheet #23 (see Section 3.2).
                                  Figure  14.  Example Measurement Performance Criteria
                                                   (QAPP Worksheet #12)
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page:  43 of 149

groupings such as metals or semivolatile organic compounds. The concentration level may be a
qualitative description (i.e., low, medium, high) as long as the terms are used consistently, are
defined and agreed to by the project team.

A discussion of the DQIs for which measurement performance criteria should be developed follows.

2.6.2.1 Precision

Precision is the degree to which a set of observations or measurements of the same property,
obtained under similar conditions, conform to themselves. Precision is usually expressed as standard
deviation, variance, percent difference, or range, in either absolute or relative terms. Precision data
indicate how consistent and reproducible the field sampling or analytical procedures have been.

The project team should determine  and document the following:

• •  Quantitative measurement performance criteria for acceptable sampling and analytical precision
    for each matrix, analytical group, and concentration level.
••  Analyte-specific measurement performance criteria, if applicable.
••  QA/QC activities, or QC samples, that should be performed or analyzed to measure precision
    for each matrix, analytical group, and concentration level.

Overall proj ect precision is measured by collecting data from co-located field duplicate (or replicate)
samples. Precision specific to the laboratory is measured by analyzing laboratory duplicate (or
replicate) samples. Comparing overall proj ect precision and laboratory precision will help to identify
sources of imprecision if a problem exists.

If only two separate samples are collected from  adjacent locations and analyzed, these samples are
referred to as co-located field duplicates. If two representative portions taken from a single sample
are analyzed by the same laboratory, these are referred to as subsample field duplicates. If two
aliquots of the same sample are prepared and analyzed by a laboratory, these samples are referred
to as laboratory duplicates. If two aliquots of the same prepared sample are analyzed in duplicate,
these samples are referred to as analytical duplicates. Duplicate precision is evaluated by calculating
a relative percent difference (RPD)  using the following equation (the smaller the RPD, the greater
the precision):

                          RPD ••  TI  *2    x  100%
where:

   Xj = original sample concentration
   x2 = duplicate sample concentration
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page:  44 of 149

If more than two duplicate samples are collected from  adjacent locations and analyzed,  these
samples are referred to as co-located field replicates. If more than two representative portions are
taken from a single sample and analyzed by the same laboratory, these samples are referred to as
subsample field replicates. If two or more aliquots of the same sample are prepared and analyzed
by a laboratory, these samples are referred to as laboratory replicates. If more than two aliquots of
the same prepared sample are analyzed in  replicate, these samples are referred to as analytical
replicates. Replicate precision is evaluated by calculating the relative standard deviation (RSD), also
referred to as the coefficient of variation, of the samples using the following equation (the smaller
the RSD, the greater the precision):
                            %RSD • •          Deviation  x  100%
                                             Mean
where:
                                SD
                                        ..(X.-*)2
                                           n*
       x; = each individual value used for calculating the mean
       x = the mean of n values
       n = the total number of values

Several software programs are available that will perform these calculations (RPD, RSD, SD).  The
type of software program used should be documented in the QAPP.

2.6.2.2 Accuracy/Bias

Accuracy is the degree of agreement between an observed value (sample result) and an accepted
reference value; bias describes the systematic or persistent distortion associated with a measurement
process. The terms accuracy and bias are used interchangeably in this document.

The project team should determine and document the following:

••  Quantitative measurement performance criteria for acceptable accuracy/bias for each matrix,
    analytical group, and concentration level.
••  Analyte-specific measurement performance criteria, if applicable.
IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
Project Management and Objectives Elements

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page:  45 of 149

••  QA/QC activities,  or QC  samples, that should  be performed or  analyzed to  measure
    accuracy/bias for each matrix, analytical group, and concentration level.

Analyte accuracy/bias can be evaluated using different types  of QC  samples.  For example, a
standard reference  material  or a laboratory control  sample (LCS) that contains  a known
concentration of analyte(s) spiked into  contaminant-free water or other blank  matrix  provides
information about how accurately the laboratory (analysts, equipment, reagents, etc.) can analyze
for a specific analyte(s) using a selected method. Single-blind and double-blind proficiency testing
(PT) samples also provide information on how accurately the laboratory can analyze for a specific
analyte using a selected method. The cumulative laboratory and method accuracy/bias is calculated
as a percentage using the following equation:
                         >        /D.      Measured Value    iririn/
                         Accuracy I Bras  • •	 x 100%
                                             True Value

Because environmental samples contain interferences (i.e., other compounds that may interfere with
the analysis of a specific analyte), the accuracy/bias for a specific analyte should be evaluated in
relation to  the  sample matrix.  This is  done by analyzing matrix spike samples.  A known
concentration of the analyte is added to an aliquot of the  sample. The difference between  the
concentration of the analyte in the unspiked  sample and the concentration of the  analyte in  the
spiked sample should be equal to the concentration of the analyte that was spiked into the sample.
The spike recovery is calculated as a percentage using the following equation:


          %Recovery  •. Spiked Sample Cone.  • • Unspiked Sample Cone.  x  WQ%
                                       Spiked Cone.  Added

Frequently, matrix spike  samples are prepared and analyzed in duplicate, especially for organic
analyses, to provide sufficient precision and accuracy data to evaluate achievement of proj ect quality
objectives.
  Note: In general, published methods provide precision and accuracy/bias statements that are supported by data
  generated during method validation studies. Additionally, laboratories  should track and maintain records of
  precision and accuracy/bias trends for their QC samples (such as laboratory duplicates/replicates, standard
  reference materials, LCSs, and matrix spike analyses) and include acceptable precision and accuracy /bias ranges
  in their analytical SOPs. Published QC data and familiarity with routine method performance will allow project
  planners to choose project-required measurement performance criteria that are technically feasible.
IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
Project Management and Objectives Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 46  of 149
2.6.2.3 Sensitivity and Ouantitation Limits
Sensitivity is the ability of the method or instrument to detect the target analytes at the level of
interest. The quantitation limit (QL) is the minimum concentration of an analyte that can be
routinely identified and quantified above the method detection limit  (MDL) by  a laboratory.
Sensitivity can be measured by calculating the percent recovery of the analytes at the QL. The
project team should document the project-required  QLs for each matrix, analytical group,
concentration level, and analyte.

The project team should determine and document the following:

• • Quantitative measurement performance criteria for acceptable sensitivity to ensure that QLs can
   be routinely achieved  for each matrix, analytical group, and concentration level.
•• Analyte-specific measurement performance criteria, if applicable.
•• QA/QC activities, or QC samples, that will be performed or analyzed to measure sensitivity.

The following issues should be considered when selecting project-specific QLs:

•• A laboratory MDL is a  statistically derived  detection limit that  represents a 99 percent
   confidence level that the reported signal is different from a blank sample. The MDL is lower
   than the concentration at which the laboratory can quantitatively report. Laboratories determine
   their "best  case" sensitivity for analytical methods by performing MDL studies.
• • Laboratory achievable QLs should be at least 3 times the achievable laboratory MDL and ideally
   10 times the achievable laboratory MDL.
• • In the UFP-QAPP, the reporting limit is the quantitation limit achievable by the laboratory. The
   reporting limit must be at or below the project quantitation limit.
•• Frequently, QLs for specific samples are adjusted for dilutions, changes to sample volume/size
   and extract/digestate volumes, percentage of solids, and cleanup procedures. These QLs are
   referred to as sample quantitation limits (SQLs).
• • The action limit (AL) for a target analyte is the numerical value the decision-maker uses as the
   basis for choosing one  of the alternate actions. It may be a regulatory threshold such as
   maximum  contaminant  levels  (MCLs), a risk-based concentration level, a reference-based
   standard, or a technological limitation.
•• SQLs must be less than the action limits for  project quality objectives to be definitively met.
   Sample results that are reported to SQLs that are higher than the action limits cannot be used to
   determine whether the action limit has been exceeded. Thus, environmental decision-making
   may be adversely affected by the failure to meet QLs.
• • Because of uncertainty at the quantitation limit, proj ect QLs should be no greater than one-third
   of the action limit and ideally one-tenth of the action limit.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 47  of 149

Method and instrument sensitivity may be evaluated by preparing and analyzing a laboratory
fortified blank (LFB). An LFB is a blank matrix that is spiked at the QL with the target analytes.
Calibration curves should always include a standard concentration at the QL to ensure sensitivity.
Low-point calibration standards should produce a signal at least 10 times the background level and
should be part of a linear calibration curve.

The QAPP should differentiate between action limits and project-required QLs. The QAPP should
also differentiate between MDLs and QLs that are documented in a published analytical method and
MDLs and  QLs that an  individual laboratory  can routinely achieve.  Figure 15  shows the
relationships between MDLs, laboratory achievable QLs, SQLs, project-specific QLs, and action
limits.

2.6.2.4 Representativeness

Representativeness is a qualitative term that describes the extent to which a sampling design
adequately reflects the environmental conditions of a site. It takes into consideration the magnitude
of the site area represented by one sample and indicates the feasibility and reasonableness of that
design rationale. Representativeness also reflects the ability of the  sample team to collect samples
and the ability of the laboratory personnel to analyze those samples so that the generated data
accurately and precisely reflect site conditions.  In other words, a discrete sample that is collected
and then subsampled by the laboratory is   representative when  its measured  contaminant
concentration equates to the contaminant concentration of some predefined vertical and horizontal
spatial area at the site. Sample homogeneity, and sampling and subsampling variability, should be
considered when developing criteria for representativeness. The use of statistical sampling designs
and standardized SOPs for sample collection and analysis  help to ensure  that samples are
representative of site conditions.

The project team should determine and document the following:

••  Qualitative measurement performance criteria for acceptable representativeness for each
    matrix, analytical group, and concentration level.
••  Analyte-specific measurement performance criteria, if applicable.
••  QA/QC activities, or QC  samples, that should be performed or analyzed to measure
    representativeness for each matrix, analytical group, and concentration level.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                                                                                        IDQTF UFP-QAPP Manual
                                                                                                                        Page:  48  of 149
                                    The statistical Method
                                    Detection Limit (MM.)
                                    is determined to be the
                                    laboratory's "best case"
                                    sensitivity for a gwen
                                    analytical method.
The laboratory achievable Quantitation
Limit JQL)
*  Is tfts mnimum concentration of an
   analyts that can be routenely identified
   and quantified above the yDL by the
   laboratory.
*  Should be at least 3 to 10 times greater
   than the MDL.
«  Should be associated with the lowest
   calibration standard or a low-level
   calibration cneck standard
Sample Qu an Mat ion
Limits (SQLs) are
laboratory QLs that are
adjusted for dilutora,
changes to sample vd ume
sues and extractttigestate
vdymea, percent solids,
and cleanup procedures.
                         Laboratory
                          Capability
                           Project
                       Requirements
                                                   MDL
         Laboratory QL
                                     SQL
                                                                   Project QL
                                                       AL
                                                            The Propel Qaantitation Limit
                                                            should bs
                                                            *   3 to 10 times lower than the AL
                                                            •   3 to 10 times greater than the
                                                                achievable laboratory MDL
                                                            •   Verified by the analysis of a
                                                                standard at that concentrator! in
                                                                the calibration curve.
                                               The Action Limit JAL) may
                                               be baaed on regulatory
                                               standard, reference-based
                                               standard, tecfinological
                                               limitation, etc
                                           Figure 15. Relationships to Project Quantitation Limits
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
                                                                                            Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 49  of 149
2.6.2.5 Comparability
Comparability is the degree to which different methods or data agree or can be represented as
similar. It describes the confidence that two data sets can contribute to a common analysis and
interpolation.

The project team should determine and document the following:

•• Quantitative performance criteria for acceptable data comparability for each matrix, analytical
   group, and concentration level.
•• Analyte-specific measurement performance criteria, if applicable.
•• QA/QC activities, or QC samples, that should be performed or analyzed to measure data
   comparability for each matrix, analytical group, and concentration level.

The QAPP should address issues such as consistency in sampling and analytical procedures within
and between data sets. For example, to ensure data comparability for repeated monitoring well
sampling, SOPs should require that well casings be notched or permanently marked so that the water
level measurement is taken from the same spot for each sampling event.

2.6.2.5.1 Split Sampling Data Comparability

Split samples are two or more representative portions taken from a sample in the field or laboratory
and analyzed by at  least  two different laboratories  to  assess precision, variability, and data
comparability between laboratories and/or methods.

Whenever split sampling and analysis are performed (e.g., multiple data generators on the same
project or as part of EPA oversight of the lead organization and its contractors and subcontractors),
comparability criteria must be established and documented in the QAPP or the oversight QAPP prior
to data collection. Comparability criteria should be determined for each matrix, analytical group (and
analyte, if applicable), and concentration level. Split sampling comparability criteria must specify
the following:

1. Acceptable  relative percent difference  (RPD) for individual analyte  comparisons  (for
   combinations of nondetects, detects close to the QLs, and detects sufficiently greater than the
   QLs).
2. Acceptable percentage of analytes (per matrix, analytical group, and concentration level) with
   acceptable RPDs.
3. Acceptable magnitude  and direction of bias for comparisons performed in 1 and 2 above.
4. Acceptable overall comparability criteria for all data generated for use in the project.
5. Corrective action and process for reconciliation of any differences, if overall comparability
   criteria are not met.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 50  of 149

PT samples should be used to identify the magnitude and direction of bias for each data generator.
The  results  should be  compared  with 3,  above, and  with  the project-specific measurement
performance criteria so that data usability decisions can be made.

Whenever split sampling is performed, a comparability flow diagram must be included in the QAPP
(see Figure 16). The equation used to calculate RPD between split sample results generated by two
different laboratories is at the bottom of Figure 16. This equation uses absolute values since it
assumes that values generated by equivalent methods used by multiple entities are equally accurate.

2.6.2.5.2 Screening Versus Definitive Data Comparability

Whenever definitive analysis is performed to confirm screening results, comparability criteria must
be established and documented in the QAPP  prior to data collection. Comparability criteria must be
determined for each matrix, analytical group (and analyte, if applicable), and concentration level.
The most important factor for  determining
whether screening data will meet the PQOs
and be usable for project decision-making is
the comparability of screening data and split
sample  confirmation data generated  using
                                             Screening Data Versus Definitive Data
                                             Screening data are analytical data that are of sufficient
                                             quality to support an intermediate or preliminary decision
                                             but must eventually be supported by definitive data
                                             Definitive data are analytical data that are suitable for
                                             final decision-making.
 , r-  •.•       i  .•  i     xi  i   T->        j x     before a project is complete.
definitive analytical  methods. Because data          F J        F
comparability decisions are based on a limited
number of samples analyzed by the definitive
analytical methods, the methods that are used
to  confirm  screening   results   must  be
scientifically valid, well-documented methods that have routinely been accepted by regulators.

When developing comparability acceptance criteria for screening and definitive data, the following
issues should be considered:

•• Are the screening  and definitive methods based on the same analytical principles? If the
   screening and definitive methods measure target analytes using different principles, then a one-
   to-one correlation should not be assumed.
• • Do the screening and definitive methods analyze for the same list of target analytes? If not, then
   a one-to-one correlation should not be assumed.
•• Do the screening and definitive methods  report to the same QL? If not, how will data that are
   reported below the  QL of either method be handled? Also,  are the QLs for the screening and
   definitive methods  significantly less than the action limits?
•• Do the screening and definitive methods have the same extraction efficiencies, use the same
   sample volumes, and perform similar  sample pretreatment  and  sample cleanup? These
   differences may also account for correlations that are not one-to-one.


IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Project Management and  Objectives Elements

-------
                                                                               IDQTF UFP-QAPP Manual
                                                                               Page: 51  of 149
                                                                    
-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 52 of 149

•• How will percentage of moisture be accounted for in both screening and definitive samples?
•• Are the calibration procedures the same for the screening and definitive methods? That is,
   will standard calibration curves or single-point calibrations be generated?

Screening versus definitive comparability criteria must specify the following:

   1.  Acceptable percent differences for individual analyte comparisons (for combinations of
       nondetects, detects close to the QLs, and detects sufficiently greater than the QLs).
   2.  Acceptable percentage of analytes (per matrix, analytical group, and concentration level)
       with acceptable percent differences.
   3.  The acceptable magnitude and direction of bias for comparisons performed in 1 and 2
       above.
   4.  Acceptable overall comparability criteria for all data generated for use in the project.

Whenever screening versus definitive split sampling is performed, a comparability flow diagram
must be included in the QAPP. Multiple flow diagrams may be needed to address QL differences
between screening and definitive methods.

Figure 17, Example Comparability Determination, illustrates two approaches that can be used for
determining the comparability of screening and definitive  data. One approach involves the
generation and application of predesign correlation factors to adjust screening sample results prior
to performing data comparability calculations. Correlation factor adjustment of screening sample
results can be critical when a one-to-one correlation does not exist for data generated  with the
screening  and definitive  methods (depending  on differences in method selectivity,  sensitivity,
precision, and accuracy, as well as on the relationship of the laboratory achievable QLs to the action
limits). The equation used to define percent difference when comparing screening and definitive
results is included at the bottom of Figure 17. The equation  assumes that values generated by the
definitive method are more accurate than those generated by the screening method. While this may
not always be true, the equation serves to standardize reporting conventions and to promote data
comparability. Note that this equation retains the sign of the difference, thus absolute values are not
used.

The other approach for determining the comparability of screening and definitive data does not use
correlation factor adjustment of screening sample results prior to performing data comparability
calculations. Comparability calculations that are performed with screening and definitive data for
which correlation  factors have  not  been generated  or applied may  result in project-specific
comparability criteria being exceeded (especially if those criteria are tight).

Both approaches require that data comparability acceptance criteria be developed and documented
in an approved project QAPP prior to initiation of field sampling activities.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                                                                                            IDQTF UFP-QAPP Manual
                                                                                                                            Page:  53 of 149
                                                                Perform
                                                            Correlation Factor
                                                              Adjustment of
                                                             Screening Sample
                                                                Results
                                                                                              Perform Overall Evaluation
                                                                                                  of Comparability:
                            %D=Percent difference =
                            C — Concentration determined by definitive analysis

                            C = Concentration determined by screening analysis

                            Percentages shown in this figure are for illustration purposes only.
                            Site-specific values must be developed for each project by the project team.
 Determine % Splits that
  Meet Project-Specific
Comparability Acceptance
    Criteria (from
  Prc-Approvcd QAPP)
                                                Figure 17. Example Comparability Determination
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
                                                          Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 54 of 149
2.6.2.6 Completeness
Completeness is a measure of the amount of valid data collected using a measurement system. It is
expressed as a percentage of the number of measurements that are specified in the QAPP.

The QAPP should address how completeness will be determined by specifying the following:

•  Performance criteria for  acceptable completeness  for  each  matrix,  analytical group, and
   concentration level.
•  Analyte-specific measurement performance criteria,  if applicable.
•  QA/QC activities that should be performed to measure completeness.

Separate values should be provided for the whole data set, not just for the critical data subset. Since
lack of data completeness may require resampling and additional costs, the QAPP should discuss
how sufficient data will be guaranteed for critical sample locations.

2.7    Secondary Data Evaluation

Previous sections discussed project  quality  objectives, measurement performance criteria and
associated data quality indicators. In determining what data must be collected, the first step  should
be evaluation of existing data to determine if they meet project needs. Secondary data may include
data generated for or by external, independent parties which are then transmitted to the current user.
Secondary data may also include data collected in other investigations designed to answer different
questions than those posed in the current investigation. Using data and information that are not
generated for the same  quality  objectives as the current investigation may result in erroneous
decisions; therefore, it is essential to identify use limitations for secondary data. Figure 18 outlines
the process used to evaluate secondary data. All items listed under "Information Needed" may not
be available; however, the project team should evaluate  whatever information is available.

The QAPP should identify sources of previously collected data and other information that will be
used to  make project  decisions. Sources of secondary data and information include, but are not
limited to:

•  Historical data (e.g., from an organization's or facility's corporate records and/or Federal, State,
   or local records pertaining to previous monitoring events, site assessments, or investigations).
   Historical data may be used to describe the site history and define the environmental problem
   (see Section 2.5.2  of this Manual).
•  Background information and data from an organization's or facility' s corporate records and/or
   Federal, State, or local records pertaining to site-specific industrial processes, process by-
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                                                               IDQTF UFP-QAPP Manual
                                                                                               Page:  55 of 149
                                                                          INFORMATION NKttDKD
                     C>?iieration/l'o]lection of Original Data
                         Iivaluation of Secondary Data
                               for Project Vse
                              Identification and
                   Documentation of Limitations on the I'se of
                         Secondary Data for the Project
    Data (Generator Information:
  1 Originating Organisation
  1 QAPP Approval Date
  1 Data Generation Collection Dates(s)
  1 Da la 'types
  1 Data Generation Formal
                                                                           Data Source Information:
                                                                       * Originating Organisation
                                                                       • Data Reporting Dales(s)
                                                                       * Dale Reporting Format (i.e.. Report,
                                                                        Laboratory Data Result Sheets, etc. )
                                                                              Sources Identified:
                                                                     • Historical Data
                                                                     • Background Information
                                                                     1 Data Generated to Verily Innovative
                                                                      Technologies and Methods
                                                                     1 Data Generated from Computer Databases
                                                                     • Environmental Indicator Data
                                                                     1 Computer Models or Algorithms
                                                                     • Literature H!es Searches
                                                                     • Publications
                                                                     1 Photographs
                                                                     1 Topographical Maps
              Information I-'valuated:
• Original Data Generation Requirements Criteria Results
 - Purpose and Scope of Original Study
 - Hl'iceU veness o! Sampling Desisin and Procedures
 - FtleeUveness o! Analytical Procedures
 - QC Procedures, Checks, and Samples
 - Method and. or Laboratory-Specific QC Acceptance
   Criteria. Documentation and Results
 - Data Review Procedures-
   Criteria, Documentation, and Results
 - Other Assessment Procedures. Criteria.
   Documentation, and Results
 - Software Validation Procedures, Criteria,
   Documentation, and Results
• Documentation Completeness (Original Approved QAPP,
 Final Report. Data Generation and Reporting Formats, etc.)
• Quality Considerations-Problems Uncertainties
• Organi/ntional  Lead
      Information Documented:
     • Project Acceptance Criteria
     • Secondary Data Uses
     • Secondary Data Limitations
     • Organisational Lead
                                     Figure 18.  Secondary Data Evaluation Process
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
                                                            Final

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 56 of 149

   products, past and current chemical uses, raw material and finished product testing, waste testing
   and disposal practices, and potential chemical breakdown products.
   Data generated to verify innovative technologies and methods.
   Data generated  from computer databases  (such  as  manufacturers'  process and product
   information, or waste management or effluent information).
   Environmental data obtained from Federal, State, or local records.
   Computer models or algorithms.
   Literature files and searches.
   Publications.
   Photographs.
   Topographical maps.
 Note: To ensure that correct environmental decisions are made, the same care should be taken when using
 secondary data as is taken when generating new data.
  Note: The information may be presented in
  tabular format, however, since the table will not
  be  able to present all required  information
  regarding secondary data, it willbe necessary to
  provide additional information in the text.
All secondary data and information that will be
used for the project, their originating sources, their
planned uses, and any limitations on their use
should be provided in the QAPP. Figure 19 (QAPP
Worksheet #13)  provides  an  example of the
Secondary  Data  Criteria  and  Limitations table.
Secondary
Data
Soil Gas
Data
Municipality
Drinking
Water Data
Data Source
(Originating
Organization, Report
Title, and Date)
BioWatch Consulting,
LTD: "Titanic
Shipyard Investigation
Report, "11/20/95
XYZ Municipality:
Quarterly Drinking
Water Check Report,
6/95 - 6/96
Data Generator(s)
(Originating Org., Data
Types, Data
Generation/Collection Dates)
BioWatch Consulting, LTD:
VOC Soil Gas Data, Sample
Collection Dates: 10/19-23/95
Smith Laboratories, Inc.: VOC
Drinking Water Data, Sample
Collection Dates: 6/12/95,
9/15/95, 12/10/95, 3/6/96,
6/12/96
How Data Will
Be Used
To assess the
potential sources
of contaminated
soil and resultant
groundwater
migration
To assess existing
groundwater
contamination
Limitations on Data
Use
1. Unvalidated data
used to generate
report
2. Insufficient data
points to fully
characterize on-site
contamination and
off-site migration
1. Unvalidated data
used to generate
report
2. Limited number of
•wells exist to sample
               Figure 19. Example Secondary Data Criteria and Limitations
                                 (QAPP Worksheet #13)
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
                                          Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 57 of 149

Once the secondary data sources have been identified, the project team should evaluate and discuss
how well the quality of the data meets the Project Quality Objectives and associated MFC, as well
as the completeness of its documentation. The QAPP should identify the following:

•   Generator(s) of the data.
•   Dates the data were generated, collected, and reported.
    Sources from which the data were obtained.
•   Procedures originally used to generate and collect the data (including sampling, analytical, and
    assessment procedures).
•   All QC procedures, checks, and samples that were analyzed with the data set, if known.
•   Method and/or laboratory-specific QC acceptance criteria used for data generation and whether
    or not data were reviewed.
•   If data were reviewed, the criteria and procedures used, the documentation provided, and the
    results obtained from previous data review activities (see Section 5.0 for a complete discussion
    of data review).

Additional items to address in the text related to the quality of the  secondary data include the
following:

•   If the data were generated under an approved QAPP or other sampling document, reference the
    document by title, date, originating organization,  and approving organization.
•   Evaluate the purpose and scope of previous studies and compare with current study objectives.
•   Evaluate similarities and differences of the measurement performance criteria and data quality
    indicators.
•   Evaluate the design and implementation of previous studies by examining the following:
    -  Whether the study was conducted properly
    -  Whether control responses were within acceptable limits
    -  Whether standard sampling and analytical methods and standard QA/QC protocols were
       available and  followed
•   Include a brief description of the sampling procedures for each matrix type (e.g., grab/grid for
    surficial soils) and analytical procedures for each matrix type (e.g., SW-846 Method 3550/8270
    for surficial soils).
•   If performance or system audits or split sampling activities were performed, provide a synopsis
    of the results of those  audits or activities.
•   If data were reviewed, reference the  data  review procedure  by title, date, and originating
    organization.
•   If data were obtained  from a computer model or algorithm, provide a brief description of the
    validation of that computer software.
•   If data were obtained from a database, provide a brief discussion on the integrity  and accuracy
    of the database information.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 58 of 149

•  Discuss the adequacy of the original QA documentation under which  secondary data were
   generated.  For example, if sufficient raw analytical data are not available to verify that an
   instrument was calibrated accurately, then the secondary data may not be usable  for their
   intended purpose.
•  Relate the secondary data back to the PQOs and MFCs.

The QAPP should discuss all possible limitations on the use of secondary data for the project based
on the uncertainty surrounding their quality, including the following:

•• The nature and  magnitude of the uncertainty.  For  example, discuss  the impact  of using
   unreviewed historical monitoring data to answer proj ect questions and support project decisions.
   Unreviewed data may be scientifically inaccurate or may not meet the objectives of the user.
•• The impact of using secondary data with known  analytical or sampling inaccuracy or bias, or
   known imprecision. For example, document the sampling and analytical methods used to collect
   and analyze soil VOA samples, and discuss possible low bias in sample results.
• • The acceptance criteria used to determine whether the secondary data and information are usable
   for the project. For example, if secondary drinking water data will be used to answer project
   questions, the QAPP should state that only data generated by EPA/State-certified or NELAP-
   accredited  Safe Drinking Water Act laboratories  will be used for the project.
•• The comparability criteria for secondary data (e.g.,  historical routine monitoring data) and the
   data generated for the current project.

2.8    Project Overview and Schedule

The QAPP should provide a general overview of the activities that will be performed, and how and
when they will be performed, based on background information and data, preplanning site visits, and
scoping sessions. Specific details  for the individual project activities will be provided in later
sections of the QAPP.

2.8.1   Project Overview (Outcome of Project Scoping Activities)

Through  project planning, the project  team should agree  on the purpose of the  project, the
environmental questions that are being asked, and the environmental decisions that must be made.
The project team should establish the PQOs (i.e., specify the type, quantity, and quality of data
needed to ensure that project data can be used  for the  intended purpose) to answer  specific
environmental questions, support environmental decisions, and determine technical activities that
will be conducted. Figure 20  (QAPP Worksheet #14) provides an example of the Summary of
Proj ect Tasks table.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Project Management and Objectives Elements

-------
                                                                                                                   IDQTF UFP-QAPP Manual
                                                                                                                   Page:  59 of 149
Sampling Tasks:      10 grounchvater (GW) samples; 2 existing wells; and 8 newly installed wells
          24 soil boring (SB) samples; 3 from each of 8 borings-during well installation
          100-120 surface soil (SS) samples; 50 collected 0-1' depths on grid; 50 random; 20 contingency sample
          6 surface waters (SW); 2 from Ruby Brook; 4 from water in depressions
          6 sediment (SED) samples from Ruby Brook; 6 leachate samples on steep hill east of site
Analysis Tasks:       GW - VOC 5030B/8260B, SVOC 3520C/8270C, Pest 3510C/8081A, PCB 3510C/8082, Metals 3051/6010B, Mercury 7470A
          SW - VOC 5030B/8260B, SVOC 3520C/8270C, Pest 3510C/8081A, PCB 35IOC/8082, Metals 3051/6010B, Mercury 7470A
          SS  - VOC 5035/8260B, SVOC 3550C/8270C, Pest 3541/8081A, PCB 3541/8082, Metals 3051/601 OB, Mercury 7471A
          SB  - VOC 5035/8260B, SVOC 3550C/8270C, Pest 3541/8081A, PCB 3541/8082, Metals 3051/6010B, Mercury 7471A
          SED - SVOC 3550C/8270C, Pest 3550B/8081A, PCB 3541/8082, Metals 3051/6010B, Mercury 7471A
          Leachate - SVOC 3520C/8270C, Pest 3510C/8081A, PCB 35IOC/8082, Metals 3051/6010B, Mercury 7470A
Quality Control Tasks: All matrices will have the following QC samples analyzed: duplicates,  matrix spikes, matrix spike duplicates, VOA trip blanks, equipment blanks,
   bottle blanks, and PE samples.
  All analytical methods will perform: initial calibrations, continuing calibrations, tuning, reagent blanks, surrogates, replicates, laboratory control spikes, and all other
  applicable QC defined in the method.
Secondary Data: Data for 1982-83 inspections, 1985 initial investigation, Fire Department records, waste manifests, and company records will be
  reviewed. All data will be evaluated for project use. Limitations to documentation will be noted. Data deemed as valuable will be added to database.
Data Management Tasks: Analytical data will be placed in a database after validation.  The database will also compile field measurements.  Secondary data deemed
   usable will be added to the database. All data will be assessed by the Case Team.
Documentation and Records: All samples collected will have locations GPS documented, records of each sample collected in notebooks, and oilfield
  measurements documented in notebooks.  COCs, airbills, and sample logs will be collected for each sample.
Data Packages: All data packages will include: all elements listed in Table 6 of Compendium.
Assessment/Audit  Tasks: Sampling SOPs reviewed; PRPs will be notified March  1, 15, 22, 27, 30 by phone and letter; Field Sample Collection and Documentation
  Audits: April 1, 15, 22, 27, and 30, 2000; no laboratory TSA.
Data Review Tasks: Each laboratory performing analyses of samples will verify that all data are complete for samples received.  Data will be
   validated using Tier II,  Region 1, EPA-New England Data Validation Functional Guidelines for Evaluating Environmental Analysis.
   All deliverables required. Validated data will be reviewed.  Data usability will be assessed; MFC met? Sample & analytical error ? Spatial variability?
  Measurement performance criteria set in QAPP checked.  Were QL requirements met? Data limitations will be determined.  Data compared to Project Objectives.
  Corrective action initiated, data are placed in database, tables,  charts, and graphs are generated.  Data compared to historical data.	
                                                   Figure 20.  Example Summary  of Project Tasks
                                                               (QAPP Worksheet  #14)


       IDQTF, UFP-QAPP Manual V1, March 2005                                                                                          Final
       Project Management and Objectives Elements

-------
                                                                 IDQTF UFP-QAPP Manual
                                                                 Page: 60 of 149

The project team should also agree on what environmental characteristics  of interest or target
analytes/contaminants of concern (COCs) will be measured. The list of target analytes/COCs should
be refined as much as possible using the information available and may increase or decrease as the
project progresses.
  Graded Approach

  The identification of target analytes/contaminants of concern represents one of the greatest opportunities for
  focusing a project, thereby saving time and money. Whenever possible, the list of target analytes/COCs should
  include those most likely to be found on the site. In some cases (e.g., an old fire training pit where a wide range
  of analytes may have been burned), devising a short list early in the project is difficult. Some people mistakenly
  believe, however, that if you have identified one or two analytes from an analyte group with a long list of analytes
  (e.g., semivolatile compounds), you might as well analyze the whole list of SVOCs (more than 70 compounds).
  Using  fewer analytes  provides the potential for significantly  improving the quality of the analysis  (e.g.,
  improvements in accuracy by optimizing the method for the specific chemical) as well as saving time and money.
The proj ect team should determine the quality criteria that the data must meet to achieve the proj ect
objectives, and document those measurement performance criteria in the QAPP. Project-required
QLs and action limits must be established prior to the selection of sampling and analytical methods.
To compensate for potential analytical inaccuracy at the QL, proj ect-required QLs should be at least
3 to 10 times less than the action limits, if achievable.

The QLs from individual methods and laboratories are evaluated relative to project-required action
limits to determine their suitability to meet PQOs. If the published method QL exceeds the action
limit for a target analyte/COC, that analytical method is unacceptable for the analysis of that analyte.
However, if a laboratory has modified the published method to achieve QLs that are less than the
action limits, and it has documented this modification in its laboratory SOP, that laboratory SOP
might constitute an acceptable method. See Section 2.6.2.3 for additional guidance on QLs.

Figure 21  (QAPP Worksheet #15)  shows  an  example Reference Limits and Evaluation  table.
Separate tables should be provided for each matrix, concentration level, and analytical group.
 Note: Achievable MDLs and QLs are those that an individual laboratory can achieve when performing a specific
 analytical method. An individual laboratory may not always be able to achieve the MDLs and QLs that are in a
 published method. Therefore, even though a published analytical method may meet project requirements, a
 laboratory may not necessarily perform the analytical method satisfactorily. Laboratory-achievable MDLs and
 QLs must be documented in the laboratory's SOP for each analytical method used by the laboratory for the
 project.
IDQTF, UFP-QAPP Manual V1, March 2005                                                 Final
Project Management and Objectives Elements

-------
                                                                                                 IDQTF UFP-QAPP Manual
                                                                                                 Page:  61 of 149
Matrix: Groundwater
Analytical Group: VOA
Concentration Level: Low
Analyte
Benzene
Trichoroethene
Vinyl Chloride
1 ,2-Dicholoroethane
Carbon Tetrachloride
1,2-Dichloropropane
1,1,2-Trichloroethane
CAS Number
71-43-2
79-01-6
75-01-4
107-06-2
56-23-5
78-87-5
79-00-5
Project Action
Limit
(applicable units)
5 -g/L
5 -g/L
2 -g/L
5 'g/L
5 -g/L
5 -g/L
5 'g/L
Project
Quantitation Limit
(applicable units)
1 'g/L
1 -g/L
1 -g/L
1 -g/L
1 -g/L
1 -g/L
1 -g/L
Analytical Method Limits1
MDLs
0.03
0.02
0.04
0.02
0.08
0.02
0.03
Method QLs
Not provided in
method
Not provided in
method
Not provided in
method
Not provided in
method
Not provided in
method
Not provided in
method
Not provided in
method
Achievable Laboratory Limits2
MDLs
0.1
0.11
0.11
0.11
0.12
0.11
0.13
QLs
0.5
0.5
0.5
0.5
0.5
0.5
0.5
'Analytical method MDLs and QLs are those documented in published methods.
2Achievable MDLs and QLs are limits that an individual laboratory can achieve when performing a specific analytical method.
                                    Figure 21. Example Reference Limits and Evaluation
                                                   (QAPP Worksheet #15)
IDQTF, UFP-QAPP Manual VI, March 2005
Project Management and Objectives Elements
Final

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 62 of 149
If the laboratory and method cannot achieve the project QLs and action limits, one of the following
options should be pursued:

   Option 1 — Use a different laboratory.
   Option 2 — Use an alternative analytical method or a modified method.
   Option 3 — Accept a higher level of uncertainty for data falling between the MDL and QL.
   Option 4 — Adjust the action limits to reflect the capability of available methods to
               detect the target analytes/COCs.

2.8.2  Project Schedule

The QAPP should include a schedule of the work to be performed using a timeline or tabular format
(see Figure 22, QAPP Worksheet #16). The timeline must include the start and completion dates for
all project activities, as well as the quality assurance assessments that will be performed during the
course of the project. Sufficient time for document review  and implementation  of effective
corrective actions should be scheduled.
In addition to the timeline, the procedure for
notifying proj ect participants concerning proj ect
schedule delays  should  be included in the
QAPP.   This   description  should   include
identification, by job function and organization
name, the personnel responsible for providing
and  receiving  such  notification,  and  the
personnel responsible for approving schedule
delays.
Graded Approach

For projects that involve only routine monitoring, such
as NPDES compliance monitoring, the schedule may
include only the dates of the sampling and the date the
results are due to the regulatory oversight authority.
Activities
Organization
Dates (MM/DD/YY)
Anticipated Date(s)
of Initiation
Anticipated Date
of Completion
Deliverable
Deliverable Due
Date
                           Figure 22. Project Schedule/Timeline
                                 (QAPP Worksheet #16)

The QAPP should include a discussion of all proj ect-related resource, political, and time constraints,
along with seasonal sampling restrictions and considerations. All regulatory requirements and/or
restrictions, and any other factors that will affect the project schedule, should be identified.
IDQTF, UFP-QAPP VI, March 2005
Project Management and Objectives Elements
                                       Final

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 63 of 149
3.0    MEASUREMENT AND DATA ACQUISITION ELEMENTS

This QAPP element group covers how project data will be collected, measured, and documented.
Proper implementation of these activities will help ensure that resulting data are scientifically sound,
of known and documented quality, and suitable for their intended use.

This section of the Manual addresses quality control activities that will be performed during each
phase of data collection and generation, from sampling to data reporting, evaluating QC acceptance
limits, and the performance of corrective actions for nonconformances. It is important to remember
that each phase of data collection and generation is interdependent and that, therefore, quality must
be  factored  into  all project activities or  tasks.  The  final  two QAPP  element  groups,
Assessment/Oversight and Data Review, evaluate the activities or tasks  described in this element
group.
All sampling and analysis procedures that
will  be  used  in  the  project  must  be
documented  in the  QAPP  or attached
documents. Attachments must be provided
with, or clearly referenced in, the QAPP to
allow for review and approval. SOPs must
provide a detailed step-by-step description
of each procedure, in addition to acceptable
limits   of  performance   and  required
corrective actions, if applicable. Analytical
methods and SOPs must specify appropriate
QC  checks  and  samples  with  explicit
concentration and frequency requirements
for preparation and analysis. When a single
SOP has multiple options for a given procedure, which is common, the QAPP must identify the
option that applies to the project. If routine SOPs are modified to meet PQOs, the modification(s)
must be described in the QAPP along with an indication that a modification occurred. Appendix A
contains  examples of the types of SOPs that should be included in the QAPP, with additional detail
on their content.
Graded Approach

To simplify QAPP preparation, written SOPs should be
included as attachments to the QAPP whenever possible.
If procedures are documented in a separate document, that
document  should  be cross-referenced, as shown in the
example in Table 2, and either attached for review and
approval (if  not already approved)  or referenced with
sufficient specificity that it can be easily found. Information
in  attachments to the QAPP can be provided in an
electronic format (such as portable document format (PDF);
on CD-ROM, DVD-R, or other storage media; or on a
website).
 Writing and Formatting SOPs and Methods

 EPA QA/G-6, Guidance for the Preparation of Standard Operating Procedures (SOPs) for Quality-Related
 Documents (EPA/240/B-01/004, March 2001 or most recent revision) provides guidance for writing and formatting
 SOPs. The Environmental Monitoring Management Council Methods Format (www.epa.gov/ttn/emc/guidlnd/gd-
 045.pdf) provides guidance for writing and formatting analytical methods.
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
                                          Final

-------
3.1 Sampling Tasks

The sampling sections of the QAPP  include all
components  of the  project-specific  sampling
system, including  process  design  and rationale,
procedures, and requirements.

The QAPP must contain sufficient documentation
to assure the reviewer that representative samples
from the appropriate matrix will be properly and
consistently collected at the appropriate locations
and that preventive and corrective action plans are
in place prior to initiation of the sampling event.

3.1.1   Sampling Process Design and Rationale
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 64 of 149
Definition of Sample

Since  the  definition  of  sample  is program-
dependent,  the term must be defined or the
regulatory definition referenced in the QAPP to
ensure the correct usage. For example, if a soil
sample is defined in the field by mesh size, then this
should be noted. If a laboratory then subsamples
this field sample based on the criteria of mesh size,
then those activities and definitions should also be
documented.
The outcome of the project scoping activities, including project quality objectives, measurement
performance criteria, and the acceptable level of uncertainty (see Section 2.8.1), should be used to
identify appropriate sampling design(s). The QAPP should describe the proj ect team's rationale for
choosing the sampling process design methodology (e.g., grid system, biased statistical approach),
and describe the sampling system (design) in terms of what matrices will be sampled, where the
samples will be taken, the number of samples to be taken, and the sampling frequency (including
seasonal considerations). Whether the QAPP applies to an initial site investigation,  a large-scale
remedial investigation/feasibility study (RI/FS), a long-term treatment monitoring program, or a
volunteer monitoring program, the  rationale for sampling specific points or  locations must be
explained in the QAPP.

For each matrix,  a detailed  rationale for selection of the sampling design should be  provided,
including critical  and background sample locations. The QAPP should describe the logic used to
determine sample locations, analytical groups, and concentration levels, as well as the type, number,
and frequency of field samples and field QC samples to be collected (include statistical tests used,
conceptual site models, etc.). If software products, such as Visual Sample Plan  (VSP), are used to
provide project-specific, statistically derived rationales, the QAPP must document the name and
version number of the software and describe how it will be used to determine the sampling design.
QAPP Worksheet #17 may be used to present this information.

Examples of the information to be provided on the selection of sampling locations include:

••  The basis  for selecting the size of the grid, if a grid system will  be used to select random
    sampling locations.
••  Decision trees that document the critical decision points of the location selection process, if on-
    site analytical measurements or screening techniques will be used to identify sample locations.
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
                                    Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 65 of 149
•• The rationale for choosing a nonstatistical approach, if a biased sampling approach will be used
   to select sampling locations.
•• The criteria for selecting "hot spots," if biased or judgmental sampling will be performed.

Additional information  to  explain the sampling  design  rationale  may  be necessary,  such  as
compositing rationale and procedures, if samples will be composited.

Selected sample locations should be identified and documented on additional site maps, charts, and
plans. An SOP should be  attached or  referenced that documents how the sampling points  or
locations will be precisely determined, for example by using a geographic information system (GIS),
global positioning system (GPS), or physical markers, and if used, the GIS that will be used to store
and display site information.  Site maps should include the site borders, well boring, and test pit
installations from previous investigations; areas with known or suspected oil or chemical  spills or
toxic substance releases; and buildings,  hills, water bodies, depressions, etc.

Figure 23 (QAPP Worksheet #18) provides an example summary table. Selected information needed
to complete this table is  discussed in  Section 3.1.2 of this  Manual. Only a short reference for the
sampling  location rationale is necessary for  the table,  such as background,  grid, hot spot,
downgradient of release, VSP output, representativeness, or completeness. The text of the QAPP
should clearly identify the detailed rationale associated with each reference.
Sample
Location/ID
Number
MW-1

Matrix
GW

Depth
(units)
20-30ft

Analytical
Group
VOA
SVOC
Concentration
Level
Low
Low
Number of
Samples (identify
field duplicates)
1
1
Sampling
SOP
Reference1
S-l
S-2
Rationale for
Sampling
Location
Background

 Specify the appropriate reference letter or number from the Project Sampling SOP References table (Figure 21).

         Figure 23. Example Sampling Locations and Methods/SOP Requirements
                                 (QAPP Worksheet #18)
 Graded Approach

 For a site with a large number of sample locations or ID numbers, the range of location or ID numbers can be
 grouped by similar matrix, analytical group, or concentration level.
A summary of the analytical SOP requirements should be provided (see Figure 24, QAPP Worksheet
#19), and the QAPP should document the rationale for selecting the sample volume, container,
preservation, and holding times requirements (e.g., EPA method or regulation). Information needed
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 66 of 149
to complete this table is discussed in Sections 3.1.2 and 3.2 of this Manual. Information concerning
sample containers, volume, and preservation is discussed in Section 3.1.2.2.

Matrix

Analytical
Group

Concentration
Level
Analytical and
Preparation
Method/
SOP Reference1

Sample
Volume
Containers
(number,
size, and
type)
Preservation
Requirements
(chemical, temperature,
light protected)
Maximum
Holding Time
(preparation/
analysis)
 Specify the appropriate reference letter or number from the Analytical SOP References table (Figure 23).

                          Figure 24. Analytical SOP Requirements
                                  (QAPP Worksheet #19)

The number of field QC samples for each matrix, analytical group, and concentration level should
also be provided (see Figure 25, QAPP Worksheet #20). The QAPP should document the rationale
for selecting the number and type of field QC samples (e.g., EPA method or regulation), provide a
complete detailed description of the analytical tasks and associated analytical control, and identify
all analytical  SOPs and methods in the appropriate sections of the QAPP (see Sections 3.2.1 and
3.4).
Matrix
Analytical
Group
Concentration
Level
Analytical and
Preparation
SOP
Reference1
No. of
Sampling
Locations2
No. of
Field
Duplicate
Pairs
Inorganic
No. of MS
No. of
Field
Blanks
No. of
Equip.
Blanks
No. of
PT
Samples
Total No.
of
Samples
to Lab
'Specify the appropriate reference letter or number from the Analytical SOP References table (Figure 23).
2If samples will be collected at different depths at the same location, count each discrete sampling depth as a
separate sampling location or station.

                    Figure 25. Field Quality Control Sample Summary
                                 (QAPP Worksheet #20)

3.1.2   Sampling Procedures and Requirements

All sampling procedures that will be used in the project must be  documented  in the  QAPP or
attached  documents. Attachments must be provided with or referenced in the QAPP to allow for
review and approval. Standardized sampling procedures provide consistency between  samplers;
facilitate collection of accurate, precise,  and representative  samples; and  help to ensure data
comparability and usability. Although it may be possible to comprehensively describe the sampling
procedures for small projects within the text of the QAPP, the most efficient and cost-effective way
to document proj ect-specific sampling techniques is to include sampling SOPs as attachments to the
QAPP.

Sampling procedures should include SOPs for sampling each matrix and each analytical parameter
for each type of equipment and technique. The SOPs must detail the appropriate number, size, and
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 67 of 149

type of sample containers to be used for collection of each field sample and field QC sample and the
proper temperature, light, and chemical preservation procedure for those samples.
 Note: All project sampling SOPs must be listed; including, but not limited to, sample collection, sample
 preservation, equipment cleaning and decontamination, equipment testing, inspection and maintenance, supply
 inspection and acceptance, and sample handling and custody SOPs.
The  QAPP should provide a table that contains the information shown in Figure 26 (QAPP
Worksheet #21). Sequentially number sampling SOP references in the Reference Number column.
The reference number can be used throughout the QAPP to refer to a specific SOP.
Reference
Number
Title, Revision
Date, and/or
Number
Originating
Organization
Equipment
Type
Modified for
Project Work?
(Y/N)
Comments
                      Figure 26. Project Sampling SOP References
                                 (QAPP Worksheet #21)

3.1.2.1 Sample Collection Procedures

The QAPP must describe how samples will be collected. The selected sample collection procedures
must be appropriate to ensure that project personnel collect representative samples  in a consistent
manner for all required sample matrices and locations, that contamination is not introduced during
collection, and that sample volumes are properly preserved in order to meet project objectives.

3.1.2.2  Sample Containers, Volume, and Preservation

The QAPP should include a description of preservation procedures (temperature, light, chemical)
that maintain sample integrity in the field, prior to and during shipment to,  and immediately upon
receipt by, the off-site or mobile on-site laboratory. The QAPP should document requirements for
sample volumes, container types,  numbers of containers, and preservation procedures for each
analytical group, matrix, and concentration level (see Figure 24, QAPP Worksheet #19).

3.1.2.3 Equipment/Sample Containers Cleaning and Decontamination Procedures

The QAPP should provide details on the procedures for both the initial cleaning of sampling
equipment and subsequent decontamination procedures that will be followed during the sampling
event. These procedures will help ensure that collected samples are representative of the sampling
location by  verifying that sampling equipment is  clean and free of target analytes/COCs or
interferences. Cleaning and decontamination procedures should cover all equipment that contacts

IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
Measurement/Data Acquisition Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 68 of 149

a sample. If the sampling equipment is disposable ("one use only"), procedures for cleaning and
decontamination are not necessary; however, the QAPP should state that disposable equipment will
be used.

3.1.2.4 Field Equipment Calibration, Maintenance, Testing, and Inspection Procedures

The QAPP should describe all procedures and documentation activities that will be performed to
ensure that field sampling equipment is available and in working order when needed; that all field
equipment, including tools, gauges, pumps, etc., are  calibrated to perform within specified limits;
and that corrective  action is taken to fix problems prior to and during field operations. The
procedures should  include record-keeping  for documenting  field  equipment  calibration,
maintenance,  testing, and inspection activities and discuss the availability  of spare parts and
equipment  to ensure that project schedules  are met. Figure 27 (Worksheet #22)  shows  what
information to include in the Field Equipment Calibration, Maintenance, Testing, and Inspection
table. The  information  provided  should  demonstrate the  ability of the equipment  to  collect
appropriate samples and data during field operations. All field equipment (other than analytical
instrumentation) should be listed, including but not limited to tools, gauges, and pumps.
Field
| Equipment
Calibration
Activity
Maintenance
Activity
Testing
Activity
Inspection
Activity
Frequency
Acceptance
Criteria
Corrective
Action
Responsible
Person
SOP
Reference1 |
  Specify the appropriate reference letter or number from the Project Sampling SOP References table (Figure 21).

      Figure 27. Field Equipment Calibration, Maintenance, Testing, and Inspection
                                 (QAPP Worksheet #22)

3.1.2.5 Sampling Supply Inspection and Acceptance Procedures

The QAPP should document the procedures and activities that will be performed to ensure that all
sampling supplies are free of target analytes/COCs and interferences and provide inspection  and
acceptance requirements  for any supplies  or consumables that  could  affect data  quality. The
documentation should include the following:

•• Supplies that will be used during sampling
•• All vendors for supplies and reagents
•• Specifications for all  supplies and reagents that could affect data quality (such as  level of
   contamination, pesticide versus reagent-grade).
•• Procedures that will be used to ensure supply cleanliness and reagent purity (such as recording
   reagent lot numbers)
•• Procedures for measuring supply cleanliness
•• Corrective action procedures for preventing the use of unacceptable supplies
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Measurement/Data Acquisition Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 69 of 149

The individuals responsible for checking supplies and implementing corrective actions should be
identified. This information may be contained in an SOP attached to or referenced in the QAPP.

3.1.2.6 Field Documentation Procedures

To provide a permanent record of field activities and possible introduction of sampling error,
observations and measurements taken in the field must be recorded. Typically, field data are
recorded in field logbooks, on field data collection forms, or electronically.

As part of the overall project data tracking and management system (described in Section 3.5), the
QAPP should describe the field documentation tracking and management system. For example, the
title of each field notebook should indicate its function, and each notebook used for a specific site
or project should be referenced to all the other project notebooks, including the project manager's
daily  log.  Also,  each  notebook should be tracked and archived with other project records in
accordance with the project data management system.

Since field information depends on the specific matrix and procedure, the QAPP should describe the
field information that will be recorded for each matrix and each type of sampling procedure.  If field
data collection forms will be used, examples of the forms should be included as figures in the QAPP
or as attachments to the QAPP,  and the forms referenced (this also applies to electronic forms). If
field notebooks will be used, the requirements for the notebooks should be described in the  QAPP.
Bound notebooks with water-resistant, sequentially numbered pages and indelible ink entries should
be used.

Regardless of the means used to record sampling information, copies of field data records should
be included with the associated data review reports to facilitate the identification of sampling error.

3.2     Analytical Tasks

The following sections address all components of the project-specific analytical measurement
system, including on-site and off-site laboratory analytical SOPs; method- and laboratory-specific
QC measurements, acceptance criteria, and  corrective  actions;  calibration  procedures; and
instrument, equipment, and supply maintenance, testing, and inspection requirements. The following
sections apply to both on-site and off-site analytical procedures. Different types of analyses can be
addressed in separate sections within the QAPP.

On-site analysis includes both semiquantitative and semiqualitative field screening techniques and
definitive analytical methods. Definitive data may be generated for field parameters, including
specific conductance, temperature, dissolved oxygen,  pH, turbidity, and oxidation/reduction
potential using field instrumentation. Definitive inorganic and organic data may be generated in a
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Measurement/Data Acquisition Elements

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 70 of 149

mobile on-site  laboratory equipped with a gas chromatograph (GC),  gas  chromatograph/mass
spectrometer (GC/MS), inductively coupled plasma (ICP), etc.
                                            Screening and Definitive Data

                                            Screening data may be  used to  support intermediate
                                            decisions, while only definitive data may be used to
                                            support final decisions.
The QAPP should  differentiate between
screening procedures and procedures used to
generate definitive data. Definitive data can
be generated by a field method, an on-site
laboratory,  or  an  off-site  laboratory. If
definitive data will be generated in the field,
documentation  (e.g.,  a QA  plan,  SOP,
sampling and analysis plan, field QA plan)
must be referenced or attached to the QAPP. If definitive data will be generated by a laboratory (on-
site or off-site), the equivalent of a laboratory QA plan should be provided as an attachment to the
QAPP. This document may not be necessary if only screening data are being generated. However,
the SOPs for generating screening data must be referenced in the QAPP, and they must be available
to the personnel performing the screening and the reviewer upon request. If the analytical procedures
are documented in the laboratory' s QA plan or manual, it may be easiest to reference the appropriate
sections of those documents or include only the relevant sections in the QAPP. This would eliminate
the need to include separate analytical  SOPs  (assuming that those  relevant  sections of the
laboratory' s QA plan contain all of the required information). Laboratory QA plans or manuals must
be included for each laboratory  retained for analytical services.

The QAPP must provide sufficient documentation to assure the reviewer that accurate, precise, and
usable data will be generated and that preventive and corrective action plans are in place prior to the
initiation of the sampling event.
Where contractual, regulatory, or programmatic
requirements  specify  that  a laboratory be
accredited, documentation of the laboratory
accreditation  should  be  included  as  an
attachment to the QAPP.
                                              Contracting Services

                                              All contracted or subcontracted on-site analytical and
                                              off-site laboratory services should be in place before
                                              the final QAPP is approved.
The QAPP  should describe the  analytical
techniques that will be used to generate screening and definitive data for the project. It also should
document the analytical SOPs that will be used to meet measurement performance criteria and
achieve project-required QLs for the target analytes/COCs at target concentration levels and in
specific matrices.
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
                                                                                     Final

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 71 of 149
                                               Methods Versus Analytical SOPs

                                               Note the difference between methods and analytical
                                               SOPs: Methods are published procedures that
                                               describe preparatory and analytical determinative
                                               techniques used in target analyte identification and
                                               quantitation.  Analytical SOPs  document how a
                                               particular  laboratory  will perform  a  specific
                                               analytical method.
3.2.1   Analytical SOPs

All analytical procedures that will be used in the
project must be documented in the QAPP  or
attached  document(s) to  allow  for review and
approval. Attachments must be provided with or
referenced  in the QAPP.  Although it may  be
possible to describe simple analytical procedures
within the text of the QAPP, the most efficient and
cost-effective way to document project-specific
measurement procedures is to include analytical SOPs as attachments to the QAPP. The QAPP
should include SOPs and reference the methods they are based on for each analytical group, matrix,
and concentration level that will be investigated. Proprietary SOPs may be submitted as confidential
business information. All  SOPs must specify the maximum allowable holding time from sample
collection to sample preparation or analysis, as appropriate.

Figure 28 (QAPP Worksheet #23)  shows what information on analytical SOPs to include in the
references table. References to analytical SOPs should be sequentially numbered in the Reference
Number column. The reference number can be used throughout the QAPP to refer to a specific SOP.
Reference
Number
Title, Revision
Date, and/or
Number
Definitive or
Screening Data
Analytical
Group
Instrument
Organization
Performing
Analysis
Modified for
Project Work?
(Y/N)
                         Figure 28. Analytical SOP References
                                (QAPP Worksheet #23)

3.2.2   Analytical Instrument Calibration Procedures

To ensure that the analytical methods and the selected instrumentation meet the proj ect requirements
for selective, sensitive, accurate, and precise detection and quantitation of the analytes of interest,
it is necessary to describe completely the calibration procedures for each analytical instrument, as
well as demonstrate the ability of the analytical technique to accurately and precisely identify and
quantitate the target analytes/COCs at the required QLs and within the required measurement ranges.

All instruments must be calibrated according to a schedule specified by the method and instrument
manual or SOPs. Calibration procedures may be documented separately in the QAPP or cross-
referenced and included with the analytical SOPs as attachments to the QAPP.

The QAPP should contain the information shown in Figure 29 (QAPP Worksheet #24) and a list of
all analytical instrumentation should be provided.
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
                                                                                  Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 72 of 149
Instrument
Calibration
Procedure
Frequency of
Calibration
Acceptance
Criteria
Corrective Action
(CA)
Person
Responsible for
CA
SOP
Reference1
'Specify the appropriate reference letter or number from the Analytical SOP References table (Figure 23).

                      Figure 29. Analytical Instrument Calibration
                                (QAPP Worksheet #24)

3.2.3   Analytical  Instrument  and  Equipment Maintenance,  Testing, and  Inspection
       Procedures

The QAPP should describe the procedures and documentation activities that will be performed to
ensure that all analytical instrumentation and equipment are available and in working order when
needed. The QAPP should contain the information shown in Figure 30 (QAPP Worksheet #25).

Instrument and equipment maintenance logs must be kept to document analytical instrumentation
and equipment maintenance, testing,  and inspection activities.

The QAPP should discuss the ability to ensure that project schedules are met (e.g., availability of
spare  parts or spare instruments, instrument control (on-site and during storage), security, and
availability (e.g., log-in/log-out procedures)).
Instrument/
| Equipment
Maintenance
Activity
Testing
Activity
Inspection
Activity
Frequency
Acceptance
Criteria
Corrective
Action
Responsible
Person
SOP
Reference1 |
 Specify the appropriate reference letter or number from the Analytical SOP References table (Figure 23).

  Figure 30. Analytical Instrument and Equipment Maintenance, Testing, and Inspection
                                (QAPP Worksheet #25)

3.2.4   Analytical Supply Inspection and Acceptance Procedures

The QAPP should document the procedures and activities that will be performed to ensure that all
supplies used  in analytical work will be available when needed and will be free of  target
analytes/COCs and interferences. The documentation should include the following:

   ••  Supplies that will be used in the performance of analytical work
   ••  All vendors for supplies and reagents
   ••  Specifications for all supplies and reagents that could affect data quality (such as level of
       contamination, pesticide versus reagent-grade). Procedures that will be used to ensure supply
       cleanliness and reagent purity (such as recording reagent lot numbers)
   ••  Procedures for measuring supply cleanliness
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 73 of 149

   ••  Corrective action procedures for preventing the use of unacceptable supplies

The individuals responsible for checking supplies and implementing corrective actions should be
identified. This information may be contained in an SOP attached to or referenced in the QAPP.

3.3    Sample Collection Documentation, Handling, Tracking, and Custody Procedures

The QAPP must include all sample collection documentation and sample handling, tracking, and
custody procedures used to ensure that sample integrity and custody are maintained. The procedures
should address sample collection, packaging, handling, and shipping, as well as records, receipt of
laboratory samples, archiving, and disposal. Chain-of-custody SOPs should include those procedures
associated with sampling and on-site and off-site laboratory analysis. The procedures may be
included as attachments, and cross-referenced in the QAPP.

3.3.1   Sample Collection Documentation

Proper field sampling and on-site and off-site analytical  documentation help ensure sample
authenticity (i.e., the sample identity is correct) and data integrity. On-site analytical and off-site
laboratory documentation procedures  are discussed in Section 3.5, in conjunction  with  data
management and proj ect records. The QAPP should describe sample documentation procedures that
will be followed for the project.

Documentation for sample collection includes sample container identification. The QAPP should
specify the required sample  identification  information and include an example. An  electronic
system, such as a bar code or FORMS II Lite (which retrieves information stored elsewhere), may
be used. The QAPP should describe how the information  on the label will be preserved (e.g., by
covering the label with clear tape to minimize water damage during transit). Refer to Section 3.1.2.6
for information about field documentation (e.g.,  field logbooks, field  data collection forms)
procedures.

3.3.2   Sample Handling and Tracking System

Proper sample tracking systems support the chain-of-custody procedures, which in turn help to
ensure sample authenticity and data defensibility. The QAPP should document the procedures that
will be followed to identify and track samples that are collected in the field, analyzed on-site, and
delivered or shipped to an off-site laboratory for analysis, as well as samples transferred throughout
the laboratory. If samples are shipped to an off-site laboratory, then the laboratory' s sample handling
and tracking system should also be described.

The sample handling and tracking procedures should do the following:
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Measurement/Data Acquisition Elements

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 74 of 149

• • Describe the sample numbering system for field sample collection and provide an example.
   If applicable, the numbering system should follow specific programmatic requirements that
   apply to the project. A systematic approach for numbering samples should be used so that each
   sampling location, matrix type, sample depth or height, and date and time of collection can be
   uniquely identified and cross-referenced to the programmatic sample number.

•• Describe the sample container identification information. See Appendix A, Section A.3.1.

• • Describe the laboratory sample tracking procedures. If laboratory identification numbers will
   be used to track samples internally, the laboratory procedure must describe how these laboratory
   identification numbers will be cross-referenced with the sample number assigned in the field.

•• Describe sample storage procedures used by the off-site or mobile on-site laboratory.

3.3.2.1 Sample Handling

To demonstrate the project's sample handling process, the QAPP should include a table that shows
the flow of samples from the time of collection to laboratory delivery to final sample disposal (see
Appendix A, QAPP Worksheet # 26, for an example). The table should identify each component of
the  project-specific  sample handling  system; indicate the personnel (and their organizational
affiliations) who are primarily responsible for ensuring proper sample handling, custody, storage,
and disposal; and specify  the length of time that samples, digestates and extracts, and biological
collections will be retained by the laboratory prior to disposal.

3.3.2.2 Sample Delivery

The QAPP should describe how samples  will be delivered or shipped to the laboratory.  The
description should include the name of the carrier service, if applicable, and define how samples will
be batched or grouped when sent to the laboratory. Samples can be grouped in sample  delivery
groups (SDGs), defined as a group of 20 or fewer field samples within a project. Proficiency testing
samples and other field QC samples (e.g., equipment blanks, volatile organic analytes [VOA] trip
blanks) are counted as field samples in the 20-sample SDG total.

The QAPP should include provisions for packaging, marking and labeling, and shipping samples
in compliance  with the most recent U.S. Department of  Transportation  regulations for shipping
hazardous and nonhazardous materials. Air carriers  that transport hazardous materials require
compliance with the current edition of the International Air Transport Association Dangerous Goods
Regulations, which applies to shipment and transportation of hazardous materials by air carriers.
Shipment papers, including bills of lading and airbills, should be retained by the laboratory with
chain-of-custody records.  Examples of all sample shipment forms to be used should be included.
These may be the same as the chain-of-custody forms, which are discussed in Section 3.3.3.

IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Measurement/Data Acquisition Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 75 of 149
3.3.3   Sample Custody
A sample is in "custody" if it is in the actual physical possession of authorized personnel or in a
secured area that is restricted to authorized personnel. For some projects, an evidentiary paper trail
documenting sample custody is required in order to meet proj ect quality obj ectives. Since it is often
difficult to predict what samples or projects will require proof of custody after the fact, all  data
collection  events  should employ  documented  chain-of-custody  procedures to ensure  data
authenticity and defensibility.
                                        The QAPP should describe the procedures that will be
                                        used to maintain sample custody and integrity and to
                                        document  the implementation  of chain-of-custody
                                        procedures. QAPP Worksheet #27  may be used to
                                        present sample custody information. The evidentiary
                                        trail from sample collection through data generation
                                        and archiving is  maintained using  sample custody
                                        procedures and documented by complete  chain-of-
                                        custody records.
Note:  Only  through  complete
documentation can the end-user prove
that the individual sample results reflect
a particular  sample  (collected at a
specific location at a specific date and
time) and that the sample was handled as
prescribed.
Chain-of-custody procedures ensure accountability for the location and integrity of the sample at
all times.(The ASTM document D4840-99 Standard Guide for Sampling Chain-of-Custody
Procedures contains information regarding chain-of-custody procedures.)

Sample custody procedures should include the field sampling team's procedures for maintaining and
documenting sample custody from the time samples are collected in the field through packaging,
shipment, and delivery to the laboratory. Field sampling documents that describe chain-of-custody
procedures, including  SOPs, should be  attached to or referenced in the QAPP. The laboratory's
procedures for maintaining and documenting sample custody from the time the samples are received
at the laboratory through archiving and disposal should also be attached to or referenced in the
QAPP. The use of software, such as FORMS II Lite, must be documented, if applicable.

Examples of all chain-of-custody documentation that will be used during the project should be
provided in the QAPP, including chain-of-custody forms, traffic reports,  sample identification,
custody seals, laboratory sample receipt forms, laboratory sample transfer forms, etc.

3.4    Quality Control Samples

This section addresses quality control samples only. Quality control (QC) is the set of activities that
are performed for the  purposes of monitoring, measuring, and controlling the performance of a
measurement process.  QC samples provide measurable data quality indicators used to evaluate the
different components of the measurement system, including sampling and analysis.
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
                                                                                 Final

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 76 of 149

During the systematic planning process, each QC sample's value should be determined based on its
contribution to measuring precision, accuracy/bias, contamination, and sensitivity. QC samples may
impose significant costs; therefore, it is important to identify which of those samples are not cost-
effective (i.e., which provide little additional information regarding data quality, or which duplicate
information provided by other QC samples). Project QC needs must be determined  based on the
decision to be made and the related level of data quality required. Deciding the most appropriate QC
samples and setting appropriate acceptance limits are a key part of project planning and frequently
require some professional judgment.
  QA/QC Compendium

  Part 2B of the UFP-QAPP, the QA/QC Compendium, provides background material to this Manual. The
  Compendium identifies as minimum activities those QC samples that either provide the most reliable information
  on overall data quality oridentify specific sources of error. See Section2.2 of the Compendium for further rationale
  for QC sample selection. Note: Many QC samples that are standard requirements in analytical methods are listed
  in Tables 4 and 5 but are not included in the QA/QC Compendium.

  Many (but not all) analytical methods will also specify QC practices. The minimum QC activities for screening and
  definitive data collected for use in the CERCLA process are provided in the Compendium. These activities may
  be appropriate for other environmental programs.
Tables 4 and 5 provide examples of QC samples that are frequently incorporated into chemical data
collection and analysis activities. Samples that commonly originate in the field are listed in Table
4; samples that are usually initiated in the laboratory are listed in Table 5. The QC needs for each
sampling project are unique. Although Tables 4  and 5 contain the most frequently run QC samples,
not all samples are applicable to every analytical procedure.  Those QC samples that are minimum
activities identified in Part 2B of the UFP-QAPP are identified.

Table 4. Recommended Types and Frequency of Sampling QC Samples for Chemical Data
                                        Collection
Sampling QC1
Field Blank (including VGA
Trip Blank)3
Equipment Blank
(rinsate blank)3
Proficiency Testing Sample3' 4
Data Quality
Indicator2
Contamination
(Accuracy /Bias)
Contamination
(Accuracy /Bias)
Accuracy /Bias
Recommended Frequency
Minimum 1 per shipment cooler per analytical group
per concentration level
Minimum 5% per analytical group per matrix per
sampling procedure per sampling team
Minimum 1 per SDG per analytical group per matrix
per concentration level
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                                    IDQTF UFP-QAPP Manual
                                                                    Page:  77 of 149

 Table 4. Recommended Types and Frequency of Sampling QC Samples for Chemical Data
                                     Collection (continued)
Sampling QC1
Field Duplicates
- Co-located Samples3
- Subsamples
Split Samples
Data Quality
Indicator2
Precision
Interlaboratory
Comparability
Recommended Frequency
Minimum 5% per analytical group per matrix per
sampling procedure per sampling team
As specified by method and based on PQOs
'Co-located and analyzed to measure errors introduced during sampling and other field activities.
2See Table 6 for additional DQI information.
'Minimum QC activity from Part 2B of the UFP-QAPP.
"Proficiency testing samples have been included under field sampling QC samples since they may be introduced during that stage. They primarily
measure analytical error, since their composition is unknown to the laboratory and they originate outside of the laboratory.


Table 5. Recommended Types and Frequency of Analytical QC Samples for Chemical Data
                                            Collection
Analytical QC
Method Blank
Instrument (System) Blank
Laboratory Duplicates2
Internal Standards
Matrix Spike (inorganics only)2
PT Sample - Single Blind and
Double Blind2
Surrogate Spikes
Laboratory Control Sample
(LCS)
Data Quality Indicators1
Accuracy /Bias (Contamination)
Accuracy /Bias (Contamination)
Precision
Precision and Accuracy /Bias
Bias
Bias
Bias
Bias
Recommended Frequency
Minimum 1 per SDG per analytical group
per matrix per concentration level
As specified by method and based on
Minimum 1 per inorganic SDG per
analytical group per matrix per
concentration level
As specified by method and based on
PQOs
Minimum 1 per inorganic SDG per
analytical group per matrix per
concentration level
Minimum 1 per SDG per analytical group
per matrix per concentration level
As specified by method and based on
PQOs
As specified by method and based on
PQOs
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 78 of  149

Table 5. Recommended Types and Frequency of Analytical QC Samples for Chemical Data
                                 Collection (continued)
Analytical QC
Laboratory Fortified Blank
(LFB)
Instrument Performance Check
Samples
Initial Calibration
Continuing Calibration or
Calibration Verification Checks
Data Quality Indicators1
Bias and Sensitivity
Sensitivity
Accuracy
Accuracy
Recommended Frequency
Minimum 1 per aqueous low-
concentration organic SDG/analytical
group
As specified by method and based on
PQOs for other analytical groups,
matrices, and concentration levels
As specified by method and based on
PQOs
After initial instrument setup, as specified
by method and when calibration
verification fails
Minimum 1 per analytical shift and more
frequently as specified by method and
based on PQOs
'See Table 6 for additional DQI information.
2Minimum QC activity from Part 2B of the UFP-QAPP.

Table 6 provides a more extensive list of QC samples and summarizes the information derived from
different sampling, transportation, and laboratory QC samples. QC samples that provide the best
overall measure of data quality and identify critical sources of error are recommended in the table.
Other QC samples may be appropriate on some projects but should not be considered minimum
samples for every project.

3.4.1   Sampling Quality Control Samples

To monitor the quality of various aspects of the sampling event, the QAPP should identify the QC
samples and their respective acceptance limits for the project. The QAPP should also document the
required analysis frequency and corrective actions (see Figure 31, QAPP Worksheet #28). This
information should correspond with Figure 14 in Section 2.6.2, which identifies the QC samples
associated with the selected measurement performance criteria.

Table 4  provides a list of recommended sampling  QC  samples. However,  actual types  and
frequencies are determined during planning based on project-specific needs.

3.4.2   Analytical Quality Control Samples

The QAPP should identify the QC samples, and their respective acceptance limits, that will be used
during the project to monitor the quality of various preparatory and analytical steps and should
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                                                     IDQTF UFP-QAPP Manual
                                                                                     Page: 79 of 149
                         Table 6. Information Derived from Quality Control Samples


Data Quality
(Type of
Information
Provided)
Accuracy/Bias
(Contamination)
























QC Samples
Field Blank




Equipment
Blank
(rinsate
blank)
Bottle Blank
per Lot #


VGA Trip
Blank


Storage
Blank


Sources of Measurement Error

Sample Collection

Sampling
Equipment
X




X
















Sample
Container
X




X



X



X








Preservation
Technique
X




X







X








Sample
Matrix
X




















Sample
Transport

Shipment
Process
X




X







X








Laboratory
Sample
Storage at
Laboratory
X




X







X



X



Sample
Preparation
Reagents
X




X



X



X



X



Sample
Preparation
Equipment
X




X



X



X



X



Analytical
Method
Reagents
X




X



X



X



X




Analytical
Equipment
X




X



X



X



X








Purpose
To evaluate
contamination
introduced during
sampling, storage, and
transport.
To evaluate carryover
contamination resulting
from successive use of
sampling equipment.
To evaluate
contamination
introduced from the
sample container.
To evaluate
contamination
introduced during
shipment.
To evaluate cross-
contamination
introduced during
sample storage.





Recommended*





. .















IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                                                            IDQTF UFP-QAPP Manual
                                                                                            Page: 80 of 149
                        Table 6. Information Derived from Quality Control Samples (continued)
Data Quality
Indicator
(Type of
Information
Provided)
Accuracy/Bias
(Contamination)
(continued)
Accuracy/Bias
(Preservation)
QC Samples
Method
Blank
Reagent
Blank
per Lot #
Instrument
(System)
Blank
Cooler
Temperature
Blank
Sources of Measurement Error
Sample Collection
Sampling
Equipment




Sample
Container




Preservation
Technique



X
Sample
Matrix




Sample
Transport
Shipment
Process




Laboratory
Sample
Storage at
Laboratory




Sample
Preparation
Reagents
X
X


Sample
Preparation
Equipment
X
X


Analytical
Method
Reagents
X
X
X

Analytical
Equipment
X
X
X

Purpose
To evaluate
contamination
introduced during
sample preparation or
analysis by laboratory,
including reagents,
equipment, sample
handling, and ambient
laboratory conditions.
To evaluate
contamination
introduced by specific
method reagents.
To evaluate
contamination
originating from the
analytical reagents
instrumentation.
To evaluate whether or
not samples were
adequately cooled
during shipment.
Recommended*
. *



IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                                                            IDQTF UFP-QAPP Manual
                                                                                            Page: 81 of 149
                        Table 6. Information Derived from Quality Control Samples (continued)


Data Quality
(Type of
Information
Provided)
Accuracy/Bias
























QC Samples
Matrix Spike
— Inorganics
Only



Surrogate
Spike -
Organics
Only

Laboratory
Control
Sample
(LCS)





Sources of Measurement Error

Sample Collection

Sampling
Equipment





















Sample
Container





















Preservation
Technique





















Sample
Matrix
X





X













Sample
Transport

Shipment
Process





















Laboratory
Sample
Storage at
Laboratory




















Sample
Preparation
Reagents
X





X




X








Sample
Preparation
Equipment
X





X




X








Analytical
Method
Reagents
X





X




X









Analytical
Equipment
X





X




X













Purpose
To determine laboratory
preparatory and
analytical bias for
specific compounds in
specific sample
matrices.
To evaluate laboratory
preparatory and
analytical bias for
specific sample
matrices.
To evaluate the
laboratory's ability to
accurately identify and
quantitate target
compounds in a
reference matrix at a
known concentration,
usually midrange of the
calibration curve.





Recommended*
• •





. i




• •








IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                                                            IDQTF UFP-QAPP Manual
                                                                                            Page: 82 of 149
                        Table 6. Information Derived from Quality Control Samples (continued)
Data Quality
Indicator
(Type of
Information
Provided)
Accuracy/Bias
(continued)
QC Samples
Proficiency
Testing
Sample —
Amputated
Single Blind
Proficiency
Testing
Sample —
Full Volume
Single Blind
Proficiency
Testing
Sample —
Double Blind
Sources of Measurement Error
Sample Collection
Sampling
Equipment



Sample
Container

X
X
Preservation
Technique

X
X
Sample
Matrix



Sample
Transport
Shipment
Process

X
X
Laboratory
Sample
Storage at
Laboratory

X
X
Sample
Preparation
Reagents
X
X
X
Sample
Preparation
Equipment
X
X
X
Analytical
Method
Reagents
X
X
X
Analytical
Equipment
X
X
X
Purpose
To evaluate sample
handling procedures
from field to laboratory.
To evaluate the
laboratory's ability to
accurately identify and
quantitate target
compounds in a
reference matrix.
Frequently used for data
review and for
laboratory self-
assessments and
external assessments,
i.e., preawards and
laboratory technical
system audits (TSAs).
To evaluate sample
handling procedures
from field to laboratory.
To evaluate the
laboratory's ability to
accurately identify and
quantitate target
compounds in a
reference matrix.
Recommended*



IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                                                            IDQTF UFP-QAPP Manual
                                                                                            Page: 83 of 149
                        Table 6. Information Derived from Quality Control Samples (continued)
Data Quality
Indicator
(Type of
Information
Provided)
Accuracy/Bias
(continued)
QC Samples
Laboratory
Fortified
Blank (LFB)
Initial
Calibration
Continuing
Calibration/
Continuing
Calibration
Verification
Instrument
Performance
Check
Sample
Sources of Measurement Error
Sample Collection
Sampling
Equipment




Sample
Container




Preservation
Technique




Sample
Matrix




Sample
Transport
Shipment
Process




Laboratory
Sample
Storage at
Laboratory




Sample
Preparation
Reagents
X



Sample
Preparation
Equipment
X



Analytical
Method
Reagents
X
X
X
X
Analytical
Equipment
X
X
X
X
Purpose
A type of LCS used to
evaluate laboratory
(preparatory and
analytical) sensitivity
and bias for specific
compounds in a
reference matrix at QL
concentrations.
To ensure that the
instrument is capable of
producing acceptable
qualitative and
quantitative data.
To ensure the accuracy
and stability of the
instrument response.
To verify that an
instrument can
accurately identify and
quantitate target
analytes at specific
concentration levels.
Recommended*




IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                                                        IDQTF UFP-QAPP Manual
                                                                                        Page:  84 of 149
                        Table 6. Information Derived from Quality Control Samples (continued)


Data Quality
(Type of
Information
Provided)
Sensitivity


























QC Samples
Laboratory
Fortified
Blank





MDL Studies













Sources of Measurement Error

Sample Collection

Sampling
Equipment























Sample
Container























Preservation
Technique























Sample
Matrix








X
(if
performed
using
same
reference
matrix)







Sample
Transport

Shipment
Process























Laboratory
Sample
Storage at
Laboratory






















Sample
Preparation
Reagents
X







X













Sample
Preparation
Equipment
X







X













Analytical
Method
Reagents
X







X














Analytical
Equipment
X







X


















Purpose
A type of LCS used to
evaluate laboratory
(preparatory and
analytical) sensitivity
and bias for specific
compounds in a
reference matrix at QL
concentrations.
A statistical
determination that
defines the minimum
concentration of a
substance that can be
measured and reported
with 99% confidence
that the analyte
concentration is greater
than zero. QLs/practical
QLs (PQLs) are
generally 3-10 times the
method detection limit
(MDL).





Recommended*
• •





















IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                                                        IDQTF UFP-QAPP Manual
                                                                                        Page:  85  of 149
                        Table 6. Information Derived from Quality Control Samples (continued)
Data Quality
Indicator
(Type of
Information
Provided)
Sensitivity
(continued)

Precision




QC Samples
Low Point of
Initial
Calibration
Curve

Field
Duplicates


Laboratory
Duplicates
Matrix Spike
Duplicates

Sources of Measurement Error
Sample Collection
Sampling
Equipment



X






Sample
Container



X






Preservation
Technique



X






Sample
Matrix



X



X
X

Sample
Transport
Shipment
Process



X






Laboratory
Sample
Storage at
Laboratory



X






Sample
Preparation
Reagents



X



X
X

Sample
Preparation
Equipment



X



X
X

Analytical
Method
Reagents
X


X



X
X

Analytical
Equipment
X


X



X
X

Purpose
To ensure that the
instrument is capable of
producing acceptable
qualitative and
quantitative data at the
lowest concentration
that sample results will
be reported; the
quantitation limit.
To measure overall
precision by evaluating
cumulative effects of
both field and
laboratory precision.
To evaluate laboratory
preparatory and
analytical precision.
To determine laboratory
preparatory and
analytical bias and
precision for specific
compounds in specific
sample matrices.
Recommended*
• •


• •



. *


IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                                                                 IDQTF UFP-QAPP Manual
                                                                                                 Page:  86 of 149
Data Quality
Indicator
(Type of
Information
Provided)
Precision
(continued)


Interlaboratory
Comparability



Reproducibility

QC Samples
Analytical
Replicates
duplicate
injections)
Internal
Standards
Split Samples



Biological
QC Check

Sources of Measurement Error
Sample Collection
Sampling
Equipment









X

Sample
Container









X

Preservation
Technique









X

Sample
Matrix











Sample
Transport
Shipment
Process





X



X

Laboratory
Sample
Storage at
Laboratory





X



X

Sample
Preparation
Reagents





X



X

Sample
Preparation
Equipment





X



X

Analytical
Method
Reagents





X



X

Analytical
Equipment
X


X

X



X

Purpose
To evaluate analytical
precision for
determinative
instrumentation.
To evaluate instrument
precision and stability.
To evaluate sample
handling procedures
from field to laboratory
and to evaluate
interlaboratory
comparability and
precision.
To evaluate biological
sorting reproducibility
between laboratories or
analysts.
Recommended*



. .

• •



• •

*The samples without "Recommended" checkmarks are believed to provide redundant QC data and raise the project's analytical costs.
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                                                 IDQTF UFP-QAPP Manual
                                                                 Page:  87  of 149

identify the project personnel expected to perform the QC activities. In addition, the QAPP must
include the following:

• An explicit description of the QC samples to be collected/analyzed
• A required frequency at which it must be collected
• A description, usually in mathematical terms, of what constitutes acceptable performance for the
  QC sample
• Corrective actions to be taken if the QC sample fails these criteria
• A description of how the QC data and results are to be documented and reported to the data user
  Note: Many analytical methods provide QC acceptance limits for most of the QC samples required by those
  methods. Certain methods require that laboratories generate their own specific QC acceptance limits for the QC
  samples required. These method- and laboratory-specific limits, however, may not be "tight" enough to support
  the project quality objectives. In other words, QC sample results may meet method/SOP QC acceptance limits
  but fail to meet the measurement performance criteria of the project as defined and documented in Section 2.6.2.
  Therefore, it is important to select analytical methods having QC acceptance limits that support the collection
  and analysis of usable project data. Subsequently, choosing a laboratory that is capable of meeting the project-
  required QC acceptance limits is critical. Again, method- and laboratory-specific QC acceptance limits, project
  measurement performance criteria, and project review criteria must be complementary for project objectives to
  be achieved.

  For some projects, the selected analytical method may not have sufficient QC samples built into the method.
  In those cases, the project team will need to  specify what additional QC samples must be analyzed by the
  laboratory.  The laboratory should document additional project-required QC activities in its analytical SOPs,
  along with the required frequency, acceptance criteria, and corrective actions for those QC samples.
Table 5 lists types of analytical QC samples but does not include all possible QC samples that are
available to the user.  Also, analytical methods may define the purpose of specific QC samples
differently (e.g., dioxin methodologies); therefore, it is necessary to adhere to the QC definitions of
the specific methods used.

The QAPP  should contain the information shown in Figure 31 (QAPP Worksheet #26), including
both sampling and analytical QC samples. This information should correspond with Figure 14 in
Section 2.6.2, which identifies the QC samples associated with selected measurement performance
criteria.

If screening analyses are performed, a decision tree or logic diagram  should be provided to describe
how samples will be selected for subsequent confirmation with definitive data analysis (see Section
2.6.2.5.2).  If  method/SOP QC  acceptance limits exceed  the  project-specific measurement
performance criteria, the data obtained may be unusable for making project decisions.
IDQTF, UFP-QAPP Manual V1, March 2005                                                  Final
Measurement/Data Acquisition Elements

-------
                                                                       IDQTF UFP-QAPP Manual
                                                                       Page:  88 of  149
Matrix
Analytical Group
Concentration Level
Sampling SOP
Analytical Method/
SOP Reference
Sampler's Name
Field Sampling
Organization
Analytical
Organization
No. of Sample
Locations
QC Sample:
Frequency/Number
Method/SOP QC
  Acceptance
    Limits
Corrective Action
   Person(s)
 Responsible for
Corrective Action
Data Quality
 Indicator
   (DQD
Measurement
Performance
  Criteria
                                           Figure 31. QC Samples
                                           (QAPP Worksheet #28)
         3.5    Data Management Tasks
         All proj ect data and information must be documented in a format that is usable by proj ect personnel.
         Therefore, the QAPP should describe how proj ect data and information will be documented, tracked,
         and managed, from generation in the field to final use and storage, in a manner that ensures data
         integrity,  defensibility, and retrieval. If electronic data is required, the QAPP should specify the
         requirements.

         3.5.1  Project Documentation and Records

         All project documents and records that will be generated for every aspect of the project should be
         identified in the QAPP. These include but are not limited to the following:

         1.    Sample Collection and Field Measurement Records

           •   Field data collection sheets
           •   Chain-of-custody records
           •   Airbills
           •   Communication logs
           •   Corrective action reports
           •   Documentation of corrective action results
           •   Documentation of deviation from methods
         IDQTF, UFP-QAPP Manual VI, March 2005
         Measurement/Data Acquisition Elements
                                                                             Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page:  89 of 149
  •  Documentation of internal QA review
  •• Electronic data deliverables
  •• Identification of QC samples
  •• Meteorological data from field (e.g., wind, temperature)
  •• Sampling instrument decontamination records
  •• Sampling instrument calibration logs
  •• Sampling location and sampling plan
  •• Sampling notes and drilling logs
  •• Sampling report

2.    Analytical Records

  •• Chain-of-custody records
  •• Sample receipt forms and sample tracking forms
  •• Preparation and analysis forms and/or logbooks
  •• Tabulated data summary forms and raw data for field samples, standards, QC checks, and QC
     samples
  •• Case narrative
  •• Sample chronology (time of receipt, extraction, and analysis)
  •• Identification of QC samples
  •• Communication logs
  •• Corrective action reports
  •• Definitions of laboratory qualifiers
  •• Documentation of corrective action results
  •• Documentation of laboratory method deviations
  •• Electronic data deliverables
  •• Instrument calibration reports
  •• Laboratory name
  •• Laboratory sample identification numbers
  •• Reporting forms, completed with actual results
  •• Signatures for laboratory sign-off (e.g., laboratory QA manager)
  •• Standards traceability records
  • • Other proj ect-specific documents in the laboratory' s possession, such as telephone logs, MDL
     studies, initial precision and accuracy tests, laboratory preaward documentation (including
     preaward PT sample data and relevant copies of proposal package), and corrective action
     reports

3.    Proj ect Data Assessment Records

  •• Field sampling audit checklists
  •• Analytical audit checklists
  •• PT sample results

IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
Measurement/Data Acquisition  Elements

-------
     Data review reports
     Telephone logs
     Corrective action reports
     Laboratory assessment
     Laboratory QA plan
     MDL study information
     NELAP accreditation
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 90 of 149
Figure 32 (QAPP Worksheet #29) shows what information to include in the Proj ect Documents and
Records table.
Sample Collection
Documents and
Records
On-site Analysis
Documents and Records
Off-site Analysis Documents
and Records
Data Assessment
Documents and
Records
Other
                       Figure 32. Project Documents and Records
                                (QAPP Worksheet #29)

3.5.2 Data Package Deliverables

The requirements for data package deliverables are project-specific and will vary by analytical
group. Table 7 presents examples of laboratory data deliverable elements selected on the basis of
analyte group.

3.5.2.1    Sample Collection and Field Measurements Data Package Deliverables

The QAPP should itemize  all required elements of data package deliverables  for all sample
collection activities (see Section 3.5.1 item 1 for examples). If field measurements are taken (e.g.,
specific conductance, temperature, dissolved oxygen, pH, turbidity, oxidation/reduction potential,
and residual chlorine), then all field and QC sample results, calibrations, and calibration verifications
should be recorded in a field logbook to ensure proper verification of sample results.
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
Final

-------
                                        Table 7. Example Analytical Data Deliverable Elements
                                                                                                                IDQTF UFP-Q APP Manual
                                                                                                                 Page: 91 of 149
DATA DELIVERABLE ELEMENTS
• INVENTORY SHEET (Org. and Inorg. DC-2 Form)
• NARRATIVE (Org. Narrative, Inorg. Cover Page)
• EPA SHIPPING/RECEIVING DOCUMENTS AND INTERNAL LABORATORY CHAIN-OF-CUSTODY RECORDS:
- Airbills
- Chain-of-Custody Records/Forms (Traffic Report)
- Sample Tags
- Sample Log-In Sheet (Org. and Inorg. DC- 1 Forrn)
- Miscellaneous Shipping/Receiving Records
- Internal Lab. Sample Transfer Records and Tracking Sheets
• SAMPLE DATA:
- Tabulated Summary Form for Field Sample and PT Sample Results (Org. and Inorg. Form I)
- Tentatively Identified Compounds Tabulated Summary Form (_Org. Form I TIC)
- Reconstructed total ion chromatogram (RIC) for each sample
- Raw spectra of target compound and background-subtracted spectrum of target compound for each sample
- Mass spectra of all reported TICs/three best library matches for each sample
- Chromatograms from both columns for each sample
- GC integration report or data system printouts and calibrationplots for each sample
- PEST/PCB Identification Tabulated Summary Form (_Org. Form X)
- For PEST/PCBs confirmed by GC/MS, copies of raw spectra and background-subtracted spectrum of target compounds
- Gel permeation chromatography sample chromatograms
- UFP-QAPP Manual worksheets
- Sample preparation/extraction/digestion log (Inorg. Form XIII) and logbook pages
VGA
X
X
X
X
X
X
X
X
X
X
X
X
X



X
X
svoc
X
X
X
X
X
X
X
X
X
X
X
X
X



X
X
X
PEST/PCB
X
X
X
X
X
X
X
X
X
	


X
X
X
X
X
X
X
METALS
X
X
X
X
X
X
X
X
X
	





X
X
CN
X
X
X
X
X
X
X
X
X
	





X
X
OTHER
X
X
X
X
X
X
X
X
X
	





X
X
VOA = volatile organic analytes
SVOC = semivolatile organic compounds
PEST = pesticide organic compounds
PCB = polychlorinated biphenyls
CN  = cyanide
Other = other parameters
(  ) = Form Number, refer to CLP SOW forms if CLP is used
   IDQTF, UFP-QAPP Manual VI, March 2005
   Measurement/Data Acquisition Elements
                                                                                                    Final

-------
                                 Table 7.  Example Analytical Data Deliverable Elements (continued)
                                                                                                               IDQTF UFP-Q APP Manual
                                                                                                                Page:  92 of 149
DATA DELIVERABLE ELEMENTS
• SAMPLE DATA (continued):
- Sample analysis run log (Tnorg. Form XIV) and logbook pages
- ICP raw data
- Furnace atomic absorption raw data
- Mercury raw data
- Cyanide raw data
- Other analytical raw data
• STANDARDS DATA:
- Method Detection Limit Study Tabulated Summary Form
- Initial Calibration Tabulated Summary Form (Org. Form VI, Inorg. Form IIA)
- Continuing Calibration Tabulated Summary Form (Org. Form VII, Inorg. Form IIA)
- RICs and quantitation reports for all GC/MS standards
- Pesticide Analyte Resolution Tabulated Summary Form (Org. Form VI, Pest-4)
- Pesticides Calibration Verification Tabulated Summary Form (Org. Form VII, Pest- 1 and Pest-2)
- Pesticide Analytical Sequence Tabulated Summary Form (Org. Form VHI-Pest)
- GC chromatograms and data system printouts for all GC standards
- For pesticides/aroclors confirmed by GC/MS, copies of spectra for standards used
- GPC Calibration Tabulated Summary Form (Org. Form IX, Pest-2)
- Florisil Cartridge Check Tabulated Summary Form (Org. Form IX, Pest-1)
- Instrument Detection Limits Tabulated Summary Form (Inorg. Form X)
- ICP Interelement Correction Factors Tabulated Summary Form (Inorg. Form XIA and XIB)
- ICP Linear Ranges Tabulated Summary Form (Inorg. Form XII)
- CRDL Standards for AA and ICP Tabulated Summary Form (Inorg. Form IIB)
- Standards preparation logbook pages
VGA
X





X
X
X
X











X
svoc
X





X
X
X
X











X
PEST/PCB
X





X
X
X

X
X
X
X
X
X
X




X
METALS
X
X
X
X


X
X
X








X
X
X
X
X
CN
X



X

X
X
X








X



X
OTHER
X




X
X
X
X




X







X
VOA  = volatile organic analytes
SVOC = semivolatile organic compounds
PEST = pesticide organic compounds
PCB = polychlorinated biphenyls
CN  = cyanide
Other = other parameters
(  ) = Form Number, refer to CLP SOW forms if CLP is used
   IDQTF, UFP-QAPP Manual VI, March 2005
   Measurement/Data Acquisition Elements
                                                                                                   Final

-------
                                  Table 7. Example Analytical Data Deliverable Elements (continued)
                                                                                                               IDQTF UFP-Q APP Manual
                                                                                                                Page:  93 of 149
DATA DELIVERABLE ELEMENTS
• QC DATA:
- Tuning and Mass Calibration Tabulated Summary Form (Org. Form V)
- Surrogate Percent Recovery Tabulated Summary Form (Org. Form II)
- MS/MSD Recovery Tabulated Summary Form (Org. Form III)
- Method Blank Tabulated Summary Form (Org. Form IV and Inorg. Form III)
- Internal Standard Area and RT Tabulated Summary Form (Org. Form VIII)
- QC Raw Data - RICs, chromatograms, quantitation reports, integration reports, mass spectra, etc.
- ICP Interference Check Sample Tabulated Summary Form (Inorg. Form IV)
- Spike Sample Recovery Tabulated Summary Form (Inorg. Form VA)
- Post Digest Spike Sample Recovery Tabulated Summary Form (Inorg. Form VB)
- Duplicates Tabulated Summary Form (Inorg. Form VI)
- Internal Laboratory Control Sample Tabulated Summary Form (Inorg. Form VII)
- Standard Addition Results Tabulated Summary Form (Inorg. Form VIII)
- ICP Serial Dilutions Tabulated Summary Form (Inorg. Form IX)
- QC raw data - ICP, furnace, mercury, computer printouts, etc.
- QC sample preparation logbook pages
• MISCELLANEOUS DATA:
- Original _prej3aration and analysis forms or copies of preparation and analysis logbook pages
- Screening records
- All instrument output, including strip charts, from screening activities
- Preparation logs raw data
- Percent solids determination log
- Other records (e.g., telephone communication log)
VGA
X
X
X
X
X
X








X
X
X
X
X
X
X
svoc
X
X
X
X
X
X








X
X
X
X
X
X
X
PEST/PCB

X
X
X

X








X
X
X
X
X
X
X
METALS



X


X
X
X
X
X
X
X
X
X
X


X
X
X
CN



X



X
X
X
X


X
X
X


X
X
X
OTHER





X







X
X
X
X
X
X
X
X
VOA  = volatile organic analytes
SVOC = semivolatile organic compounds
PEST = pesticide organic compounds
PCB = polychlorinated biphenyls
CN  = cyanide
Other = other parameters
(  ) = Form Number, refer to CLP SOW forms if CLP is used
   IDQTF, UFP-QAPP Manual VI, March 2005
   Measurement/Data Acquisition Elements
                                                                                                   Final

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 94 of  149
3.5.2.2    On-site Analysis Data Package Deliverables
The QAPP should list the required data package deliverables for all on-site analytical data generated
in the  field, with required data package turnaround times specified for each analytical group
measured on-site.

•• On-site Analytical Screening Data. The requirements for on-site screening data packages are
  project-specific. In addition, the usability of on-site screening data depends on the PQOs and the
  comparability of those data to the definitive confirmatory data generated by an on-site mobile
  laboratory or off-site laboratory.
••On-site Analytical Definitive Data. If on-site analytical data are  generated for definitive
  purposes, then a complete data package should be generated to ensure that data can be properly
  reviewed.

If complete on-site analysis data packages (i.e., original raw data) are not required deliverables, the
QAPP should justify  this decision and specify which project data will be kept by the on-site
analytical unit, where the data will be stored (the organization's name, address, and exact location
in building), and how long it will be stored (the length of time required records must be stored is
program-dependent).

Even if complete data packages are not required deliverables in  the QAPP, all hard-copy and
electronic data and information relevant to the project must be archived in one location by the on-
site analytical unit to ensure the data's availability for potential future retrieval and use.

In order to facilitate possible future review, it is strongly recommended that raw data (including
electronic media) of all field samples, QC checks and samples, standards,  and blanks from all data
collection events be archived, if applicable, and be available on request for a minimum of 5 years
from the date of generation.

3.5.2.3    Off-site Laboratory Data Package Deliverables

Required data package deliverables for all data generated by off-site laboratories retained to provide
analytical services should be itemized in the QAPP, with required data package turnaround times
specified for each analytical group.

For all data collection events, a laboratory data package should be provided for each set of samples
designated as a sample delivery group (SDG).
IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Measurement/Data Acquisition Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 95  of 149
A good  example  of  the  data  package
requirements  for  18  different analytical
methods is found in the EPA Region 9 draft
report,   Laboratory   Documentation
Requirements for Data Validation (July 1997;
9QA-07-97; available at http://www.
epa.gov/region9/qa/pdfs/ldrdv.pdf)
Laboratory data package deliverables may
include the documents shown in Table 7.
Graded Approach

Depending on the data needs of the project, such as
radiological  sampling and analysis,  information
beyond that shown on Table 7 may be required. For
other projects, fewer items may be required.
It is strongly recommended that raw data (including electronic media) of all field samples, QC
samples, standards, and blanks be archived, if applicable, and be available upon request for 5 years
from the date of generation.

Figure 33 (QAPP Worksheet #30) shows what information to include in the Analytical Services
table. The organizations or laboratories that will provide the analytical services (for all on-site
screening, on-site definitive, and off-site laboratory analytical work, including all prime laboratories,
subcontractor laboratories,  and backup laboratories)  should by identified, grouped  by matrix,
analytical group, concentration level, and sample locations or ID numbers.




Matrix



Analytical
Group



Concentration
Level

Sample
Location/
ID
Number



Analytical
SOP

Data
Package
Turnaround
Time
Laboratory/
Organization (Name
and Address,
Contact Person and
Telephone Number)
Backup Laboratory/
Organization (Name
and Address,
Contact Person and
Telephone Number)
                              Figure 33. Analytical Services
                                 (QAPP Worksheet #30)

3.5.3 Data Reporting Formats

The QAPP should discuss procedures and/or SOPs for recording data, including guidelines for
recording (e.g., manually, legibly in ink, and initialed and dated by the responsible person) and
correcting data (e.g., single line drawn through errors, initialed and dated by the responsible person).
The QAPP should include examples of hard-copy data reporting forms and all verification checklists
and forms (as an attachment to the QAPP or by referencing the laboratory QA plan or manual). If
applicable, the QAPP should discuss specifications for format, content, and computer configuration
of electronic data deliverables, and examples of all electronic data deliverable forms should be
attached.
IDQTF, UFP-QAPP Manual VI, March 2005
Measurement/Data Acquisition Elements
                                        Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 96 of  149
3.5.4 Data Handling and Management
The QAPP should describe all computerized and manual procedures that trace the paths of all data
from generation to final use and storage, as well as the associated quality checks for error detection
that are performed to ensure data integrity. Applicable SOPs may be attached to or referenced in the
QAPP. The following data management steps should be addressed:

1.    Data Recording

  ••  Provide examples of data entry forms.
  ••  Describe internal checks to detect errors such as transcription and calculation errors, the
     resultant documentation generated, and responsible personnel.
  ••  Provide examples of all verification checklists and forms.

2.    Data Transformations and Data Reduction

  ••  Provide formulas used in data conversions (e.g., calculation of dry weight field sample
     concentrations).
  • •  Describe when and how data conversion procedures are performed, how they are checked, the
     resultant documentation generated, and responsible personnel.
  ••  Describe all data manipulations involved in reducing raw data to reportable data, as well as
     responsible personnel.
  ••  Provide an example of how raw data are reduced for all manual and automated calculations
     (e.g., calculation of sample concentrations from peak areas, manual integration procedures).
  ••  Provide references to specific software documentation for automated data processing.
  ••  Describe internal checks to  detect errors, the  resultant documentation generated,  and
     responsible personnel. Provide examples of all verification checklists and forms.
  ••  Indicate the number of significant figures.

3.    Data Transfer and Transmittal

  ••  Identify electronic data transfer software.
  ••  Provide examples of electronic data transfer forms.
  ••  Describe manual data  transcription and electronic transmittal procedures, the resultant
     documentation generated, and responsible personnel.
  ••  Describe internal checks to  detect errors, the  resultant documentation generated,  and
     responsible personnel. Provide examples of all verification checklists and forms.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Measurement/Data Acquisition Elements

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 97 of  149
4.    Data Analysis

  •• Identify and describe the data equipment and computer hardware and software that will be
     used to  process, compile, and analyze project data (e.g.,  the  Laboratory Information
     Management Systems, or LIMS) and secondary data (see Section 2.7).
  •• Describe in detail the computer models and/or algorithms that will be used for data analysis
     and justify their use for this project.

5.    Data Review

  •• Describe in detail the computer programs that will be used to review data.
  •• Describe in detail statistical computer programs that will be used to assess data.
  •• Indicate the anticipated organization that will be performing data review (see Section 5.0 for
     details on the data review process).

3.5.5 Data Tracking and Control

The QAPP should describe the procedures for data tracking, storage,  archiving, retrieval, and
security, including both hard-copy and electronic data and information, and identifying the personnel
responsible.

1.    Data Tracking

  •• Describe  procedures for tracking data  as they are collected, transformed or reduced,
     transmitted, and  analyzed; the resultant documentation generated; and  the responsible
     personnel.

2.    Data Storage, Archiving, and  Retrieval

  •• Describe data storage, archiving, and retrieval procedures for all  project data, documents,
     records, and reports. Differentiate between hard-copy and electronic data and information.
  •• Identify specific project data, documents, records, reports, etc. that will be stored and/or
     archived. Differentiate between hard-copy and electronic data and information, and between
     documentation stored at a subcontracted laboratory and documentation archived by the lead
     organization. If data package deliverables do not include all project data documentation,
     describe what data (for on-site screening, on-site definitive, and off-site laboratory) will be
     kept by which laboratory or other organization and the exact physical location for each (i.e.,
     complete laboratory  or organization name, address, and specific location in the building).
  •• Identify the organizations and personnel responsible for storing, archiving,  and retrieving
     specific project documents.  Identify responsible document  control personnel, including
     organizational affiliation, telephone, e-mail address, and fax number (see Sections 2.4.1 and
     2.4.3).

IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Measurement/Data Acquisition Elements

-------
                                                           IDQTF UFP-QAPP Manual
                                                           Page: 98 of 149
     Describe where the documents will be stored during the project and where the documents will
     be archived. Provide exact locations (organization name,  complete address, and specific
     location in building) and timeframes in which documents will be moved from one location to
     another.
     Indicate when documents will be archived at a final location.
     Data Security

     Describe procedures for data security.
     Describe procedures for computer security.
IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
Measurement/Data Acquisition Elements

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 99 of  149
4.0  ASSESSMENT AND OVERSIGHT ELEMENTS

This QAPP element group ensures that planned project activities are implemented as described in
the QAPP and that reports are provided to apprise management of the project status and any QA
issues that arise during implementation. Assessment activities help to ensure that the resultant data
quality is adequate  for its intended use, and that appropriate responses are in place to address
nonconformances and deviations from the QAPP.

Frequently, deviations from the QAPP are identified by project personnel without the benefit of
formal, scheduled assessments. This section also addresses those situations and describes the process
by which the need for corrective action is documented, reported, and implemented and its
effectiveness assessed.

4.1  Assessments and Response Actions

Appropriately scheduled assessments allow management  	
to implement corrective action  measures  in a timely   Graded Approach
manner,  thereby  minimizing  the  impact  of
       r-            i  •  •       •  ±    i-A   u-   A-        The PQOs dictate the type, frequency, and
nonconformance on achieving proj ect quality obj ectives.   extent rf ^ assessm^ts ^ should be
Periodic internal and/or external assessments should be   performed
conducted throughout the project to ensure that usable
data  are  being  generated.   In addition,  oversight
assessments should be performed by the approval authority to identify and correct nonconformances.

The number, frequency, and types of planned assessment activities that will be performed for the
project should be identified in the QAPP. Descriptions should include activities for identifying and
correcting any  problems  encountered  during the project. The project team should  choose
assessments that identify activities with the most influence on data quality and provide information
about potential  problems and  mistakes.  Sampling error is generally thought to contribute the
majority of the measurement error associated with project data, where:

                    Measurement Error = Sampling Error + Analytical Error

Therefore, all data generation and collection operations should include at least one field sampling
technical systems audit (TSA)  at the start of field sampling activities so that effective corrective
action measures  can  be  implemented  to mitigate the  extent  and  impact  of identified
nonconformances. Both investigative and routine monitoring projects should also include field
analytical, on-site laboratory, and off-site laboratory TSAs, as appropriate. An RI/FS with known
human health or ecological risks should include comprehensive assessments of field sampling, on-
site analytical  and  off-site laboratory  measurement procedures,  and proposed remediation
technologies, as well as an evaluation of the risk assessment procedures that will be employed.
IDQTF, UFP-QAPP V1, March 2005                                                    Final
Assessment and Oversight Elements

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 100  of 149
4.1.1 Planned Assessments

Many different types of assessments are used for evaluating the effectiveness of project activities.
The following may be performed by project participants as internal or external assessments, or by
the approval authority as oversight audits.

Readiness Review - A systematic, documented review of readiness for the startup or continued use
of a facility, process, or activity. Readiness reviews are typically conducted before proceeding
beyond project milestones and prior to initiating a major phase of work.

Field Sampling TSA -  A thorough on-site audit during which sampling design, equipment,
instrumentation, supplies, personnel, training, sampling procedures,  chain-of-custody,  sample
handling and tracking, data reporting, data handling and management, data tracking and control, and
data review procedures are examined for conformance with the QAPP. At least one field sampling
TSA should be performed at the start of field sampling activities.

On-site Analytical TSA - A thorough audit of on-site analytical procedures during which the
facility (e.g., mobile  lab, trailer), equipment, instrumentation,  supplies,  personnel, training,
analytical  methods and procedures, laboratory procedures, sample handling and tracking, data
reporting,  data handling and management, data tracking and control, and data review procedures are
checked for conformance with the QAPP. An on-site analytical  TSA can be performed prior to, at
the start of, or at any time during field sampling activities. However, at least one on-site analytical
TSA should be performed prior to the start of the field sampling activities so that effective corrective
action  measures  can  be  implemented  to  mitigate  the  extent  and  impact  of identified
nonconformances.

Off-site Laboratory TSA - A thorough  audit of an off-site laboratory during which the facility,
equipment, instrumentation, supplies, personnel, training, analytical  methods and procedures,
laboratory procedures, sample handling and tracking, data reporting, data deliverables, data handling
and management,  data tracking  and control, and data  review procedures  are  checked for
conformance with the QAPP. An off-site laboratory TSA can be performed prior to, at the start of,
or at any time during field sampling activities. However, it is recommended that at least one off-site
laboratory TSA be performed prior to the start of the field sampling activities so that effective
corrective action measures can be  implemented to mitigate the extent and  impact of identified
nonconformances.

Split Sampling and Analysis Audit - A comparison study to assess interlaboratory precision and
accuracy. The sampler collects one field sample and then physically splits it into two representative
sample aliquots. The samples are  then sent to different laboratories for analysis. Split samples
quantitatively assess the measurement error introduced by the organization's sample shipment and
analysis system and must be accompanied by a PT sample to establish the acceptance criteria. Split
IDQTF, UFP-QAPP V1, March 2005                                                    Final
Assessment and Oversight Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 101  of 149
sample comparability criteria must be generated prior to sample collection and documented in the
QAPP (see Figure 15, Section 2.6.2.5.1).

PT Sample  Tracking and Analysis - Statistical analysis of PT sample results, which provides
information  on routine laboratory performance and the overall accuracy and bias of the analytical
method. The QAPP should address the selection of appropriate PT samples.  Factors to consider
include analyte selection; whether PT samples are single or double blind, native or synthetic matrix,
or spiked or  natively contaminated or both; multiple matrices and concentrations; total number of
PT samples;  and analytical methods.

Data Review TSA - A thorough review of the complete data review process, including a review of
the sampling and  analysis  verification,  sampling  and analysis validation,  and data usability
assessment steps, to ensure that the process conforms to the procedures specified in the QAPP (for
more information, see  Section 5.0). The data review  TSA  may also include an audit of the
performance of the data reviewer.

Management Systems Review (MSR) - A review  of an organization or organizational subset to
determine if the management structure, policies, and procedures are  sufficient to ensure that an
effective quality system is in place that supports the  generation of usable project data. This review
is performed against the organization's QMP.

If assessments (audits) are planned, the QAPP should contain  the information shown in Figure 34
(QAPP Worksheet #31). If no assessments are planned, the QAPP must contain documentation and
justification  of that fact.








Assessment
Type









Frequency








Internal or
External







Organization
Performing
Assessment




Person(s)
Responsible for
Performing
Assessment, (Title
and Organizational
Affiliation)



Person(s)
Responsible for
Responding to
Assessment
Findings (Title and
Organizational
Affiliation)
Person(s)
Responsible
for Identifying
and
Implementing
Corrective
Actions (CA)
(Title and
Organizational
Affiliation)



Person(s)
Responsible for
Monitoring
Effectiveness of
CA (Title and
Organizational
Affiliation)
                         Figure 34. Planned Project Assessments
                                (QAPP Worksheet #31)

For each planned assessment, the following information should be recorded:

•• Assessed organization
•• Internal, external or EPA oversight
• • Location of assessment
•• Dates of assessment
IDQTF, UFP-QAPP VI, March 2005
Assessment and Oversight Elements
Final

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page:  102 of 149
•• Assessment team members
•• Type of assessment
•• Assessment scope
•• Documents to be reviewed
•• Notification date(s)
•• Proposed schedule
•• Assessment number
•• Contract number

Project-specific questionnaires and audit checklists used for performing assessments should be
attached to or referenced in the  QAPP. Completed checklists should be attached to the  QA
management reports, as described in Section 4.2.
   Note: Written oversight reports and split sampling results, and subsequent corrective action responses
   generated by the investigative organization, should be included in QA management and final project reports.
4.1.2 Assessment Findings and Corrective Action Responses

Assessment findings that require corrective action initiate a sequence of events that include
documentation of deficiencies, notification of findings, request for corrective action, implementation
of corrective action, and follow-up assessment of the corrective action's effectiveness. The QAPP
should describe how QAPP deviations and project deficiencies that are identified through the
planned project assessments will be handled. For each type of assessment, the QAPP should do the
following:

•• Describe how deficiencies will be documented and communicated (e.g., verbal debriefing after
   audit and/or written audit report).
• • Describe what type of corrective  action responses will be required, and how the responses will
   be documented.
•• Identify individuals  who will be notified of audit findings  (the name, title, organizational
   affiliation, position, e-mail address, and telephone and fax numbers of all individuals who should
   be notified of deficiencies and nonconformances).
•• Identify who should  receive the corrective action responses.
•• Include timeframes allowed for  the notification of audit findings, the request for corrective
   action, the transmittal of corrective action responses, and completion of corrective action.

Figure 35 (QAPP Worksheet #32) provides an example of the Assessment Findings and Corrective
Action Responses table.
IDQTF, UFP-QAPP V1, March 2005                                                     Final
Assessment and Oversight Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page:  103 of 149
The content and format of corrective action responses should be tailored to suit the PQOs. In certain
situations, a letter documenting specific procedural changes may be a sufficient corrective action
response. Appropriate procedural  changes  can include, but  are not limited to, additional staff
training, revision of SOPs, and rescheduling of sampling and analytical activities (e.g., to ensure that
holding times are met). Corrective actions that must be implemented immediately to ensure that
PQOs are met may require that work cease until the corrective actions are implemented and their
effectiveness verified.



Assessment
Type

Off-Site
Laboratory
TSA

Split
Sampling
andAnalysis
Audit



Nature of
Deficiencies
Documentation

Written audit
report


Memo




Individual(s)
Notified of
Findings
(Name, Title,
Org.)

Jay Strong
QA Officer
Ringer
Laboratories
Jay Strong
Ringer Labs;
Jane Black
AAA
Laboratories




Contact Information

JS@ringer. com
Phone: 755-555-1212
Fax: 755-555-1280

JS@ringer. com
755-555-1212;
Jane.Black@aaa. com
755-594-0000



Timeframe
of
Notification

5 days after
audit


48 hours
after
receiving
results

Nature of
Corrective
Action
Response
Documentation

Corrective
Action Plan


Letter




Individual(s)
Receiving
Corrective Action
Response (Name,
Title, Org.)

Jake Feathers
QA Officer
Butts Engineering

Jake Feathers
QA Officer
Butts Engineering






Contact Information

JakeF@buttseng. com
808-555-8000


Jake@buttseng.com
808-555-8000






Timeframe
for
Response
Two weeks
after
receiving
notification
5 days after
receiving
notification


        Figure 35. Example Assessment Findings and Corrective Action Responses
                                (QAPP Worksheet #32)

4.2  QA Management Reports

Periodic QA management reports ensure that managers and stakeholders are updated on project
status and results of all QA assessments. Efficient communication of project status and problems
allows proj ect managers to implement timely and effective corrective actions so data generated can
meet PQOs.

The QAPP should describe the content of each QA management report that will be generated for the
project,  including an  evaluation of measurement error  as determined  from the  assessments.
Assessment checklists, reports, requests for corrective action letters, and the corrective response
letters (refer to Section  4.1.2) should  be included  as attachments to or referenced in the QA
management reports. All QA management reports should be included as attachments to the final
proj ect report.

The following issues  should be included in the final project report, either as part of the QA
management report or  in a QA/QC section of the final project report:

•• Summary of project QA/QC programs and trainings conducted during the project
•• Conformance of project activities to QAPP requirements and procedures
•• Status of project and schedule delays
IDQTF, UFP-QAPP Manual VI, March 2005
Assessment/Oversight Elements
Final

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page:  104 of 149

•• Deviations from the approved QAPP and approved amendments to the QAPP
• • Results and trends of PT samples performed by all laboratories (per analytical group, matrix, and
   concentration level)
•• Description and findings of TSAs and other assessments
•• Results of data review activities in terms of amount of usable data generated
•• Required corrective actions and effectiveness of corrective action implementation
•• Data usability assessments in terms of precision, accuracy, representativeness, completeness,
   comparability, and sensitivity (refer to Section 5.2)
•• Limitations on the use of measurement data generated

Figure 36 (QAPP Worksheet #33) provides an example QA Management Reports table, identifying
the frequency and types of reports planned, projected delivery dates, personnel responsible for report
preparation,  and  report recipients.

4.3  Final Project Report

The issues listed above must be addressed in the QA management reports (as attachments to the final
project report) or the QA/QC section of the  final project report. The final project report must also
address additional data quality concerns, including but not limited to the following:

• • Narrative and timeline of proj ect activities
•• Summary of PQO development
• • Reconciliation of proj ect data with PQOs
•• Summary of major problems encountered and their resolution
•• Data summary, including tables, charts, and graphs with appropriate sample identification or
   station location numbers, concentration units, percent solids (if applicable), and data quality
   flags
•• Conclusions and recommendations
IDQTF, UFP-QAPP Manual VI, March 2005                                              Final
Data Review

-------
                                                                                                                  IDQTF UFP-QAPP Manual
                                                                                                                  Page:  105  of 149
    Type of Report
Frequency (daily, weekly
  monthly, quarterly,
     annually, etc.)
   Projected
Delivery Date(s)
Person(s) Responsible for Report Preparation,
     Title and Organizational Affiliation
          Report Recipients, Title
        and Organizational Affiliation
Field Sampling Technical
  Systems A udit Report
 17At startup of sampling
     38060
 Claire Carpenter, Project QA Officer, Chaucer
                Engineering
Dorothy Parker, Project Manager/Geotechnical
Engineer & James Keller, Field Sampling
Coordinator, Chaucer Engineering; Howard
Fast, Poe Recycling Project Manager, Poe
Recycling
   Off-site Laboratory
 Technical Systems Audit
        Report
 I/Prior to sample receipt
     38031
 Claire Carpenter, Project QA Officer, Chaucer
                Engineering
John Grissom, Laboratory QA/QC Manager &
Robert Galvani, Laboratory Manager, Austin
Labs; Howard Fast, Poe Recycling Project
Manager, Poe Recycling; Dorothy Parker,
Project Manager/Geotechnical Engineer,
Chaucer Engineering
  Data Review Report
I/After all data generated
      and reviewed
     38144
 Brendan Rivers, Data Validator, EDO Quality
 Services; Claire Carpenter, Project QA Officer,
            Chaucer Engineering
Dorothy Parker, Project Manager/Geotechnical
Engineer, Chaucer Engineering; Howard Fast,
Poe Recycling Project Manager, Poe Recycling;
Henry Thoreau, EPA Project Manager, EPA-NE
  Final Project Report
 I/After QA Management
    Reports and Risk
  Assessment completed
     38173
Dorothy Parker, Project Manager/Geotechnical
       Engineer, Chaucer Engineering
Howard Fast, Poe Recycling Project Manager,
Poe Recycling; Henry Thoreau, EPA Project
Manager, EPA-NE	
                                                  Figure 36. Example QA Management Reports
                                                              (QAPP Worksheet #33)
       IDQTF, UFP-QAPP Manual VI, March 2005
       Assessment/Oversight Elements
                                                                                                                  Final

-------
This page intentionally left blank.

-------
                                                                IDQTF UFP-QAPP Manual
                                                                Page:  107 of 149
5.0  DATA REVIEW ELEMENTS
Data review is the process which data are examined and evaluated to varying levels of detail and
specificity by a variety of personnel who have different responsibilities within the data management
process. It includes verification, validation, and usability assessment. This QAPP element group
encompasses the data review activities used to ensure that only scientifically sound data that are of
known and documented quality and meet project quality objectives (PQOs) are used in making
environmental decisions. The approach used for data review of a project must be appropriate to the
project requirements.

This section of the Manual  defines the steps of data review and describes their implementation.
Although data review takes  place after the data have been generated, determination of the type of
data review that is required to meet PQOs begins during the planning phase of the project. Key
questions regarding data review that must be answered during the project planning stage include,
but are not limited to, the following:

••  What  PQOs are  necessary to achieve  the   appropriate level  of precision,  accuracy,
    representativeness,  comparability, sensitivity, and  completeness? (See Section  2.6.1 for a
    discussion of PQOs.)
••  What data review inputs, activities, and outputs will be required for this project? (See Tables 8
    and 9 and Section 5.2 for examples.)
••  What entities will be responsible for each step of the data review process and what are  their
    relationships to those responsible for the data generation process?
• •  How will the implementation of the data review process and its results integrate with the overall
    project decision timeline?
• •  What is the extent of data review and the availability and appropriate use of streamlining tools?
    (See Section 5.3.)
  Note: Although the data review process outlined in the following sections is portrayed as a sequential process, it
  may be beneficial (and more cost-effective) for many projects to combine steps. For example, the entity conducting
  the verification could also conduct the first step of the validation process.
5.1  Overview
This UFP-QAPP Manual defines three distinct evaluative
steps that are used to ensure that proj ect data quality needs
are met. These data review steps are required for all data
collected and used in environmental projects (see the
QA/QC Compendium, Part 2B of the UFP-QAPP).
Note: All three data review steps apply to
all aspects of data generation, including
field sampling and analytical activities.
IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
Data Review

-------
                                                                IDQTF UFP-QAPP Manual
                                                                Page:  108  of 149

    Step I: Verification (review for completeness) - Confirmation by examination and provision
    of objective evidence that the specified requirements (sampling and analytical)  have been
    completed.

    Step II: Validation - Confirmation by examination and provision of obj ective evidence that the
    particular requirements for a specific intended use are fulfilled. Validation is a sampling and
    analytical process that includes evaluating compliance with method, procedure, or contract
    requirements and extends to evaluating against criteria based on the quality obj ectives developed
    in the QAPP (e.g., the QAPP measurement performance criteria [MFC]). The purpose of
    validation is to assess the performance of the sampling and analysis processes to determine the
    quality of specified data. It is divided into two subparts:

    - Step Ha assesses and documents compliance with methods, procedures, and contracts.
    - Step lib assesses and documents a comparison with MFC in the QAPP.

    Step III: Usability Assessment - Determination of the adequacy of data, based on the results
    of validation and verification, for the decisions being made. The usability step involves assessing
    whether the process execution and resulting data meet proj ect quality obj ectives documented in
    the QAPP.
  Consistency with EPA Documents

  Although the requirements in this Manual are consistent with QA/R-5 and QA/G-5, the definitions and scope for
  data review outlined in this section are different from those in QA/R-5, QA/G-5, and EPA QA/G-8, Guidance on
  Environmental Data Verification and Data Validation (November 2002). The IDQTF purposely expanded the
  scope to initiate change in the current process and to better ensure that only data of known and documented quality
  are used to make environmental decisions.
The table below describes the objectives, scope, steps, and output of data review associated with
each process term. The table identifies where the scope of the terms used or the steps involved in
the process are expansions of current practice.
IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Data Review Elements

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 109 of  149
                        Table 8. Data Review Process Summary
Process Term
Verification
Validation
Usability
Assessment*
Objective
Review to see if data
required for the project
are available.
- Assess and document
the performance of the
field sample collection
process.
- Assess and document
the performance of the
analytical process.
Assess and document
usability to meet project
quality objectives.
Scope
- Sampling*
- Analysis
- Sampling*
- Analysis
- Sampling
- Analysis
Data Review Step
I. Completeness check
Ha. Check compliance
with method, procedure,
and contract
requirements
lib. Compare with
measurement
performance criteria
from the QAPP*
III. Assess usability of
data by considering
project quality
objectives and the
decision to be made*
Output
Verification Report
- May be checklist
form
- Package includes
all documentation
Validation Report
- Includes qualified
data
- May be part of
other report such as
RI/FS
Usability Report
- May be part of
other report such as
RI/FS
The expansion of scope of terms and steps from current data review practice encompasses the
following:

•• The terms verification and validation apply to field sampling activities, as well as to the
   analytical component of data generation.
• • Validation assesses not only compliance with method, procedure, and contract requirements, but
   also compliance with QAPP-specific requirements.
• • Usability assessments are a minimum requirement for all environmental proj ect phases and data
   uses. This is the final step of data review: assessing whether the data are suitable as a basis for
   the decision.

Figure 37 outlines the data review process described in this UFP-QAPP Manual. Each step of the
process is critical to the overall assessment of data quality and each step builds on the outcome of
the previous  step.  The level  of data review  (types and amount of data  reviewed) should be
appropriate to the PQOs. Streamlining data review (validation in particular) is an option to consider
to potentially eliminate some validation requirements, if allowed by the project's data quality needs
(see Section 5.3).
IDQTF, UFP-QAPP Manual VI, March 2005
Data Review Elements
Final

-------
       Verification
                                                                                                                   IDQTF UFP-QAPP Manual
                                                                                                                   Page:  110 of 149
Validation
 Step Ha
Validation
 Step lib
 Usability
Assessment
                     Notes:
                     * Does not have to be a separate report - may He part of RI rS or other tiooument
                      Although the steps shown here are presented in a sequential manner, eertain steps in the data review proeess may be performed simultaneously.
                                                       Figure 37. Data Review Process
IDQTF, UFP-QAPP Manual VI, March 2005
Data Review Elements
                                                                                                          Final

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 111  of 149

In order to perform the data review steps described  above, reported analytical data must be
supported by complete data packages, as defined in the QAPP (see Table 7, Section 3.5.2). Data
packages include sample receipt and tracking information, chain-of-custody records, tabulated data
summary forms, and raw analytical data for all field samples, standards, QC samples, and all other
project-specific documents that are generated. If relevant raw data or sample information are not
available or adequate to document data quality, then data review cannot be performed,  and
resampling or reanalysis must be considered. Secondary data should also be evaluated during data
review, if available.

5.2    Data Review Steps

This section of the Manual describes what data review information must be included in the QAPP
and presents procedures for implementing each of the three data review steps: verification (step I),
validation (steps Ha and lib), and usability assessment (step III). Example activities are provided
to clarify the types of procedures that may be performed.

Table 9 lists example inputs for data review and identifies the step of the data review process to
which each input applies. These are only examples and are not intended to be either a minimum or
comprehensive list of inputs.

5.2.1   Step IrVerification

Verification is a completeness check that is performed before the data review process continues in
order to determine whether the required information (the complete data package) is available for
further review. It involves a review of all data inputs to ensure that they are present. The question
answered by this  step is: Are the inputs present? (Yes  or no). Table 9 provides examples of the
inputs  for conducting the completeness check. Although this  step is not  designed  for use in
qualitative review (e.g., a compliance check that takes place during step Ha  of the validation
process), it is essential for ensuring the availability of sufficient information for subsequent steps
of the data review process.
 IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
 Data Review Elements

-------
                                                       IDQTF UFP-QAPP Manual
                                                       Page: 112 of 149
                  Table 9. Example Inputs to Data Review Process

Item
Step I
Verification
Step Ha
Compliance
Step lib
Comparison
Step III
Usability
Planning Documents
1
2
3
4
5
6
7
8
9
10
11
12
Evidence of required approval of plan (QAPP)
Identification of personnel (those involved in the
project and those conducting verification steps)
Laboratory name
Methods (sampling and analysis)
Performance requirements (including QC criteria) for
all inputs
Project quality objectives
Reporting forms
Sampling plans, location, maps, grids, and sample ID
numbers
Site identification
SOPs (sampling and analytical)
Staff training and certification
List of project-specific analytes
X
X
X
X
X
X
X
X
X
X
X
X



X
X

X
X

X

X




X
X






Uses outputs
from
previous
steps
Analytical Data Package
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
Case narrative
Internal laboratory chain of custody
Sample condition upon receipt, and storage records
Sample chronology (time of receipt, extraction, and
analysis)
Identification of QC samples (sampling or lab,
temporal, and spatial)
Associated (batch or periodic) PT sample results
Communication logs
Copies of laboratory notebook, records, prep sheets
Corrective action reports
Definitions of laboratory qualifiers
Documentation of corrective action results
Documentation of individual QC results (e.g., spike,
duplicate, LCS)
Documentation of laboratory method deviations
Electronic data deliverable s
Instrument calibration reports
Laboratory name
Laboratory sample identification numbers
QC sample raw data
QC summary report
Raw data
Reporting forms, completed with actual results
Signatures for laboratory sign-off (e.g., laboratory QA
manager)
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X




X



X
X
X
X

X


X
X
X
X

Uses outputs
from
previous
steps
IDQTF, UFP-QAPP Manual VI, March 2005
Data Review Elements
Final

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page: 113  of 149
              Table 9. Example Inputs to Data Review Process  (continued)

35
Item
Standards traceability records (to trace standard source
from NIST, for example)
Step I
Verification
X
Step Ha
Compliance
X
Step lib
Comparison
X
Step III
Usability

Sampling Documents
36
37
38
39
40
41
42
43
44
45
46
47
48
49
Chain of custody
Communication logs
Corrective action reports
Documentation of corrective action results
Documentation of deviation from methods
Documentation of internal QA review
Electronic data deliverable s
Identification of QC samples
Meteorological data from field (e.g., wind,
temperature)
Sampling instrument decontamination records
Sampling instrument calibration logs
Sampling location and plan
Sampling notes and drilling logs
Sampling report (from field team leader to project
manager describing sampling activities)
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X


X
X
X
X

X
X


X
X
X

External Reports
50
51
52
53
54
55
External audit report
External PT sample results
Laboratory assessment
Laboratory QA plan
MDL study information
NELAP accreditation
X
X
X
X
X
X
X
X
X
X
X
X
X



X

Uses outputs
from
previous
steps
The QAPP planning process must establish verification procedures, which should be documented
in the QAPP to ensure that data are evaluated properly, completely, and consistently for use in
meeting PQOs. The procedures should address the following:

•• The process that will be used to verify sample collection, handling, field analysis, and analytical
   laboratory project data.
•• The procedures and criteria that will be used to verify data information operations. These
   operations include, but are not limited to, the electronic and/or manual transfer, entry, use, and
   reporting of data for computer models, algorithms, and databases; correlation studies between
   variables; data plotting and so forth.

Figure 38 (QAPP Worksheet #34) provides an example Verification (Step I) Process table that can
be used to present the process that will be followed to verify proj ect data. Verification inputs include
 IDQTF, UFP-QAPP Manual VI, March 2005
 Data Review Elements
Final

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page:  114 of 149

items such as those listed in Table 9. The description should detail how each item will be verified,
when the activity will occur, and what documentation is necessary. Internal or external'^ in relation
to the data generator. The resulting tables will describe the following:

•• How sample collection, handling, and analysis procedures will be verified.
•• How verification of field sampling, handling, and analysis activities will be documented
   (e.g., QC signatures in field logs, QC checklist, etc.).
•• Which sampling, handling, on-site analytical, and off-site laboratory data will be verified
   internally at the data generator level.
•• The end product of laboratory verification (e.g., laboratory-qualified data).
•• Which sampling, on-site analytical, and off-site laboratory data will be verified by entities
   external to the data generator.
Verification
Input
Chain of custody
Analytical data
package
QC summary report
Description
Chain-of-custody forms -will be reviewed internally
upon their completion and verified against the
packed sample coolers they represent. When
everything checks out, the shipper 's signature on the
chain-of-custody form -will be initialed by the
reviewer, a copy of the form will be retained in the
site file, and the original and remaining copies will
be taped inside the cooler for shipment. See SOP s for
further details.
All analytical data packages will be verified
internally by the laboratory performing the work for
completeness prior to submittal. The laboratory shall
complete the appropriate form documenting the
organization and complete contents of each data
package.
A summary of all QC sample results will be verified
for completeness by the prime contractor upon
receipt of data packages from the laboratory.
Internal/
External
7
7
E
Responsible for Verification
(Name, Organization)
Cole Lector
Jewel Engineering
Jasper Sanquin
Emerald Environmental Lab
Tammy Finsk
Whole World Consulting, Inc.
                     Figure 38. Example Verification (Step I) Process
                                 (QAPP Worksheet #34)

5.2.2   Step II: Validation

The QAPP planning process must establish validation procedures and criteria. Project-specific
validation procedures are developed to identify and qualify data that do not meet the measurement
performance criteria as  established  in Section 2.6.2. Validation procedures  and criteria are
documented in the QAPP to ensure that data are evaluated properly, completely, and consistently
 IDQTF, UFP-QAPP Manual VI, March 2005
 Data Review Elements
Final

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 115  of 149

for use in meeting PQOs. Validation guidance and documents may be attached to or referenced in
the QAPP and should address the following:

•• The process that will be used to validate sample collection, handling, field analysis, and
   analytical laboratory project data (example activities are listed in Section 5.2.2.1 and 5.2.2.2).
•• The specific validation  process that will be used for each analytical group, matrix, and
   concentration level.
• • The procedures and criteria used to validate data information operations, which may include, but
   are not limited to, the electronic  or manual transfer,  entry, use, and reporting of data for
   computer models, algorithms, and databases; correlation studies between variables; data plotting
   and so forth.

Figure 39 (QAPP Worksheet #35) shows what information to include in the Validation Process
(Steps Ha and lib) table that can be used to present the process that will be followed to validate
project data.

Validation inputs include items such as those listed in Table 9. The description should detail how
each item will be validated, when the activity will occur, and what documentation is necessary. The
resulting tables will describe the following:

•• How sample collection, handling, and analysis procedures will be validated  against the
   measurement performance criteria specified in Section 2.6.2.
•• How validation of field sampling, handling, and analysis activities will be documented (e.g., QC
   signatures in field logs, QC checklist, etc.).
•• Which sampling, on-site analytical, and off-site laboratory data will be validated.
• • The evaluative procedures used in validation to assess overall measurement error associated with
   the project, including the data quality indicators (DQIs) described in Section 5.2.3.2.
•• The individual, identified by title and organizational affiliation,  who is ultimately responsible
   for data validation. This is the person (lead chemist, project chemist, etc.) who will sign the
   project validation reports.
1 Step
| Ha/Hb
Validation Input
Description
Responsible for Validation 1
(Name, Organization) |
                     Figure 39. Validation (Steps Ha and lib) Process
                                 (QAPP Worksheet #35)

In addition, the QAPP should identify the matrices, analytical groups, and concentration levels that
each entity performing validation will be responsible for, as well as the criteria that will be used to
validate those data. Figure 40 (QAPP Worksheet #36) provides an example of the Validation
Summary (Steps Ha and lib) table. In a table, the validation criteria column may reference an outside

 IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
 Data Review Elements

-------
                                                                  IDQTF UFP-QAPP Manual
                                                                  Page: 116 of 149
guidance document (e.g., CLP Functional Guidelines) or different section of the QAPP. The title and
affiliation of the person who will perform the validation should be included for each entry, since this
may be different from the person ultimately responsible for the entire validation.
Step Ha/IIb
Ila
Ila
lib
Matrix
Soil
GW
Soil
Analytical
Group
VOA
Metal
VOA
Concentration Level
Low
Low/Medium
Low
Validation Criteria
SW-846 Method 8260B,
SOPs
SW-846 Method 601 OB,
SOPs
See QAPP section 2. 7
Validator
(title and organizational
affiliation)
Tom Lee, Chemist, Best
Review Company
Tom Lee, Chemist, Best
Review Company
Paula Simpson, Sr. Chemist,
Whatayuk Consulting
                Figure 40. Example Validation Summary (Steps Ila and lib)
                                   (QAPP Worksheet #36
 Note: Sources of sampling and analytical error should be identified and corrected as soon as possible after sample
 collection activities have begun. Incorporating an ongoing usability assessment process throughout the project,
 rather than just as a final step, will facilitate the early detection and correction of problems, thereby ensuring that
 PQOs are met.
 Streamlining Validation

 Some requirements for validation may be eliminated depending on project-specific data needs. Section 5.3 addresses
 the criteria for streamlining and amounts and types of data to be streamlined.
5.2.2.1 Step Ila Validation Activities

The examples listed in Table  10 are of specific activities that may occur during an environmental
proj ect under step Ila of the validation process (compliance with methods, procedures, and contracts)
for both sampling and analytical data. Although these activities are organized separately, they may
be performed at the same time and/or by the same people as step I and step lib activities.
 IDQTF, UFP-QAPP Manual VI, March 2005
 Data Review Elements
Final

-------
                                                        IDQTF UFP-QAPP Manual
                                                        Page: 117 of 149
                      Table 10. Step Ha Validation Activities
               (Compliance with Methods, Procedures, and Contracts)

Data Deliverables and
QAPP
Analytes
Chain-of-Custody
Holding Times
Sample Handling
Sampling Methods and
Procedures
Field Transcription
Analytical Methods and
Procedures
Data Qualifiers
Laboratory Transcription
Proficiency Testing
Standards
Communication
Activity
Ensure that all required information on sampling and analysis from step I was
provided (including planning documents).
Ensure that required lists of analytes were reported as specified in governing
documents (i.e., method, procedure, or contract).
Examine the traceability of the data from time of sample collection until reporting
of data. Examine chain-of-custody records against contract, method, or
procedural requirements.
Identify holding time criteria, and either confirm that they were met or document
any deviations. Ensure that samples were analyzed within holding times specified
in method, procedure, or contract requirements. If holding times were not met,
confirm that deviations were documented, that appropriate notifications were
made (consistent with procedural requirements), and that approval to proceed was
received prior to analysis.
Ensure that required sample handling, receipt, and storage procedures were
followed, and that any deviations were documented.
Establish that required sampling methods were used and that any deviations were
noted. Ensure that the sampling procedures and field measurements met
performance criteria and that any deviations were documented.
Authenticate transcription accuracy of sampling data (i.e., from field notebook to
reports).
Establish that required analytical methods (off-site laboratory and on-site
analytical) were used and that any deviations were noted. Ensure that the QC
samples met performance criteria and that any deviations were documented.
Determine that the laboratory data qualifiers were defined and applied as
specified in methods, procedures, or contracts.
Authenticate accuracy of the transcription of analytical data (i.e., laboratory
notebook to reporting form, or instrument to LIMS).
Confirm acceptance of PT sample results against performance requirements as
specified in methods, procedures, or contracts.
Determine that standards are traceable and meet contract, method, or procedural
requirements.
Establish that required communication procedures were followed by field or
laboratory personnel.
IDQTF, UFP-QAPP Manual VI, March 2005
Data Review Elements
Final

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page: 118 of 149
                  Table 10.  Step Ha Validation Activities (continued)

Audits
Step Ha Validation
Report
Activity
Review field and laboratory audit reports and accreditation and certification
records for the laboratory's performance on specific methods.
Summarize deviations from methods, procedures, or contracts. Include qualified
data and explanation of all data qualifiers.
5.2.2.2 Step lib Validation Activities

The examples listed in Table 11 are of specific activities that may occur during an environmental
proj ect under step lib of the validation process (comparison with measurement performance criteria
in the Q APP) for both sampling and analytical data. These activities require that the validators have
a complete copy of the QAPP, and they often involve all or parts of the project team. Some of the
activities listed for step Ha have a QAPP-specific review element and are therefore also listed as
activities under step lib.

                        Table 11. Step lib Validation Activities
          (Comparison with Measurement Performance Criteria in the QAPP)

Data Deliverables and
QAPP
Deviations
Sampling Plan
Sampling Procedures
Co-located Field
Duplicates
Project Quantitation
Limits
Activity
Ensure that the data report from step Ha was provided.
Determine the impacts of any deviations from sampling or analytical methods
and SOPs. For example, confirm that the methods given in the QAPP were used
and, if they were not, determine if data still meet MFCs. Consider the
effectiveness and appropriateness of any corrective action.
Determine whether the sampling plan was executed as specified (i.e., the
number, location, and type of field samples were collected and analyzed as
specified in the QAPP).
Evaluate whether sampling procedures were followed with respect to equipment
and proper sampling support (e.g., techniques, equipment, decontamination,
volume, temperature, preservatives, etc.).
Compare results of collocated field duplicates with criteria established in the
QAPP.
Determine that quantitation limits were achieved, as outlined in the QAPP and
that the laboratory successfully analyzed a standard at the QL.
IDQTF, UFP-QAPP Manual VI, March 2005
Data Review Elements
Final

-------
                                                                IDQTF UFP-QAPP Manual
                                                                Page:  119 of  149
                  Table 11. Step lib Validation Activities (continued)

Confirmatory Analyses
Performance Criteria
Data Qualifiers
Step lib Validation
Report
Activity
Evaluate agreement of laboratory results.
Evaluate QC data against project-specific performance criteria in the QAPP (i.e.,
evaluate quality parameters beyond those outlined in the methods).
Determine that the data qualifiers applied in step Ha were those specified in the
QAPP and that any deviations from specifications were justified.
Summarize outcome of comparison of data to MFC in the QAPP. Include
qualified data and explanation of all data qualifiers.
5.2.3   Step III: Usability Assessment

A usability assessment considers whether data meet project quality objectives as they relate to the
decision to be made, and evaluates whether data are suitable for making that decision. All types of
data (e.g., sampling, on-site analytical, off-site laboratory) are relevant to the usability assessment.
The usability assessment is the final step of data review and  can be performed only on data of
known and documented quality (i.e., verified and validated data).

To accomplish this step of data review, the project team should do the following:

• •  Summarize the usability assessment process and all usability assessment procedures, including
    interim steps and any statistics, equations, and computer algorithms that will be used to assess
    data (example activities are listed in Section 5.2.3.2).
••  Describe the documentation that will be generated during usability assessment.
••  Identify the personnel (by title and organizational affiliation) responsible for performing the
    usability assessment.
••  Describe how usability assessment results will  be presented so that they identify trends,
    relationships (correlations), and anomalies.
••  Describe the evaluative procedures used to assess overall measurement error associated with
    the project and include the DQIs described in Section 5.2.3.1.

QAPP Worksheet #37, Usability Assessment, may be used for this purpose.
IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
Data Review Elements

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page:  120 of  149

5.2.3.1 Data Limitations and Actions from Usability Assessment

The following data quality indicators (precision, accuracy/bias, representativeness, comparability,
completeness, and sensitivity) are important components of validation and usability assessment.
A description of how they should be incorporated into the usability report is found under each
parameter heading.  Further discussion of the importance of these parameters as  they relate to
specific QC samples can be found in Sections 2.6.2 and 3.4 of this UFP-QAPP Manual, and Section
2.2 of the QA/QC Compendium.

When proj ect-required measurement performance criteria are not achieved and proj ect data are not
usable to adequately address environmental questions (i.e., to determine if regulatory or technical
action limits have been exceeded) or to support project decision-making, then the usability report
should address how this problem will be resolved and discuss the potential need for resampling.

5.2.3.1.1  Precision

Precision is the degree to which a set of observations or measurements of the same property,
obtained under similar conditions,  conform to themselves. Precision is usually  expressed as
standard deviation, variance, percent difference, or range, in  either absolute or relative terms.
Examples of QC measures for precision include field duplicates, laboratory duplicates, matrix spike
duplicates, analytical replicates, and surrogates.

In order to meet the needs of the data users, project data must meet the measurement performance
criteria for precision specified in the QAPP (see Section 2.6.2.1). Section 2.2.2 and Table A-l of
the QA/QC Compendium identify QC samples required for projects in the CERCLA process that
contribute to the measurement of precision.

Poor overall precision may be the result of one or more of the following:  field instrument variation,
analytical measurement variation, poor sampling technique, sample transport problems, or spatial
variation (heterogeneous sample matrices). To identify the cause of imprecision, the field sampling
design rationale and sampling techniques should be evaluated by the reviewer, and both field and
analytical duplicate/replicate sample results  should be reviewed. If poor precision is indicated in
both the field and analytical duplicates/replicates, then the laboratory may be the source of error.
If poor precision is limited to the field duplicate/replicate results, then the sampling technique, field
instrument variation, sample transport, and/or spatial variability may be the source of error.

If data validation reports indicate that analytical imprecision  exists for a  particular data set or
sample delivery group (SDG), then the impact of that imprecision on usability must be discussed
in the usability report.
IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
Data Review Elements

-------
                                                                   IDQTF UFP-QAPP Manual
                                                                   Page:  121  of  149
 Usability Report

 The usability report should discuss and compare overall field duplicate precision data from multiple data sets
 collected for the proj ect for each matrix, analytical group, and concentration level. Usability reports should describe
 the limitations on the use of project data when overall precision is poor or when poor precision is limited to a
 specific sampling or laboratory (analytical) group, data set or SDG, matrix, analytical group, or concentration level.
5.2.3.1.2  Accuracy/Bias

Accuracy is the degree of agreement between an observed value and an accepted reference value.
Accuracy includes a combination of random error (precision) and systematic error (bias), that are
due to sampling and analytical operations. Examples of QC measures for accuracy include PT
samples, matrix spikes, laboratory control  samples (LCSs), and equipment blanks.

In order to meet the needs of the data users, project data must meet the measurement performance
criteria for accuracy/bias specified in the QAPP (see Section 2.6.2.2). Section 2.2.2 and Tables A-2
and A-3 of the QA/QC Compendium identify QC samples required for projects in the CERCLA
process that contribute to the measurement of accuracy.
 Usability Report

 The usability report should:

 •  Discuss and compare overall contamination and accuracy/bias data from multiple data sets collected for the
    project for each matrix, analytical group, and concentration level.
 •  Describe the limitations on the use of project data if extensive contamination and/or inaccuracy or bias exist,
    or when inaccuracy is limited to a specific sampling or laboratory group, data set or SDG, matrix, analytical
    group, or concentration level.
 •  Identify qualitative and/or quantitative bias trends in multiple proficiency testing (PT) sample results for each
    matrix, analytical group, and concentration level.
 •  Discuss the impact of any qualitative and quantitative trends in bias on the sample data.

 Any PT samples that have false positive or false negative results should be reported, and the impact on
 usability should be discussed in the usability report.
IDQTF, UFP-QAPP Manual V1, March 2005                                                  Final
Data Review Elements

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page:  122 of 149
5.2.3.1.3  Representativeness
Representativeness is the measure of the degree to which data accurately and precisely represent
a characteristic of a population, a parameter variation at a sampling point, a process condition, or
an environmental condition. In order to meet the needs of the data users, proj ect data must meet the
measurement performance criteria for sample representativeness specified in the QAPP (see Section
2.6.2.4).

The QAPP should discuss how the QA/QC activities (review of sampling design and SOPs, field
sampling TSAs, split sampling and analysis audits, etc.) and QC sample data will be reviewed to
assess sample representativeness. If field duplicate precision checks indicate potential spatial
variability, additional scoping meetings and subsequent resampling may be needed in order to
collect data that are more representative of a nonhomogeneous site.
 Usability Report

 The usability report should discuss and compare overall sample representativeness for each matrix, analytical
 group, and concentration level. Usability reports should describe the limitations on the use of project data when
 overall nonrepresentative sampling has occurred, or when nonrepresentative sampling is limited to a specific
 sampling, group, data set or SDG, matrix, analytical group, or concentration level.
5.2.3.1.4   Comparability

Comparability is the degree to which different methods, data sets, and decisions agree or can be
represented as similar. Comparability describes  the  confidence (expressed qualitatively or
quantitatively) that two data sets can contribute to a common analysis and interpolation. In order
to meet the needs of the data users, project data must meet the measurement performance criteria
for comparability specified in the QAPP (see Section 2.6.2.5).

The QAPP should include methods and formulas for assessing data comparability for each matrix,
analytical group, and concentration level. Different  situations require different assessments of
comparability, as in the following:

• • If two or more sampling procedures or sampling teams will be used to collect samples, describe
   how comparability will be assessed for each matrix, analytical group, and concentration level.
•• If two or more analytical methods or SOPs will be used to analyze samples of the same matrix
   and concentration level for the same analytical group,  describe how comparability will be
   assessed between the two data sets.
•• If split samples are analyzed, document the specific method and percent difference formula that
   will be used to assess split sample comparability for individual data points (refer to Section
   2.6.2.5.1). To document overall comparability, describe the procedures used to perform overall
   assessment of oversight split sampling comparability and include mathematical and statistical
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Data Review Elements

-------
                                                                IDQTF UFP-QAPP Manual
                                                                Page:  123  of 149

    formulas for evaluating oversight split sampling data comparability. Section 2.2.2 of the
    QA/QC Compendium recommends that, for proper evaluation of results, split samples should
    be used only when accompanied by a batch-specific PT sample.
    If screening data will be confirmed by definitive methods, document the specific method and
    percent difference formula that will be used to assess comparability for individual data points
    (refer to Section 2.6.2.5.2). To document overall comparability, describe the procedures used
    to perform  overall  assessment of comparability and include mathematical and  statistical
    formulas for evaluating screening and confirmatory  data comparability.
    If the project is  long-term  monitoring, project  data  should be compared with previously
    generated data to ascertain the possibility of false positives and false negatives, and positive and
    negative trends in  bias.  Data comparability  is  extremely important  in these situations.
    Anomalies detected in the data may reflect a changing environment or indicate sampling and/or
    analytical error. Comparability criteria should be established to evaluate these data sets to
    identify outliers and the need for  resampling as warranted.
 Usability Report

 The usability report should:

 •  Discuss and compare overall comparability between multiple data sets collected for the project for each matrix,
   analytical group, and concentration level.
 •  Describe the limitations on the use of project data when project-required data comparability is not achieved for
   the overall project or when comparability is limited to a specific sampling or laboratory group, data set or SDG,
   matrix, analytical group, or concentration level.
 •  Document the failure to meet screening/confirmatory comparability criteria and discuss the impact on usability.
 •  Document the failure to meet split sampling comparability criteria and discuss the impact on usability.
 •  If data are  not usable  to adequately address environmental questions or support project decision-making,
   address how this problem will be resolved and discuss the potential need for resampling.
 •  If long-term monitoring data are not comparable, address whether the data indicate a changing environment or
   are a result of sampling or analytical error.
5.2.3.1.5   Sensitivity and Quantitation Limits

Sensitivity is the capability of a test method or instrument to discriminate between measurement
responses representing different levels (e.g., concentrations) of a variable of interest. Examples of
QC measures for determining sensitivity include laboratory fortified blanks, a method detection
limit study, and calibration standards at the quantitation limit (QL).

In order to meet the needs of the data users, project data must meet the measurement performance
criteria for sensitivity and project QLs specified in the QAPP (see Section 2.6.2.3). Section 2.2.2
and Table A-4 of the QA/QC Compendium identify QC samples required for projects in the
CERCLA process that contribute to the measurement of sensitivity.


IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
Data Review Elements

-------
                                                                IDQTF UFP-QAPP Manual
                                                                Page:  124  of 149
The QAPP should include the following:
   Methods and formulas for calculating analytical sensitivity that ensure QLs are achieved (e.g.,
   percent recovery of laboratory fortified blank compounds)
   Procedures for calculating MDLs, QLs, and SQLs (refer to Figure 14 in Section 2.6.2.3)
   Procedures for evaluating low-point calibration standards run at the QL
  Usability Report

  The usability report should:

  •  Discuss and compare overall sensitivity and QLs from multiple data sets collected for the project for each
    matrix, analytical group, and concentration level.
  •  Discuss the impact of that lack of sensitivity or higher QLs on data usability, if validation reports indicate that
    sensitivity or QLs were not achieved .
  •  Describe the limitations on the use of project data if project-required sensitivity and QLs are not achieved for
    all project data, or when sensitivity is limited to a specific sampling or laboratory group, data set or SDG,
    matrix, analytical group, or concentration level.
5.2.3.1.6   Completeness

Completeness is a measure of the amount of valid data obtained from a measurement system
compared with the amount that was expected to be obtained under correct, normal circumstances.

In order to meet the needs of the data users, project data must meet the measurement performance
criteria for data completeness specified in the QAPP (see Section 2.6.2.6).

The QAPP should:

••  Include the methods and formulas for calculating data completeness.
••  Describe how the amount of valid data will be determined as a percentage of the number of
    valid measurements that are specified in the QAPP for each matrix, analytical group, and
    concentration level.
• •  Describe how critical data will be assessed for completeness when certain sample locations or
    analytes and matrices are more critical than others in making project decisions.
IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Data Review Elements

-------
                                                                  IDQTF UFP-QAPP Manual
                                                                  Page:  125  of 149
  Usability Report

  The usability report should:

  •  Discuss and compare overall completeness of multiple data sets collected for the project for each matrix,
    analytical group, and concentration level.
  •  Describe the limitations on the use of project data if project-required completeness is not achieved for the overall
    project, or when completeness is limited to a specific sampling or laboratory group, data set or SDG, matrix,
    analytical group, or concentration level.
5.2.3.2 Activities

The entire project team should reconvene to perform the usability assessment to ensure that the
PQOs are understood and the full scope is considered. The items listed in Table 12 are examples of
specific  items that  should be considered during an  environmental  project under the usability
assessment.
                     Table 12. Considerations for Usability Assessment
Item
Data Deliverables and
QAPP
Deviations
Sampling Locations,
Deviation
Chain-of-Custody,
Deviation
Holding Times,
Deviation
Damaged Samples,
Deviation
PT Sample Results,
Deviation
SOPs and Methods,
Deviation
Assessment Activity
Ensure that all necessary information was provided, including but not limited to
validation results.
Determine the impact of deviations on the usability of data.
Determine if alterations to sample locations continue to satisfy the project
objectives.
Establish that any problems with documentation or custody procedures do not
prevent the data from being used for the intended purpose.
Determine the acceptability of data where holding times were exceeded.
Determine whether the data from damaged samples are usable. If the data
cannot be used, determine whether resampling is necessary.
Determine the implications of any unacceptable analytes (as identified by the PT
sample results) on the usability of the analytical results. Describe any limitations
on the data.
Evaluate the impact of deviations from SOPs and specified methods on data
quality.
 IDQTF, UFP-QAPP Manual VI, March 2005
 Data Review Elements
Final

-------
                                                             IDQTF UFP-QAPP Manual
                                                             Page: 126 of  149
             Table 12. Considerations For Usability Assessment (continued)
Item
QC Samples
Matrix
Meteorological Data and
Site Conditions
Comparability
Completeness
Background
Critical Samples
Data Restrictions
Usability Decision
Usability Report
Assessment Activity
Evaluate the implications of unacceptable QC sample results on the data
usability for the associated samples. For example, consider the effects of
observed blank contamination.
Evaluate matrix effects (interference or bias).
Evaluate the possible effects of meteorological (e.g., wind, rain, temperature)
and site conditions on sample results. Review field reports to identify whether
any unusual conditions were present and how the sampling plan was executed.
Ensure that results from different data collection activities achieve an acceptable
level of agreement.
Evaluate the impact of missing information. Ensure that enough information
was obtained for the data to be usable (completeness as defined in PQOs
documented in the QAPP).
Determine if background levels have been adequately established (if
appropriate).
Establish that critical samples and critical target analytes/COCs, as defined in
the QAPP, were collected and analyzed. Determine if the results meet criteria
specified in the QAPP.
Describe the exact process for handling data that do not meet PQOs (i.e., when
measurement performance criteria are not met). Depending on how those data
will be used, specify the restrictions on use of those data for environmental
decision-making.
Determine if the data can be used to make a specific decision considering the
implications of all deviations and corrective actions.
Discuss and compare overall precision, accuracy /bias, representativeness,
comparability, completeness, and sensitivity for each matrix, analytical group,
and concentration level. Describe limitations on the use of project data if criteria
for data quality indicators are not met.
5.3    Streamlining Data Review

Streamlining data review refers to a process of eliminating some requirements for validation (steps
Ha and lib) that are deemed no longer necessary to preserve data integrity. Streamlining data review
is  meant to reduce time and costs while still confirming the  quality of the data.  Thus,  any
streamlining option should recognize that:
 IDQTF, UFP-QAPP Manual VI, March 2005
 Data Review Elements
Final

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 127  of 149
•• The types and amounts of data reviewed should be sufficient to develop a clear understanding
   of the quality of the data.
•• The practice of reviewing a subset of data (or a data indicator such as a successful PT sample)
   as a substitute for reviewing all data should be reevaluated if problems are detected that call into
   question the quality of the data set.

Streamlining data review occurs when efficiencies are created in the data review process by the
following actions:

• • Looking at a subset of data that is representative of a larger universe.
• • Examining the data in an alternative manner (e.g., through the use of batch-specific PT samples).

Different EPA Regions, DoD  components, and  DOE facilities  have negotiated a  variety of
streamlining options with different projects. The decision as to the nature and type of streamlining
to be conducted is determined by the project team on a site-by-site or facility-by-facility basis and
must be documented in the QAPP. The QAPP should also contain decision criteria that allow for
revision of the initial streamlining plan. For example, decision criteria contained in the QAPP could
specify that if problems are identified in the investigation, then streamlining cannot occur. Other
factors may also lead to a revision of the initial streamlining decision, such  as intense political
interest and concern on the part of the community.  The QAPP should contain a statement that
prohibits streamlining when conditions are not optimal.

Applicability of streamlining options is addressed in three ways: data review  steps for which
streamlining may be applicable,  criteria for considering the streamlining of data review, and  level
and type of streamlining to be applied. Each of these is addressed below.

5.3.1   Data Review Steps To Be Streamlined

Use of streamlining of data review steps is negotiated on a project-specific basis, in accordance with
the criteria outlined below, and is documented in the project-specific QAPP. The decision of
whether  to streamline data review or  not occurs during  a step  Ha of the  validation process
(compliance with method, procedural, and contractual requirements) and subsequent steps that rely
on outputs  from step Ha. The level of streamlining in the data  review steps (Ha, lib,  and III) is
evaluated on a project-specific basis.

Verification. Step I (verification) is not subject to streamlining. This step is a completeness check
of all of the sampling and analytical data associated with the project.  It is conducted by the
environmental  laboratory  (for analytical data) and by the prime contractor (for both sample
collection and analytical data). It may be conducted externally.
 IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
 Data Review Elements

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page:  128  of 149

Validation. Step Ha (validation) may be streamlined based on criteria described in Section 5.3.2.
The amount of streamlining and the type of information to be streamlined  is negotiated on a proj ect-
by-project basis that takes into account the cost savings of streamlining  analytical data  validation
and review, while maintaining sufficient representativeness to ensure quality. Validation step lib
(consistency with QAPP-specific requirements) can be streamlined to the same degree as step Ha,
since it relies on the outputs of step Ha.

Usability Assessment. Step III (usability assessment) can be streamlined only to the degree that step
Ha is streamlined, given that usability relies on outputs from previous steps.

5.3.2  Criteria for Streamlining Data Review

For each project, the following criteria are used to qualitatively evaluate the extent to which a
streamlined data review process for validation steps Ha and lib is appropriate:

•• Level of risk associated with the target analytes/COCs at the site (not always known in the
   planning stage).
•• Cost and schedule demands of the  overall project (could drive a decision to implement
   streamlining that may speed up the project and reduce costs).
•• The specific decisions for which the data will be used (e.g., risk assessment or determination of
   whether further investigation is required).
•• Complexity of analysis  (more streamlining may be acceptable for  simple analyses; less
   streamlining may be appropriate for highly complex analyses).
•• Ability to identify critical  (most significant) samples and focus data review on those samples.
• • Political attention to proj ect (could drive more streamlining, in the case of time pressures, or less
   streamlining, in the case of potentially elevated risks).
• • Results of proj ect-specific audits suggesting that data quality problems exist or that contractors
   are performing high-quality work.
•• Sampling events that include recurring samples (i.e., monthly or quarterly long-term monitoring
   of the same chemicals could lead to streamlined validation for these events).
•• Proximity of results to action levels. For example, analytical levels that are close to action levels
   may require a higher level of confidence (and a greater amount of validation) than levels that
   are considerably above action levels and for which validation is not likely to  show a  difference
   in the presence or absence of risk.
• • Availability of successfully performed batch-specific PT samples.  The PT sample should be of
   a similar matrix, contaminant makeup, and concentration as the environmental samples being
   tested, and quantitative acceptance criteria should be established. Batch-specific PT samples
   may be used to  streamline the analytical portion of validation only. Section 2.2.3 of the QA/QC
   Compendium summarizes the issues surrounding a requirement for batch-specific PT samples,
   and Section 2.3.3.2 describes the circumstances that may allow their use as a tool to streamline
   validation.

 IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
 Data Review Elements

-------
                                                                 IDQTF UFP-QAPP Manual
                                                                 Page: 129  of 149
5.3.3  Amounts and Types of Data Appropriate for Streamlining

The amounts and types of data to be streamlined (for steps Ha and lib), as well as the nature of the
streamlining activity, will  be determined by  site-specific  circumstances.  Some examples of
streamlining options are presented below:

••  Only a specific percentage of all data sets will be validated (e.g., 10 percent), unless a problem
    is identified.
••  Only a specific percentage of all data sets will be validated, but all critical samples, as identified
    in the QAPP, will undergo full data review.
••  Only a specific percentage of all data sets will be validated, but that validation will include
    recalculation of raw data.
••  All data will be validated, but only a percentage of raw data will be reviewed and recalculated.
••  Successful  batch-specific PT samples may substitute  for  validation of all or some  of the
    analytical data.
  Note: The term validation has traditionally applied to analytical data. As used here, the term applies both to data
  from field sampling and analytical activities. Since the environmental community has more experience with
  validation for analytical data, it is easier to identify some logical options for that process. The examples
  described above therefore involve analytical data.
 IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
 Data Review Elements

-------
This page intentionally left blank.

-------
                                                            IDQTF UFP-QAPP Manual
                                                            Page:  131 of 149
                                    REFERENCES
American Association for Laboratory Accreditation, A2LA Guide for the Estimation of Measurement
Uncertainty in Testing, July 2002. Web site: http://www.a21a2.net/guidance/est_mu_testing.pdf.

American National Standards Institute, Quality Systems for Environmental Data and Technology
Programs-Requirements with guidance for use, American National Standard, ANSI/ASQE4-2004.

ASTM International, D4840-99 Standard Guide for Sampling Chain-of-Custody Procedures., 2003.

Department of Defense, Quality Systems Manual for Environmental Laboratories, Version 2, June
2002 or most recent revision.

Eurachem/CITAC, Quantifying Uncertainty in Analytical Measurement, 2nd Edition, July 2002.
Web site: http://www.eurachem.ul.pt/guides/QUAM2000-l.pdf

Ingersoll, William (NAVSEA), Environmental Analytical Measurement Uncertainty Estimation:
Nested Hierarchical Approach, 2001.

National Environmental Laboratory Accreditation Conference (NELAC), 2003 NELAC Standard
Chapters 1 to 6, EPA/600/R-04/003, June 5,2003 or most recent version. Web site: http://www.epa.
gov/nerlesdl/land-sci/nelac/.

U.S. Air Force, Quality  Assurance Project Plan,  HQ Air Force Center for Environmental
Excellence, July 2001 or most recent revision. Web site: http://www.afcee.brooks.af.mil/er/qfw.htm.

U.S. Army Corps of Engineers, Chemical Quality Assurance for HTRW Projects, EM-200-1-6.
October  10, 1997, or most  recent revision. Web  site: http://www.usace.army.mil/inet/usace-
docs/eng-manual s/em. htm.

U.S. Army Corps of Engineers, Requirements for the Preparation of Sampling and Analysis Plans,
US ACE EM 200-1-3. Web site: http://www.usace.army.mil/inet/usace-docs/eng-manuals/em.htm.

U.S. Army Corps of Engineers, Technical Project Planning Guidance for HTRW Data Quality
Design,  USAGE  EM 200-1-2.  Web  site:  http://www.usace.army.mil/inet/usace-docs/eng-
manuals/em200-l-2/toc.htm.

U.S. EPA, Data Quality Objectives Decision Error Feasibility Trials Software (DEFT) - USER'S
GUIDE (EPA QA/G-4D),  EPA/240/B-01/007, September 2001 or most recent revision. Web site:
http://www.epa.gov/qualityl/qa/pdfs/ldrdv.pdf.

IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
References

-------
                                                           IDQTF UFP-QAPP Manual
                                                           Page: 132 of 149
U.S.EP'^Guidance for the Data Quality AssessmentProcess: PracticalMethodsfor Data Analysis
EPA/600/R-96/084, July 2000 or most recent revision. Web site: http://www.epa.gov/qualityl/qa
_docs.html.

U.S. EPA, Guidance for the Data Quality Objectives Process (EPA QA/G-4), EPA/600/R-96/055,
July 2000 or most recent revision. Web site: http://www.epa.gov/qualityl/qa_docs.html.

U.S. EPA, Guidance for Geospatial Data Quality Assurance Project Plans  (EPA QA/G-5G),
EPA/240/R-03/003, March 2003  or most recent version. Web site: http://www.epa.gov/qualityl/
qa_docs.html.

U.S. EPA, Guidance for Monitoring at Hazardous  Waste Sites: Framework for Monitoring Plan
Development and Implementation, OSWER Directive No. 9355.4-28, July 2002.

U.S. EPA, Guidance for the Preparation of Standard Operating Procedures for Quality-Related
Operations (EPA QA/G-6),  EPA/240/B-01/004, March 2001 or  most recent  revision. Web
site: http://www.epa.gov/quality 1 /qa_docs.html.

U.S. EPA, Guidance for Quality Assurance Project Plans (EPA QA/G-5), EPA/240/R-02/009,
December 2002 or most recent revision. Web site: http://www.epa.gov/qualityl/qa_docs.html.

U.S. EPA, Guidance for Quality Assurance Project Plans for Modeling (EPA QA/G-5M),
EPA/240/R-02/007, December 2002 or most recent version. Web site: http://www.epa.gov/
quality l/qa_docs.html.

U.S. EPA, Guidance on Choosing a Sampling Design for Environmental Data Collection for Use
in Developing a Quality Assurance Project Plan (EPA QA/G-5 S), EPA/240/R-02/005, December
2002 or most recent. Web site: http://www.epa.gov/qualityl/qa_docs.html.

U.S. EPA, Information Quality Guidelines, EPA/260R-02-008, October 2002.

U.S. EPA, National Enforcement Investigations Center  (NEIC) Policies  and Procedures,
EPA/330/9-78/001-R, May 1978, rev. December 1981, or most recent revision. NTIS: 1-800-553-
6847.

U.S. EPA, Requirements for Quality Assurance Project Plans (EPA QA/R-5), March 2001 or most
recent revision. Web site: http://www.epa.gov/qualityl/qa_docs.html.
IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
References

-------
                                                           IDQTF UFP-QAPP Manual
                                                           Page:  133 of 149

U.S. EPA, Intergovernmental Data Quality Task Force, Uniform Federal Policy for Implementing
Environmental Quality Systems: Evaluating, Assessing and Documenting Environmental Data
Collection/Use and Technology Programs, Final, Version 2, March 2005.

U.S. EPA, Region 9, Draft Laboratory Documentation Requirements for Data Validation (9Q A-07-
97), July 1997 or most recent revision. Web site: http://www.epa.gov/region9/qa/rq_qadocs.html.

U.S. EPA, Department of Defense, Department of Energy, U.S. Nuclear Regulatory Commission,
Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM), July 2000. Web site:
http://www.epa.gov/radiation/marssim/.

U.S. EPA, Department of Defense, Department of Energy, U.S. Nuclear Regulatory Commission,
Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP), July 2001, Web
site: http://www. eml. doe. gov/marlap.

U.S. Navy, Navy Installation Restoration Chemical Data Quality Manual (IR CDQM), September
1999. Web site: http://enviro.nfesc.navy.mil/ps/guidance/ircdqm/.
IDQTF, UFP-QAPP Manual V1, March 2005                                            Final
References

-------
This page intentionally left blank.

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 135  of 149

           GLOSSARY OF QUALITY ASSURANCE AND RELATED TERMS

Acceptance criteria — Specified limits placed on characteristics of an item, process, or service
defined in requirements documents.

Accuracy — The degree of agreement between an observed value and an accepted reference value.
Accuracy includes a combination of random error (precision) and systematic error (bias) components
that are due to sampling and analytical operations; a data quality indicator. Examples of QC measures
for accuracy include proficiency testing samples, matrix spikes, laboratory control samples (LCSs),
and equipment blanks.

Action limit/level — The numerical value that causes a decision maker to choose or accept one of
the alternative actions. It may be a regulatory threshold standard, such as a maximum contaminant
level for  drinking water; a risk-based concentration level; a technology limitation;  or a reference-
based standard.

Activity  — An all-inclusive term describing a specific set of operations or related tasks to be
performed, either serially or in parallel (e.g., research and development, field sampling, analytical
operations, equipment fabrication), that, in total, result in a product or service.

Aliquot — A measured portion of a sample taken for analysis.

Analyte — A property which is to be measured.

Analytical batch — A group of samples, including quality control samples, which are processed
together using the same method, the same lots of reagents, and at the same time or in continuous,
sequential time periods. Samples in each batch should be of similar composition and share common
internal quality control standards.

Assessment —  As  defined in the UFP-QAPP, the  evaluation process used to measure the
performance or effectiveness of a system and its elements against specific criteria.1 Examples include,
but are not limited to, audits, proficiency  testing,  management systems reviews,  data quality
assessments, peer reviews, inspections, or surveillance.

Audit (quality) — A systematic and independent examination to determine whether QA/QC and
technical activities are being conducted as planned  and whether these activities will effectively
achieve quality objectives. See also Technical Systems Audit.
  This differs from the definition for Assessments in the UFP-QS due to the different scope of the two documents.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Glossary of Quality Assurance and Related Terms

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page:  136  of 149

Bias — The systematic or persistent distortion of a measurement process, which causes errors in one
direction (i.e., the expected sample measurement is different from the sample's true value).

Blank — A sample subjected to the usual analytical or measurement process to establish a zero
baseline or background value; a sample that is intended to contain none of the analytes of interest.
A blank is used to detect contamination during sample handling preparation and/or analysis.

Bottle blank — A sample designed to  evaluate contamination introduced from  the  sample
container(s) in a particular lot.

Calibration — A comparison of a measurement standard, instrument, or item with a standard or
instrument of higher accuracy to detect and quantify inaccuracies and to report or eliminate those
inaccuracies by adjustments.

Calibration  standard — A  substance  or  reference material used for  calibration.  See  also
Calibration.

Certification — The process of testing and evaluation against specifications designed to document,
verify, and recognize the competence of a person, organization, or other entity to perform a function
or service, usually for  a specified time.

Chain of custody — An  unbroken trail  of accountability that ensures the physical security of
samples, data, and records.

Characteristic — Any property or attribute of a datum,  item, process, or service that is distinct,
describable, and/or measurable.

Coefficient of variation (CV) — A measure of precision (relative dispersion).  It is equal to the
standard deviation divided by the arithmetic mean. See also Relative standard deviation.

Co-located samples — See Field duplicates, co-located samples.

Comparability — The degree to which different methods or data agree or  can be represented as
similar. Comparability describes the confidence that two data sets  can contribute to a common
analysis and interpolation.

Completeness — A measure of the amount of valid data obtained from a measurement system
compared with the amount that was expected to be obtained under correct, normal conditions.

Configuration — The functional, physical, and procedural characteristics of an item, experiment,
or document.

IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Glossary of Quality Assurance and Related Terms

-------
                                                                 IDQTF UFP-QAPP Manual
                                                                 Page:  137  of 149
Conformance — An affirmative indication or judgment that a product or service has met the
requirements of the relevant specification, contract, or regulation; also,  the state of meeting the
requirements.

Contaminants of concern (COC) — The matrix-specific list of chemical compounds and analytes
determined to be pertinent to a specific site or project; sometimes used interchangeably with target
analytes.

Continuing calibration verification — A check of the initial calibration that is performed during
the course of an analytical shift at periodic intervals using a Calibration Check Standard. Continuing
calibration verification applies to both external standard and internal standard calibration techniques,
as well as to linear and non-linear calibration models. The purpose is to assess the continued
capability of the measurement system to generate accurate and precise data over a period of time.

Contractor — Any organization or individual contracting to furnish services or items or to perform
work.

Corrective action — Any measures taken to rectify conditions adverse to quality and, where
possible, to preclude their recurrence.

Data quality indicators (DQIs) — The quantitative statistics and qualitative descriptors that are used
to interpret the  degree of acceptability or utility of data to the user. The principal data quality
indicators  are precision,  accuracy/bias,  comparability, completeness,  representativeness,  and
sensitivity.2 Also referred to as data quality attributes.

Data quality objectives (DQOs) — Qualitative and quantitative statements derived from the data
quality objectives (DQO) process, as defined by EPA QA/G-4. DQOs can be used as the basis for
establishing the quality and quantity of data needed to support decisions.

Data quality objective (DQO) process — A systematic planning tool based on the scientific method
that clarifies study objectives, defines the appropriate type, quantity and quality of data and specifies
tolerable levels of potential decision errors needed to answer specific environmental questions and
to support proper environmental decisions. The DQO process is one type of systematic planning
process. See also Systematic planning process.

Data reduction — The process of transforming the number of data items by arithmetic or statistical
calculations, standard curves, and concentration factors, and collating them into a more useful form.
 2The definition in the UFP-QS does not include sensitivity; however, sensitivity is considered a principal DQI in this
 Manual.
IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
Glossary of Quality Assurance and Related Terms

-------
                                                                IDQTF UFP-QAPP Manual
                                                                Page:  138 of 149

Data reduction is irreversible and generally results in a reduced data set and an associated loss of
detail.

Data review — The process of examining and/or evaluating data to varying levels of detail and
specificity by a variety of personnel who have different responsibilities within the data management
process. It includes verification, validation, and usability assessment.

Data  user — Technical and other personnel responsible  for engineering, scientific, and legal
evaluations that are the basis for site decisions. Data users are responsible for determining data needs
required to satisfy project objectives from their perspective (remedy, risk, compliance, etc.).

Decision-maker — Project manager, stakeholder, regulator, etc., who has specific interests in the
outcome of site-related activities and will use the collected data to make decisions regarding the
ultimate disposition of the site or whether to proceed to the next study phase.

Definitive data — Analytical data of known quality, concentration, and level of uncertainty. The
levels of quality and uncertainty of the  analytical data are consistent with the requirements for the
decision to be made. Suitable for final decision-making. See also Screening data.

Design — The specifications, drawings,  design criteria, and performance requirement; also, the result
of deliberate planning, analysis, mathematical manipulations, and design processes.

Detection limit — A measure of the capability of an analytical method to distinguish samples that
do not contain a specific analyte from samples that contain low concentrations  of the analyte; the
lowest concentration or amount of the target analyte that can be determined to be  different from zero
by a single measurement at a stated level of probability. Detection limits are analyte- and matrix-
specific and may be laboratory-dependent. See also Method detection limit, Quantitation limit, and
Sample quantitation limit.

Distribution — (1) The appointment of an environmental contaminant at a point overtime, over an
area, or within a volume; (2) a probability function (density function, mass function, or distribution
function) used to describe a set of observations (statistical sample) or a population from which the
observations are generated.

Document — Written text such  as a  report, standard operating procedure, plan.  Once written,
documents can be revised or amended, unlike records which are not revised once written.

Document control — The policies  and procedures used by an organization to  ensure that its
documents and their revisions are proposed, reviewed, approved for release, inventoried, distributed,
archived, stored, and retrieved in accordance with the organization's requirements.
IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
Glossary of Quality Assurance and Related Terms

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page:  139 of 149

Environmental conditions — The description of a physical matrix (e.g., air, water, soil, sediment)
or a biological  system expressed in terms of its physical, chemical, radiological, or biological
characteristics.

Environmental data  — Any parameters or pieces of information collected  or produced from
measurements, analyses, or models of environmental processes, conditions, and effects of pollutants
on human health and the ecology, including results from laboratory analyses or  from experimental
systems representing such processes and conditions. It also includes information collected directly
from measurements, produced from models, and compiled from other sources such as databases or
the literature.

Environmental data  operations — Any work performed to obtain, use,  or report information
pertaining to environmental processes and conditions.

Environmental monitoring — The process of measuring or collecting environmental data.

Environmental processes — Any manufactured or natural processes that produce discharges to, or
that impact, the ambient environment.

Environmental programs — An all-inclusive term pertaining to any work or activities involving
the environment, including but  not limited to  characterization  of environmental processes  and
conditions;  environmental monitoring; environmental research and development; the  design,
construction,  and  operation  of environmental  technologies;  and  laboratory  operations  on
environmental samples.

Equipment blank — A sample  of water free  of measurable contaminants poured over or through
decontaminated field sampling equipment that is considered ready to collect or process an additional
sample. The purpose of this blank is to assess the adequacy of the decontamination process. Also
called rinse blank or rinsate blank.

Estimate — A characteristic from the sample  from which inferences on parameters can be made.

Field blank — A blank used to provide information about contaminants that may be introduced
during sample collection, storage, and transport; also a clean sample exposed to sampling conditions,
transported to the laboratory, and treated as an environmental  sample.

Field duplicate (replicate) samples — 1) A generic term for two (or more)  field samples taken at
the same time in the same location. They are intended to represent the same population and are taken
through all steps of the analytical procedure in an identical manner and provide precision information
for the data collection activity. 2) The UFP-QAPP recognizes two categories of Field Duplicates
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Glossary of Quality Assurance and Related Terms

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page:  140 of 149

Samples defined by the collection method, field duplicate, co-located samples and field duplicate,
subsamples. See also Field duplicate, co-located samples and Field duplicate, subsamples.

Field duplicate, co-located samples — Two or more independent samples collected from side-by-
side locations at the same point in time and space so as to be considered identical. These separate
samples are said to represent the same population and are carried through all steps of the sampling
and analytical procedures in an identical manner. These samples are used to assess precision of the
total method, including sampling, analysis, and site heterogeneity. Examples of co-located samples
include ambient air monitoring samples, surface water grab samples, and side-by-side sample core
soil samples.

Field duplicate (replicate), subsamples—Duplicate (replicate) samples resulting from one sample
collection at one sample location. For example, duplicate (replicate) subsamples may be taken from
one soil boring or sediment core.

Finding — An assessment conclusion that identifies a condition having a significant effect on an item
or activity. An assessment finding may be positive  or negative and  is  normally accompanied by
specific examples of the observed condition.

Graded approach — The objective process of establishing the project requirements and level of
effort according to the intended use of the results and the degree of confidence needed in the quality
of the results.

Guidance — A suggested practice that is not mandatory, intended as an aid or example in complying
with a standard or requirement.

Guideline — A suggested practice that is not mandatory in programs  intended to comply with a
standard.

Hazardous waste — Any waste material that satisfies the definition of hazardous waste given in 40
CFR 261, "Identification and Listing of Hazardous Waste."

Holding time — The period of time a sample may be stored prior to its  required analysis.

Inspection — The examination or measurement of an item or activity to verify conformance to
specific requirements.

Instrument blank — An aliquot of analyte-free water or solvent processed through the instrumental
steps of the measurement process to determine the presence of carryover  from the previous analysis.
Analysis does not include any  sample preparation.
IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Glossary of Quality Assurance and Related Terms

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 141 of 149

Instrument performance check sample — A sample of known composition analyzed concurrently
with environmental samples to verify the performance of one or more components of the analytical
measurement process.  Those  components can  include  retention time, resolution, recovery,
degradation, etc.

Interference — A positive or negative effect on a measurement caused by a analyte other than the
one being investigated or other factors.

Internal standard — A standard added to a test portion of a sample in a known amount and carried
through the entire determination procedure as a reference for calibrating and controlling the precision
and bias of the applied analytical method.

Investigative organization — An entity contracted by the lead organization for one or more phases
of a data collection operation.

Laboratory control sample — A sample of known composition prepared using contaminant-free
water or in inert solid that is spiked with analytes of interest at the midpoint of the calibration curve
or at the level of concern.  It is analyzed using the same sample preparation, reagents, and analytical
methods  employed for regular samples.

Laboratory duplicates/replicates — Two or more representative  portions taken from one
homogeneous sample by the laboratory and analyzed in the same laboratory. Laboratory duplicate/
replicate samples are quality control samples that are used to assess intralaboratory preparatory and
analytical precision.

Laboratory fortified blank — A low-level laboratory control sample (e.g., at the quantitation limit)
used to evaluate laboratory preparatory and analytical sensitivity and bias for specific compounds.

Lead organization — An entity responsible for all phases  of the data collection operation.

Management — Those individuals directly responsible and accountable for planning, implementing,
and assessing work.

Management  system — A structured, nontechnical system  describing the policies, objectives,
principles, organizational authority, responsibilities, accountability, and implementation plan of an
organization for conducting work and producing items and services.

Matrix —  The material of which the sample is composed, such as water, soil/sediment, or other
environmental medium.
IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Glossary of Quality Assurance and Related Terms

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 142  of 149

Matrix spike — A sample prepared by adding a known concentration of a target analyte to an aliquot
of a specific homogenized environmental sample for which an independent estimate of the target
analyte concentration is available. The matrix spike is accompanied by an independent analysis of
the unspiked aliquot of the environmental sample. Spiked samples are used to determine the effect
of the matrix on a method's recovery efficiency.

Matrix spike duplicate — A homogeneous sample used  to  determine the  precision of the
intralaboratory analytical process for specific analytes (organics only) in a sample matrix. The
duplicate sample is prepared simultaneously as a split with the matrix spike sample, and each is
spiked with identical, known concentrations of targeted analyte(s).

Mean (arithmetic) — The sum of all the values of a set of measurements divided by the number of
values in the set; a measure of central tendency.

Measurement performance criteria—Acceptance limits selected for proj ect-specific sampling and
analytical systems that will be used to judge whether proj ect quality obj ectives are met. See also data
quality indicators.

Method — A body of procedures and techniques for performing an activity (e.g., sampling, chemical
analysis, quantification), systematically presented in the order in which they are to be executed.

Method blank — A sample of a matrix similar to the batch of associated samples (when available)
in which no target analytes or interferences are present at concentrations that impact the analytical
results. It is processed and analyzed simultaneously with samples of similar matrix and under the
same conditions as the samples.

Method  detection limit — Minimum concentration  of a substance that can be  reported with 99
percent confidence that the analyte concentration is greater than zero. See also Detection limit and
Quantitation limit.

Method  detection  limit  studies — A  statistical  determination that  defines the  minimum
concentration of a substance that can be measured and reported with 99 percent confidence that the
analyte concentration is greater than zero.

Must — When used in a sentence, a term denoting a requirement that has to be met.

Nonconformance — A deficiency in a characteristic, documentation, or a procedure that renders the
quality  of an item or activity  unacceptable  or indeterminate; nonfulfillment of a  specified
requirement.
IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Glossary of Quality Assurance and Related Terms

-------
                                                                IDQTF UFP-QAPP Manual
                                                                Page:  143  of 149
Observation — An assessment conclusion that identifies a condition (either positive or negative) that
does not represent a significant effect on an item or activity. An observation may identify a condition
that has not yet caused a degradation of quality.

Organization — A public or private company, corporation, firm, enterprise, or institution, or part
thereof, whether incorporated or not, that has its own functions and administration.3

Outlier — A  data point that is shown to have a low probability of belonging to a specified data
population.

Parameter — A quantity, usually unknown, such as a mean or a standard deviation characterizing
a population. Parameter is commonly  misused for variable., characteristic., or property.

Precision — The degree to which  a set of observations or measurements of the  same property,
obtained under similar conditions, conform to themselves. Precision is usually expressed as standard
deviation, variance, or range, in either absolute or relative terms. Examples of QC measures for
precision include field duplicates, laboratory duplicates, analytical replicates, and internal standards.

Procedure — A specified way to perform an activity.

Process — A set of interrelated resources and activities that transforms inputs into outputs. Examples
of processes include analysis, design, data collection, operation, fabrication, and calculation.

Proficiency testing  (PT) sample — A sample, the composition of which  is unknown to  the
laboratory or analyst, which is provided to that laboratory or analyst to assess capability to produce
results within  acceptable criteria. PT  samples can fall into three categories: (1) prequalification,
conducted prior to a laboratory beginning project work, to establish initial proficiency; (2) periodic
(e.g., quarterly, monthly, or episodic), to establish ongoing laboratory proficiency; and (3) batch-
specific, which is  conducted simultaneously with analysis of a sample batch. A PT sample is
sometimes called a performance evaluation  sample.

Proficiency testing sample, amputated —  A PT sample that is received as a concentrate and must
be diluted to volume before being treated as an analytical sample. It can only be single blind.

Proficiency testing sample, full volume — A PT sample that is received by the laboratory ready to
be treated as an analytical sample. It does not require dilution and therefore can be single or double
blind.
 3When used in this Manual, the term is not limited to entities within a Federal Agency, as it is in the UFP-QS.
IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
Glossary of Quality Assurance and Related Terms

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page:  144 of 149

Proficiency testing sample, full volume — A PT sample that is received by the laboratory ready to
be treated as an analytical sample. It does not require dilution and therefore can be single or double
blind.

Proficiency testing sample, site-specific — A PT  sample created  using a well-characterized
contaminated matrix and treated as an analytical sample by the laboratory to test its capabilities.

Project — An organized set of activities within a program.

Project quality objectives (PQOs) — Qualitative and quantitative  statements derived from  a
Systematic Planning Process (e.g., EPA QA/G-4 DQO process ) that clarify study objectives, define
the appropriate type of data, and specify tolerable levels of potential decision errors. PQOs will be
used as the basis for establishing the quality and quantity of data needed to support decisions.

Project quantitation limit — The lowest concentration or amount of the target analyte required to
be reported from a data collection project.

Quality — The totality of features and characteristics of a product or service that bears on its ability
to meet the stated or implied needs and expectations of the user.

Quality  assurance — An integrated system  of management activities involving planning,
implementation, assessment, reporting, and quality improvement to ensure that a process,  item, or
service is of the type and quality needed and expected by the client.

Quality assurance project plan (QAPP) — A formal document describing in comprehensive detail
the necessary quality assurance (QA), quality control (QC), and other technical activities that must
be implemented to ensure that the results of the work performed will satisfy the stated performance
criteria.

Quality  control — The  overall system  of technical  activities that measures the attributes and
performance of a process, item, or service against defined standards to verify that they meet the stated
requirements established by the customer; operational techniques and activities that are used to fulfill
requirements for quality; also the system of activities and checks used to ensure that measurement
systems  are maintained within prescribed limits, providing protection against "out of control"
conditions and ensuring that the results are of acceptable quality.

Quality  control sample  — One of any  number of samples, such as a PT sample, intended to
demonstrate that a measurement system or activity is in control.

Quality management — That aspect of the overall management system of the organization that
determines and implements the quality policy. Quality management includes strategic planning,

IDQTF, UFP-QAPP Manual V1, March 2005                                                Final
Glossary of Quality Assurance and Related Terms

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page:  145 of 149

allocation of resources,  and other systematic activities  (e.g., planning, implementation, and
assessment) pertaining to the quality system.

Quality Management Plan — A formal document that describes the quality system in terms of the
organization's structure,  the  functional responsibilities  of management and  staff, the lines  of
authority, and the required interfaces for those planning, implementing, and assessing all activities
conducted.

Quality  system — A structured and documented management system describing the  policies,
objectives, principles, organizational authority, responsibilities, accountability, and implementation
plan of an organization for ensuring quality in its work processes, products (items), and services. The
quality system provides the framework for planning, implementing, and assessing work performed
by the organization and for carrying out required quality assurance (QA) and quality control (QC)
activities.

Quantitation limit — The minimum concentration of an analyte or category of analytes in a specific
matrix that can be identified and quantified above the method detection limit and within specified
limits of precision and bias during routine analytical operating conditions.

Raw  data —  The documentation generated during sampling and analysis. This documentation
includes, but is not limited to, field notes, hard copies of electronic data, magnetic tapes, untabulated
sample results, QC sample results, printouts of chromatograms, instrument outputs, and handwritten
notes.

Readiness review — A systematic, documented review of the readiness for the start-up or continued
use of a facility, process,  or activity. Readiness reviews are typically conducted before proceeding
beyond project milestones and prior to initiation of a major phase of work.

Reagent blank — An aliquot of water or solvent free of measurable contaminants analyzed with the
analytical batch and containing all the reagents in the same volume as used in the processing of the
samples. The method blank goes through preparatory steps; the reagent blank does not.

Record (quality) — A document that furnishes objective evidence  of the quality of products,
services, or activities and that has been verified and authenticated as technically complete and correct.
Records may include photographs, drawings, magnetic tape, and other data recording media.

Recovery — A measure of bias. Typically, a known concentration of analyte is spiked into an aliquot
of sample. Both the spiked aliquot and an unspiked aliquot of sample are analyzed and the percent
recovery is calculated.

Relative percent difference (RPD) — A unit-free measure of precision between duplicate analyses.

IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Glossary of Quality Assurance and Related Terms

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page:  146 of 149

Relative standard deviation (RSD) — A unit-free measure of precision or variability. The RSD is
also known as the Coefficient of Variation (CV) which is the standard deviation expressed as a
percentage of the mean.

Remediation — The process of reducing the concentration of a contaminant (or contaminants) in air,
water, or soil matrices to a level that poses an acceptable risk to human health.

Replicate samples — Multiple duplicate samples.

Representativeness — A measure of the degree to which data accurately and precisely represent a
characteristic of a population, a parameter variation at a sampling point, a process condition, or an
environmental condition.

Reproducibility — The precision, usually expressed as variance, that measures the variability among
the results of measurements of the same sample at different laboratories.

Requirement — A formal statement of a need and the expected manner in which it is to be met;
documented statements that specify activities that must be done; the mandated activities.

Sample quantitation limit (SQL)- Quantitation limit adjusted for dilutions, for changes in sample
volume or size, and extract and digestate volumes, percent solids, and cleanup procedures.

Scientific method — The principles and processes regarded as necessary for scientific investigation,
including rules for formulation of a concept or hypothesis, conduct of experiments,  and validation
of hypotheses by analysis of observations.

Screening  data — Analytical data of known quality, concentration, and level of uncertainty. The
levels of quality and uncertainty of the analytical data are consistent with the requirements for the
decision to be made. Screening data are of sufficient quality to support an intermediate or preliminary
decision but must eventually be supported by definitive data before a project is complete.

Secondary Data — Data not originally collected for the purpose for which they are now being used.
In addition, the level of QA/QC provided at the time of the original data collection may be unknown.

Self-assessment — The assessments of work conducted by individuals, groups, or organizations
directly responsible for overseeing or performing the  work.

Sensitivity — The capability of a test method or instrument to discriminate between measurement
responses representing different levels (e.g., concentrations) of a variable of interest. Examples of QC
measures for determining sensitivity include laboratory-fortified blanks, a method detection limit
study, and initial calibration low standards at the quantitation limit.

IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Glossary of Quality Assurance and Related Terms

-------
                                                               IDQTF UFP-QAPP Manual
                                                               Page: 147  of 149
Service — The result generated by activities at the interface between the supplier and the customer;
the supplier's internal activities to meet customer needs. Such activities in environmental programs
include design, inspection, laboratory and/or field analysis, repair, and installation.

Shipping container temperature blank—A container of water designed to evaluate whether or not
samples were adequately cooled during sample shipment.

Specification — A document stating requirements and referring to or including drawings or other
relevant  documents.  Specifications should  indicate the means and criteria  for  determining
conformance.

Spike — A substance that is added to an environmental sample to increase the concentration of target
analytes by known amounts. A spike is used to assess measurement accuracy (spike recovery). Spike
duplicates are used to assess measurement precision.

Split sample — Two or more representative portions taken from a sample in the field or laboratory,
analyzed by  at least two different laboratories and/or methods. Prior to splitting, a sample is mixed
(except volatiles,  oil and grease, or when otherwise directed) to minimize sample heterogeneity.
These  are quality control  samples used  to assess precision, variability,  and data comparability
between different laboratories. (Split samples should be used when accompanied by a PT sample.)

Standard deviation  — A measure of the dispersion or imprecision  of a sample or population
distribution;  expressed as the positive square root of the variance, with the same unit of measurement
as the mean.

Standard Operating Procedures (SOPs) — A written document that details the method for an
operation, analysis, or action, with thoroughly prescribed techniques and steps.  SOPs are officially
approved as  the methods for performing certain routine or repetitive tasks.

Storage blank — A sample composed of water free of measurable contaminants and stored with a
sample set in the same kind of sample container. Storage begins upon receipt of sample shipment at
the laboratory. The storage blank is analyzed at the end of the sample storage period to assess cross-
contamination occurring during sample  storage (typically  analyzed only for volatile  organic
compounds).

Supplier — Any individual  or organization furnishing items or services or performing work
according to a procurement document or a financial assistance agreement. Supplier is an all-inclusive
term used in place of any of the following: vendor, seller, contractor, subcontractor, fabricator, or
consultant.
IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Glossary of Quality Assurance and Related Terms

-------
                                                                IDQTF UFP-QAPP Manual
                                                                Page:  148 of 149

Surrogate spike or analyte — A pure substance with properties that mimic the analyte of interest
(organics only). Surrogates are brominated, fluorinated, or isotopically labeled compounds unlikely
to be found in environmental  samples. These analytes are added to samples to evaluate analytical
efficiency by measuring recovery.

Systematic planning process — Systematic planning is a process that is based on the scientific
method and includes concepts such as obj ectivity of approach and acceptability of results. Systematic
planning is based on a common sense, graded approach to ensure that the level of detail in planning
is commensurate with the importance and intended use of the work and the available resources. This
framework promotes communication among all organizations  and individuals involved  in an
environmental program. Through a  systematic planning process, a team can develop acceptance or
performance criteria for the quality  of the data collected and for the quality of the decision.

Target analytes — The project-specific list of analytes for which laboratory analysis is required;
sometimes used interchangeably with contaminants of concern.

Technical Systems Audit (TSA) — A thorough, systematic, on-site  qualitative audit of facilities,
equipment, personnel, training, procedures, record-keeping, data validation, data management, and
reporting aspects of a system.

Traceability — The ability to trace the history, application,  or location of an  entity by means of
recorded identifications. In a calibration sense, traceability relates measuring equipment to national
or international standards, primary  standards, basic physical  constants or properties,  or reference
materials.  In a data collection sense, it relates calculations and data generated throughout the project
back to the requirements for the quality of the project.

Trip blank — A clean sample of water free of measurable contaminants that is taken to the sampling
site  and transported to the laboratory for analysis without having been exposed  to sampling
procedures. Trip blanks are analyzed to assess whether contamination was introduced during sample
shipment (typically analyzed for volatile organic compounds only).

Usability assessment — Evaluation of data based upon the results of data validation and verification
for the decisions being made.  In the usability step, reviewers assess whether the process execution
and resulting data meet quality objectives based on criteria established in the QAPP.

Validation — Confirmation by examination and provision of objective evidence that the particular
requirements for a specific intended use are fulfilled. Data validation is a sampling and analytical
process evaluation that includes evaluating compliance with methods, procedures, or contracts, and
comparison with criteria based upon the quality objectives developed in the project QAPP. The
purpose of data validation is to assess the performance associated with the sampling and analysis to
determine the quality of specified data.

IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Glossary of Quality Assurance and Related Terms

-------
                                                              IDQTF UFP-QAPP Manual
                                                              Page:  149  of 149

Variance (statistical) — A measure or dispersion of a sample or population distribution.

Verification — Confirmation by examination and provision of objective evidence that the specified
requirements (sampling and analytical) have been completed. This is to be a completeness check.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Glossary of Quality Assurance and Related Terms

-------
This page intentionally left blank.

-------
          APPENDIX A




STANDARD OPERATING PROCEDURES

-------
This page intentionally left blank.

-------
              APPENDIX A - STANDARD OPERATING PROCEDURES

As described in Section 3.0, Measurement and Data Acquisition Elements, all sampling and analysis
procedures that will be used in the project must be documented in the QAPP or SOPs provided with
or referenced in the QAPP. This appendix provides examples of SOP types, and some additional
detail on their content.

A.I    Sampling Procedures

A.1.1  Sample Collection SOPs

Examples of sample collection  SOPs include, but are not limited to, the following:

•• Low Stress (low flow) Purging and Sampling Procedure for the Collection of Ground Water
   Samples from Monitoring Wells
•• SOPs for Soil Sampling during Monitoring Well Installation
•• Sampling SOPs for Surface and Subsurface Soils
•• SOPs for the Collection of Sediments
•• SOPs for the Collection of Surface Water Samples from Lakes, Ponds, and Streams
•• SOPs for the Collection of Drinking Water from Residential Homes
•• Sampling SOPs for Ambient Air, Stack Gases, and Soil Gas
•• SOPs for Collection of Samples from Waste Storage Tanks and Waste Drums
•• Sample  Compositing SOPs
•• Split Sampling SOPs

A.1.2  Equipment Cleaning and Decontamination SOPs

A.l.2.1 Equipment Cleaning  SOPs

SOPs for equipment cleaning may be attached to or referenced in the QAPP. They should also be
listed on the sampling SOP reference table (Figure 22 in Section 3.1.2). Initial equipment cleaning
should address:

•• How equipment will be cleaned prior to field activities
•• Frequency at which equipment will undergo full cleaning protocols
•• Criteria for measuring cleanliness
Frequency at wmcn equipment wi
Criteria for measuring cleanliness
If precleaned bottles are used, the QAPP should identify the vendor and describe where the
certificates of cleanliness will be maintained.
IDQTF, UFP-QAPP Manual V1, March 2005                                             Final
Appendix A. Standard Operating Procedures     A-l

-------
A.l.2.2 Equipment Decontamination SOPs

SOPs for equipment decontamination should be attached to or referenced in the QAPP. They should
also be listed on the sampling SOP reference table (Figure 22). Decontamination procedures for each
type of equipment should address:

••  How equipment will be decontaminated in the field
••  Frequency at which equipment will be decontaminated
••  Criteria for measuring the effectiveness of the decontamination procedures
••  Disposal of decontamination by-products, if applicable

A.1.3  Inspection and Acceptance SOPs for Supplies and Sample Containers

SOPs for inspection and acceptance of supplies and sample containers should include the following:

••  Itemization of the supplies and sample containers  that will be used when performing field
    activities, including sampling activities
••  List of all supply and sample container vendors
••  Description of the procedures that will be used to ensure that adequate supplies and sample
    containers are on hand and that sample containers are traceable and clean
••  Procedures for tracking, storing, and recording supplies and lot numbers for sample containers
••  Procedures for verifying container cleanliness, such as bottle blank analysis
• •  Frequency of inspection activities, acceptance criteria, and corrective action procedures employed
    to prevent the use of unacceptable supplies or sample containers
••  List of personnel  responsible for checking supplies, sample containers, and sample container
    certificates of cleanliness, by job function and organizational affiliation, and the personnel
    responsible for implementing corrective actions (If information is in an SOP, the SOP reference
    number should be cited and the SOP attached to the QAPP)

A.1.4  Field Documentation SOPs

The following information should be included in the field logbooks, field data collection forms, or
electronic data instruments, if applicable:

••  Site name and location
••  Sample identification number
••  Names, job functions, and organizational affiliations of personnel on-site
••  Dates (month/day/year) and times (military) of all entries made in logbooks/forms
••  User signatures
••  Descriptions of all site activities, including site entry and exit times
••  Site location by longitude and latitude, if known
••  Weather conditions, including temperature and relative humidity
••  Site observations
••  Identification and description of sample morphology and collection locations

IDQTF, UFP-QAPP Manual V1, March 2005                                               Final
Appendix A. Standard Operating Procedures      A-2

-------
• •  Sample collection information, including dates (month/day/year) and times (military) of sample
    collections, sample collection methods and devices, station location numbers, sample collection
    depths/heights, sample preservation information, sample pH (if applicable), analysis requested
    (analytical groups), etc., as well as chain-of-custody information  such as  sample  location
    identification numbers cross-referenced to field sample numbers
• •  Laboratories receiving samples and shipping information, such as carrier, shipment time, number
    of sample containers shipped, and analyses requested
••  Contractor and subcontractor  information  (address,  names  of personnel,  job  functions,
    organizational affiliations, contract number, contract name, and work assignment number)
• •  Records of photographs taken
••  Site sketches and diagrams made on-site

Because field information is matrix- and procedure-dependent, the information that will be recorded
should  be  described for each matrix  and  each  type of sampling  procedure.  For example,
documentation of monitoring well sampling should include screen interval, pump intake, purge rate,
purge volume, temperature, relative humidity, specific conductance, pH, redox potential, dissolved
oxygen, and turbidity. For a soil boring, the documented field information should include drilling
method, borehole diameter,  ground elevation, and water level and soil descriptions should  be in
accordance with the Unified Soil Classification System or applicable ASTM procedures.

A.2   Analytical SOPs

Examples of analytical SOPs include, but are not limited to, the following:

••  On-site Analytical SOPs
••  Off-site Laboratory  SOPs
••  Sample Preparation SOPs
••  Glassware Cleaning SOPs
••  Calibration SOPs
••  Maintenance, Testing, and Inspection Activities SOPs
••  Analytical Standards Preparation and Traceability SOPs
• •  Data Reduction Procedures
••  Documentation Policies/Procedures
• •  Data Review Procedures
• •  Data Management Procedures
••  Sample and Sample Extract/Digestate Disposal SOPs

Calibration procedures may be documented separately in the QAPP or included in the appropriate
analytical SOPs as attachments to the QAPP. In either case, the following items,  where appropriate,
must be addressed for each analytical procedure:

••  Frequency of initial and continuing calibrations
••  Number of calibration points, calibration levels for multipoint curves, and calibration standards
    at the required quantitation limit concentration for each target analyte/contaminant of concern
••  Linearity calculation techniques
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Appendix A. Standard Operating Procedures     A-3

-------
•• Acceptance criteria for calibrations
•• Calibration level for calibration verification standards (To assess instrument drift, a calibration
   verification standard should be run periodically during the analytical sequence and at the end of
   the analytical sequence.)
•• Corrective actions for nonconformances
• • Calibration and standards documentation, including a description of what documentation will be
   generated for calibrations and standards for each instrument (A plot for each regression curve
   should be provided for all nonlinear curves that will be used to quantitate field samples.)
• • A description of the procedures to be used to ensure traceability of standards (Standards must be
   traceable to a verifiable source such as a NIST standard, if applicable.)
•• A description of the use of second source verification standards

A.3    Sample Collection Documentation, Handling, Tracking, and Custody SOPs

Examples of sample collection documentation, handling, tracking, and custody SOPs include, but are
not limited to, the following:

•• Field Documentation SOPs and Records Management SOPs
•• Sample Custody/Sample Security SOPs (field sampling)
•• Sample Handling and Tracking SOPs (field sampling)
•• Sample Packaging and Shipping SOPs (field sampling)
•• Sample Receipt and Storage SOPs (laboratory analysis)
•• Sample Custody/Sample Security SOPs (laboratory analysis)
•• Sample Tracking SOPs (laboratory analysis)
•• Sample Disposal or Archiving SOPs (laboratory analysis)

A.3.1  Sample Container Identification

Sample containers should be identified with the following minimum information:

•• Site name and location
•• Sample identification number
•• Sample collection location and depth/height
•• Collection date (month/day/year) and time (military)
•• Sample collection method (composite or grab) and device
• • Sample preservation method (chemical or physical, such as ice; indicate if sample must be light-
   protected)
•• Sample pH, if applicable
•• Analysis requested (analytical group)
•• Sampler's signature

A.3.2  Sample Handling Procedures

Figure  A-l (QAPP Worksheet #26) provides an example of the  Sample Handling System table that
shows the flow of samples from the time of collection to laboratory delivery to final sample disposal.
IDQTF, UFP-QAPP Manual V1, March 2005                                              Final
Appendix A. Standard Operating Procedures      A-4

-------
 SAMPLE COLLECTION, PACKAGING, AND SHIPMENT
 Sample Collection (Personnel/Organization):
 Sample Packaging (Personnel/Organization):
 Coordination of Shipment (Personnel/Organization):
 Type of Shipment/Carrier:
 SAMPLE RECEIPT AND ANALYSIS
 Sample Receipt (Personnel/Organization):
 Sample Custody and Storage (Personnel/Organization):
 Sample Preparation (Personnel/Organization):
 Sample Determinative Analysis (Personnel/Organization):
 SAMPLE ARCHIVING
 Field Sample Storage (No. of days from sample collection):
 Sample Extract/Digestate Storage (No. of days from extraction/digestion):
 Biological Sample Storage (No. of days from sample collection):
 Personnel/Organization:
 Number of Days from Analysis:
                             Figure A-l. Sample Handling System
                                    (QAPP Worksheet #26)
IDQTF, UFP-QAPP Manual V1, March 2005                                                   Final
Appendix A. Standard Operating Procedures       A-5

-------