Environmental Technology Verification Program D Site Characterization and Monitoring Technologies Pilot D Oak Ridge National Laboratory and Sandia National Laboratories D Quality Management Plan Prepared By Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department Prepared For Eric Koglin, Pilot Manager U.S. Environmental Protection Agency Environmental Sciences Division National Exposure Research Laboratory Las Vegas, Nevada 89193-3478 ------- APPROVAL SIGNATURES D The signatures of the individuals below indicate concurrence with, and agreement to operate compliance with, procedures specified in this document. U. S. ENVIRONMENTAL PROTECTION AGENCY Pilot Manager: Eric Koglin Date ESD Quality Manager: George Brilis Date OAK RIDGE NATIONAL LABORATORY Program Manager: Technical Lead: QA Specialist: Head, Research Support Section: CASD Division Director: Marv Poutsma Date SANDIA NATIONAL LABORATORIES Program Manager: Roger Jenkins Amy Dindal Janet Wagner Fred Smith Date Date Date Date Wayne Einfeld Date QA Specialist: Tom Burford Date Department Manager: Dan Horschel Date ------- Oak Ridge National Laboratory D Chemical and Analytical Sciences Division D andD Sandia National Laboratories D Environmental Characterization and Monitoring Department D QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 1 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot TABLE OF CONTENTS ACRONYMS AND ABBREVIATIONS 3D INTRODUCTION 4D 1D Quality Management System Overview 5 D LID Policy Statement 5 D 1.1.ID ORNL 5D 1.1.2D SNL 5D 1.2D Purpose 5D 1.3D Scope 6D 2D Organization 6D 2.1D Organizational Structure 6D 2.2D Roles and Responsibilities 6D Figure 1: Organizational chart for ORNL SCMT activities 7 D Figure 2: Organizational chart for SNL SCMT activities 8D 2.3D Personnel Qualification and Training 9D 2.4D Customer Needs, Expectations, and Satisfaction 9D 3 D Procurement of Items and Services 10D 4D Documents and Records 10D 5 D Computer Hardware and Software 10D 6D Planning 10D 7D Implementation of Work Processes 11D 7.1 D Design and Implementation of Technology Verification Tests 11D 7.2D Assessment and Verification of Reference Laboratory Data Usability 14 D 7.3D Data Quality Indicators 16D 7.4D Existing Data 17D 8D Assessment and Response 17D 8.1D Technical systems audit 17D 8.2D Performance evaluation audit 17D 8.3 D Audits of data quality 17D 8.4D Surveillance of technology performance 18D 8.5D External Peer Review 18D ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 2 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot 9 Quality Improvement 18 D 10 References 19D ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 3 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot CASD EPA BSD ETV ETVR ETVS HASP NERL ORNL PARCC PE QA QAPP QAS QC QMP SARA SCMT SITE SNL ACRONYMS AND ABBREVIATIONS Chemical and Analytical Sciences Division US Environmental Protection Agency Environmental Sciences Division Environmental Technology Verification program Environmental Technology Verification Report Environmental Technology Verification Statement Health and Safety Plan National Exposure Research Laboratory (EPA) Oak Ridge National Laboratory precision, accuracy, representativeness, completeness, and comparability performance evaluation sample quality assurance Quality Assurance Project Plan quality assurance specialist quality control quality management plan Superfund Amendments and Reauthorization Act Site Characterization and Monitoring Technologies Pilot (ETV) Superfund Innovative Technology Evaluation Sandia National Laboratories ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 4 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot INTRODUCTION The performance evaluation of innovative and alternative environmental technologies is an integral part of the U.S. Environmental Protection Agency's (EPA's) mission. Early efforts focused on evaluating technologies that supported the implementation of the Clean Air and Clean Water Acts. In 1987, the Agency began to evaluate the cost and performance of remediation and monitoring technologies under the Superfund Innovative Technology Evaluation (SITE) program. This was in response to the mandate in the Superfund Amendments and Reauthorization Act (SARA) of 1986. In 1990, the U.S. Technology Policy was announced. This policy placed a renewed emphasis on "making the best use of technology in achieving the national goals of improved quality of life for all Americans, continued economic growth, and national security." In the spirit of the Technology Policy, the Agency began to direct a portion of its resources toward the promotion, recognition, acceptance, and use of U.S.-developed innovative environmental technologies both domestically and abroad. The Environmental Technology Verification (ETV) Program was created by the Agency to facilitate the deployment of innovative technologies through performance verification and information dissemination. The goal of the ETV Program is to further environmental protection by substantially accelerating the acceptance and use of improved and cost-effective technologies. The ETV Program is intended to assist and inform those involved in the design, distribution, permitting, and purchase of environmental technologies. The ETV Program capitalizes upon and applies the lessons that were learned in the implementation of the SITE Program to the verification of twelve categories of environmental technology: Drinking Water Systems, Pollution Prevention/Waste Treatment, Pollution Prevention/ Innovative Coatings and Coatings Equipment, Indoor Air Products, Air Pollution Control, Advanced Monitoring Systems, EvTEC (an independent, private-sector approach), Wet Weather Flow Technologies, Pollution Prevention/Metal Finishing, Source Water Protection Technologies, Site Characterization and Monitoring Technologies Technology (SCMT), and Climate Change Technologies. For each pilot, EPA utilizes the expertise of partner "verification organizations" to design efficient procedures for conducting performance tests of environmental technologies. To date, EPA has partnered with federal laboratories and state, university, and private sector entities. Verification organizations oversee and report verification activities based on testing and quality assurance protocols developed with input from all major stakeholder/customer groups associated with the technology area. Two verification organizations are utilized for the SCMT pilot, which is managed by EPA's National Exposure Research Laboratory's (NERL) Environmental Sciences Division (ESD): Sandia National Laboratories (SNL) and Oak Ridge National Laboratory (ORNL). The purpose of this plan is to document the quality assurance (QA) and quality control (QC) activities of these verification organizations performed under the auspices of the ETV program. This plan, which is modeled after the EPA's required quality system for cooperative agreements (American National Standard Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs, ANSI/ASQC E4-1994) and EPA/600/R-98/064, Environmental Technology Verification Program Quality and Management Plan for the Pilot Period, has been developed to ensure that the program meets these quality objectives. ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 5 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot 1D Quality Management System Overview LID Policy Statement 1.1.1D ORNL: The Sampling and Analysis Group, a group within the Organic Chemistry Section of the Chemical and Analytical Sciences Division (CASD), conducts applied research on the development and application of sampling and analytical methods for the determination of toxic species in complex matrices, with special emphasis on airborne and environmental mixtures. With the assistance of other ORNL organizations, the ORNL SCMT pilot staff, serves as one of the verification organizations in the Site Characterization and Monitoring Technologies Pilot (SCMT) of EPA's Environmental Technology Verification (ETV) Program. This policy for operation is in congruence with the statement of policy described in the Chemical and Analytical Sciences Division Quality Assurance Plan, QAP-X-94-CASD-001, which states that the Division will maintain a cost-effective, graded-level QA program, appropriate to its operating style, that will aid in assuring reliable and efficient operation of all facilities. 1.1.2D SNL: It is the policy of the Environmental Characterization and Monitoring Department to ensure that all work will achieve intended objectives and that all activities are adequately documented so that they can later be reproduced. To accomplish this, management will define requirements to meet objectives; properly train, motivate, and empower personnel; provide appropriate resource/budget; and assess performance to ensure requisite quality of products and services. Application of quality assurance principles is intended to instill a culture in which there is a commitment to achieve a rising standard of excellence. Quality Assurance management controls shall reflect anticipated risks (both the probability and the consequences) of an event that could adversely affect quality, safety, health, or the environment. Achievement of quality is a personnel responsibility wherein each individual is independently accountable for the quality of his or her work. Although they may delegate quality management functions to staff, management retains the ultimate responsibility for the success of the quality system. The principles and practices specified in this document apply to all personnel and to all aspects of activities performed for the Environmental Characterization and Monitoring Department. This policy for operation is in congruence with the statement of policy described in SNL's PR9202C, Implementation Plan for DOE 5700.6C and DOE 5480.19. 1.2D Purpose The purpose of this plan is to describe the quality management elements of the verification testing performed by ORNL and SNL under the SCMT pilot of ETV. ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 6 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot 1.3D Scope This quality management plan (QMP) supplements the minimum quality assurance requirements described in ORNL's QAP-X-94-CASD-001 for the activities of the SCMT pilot. Requirements implemented by this QMP include those identified in ORNL-QA-P01, ORNL Quality Assurance Program, ETVP-QA-5.1, Sandia National Laboratories Quality Assurance Procedure, Document Control, ETVP-QA-5.2, Sandia National Laboratories Quality Assurance Procedure, Records, applicable ORNL and SNL plans and procedures, and EPA/600/R-98/064, Environmental Technology Verification Program Quality and Management Plan for the Pilot Period (1995 - 2000). The preeminent document that outlines the specific quality system requirements for each verification test is the technology demonstration1 plan. 2 D Organization 2.ID Organizational Structure In Figure 1 is presented an organizational chart for the execution of verification testing at ORNL, consisting of a program manager, a technical lead, and other ORNL staff (such as a quality assurance specialist and a statistician). Overall division organization is shown in QAP-X-94-CASD-001. In Figure 2 is presented SNL's organizational chart. The organizational structure ensures that management controls are established, levels of accountability and authority defined, responsibilities assigned, and lines of communication identified. 2.2D Roles and Responsibilities Functional responsibilities of the staff involved in verification testing are described in this section. 2.2.1D SCMT program manager - The overall responsibility for verification activities and reporting belongs to the program manager. The program manager ensures that staff members receive training required by the quality assurance programs; approves staff members for performing the various aspects of the verification process; approves technology demonstration plans; investigates and reports irreparable quality failures as identified by the QA specialist; ensures proper handling of QA records; and reviews and approves verification reports and verification statements. 1 The term "demonstration" is used in this pilot to describe the actual verification testing process, which usually involves the evaluation of more than one technology. ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 7 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot Research Support Section Head ORNL Office of Quality Services QAS ORNL Computer Science and Mathematics Division Statistician CASD Division Director Organic Chemistry Section Head ETV Program Manager ORNL SCMT Pilot Technology Demonstration Technical Lead EPA/NERL/ESD ORNL Staff Figure 1. Organizational chart for ORNL SCMT activities. ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 8 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot Vice President Energy, Information, and Infrastructure D irector Energy and Critical Infrastructure De partment Manager Environmental Monitoring and Characterization Department ETV Program Manager SNL SCMT Pilot EPA/NERL/ESD Technology Demonstration Technical Lead Statistician QAS Demonstration Site Personnel Technology Vendors Figure 2. Organizational chart for SNL SCMT activities. ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 9 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot 2.2.20 SCMT technical lead - The technical lead serves as the program manager during the latter's absence. In addition, the technical lead has coordination and oversight responsibilities for the following: scheduling and coordinating the activities of all demonstration participants; preparing the demonstration plan, including developing a quality assurance project plan (QAPP) (Section 8 of the demonstration plan) and preparing a health and safety plan (HASP) (Section 10 of the demonstration plan) for the demonstration activities; acquiring the necessary reference analysis data; sampling activities (including collecting, homogenizing, dividing into replicates, bottling , labeling, and distributing); site access; characterization information for the site; other logistical information and support needed to coordinate access to the site for the field portion of the demonstration, such as waste disposal; auditing the on-site activities; managing, evaluating, interpreting, and reporting on the performance of the technologies; and providing information to the ETV webmasters. The technical lead will work with the program manager to identify the appropriate staff to assist with verification testing. 2.2.30 Statistician - The statistician will have the following responsibilities: assist with the development of the demonstration plan, particularly the experimental design; and assist with the evaluation and reporting on the performance of the technologies. 2.2.40 Quality Assurance Specialist (QAS) - The QAS provides general QA support to the SCMT pilot staff. Responsibilities include but are not limited to: tracking nonconformance reports and corrective actions; communicating quality concerns to management; conducting independent assessments; providing QA training when feedback and corrective action reports indicate training is necessary; and reviewing and approving assessment reports. 2.3D Personnel Qualification and Training It is the policy of the SCMT pilot to select qualified personnel on the basis of verifiable education and/or demonstrated proficiency in a given job field. Facility and generic training will be carried out as described in ORNL's QAP-X-94-CASD-001, Chemical and Analytical Sciences Division Quality Assurance Plan, and SNL's CPR500.2.1, Sandia National Laboratories Procurement Manual. 2.4D Customer Needs, Expectations, and Satisfaction For verification testing performed under ETV, SCMT pilot staff will fully implement this QMP unless exclusions are agreed upon between the ETV pilot manager and the program managers of each laboratory. The ETV pilot manager and the program manager may also agree on additional quality assurance requirements. Exclusions and/or additional requirements will be documented appropriately. A project-specific demonstration plan will identify customers, both internal and external, as well as their anticipated needs and expectations. As the demonstration progresses the needs and expectations will be refined, as appropriate. The program manager of each laboratory shall document any ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 10 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot refinements that occur based on meetings, discussions, participant recommendations, or any other source. Any refinements made must have the concurrence of the EPA pilot manager prior to any actions that have a direct bearing on meeting the pertinent expectations and needs. Should a situation arise where constraints of time, costs, or other problems could affect the capability to satisfy the identified program needs and expectations, the program manager will notify the EPA pilot manager and the ESD QA manager. The program manager will then attempt to negotiate an acceptable quality level for the work relative to the constraints. If an agreement cannot be reached, the negotiation shall be elevated to the appropriate EPA branch chief or division director. No work is allowed unless an agreement can be negotiated. Procurement of Items and Services During verification testing, the most significant procurements is acquiring analytical services and subcontracting with external personnel (i.e., peer reviewers, technical experts, etc.). SCMT pilot staff will follow the procurement guidelines established in ORNL's QAP-X-94-CASD-001, Chemical and Analytical Sciences Division Quality Assurance Plan, and SNL's CPR 500.2.1, Sandia National Laboratories Procurement Manual. Documents and Records SCMT pilot staff will follow the records requirements outlined in QAP-X-94-CASD-001, Chemical and Analytical Sciences Division Quality Assurance Plan, ETVP-QA-5.1, Sandia National Laboratories Quality Assurance Procedure, Document Control, ETVP-QA-5.2, Sandia National Laboratories Quality Assurance Procedure, Records, and EPA/600/R-98/064, Environmental Technology Verification Program Quality and Management Plan for the Pilot Period (1995 - 2000). Compliance with these documents will be periodically verified by the QAS. The current minimum requirement is that the ETV records will be retained for seven years after the final payment of the interagency agreement. Computer Hardware and Software SCMT pilot staff will follow the computer hardware and software requirements outlined in EPA/600/R-98/064, Environmental Technology Verification Program Quality and Management Plan for the Pilot Period (1995 - 2000). SCMT pilot staff will submit information to the ETV webmaster via electronic mail, with copies to the EPA pilot manager. Posting of verification reports and verification statements will be approved by the EPA pilot manager prior to submission. Planning All work involved in the verification of field technologies will be planned and documented. Each verification organization will follow the planning requirements outlined in EPA/600/R-98/064, ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 11 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot Environmental Technology Verification Program Quality and Management Plan for the Pilot Period (1995 - 2000). The preeminent document for accomplishing this is the demonstration plan. As described in A Guidance Manual for the Preparation of Site Characterization and Monitoring Technology Demonstration Plan, a typical technology demonstration plan would have the following outline: l.OD INTRODUCTION 2.0 D DEMONSTRATION RESPONSIBILITIES AND COMMUNICATION 3.0 D TECHNOLOGY DESCRIPTION 4.0 D DEMONSTRATION SITE DESCRIPTIONS 5.0D CONFIRMATORY PROCESS 6.0 D DEMONSTRATION DESIGN 7.0 D FIELD OPERATIONS 8.0 D QUALITY ASSURANCE PROJECT PLAN 9.0D DATA MANAGEMENT AND ASSESSMENT 10.0 D HEALTH AND SAFETY PLAN While most demonstration plans will follow a similar outline, many factors (such as contaminant class, technology type, experimental design, etc.) can change the information that will be included in the plan. Therefore, the outline will be used only as guide. 7D Implementation of Work Processes The technology verification process is intended to serve as a template for conducting demonstrations that will generate high quality data to verify technology performance. Each verification test is unique. Therefore, the requirements for each verification test will be specified in a technology demonstration plan. This plan will be developed according to A Guidance Manual for the Preparation of Site Characterization and Monitoring Technology Demonstration Plan. The demonstration plan will be approved by the EPA pilot manager, the SCMT program manager, the technical lead, the QAS, and the vendor(s). Where possible, the information contained in the plan will be included in the verification report. 7.1 D Design and Implementation of Technology Verification Tests The following is a list of typical tasks that are necessary for verification testing. Due to the diverse nature of the technology areas that could be evaluated by this pilot, some of the tasks could be amended or not performed. 7.1.ID Technology Area Selection: With input from the SCMT Stakeholder group, the EPA pilot manager will identify technology areas that are in need of verification. SNL and ORNL will provide recommendations and assist in the selection process, as required. ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 12 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot 7.1.2D Site Selection: SCMT pilot staff will visit potential sites, or at a minimum discuss the site characteristics with site personnel. The site(s) will be selected based on relevance to the technology, the types of analytes present, concentration range of the analytes, and geological conditions. SCMT pilot staff will also consider the following: features of facilities (power availability, sample storage locations, water availability, building/trailer access); training required for site entry; availability of competent site contractor(s), who will be dedicated to meeting the needs of the demonstration; likely weather/environmental conditions during the demonstration time; mode of waste deposition; ease of sample collection, including radiological considerations; access to the site (physical location relative to hotels, stores, restaurants). The technology may be demonstrated at more than one site, to test performance under varied geological and environmental conditions. Candidate sites will be identified prior to the initial vendor meeting. 7.1.3 D Announcement of Verification Testing: The announcement of the verification testing will be published in the Commerce Business Daily, on the ETV web site, and in appropriate trade journals. 7.1.4 D Vendor Solicitation: SCMT pilot staff will perform an independent literature search, utilizing the Internet and other resources, such as trade journals and major technical meetings (e.g., The Pittsburgh Conference), to identify potential participants. Interested vendors will inform the technical lead of their desire to participate. SCMT pilot staff will gather preliminary information concerning each potential participant. The potential vendors will be invited to attend an initial meeting at SNL, ORNL, or some other appropriate location. At the initial meeting, the vendors will learn about the verification process and the general nature of the demonstration. The vendors will also be briefed about the potential sites. The vendors will provide information regarding their technologies to assist the SCMT pilot staff in the vendor selection process. 7.1.5D Vendor Selection: After the initial vendor meeting, SCMT pilot staff and the EPA pilot manager will review the available technologies and select the vendors to participate in the demonstration. To receive verification, the technology must be commercially available, and it must fulfill the technology area need requested by EPA in Step 7.1.1. Additional selection criteria will include those stated in A Guidance Manual for the Preparation of Site Characterization and Monitoring Technology Demonstration Plan and performance on predemonstration samples. A letter of intent which clearly specifies the vendor's acceptance of participation in the demonstration must be signed by the vendor. 7.1.6D Development of Experimental Design and Technology Demonstration Plan: At the vendor meeting (or shortly thereafter), a draft demonstration plan will be distributed. SCMT project staff are responsible for preparing the document, which will include a QAPP and a HASP. The vendors are required to prepare a technology description for the plan. As necessary, the ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 13 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot SCMT program manager will utilize subject matter experts, such as the QAS and project statisticians, to finalize the document. The vendors will also work with SCMT pilot staff by reviewing, revising, and providing feedback. The EPA pilot manager and the SCMT program manager will approve the final demonstration plan prior to initiation of the demonstration. The finalized plan will include a list of performance characteristics of the technology (i.e., accuracy, precision, sample throughput, comparability with the reference laboratory results, etc.) which will be reported in the verification report from the data collected during the field demonstration. 7.1.7D Reference Laboratory and Contingency Laboratory Selections: For some demonstrations, the field technology's performance will be compared to the performance of a reference analytical laboratory. If so, the reference laboratory selection will include the following steps: a predemonstration audit of the laboratory (e.g., a readiness review teleconference); evidence that the laboratory's analytical methods are proven; and successful analysis of predemonstration samples. The possibility of having an on-site field laboratory will be considered. However, the quality of the reference laboratory will supersede physical location as a priority in the selection process. A contingency laboratory will also be selected should the primary reference laboratory be unable to analyze the samples at the time of the demonstration. The second laboratory may also be asked to analyze samples if there is a large discrepancy between the field technology and reference laboratory analytical results. In a case where sample holding times are an issue, the contingency laboratory may be asked to simultaneously analyze samples with the reference laboratory. If such is the case, the number of samples that the contingency laboratory must analyze will be predetermined by the statistician. 7.1.8 D Pre-demonstration Study: As appropriate, approximately six weeks prior to the demonstration, SCMT pilot staff may travel to the site to survey the site infrastructure to identify any problems that might be encountered during the actual demonstration. Predemonstration samples may be collected and distributed (shipped, if necessary) to the technology vendors and the reference laboratory to allow for an initial evaluation of matrix effects, establishment of calibration range, and identification of instrumental difficulties. Poor performance using the predemonstration samples may indicate that the technology should not participate. At this time, the technology vendor can choose to withdraw from participation in the demonstration if they feel such action is appropriate. The technology demonstration plan may be amended based on predemonstration sample results. 7.1.9D Perform Site Demonstration: Sufficient SCMT pilot staff will attend all or part of the demonstration. SCMT pilot staff will be available at the demonstration to assist the vendors, perform field audits of the technologies, and oversee the demonstration. The demonstration will be organized such that sufficient "buffer" time will be incorporated into each day, to allow for difficulties and problems. Acquisition, homogenization, and distribution of blind ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 14 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot environmental samples will be performed by SCMT pilot staffer its designee. Performance evaluation (PE) samples, labeled as actual samples, may be given to the vendors for blind analysis during the demonstration. PE sample preparation will be conducted by a vendor whose procedures are documented and available for inspection by the QAS. The PE samples may also be prepared by the SCMT staff, using materials traceable to NIST. SCMT pilot staff are responsible for ensuring that all samples for the reference laboratory are properly preserved and are transported to the laboratory within established holding times. Meetings may be held daily to communicate changes, answer questions, and collect data. All vendor data will be due upon completion of the demonstration. The vendor may be allowed to make justified changes to their data within a time interval specified in the demonstration plan. SCMT pilot staff may request an electronic version of results from the vendor, depending on the amount of data generated. 7.1.10 DAudit of Reference Laboratory and Receipt of Results: At some time during the demonstration process, an on-site audit of the reference laboratory will be conducted. The audit may consist of observing the execution of analytical methods and reviewing QC results. Analytical results from the reference laboratory (and contingency laboratory, as necessary) will be in a report format requested by the SCMT pilot staff. The reference laboratory results will be validated by SCMT pilot staffer its designee (see Section 7.2). 7.1.11DPrepare Environmental Technology Verification Reports (ETVR) and Statements (ETVS): SCMT pilot staff will prepare an ETVR for each technology. The primary focus of the report will be the evaluation of performance characteristics described in the demonstration plan. A preliminary ETVR will be distributed to the vendors and the EPA pilot manager for comment and review. A verification statement (ETVS) will be developed from each ETVR. The ETVS will be a 3 to 5 page summary of the results. The final ETVR and ETVS will be peer- reviewed by at least one reviewer who is external to the verification organization. 7.1.12DETVR and ETVS Distribution: Once the final ETVR and ETVS have been prepared, the EPA pilot manager will have the documents approved by the appropriate EPA management. Following final EPA approval, the ETVR will be assigned an EPA publication number. Each ETVR and accompanying ETVS will be posted on the ETV web site. The EPA may also have copies of the reports printed for distribution. As requested by the EPA pilot manager, SCMT pilot staff will assist with this process. 7.2 D Assessment and Verification of Reference Laboratory Data Usability Validation determines the quality of the results relative to the end use of the data. All data obtained from verification testing will be validated according to the procedure described below. As necessary, more specific validation procedures will be described in the demonstration plan. The vendor is responsible for validating its own data prior to final submission. SCMT pilot staff, or its designee, ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 15 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot are responsible for validating the reference laboratory data. Due to the nature of the technologies evaluated in this pilot, many of the vendor technologies will be compared to results generated by a reference laboratory. Several aspects of the data (listed below) will be reviewed. The findings of the review will be documented in the validation records. As appropriate, the ETVR will describe instances of failure to meet quality objectives and the potential impact on data quality. 7.2. ID Completeness of laboratory records: This qualitative review ensures that all of the samples that were sent to the laboratory were analyzed, and that all of the applicable records and relevant results are included in the data package. 7.2.2D Holding times: Sample holding times will be reviewed relative to the specifications in the demonstration plan and/or contractual arrangement with the laboratory. The holding times may be significantly different from test to test, based on the analyte(s) and matrix(ces) which are evaluated. 7.2.3 D Correctness of data: So as not to bias the assessment of the technology's performance, errors in the reference laboratory data will be corrected as necessary. Corrections may be made to data that has transcription errors, calculation errors, and interpretation errors. These changes will be made conservatively, and will be based on the guidelines provided in the method used. The changes will be justified and documented in the validation records. 7.2.4D Correlation between related results: Normally, one would not know if a single sample result was "suspect" unless (a) the sample was a PE sample, where the concentration is known or (b) a result was reported and flagged by the reference laboratory as suspect for some obvious reason (e.g., no quantitative result was determined). The experimental design implemented in the verification study may provide an additional indication of the abnormality of data through the inspection of the replicate results from homogenous sample sets. In these cases, criteria may be established to determine if data is suspect. For example, data sets could be considered suspect if the standard deviation of the replicate results was greater than 30 ppm and the percent relative standard deviation was greater than 50%. These criteria would indicate imprecision in the sample replicate set. These data would be flagged so as not to bias the assessment of the technology's performance. Precision and accuracy evaluations may be made with and without these suspect values to represent the best and worst case scenarios. If both the reference laboratory and the vendor(s) report erratic results, the data may be discarded if it is suspected that the erratic results are due to a sampling error. 7.2.5D Evaluation of QC results: Quality control (QC) samples will be analyzed by the reference laboratory with every batch of samples to indicate whether or not the samples were analyzed properly. These may be lab internal, external, or both. Acceptable QC results will be specified in either the reference laboratory's procedure or in the demonstration plan. A summary of QC samples include, but are not limited to: instrument tuning, calibration, ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 16 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot continuing calibration verification, laboratory control samples, matrix spike and matrix spike duplicates, surrogate recoveries, and blank results. 7.2.60 Evaluation of performance evaluation data: PE samples are homogenous samples containing certified concentrations of known analyte(s). The vendor who prepares the PE samples usually provides performance acceptance limits, which are guidelines established to gauge acceptable analytical results. The performance of the reference laboratory will be evaluated relative to the PE samples. Results for these samples represent the best estimate of accuracy and precision for verification testing. 7.3 D Data Quality Indicators The data obtained during verification testing must be of sufficient quality for the appropriate conclusions to be drawn. Five PARCC parameters may be evaluated as a measure of data quality: Precision, Accuracy, Representativeness, Completeness, and Comparability. As applicable, these data quality parameters will be assessed for both the vendor's technology and the reference laboratory. The statistics used to evaluate each of these parameters will be described in the demonstration plan, where necessary. 7.3.ID Precision: Precision, in general, refers to the degree of mutual agreement among measurements of the same materials and contaminants. In environmental applications, precision is often specified as a percentage of contaminant concentration. 7.3.2D Accuracy: Accuracy is a measure of how close measured values are to true values. Inaccuracies or biases are the result of systematic differences between these values. The incorporation of blanks, replicates, and performance evaluation samples in the experimental design will enable a determination of the technology's accuracy under the demonstration conditions. 7.3.3 D Representativeness: Representative samples, in general, are samples that contain a reasonable cross-section of the "population" over which they are to be used to make inferences. Representativeness may also express the degree to which the sample data represent the capability of the technology. For this reason, the SCMT demonstrations may include a variety of media and contaminants. 7.3.4D Completeness: Completeness refers to the amount of data collected from a measurement process expressed as a percentage of the data that would be obtained using an ideal process under ideal conditions. The completeness objective, which is usually 95% or better, will be discussed in the project-specific demonstration plan. ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 17 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot 7.3.5D Comparability: Comparability refers to the confidence with which one data set can be compared to another. If possible, the field technology will be compared in some way to a reference or baseline method. 7.4 D Existing Data Existing data provided by the vendor will not be evaluated in the SCMT pilot. The existing data may be used to describe the performance of the technology in the technology description sections of the demonstration plan and ETVR, but it will not be evaluated by the SCMT pilot. All evaluations will be based on data generated in the presence of SCMT pilot staff, or its designee, during verification testing. 8D Assessment and Response Activities performed during verification testing shall be assessed, and the findings reported, to ensure that the requirements stated in the demonstration plan and this quality management plan are being implemented as prescribed. Appropriate corrective actions shall be taken and their adequacy verified and documented in response to the findings of an assessment. 8.1D Technical systems audit: The QAS will perform a surveillance during verification testing to assess compliance with the demonstration plan. If it is not possible for the QAS to perform the audit (due to the verification testing occurring at an off-site location), the QAS may develop and supply a written checklist for the SCMT pilot staff who are conducting the verification test to perform a self-assessment. The QAS is responsible for reviewing , approving, and reporting on the self-assessment at the completion of verification testing. 8.2D Performance evaluation audit: This audit will be performed by SCMT pilot staff as part of the preparation of the ETVR. For certain demonstrations, both the field technology and the reference laboratory will evaluate PE samples, which will be of known concentration. The results will be compared to the range of acceptable results for the PE samples, as determined by the provider of the PE material and verified by the statistician. This evaluation will serve as a measure of accuracy and precision, and will be reported in the ETVR. 8.3 D Audits of data quality: If possible, the reference laboratory will be selected using the same resources used by ORNL or SNL organizations in need of analytical work. If such is the case, pre-existing qualification information will be evaluated to support selection of the laboratory. Additionally, the reference laboratory may be audited prior to, during, or after sample analyses. The audit will usually focus on adherence to method requirements and procedures, particularly in sample preparation, sample management, and quality control. For each applicable verification test, the reference laboratory data will be validated by SCMT pilot staff, or its designee, according to Section 7.2. Results of the validation will be reviewed ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 18 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot and approved by the QAS and documented in the ETVR. The vendor organization will be responsible for reviewing its own data. Once the final data set has been submitted to the SCMT pilot by the vendor, no changes to the data will be allowed. 8.4D Surveillance of technology performance: During verification testing, SCMT pilot staffer its designee will observe the operation of the field technology, such as observing the vendor operations, photo-documenting the demonstration activities, surveying calibration procedures, and reviewing sample data. The observations will be documented in a laboratory notebook or by completing a field audit form. 8.5 D External Peer Review: A final check of overall data quality will be performed by the external peer-reviewer of the ETVR. Quality Improvement 9.1D The program manager, or its designee, and the QAS will review this plan annually and revise as necessary. The revised plan will be reviewed and approved by the SCMT program manager, the EPA pilot manager, and the EPA quality manager. 9.2D All laboratory personnel have the responsibility to immediately stop activities when, in their judgement, further work activities could result in a significant condition adverse to personnel safety or data quality. The suspect activity will be reviewed by the program manager and/or the QAS to determine the need for a Stop Work Order. The EPA pilot manager will be notified within 24 hours of issuing a Stop Work Order. Restart activities will commence upon concurrence of the program manager and/or the QAS. The work activity in question and final resolution shall be documented. 9.3D Corrective actions for conditions adverse to quality will be established, documented, implemented, and reported according to ORNL's QAP-X-94-CASD-001, Chemical and Analytical Sciences Division Quality Assurance Plan and SNL's PR9202C, Implementation Plan for DOE 5700.6C and DOE 5480.19. The QAS shall review the corrective action plan for appropriateness and accuracy, and will follow up to verify timely completion of corrective actions. 9.4D Lessons learned from verification testing will be assessed, and may be reported at ETV team meetings. Additionally, the program manager will ensure that all SCMT pilot staff are informed of lessons learned. 9.5 D All SCMT personnel are responsible for identifying and reporting to responsible management any conditions that are adverse to quality. Items and processes that do not meet established requirements must be identified, documented, reviewed, and resolved with the goal of ------- Oak Ridge National Laboratory Chemical and Analytical Sciences Division and Sandia National Laboratories Environmental Characterization and Monitoring Department QMP-X-98-CASD-001, Rev. 0 Date: November 24, 1998 Page 19 of 19 Title: Quality Management Plan for the Environmental Technology Verification Program's Site Characterization and Monitoring Technologies Pilot promoting higher quality. Problems will be controlled to an extent commensurate with the significance of the problem. 10 References This plan is based on the guidance provided in the documents listed below. At the time of publication, the editions indicated were valid, but note that all standards are subject to revision. 10.ID A Guidance Manual for the Preparation of Site Characterization and Monitoring Technology Demonstration Plans. Interim Final Report Version 5.0. National Exposure Research Laboratory, US Environmental Protection Agency, 1996. 10.2D American Society for Quality Control, Energy and Environmental Quality Division, Environmental Issues Group. AMERICAN NATIONAL STANDARD Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs. ANSI/ASQC E4-1994. American Society for Quality, 1994. 10.3D Berger, Walter et al. Environmental Laboratory Data Evaluation. Genium Publishing. 1996. 10.4D CHEMICAL AND ANALYTICAL SCIENCES DIVISION Quality Assurance Plan. QAP- X-94-CASD-001, Rev. 2. Oak Ridge National Laboratory, 1997. 10.5D ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM Quality and Management Plan for the Pilot Period (1995 - 2000). EPA/600/R-98/064. National Risk Management Research Laboratory and National Exposure Research Laboratory, US Environmental Protection Agency, 1998. 10.6D ORNL-QA-P01, ORNL Quality Assurance Program. Rev 0, July 1998. 10.7D PR9202C. Sandia National Laboratories' Implementation Plan for DOE 5700.6C and DOE 5480.19. 10.8D CPR 500.2.1. Sandia National Laboratories Procurement Manual. ------- |