United States Office of Research and January 2008 Environmental Protection Development Agency Washington, D.C. 20460 SEPA Quality Management Plan National Homeland Security Research Center Technology Testing and Evaluation Program (TTEP) Version 3 ------- ------- QUALITY MANAGEMENT PLAN (QMP) for the Technology Testing and Evaluation Program Eric Koglin EPA Program Manager Version 3 Date Eletha Brady-Roberts NHSRC Quality Assurance Manager Date Karen Riggs Battelle TTEP Manager Date Zarttary WiB«nberf \. Battelle Quality Assurance Manager Date ------- ------- TTEP QMP Pagel of 36 Version 3 Date: 01/01/08 TABLE OF CONTENTS Page 1.0 MANAGEMENT AND ORGANIZATION 4 1.1 Policy Statement 4 1.2 Organization and Communication 5 1.3 Technical Activities 7 1.4 Communicating the Quality System 7 1.5 Resources 7 1.6 Authority to Stop Work 8 1.7 Management Assessment of the Quality System 8 2.0 QUALITY SYSTEM AND DESCRIPTION 9 2.1 Quality System Elements 9 2.2 Quality Management Plan Reviews and Revisions 11 3.0 PERSONNEL RESPONSIBILITIES, QUALIFICATIONS, AND TRAINING 13 3.1 Responsibilities 13 3.2 Qualifications and Training 13 3.3 Formal Qualifications and Certifications 13 3.4 Training Documentation 16 3.5 Job Proficiency 16 3.6 Retraining 16 4.0 PROCUREMENT OF ITEMS AND SERVICES 17 4.1 Planning and Control 17 4.2 Technical and Quality Requirements 17 4.3 Verifying Supplier's Conformance 17 4.4 Document Review 17 4.5 Review of Changed Documents 17 4.6 Review of Items and Services 17 5.0 DOCUMENTS AND RECORDS 19 5.1 Records Management Procedures 19 5.2 Document Control 21 6.0 COMPUTER HARDWARE AND SOFTWARE 22 6.1 Conformance to User and EPA Requirements 22 6.2 Configuration Testing 22 6.3 Change Assessment 23 6.4 Retesting and Redocumentation 23 ------- TTEP QMP Page 2 of 36 Version 3 Date: 01/01/08 Page 7.0 PROGRAM PLANNING 24 7.1 Planning and Documenting the Generation, Acquisition, and Use of Environmental Data 24 7.2 Identifying and Documenting Environmental Data Needed 24 7.3 Key Users, Customers, and Technical Staff. 26 7.4 Review and Approval of Planning Documentation 26 8.0 IMPLEMENTATION 27 8.1 Implementation Procedures 27 8.2 Standard Operating Procedures 29 9.0 ASSESSMENT AND RESPONSE 30 9.1 Scope 30 9.2 Assessment Planning and Procedures 31 9.3 Personnel Qualifications for Assessment 32 9.4 Responsibility and Authority to Stop Work 32 9.5 Documentation, Reporting, and Review 32 9.6 Responses and Follow-up Actions 33 10.0 VALIDATION OF DATA USABILITY 34 10.1 Assessing, Verifying, and Qualifying Data 34 10.2 Documenting Limitations on Data 34 10.3 Independent Review of Reports 34 10.4 Management Approval 34 11.0 QUALITY SYSTEM IMPROVEMENT 35 11.1 Quality Improvement Process 35 11.2 Preventing, Detecting, and Correcting Quality System Problems 35 11.3 Response Actions 36 ------- TTEP QMP Page 3 of 36 Version 3 Date: 01/01/08 LIST OF TABLES Table 3-1 Personnel Responsibilities for TTEP Evaluation Activities 14 Table 5-1 Records Management Responsibilities for TTEP 20 Table 9-1 TTEP Assessments 31 LIST OF FIGURES Figure 1-1 TTEP Organization 6 Figure 7-1 Systematic Planning of Technology Evaluations 25 APPENDICES APPENDIX I. NAMES, DESCRIPTIONS, ADDRESSES, AND PHONE NUMBERS OF KEY TTEP STAFF APPENDIX II APPENDIX B QAPP FORMAT PAGES FROM NHSRC QMP APPENDIX III. AMENDMENT AND DEVIATION FORMS ------- ------- TTEP QMP Page 4 of 36 Version 3 Date: 01/01/08 1.0 MANAGEMENT AND ORGANIZATION 1.1 POLICY STATEMENT The Environmental Protection Agency's (EPA) National Homeland Security Research Center (NHSRC) maintains a mission statement which specifies: "NHSRC will manage, coordinate, and support a wide variety of homeland security research and technical assistance efforts. The NHSRC will provide the U.S. Environmental Protection Agency (EPA) and Office of Research and Development (ORD) with a management structure that ensures effective design and oversight of research, provide clear lines of communication, and facilitates interaction with EPA Program Offices and Regions, other federal agencies, the private sector, and research partners. By bringing together a critical mass of research talent, the Center will integrate and unify ORD homeland security research and provide an easily identifiable entity for communication and coordination." The NHSRC policy on quality assurance (QA) is to ensure that NHSRC meets EPA QA requirements as defined in EPA Order 5360.1 A2 (May 2000). The NHSRC management and QA staff work to ensure the following: Standard policies and procedures are in place for the quality system elements described in the NHSRC quality management plan All environmental data collection, evaluation, and use are performed in accordance with an approved planning document, and All NHSRC projects produce defensible data with the level of QA determined prior to data collection. The Technology Testing and Evaluation Program (TTEP), which is administered by the EPA's NHSRC, produces high quality performance data of homeland security-related technologies through a rigorous and peer-reviewed testing and evaluation process. A contractor manages the day-to-day activities of the TTEP. The contractor's management is responsible for committing to a quality policy and for creating work environments in which all personnel strive for the highest quality of services and products. The management also provides the contractor TTEP Manager the authority to ensure the following: That all applicable elements of the quality system described in the Quality Management Plan (QMP) are understood and are implemented in the program That adequate personnel and resources are available to plan, implement, assess, and improve services and products relevant to the program That staff is (are) clearly directed to stop unsafe work and work of inadequate quality. ------- TTEP QMP Page 5 of 36 Version 3 Date: 01/01/08 1.2 ORGANIZATION AND COMMUNICATION This document is the QMP for the TTEP. In this program, technologies for use in various homeland security applications subjected to performance evaluation in tests that often use chemical and/or biological warfare agents or surrogates. Currently, staff involved in the program includes those with expertise in technology testing for water security, safe buildings design, modeling, building detection, and building decontamination for chemical and/or biological agents. Key facilities used for the program include comprehensive laboratory analysis equipment; aerosol sciences laboratories and instrumentation; Bio-Safety Level (BSL) 2 and 3 facilities, certified chemical surety and biological containment facilities; and facilities capable of evaluating homeland security technologies under conditions that simulate terrorist attack scenarios. The organization chart for this program is provided in Figure 1-1 and shows the current staff in the key program positions and their reporting and communication lines. For this program the key staff are: Contractor TTEP Manager: The contractor TTEP Manager is the main point of contact for the EPA TTEP Program Manager and directs activities of contractor TTEP personnel during the performance of the contract. Contractor Technology Area Leaders: The TTEP Technology Area Leaders (TAL) are responsible for planning and leading evaluations and report directly to the contractor TTEP Manager on their TTEP activities. Contractor Quality Assurance Manager: The program's QA Manager reports to management outside of the reporting line for TTEP management and keeps this management informed of all program level quality items and issues. Contractor Stakeholder Involvement Leader: The TTEP Stakeholder Involvement Leader primary responsibility is stakeholder recruitment, communications, and meetings. This individual reports directly to the contractor TTEP Manager on this program. Names, descriptions, mailing/email addresses, and phone/facsimile numbers of the key TTEP staff are included in Appendix I. ------- TTEP QMP Page 6 of 36 Version 3 Date: 01/01/08 EPA/NHSRC Quality Assurance Manager E. Brady-Roberts EPA TTEP Program Manager E. Koglin Contractor Senior Management M. Toomajian Stakeholder Involvement Leader R. Sell Contractor TTEP Manager K. Riggs Contractor Quality Assurance Manager Z. Willenberg Water Security Technology Area Leader R. James Building Decontamination Technology Area Leader M. Taylor Building Detection Technology Area Leader T. Kelly Building Filtration/Air Cleaning Technology Area Leader D. Franke* Technical Staff Figure 1-1. TTEP Organization (dotted lines indicate indirect reports) * Currently inactive ------- TTEP QMP Page 7 of 36 Version 3 Date: 01/01/08 1.3 TECHNICAL ACTIVITIES This document encompasses all testing and evaluation activities that the contractor utilizes to assure the quality of products and services provided for this program. This QMP applies to personnel involved in and activities conducted for TTEP. The document contains the minimum specifications and guidelines that are applicable to the program's quality management functions and activities. These include, but are not limited to, personnel qualification and training, procurement of items and services, documents and records, computer hardware and software, planning, implementation for work processes, assessment and response, and quality improvement provisions. 1.4 COMMUNICATING THE QUALITY SYSTEM The contractor TTEP QA Manager is responsible for the development, implementation, and maintenance of the TTEP Quality System, defined within the QMP. The contractor TTEP Manager is responsible for assuring that all technical work performed within the program is performed under the guidance of the QMP. The contractor recognizes the importance of having a QMP that is as current as possible and is in compliance with the EPA's needs, contractual requirements, and QA policy. Therefore, the contractor TTEP Manager, the contractor QA Manager, the EPA Program Manager, and the EPA NHSRC QA Manager (hereafter referred to as the EPA QA Manager) review this QMP at least annually to reconfirm the suitability and effectiveness of program quality management practices. When revisions to this QMP are deemed necessary and have been made, the individuals listed on the plan's Signature Page review and approve the revisions, and the contractor QA Manager issues a memo or e-mail message to the entire program team (including any subcontractors) to notify them that the QMP has been revised, describes the types of changes made to the QMP, and makes the revised QMP available to the staff. Each revision to the QMP is assigned a unique version number which is clearly documented in the title page and the header block of each page of the QMP. Each key staff member in TTEP has access to this QMP through receipt of a paper copy of the plan and/or receiving a non-changeable version of the plan (stored as an Adobe Acrobat PDF file) from the contractor QA Manager. Each staff member is expected to be aware of and work in accordance with the Quality System policies defined in this QMP. 1.5 RESOURCES The contractor's corporate quality policy requires that management staff allocate sufficient resources toward performing necessary QA procedures on each program, such as performing audits and collecting quality control (QC) data and reporting these data to our client. As such, a QA Manager is assigned to TTEP whose primary responsibility is to ensure that the goals and procedures of the TTEP Quality System are being achieved and to provide technical staff with guidance on appropriate QA procedures to implement through the program life cycle (i.e., from experimental design and sample analysis to data generation, reduction, and reporting). ------- TTEP QMP Page 8 of 36 Version 3 Date: 01/01/08 1.6 AUTHORITY TO STOP WORK Authority to stop work during an evaluation for safety and quality considerations is delegated to the contractor TTEP Manager, or subcontractor designee, who must ensure compliance with all applicable federal, state, and local safety policies during the performance of an evaluation. In addition, should any technical staff suspect compromise to personal health or test objectives during the conduct of a technology evaluation, that staff member shall immediately contact the contractor TTEP Manager who has the authority to issue the stop work. Further, should an EPA TOPO or EPA Manager become aware that safety measures or data quality is being compromised during an evaluation; he or she shall report that to the EPA TTEP Program Manager. The EPA TTEP Program Manager shall contact the EPA Contracting Officer to implement a stop work order. See Section 9.4 for more information. 1.7 MANAGEMENT ASSESSMENT OF THE QUALITY SYSTEM The aim of management assessment is to protect the contractor's business interest by assuring that all contractual, financial, risk, and quality issues are reviewed for correctness, compliance, completeness, and accuracy, and that products are appropriate for the EPA's intended use. All test/QA plans, reports, data, and other deliverables to the EPA receive prior management approval by the contractor TTEP Manager, or appropriate subcontractor designee, before they leave the contractor. Technology Area Leaders are appropriate substitutes to provide management review when the contractor TTEP Manager is not available. In addition, annually, as part of this contract, the contractor QA Manager performs a Quality System Audit (QSA) of the program, with emphasis on one or more components. This audit focuses on the Quality System defined by this document to ensure its continuing suitability and effectiveness, and to introduce necessary changes or improvements. EPA feedback of the contractor's performance is obtained, when possible, to provide additional detail on compliance with quality expectations and requirements, such as through annual external QSA's or evaluation-specific Technical System Audits (TSA). The contractor TTEP Manager reviews the input and provides guidance for corrective actions regarding any EPA concerns. ------- TTEP QMP Page 9 of 36 Version 3 Date: 01/01/08 2.0 QUALITY SYSTEM AND DESCRIPTION 2.1 QUALITY SYSTEM ELEMENTS The contractor Quality System to be implemented for this program according to this QMP is intended to conform to the specifications listed in: ANSI/ASQC E4-1994, "Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs" EPA QA/R-2, EPA Requirements for Quality Management Plans, March 2001 Quality Management Plan for the National Homeland Security Research Center (NHSRC), August 2003. This document, the QMP for TTEP, is the principal quality system document governing general and specific responsibilities for program management and staff, responsibility and authority for all technical activities, and reporting lines. Individual task orders conform both to this QMP and the applicable test/QA plan document(s) and applicable standard operating procedures (SOPs). The test/QA plans contain the specific information needed to conduct a technology evaluation. If another level of detail is required for describing technology evaluation activities, for example operation of an instrument, an SOP is to be written and attached to the test/QA plan (if possible). Quality procedures documentation includes maintenance of all inspection and review/assessment records, listing of all controlled documents (see Section 5.2), and retention of records pertaining to personnel training and qualification, instrument maintenance and calibration, and test methods/operating procedures. Implementation of a complete and consistent assessment of technical operations provides overall control of program activities. This is accomplished by the contractor QA Manager according to Section 9.0 in this QMP. 2.1.1 Test/QA Plans Test/QA plans are the responsibility of the TALs and/or Task Order Leaders and are reviewed by the contractor TTEP Manager, contractor QA Manager, EPA Task Order Project Officer (TOPO), and NHSRC QA Manager. One EPA source for guidance in writing test/QA plans is available in EPA QA/G-5, Guidance for Quality Assurance Project Plans (December 2002). However, other test/QA plan templates are available for use upon request of the EPA TOPOs and with approval from the EPA Program Manager. These templates are included in Appendix II of this document (from Appendix B of the QMP for NHSRC). Planned changes on approved test/QA plan are made by written amendment. Deviations from the plan must be fully documented including date and description of deviation, and the impact on the quality of the affected data. Examples of TTEP amendment and deviation forms are included in Appendix III. Below are the section headings of a test/QA plan as guided by the EPA QA/G-5 document. It is possible that not all of the sections would be applicable for every technology evaluation. If that is the case, omissions can be listed within the test/QA plan. ------- TTEP QMP Page 10 of 36 Version 3 Date: 01/01/08 Group A: Evaluation Management - This group of elements covers the general areas of evaluation management, history and objectives, and roles and responsibilities of the participants. The following nine elements ensure that the evaluation's goals are clearly stated, that all participants understand the goals and the approach to be used, and that evaluation planning is documented: Al Title and Approval Sheet A2 Table of Contents and Document Control Format A3 Distribution List A4 Task Organization and Schedule A5 Problem Definition/Background A6 Task Description A7 Quality Objectives and Criteria for Measurement Data A8 Special Training Requirements/Certification A9 Documentation and Records. Group B: Measurement/Data Acquisition - This group of elements covers all of the aspects of measurement system design and implementation, ensuring that appropriate methods for sampling, analysis, data handling, and QC are employed and will be thoroughly documented: Bl Sampling Process Design (Experimental Design) B2 Sampling Methods Requirements B3 Sample Handling and Custody Requirements B4 Analytical Methods Requirements B5 Quality Control Requirements B6 Instrument/Equipment Testing, Inspection, and Maintenance Requirements B7 Instrument Calibration and Frequency B8 Inspection/Acceptance Requirements for Supplies and Consumables B9 Data Acquisition Requirements (Non-Direct Measurements) B10 Data Management. Group C: Assessment/Oversight - The purpose of assessment is to ensure that the test/QA plan is implemented as prescribed. This group of elements addresses the activities for assessing the effectiveness of the implementation of the evaluation and the associated QA/QC activities: Cl Assessments and Response Actions C2 Reports to Management. Group D: Data Validation and Usability - Implementation of Group D elements ensures that the individual data elements conform to the specified criteria, thus enabling reconciliation with the evaluation's objectives. This group of elements covers the QA activities that occur after the data collection phase of the evaluation has been completed: D1 Data Review, Validation, and Verification Requirements ------- TTEP QMP Page 11 of 36 Version 3 Date: 01/01/08 D2 Validation and Verification Methods D3 Reconciliation with Data Quality Objectives. 2.1.2 Standard Operating Procedures If an SOP is attached to a test/QA plan, the following topics, from EPA QA/G-6, Guidance for Development of Standard Operating Procedures (SOPs) (March, 2001), may be included (or a reference provided). These topics may be appropriate for inclusion in technical SOPs. Not all will apply to every procedure or work process detailed: Title Page Table of Contents Scope & Applicability Summary of Method Definitions Health & Safety Warnings (indicating operations that could result in personal injury or loss of life) Cautions (indicating activities that could result in equipment damage, degradation of sample, or possible invalidation of results) Interferences (describing any component of the process that may interfere with the accuracy of the final product) Personnel Qualifications Equipment and Supplies Procedure - identifying all pertinent steps, in order, and materials needed to accomplish the procedure such as: Instrument or Method Calibration and Standardization Sample Collection Sample Handling and Preservation Sample Preparation and Analysis Troubleshooting Data Acquisition, Calculations, and Reduction Requirements for Computer Hardware and Software used in Data Reduction and Reporting Data and Records Management Quality Control and Quality Assurance Section References. 2.2 QUALITY MANAGEMENT PLAN REVIEWS AND REVISIONS This QMP and subsequent revisions are controlled documents identified by a unique document number (see Section 5.2) and are distributed according to a published list as part of the Master List of Controlled Documents maintained by the contractor QA Manager. At a minimum, the QMP is reviewed on an annual basis to ensure that it is current and representative of the activities of the program. Upon such a review, the contractor QA Manager develops a brief letter report containing recommendations for any revisions and convenes a ------- TTEP QMP Page 12 of 36 Version 3 Date: 01/01/08 teleconference with the EPA Program Manager and NHSRC QA Manager to discuss the recommendations. The contractor QA Manager identifies and explains the rationale for changes. Subsequent to the review and teleconference, the contractor QA Manager revises the QMP and provides a revised draft to the EPA Program Manager within three weeks. The EPA Program Manager reviews and provides comments within two weeks of receipt. If necessary, a second teleconference may be scheduled to discuss unresolved issues. The final QMP ready for signatures is provided to the EPA Program Manager within two weeks after receipt of the EPA Program Manager's comments or after the teleconference (whichever is later). The initial approved QMP served as Version 1, which was designated along with its effective date, in the upper right corner of each document page. Revisions are so designated beginning with Version 2 and are subsequently numbered and dated as applicable. TTEP staff to whom controlled copies are issued are responsible for disposal of outdated QMP versions. ------- TTEP QMP Page 13 of 36 Version 3 Date: 01/01/08 3.0 PERSONNEL RESPONSIBILITIES, QUALIFICATIONS, AND TRAINING 3.1 RESPONSIBILITIES Key Staff Responsibilities. The contractor is responsible for operating an effective quality system that ensures compliance with all program requirements. The responsibilities of key TTEP staff that perform or assist in performing technology evaluations addressed by this QMP are listed in Table 3-1. Responsibilities of evaluation staff that might conduct or assist in a technology evaluation are also listed in Table 3-1. Stakeholder Responsibilities. The responsibilities of TTEP stakeholders include the following: Assist in prioritizing the types of technologies to be evaluated Review program-specific procedures and TTEP documents including test/QA plans, and evaluation reports, as requested Assist in the definition and conduct of outreach activities appropriate to the technology area and customer groups Serve as information conduits to the particular constituencies that each member represents. 3.2 QUALIFICATION AND TRAINING TTEP personnel qualifications and training target technical work performed directly in support of program activities. These qualifications and training may include: Formal education in physical and/or biological sciences (i.e., chemistry, physics, engineering, molecular biology, toxicology, biochemistry) Experience in chemical and biological (CB) agent sampling and analysis Training on standard analytical instrumentation such as gas chromatographs, mass spectrometers, flame ionization detectors, etc. Experience in designing experiments to evaluate various monitoring, detection, decontamination, air cleaning, and/or water treatment technologies. The contractor or subcontractor permanently maintains documentation about the TTEP personnel working on program activities, which includes: Education history that can include formal qualification or certification relevant to technical, QA, or management disciplines Work experience as academic or on-the-job performance in technical and/or management areas Experience in the application of QA/quality control (QC) requirements in technical performance or data verification. 3.3 FORMAL QUALIFICATIONS AND CERTIFICATIONS Formal qualifications and certifications in the form of actual or verified-copy documentation for specific disciplines are maintained in the TTEP staff member's qualification/training file. ------- TTEP QMP Page 14 of 36 Version 3 Date: 01/01/08 Table 3-1. Personnel Responsibilities for TTEP Evaluation Activities TTEP Member EPA Program Manager Responsibilities Ultimate responsibility for all aspects of TTEP Review and approve the TTEP QMP and subsequent revisions Ensures that the EPA TOPOs implement the quality system as specified by the TTEP QMP Manages the release and approval of task orders Oversees amendments and extensions to task orders Ensures that necessary funds are available for the performance of TTEP activities Inform EPA Contracts to issue a stop work order if made aware that safety measures or data quality is being compromised during an evaluation EPA Task Order Project Officers Have overall responsibility for directing the evaluation process Review and approve test/QA plans and subsequent revisions Review and approve evaluation reports Oversee the EPA review process on the draft test/QA plans and evaluation reports Coordinate submission of evaluation reports for final EPA approval EPA QA Manager Review and approve the TTEP QMP and subsequent revisions Review and approve test/QA plans and subsequent revisions Perform one external TSA per year Notify the EPA TTEP Program Manager if they become aware that safety measures or data quality is being compromised during an evaluation, TTEP Program Manager would then contact EPA Contracts to issue stop work order Prepare and distribute an assessment report summarizing the results of the external TSA Review evaluation reports Contractor TTEP Manager Ultimate responsibility for all aspects of the contractor and possible subcontractor, activities Conduct and oversee activities to establish and maintain active stakeholder committees Maintain adequate communication with EPA Program Manager Manage oversight and conduct of evaluation activities Assure that quality procedures are incorporated and implemented Conduct management review and approval of test/QA plans Issue stop work orders if data quality, health, or safety issues call for it Ensure TTEP activities are operated within the documented quality system Conduct management review and approval of evaluation reports TTEP Technology Area Leaders Coordinate planning, performance, and data reviews of technology evaluations consistent with TTEP QMP requirements Solicit technology vendors Perform pre-evaluation kick-off meetings to review technical, project management, and QA aspects of testing with evaluation staff Work with stakeholders and EPA to identify and prioritize technologies for evaluation Schedule technology evaluations Select/assemble technical staff to perform specific technology evaluations/data reviews Coordinate development and implementation of test/QA plans Review and approve amendments and deviations to test/QA plans prepared by Task Order Leaders Prepare, review, and/or approve evaluation reports Oversee/assist in problem resolution involving evaluations May serve as Task Order Leader with responsibility for ensuring that task order financial, contractual, reporting, and schedule requirements are met Contractor QA Manager Ensure that the quality system is compliant with EPA-specified standards Advise the contractor TTEP Manager of any QA/QC problems and oversee corrective actions ------- TTEP QMP Page 15 of 36 Version 3 Date: 01/01/08 TTEP Member Contractor QA Manager (continued) Responsibilities Ensure the TTEP QMP includes sufficient and appropriate specifications for QA/QC as required for TTEP Interact with TTEP management and technical staff to ensure that QA/QC procedures are understood Ensure the TTEP QMP and NHSRC QMP are followed by performing system assessments and audits Perform or oversee a TSA and audit of data quality (ADQ) for every evaluation Participate in pre-evaluation kick-off meetings to review QA requirements with evaluation staff Review training records of evaluation staff Notify the contractor TTEP Manager to issue a stop work order if assessments indicate health, safety, or quality concerns Review QA documentation of reference laboratories for each evaluation, as appropriate Review QC data (including reference laboratories and vendor technologies) generated during evaluations Ensure that assessment reports are prepared and distributed that detail appropriate corrective action and that implementation is responded to by the appropriate Task Order Leader or Technology Area Leader and returned to the contractor QA Manager. Problems that are not addressed are to be brought to the attention of management Review and approve test/QA plans, SOPs, and evaluation reports Review and approve amendments and deviations to test/QA plans Review all quality system documentation, including this document, at intervals necessary to ensure their integrity. Such reviews will be recorded and documents will be revised if necessary. All previous original (i.e., signed) revisions will be retired and archived. Act as a QA resource to respond to quality needs and problems. Answer questions and train evaluation staff in QA/QC requirements and procedures. Remove outdated control documents from circulation. Evaluation Staff Provide technical support to technology evaluations as needed, and interact with the contractor QA Manager during assessments and implementation of corrective actions when needed Perform QA/QC activities specified in this document, applicable test/QA plans, and in pertinent SOPs Conduct QC measures and activities required for sample analyses Verify 100% of data and evaluate results of QC analyses to determine if quality goals and objectives have been met Inform the appropriate Technology Area Leader and/or Task Order Leader of potential QC problems Perform corrective action at the direction of TTEP management and contractor QA Manager in response to TSA and ADQ audit reports Document results of QC analyses and include them with sample results and historical data files Maintain instrumentation (vendor and/or reference instrumentation) in accordance with the QMP, test/QA plan, SOPs, and the manufacturer's instructions Prepare test/QA plans and amendments and deviations (Task Order Leaders only) to these plans as appropriate Attend pre-evaluation kick-off meetings to review technical, project management, and QA aspects of testing Perform performance evaluation (PE) audits of subcontractor and reference laboratories and calibrated equipment for each evaluation, as appropriate Maintain evaluation records (in bound Laboratory Record Book and/or test binders) that adequately capture the quality of data collected Prepare technology evaluation reports May serve as Task Order Leader with responsibility for ensuring that task order financial, contractual, reporting and schedule requirements are met. ------- TTEP QMP Page 16 of 36 Version 3 Date: 01/01/08 3.4 TRAINING DOCUMENTATION The contractor or subcontractor evaluation staff member working on TTEP activities have his or her training experience documented within a training record that is stored and maintained within the contractor's or subcontractor's training files. A staff member's training experience, as well as educational background and employment skill experience (i.e., skill experience acquired prior to establishment of the training record), is summarized in the staff member's training record. Management and technical training received in-house or off-site is recorded and forms, memos, or certificates retained. Performance on program assignments is considered as part of training. In addition to the items discussed above, other documentation may be stored with the training record related to specific program requirements, including biosketches, curriculum vitae, certifications, accreditations, licenses, SOP read lists, and other necessary formal qualifications. The contractor QA Manager reviews the qualifications of TTEP staff within a TSA to ensure that program staff are adequately trained to perform program-related tasks and have maintained any necessary quality-related qualifications. This is accomplished by reviewing the staff members' training records, along with any other above mentioned items when additional details are needed for a specific training activity. 3.5 JOB PROFICIENCY Personnel job proficiency, based on witnessed performance on-the-job by a qualified trainer/staff member designee, is documented. Specific method requirements for instrument inspection, performance, and maintenance are objective measures that could be considered. Specific performance based on national certification requirements can be recorded with certificates or other documentation. Participants (e.g. subcontractors) working on behalf of the contractor in overall support of TTEP and/or support of individual technology evaluations are expected to provide a training record or biosketch to the Technology Area Leader, or designee, before their participation in a technology evaluation, indicating: Educational background and/or degree(s) relevant to technical areas represented in this program Work experience related to chemical and biological (CB) agent protection, sampling, and analysis; in designing experiments to evaluate various monitoring, detection, or decontamination technologies; and/or experience with standard analytical instrumentation Experience in quality management. 3.6 RETRAINING Retraining needs based on job requirements are determined by the staff member working on TTEP and respective program management. To maintain staff proficiency, training opportunities provided by contractor or other sources are made available, preferably on an annual basis. ------- TTEP QMP Page 17 of 36 Version 3 Date: 01/01/08 4.0 PROCUREMENT OF ITEMS AND SERVICES 4.1 PLANNING AND CONTROL Procurement technical and quality requirements are generally based upon value (cost, durability, maintainability); performance (specification compliance, operating conditions, calibration capacity); delivery (timeliness, ease of ordering); customer support (responsiveness, technical ability); and completeness and coherence of instructions (clarity, accuracy). 4.2 TECHNICAL AND QUALITY REQUIREMENTS Contractor staff members must follow their current procurement system. Technical and quality requirements for items (e.g. equipment used for performance evaluations) and services (subcontractors) procured for each technology evaluation are included in the test/QA plan. The request for items or services initiates from the Task Order Leader or technical staff with approval for purchase from the contractor TTEP Manager or designee (e.g. appropriate Technology Area Leader). 4.3 VERFIYING SUPPLIER'S CONFORMANCE Testing equipment procured for activities affecting quality are received calibrated from the supplier, or will be calibrated by an instrument services laboratory prior to any use on an evaluation to ensure accuracy with required specifications listed in the test/QA plan. Discrepancies result in the return of the item to the supplier. Test, storage, and maintenance records are included in individual test records. Testing materials procured for activities affecting quality (e.g., reference standards or gases) are accompanied with a Certificate of Analysis (COA). The CO A is examined to ensure that the listed specifications are within those necessary for the evaluation. The COA is retained and included in the evaluation records. 4.4 DOCUMENT REVIEW All procurement documentation is reviewed and approved by the contractor TTEP Manager or designee to ensure completeness and accuracy before these requests are forwarded to the contractor's Procurement Office for processing. 4.5 REVIEW OF CHANGED DOCUMENTS Any procurement documentation that changes after its initial review before being sent to the contractor's Procurement Office has to undergo the same review process described in Section 4.4 above. 4.6 REVIEW OF ITEMS AND SERVICES Procured items and services are reviewed to ensure compliance with each technology evaluation requirements and specifications. When receiving items, an inspection is performed in comparison with the supplier's supplied document for the deliverable. All purchased items are ------- TTEP QMP Page 18 of 36 Version 3 Date: 01/01/08 inspected as necessary to verify compliance to specified requirements. When reagents or standards are purchased, it is necessary to ensure NIST traceability of the item, when applicable. Methods to accept procurement of services (i.e. subcontractors; installation, repair, or maintenance work; etc.) includes at least one of the following: Technical verification of the data produced Surveillance and/or audit of the activity being performed, and Review of objective evidence for conformance to procurement document requirements. ------- TTEP QMP Page 19 of 36 Version 3 Date: 01/01/08 5.0 DOCUMENTS AND RECORDS 5.1 RECORDS MANAGEMENT PROCEDURES 5.1.1 Test Records Active Technology Evaluation Records. All technology evaluation records shall carry minimum identification pertaining to title, responsible person or author, and date. All manual entries are entered using ink and are signed/initialed and dated by the individual recording the entry. Changes to entries, manual or electronic, are not to obscure the original record during the correction process, and be initialed and dated by the individual recording the correction. A short explanation is added to non-obvious corrections. Storage of Technology Evaluation Records. Technology evaluation records are retained for at least seven years after final payment under the BPA. All program records are retained that are necessary to reconstruct evaluation activities and to verify that reported data were collected in a quality manner. The contractor maintains all records in a file storage area that is only accessible to staff working on projects supported through the BPA. At the conclusion of the BPA, all contractor maintained and stored records shall be transferred to the EPA TOPO for storage. The contractor QA Manager retains, as permanent record, documentation of the transfer of program records. 5.1.2 Program Records The following program records are retained for at least seven years after final payment under the BPA. These records include paper and electronic copies or versions. Minutes of stakeholder meetings BPA records Test/QA plans, amendments, and deviations Technology evaluation reports Contractor quality assessment reports. 5.1.3 Document and Record Preparation, Review, Approval, and Distribution Document and record review and approval are summarized in Table 5-1 and are performed as described below. Preparation. Individual case requirements and this QMP guide document and record content and/or format. For TTEP, guidance for content and/or format is derived by the following documents: ANSI/ASQC E4-1994. Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs EPA QA/R-2, EPA Requirements for Quality Management Plans, March 2001 ------- Table 5-1. Records Management Responsibilities for TTEP TTEP QMP Page 20 of 36 Version 3 Date: 01/01/08 Preparation/ Record Type Updating Review Approval Finals Distributed to: TTEP Quality Management Plan Minutes of Stakeholder Meetings Test/QA Plan Test/QA Plan Amendments Test/QA Plan Deviations Raw data Evaluation Report EPAQA Reviews/Audit Reports Contractor QA Reviews/Audit Reports Contractor QA Manager Contractor Stakeholder Coordinator Contractor Task Order Leader (TOL) or Subcontractor Contractor TOL Contractor TOL Contractor or Subcontractor Contractor or Subcontractor EPAQA Manager or designee Contractor QA Manager or designee EPA Program Manager EPA QA Manager Contractor TTEP Manager EPA Program Manager Attending EPA Task Order Project Officers [TOPOs) Contractor TOLs Stakeholders EPATOPO NHSRC QA Manager Contractor TTEP Manager Contractor QA Manager Assigned Stakeholders Peer Reviewers Contractor TTEP Manager or designee Contractor QA Manager Contractor TTEP Manager or designee Contractor QA Manager Internal Technical Peer Review EPATOPO NHSRC QA Manager or designee Contractor TTEP Manager Appropriate Contractor Technology Area Leader (TAL) Contractor QA Manager Vendor Peer Reviewers EPA Task Order Project Officer Contractor TTEP Manager Contractor QA Manager Contractor TTEP Manager Appropriate Contractor TAL EPA Program Manager EPA QA Manager Contractor TTEP Manager Contractor QA Manager No approval required EPA TOPOs EPA QA Manager Contractor TTEP Manager Contractor QA Manager Vendors EPATOPO* Contractor TTEP Manager or designee Contractor QA Manager Contractor TTEP Manager or designee Contractor QA Manager No approval required EPA TOPO EPA QA Manager EPA TOPOs No approval required EPA Program Manager EPA TOPOs EPA QA Manager Key TTEP Staff EPA Program Manager Contractor TTEP Manager Applicable EPA TOPOs Stakeholders EPA Program Manager Contractor TTEP Manager Contractor QA Manager EPATOPO EPA QA Manager Technical staff Vendors Contractor TTEP Manager Contractor or Subcontractor Technical Staff Contractor QA Manager Contractor TAL Applicable EPA TOPOs EPA QA Manager Vendors Contractor TTEP Manager Contractor QA Manager Document in evaluation report EPA can request copies Applicable EPA TOPOs Vendors EPA Program Manager Contractor TTEP Manager Contractor QA Manager EPA TOPOs EPA QA Manager TOPO Approval of Amendments only when contractually required deliverable EPA QA/R-5. EPA Requirements for Quality Assurance Project Plans, March 2001 EPA QA/G-4. EPA Guidance for Data Quality Objective Process, February 2006 EPAQA/G-5. EPA Guidance for Quality Assurance Project Plans, December 2002 EPA QA/G-6. EPA Guidance for the Preparing Standard Operating Procedures (SOPs), April 2007 EPA QA/G-7. EPA Guidance on Technical Audits and Related Assessments for Environmental Data Operations, January 2000 EPA QA/G-9R. Data Quality Assessment: A Reviewer's Guide, February 2006 EPA QA/G-9S. Data Quality Assessment: Statistical Tools for Practitioners, February 2006 ------- TTEP QMP Page 21 of 36 Version 3 Date: 01/01/08 QMP for the National Homeland Security Research Center, August 2003, Appendix B. Review/Approval. Record review/approval is performed by qualified technical and/or management personnel and the contractor QA Manager, as appropriate. The individual reviewer has access to all needed references. All draft test/QA plans and evaluation reports are required to be reviewed by at least one technical reviewer and the contractor QA Manager and contractor TTEP Manager, or their designees at a subcontractor, as appropriate, prior to external distribution to the EPA. Distribution. Once records are reviewed and approved as required, distribution is made through a distribution list maintained as part of the document. Program documents specifically requiring EPA approval before release include: QMP for the Technology Testing and Evaluation Program (this document) Test/QA plans Evaluation reports. 5.2 DOCUMENT CONTROL Document control is the system that ensures only the latest revision of controlled documents are used by staff participating in TTEP. The system includes retention of the document with original signed page(s) in a limited access storage area, a unique numbering system for all documents (typically identified by revision number and/or date), and an issue list for each document. Such documents are defined as "controlled documents" and can be revised only by the personnel listed within each document or this QMP. The following is a list of the controlled documents under this QMP: QMP for the Technology Testing and Evaluation Program Standard Operating Procedures Test/QA Plans, including amendments and deviations. Controlled document identification consists of a number, date, and version, if applicable, assigned to the document by the contractor QA Manager or designee. A current Master List of Controlled Documents and Distribution is maintained by the contractor QA Manager. As a controlled document, approved copies of the QMP are maintained and issued to TTEP key staff by the contractor QA Manager or designee. Obsolete or superseded documents are removed from operations when new documents are provided. Notification accompanies new document versions that the previous version is to be removed from use and destroyed. Staff members are responsible for destroying outdated versions of documents assigned to their person. The contractor QA Manager is authorized to remove outdated documents observed during inspections and reviews. All controlled documents, including historical revisions, are retained as records as per Section 5.1.1, with the exception of SOPs which are permanently archived in the QA records at the contractor's or their subcontractor's facility. ------- TTEP QMP Page 22 of 36 Version 3 Date: 01/01/08 6.0 COMPUTER HARDWARE AND SOFTWARE This QMP requires that TTEP staff understand the necessity for all computer hardware and software specifications. Staff must utilize computer hardware and software within the acceptance criteria specified, and assure that hardware and software are installed, maintained, and used according to specifications. Any time a change in hardware components or configuration or a software modification is needed, retesting and recalibration must be performed and documentation included with facility records. The description below for conformance to user and EPA requirements for hardware and software pertains to the contractor and any subcontractors. 6.1 CONFORMANCE TO USER AND EPA REQUIREMENTS The contractor's Information Management (IM) department is responsible for determining the standard machine configurations used across the contractor's facility. The standard configurations are leading-edge or near-leading-edge computers. PCs attached to analytical instruments have all raw data backed up onto DVD, CD or ZIP disc a minimum of quarterly (more frequently depending on instrument use and/or hard drive space). All backed up data is to be stored with test records if the back up is for a specific evaluation, or in the laboratory as part of a general back up covering multiple projects. Hardware. All computer hardware at contractor's facility contains Intel based Pentium processors running a Microsoft operating system. It is expected that the contractor's computer hardware is continually upgraded to improve performance and provide complete compatibility with current standards. Documentation of assessment and upgrades is maintained by the contractor's Information Management (IM) department. Software. Each PC primarily consists of a standard complement of Microsoft software (e.g., Word, Excel, Access, PowerPoint, and Outlook) with capabilities of running other commercial software (e.g., WordPerfect, Quattro, Lotus, SAS,) and delivery of data in any standard format. Specific software required for an evaluation is identified in the test/QA plan. Most software used by the contractor is acquired commercially, loaded, and tested as specified by the publisher. Independently-developed software is not used within the program, only commercial products are used. Software used for data management activities include Microsoft Excel or Access. Standard word processing software (e.g., WordPerfect, Word) is used to create reports. 6.2 CONFIGURATION TESTING The extent to which commercially available hardware and software are validated before use on technology evaluations in TTEP will vary from one product to another. Because wide public use and continued market viability can be considered proof of software dependability, configuration testing is not considered necessary. However, it is the policy of TTEP to verify that data analysis techniques (e.g., formulas, macros) are accurately and correctly coded within custom-designed programs. For spreadsheets, databases, or other user-produced programs prepared within TTEP, the individual who prepared the program is to complete a computer performance test form which includes the following information: An overview of the application. The overview describes what the application is required to do and specifies the methods used to meet the predetermined requirements ------- TTEP QMP Page 23 of 36 Version 3 Date: 01/01/08 References to the productivity software used (e.g., Excel, SAS, etc), and the operating system (e.g., Windows 2000, Windows NT, etc.) A description of important equations used to derive data A description of what test(s) were conducted to confirm the accuracy of the application. The performance test form is to be completed before the use of the application and must be in the program files before any Audit of Data Quality (ADQ). 6.3 CHANGE ASSESSMENT This program does not expect any software development; therefore no change in hardware components or configuration, or software modification is expected. 6.4 RETESTING AND REDOCUMENTATION In the event that a change in hardware components or configuration or a software modification is needed, retesting and recalibration will be performed, and documentation of this process must be included with program records. If there are changes within spreadsheets or databases that were previously checked according to Section 6.2 above, then the spreadsheet or database would need to undergo the performance test again. ------- TTEP QMP Page 24 of 36 Version 3 Date: 01/01/08 7.0 PROGRAM PLANNING In the planning stage for each technology evaluation, Technology Area Leaders working with EPA, stakeholders (as necessary or applicable), and the technology vendors, determine the type, quality, and quantity of data that will be necessary to meet the needs of the EPA. A systematic process is the foundation of the planning stage. 7.1 PLANNING AND DOCUMENTING THE GENERATION, ACQUISITION, AND USE OF ENVIRONMENTAL DATA The systematic planning to prepare for a technology evaluation is illustrated in Figure 7-1. The planning process for a technology evaluation first starts with EPA, the contractor, and the stakeholders identifying a technology category. Once a technology category has been identified, the identification and recruitment of potential vendors begins. Once all participating vendors are identified, the test/QA plan is developed with input from EPA, stakeholders, and the vendors, as appropriate. Each test/QA plan is first reviewed internally, reviewed by the EPA TOPO, then reviewed by the vendor (depending on the task order and whether the vendor is voluntarily participating), after which it is subjected to peer review by one EPA and on non-EPA reviewer. After the test/QA plan is finalized, signed by EPA and contractor representatives, and in place at the technology evaluation location, the evaluation can begin. During development of the test/QA plan, planning and documenting of how environmental data are generated, acquired, and used is determined. This process is similar to the seven-step Data Quality Objectives (DQO) Process presented in the Guidance for the Data Quality Objectives Process (EPA QA/G-4). The seven steps of the DQO process are: State the Problem Identify the Questions Identify the Types of Information Needed Establish Program Design Constraints Specify Information Quality Specify a Strategy for Information Synthesis Optimize the Design for Collecting Information. Once the problem is stated and the questions and target audience are identified, the data planning and documenting process establishes performance or acceptance criteria that detail the type and amount of data or other information that are needed to fulfill the evaluation's objectives, levels of quality that these data need to achieve, and how the data are analyzed to address the evaluation's objectives, how uncertainty or variability associated with statistics calculated from the data will be estimated. Performance criteria will be placed on new ("primary") data to be collected for the evaluation's specific use. 7.2 IDENTIFYING AND DOCUMENTING ENVIRONMENTAL DATA NEEDED The outcome of the systematic planning process contributes to the contractor TTEP Manager and Technology Area Leader's decision on the types of data collection and analysis methods to implement, by determining what approaches are necessary in order to yield data that meet ------- Identification of Technology Categories for Evaluation (Input from EPA/Stakeholder/Contractor) Recruitment of Vendors (Performed mostly by Contractor) TTEP QMP Page 25 of 36 Version 3 Date: 01/01/08 Draft Test/QA Plan (Input from Contractor/EPA/ Stakeholders/Vendors*) Vendor Review of Test/QA Plan* (Review coordinated by Contractor) Peer and QA Review of Revised Test/QA Plan (Reviews coordinated by Contractor or EPA) Coordination with Host Facility/ Subcontractors (Performed by Contractor) Final Test QA Plan (Final revisions made by Contractor) Finalize Host Facility/Subcontractor Roles (Performed by Contractor) To Technology Evaluation Figure 7-1. Systematic Planning of Technology Evaluations *Optional. Depends on task order and whether vendor participation is voluntary. these quality criteria. This planning process also contributes to establishing data quality assessment procedures that are used to evaluate the collected data based upon whether they actually achieved the performance or acceptance criteria relative to their intended use. The outcome of the systematic planning process, along with information on the data collection and assessment methods that are determined during this planning process, are documented in the test/QA plan. The test/QA plan also delineates the management authorities, personnel, schedule, policies, and procedures associated with data collection identified during the planning of the evaluation. If waste is expected to be generated as part of a test, the procedures for minimization and disposal in accordance with local, state, and Federal laws will be included in each test/QA plan. ------- TTEP QMP Page 26 of 36 Version 3 Date: 01/01/08 Occasionally, existing data that were originally collected for some other use can be utilized. In this case, the systematic planning process requires the following components of the acquisition and analysis process for existing data: 1) Determining the data needs for the technology evaluations, 2) Identifying existing data sources that might meet those needs; 3) Evaluating existing data relative to the evaluation's data quality specifications; and 4) Documenting quality issues in test/QA plans and the final report. When the systematic planning process establishes criteria on the quality of data utilized on a particular evaluation, it also typically establishes measurement quality objectives (MQOs) that address the quality of individual data values. MQOs specify acceptable levels for indicators of selected data quality attributes such as precision, bias, representativeness, comparability, completeness, and sensitivity. Appropriate QA and QC procedures, such as the collection and analysis of certain types of QC samples, are also defined that provide information on whether MQOs and other quality performance criteria are being achieved in the measurement process. 7.3 KEY USERS, CUSTOMERS, AND TECHNICAL STAFF Evaluation planning is coordinated by the contractor among the participating organizations including EPA, the program stakeholders, the technology vendors (if they are participating voluntarily), and any organizations that may be supporting or collaborating in the evaluation, e.g. by providing a test site. The contractor, with the concurrence and oversight of the EPA Program Manager, identifies the planning roles of the participants, and conducts planning activities by shared communication via teleconference, video conference, and in-person meetings, as appropriate, and within the constraints of the budget 7.4 REVIEW AND APPROVAL OF PLANNING DOCUMENTATION All documents shall require one review by at least one technical reviewer and the contractor QA Manager and/or contractor TTEP Manager, as appropriate, prior to external distribution. Document and record reviews are performed at the request of the contractor TTEP Manager, contractor QA Manager, Technology Area Leaders, or other personnel. ------- TTEP QMP Page 27 of 36 Version 3 Date: 01/01/08 8.0 IMPLEMENTATION The Technology Area Leaders oversee the implementation stage of evaluations, which involves efforts to collect data according to the methods and procedures documented in technical documents (test/QA plans, SOPs, etc.) that are prepared or obtained and approved during the planning phase of the evaluation. The Technology Area Leaders provide technical staff with access to these planning documents, approved changes to planning documents, and all referenced documents. Work on individual technology evaluations is not initiated until the Technology Area Leaders notify the assigned technical staff that all official approvals have been obtained on the test/QA plan and the approved test/QA plan is in place. Current versions of test/QA plans and any applicable methods and SOPs are required to be physically in place at each evaluation testing site. 8.1 IMPLEMENTATION PROCEDURES All operations are implemented according to specific procedures identified through discussions with the program team and the technology vendor during the systematic planning process, and these procedures are documented, along with their policy for use, in SOPs, or by other recommended documents. Time is allowed for the composition, review, and approval of technical procedures to be completed in advance of the actual performance. Review of evaluation-specific technical procedures is done by personnel technically competent with respect to the procedure. The Technology Area Leaders will work with the contractor TTEP Manager to identify qualified personnel to conduct such activities. Before the initiation of an evaluation, a kick-off meeting is held by the Technology Area Leader. The contractor TTEP Manager, contractor QA Manager, and all pertinent technical staff who are involved in the evaluation attend the kick-off meeting. Subjects discussed at the meeting, at a minimum, are as follows: Management - Planning documents - Roles and responsibilities Staff assignments - Evaluation schedule Quality Assurance - Training records - Internal TSA - Chain-of-custody - Amendments and deviations - PE audits (if applicable) Technical - Experimental procedures and design - Documentation Data - Handling/review of data Addition assessment kick-off topics are covered in Section 9.2. ------- TTEP QMP Page 28 of 36 Version 3 Date: 01/01/08 When work cannot be implemented according to the approved test/QA plan, the contractor is responsible for providing a written amendment to the test/QA plan or a deviation report for the evaluation records. Amendments are produced for planned changes that are made to the test/QA plan before the proposed change is begun. Amendments are approved internally by the Technology Area Leader, contractor QA Manager, and/or contractor TTEP Manager (if the Technology Area Leader is the originator). Following approval, the amendment is distributed to all internal personnel holding a copy of the parent test/QA plan and the EPA TOPO and NHSRC QA Manager or designee. A deviation report is produced for any changes to the test/QA plan that occurred during the evaluation. Deviation reports are retained in the evaluation test records and summarized in the evaluation report. Frequent deviations from established procedures should result in a retrospective review of the written document and possible revision. Amendments and deviations include all the information displayed on the example forms shown in Appendix II. The Technology Area Leaders work with the contractor QA Manager to ensure that any revisions to procedures are properly documented and approved, that program personnel are properly notified of the revisions and are provided with the revised documentation, and that obsolete versions of the documentation are removed from work areas (as described in Section 5.2). Technical staff document all implementation activities. Suitable documents are bound notebooks, field and laboratory data sheets, spreadsheets, computer records, and output from instruments (both electronic and hardcopy). All documentation is implemented as described in the planning documents. All implementation activities are traceable to the planning documents and traceable to TTEP personnel. Contractor oversight and assessment of an evaluation are provided by the contractor QA Manager or designee at intervals prescribed in each test/QA plan. This frequency, at a minimum, is once for each evaluation of a technology category. To verify full implementation of the test/QA plan, the assessment includes the testing process and any documentation associated with the process, such as sample tracking records; instrument maintenance and calibration; sample preparation and actual analysis; and data records. The contractor QA Manager provides a written report, verifies the completion of any corrective actions needed, and retains the original report with permanent contractor QA records. The contractor TTEP Manager is included in the routing of the assessment results and a written copy provided to EPA. The oversight and assessment of the work process is done to ensure: Satisfactory performance based on requirements Required actions (as specified in implementation documents) are performed so that any specifications are met Preventive maintenance is performed and documented as specified in facility and evaluation records Calibrations are performed as planned and prescribed, and Corrective actions are implemented and documented as planned in response to items of nonconformance. The contractor QA Manager receives copies of monthly or other periodic progress reports submitted to the EPA Program Manager in order to keep abreast of the status of each technology evaluation. ------- TTEP QMP Page 29 of 36 Version 3 Date: 01/01/08 8.2 STANDARD OPERATING PROCEDURES See Section 2.1.2 for Standard Operating Procedures. ------- TTEP QMP Page 30 of 36 Version 3 Date: 01/01/08 9.0 ASSESSMENT AND RESPONSE 9.1 SCOPE Assessments (e.g., audits) are planned, scheduled, conducted, and reported in order to measure the efficacy of the TTEP quality system. Assessment and response elements include assigning appropriate, qualified persons to conduct assessments at planned, scheduled intervals; having provisions for timely responses and implementation of corrective actions if needed; and completing the evaluation process with written reports to technical and management staff. Assessment types and responsibilities are shown in Table 9-1 and defined as follows: Quality Systems Audit - an on-site review of the implementation of the TTEP quality system as documented in this QMP. This review verifies the existence of, and evaluates the adequacy of, the internal quality system. The contractor QA Manager and contractor TTEP Manager are responsible to see that this audit is conducted on an annual basis. The EPA QA Manager has the option to conduct an independent QSA annually. Technical Systems Audit - a qualitative on-site evaluation of sampling and/or measurement systems associated with a particular evaluation. The objective of the TSA assesses and documents the acceptability of all facilities, maintenance, calibration procedures, reporting requirements, sampling, and analytical activities, and quality control procedures in the evaluation. Conformance with the test/QA plan and associated methods and/or SOPs is the basis for this assessment. The contractor QA Manager, or designee, conducts a TSA at least once for each technology evaluation, as applicable. The NHSRC QA Manager or designee has the option to conduct an independent TSA at least once a year. In order for the NHSRC QA Manager to consider the scheduling of a TSA, they shall be notified about upcoming evaluations at least four weeks before they occur. Performance Evaluation Audit - a quantitative evaluation of a measurement system. The type and frequency of performance evaluation (PE) audits performed by the Technology Area Leaders, or designee, are specified in the test/QA plan for each technology evaluation as applicable. The value or composition of reference materials must be certified or verified prior to use, and the certification or verification must be adequately documented. Results of PE Audits are reviewed by the contractor QA Manager, or designee, for conformance to acceptance criteria specified in the applicable test/QA plan. The need for independent PE audits will be determined by the NHSRC QA Manager or designee. Audits of Data Quality - an examination of the test data after they have been collected and 100 percent verified by technical staff. The contractor QA Manager, or designee, audits at least 10 percent of all technology evaluation data. The need for independent ADQs will be determined by the NHSRC QA Manager, or designee. ------- TTEP QMP Page 31 of 36 Version 3 Date: 01/01/08 Table 9-1. TTEP Assessments Assessment Subject of Reason for Level Tool Assessors Responders Assessment Minimum Frequency Assessment Program Task Order Task Order Task Order Quality Systems Audit Technical Systems Audits Performance Evaluation Audits Audits of Data Quality Contractor Contractor QA Manager EPA NHSRC QA Manager Contractor Contractor QA Manager or designee EPA NHSRC QA Manager or designee Contractor Contractor QA Manager or designee EPA NHSRC QA Manager or designee Contractor Contractor QA Manager or designee EPA NHSRC QA Manager or designee Contractor Contractor or Subcontractor Contractor or Subcontractor Contractor or Subcontractor Program QMP Test/QA plans Test/QA plans Raw data and summary data Contractor Annually EPA Option, once per year Contractor Once per technology evaluation EPA Annually, as applicable Contractor Each technology evaluation, as applicable EPA Each technology evaluation, as applicable Contractor At least 10% of the data in each technology evaluation EPA For each technology evaluation, as applicable Assess quality management practices of the program quality system Assess technical quality of evaluations Assess measurements performance Assess data calculations and reporting 9.2 ASSESSMENT PLANNING AND PROCEDURES Assessment planning is performed by the contractor QA Manager and the contractor TTEP Manager prior to the actual performance assessments. Planning the assessment scope helps provide the information needed to determine whether procedural compliance and technical requirements are being met during each evaluation. Assessment planning by the contractor includes a kick-off meeting with the technical staff assigned to the technology evaluation where at least the following information is discussed: Schedule of assessment(s) Notification to affected parties Specific assessment requirements (training records, methods and/or SOPs, and availability of test/QA plans) ------- TTEP QMP Page 32 of 36 Version 3 Date: 01/01/08 9.3 PERSONNEL QUALIFICATIONS FOR ASSESSMENT The principal assessor for TTEP is the contractor QA Manager, who has an extensive quality assurance laboratory and field inspection background, and technical and management experience, and who is directly familiar with the program assessment requirements. Should the need arise, the contractor QA Manager will designate an individual to perform scheduled assessments, based upon that person's technical skill and knowledge of QMP compliance requirements and test/QA plan specifications. TTEP personnel conducting assessments have the responsibility and authority to: Identify and document problems affecting the quality of test results Propose recommendations for resolving these problems Independently confirm implementation and effectiveness of solutions. 9.4 RESPONSIBILITY AND AUTHORITY TO STOP WORK Responsibility and authority to stop work during an evaluation for safety and quality considerations is delegated to the contractor TTEP Manager, or subcontractor designee, who must ensure compliance with all applicable federal, state, and local safety policies during the performance of an evaluation. In addition, should any technical staff suspect compromise to personal health or test objectives during the conduct of a technology evaluation, that staff member shall immediately contact the contractor TTEP Manager who shall have the authority to issue the stop work. Further, should an EPA TOPO or EPA QA Manager become aware that safety measures or data quality is being compromised during an evaluation; he or she shall report such to the EPA TTEP Program Manager. The EPA TTEP Program Manager shall contact the EPA Contracting Officer to implement a stop work order. Should it be determined during an assessment that adverse health effects could result, or that test objectives of acceptable quality cannot be achieved during performance of an evaluation, the contractor QA Manager is responsible for immediately notifying the contractor TTEP Manager of the need to consider a stop work order. The contractor TTEP Manager shall then direct the technical staff accordingly. Documentation is required of any stop work order and the corrective action implemented and shall be maintained as part of the contractor quality records, with a copy provided to the EPA Program Manager and E QA Manager. 9.5 DOCUMENTATION, REPORTING, AND REVIEW Authority to effectively report internal TSAs, PE audits, and ADQs is assigned to the contractor QA Manager or designee. The outcome of each assessment is fully documented. The contractor QA Manager archives all assessment documentation collected on TTEP. The contractor QA Manager reports the findings of each assessment to the appropriate level of management, who then addresses the assessment findings and provides an appropriate response. Quality assessment reports require a written response by the person performing the inspected ------- TTEP QMP Page 33 of 36 Version 3 Date: 01/01/08 activity, and acknowledgment of the assessment by the contractor TTEP Manager and the Technology Area Leader. Assessment reports: Identify and document problems that affect quality and the achievement of objectives required by the QMP, test/QA plan, and any associated SOPs Identify and cite noteworthy practices that may be shared with others to improve the quality of their operations and products Propose recommendations (if requested) for resolving problems that affect quality Independently confirm implementation and effectiveness of solutions Provide documented assurance (if requested) to line management that, when problems are identified, further work performed is monitored carefully until the problems are suitably resolved. 9.6 RESPONSES AND FOLLOW-UP ACTIONS Findings in TSA reports that have a direct impact on the conduct of a technology evaluation are corrected immediately following notification of the finding. Documented responses to TSA adverse findings are typically addressed within 10 working days after the TSA is completed. Findings in ADQ or QSA reports that have a direct impact on the quality of results or the operations of the program are corrected immediately following notification of the finding. Documented responses to ADQ and QSA adverse findings are typically addressed within 10 working days after completion of the audit. Normally, results for PE Audits are assessed during a TSA and any findings will be addressed as part of the TSA. If the PE Audit is not performed by the Technology Area Leader, or designee, during the TSA, then the Technology Area Leader supplies the contractor QA Manager, or designee, with the results for review against acceptance criteria. When this is the case, then any adverse findings will be handled the same as the TSA. Responses to each adverse finding are documented in the assessment report (QMP Section 9.5). Assessment reports provide space after each adverse finding for a response to be recorded. The response indicates the corrective action taken or planned to address the adverse finding. The response is signed and dated by the staff responsible for implementing the corrective action. Corrective actions that cannot be immediately implemented are verified following completion by the contractor QA Manager or designee. Once all corrective actions associated with an assessment report have been taken, the contractor QA Manager or designee initials the corrective action in the assessment report thus documenting verification of the corrective action. Any impact that an adverse finding had on the quality of evaluation data is addressed in the evaluation report. The TSA assessment report, with responses to adverse findings recorded within, are sent to EPA within 10 working days after the contractor QA Manager has verified all corrective actions. ------- TTEP QMP Page 34 of 36 Version 3 Date: 01/01/08 10.0 VALIDATION OF DATA USABILITY 10.1 ASSESSING, VERIFYING, AND QUALIFYING DATA The assessment, verification, and qualification of data collected for technology evaluations are through the use of audits, statistical analyses, and quality control (QC) samples. The specific assessments, statistical analyses, and QC samples used for an evaluation are specified in the test/QA plan, and are summarized below. During the course of evaluations, assessments that are used during the collection of data are TSAs and PE audits. TSAs are performed by the contractor QA Manager, or designee, to ensure compliance with the methods and/or procedures specified in the test/QA plan. PE audits are performed by the Technology Area Leader, or designee, on those measurements that factor into the data used for evaluation (i.e. the vendor instruments undergoing evaluation are not the subject of the PE audit). After collection of data, all results are 100% verified by the appropriate Task Order Leader, and 10% by the contractor QA Manager, or designee. Validation of the data is through the use of various statistical measurements including, but not limited to, accuracy, precision, linearity, comparison to reference standards or methods, and completeness. Finally, data is qualified through the use of QC samples, including, but not limited to, blanks, spikes, duplicates, and standard reference materials (SRMs). To ensure validity of standards used for spikes and SRMs, the standards and SRMs needed are certified and traceable (where applicable). 10.2 DOCUMENTING LIMITATIONS ON DATA Any limitations on the data and recommendations for limitation on its usability is documented initially in the test/QA plan for each evaluation and discussed in the final report. 10.3 INDEPENDENT REVIEW OF REPORTS All documents produced by TTEP are peer and administratively reviewed as necessary and as described in the EPA's Peer Review Handbook (EPA/100/B-06/002) and the internal NHSRC SOPs for peer review. 10.4 MANAGEMENT APPROVAL Review and approval procedures for evaluation data and reports are given as part of the records management responsibilities in Table 5-1. ------- TTEP QMP Page 35 of 36 Version 3 Date: 01/01/08 11.0 QUALITY SYSTEM IMPROVEMENT A continuous quality improvement process is considered essential for TTEP staff to develop a more responsive quality system in all aspects of technical and management activities. 11.1 QUALITY IMPROVEMENT PROCESS In order to ensure that TTEP can consistently achieve its goals over time to design and operate high-quality evaluations that meet the needs and requirements of the EPA, contractor and EPA TTEP management is continually identifying how its quality system can be improved. Quality processes are continually monitored and both short-term and long-term quality issues are identified through the regular quality reviews and audits that are performed. 11.1.1 Annual QMP Review An annual review of the QMP for TTEP is conducted by the contractor QA Manager and technical and management staff in order to incorporate improvements to the quality system process. See Section 2.2 for additional information on the annual QMP review. 11.1.2 Annual Quality Systems Audit In addition to the annual QMP review, a QSA is performed annually by the contractor QA Manager internally, and the NHSRC QA Manager has the option to conduct an independent QSA. See Section 9.1 for additional information on QSAs. 11.2 PREVENTING, DETECTING, AND CORRECTING QUALITY SYSTEM PROBLEMS Detecting and correcting quality system problems is a result of qualified TTEP technical and management staff implementing not only this QMP, but also the test/QA plans and other procedures. All staff are encouraged to identify problems and offer solutions to problems in the following quality areas: Adequacy of the quality system, as defined in the QMP Consistency of the quality system Implementation of the quality system to specific evaluations Correction of quality system procedures Completeness of documented information Quality of data Quality of planning documents, such as the test/QA plans Implementation of the work process. Cause and effect relationships of significant problems are documented by the contractor QA Manager. When significant problems are reported to the contractor QA Manager, attempts to determine the root cause based on cause and effect during performance of planned and documented procedures are made through intensified observations of technology evaluation activities and audits of technology evaluation data. ------- TTEP QMP Page 36 of 36 Version 3 Date: 01/01/08 Collaboration with trained technical/management staff associated with or performing the activity can provide insight and determine whether any of the following is required: A test/QA plan change A management system change A quality system change within TTEP. Assessment reports also serve as tools to determine cause and effect relations of significant problems that might require technology evaluation, management system, or quality system changes. For example, the repeated inclusion of the same problem or problem resolution in the various quality assessment reports may indicate to EPA or the contractor TTEP Manager that a program-wide change may be warranted. This might include corrective actions taken by the contractor or a modification to the quality system requested by EPA, whichever is most appropriate. Root cause determination is immediately reported by the contractor to the EPA prior to any planned implementation of preventative measure. Once the root cause determination is verified, appropriate actions can be planned, documented, and implemented by TTEP staff. 11.3 RESPONSE ACTIONS The contractor QA Manager reports the findings of the QSA in a format similar to that described in Section 9.5 to the contractor TTEP Manager and Technology Area Leaders. Responses to quality system problems identified are addressed in a format similar to that described in Section 9.6. The QSA report, with responses to adverse findings recorded within, is sent to EPA within 10 working days after the contractor QA Manager has verified all corrective actions. ------- APPENDIX I NAMES, DESCRIPTIONS, ADDRESSES, AND PHONE NUMBERS OF KEY TTEP STAFF ------- KEY TTEP STAFF TTEP is managed within Battelle, which includes approximately 16,000 scientists, engineers, and support personnel. Staff and facilities are drawn from within Battelle to support the program. RTI International has provided staff and facilities for TTEP as a major subcontractor to Battelle, however, that work is currently inactive. Listed below are the key TTEP staff and their contact information. Battelle TTEP Manager: Ms. Karen Riggs is the Battelle TTEP Manager. She has over 25 years experience in managing contracts and projects for EPA and is very familiar with EPA contractual, reporting, and quality assurance requirements. Ms. Riggs has managed two multi-year task order contracts for NHSRC's Safe Buildings Program and served as program manager for EPA's ETV program for 8 years. Ms. Riggs has extensive experience in the development and implementation of QMPs and test/QA plans for EPA. Ms. Karen Riggs Battelle 505 King Avenue Columbus, OH 43201 Phone: 614-424-7379 Fax: 614-424-3638 email: riggsk@battelle.org Battelle Technology Area Leaders: Dr. Ryan James, Dr. Thomas Kelly, Ms. Deborah Franke, and Dr. Michael Taylor are the TTEP Technology Area Leaders (TAL) and have responsibility for planning and leading evaluations within their respective technology areas. Dr. James is TAL for Water Security, Dr. Taylor for Building Decontamination, Dr. Kelly for Building Detection, and Ms Franke for Building Filtration/Air Cleaning. Each of them report directly to Ms. Riggs on their TTEP activities. Dr. Ryan James Battelle 505 King Avenue Columbus, OH 43201 Phone: 614-424-7954 Fax: 614-424-3638 email: iamesr@battelle.org Dr. Thomas J. Kelly Battelle 505 King Avenue Columbus, OH 43201 Phone: 614-424-3495 Fax: 614-424-3638 email: kellyt@battelle.org Dr. Michael Taylor Battelle Suite 155 10300 Alliance Rd Cincinnati, OH 45242 Phone: 513-265-2600 ext. 15 Fax: 513-265-2610 email: taylorm@battelle.org Ms. Deborah Franke RTI International 3040 Cornwall! s Rd Research Triangle Park, NC 27709 Phone: 919-541-6826 Fax: 919-541-6936 email: dlf@rti.org * Currently inactive ------- Quality Assurance Manager: Mr. Zachary Willenberg is the Quality Assurance (QA) Manager for TTEP. He is a QA Officer for Battelle and in his capacity as the program's QA Manager he will report to Mr. Martin Toomajian, a Senior Manager at Battelle. Mr. Willenberg will keep Mr. Toomajian informed of all program level quality items and issues. Mr. Zachary J. Willenberg Battelle 505 King Avenue Columbus, OH 43201 Phone: 614-424-5795 Fax: 614-424-3638 email: willenbergz@battelle.org Stakeholder Involvement Leader: Ms. Rachel Sell is the TTEP Stakeholder Involvement Leader with primary responsibility for stakeholder recruitment, communications, and meetings. She also reports directly to Ms. Riggs on this program. Ms. Rachel Sell Battelle 505 King Avenue Columbus, OH 43201 Phone: 614-424-3579 Fax: 614-458-3579 email: sellr@battelle.org ------- APPENDIX II APPENDIX B QAPP FORMAT PAGES FROM NHSRC QMP ------- QAPP REQUIREMENTS FOR SAMPLING AND ANALYSIS PROJECTS A sampling and analysis activity or project is typically defined as a study performed to generate data to either monitor parameters on a routine basis or to characterize a particular population for later studies. The following requirements should be addressed as applicable. SECTION 1.0, PROJECT DESCRIPTION AND ORGANIZATION 1.1 The purpose of the study shall be clearly stated in the sampling and analysis plan (SAP). 1.2 Responsibilities and points of contact for each organization shall be identified in the SAP. This should include identification of key personnel and/or organization(s) responsible for sample collection and custody, analytical and/or process measurements, data reduction, report preparation, and quality assurance. SECTION 2.0, SAMPLING 2.1 Sampling points for all measurements (i.e., analytical, physical, and process, including locations and access points) shall be identified in the SAP whenever possible. If the specific locations cannot be identified at the time of plan generation, discuss the documentation and/or communication mechanism(s) for ensuring adequate information is captured to later identify sampling points. 2.2 The anticipated sampling frequency (e.g., how many sampling events and how often events occur), the number of sample types (e.g., metals, VOCs, SVOCs, etc.), and the minimum number of samples of each type taken at each event shall be provided. 2.3 The expected measurements (i.e., specific analytes) planned for each sample type shall be summarized. 2.4 If applicable, known site-specific factors that may affect sampling procedures shall be described. 2.5 If applicable, any site preparation (e.g., sampling device installation, sampling port modifications) needed prior to sampling shall be described. 2.6 Each sampling procedure (including a list of equipment needed and the calibration of this equipment as appropriate) to be used shall be discussed or referenced. Maintenance requirements/procedures (as appropriate) must also be addressed in this section. 2.7 If compositing or splitting of samples is planned, the applicable procedures shall be described. 2.8 A list of sample quantities to be collected, and the sample amount required for each analysis, including QC sample analysis, shall be specified. B-2 NHSRC QMP - August 2003 ------- 2.9 Containers used for sample collection, transport, and storage for each sample type shall be described. 2.10 Sample preservation methods (e.g., refrigeration, acidification, etc.) shall be described. 2.11 Requirements for shipping samples shall be described. 2.12 Holding times requirements shall be noted. 2.13 Procedures for tracking samples in the laboratory and for maintaining chain-of-custody when samples are shipped shall be described. COC procedures shall be described to ensure that sample integrity is maintained (labeling, seals, records). 2.14 Information to be recorded and maintained by field personnel shall be discussed. SECTION 3.0, TESTING AND MEASUREMENT PROTOCOLS 3.1 Each analytical method to be used shall be referenced. This includes EPA-approved and other validated nonstandard methods. 3.2 If applicable, modifications to EPA-approved or other validated nonstandard methods shall also be described. SECTION 4.0, QA/QC CHECKS 4.1 The SAP shall list and define all calibrations and QC checks and/or procedures used for the project, both field and laboratory as needed.. 4.2 For each specified calibration, QC check, or procedure, required frequencies and acceptance criteria shall be included. SECTION 5.0, DATA REDUCTION AND REPORTING 5.1 Data reduction procedures specific to the project, and also specific to each organization, shall be summarized. 5.2 The reporting requirements (e.g., units, reporting method [e.g., wet or dry]) for each measurement and matrix shall be identified. SECTION 6.0, REPORTING REQUIREMENTS The deliverables expected from each organization responsible for field and/or analytical activities shall be described. B-3 NHSRC QMP - August 2003 ------- QAPP REQUIREMENTS FOR BASIC RESEARCH PROJECTS A basic research project is a study performed to generate data used to evaluate unproven theories, processes, or technologies. SECTION 1.0, PROJECT OBJECTIVES AND ORGANIZATION 1.1 State the proj ect obj ectives. 1.2 Identify the responsibilities of all project participants (e.g., QAPP preparation, sample collection and analyses, data reduction/validation/analysis, report preparation, QA). SECTION 2.0, EXPERIMENTAL APPROACH 2.1 Describe the process, site, facility, apparatus, and/or environmental system to be tested. 2.2 Describe all known or pre-established test conditions and variables, including replicate experimental runs. 2.3 Describe the planned approach (statistical and/or non-statistical) for evaluating project objectives (i.e., data analysis). SECTION 3.0, SAMPLING AND MEASUREMENT APPROACH AND PROCEDURES 3.1 Complete a table similar to the following to summarize the experimental sampling strategy to be used. Sample/Measurement Location Matrix Measurement Frequency Experimental QC1 Total No. Samples 'QC samples generated during experiment, as applicable (e.g., blanks, replicate samples, spikes) 3.2 Complete a table similar to the following to summarize the experimental sampling and analytical procedures to be used. Matrix Measurement Sampling/ Measurement Method1 Analysis Method1 Sample Container/ Quantity of Sample Preservation/ Storage Holding Time(s)2 'Provide details in text, as necessary, if standard method or SOP cannot be referenced 2Both to extraction and analysis, if applicable B-4 NHSRC QMP - August 2003 ------- SECTION 4.0, QA/QC CHECKS Complete a table similar to the following to summarize QA/QC checks. Matrix || Measurement QA/QC Check1 Frequency | Acceptance Criteria Corrective Action 'Include all QA/QC checks (experimental and analytical, as applicable) for accuracy, precision, detection limits, mass balance, etc. (e.g., matrix spikes, lab control samples, blanks, replicates, surrogates) SECTION 5.0, DATA REPORTING Describe data reduction procedures specific to the project. SECTION 6.0, REFERENCES Provide references to methods and germane prior publications. IN ADDITION, WHEN APPLICABLE... If bulk sample(s) will be collected in the field for use in laboratory experiments, include applicable information from Section 2.0 of QAPP Requirements for Sampling and Analysis Projects. List all project-specific target analytes (i.e., when a class of compounds is specified in the table). Indicate if reporting is on a wet or dry weight basis (solid matrices only). Describe the method used to establish steady-state conditions. Describe how sampling equipment is calibrated. Describe how cross-contamination between samples is avoided. Describe the procedures used to collect representative samples. Describe sample packing and shipping procedures. Describe instrument calibration procedures and acceptance criteria if not included in a referenced method or SOP. B-5 NHSRC QMP - August 2003 ------- QAPP REQUIREMENTS FOR APPLIED RESEARCH PROJECTS An applied research project is a study to demonstrate the performance of technologies under defined conditions. These studies are often pilot- or field-scale. The following requirements should be addressed as applicable. SECTION 0.0, DISTRIBUTION LIST A distribution list shall be provided to facilitate the distribution of the most recent current version of the QAPP to all the principal project participants. SECTION 1.0, PROJECT DESCRIPTION AND OBJECTIVES 1.1 The purpose of study shall be clearly stated. 1.2 The process, site, facility, and/or environmental system to be tested shall be described. 1.3 Project objectives shall be clearly stated and identified as primary or non-primary. SECTION 2.0, PROJECT ORGANIZATION 2.1 Key points of contact for each organization involved in the project shall be identified. 2.2 All QA Managers and their relationship in the organizations (i.e., location within each organization) shall be identified with evidence that the QA Manager is independent of project management. 2.3 Responsibilities of all other project participants and their relationship to other project participants shall be identified, meaning that organizations responsible for planning, coordination, sample collection, sample custody, measurements (i.e., analytical, physical, and process), data reduction, data validation, and report preparation shall be clearly identified. SECTION 3.0, EXPERIMENTAL APPROACH 3.1 The general approach and the test conditions for each experimental phase shall be provided. The statistical methods that will be used to evaluate the data (i.e., ANOVA, or summary statistics) should be identified. (NOTE: As deemed appropriate to the project by the TLP, the information requested in Sections 3.2, 3.3, and 3.4 may be presented here or in Section 4; the information requested in Sections 3.5 may be presented here or in Section 5; and the information requested in Sections 3.6 may be presented here or in Section 7.) 3.2 The sampling strategy shall be included and evidence must be presented to demonstrate that the strategy is appropriate for meeting primary project objectives, i.e., a description B-6 NHSRC QMP - August 2003 ------- of the statistical method or scientific rationale used to select sample sites and number of samples shall be provided. 3.3 Sampling/monitoring points for all measurements (i.e., including locations and access points) shall be identified. 3.4 The frequency of sampling/monitoring events, as well as the numbers for each sample type and/or location shall be provided, including QC and reserve samples. 3.5 All measurements (i.e., analytical [chemical, microbiological, assays], physical, and process) shall be identified for each sample type or process, and project-specific target analytes shall be listed and classified as critical or noncritical in the QAPP. 3.6 The planned approach (statistical and/or non-statistical) for evaluating project objectives shall be included. SECTION 4.0, SAMPLING PROCEDURES 4.1 Whenever applicable, the method used to establish steady-state conditions shall be described. 4.2 Known site-specific factors that may affect sampling/monitoring procedures shall be described. 4.3 Any site preparation needed prior to sampling/monitoring shall be described. 4.4 Each sampling/monitoring procedure to be used shall be discussed or referenced. If compositing or splitting samples, those procedures shall be described. 4.5 For samples requiring a split sample for either QA/QC purposes or for shipment to a different laboratory, the QAPP shall identify who is responsible for splitting samples, and where the splitting is performed (e.g., field versus lab). 4.6 If sampling/monitoring equipment is used to collect critical measurement data (i.e., used to calculate the final concentration of a critical parameter), the QAPP shall describe how the sampling equipment is calibrated, the frequency at which it is calibrated, and the acceptance criteria for calibration or calibration verification, as appropriate. 4.7 If sampling/monitoring equipment is used to collect critical measurement data, the QAPP shall describe how cross-contamination between samples is avoided. 4.8 The QAPP shall include a discussion of the procedures to be used to assure that representative samples are collected. 4.9 A list of sample quantities to be collected, and the sample amount required for each analysis, including QC sample analysis, shall be specified. B-7 NHSRC QMP - August 2003 ------- 4.10 Containers used for sample collection, transport, and storage for each sample type shall be described. 4.11 The method for uniquely identifying each samples shall be described. 4.12 Sample preservation methods (e.g., refrigeration, acidification, etc.), including specific reagents, equipment, and supplies required for sample preservation shall be described. 4.13 Holding time requirements shall be noted. 4.14 Procedures for packing and shipping samples shall be described. 4.15 Procedures to maintain chain-of-custody (e.g., custody seals, records) during transfer from the field to the laboratory, in the laboratory, and among contractors and subcontractors shall be described to ensure that sample integrity is maintained. 4.16 Sample archival requirements for each relevant organization shall be provided. SECTION 5.0, TESTING AND MEASUREMENT PROTOCOLS 5.1 Each measurement method to be used shall be described in detail or referenced. Modifications to EPA-approved or similarly validated methods shall be specified. 5.2 For unproven methods, verification data applicable to expected matrices shall be included in the QAPP meaning the QAPP shall provide evidence that the proposed method is capable of achieving the desired performance. 5.3 For measurements which require a calibrated system, the QAPP shall include specific calibration procedures applicable to each project target analyte, and the procedures for verifying both initial and continuing calibrations (including frequency and acceptance criteria, and corrective actions to be performed if acceptance criteria are not met). SECTION 6.0, QA/QC CHECKS 6.1 At a minimum, the QAPP shall include quantitative acceptance criteria for QA objectives associated with accuracy, precision, detection limits, and completeness for critical measurements (process, physical, and analytical, as applicable) for each matrix. 6.2 Any additional project-specific QA objectives shall be presented, including acceptance criteria. This includes items such as mass balance requirements. 6.3 The specific procedures used to assess all identified QA objectives shall be fully described. 6.4 The QAPP shall list and define all other QC checks and/or procedures (e.g., blanks, surrogates, controls, etc.) used for the project, both field and laboratory. B-8 NHSRC QMP - August 2003 ------- 6.5 For each specified QC check or procedure, required frequencies, associated acceptance criteria, and corrective actions to be performed if acceptance criteria are not met shall be included. SECTION 7.0, DATA REPORTING, DATA REDUCTION, AND DATA VALIDATION 7.1 The reporting requirements (e.g., units, reporting method [wet or dry]) for each measurement and matrix shall be identified. 7.2 The deliverables expected from each organization responsible for field and laboratory activities shall be listed. 7.3 Data reduction procedures specific to the project, and also specific to each organization, shall be summarized. 7.4 Data validation procedures specific to each organization used to ensure the reporting of accurate project data to internal and external clients shall be summarized. 7.5 Data storage requirements for each organization shall be provided. 7.6 The product document that will be prepared for the project shall be specified (e.g., journal article, final report, etc.). The contents of this document can be referenced to a NHSRC or program-specific QMP, if appropriate. SECTION 8.0, ASSESSMENTS 8.1 The QAPP shall identify all scheduled audits (i.e., both technical system audits [TSAs] and performance evaluations [PEs]) to be performed, who will perform these audits, and who will receive the audit reports. 8.2 The QAPP shall provide procedures that are to be followed that will ensure that necessary corrective actions will be performed. 8.3 The responsible party(-ies) for implementing corrective actions shall be identified. SECTION 9.0, REFERENCES References shall be provided either in the body of the text as footnotes or in a separate section. B-9 NHSRC QMP - August 2003 ------- QAPP REQUIREMENTS FOR PROJECTS USING SECONDARY DATA A secondary data project involves the gathering and/or use of existing environmental data for purposes other than those for which they were originally collected. These secondary data may be obtained from many sources, including literature, industry surveys, compilations from computerized databases and information systems, and computerized or mathematical models of environmental processes. For these projects, a QAPP shall be prepared to include the requirements identified below. If primary data will also be generated as part of the project, then the information below can be incorporated into the associated QAPP to address the secondary data. The following requirements should be addressed as applicable. SECTION 1.0, PROJECT OBJECTIVES, ORGANIZATION, AND RESPONSIBILITIES 1.1 The purpose of study shall be clearly stated. 1.2 Project objectives shall be clearly stated. 1.3 The secondary data needed to satisfy the project objectives shall be identified. Requirements relating to the type of data, the age of data, geographical representation, temporal representation, and technological representation, as applicable, shall be specified. 1.4 The planned approach for evaluating project objectives, including formulas, units, definitions of terms, and statistical analysis, if applicable, shall be included. 1.5 Responsibilities of all project participants shall be identified, meaning that key personnel and their organizations shall be identified, along with the designation of responsibilities for planning, coordination, data gathering, data analysis, report preparation, and quality assurance, as applicable. SECTION 2.0, SOURCES OF SECONDARY DATA 2.1 The source(s) of the secondary data must be specified. 2.2 The rationale for selecting the source(s) identified shall be discussed. 2.3 The sources of the secondary data will be identified in any project deliverable. SECTION 3.0, QUALITY OF SECONDARY DATA 3.1 Quality requirements of the secondary data must be specified. These requirements must be appropriate for their intended use. Accuracy, precision, representativeness, completeness, and comparability need to be addressed, if applicable. (If appropriate, a related QAPP containing this information can be referenced.) 3.2 The procedures for determining the quality of the secondary data shall be described. B-10 NHSRC QMP - August 2003 ------- 3.3 If no quality requirements exist, this shall be stated in the QAPP. If no quality requirements exist or if the quality of the secondary data will not be evaluated by EPA, the QAPP shall require that a disclaimer be added to any project deliverable to indicate that the quality of the secondary data has not been evaluated by EPA for this specific application. The wording for the disclaimer shall be defined. SECTION 4.0, DATA REPORTING, DATA REDUCTION, AND DATA VALIDATION 4.1 Data reduction procedures specific to the project shall be described, including calculations and equations. 4.2 The data validation procedures used to ensure the reporting of accurate project data shall be described. 4.3 The expected product document that will be prepared shall be specified (e.g., journal article, final report, etc.). B-11 NHSRC QMP - August 2003 ------- QAPP REQUIREMENTS FOR METHOD DEVELOPMENT PROJECTS A method development project is typically needed in situations for which there exists no standard or known method, or when an existing method needs to be modified to meet a project-specific need. The following requirements should be addressed as applicable. SECTION 1.0, BACKGROUND A description of the situation that requires the generation of a new or modified method shall be clearly stated. Why are we doing this? SECTION 2.0, SCOPE AND APPLICATION The scope and application of the method shall be clearly stated. Specifically, to what matrices, conditions, etc., will this method apply for this project? What detection limits and/or practical quantitation limits are needed? How is this method intended to be used in the future (e.g., research only, potential regulatory usage, etc.)? SECTION 3.0, PROJECT ORGANIZATION Responsibilities of all project participants shall be identified, meaning that key personnel and their organizations shall be identified, along with the designation of responsibilities for planning, coordination, sample collection, measurements (i.e., analytical, physical, and process), data reduction, data validation (independent of data generation), data analysis, report preparation, and quality assurance, SECTION 4.0, EXPERIMENTAL APPROACH INCLUDING SAMPLING AND ANALYTICAL SPECIFICATIONS 4.1 A description of the test(s) to be conducted in order to support the development of the method shall be included. All known or preestablished test conditions and variables shall be provided. 4.2 All planned measurements (i.e., analytical [chemical, microbiological, assays, etc.], physical, and process) shall be identified, and project-specific target analytes shall be listed. 4.3 Any known restrictions/specifications for sampling (e.g., collecting soil samples from a site or water samples from a port, etc.) or subsampling (e.g., mixing sample before taking subsample for analysis, etc.) shall be documented. Include specifications for: type and size of sample containers; amount of sample needed for preparation and analysis; preservation; holding times; representativeness; compositing; QC samples; etc. 4.4 The type of instrumentation that will be used and any required instrument conditions shall be documented. Include a discussion of calibration and calibration verification including frequency, acceptance criteria, and corrective action to be taken if acceptance criteria are B-12 NHSRC QMP - August 2003 ------- not met. SECTION 5.0, QA/QC CHECKS Any planned QC checks and criteria that must be met for the method to be considered successful shall be specified. QC checks may include spikes, replicates, blanks, controls, surrogates, etc. Note: For chemical methods, quality control procedures to determine the precision, accuracy, and method detection limit should be described. For microbiological methods, positive and negative control procedures should be described. SECTION 6.0, METHOD VERIFICATION The tests that will be used to verify the method's performance once it's been developed shall be specified. SECTION 7.0, REPORT The report for a successful method development project will be a method written in a format appropriate for the application e.g., SW-846 for RCRA applications, Standard Methods for bacteria in drinking water, a SOP for a specific application (with supporting method performance data appended), etc. SECTION 8.0, REFERENCES References shall be provided either in the body of the text as footnotes or in a separate section. B-13 NHSRC QMP - August 2003 ------- QAPP REQUIREMENTS FOR SOFTWARE AND DATA MANAGEMENT PROJECTS Types of projects to which this guidance applies include the following: software development, software/hardware systems development, data base design and maintenance, and data validation and verification systems. The QAPP requirements for software development in this appendix do not mandate a particular method for software development. Project managers should choose software development and QA methods best suited to their individual projects within the parameters set forth here. Table D-l provides a set of alternative QAPP elements for situations in which the elements applicable to measurement projects are not appropriate. The applicability of different elements is based on (1) the QA category and (2) the size or complexity of the task. Projects that involve both measurement and software/systems development should have plans addressing all applicable QA elements. Main issues to consider for inclusion in a QAPP for software and data management are listed in the following sections. Additional guidance for software and data management projects is available from the QAMs. SECTION 0.0, APPROVAL BY PROJECT PARTICIPANTS The EPA Technical Lead Person (TLP) shall be responsible for obtaining signatures of appropriate project participants on the signature page of the QA plan, documenting agreement to project objectives and the approach for evaluating these objectives. SECTION 1.0, PROJECT DESCRIPTION This section should provide an overview of the project, its intended uses, quality objectives, schedules and appropriate milestones, information about the hardware and operating systems, and planning documents. SECTION 2.0, PROJECT ORGANIZATION AND RESPONSIBILITIES This section should discuss all important intramural and extramural project personnel and should show the relationship between the development team and the personnel responsible for QA and testing. SECTION 3.0, FUNCTIONAL REQUIREMENTS This section should provide a list of the most important functions that the software system must address. This section can also state any quantitative or qualitative data quality objectives (DQOs) that might apply to the software. SECTION 4.0, SYSTEM DESIGN OVERVIEW (HIGH LEVEL DESIGN) A brief description of the system design is all that is necessary in the QAPP, if additional design documentation is planned. SECTION 5.0, DETAILED DESIGN B-14 NHSRC QMP - August 2003 ------- Complex projects and those with significant defensibility requirements should have a detailed design document. SECTION 6.0, IMPLEMENTATION Written standard operating procedures (SOPs) for software development should be provided for extremely large and complex software projects. The internal checks applied during development should also be described. SECTION 7.0, TESTING The QAPP should outline the testing strategy to be used. SECTION 8.0, DATA VALIDATION AND VERIFICATION The QAPP must describe the means for checking the correctness of outputs. SECTION 9.0, CHANGE CONTROL AND CONFIGURATION MANAGEMENT This section should describe the procedures for controlling and documenting all significant changes to software and hardware. SECTION 10.0, AUDITS AND REVIEWS This section should describe planned assessments, including performance evaluation audits (PEAs), technical systems audits (TSAs), quality systems audits (QSAs), and audits of data quality (ADQs). Additional types of reviews applicable to these projects include peer reviews and beta testing. SECTION 11.0, MAINTENANCE AND USER SUPPORT Where software or data generated by the project will be distributed outside NHSRC, maintenance and user support must be addressed. SECTION 12.0, SYSTEM DOCUMENTATION AND ARCHIVING Documentation is required for software projects in all QA categories. Table D-2 gives documentation requirements by QA Category. SECTION 13.0, QA PROGRESS REPORTS TO MANAGEMENT System development QA and QC results and plans should be reported regularly, particularly in projects in Categories I and II and where contractually required. B-15 NHSRC QMP - August 2003 ------- APPENDIX III AMENDMENT AND DEVIATION FORMS ------- TEST/OA PLAN AMENDMENT TEST/QA PLAN TITLE AND DATE: AMENDMENT NUMBER: EFFECTIVE DATE: PART TO BE CHANGED/REVISED: CHANGE/REVISION: REASON FOR CHANGE: ORIGINATED BY: Task Order Leader DATE APPROVED BY: Technology Area Leader or Battelle Quality Assurance Manager Battelle TTEP Manager DATE DATE Required Distribution - All individuals/organizations listed on distribution for the applicable Test/QA Plan, including but not limited to: TTEP Management TTEP Technical staff Battelle QA Manager Subcontractors (if any) EPA Task Order Project Officer NHSRC QA Manager Vendors Distribution must be documented ------- TEST/OA PLAN DEVIATION REPORT TEST/QA PLAN TITLE AND DATE: DEVIATION NUMBER: DATE OF DEVIATION: DESCRIPTION OF DEVIATION: CAUSE OF DEVIATION: IMPACT OF DEVIATION ON THE EVALUATION: CORRECTIVE ACTION: ORIGINATED BY: Task Order Leader DATE ACKNOWLEDGED BY: Technology Area Leader or Battelle Quality Assurance Manager Battelle TTEP Manager DATE DATE Required Distribution - All individuals/organizations listed below: TTEP Management Battelle QA Manager Distribution must be documented ------- |