United States Office of Environmental EPA/600/R-99/080 Environmental Protection Information January 2000 Agency Washington, DC 20460 vxEPA Guidance on Technical Audits and Related Assessments for ^^Environmental Data ^f Operations EPA QA/G-7 m.4 Final ------- FOREWORD The U.S. Environmental Protection Agency (EPA) has developed several types of technical audits and related assessments as important tools for the systematic and objective examination of a project to determine whether environmental data collection activities and related results comply with the project's quality assurance project plan, are implemented effectively, and are suitable to achieve data quality goals. The mandatory Agency-wide Quality System requires that all organizations performing work for EPA develop and operate management processes and structures for ensuring that data or information collected are of the needed and expected quality for their desired use. Technical audits and related assessments are an integral part of the fundamental principles of quality management that form the foundation of the Agency's Quality System. This document is one of the U.S. Environmental Protection Agency Quality System Series requirements and guidance documents. These documents describe the EPA policies and procedures for planning, implementing, and assessing the effectiveness of the Quality System. Questions regarding this document or other Quality System Series documents may be obtained from the Quality Staff: U.S. EPA Quality Staff (2811R) Office of Environmental Information Ariel Rios Building 1200 Pennsylvania Avenue, NW Washington, DC 20460 Phone: (202)564-6830 Fax: (202) 565-2441 email: quality@epa.gov Copies of the EPA Quality System Series may be obtained from the Quality Staffer by downloading them from the Quality Staff Home Page: www. epa.gov/quality Final EPA QA/G-7 i January 2000 ------- Final EPA QA/G-7 ii January 2000 ------- TABLE OF CONTENTS Page CHAPTER 1. INTRODUCTION 1 1.1 PURPOSE AND OVERVIEW 1 1.2 THE EPA QUALITY SYSTEM 2 1.3 SCOPE AND BACKGROUND 3 1.4 INTENDED AUDIENCE 7 1.5 SPECIFIC DEFINITIONS 7 1.6 PERIOD OF APPLICABILITY 8 1.7 ORGANIZATION OF THE DOCUMENT 9 CHAPTER 2. USING TECHNICAL AUDITS AND RELATED ASSESSMENTS 11 2.1 WHAT IS A TECHNICAL AUDIT OR ASSESSMENT? 11 2.1.1 General Characteristics of Technical Audits and Assessments 11 2.1.2 Self-Assessments versus Independent Audits and Assessments 12 2.1.3 Role of Technical Audits and Assessments in the EPA Quality System 13 2.2 WHY CONDUCT A TECHNICAL AUDIT OR ASSESSMENT? 13 2.3 AUTHORITY TO AUDIT OR ASSESS 15 2.4 ATTRIBUTES OF AUDITORS 17 2.4.1 Education and Training 18 2.4.2 Independence and Objectivity 19 2.4.3 Experience 19 2.4.4 Personal Attributes of Auditors 20 2.5 MANAGEMENT OF AUDIT PROGRAMS 21 2.6 AUDIT COSTS 21 2.6.1 Budget 21 2.6.2 Cost Considerations 22 CHAPTER 3. STEPS IN THE GENERAL TECHNICAL AUDIT AND ASSESSMENT PROCESS 23 3.1 PLANNING 23 3.1.1 Decision to Conduct an Audit 23 3.1.2 Selection of Audit Type 25 3.1.3 Selection of Audit Team 25 3.1.4 Planning Meeting 26 3.1.5 Confidentiality and Dissemination of Audit Results 27 3.1.6 Review of Project Documents 29 3.1.7 Contact with Auditee 30 3.1.8 AuditPlan and Other Preparation 33 3.1.9 Audit Questionnaire 36 3.1.10 Audit Checklist 36 Final EPA QA/G-7 iii January 2000 ------- TABLE OF CONTENTS (continued) Page 3.2 PERFORMANCE OF THE AUDIT 38 3.2.1 Audit Protocol 38 3.2.2 Opening Meeting 40 3.2.3 Audit Activities 41 3.2.4 Observation of Work 43 3.2.5 Interviews 43 3.2.6 Document Review 45 3.2.7 Objective Evidence Compilation 46 3.2.8 Closing Meeting 46 3.3 EVALUATION 48 3.3.1 Identification of Finding 48 3.3.2 Evaluation of Finding 49 3.4 DOCUMENTATION 50 3.4.1 Draft Finding Report 51 3.4.2 Final Report 52 3.5 CORRECTIVE ACTION 54 3.6 CLOSEOUT 55 CHAPTER 4. TYPES OF TECHNICAL AUDITS 57 4.1 INTRODUCTION TO AUDIT TYPES 57 4.2 READINESS REVIEWS 57 4.3 TECHNICAL SYSTEMS AUDITS 59 4.4 SURVEILLANCE 62 4.5 PERFORMANCE EVALUATIONS 63 CHAPTER 5. RELATED TECHNICAL ASSESSMENTS 67 5.1 INTRODUCTION TO ASSESSMENT TYPES 67 5.2 AUDITS OF DATA QUALITY 67 5.3 DATA QUALITY ASSESSMENTS 68 CHAPTER 6. GUIDANCE FOR AUDITEES 71 6.1 PREAUDITS PARTICIPATION 71 6.2 AUDIT READINESS 72 6.3 AUDIT PARTICIPATION 72 6.3.1 Opening Meeting 72 6.3.2 Audit Activities 73 6.3.3 Closing Meeting 73 6.4 DRAFT FINDINGS REPORT REVIEW 74 6.5 CONFIDENTIALITY AND DISSEMINATION OF AUDIT RESULTS 74 Final EPA QA/G-7 iv January 2000 ------- TABLE OF CONTENTS (continued) Page REFERENCES AND SUPPLEMENTAL INFORMATION SOURCES 75 APPENDIX A. GLOSSARY A-l APPENDIX B. EXAMPLE OF A TECHNICAL SYSTEMS AUDIT CHECKLIST ... B-l Final EPA QA/G-7 v January 2000 ------- LIST OF FIGURES Page Figure 1. EPA Quality System Components 4 Figure 2. A Generalized Flowchart of a Technical Audit 24 Figure 3. Example of an Audit Plan 28 Figure 4. Suggested Format for the Notification Letter 31 Figure 5. Suggested Format for the Follow-up Letter 32 Figure 6. Example of a Technical Audit Agenda 34 Figure 7. Example of a Logistical Letter 35 Figure 8. Characteristics of an Opening Meeting 41 Figure 9. Interviewing Considerations and Techniques 44 Figure 10. Characteristics of an Closing Meeting 47 Figure 11. Types of Findings 49 Figure 12. Example Draft Findings Report Format 52 Figure 13. Example of a Draft Findings Report Transmittal Letter 53 Figure 14. Example of a Closeout Letter 56 Figure 15. Example of a Flowchart of Technical Systems Audit Implementation 60 Final EPA QA/G-7 vi January 2000 ------- CHAPTER 1 INTRODUCTION 1.1 PURPOSE AND OVERVIEW This document provides general guidance for selecting and performing technical audits and related assessments of environmental data operations. The document also provides guidance on the general principles of audits and assessments from the context of environmental programs associated with the U.S. Environmental Protection Agency (EPA). Such environmental programs include: intramural programs performed by EPA organizations, programs performed under EPA extramural agreements [i.e., contracts, grants, cooperative agreements, and interagency agreements (IAGs)], and programs required by environmental regulations, permits, and enforcement agreements and settlements. This guidance is nonmandatory and is intended to help organizations plan, conduct, evaluate, and document technical audits and related assessments for their programs. It contains tips, advice, and case studies to help users develop processes for conducting technical audits and related assessments. Because of the diversity of environmental programs that may use one or more of the technical audits and assessments described, it is not possible to provide all of the implementation details in this document. Additional guidance has been developed for some "tools" and other guidance is likely. The user is referred to those documents for additional information. Establishing and implementing an effective assessment program are integral parts of a quality system. Audits and assessments, both management and technical, provide management with the needed information to evaluate and improve an organization's operation, including: the organizational progress in reaching strategic goals and objectives, the adequacy and implementation of management or technical programs developed to achieve the mission, the quality of products and services, and the degree of compliance with contractual and regulatory requirements. Final EPA QA/G-7 1 January 2000 ------- 1.2 THE EPA QUALITY SYSTEM A quality system is a structured and documented management system describing the policies, objectives, principles, organizational authority, responsibilities, accountability, and implementation plan of an organization for ensuring quality in its work processes, products, and services. A quality system provides the framework for planning, implementing, documenting, and assessing work performed by the organization for carrying out required quality assurance (QA) and quality control (QC) activities. Audits and assessments are an integral part of a quality system. Since 1979, EPA policy has required participation in an Agency-wide quality system by all EPA organizations (i.e., offices, regions, national centers, and laboratories) supporting intramural environmental programs and by non-EPA organizations performing work funded by EPA through extramural agreements. The EPA Quality System operates under the authority of Order 5360.1 CHG 1, Policy and Program Requirements for the Mandatory Agency-wide Quality System (U.S. EPA, 1998c), hereafter referred to as the Order. The implementation requirements for the Order for EPA organizations are provided in Order 5360, EPA Quality Manual for Environmental Programs (U.S. EPA, 1998a). The Order and applicable extramural agreement regulations define EPA's authority to conduct technical audits and assessments. The Order requires every EPA organization collecting and using environmental data to document its quality system in an approved quality management plan and requires QA Project Plans for all environmental data collection be developed, reviewed, approved, and then implemented as approved, before work commences. The Order defines environmental data as any measurement or information that describes environmental processes, locations, or conditions; ecological or health effects and consequences; or the performance of environmental technology. For EPA, environmental data include information collected directly from measurements, produced from models, and compiled from other sources such as databases or the existing literature. The Order applies (but is not limited) to the following environmental programs: the characterization of environmental or ecological systems and the health of human populations; the direct measurement of environmental conditions or releases, including sample collection, analysis, evaluation, and reporting of environmental data; the use of environmental data collected for other purposes or from other sources (also termed "secondary data"), including literature, industry surveys, compilations from computerized databases and information systems, and results from computerized or mathematical models or environmental processes and conditions; and Final EPA QA/G-7 2 January 2000 ------- the collection and use of environmental data pertaining to the occupational health and safety of personnel in EPA facilities (e.g., indoor air quality measurements) and in the field (e.g., chemical dosimetry, radiation dosimetry). EPA bases its quality system on the American National Standard for quality systems in environmental programs, ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs (ASQ, 1994), developed by the American National Standards Institute (ANSI) and American Society for Quality (ASQ). Non-EPA quality systems that demonstrate objective evidence of conformance to ANSI/ASQC E4-1994 also are in compliance with EPA policy. The EPA Quality System comprises three structural levels: policy, organization/program, and project (see Figure 1). This structure promotes consistency among quality systems at the higher management levels, while allowing project managers the flexibility necessary to adapt the EPA Quality System components to meet the individual and often unique needs of their work. The EPA Quality System encompasses many functions, including the following: establishing quality management policies and guidelines for the development of organization- and project-specific quality plans; establishing criteria and guidelines for planning, implementing, documenting, and assessing activities to obtain sufficient and adequate data quality; providing an information focal point for QA and QC concepts and practices; performing management and technical audits or assessments to ascertain the effectiveness of QA and QC implementation; and identifying and developing training programs related to QA and QC implementation. The EPA Quality System also is characterized by the principle of "graded approach," which allows managers to base the degree of quality controls applied to an organizational area or project on the intended use of the results and on the confidence that is needed and expected in the quality of the results. 1.3 SCOPE AND BACKGROUND A technical audit or assessment is a systematic and objective examination of a program or project to determine whether environmental data collection activities and related results comply with the project's QA Project Plan and other planning documents, are implemented effectively, and are suitable to achieve its data quality goals. Technical audits and assessments may also be used as an investigative tool where problems may be suspected. A technical audit or assessment Final EPA QA/G-7 3 January 2000 ------- o _l o 0. Consensus Standards ANSI/ASQC E4 ISO 9000 Series Internal EPA Policies EPA Order 5360.1 EPA Manual 5360 External Policies Contracts - 48 CFR 46 Assistance Agreements - 40 CFR 30, 31, and 35 (3 O g_ z O O or o Supporting System Elements O LU 8 0. Conduct Study/ Experiment 1 Standard Operating Procedures t Technical Assessments PLANNING A IMPLEMENTATION ASSESSMENT Defensible Products and Decisions Figure 1. EPA Quality System Components EPA QA/G-7 Final January 2000 ------- is not a management assessment (e.g., the management systems review process, which is used to conduct quality system audits, and which occurs at the organization/program level) nor is it data verification/validation, which occurs during the assessment phase of the project. However, in certain circumstances, a technical audit or assessment may be performed with or as an adjunct to a management assessment. Technical audits and assessments are conducted as part of an organization's quality system. They are dependent on other components of the system, including effective planning and QA documentation like the QA Project Plan, to guide and provide evaluation or performance criteria for them. The absence of one or more components, such as an approved QA Project Plan, may limit the effectiveness of audits and assessments. In most cases, the QA Project Plan will be the basis for planning and conducting the technical audits and assessments, although other planning documentation also may be used. There also must be clear designated authority for conducting the audits and assessments and for confirming implementation of any corrective actions taken from them. While most audits and assessments will be scheduled in advance with an organization, there may be occasions when unannounced technical audits or assessments are needed. Such needs are identified in planning and are usually based on historical evidence of problems. Unannounced audits and assessments are particularly effective in getting the "true" picture of project implementation. However, there are downsides to unannounced audits. For example, a visit may occur during a period when the project is not active or when project staff are very busy with routine project activities that would be disrupted by an unannounced audit. In either case, the unannounced audit or assessment may not reveal a representative picture of project activities. Several types of technical audits and related assessments can be used to evaluate the effectiveness of project implementation, as follows: Readiness reviews are conducted before specific technical activities (e.g., sample collection, field work, and laboratory analysis) are initiated to assess whether procedures, personnel, equipment, and facilities are ready for environmental data to be collected according to the QA Project Plan. Technical systems audits (TSAs) qualitatively document the degree to which the procedures and processes specified in the approved QA Project Plan are being implemented. Surveillance is used to continuously or periodically assess the real-time implementation of an activity or activities to determine conformance to established procedures and protocols. Final EPA QA/G-7 5 January 2000 ------- Performance evaluations (PEs) quantitatively test the ability of a measurement system to obtain acceptable results. Audits of data quality (ADQs) are conducted on verified data to document the capability of a project's data management system (hardcopy and/or electronic) to collect, analyze, interpret, and report data as specified in the QA Project Plan. Data quality assessments (DQAs) are scientific and statistical evaluations of validated data to determine if the data are of the right type, quality, and quantity to support their intended use. Technical audits are discussed in greater detail in Chapter 4 and related assessments are discussed in Chapter 5. Because they accomplish different objectives, different types of audits and assessments often can be integrated into one effort, depending on the needs of the project. For instance, a PE can quantitatively indicate a problem with the bias of a measurement system and a ISA can determine the cause of the problem. Some of these types of audits and assessments (i.e., TSAs, readiness reviews, and surveillance) are primarily conducted at the project site. Other types of audits and assessments (i.e., PEs, ADQs, and DQAs) are primarily conducted away from the project site, but may have a component for gathering objective evidence or for delivering a PE sample at the project site. For the purposes of this document, inspection is not included as a technical audit or assessment. The International Organization for Standardization defines inspection as "an activity such as measuring, examining, testing, or gauging one or more characteristics of an entity and comparing the results with specified requirements in order to establish whether conformance is achieved for each characteristic" (International Organization for Standardization, 1994d). This definition suggests that inspection is part of routine quality control practices that may be supplied to an operation. While several of the audit and assessment types discussed in this document involve elements of inspection, they are more specific in their scope and application than the more general term, inspection, and are used in this document instead. Peer reviews may be used along with other technical audits and assessments to corroborate scientific defensibility. A peer review is a documented critical review of a specific major scientific and/or technical work product. It is conducted by qualified individuals (or organizations) independent of those who performed the work but which are collectively equivalent in technical expertise (i.e., peers) to those who performed the original work. A peer review is an in-depth assessment of the assumptions, calculations, extrapolations, alternative interpretations, methodology, acceptance criteria, and conclusions pertaining to the specific scientific and/or technical work products and of the documentation that supports them. Final EPA QA/G-7 6 January 2000 ------- EPA conducts peer reviews of major scientifically and technically based work products used to support its decisions. Additionally, they are used by EPA to evaluate applications for research assistance agreements, graduate fellowships, and environmental research centers. More generally, peer reviews are used in the academic community to evaluate the quality of research publications. Despite the widespread use of peer reviews, they are outside the scope of this document. Guidance for EPA peer reviews is given in EPA's Peer Review Handbook (U.S. EPA, 1998b). 1.4 INTENDED AUDIENCE This document is intended primarily for organizations conducting technical audits and related assessments for environmental programs associated with EPA. Such organizations include EPA itself since the Agency collects environmental data through a variety of intramural activities. Environmental data are collected by other organizations through extramural agreements with EPA, including State program grants, contracts, cooperative agreements, grants, and interagency agency agreements. EPA quality requirements apply to regulatory activities including permits, enforcement consent orders and agreements, and enforcement settlements when included in the applicable agreement. As noted earlier, this document provides general guidance for selecting and performing technical audits and related assessments of environmental data operations. The document also provides guidance on the general principles of audits and assessments from the context of environmental programs associated with EPA and includes a section on expectations from the perspective of the organization being assessed. While this guidance is general, its application and use is not confined to environmental programs associated with EPA, but may be used by any organization implementing environmental programs. 1.5 SPECIFIC DEFINITIONS For purposes of consistency, the following terms are used throughout this document. These definitions do not constitute the Agency's official use of terms for regulatory purposes and should not be construed to alter or supplant other terms in use. 1. EPA QA Manager: The EPA QA Manager, or the person assigned equivalent duties, responsible for ensuring the implementation of the organization's quality system as defined by the project's approved QMP. The QA Manager should: function independently of direct environmental data generation, model development, or technology development; Final EPA QA/G-7 7 January 2000 ------- report on quality issues to the senior manager with executive leadership authority for the organization; and have sufficient technical and management expertise and authority to independently oversee and implement the organization's quality system in the environmental programs of the organization. 2. Project QA Manager: The project QA Manager, or the person assigned equivalent duties, responsible for ensuring the implementation of the QA and QC procedures and activities defined in the project's approved QA Project Plan. 3. Client: Any individual or organization for whom items or services are furnished or work is performed in response to defined requirements and expectations. 4. Quality Assurance Project Plan: A formal planning document for an environmental data collection operation that describes the data collection procedures and the necessary QA and QC activities that must be implemented to ensure that the results are sufficient and adequate to satisfy the stated performance criteria. A QA Project Plan documents the outputs of the systematic planning process and the resulting sampling design. It is the blueprint for identifying how the quality system of the organization performing the work is reflected in a particular project and in associated technical goals. It also details the management authorities, personnel, schedule, policies, and procedures for the data collection. Often, QA Project Plans incorporate or cite standard operating procedures, which help to ensure that data are collected consistently using approved protocols and quality measures. 5. EPA Project Officer: The EPA person responsible for the overall technical and administrative aspects of the project. Such persons may be referred to as project manager, project officer (PO), work assignment manager, or similar title, for extramural projects. For intramural projects, other titles may include principal investigator (PI) and team leader. NOTE: Conventional terminology generally refers to "audits" and "auditors" when discussing technical audits and assessments. Although other types of assessments are discussed in this document, the conventional terms auditor, auditee, and audits will be used for consistency. 1.6 PERIOD OF APPLICABILITY Based on the Manual (U.S. EPA, 1998a), this document will be valid for a period of five years from the official date of publication. After five years, this document will either be reissued without modification, revised, or removed from the EPA Quality System. Final EPA QA/G-7 8 January 2000 ------- 1.7 ORGANIZATION OF THE DOCUMENT The organization of this document is designed to first acquaint the reader with the basic principles common to all types of technical audits and assessments. While there are many excellent books and manuals in the literature on general audit and assessment principles, they are discussed here in the context of environmental programs typically performed by EPA, through extramural agreements, and by the regulated community. The discussion of such underlying principles will then serve as the basis for discussing the differences among the various types of technical audits and assessments available in environmental programs. For additional clarity, the document will generally use the term audit to mean audit or assessment except where use of assessment is necessary. Similarly, the terms auditor and auditee will refer to the person or organization conducting the assessment and the organization being assessed, respectively. After this introductory chapter, subsequent chapters present the following material: Chapter 2 discusses the philosophy common to all technical audit and assessment types, including what an audit and assessment are and why they are performed. Chapter 3 discusses general steps in an audit or assessments, which may be applicable to different types of audits and assessments, with several case histories presented in italics and indented. Chapter 4 describes specific types of technical audits, including readiness reviews, surveillance, TSAs, and PEs and provides guidance on how and when they should be used. Chapter 5 describes related technical assessment, including ADQs and DQAs. Chapter 6 provides guidance for those being audited. This document includes cited references and supplemental information sources and two appendices. Appendix A contains a glossary of key terms, and Appendix B contains an example of a TSA checklist. NOTE: Once the user has become familiar with the underlying principles of technical audits and assessments and how they are planned and implemented, it is likely that Chapter 4 and Chapter 5 will serve as a reference for details on the specific audit or assessment type needed for a particular application. Final EPA QA/G-7 9 January 2000 ------- Final EPAQA/G-7 10 January 2000 ------- CHAPTER 2 USING TECHNICAL AUDITS AND RELATED ASSESSMENTS 2.1 WHAT IS A TECHNICAL AUDIT OR ASSESSMENT? A technical audit or assessment is a systematic and objective examination of an intramural or extramural project to determine: whether environmental data collection activities and related results comply with the project's QA Project Plan, whether the procedures defined by the QA Project Plan are implemented effectively, and whether they are sufficient and adequate to achieve the QA Project Plan's data quality goals. This definition is based on ISO's definition of "quality audit" as given in Guidelines for Auditing Quality Systems - Auditing (ISO, 1994a). Proper use of technical audits and assessments provides important information to management to help ensure that collected environmental data are defensible. Audits and assessments can uncover deficiencies in physical facilities, equipment, project planning, training, operating procedures, technical operations, custody procedures, documentation of QA and QC activities as well as quality system aspects applying to more than one project. Audits and assessments can be performed before, during, and after environmental data collection. 2.1.1 General Characteristics of Technical Audits and Assessments A technical audit or assessment is primarily a management tool and secondarily a technical tool. EPA conducts environmental data collection activities in support of its regulatory decision making. As a component of the EPA Quality System, technical audits and assessments provide management with a tool to determine whether data collection activities are being or have been implemented as planned. They also provide the basis for taking action to correct any deficiencies that are discovered. Unless management sees the benefit of conducting an audit or assessment, there will be little impetus to complete the entire process. The ownership of quality for a project remains with the manager, rather than the auditor. The auditor acts as the manager's agent because the manager has the authority, which the auditor lacks, to implement corrective action. Both the manager and the auditor need to understand that an audit or assessment is a means for keeping a manager informed about a project. Such an understanding helps ensure that the audit or assessment will produce positive changes in the Final EPA QA/G-7 11 January 2000 ------- project. The auditor can identify problems, but the manager is better equipped to develop the specific solutions to those problems that work best for a particular project. Consequently, audit and assessment results should be presented in management's terms, should generate management interest, and should convince management that any proposed corrective actions are necessary and will benefit the project. Although all audits and assessments are evaluations, the results should be regarded by all interested parties as an opportunity for project and personnel improvement, rather than as a judgment of the quality of the project and personnel. The findings of technical audits and assessments should be focused on the future, rather than the past. Typically, projects are audited or assessed during their implementation phase. The goal is to find practical solutions to problems, not to affix blame. The results should provide management with information that can be used for improving projects as well as providing detailed descriptions of any technical deficiencies. Audits and assessments are most effective early in a project's implementation phase so that corrective action may be taken before the project has been completed and all environmental data have been collected. It is better for an auditor to report earlier that a project can generate better data than to report later that a project has inadequate data. The findings of technical audits and assessments should pass the "so what" test. The "so what" test focuses attention on significant findings, helps to eliminate trivial issues, and prioritizes findings. The existence of a technical deficiency is not as important to a manager as is the impact of the technical deficiency on the conclusions drawn from the environmental data. Audits and their reports should emphasize what is important to a project. Management does not have time to deal with minutia. For example, the discovery of a spelling error in a laboratory notebook is not likely to affect the outcome of the data collection activities. The error can be corrected on the spot without any other action. However, the discovery that an instrument calibration procedure has a conceptual flaw may have a significant effect on these activities and should be reported to management immediately. 2.1.2 Self-Assessments versus Independent Audits and Assessments Self-assessments are audits or assessments of work conducted by individuals, groups, or organizations directly responsible for overseeing and/or performing the work. Independent assessments are audits or assessments performed by a qualified individual, group, or organization that is not part of the organization directly performing and accountable for the work being assessed. The basic principles for assessments do not change for self-assessments and independent assessments. Auditors from within the audited organization may be more knowledgeable about the technical aspects of the program and they may be able to assess more readily the suitability and effectiveness of the data collection activity. Independent auditors may more easily maintain their objectivity and independence. Independent assessments are typically more formal than self-assessments. Final EPA QA/G-7 12 January 2000 ------- A similar classification system for audits is internal versus external. Internal audits are first-party audits (self-assessments), while external audits can be second- or third-party audits. A second-party audit is an external audit performed on a supplier by a customer or a contracted organization on behalf of a customer. A third-party audit is performed on a supplier or regulated entity by an external participant other than a customer (Russell, 1997). 2.1.3 Role of Technical Audits and Assessments in the EPA Quality System Technical audits and assessments are used to check that an environmental data collection activity is conducted as planned and that it is producing data of the type and quality specified in project planning documents, such as QA Project Plans, SOPs, or sampling and analysis plans. Technical audits and assessments should be based on performance criteria, such as data quality objectives, that are developed during the project planning process and documented in the project's planning documents. Technical audits and assessments play an important role in documenting the implementation of the QA Project Plan and are used to check whether performance criteria and other data quality indicator goals are being met. QA Project Plans may be used to develop the checklists that guide certain types of audits, such as TSAs. 2.2 WHY CONDUCT A TECHNICAL AUDIT OR ASSESSMENT? Managers should answer this question for each program under their supervision. When independent professionals assess scientific data, they may detect irregularities in the experimental results, inappropriate practices, misinterpretations of data, inaccurate reporting of data, and departures from planning documents. There are good and practical reasons for conducting technical audits and assessments of EPA projects, including the following: confirming the implementation of prescribed planning documents (e.g., QA Project Plans, work plans), ensuring that measurement systems attain stated performance criteria, and identifying potential problems early in the course of a project. Audits and assessments are important for all types of EPA projects. For example, EPA cannot afford for its technical evidence to be wrong in legal proceedings. For research and technical analysis projects, scientific defensibility is crucial because the results of these projects may be used to determine future Agency directions or to provide the scientific basis for regulatory decision making. ISO guidelines for auditing quality systems (ISO, 1994a) lists the following objectives: Audits are normally designed for one or more of the follow ing purposes: Final EPAQA/G-7 13 January 2000 ------- to determine the conformity or nonconformity of the quality system elements with specified requirements, to determine the effectiveness of the implemented quality system in meeting specified quality objectives, to provide the audience with an opportunity to improve the quality system, to meet regulatory requirements, and to permit the listing of the audited organization's quality system in a register. Audits are generally initiated for one or more of the following reasons: to initially evaluate a supplier where there is a desire to establish a contractual relationship, to verify that an organization's own quality system continues to meet specified requirements and is being implemented, within the framework of a contractual relationship, to verify that the supplier's quality system continues to meet specified requirements and is being implemented, and to evaluate an organization's own quality system against a quality system standard. Effective technical audits and assessments contribute to a reduction in the occurrences of questionable data, faulty conclusions, and inappropriate practices. Key purposes of technical audits and assessments include the discovery and characterization of sources of measurement error, the reduction of deficiencies, and the safeguarding of EPA's decision-making process. Audits and assessments help to ensure that approved QA Project Plans are being followed and that the resulting data are sufficient and adequate for their intended use. Proper use of technical audits and assessments can provide increased confidence that the collected environmental data are defensible and properly documented. Audits and assessments can uncover deficiencies in physical facilities, equipment, project planning, training, operating procedures, technical operations, custody procedures, documentation, QA and QC activities, and reporting. One of the most significant questions associated with initiating a technical audit or assessment is whether the cost can be justified. One might think, "A lot of money has been spent on planning and implementing this project. Why should I spend more money on performing an Final EPA QA/G-7 14 January 2000 ------- audit of this project?" First, good project planning is necessary, but it is no guarantee that a project will be implemented as planned. A situation may arise during the implementation phase that was not anticipated during the planning phase and may lead to changes in the project. Second, personnel implementing the project may make errors that would go undiscovered without an audit or assessment. Money can be saved by discovering errors early; however, management should balance the costs and benefits of conducting audits and assessments. 2.3 AUTHORITY TO AUDIT OR ASSESS The authority and independence of auditors, and the limits on their authority, must be clearly defined in the organization's QMP and the project-specific QA Project Plan. Prior to an assessment, it is important to establish whether the auditors have the authority to stop or suspend work if they observe conditions that present a clear danger to personnel health or safety or that adversely affect data quality. According to ANSI/ASQC E4-1994, auditors should have sufficient authority, access to programs and managers, and organizational freedom to: identify and document problems that affect quality; identify and cite noteworthy practices that may be shared with others to improve the quality of their operations and products; propose recommendations (if requested) for resolving problems that affect quality; independently confirm implementation and effectiveness of solutions; and when problems are identified, provide documented assurance (if requested) to line management that further work performed will be monitored carefully until the deficiencies are suitably resolved. While not explicitly a quality issue, ANSI/ASQC E4-1994 indicates that auditors may be granted authority to stop work when unsafe conditions exist. Moreover, inasmuch as many organizations operate integrated management systems for quality, environment, and health and safety, this responsibility could be an integral function for the auditor. If not, auditors need to know what communication they may need to have with the authorized EPA project officer who can stop work. Many quality system activities involving environmental data collection activities are inherently governmental functions and must be performed only by EPA personnel or by non-EPA personnel explicitly authorized by EPA (U.S. EPA, 1998a). This authorization may be based on statute or regulation or by the terms of an extramural agreement. Such non-EPA personnel may include other governmental personnel and, with specific authorization, contractor personnel. When such quality management tasks are performed by a contractor, the contract must be managed and remain under the control of the authorized EPA contracting representatives. Final EPAQA/G-7 15 January 2000 ------- Two types of quality management functions exist, as follows: exclusively EPA functionsactivities that are inherently governmental work, which must be performed only by responsible EPA officials, such as QA Managers, or authorized representatives. discretionary functionsactivities that may be performed either by EPA personnel or by non-EPA personnel under the specific technical direction of and performance monitoring by the QA Manager or another responsible EPA or government official under an approved contract, work assignment, delivery order, task order, and so on. Exclusively EPA functions include the following: planning and directing periodic technical audits and assessments of ongoing environmental data collection activities to provide information to management to ensure that technical and quality objectives are being met and that the needs of the client are being satisfied. Such audits and assessments may include TSAs, surveillance, PEs, and DQAs. determining conclusions and necessary corrective actions (if any) based on the findings of the audits or assessments. Discretionary functions that may be performed by either EPA personnel or non-EPA personnel include the following: performing technical audits or assessments of intramural and extramural environmental data collection activities, either on-site and off-site, according to a specific plan approved by the QA Manager. Preparations for such audits and assessments may include the acquisition or development of audit materials and standards. Results (findings) are summarized, substantiated, and presented to the QA Manager or another authorized EPA representative. A determination of whether an authorized EPA representative should accompany contractor personnel should be made on a case-by-case basis only after coordination between the responsible organization and the contracting officer. Such coordination should include consideration of the purpose of the accompaniment and clear definition of the EPA representative's role and responsibility during the contractor's performance of the technical assessment. Such coordination avoids giving the appearance of a personal services relationship. Final EPAQA/G-7 16 January 2000 ------- 2.4 ATTRIBUTES OF AUDITORS Assessment findings are only as good as the auditors who perform the audits and assessments and the organizations that support them. Three general standards (U.S. General Accounting Office, 1994) for auditors are as follows: The auditors assigned to conduct a specific audit should collectively possess adequate professional proficiency for the audit. This standard places responsibility on the auditors' organization to ensure that the audit is conducted by auditors who collectively have the technical knowledge and auditing skills necessary for the audit. This standard applies to the auditors as a group, not necessarily to every individual auditor. The auditors should be free from personal and external barriers to independence, organizationally independent, and able to maintain an independent attitude and appearance. This standard places responsibility on the auditors' organization and on individual auditors to maintain independence so that audit findings will be both objective and viewed as objective by knowledgeable third parties. The auditors should consider not only whether they are independent and whether their attitudes and beliefs permit them to be independent but also whether there is anything about their situation that might lead others to question their independence. All situations deserve consideration because it is essential not only that auditors be independent and objective in fact but that knowledgeable third parties consider them to be so as well. The auditors should use due professional care in conducting the audit and in preparing related reports. This standard places responsibility on the auditors' organization and on individual auditors to follow all applicable standards in conducting audits. Auditors should use sound professional judgment in determining the standards that are to be applied to the audit. Exercising due professional care means using sound judgment in establishing the scope, selecting the methodology, and choosing the tests and procedures for the audit. The same sound judgment should be applied in conducting the audit and in reporting the findings. In addition, auditors should be proficient in the use of computers and their applications (e.g., word processing, spreadsheets). The following sections discuss in greater detail the education, training, independence, objectivity, experience, and personal attributes of successful auditors. Final EPAQA/G-7 17 January 2000 ------- 2.4.1 Education and Training Auditors should be technically competent. Auditors should be qualified to perform their duties by virtue of education, training, and/or experience. The audit team should understand audit techniques, should be competent in the subject under assessment, and should be familiar with basic quality system concepts and principles. Not everyone on the audit team needs equal expertise in all these areas. Auditors should have training and technical competence in the project's discipline so that scientific integrity and credibility are maintained. Suitable auditors can be biologists, chemical engineers, chemists, physicists, and statisticians, along with other types of engineers and scientists. The key is that the audit team as a whole should possess the needed audit expertise and subject matter knowledge for the scope of the audit. The lead auditor should be trained in audit techniques and in quality systems and should have the experience to direct the audit. The lead auditor should be technically educated, although the lead auditor's technical field of expertise need not apply to all aspects of the audit. Experience in auditing is more critical for the lead auditor than for the other members of the audit team. Auditor training should be provided by qualified and experienced instructors. Auditors can become qualified to assess by participating in training programs designed to provide knowledge of the audit process. A training program for auditors may include audit planning, conduct of audits, documentation, reporting, and followup. ANSI/ASQC E4-1994 states: Personnel performing work shall be trained and qualified based on project- specific requirements prior to the start of the work or activity. The need to require formal qualification or certification of personnel performing certain specialized activities shall be evaluated and implemented where necessary. . . . Objective evidence of personnel job proficiency shall be documented and maintained for the duration of the project or activity affected, or longer if required by statute or organizational policy. Resources for training shall be provided. Training should reflect the particular needs of the organization, including the organization's mission and the types of work performed. ISO 10011-2-1994 (ISO, 1994c) on auditor qualifications states: Auditor candidates should have undergone training to the extent necessary to ensure their competence in the skills required for carrying out audits and for managing audits. Training in the following areas should be regarded as particularly relevant: Final EPAQA/G-7 18 January 2000 ------- knowledge and understanding of the standards against which quality systems may be assessed; assessment techniques of examining, questioning, evaluating, and reporting; and additional skills required for managing an audit, such as planning, organizing, communicating, and directing. 2.4.2 Independence and Objectivity Independence means that the auditor is neither the person responsible for the project being audited nor the immediate supervisor of the person responsible for the project. If possible, the auditor's only connection to the project is to the manager to whom the audit results will be reported. The auditor should be free of any conflicts of interest, such as might occur by close association with the environmental data collection activity. Despite the best intentions to remain objective, an auditor with a conflict of interest may unconsciously bias the audit by overlooking problems due to overfamiliarity with the data collection effort or may unconsciously minimize the impact of any discovered deficiencies on the outcome of the project. The independence requirement helps to ensure that the auditor has no stake in the outcome of the audit, other than an interest that the environmental data be collected objectively and in accordance with the QA Project Plan. Independence is needed for both self-assessments and independent assessments. ISO 10011-2-1994 states that auditors should be free from bias and influences that could affect objectivity. If the results of a technical audit are to be treated seriously, everyone involved in the process should accept the audit as being objective. Although an auditor should have technical knowledge of the equipment and procedures being used in the project, this knowledge should not influence the objectivity of the auditor's observations. The auditor should not begin the audit with preconceived notions about the project's quality system or its technical aspects. The auditor should use the project's QA Project Plan as the criteria to measure the project. The QA Project Plan documents the project personnel's and management's commitment to data quality. Upon approval, it also documents the EPA project officer's agreement that this commitment is acceptable. The auditor's role is to check whether these commitments have been kept. The objectivity requirement helps to ensure that the findings of the audit are not biased by the auditor. 2.4.3 Experience ISO 10011-2-1994 states: "Auditor candidates should have a minimum of four years' full- time appropriate practical workplace experience (not including training), at least two years of which should have been in quality assurance activities." Final EPAQA/G-7 19 January 2000 ------- Lead auditors should have assessment, technical, and quality system experience. Other auditors in the audit team also may have such experience, or they may have only technical experience and be currently receiving auditing and QA training. Diversity of experience among audit team members is important to cover all facets of the facility or project that the team may encounter. Audit experience is more important than technical experience for the successful conduct of an audit, although technical experience is necessary for the team to understand the project's technical aspects in full detail. Lead auditors should have some knowledge and understanding of the applicable environmental statutes and regulations. They should be familiar with management systems and with the organizational and operating procedures for environmental data collection. Lead auditors should have a working knowledge of the technical audit techniques for examining, questioning, evaluating, and reporting environmental data operations and for following up on corrective actions. They need to understand the audit planning process. They also need some technical understanding of the project being audited. In general, they need to be able to evaluate a project's scope of work, management system structure, and operating procedures and to judge the project's adequacy compared to its QA Project Plan. Audit team members should be familiar with technical audit concepts and techniques and with the structure and operating procedures for environmental data collection. They should have some technical knowledge of the project being audited or of the scientific basis for the project. Depending on the scope of the technical audit, auditors may need to meet additional qualifications, including health and safety requirements. For example, audits at a hazardous waste site may require auditors to successfully complete certain safety training, including the use of protective clothing and breathing equipment. Technical specialists, who have specialized knowledge of the project being audited and basic knowledge of audit techniques and procedures, may participate in audits. They may need basic training in audit techniques and procedures. Under the direct supervision of the lead auditor, they may help prepare the technical portions of audit checklists and may conduct the technical portions of an audit. They can verify findings and observations that are made by other audit team members concerning any specialized technical aspects of the project being audited. 2.4.4 Personal Attributes of Auditors An auditor should possess integrity, remain objective, and report only what is observed. An auditor should maintain proper confidentiality of the findings. An auditor should have good information gathering and communication skillsthe demonstrated ability to extract and provide information, analyze that information, and report the results accurately. An auditor should be able to assimilate information, to formulate pertinent questions, to present questions clearly during interviews with project personnel, to listen carefully to the information being provided, to verify the information from written documentation, and to report the situation as found. Final EPA QA/G-7 20 January 2000 ------- An auditor should be even-tempered. When confrontational circumstances arise, the auditor should remain calm and keep the audit under control. The auditor should defuse potentially problematic situations with tact and reason. If the situation cannot be defused, the auditor should make a judgment to defer the audit to another day or to terminate the audit entirely, if necessary. An auditor should be organized. All audits should have a structure that reflects the type of audit undertaken. Each step of the audit should be carried out as prescribed. If clients or auditees fail to respond on time, the auditor should be persistent in obtaining the necessary information to the extent of the auditor's defined authority. An auditor should have the stamina to travel the distances necessary and to perform audits under physically demanding circumstances, as necessary. Visits to remote engineering demonstration projects or environmental measurement projects can often be physically challenging. 2.5 MANAGEMENT OF AUDIT PROGRAMS Auditors should be supervised and evaluated to ensure high-quality and professional work. ISO 10011-3-1994 (ISO, 1994b) states that management of an audit program should be carried out by personnel with a practical knowledge of technical audit procedures and practices. Management should be independent of direct responsibility for implementing the projects being audited. The auditor's performance should be regularly evaluated by management. Management should establish procedures and criteria to compare auditor performance to achieve consistency among auditors to the extent possible. Auditor training needs should be regularly evaluated, and auditors should be provided with appropriate training opportunities. Management should ensure that adequate resources are available for the audit program, that procedures are in place for planning and scheduling audits, and that report formats are formalized. 2.6 AUDIT COSTS An element of primary importance to EPA managers and project officers is audit cost. They need to know what audits cost to ensure that adequate resources can be made available to support the needed audits. 2.6.1 Budget The budget for an audit depends upon the audit objectives, the criticality or visibility of the project, the project's objectives, and the complexity and type of audit desired. Audit objectives may be general or specific. For example, an audit may need to confirm that the data from an entire project are usable or that only certain data sets are usable. The budget for an audit of a project in support of enforcement, compliance, or litigation activities is likely to be larger than the budget for an audit of a basic research project. The complexity and duration of audits should Final EPAQA/G-7 21 January 2000 ------- reflect the nature of the project, which may range from very small (e.g., bench-level research) to very large (e.g., control technology demonstrations). The audit type also will determine the duration, complexity, and cost of the audit. 2.6.2 Cost Considerations Factors that affect audit costs include the number of auditors needed (including labor, travel, and lodging costs), as well as possible costs associated with the purchase and shipment of equipment and PE samples. Auditors need time to prepare for the audit, conduct the audit, generate the findings report, and perform any specified verification of corrective actions. Any special equipment needed for the audit should be procured, calibrated, and verified before and after the audit. The equipment should be shipped to the audit site and back, stored upon return, and properly maintained for future use. PE samples are a special consideration discussed further in 4. Audits also impact those being audited in terms of their time needed for audit meetings and activities, then time taken away from other projects, and their time and costs to support other audits conducted by other organizations. All these considerations affect the cost of audits. Final EPA QA/G-7 22 January 2000 ------- CHAPTER 3 STEPS IN THE GENERAL TECHNICAL AUDIT AND ASSESSMENT PROCESS 3.1 PLANNING Decisions about what projects to assess, what specific aspects of projects to assess, what types of audits to perform, and when and how often to perform the audits or assessments should be made by management in the context of the organization's overall quality system. Many organizations develop comprehensive SOPs to ensure uniform application and conduct of audits and assessments and to hold down costs. These procedures guide the auditors in the planning, execution, and completion of audits and assessments. Such general SOPs may cover pre-audit planning, audit preparation, audit activities, followup, and QC activities. This chapter is intended to supplement such organization-specific SOPs when they are available. It presents one of several possible approaches to audits and assessment. This approach may require adjustment to fit into a specific organization's quality system. This chapter discusses the technical audit and assessment process in the most general terms. The independent auditors' organization is assumed to be a third-party (i.e., neither EPA nor the auditee). This chapter describes an audit or assessment as a very formal process involving written communications between the various parties. This discussion is based on the assumption that the auditors and their equipment must travel to an assessment site which requires that the chapter address logistical steps associated with the travel. Moreover, the project being assessed likely has a limited duration and should have an EPA-approved QA Project Plan to describe current project activities accurately. Figure 2 presents the some of the major steps in a technical audit or assessment. In specific instances such as an self-assessment within EPA, some steps in the assessment process may be unnecessary and can be omitted. For example, the auditors may already be familiar with project planning documents prior to the decision to conduct the audit. It may not be necessary to notify the auditee by letter concerning an upcoming self-assessment or to make travel arrangements. Similarly, documentation of the audit and determination of corrective actions may involve fewer procedural steps for self-assessments. Periodic audits of ongoing projects may be simpler to plan because problem areas may have been identified in previous audits. 3.1.1 Decision to Conduct an Audit Audits are generally performed early in a project to identify and correct deficiencies. A balance should exist between the actual cost of the audits and the potential savings associated with not having to repeat measurements having deficiencies. This balance favors conducting such audits early in a project. If deficiencies with sampling and analysis are not uncovered until the end of a project, resampling and reanalysis may be needed and can be considerably more expensive than the audits. Final EPA QA/G-7 23 January 2000 ------- Planning Decision to Conduct an Audit Selection of Audit Type Selection of Audit Team Planning Meeting Confidentiality and Audit Results Review of Project Documents Contact with Auditee Audit Plan and Other Preparation Audit Questionnaire/Checklist 4^ Performance Audit Protocol Opening Meeting Audit Activities Observation of Work Interviews Document Review Objective Evidence Compilation Closing Meeting + L Evaluation Identification of Findings Determination of Corrective Action + 1 Documentation Draft Findings Report Final Report + L Corrective Action Implementation Verification of Effectiveness Closeout XI Closeout Closeout Memo Audit File Closure Figure 2. A Generalized Flowchart of a Technical Audit EPA QA/G-7 24 Final January 2000 ------- 3.1.2 Selection of Audit Type The EPA guidance on QA Project Plans (U.S. EPA, 1997) states: "A graded approach is the process of basing the level of application of managerial controls applied to an item or work according to the intended use of the results and the degree of confidence needed in the quality of the results." The graded approach concept is an integral part of the EPA Quality System. It starts with systematic project planning (i.e., development of DQOs or other objectives on performance criteria) and continues through the preparation of QA Project Plans and the planning of technical audits. This approach means that the content and level of detail in each QA Project Plan will vary according to the nature of the work being performed and the intended use of the data. The approach is implicit in Step 6 of the DQO process (U.S. EPA, 1994), in which decision makers specify tolerable limits on decision errors. The graded approach should be used to guide audit planning decisions and to achieve the desired information. It helps to ensure that audit resources are used effectively and efficiently where they are needed most. Because technical audits are closely tied to QA Project Plans, the level of effort in a technical audit will be determined, to a first approximation, by the complexity and detail of the QA and QC procedures described in the corresponding QA Project Plan. For instance, a high-visibility project to determine compliance with regulations may have an extensive and complex QA Project Plan and will probably be assessed more frequently than a proof-of- concept research project with a simpler QA Project Plan. A program using a new laboratory to perform analyses that will be used to develop regulations or to determine compliance with regulations would probably have a higher priority for audits in most organizations. It is unlikely that any organization can afford to perform unlimited technical audits. Each organization should determine its priorities and budgets for audits. Management should decide what types of audits to perform on different programs with what frequency. A detailed description of the types of audits and assessments available for environmental programs is provided in Chapters 4 and 5. 3.1.3 Selection of Audit Team Once an organization has decided what audits to perform, teams for particular audits can be established. Most audit teams will consist of at least two individualsthe lead auditor and at least one supporting audit team member. They should be qualified to perform audits and have technical knowledge of the project being audited. They should not have a vested interest in the project being audited. Technical audits may be conducted by auditors from the audited organization or by independent auditors. Information on auditor attributes is provided in Chapter 2. ISO guidelines for management of audit programs (ISO, 1994b) state that audit program management should consider the following factors when selecting auditors and lead auditors for particular assignments: Final EPA QA/G-7 25 January 2000 ------- the type of quality system against which the audit is to be conducted (e.g., environmental data collection, computer software, or service standards), the type of service or product and its associated regulatory requirements (e.g., field sampling, laboratory analyses, instrumentation), the need for professional qualifications or technical expertise in a particular discipline, the size and composition of the audit team, the need for skill in managing the team, the ability to make effective use of the skills of the various team members, the personal skills needed to deal with a particular organization being assessed, the necessary language and communication skills, the absence of any real or perceived conflict of interest, and other relevant factors. 3.1.4 Planning Meeting Planning for an audit is critical for satisfactory assessment performance. Unless an audit is planned thoroughly, the effort of conducting it may be wasted. Negative experiences resulting from poorly prepared and poorly performed audits tend to frustrate and disillusion auditees. Worse yet, they may confirm the perception that audits are a waste of time and do not make a positive contribution to the project being audited. An audit should be properly planned to achieve quality results. Planning begins after the decision to perform an audit has been made by management. Sometimes the need for an audit is identified at the planning stage of an intramural project or at the procurement stage of an extramural project. The audit team, the EPA QA Manager, and the EPA project officer should hold a preaudit planning meeting. The lead auditor and the supporting members of the audit team should be selected prior to this meeting. These individuals should review the project to be audited, the auditee and project personnel, the audit site, and any other information pertinent to planning for the audit during this meeting. The QA Project Plan and other project planning documents should have been reviewed prior to this meeting. The following important decisions should be made at this meeting: Final EPA QA/G-7 26 January 2000 ------- the authority for the audit (e.g., an explicit statement in a QMP or QA Project Plan), the purpose and scope of the audit, the type of audit to be conducted, the performance standards for the audit, the expected product of the audit (e.g., an audit report), any requirement for conclusions, recommendations, and suggested corrective actions, the confidentiality and dissemination of the audit results, the identification of the client, the expected budget for the audit, and the schedule for the audit and its documentation/report. Unusual circumstances may arise in which an unannounced audit is needed. In such cases, the audit team should meet with the EPA project officer and the EPA QA Manager to develop a strategy for carrying out the audit. The authority to conduct the audit must be documented in this meeting. Clear and objective audit criteria must be developed from the QA Project Plan and other project planning documents and documented in the audit checklist because the surprise audit may be conducted under a confrontational atmosphere. Observation and document review are the most important activities during an unannounced audit. The auditors may not gain much information from interviews under these circumstances. The EPA project officer and the EPA QA Manager should balance the information that might be gained from an unannounced audit with the disruption to routine project activities that might be caused by an unannounced audit. The product of the preplanning meeting should be the draft audit plan (see an example of this plan in Figure 3), which summarizes the decisions made in this meeting. The draft audit plan is the first document to be placed in the audit file for this project. The draft audit plan should be presented to management for review and approval. The audit file should be kept in an area accessible by the audit team; eventually, it should contain all of the working documents of the audit. 3.1.5 Confidentiality and Dissemination of Audit Results The confidentiality and dissemination of the audit results and other audit documents should be addressed in the preaudit planning meeting. The EPA project officer and the EPA QA Manager should decide whether these documents should remain confidential. This decision should be communicated explicitly to the auditors and the auditee prior to the audit. Planning documents also should define a policy for the dissemination of information about the findings of audits. Audit reports are generally considered internal reports and do not receive wide distribution in most cases. The auditee should be informed about how and to whom the reports will be distributed before the audit begins. Final EPA QA/G-7 27 January 2000 ------- 123 Environmental Corporation Audit Audit Number: Contract Number: EPA QAM: EPA Project Officer: Auditee: Location of Audit: Dates of Audit: Audit Team Members: Purpose: Audit Scope: Requirements: Documents to be Reviewed: Proposed Schedule: June 30 July 1 July 20, morning July 20, afternoon July 21, morning July 21, morning July 21, afternoon July 22, morning July 22, afternoon July 22, afternoon July 22, afternoon August 5 August 19 September 22 October 7 October 15 October 22 99-068 68-G-9999 Michael O'Brien Julia Bennett XYZ Corporation Anytown, CA July 2 land 22, 1999 Susan Davis, 123 Environmental Frank Michael, 123 Environmental To verify conformance with project documentation for EPA contract number 68-G-9999 and to report any nonconformances Sample receiving, storage, and extraction; sample analysis (including PEs); data management As specified in existing policies and procedures. Work plan, quality assurance project plan, reports from previous audits. Telephone call to XYZ Corporation Notification letter sent to XYZ Corporation Performance evaluation samples arrive on-site Audit team travels to site Opening meeting (Attendees: Jim Boyd, Elizabeth Wright, Susan Davis, Frank Michael) Audit sample handling facilities Audit laboratory, review documents Audit data management section, review documents Audit team meeting Closing meeting (Attendees: Jim Boyd, Elizabeth Wright, Susan Davis, Frank Michael) Audit team returns from site Submit draft findings report to EPA QA Manager EPA QA Manager sends draft findings report to XYZ Corporation Receive comments from XYZ Corporation Submit final report to EPA QA Manager EPA QA Manager submits final report to XYZ Corporation Add closeout memorandum to audit file EPA QA/G-7 Figure 3. Example of an Audit Plan 28 Final January 2000 ------- The confidentiality decision may not be easy. At first glance, the findings of an EPA- funded audit should be made available for public inspection because public funds are involved. However, businesses have a legitimate right to preserve the confidentiality of trade secrets and other confidential business information and to limit its use or disclosure by others in order that the business may obtain confidential business advantages it derives from its rights to the information. EPA's regulations relating to this issue are found in "Confidentiality of Business Information" in 40 CFR 2.201. Other statutory requirements and Federal Acquisition Regulations also could apply to the confidentiality decision. EPA rules under the Clean Air Act; the Federal Insecticide, Fungicide, and Rodenticide Act; the Resource Conservation and Recovery Act; the Toxic Substance Control Act; and other environmental statutes regarding confidential business information may apply to some or all items observed during an audit. The planning documents should indicate if this type of confidentiality must be maintained. For particularly sensitive projects, nondisclosure agreements between EPA and the auditee may be needed and should be considered during the preaudit planning. The use of an auditor from within the audited organization may be more appropriate than an independent auditor for particularly sensitive projects. As a practical matter, auditors may find that an auditee may not be completely candid about its management and technical procedures if it believes that the dissemination of such information by the auditors might give an unfair procurement advantage to a competitor or result in adverse publicity for the auditee. The auditors may find it difficult to gain access to project- related documents containing trade secrets in a nonconfidential audit. The auditors should remember that the primary purpose of the audit is to correct deficiencies in the project, not to affix blame on the auditees. They should also remember that the clients for the audit are the EPA project officer and the EPA QA Manager. 3.1.6 Review of Project Documents Project planning documents usually consist of work plans and QA Project Plans or monitoring sampling and analysis plans. These documents may refer to other supporting documents such as SOPs. There may be prior publications or reports on earlier phases of the project or reports of previous audits of the project. All of these materials may help the audit personnel plan the audit. The purpose of the preaudit document review is to provide the auditors with as much information as is available about the auditee's quality system and about the basic technical aspects, performance criteria (e.g., DQOs), and DQI goals of the project. The auditors should understand the basic technical aspects prior to the audit because they will be expected to start the audit soon after arriving at the audit site. Time spent reviewing documents before the audit reduces the extent that project activities are disrupted during the audit. The auditors should know the names of the project manager, the project QA Manager, and project personnel as well as the roles of these individuals in the project. The auditors should review project planning documents Final EPA QA/G-7 29 January 2000 ------- for information about the physical layout of the audit site and for descriptions of the equipment that will be audited. The auditors should develop an audit questionnaire/checklist that contains performance criteria that are based on the project's specific performance criteria (e.g., DQOs) and DQI goals, rather than subjective criteria, which may not be appropriate for this project. 3.1.7 Contact with Auditee Before the preaudit planning meeting, the EPA project officer should have notified the auditee of the upcoming audit (see Figure 4). After this meeting, the lead auditor should inform the appropriate project management and audit site management (if the auditee does not control the site) about the scope of the audit, the approximate date of the audit, the identification of the audit team personnel, and any special needs for the audit (see Figure 5). Notifying the auditee and the audit site organization gives them the opportunity to more effectively prepare for the audit, perhaps by focusing on critical areas pertaining to the audit subject. This notification should be in writing and should allow the auditee sufficient time to prepare for the arrival of the audit team, if the audit involves travel. If the initial contact is made by telephone, it should always be followed by written confirmation. The project management and the audit site management should confirm that the proposed audit date is suitable relative to the project schedule or they should propose alternative dates. The lead auditor may wish to discuss a proposed audit schedule with the project management so that specific audit activities can be coordinated with project activities and with project personnel availability. Audit activities should disrupt project activities only to the extent necessary for performing the audit. This discussion would also allow for revisions to the audit schedule should any conflicts with original schedule be identified. The lead auditor should contact the project manager to request any additional information needed for the audit. This information may consist of project-specific information such as the laboratory management plan or reports of ongoing interlaboratory QA programs in which the auditee may be participating. Useful information that is typically requested includes a list of the particular equipment or instrumentation used for the project, the date of the last calibration of each piece of measurement equipment, recent QC data, and the identity and background of the project personnel. It is usually useful to know the current status of the project; that is, have all sampling activities been completed and has analytical work begun? Sometimes a preaudit site visit is useful for audits involving travel. The lead auditor should determine the appropriate point-of-contact at the audit site, the travel route to the audit site, and the procedure for gaining entry to the audit site. The lead auditor should identify the safety requirements for the audit site (if any) and determine what personal protective and safety equipment is needed. Final EPA QA/G-7 30 January 2000 ------- UNITED STATES ENVIRONMENTAL PROTECTION AGENCY National Laboratory for Environmental Research Research Triangle Park, NC 27799 ECOSYSTEM RESEARCH DIVISION June 25, 1999 Mr. Jim Boyd XYZ Corporation 10 Main Street Anytown, CA Dear Mr. Boyd: As authorized by the Inspections Clause in the Federal Acquisition Regulations, the U.S. Environmental Protection Agency (EPA) will conduct a technical systems audit and a performance evaluation of your quality assurance program activities under contract number 68-G-9999 on July 21 and 22, 1999. These audits will be conducted by representatives of 123 Environmental Corporation under a support contract with EPA. 123 Environmental is authorized to represent EPA in this audit. Ms. Susan Davis, the 123 Environmental Corporation lead auditor, will be contacting you shortly with additional details about the audits and to confirm the agendas for the audits. If you have any questions before then, however, please contact me at 001-222-3333 or by Email at julia.bennett@gov. Sincerely yours, Julia Bennett EPA Project Officer cc: Susan Davis, 123 Environmental Corporation Frank Michael, 123 Environmental Corporation Michael O'Brien, EPA QA Manager Figure 4. Suggested Format for the Notification Letter Final EPA QA/G-7 31 January 2000 ------- 123 Environmental Corporation July 1, 1999 Mr. Jim Boyd XYZ Corporation 10 Main Street Anytown, CA Dear Mr. Boyd, Representatives of 123 Environmental Corporation will conduct a technical systems audit (TSA) and a performance evaluation of your quality assurance program activities under U.S. Environmental Protection Agency (EPA) contract number 68-G-9999 on July 21 and 22 as agreed upon during our telephone conversation on June 30. The enclosed audit plan describes the audit scope, the activities to be assessed, the applicable documents, the personnel conducting the audit, and the proposed schedule. Please ensure that adequate facilities are available for conducting the opening and closing meetings and for the audit team to caucus and review documents. Please notify cognizant project management and other appropriate project personnel of the proposed audit schedule so that they are available to attend the opening and closing meetings and to escort the auditors, as necessary. The escorts should be knowledgeable about the technical aspects of the project activities being assessed. The auditors will want to see project personnel performing their normal project activities and will want the opportunity to interview them. The auditors also may want to observe project documentation, storage facilities, packing and transport of samples, calibration standards, quality control measurements, and routine project data. The auditors will attempt to accomplish their work with as much efficiency and as little disruption of the daily routine as possible. If you require additional information about the upcoming technical audit, please contact me or Frank Michael at 012-345-6789 or qa@123env.com. Sincerely, Susan Davis, Lead Auditor Enclosure cc: Frank Michael, 123 Environmental Michael O'Brien, EPA QA Manager Julia Bennett, EPA Project Officer Figure 5. Suggested Format for the Follow-up Letter Final EPA QA/G-7 32 January 2000 ------- After the above information-gathering discussions, the lead auditor should set the agenda for the audit (see Figure 6). The agenda should consist of a complete outline of the audit activities. If possible, the agenda can contain a proposed audit schedule. Everyone involved with the audit should understand that the actual timing of audit activities may vary from the proposed schedule due to last-minute changes in project schedules or the need for more extensive discussion of specific audit items. For audits involving travel, the lead auditor should write a letter to the point-of-contact at the audit site and send a copy to the appropriate project and site management. The letter should discuss the logistics associated with the audit team's travel plans, equipment shipments, audit team lodging, dates and times of arrival and departure, and a telephone number where audit personnel can be reached (see Figure 7). The lead auditor should include a copy of the audit agenda with this letter. The lead auditor will make logistical arrangements, such as finalizing the audit date with the auditee, selecting a hotel (often the auditee will offer good suggestions), making travel arrangements, shipping and delivering any needed equipment to the point-of-contact at the audit site, and providing any needed PE samples and supplies. Copies of all correspondence pertaining to the audit should be provided to all audit team members. 3.1.8 Audit Plan and Other Preparation The audit plan is essentially a work plan that documents what, when, and how the audit will be done. It should incorporate any changes based on preaudit contacts before being finalized. The client and the auditee should receive the final audit plan prior to the audit to ensure that all interested groups understand and agree to what the auditors plan to do before the audit begins. The audit file for this project should be created before the audit. Prior to the audit, this file should contain the audit plan, a questionnaire for the interviews, a checklist for recording key items, the letter to the auditee, the audit agenda, the preaudit information supplied by the auditee, the auditors' travel arrangements, and any other information specific to the audit. After the audit, notes, complete checklists, the closeout memo, and all reports should be added to this file. Other preparations may include preparing any special forms. For example, PE samples may be regarded as "hazardous cargo" that cannot be transported on commercial airlines. Such samples must be shipped in accordance with applicable regulations. Also, the equipment to be used may require calibration. A checklist should be prepared by the lead auditor based on review of the QA Project Plan and other project documentation. This checklist should be reviewed by the audit team and by other appropriate individuals, such as the EPA QA Manager, before the audit begins. All members of the audit team should have a clear understanding of their individual Final EPA QA/G-7 33 January 2000 ------- 123 Environmental Corporation Agenda for Technical Audit of XYZ Corporation Audit Team Susan Davis, lead auditor Frank Michael, auditor July 20, 1999 10:30 a.m. Performance evaluation samples and associated equipment should arrive by air freight delivery at XYZ's Anytown facility; Jim Boyd is listed as the XYZ Corporation point-of-contact. 9:10 p.m. The 123 Environmental Corporation audit team flies to Anytown on Jumbo flight 0000. July 21, 1999 9:00 a.m. Audit team arrives at the facility and attends safety orientation. 9:30 a.m. Opening meeting with XYZ personnel (Jim Boyd, Elizabeth Wright). 10:15 a.m. Team begins audit in the sample receiving area; relinquishes custody of the PE samples; and continues into sample storage and extraction facilities. 12:15 p.m. Team breaks for lunch. 1:00 p.m. Team resumes audit in the analysis laboratory. 4:00 p.m. Team collects data and documents and begins preliminary review for completeness. 6:30 p.m. Team breaks for dinner. July 22, 1999 8:30 a.m. Audit team arrives at the audit site; updates the point-of-contact at the site on the status of the audit. 9:00 a.m. Team resumes audit in the project's data management section. 11:00 a.m. Team collects any additional data and documents and continues preliminary review. 12:15 p.m. Team breaks for lunch. 1:00 p.m. Audit team meets to discuss observations thus far and prepares list of any additional materials required. 2:45 p.m. Closing meeting with XYZ personnel (Jim Boyd, Elizabeth Wright). 4:00 p.m. Audit team leaves for airport. Figure 6. Example of a Technical Audit Agenda Final EPA QA/G-7 34 January 2000 ------- 123 Environmental Corporation July 10, 1999 Mr. Jim Boyd XYZ Corporation 10 Main Street Anytown, CA Dear Mr. Boyd: We will be shipping performance evaluation samples and associated equipment for the audit directly to your facility for receipt on the morning of July 20, 1999. The samples will be delivered to your attention. Please contact me by noon if you have not yet received them. The audit team from 123 Environmental Corporation (Frank Michael and myself) will be arriving in Anytown on the evening of Wednesday, July 20, on Jumbo flight 0000. We will be staying at the Anytown Starlight Motel. Please call us there at 111-555-1000 if there are any changes we need to know of before we arrive at your facility. As we have discussed, we plan to arrive at the facility at 9:00 a.m. on July 21 and will need to leave by 4:00 p.m. on July 22 to catch our flight. We are scheduled to attend a safely orientation as soon as we arrive at the facility. Please see the enclosed agenda for more details. Please contact me at 012-345-6789 or qa@123env.com if you have any questions about the audit. Sincerely, Susan Davis, Lead Auditor Enclosure cc: Frank Michael, 123 Environmental Michael O'Brien, EPA QA Manager Julia Bennett, EPA Project Officer Figure 7. Example of a Logistical Letter Final EPAQA/G-7 35 January 2000 ------- responsibilities before the audit. They should be aware that health and safety may be key factors in some audits and that appropriate preparations to safeguard the auditors' health and safety should be made before visiting the site. 3.1.9 Audit Questionnaire The results of an audit should be based on objective evidence or observations about the project being assessed. The audit questionnaire used for some types of audits is a document for systematically recording objective evidence from interviews. It is useful as a means to obtain information that has not been documented by the auditee. When completed, the questionnaire should demonstrate that an audit was conducted, that it was conducted in an orderly and complete manner, and that it examined all important aspects of the project. It should consist of a series of specific questions about the project or issues to examine during the audit. These questions should be based on documented statements in the QA Project Plan or related project documents about specific, observable activities that will be performed during the project, rather than on more general principles that may be hard to define in practice. (For example, "How are measurements recorded?" is a better question than "Are good record-keeping procedures being followed?") When broader explanation of an issue is needed, close-ended questions (i.e., yes/no) should be avoided and open-ended questions should be used. Such questions allow the interviewee to explain the answer more completely. The questions may be qualitative or quantitative as needed, but they should be written to include specifications for acceptable performance or conformance. If possible, cite the specific section of the QA Project Plan or other project document that is the basis for the specification. Time spent developing these performance specifications in the planning phase can save time during the audit. It is better to have more questions, each with a narrow focus, than a few overly broad questions, which may be difficult to answer succinctly. Questioning is discussed in more detail in Section 3.2.5. 3.1.10 Audit Checklist An audit checklist is another method for describing project elements to be assessed. It provides a commonly accepted method for documenting the elements and the audit findings for these elements. The completed checklist provides objective evidence that an audit was performed, that it was performed in an orderly manner, and that all applicable aspects of a quality system were addressed during the audit. Anyone who performs audits frequently will probably find it difficult to retain the detailed findings of every audit in memory. The completed checklist also provides a basis for the closing meeting and the audit report. See Appendix B for an example of a TSA checklist that illustrates one possible format for such documents. The questions in the audit checklist should be based on performance criteria, such as DQOs, that are listed in the project's QA Project Plan and other planning documents. They should be written so that they can be answered by observation of objective evidence during the audit. A question requiring an objective response (such as "Is the spectrophotometer calibrated Final EPA QA/G-7 36 January 2000 ------- daily using certified reference standards at three different concentration levels?") is better than a question requiring a subjective response (such as "Is the spectrophotometer calibration procedure adequate?"). The format of a specific checklist should be dictated by the needs of the audit. Some checklists are in a question-and-answer format where the answer is either "yes" or "no." Such checklists are easy to review because the "no" answers attract attention. The inherent assumption is that if all the questions have "yes" answers, the quality system elements assessed are adequate. This format does not allow the auditor to be analytical in documenting the findings of an audit. Many elements are not cut-and-dried and do require some detailed evaluation. Also, a relatively inexperienced auditor can be misled or confused looking only for "yes" or "no" answers. Other checklists provide a question-and-answer format in which the auditor must write out the answer and any qualifying remarks. This format is more difficult to review than the "yes/no" format, but it provides more information with which to judge the performance of a quality system element. Regardless of format, a checklist should always have enough room for comments and information on which the audit findings will be based. It also should provide room for identifying the audit particulars, such as the auditee, the auditors, the audit date, and the audit site. This information helps avoid confusion of audit records in the future. The use of generic checklists for multiple audits is discouraged. A "one-size-fits-all" checklist may overlook important, unique features of a project even for routine, respective activities. A checklist should be tailored to the specific project being assessed. It is not appropriate to take a checklist developed for one project and apply it to a different project. During the process of developing a project-specific checklist, auditors will review project-specific documents and become better prepared to conduct the audit. A project-specific checklist developed for a long-term project may be reused in subsequent audits of that project, but it should be revised to incorporate information gained during the earlier audits. In one case history, an audit team involving several auditors used a generic checklist to conduct the audit. When the auditors went over the checklist, they discovered that they had trusted their own expertise to simply know what was right during the course of the audit. When the auditors talked as a group, they realized that each auditor was inadvertently using individual and subjective audit criteria. The credibility of the audit team suffered because of this inconsistency. After this incident, the audit team developed a much more specific checklist, based on a very specific set of requirements. This approach gave the audit team a solid foundation and improved consistency. The auditors were still allowed to use their professional judgment. The requirements formed the basis of the program, the checklists, and the reports. The team continued to fine-tune its requirements. Final EPA QA/G-7 37 January 2000 ------- Although the checklist is a guide for the audit, the audit should not be limited to only issues mentioned in the checklist. During an audit, an auditor may need to deviate from the checklist to determine if nonconformances exist and what their significance might be. Important features of the project may be observed that were not considered while the checklist was being developed. The format of the checklist should provide space for comments and observations of any unexpected phenomena. 3.2 PERFORMANCE OF THE AUDIT This section discusses the elements of audit performance: audit protocol, the opening meeting, audit activities and techniques, and the closing meeting. Each audit differs depending on the project, the site, the type of audit, and the circumstances. A typical audit of an external organization will possess certain common characteristics, including an opening meeting, the audit activities, and a closing meeting. The audit should always begin with an opening meeting with management and key project personnel and end with a closing meeting with the same individuals if possible. The suggested timing in the audit agenda lets project managers get their personnel started on the day's work without any disruption by the auditors' presence, allows them to focus on the audit when the audit team arrives, and minimizes the necessary disruption in the auditee's activities. Separating the audit into multiple days allows audit team members to assimilate their thoughts, review their notes, consult with each other, and prepare questions to clarify their findings without unnecessarily prolonging the audit. 3.2.1 Audit Protocol Audit teams should remain calm and be professional at all times, particularly during interviews. The audit team should not have preconceived notions about what it will find during the audit. The audit should be understood as an objective evaluation of the project to help the project attain its stated goals, rather than as a means of criticizing the project or the project staff. It is the responsibility of the audit team to establish an atmosphere of trust and cooperation. An audit should be conducted in accordance with established requirements developed by the auditors' organization. Any agenda or specific protocols established during preaudit planning should be used to ensure that the audit is conducted effectively and safely. Auditors should keep their points-of-contact at the audit site informed of their activities to preclude surprises during the closing meeting. This communication may include requests for additional assistance or expressions of concerns that require immediate action on the part of the auditee. Timely communication allows the auditee to verify the accuracy of the observations and to provide relevant facts and background information on the issues. Daily audit team meetings may be useful to ensure continuity and overall focus. Daily meetings provide the lead auditor with information on the completion status of the audit checklist Final EPAQA/G-7 38 January 2000 ------- and identify issues requiring additional action (e.g., clearances, access, requests for personnel or material, and impasse resolution). These meetings also provide the setting for informing other team members of issues that may be of interest in their assigned scope or for integrating data gathered by the various auditors. The meetings should be brief so that they do not reduce the team members' time with the processes and people they are assessing. It is important that sufficient information be gathered during the audit to determine whether project requirements are being met and whether a process meets the performance criteria that have been established for it. The auditor should be able to clearly state the criterion impacted by the process. To accomplish this, the auditor may deviate from the audit agenda to determine the extent and significance of an issue; however, such deviations from the plan should be carefully weighed to determine if the expected results justify the changes. Deficiencies that affect the auditor's ability to complete the audit agenda should be communicated to the team leader and the auditee. An audit can be stressful for all parties involved. The auditees will almost always be anxious and may feel defensive. Auditors may feel pressure to develop an understanding of a complex system in a short period of time and may not be sensitive to the perspective of the individuals being assessed. In view of their greater familiarity with the audit process, auditors should take the initiative to put everyone at ease and to make the audit a positive experience. Stress can be reduced if the entire audit team understands the agenda before arriving at the audit site, if the agenda is clearly explained to the auditee during the opening meeting, and if the audit is executed according to the agenda. Stress, fear of the unknown, or natural resistance to change may, in some instances, be expressed as defensiveness or confrontational behavior on the part of auditees. The audit team should be aware that such behavior is possible and should not react to it with similar behavior. Such behavior by any party interferes with the goals of the audit. After the audit team has done all that it can to reduce stress and defuse any difficult situations, it may be necessary to postpone the audit to the next day to allow auditees an opportunity to regain their perspective and composure. The audit should be resumed by restating the purpose and objectives of the audit. If the situation does not improve the next day, the audit team should consult with its client and then make a decision about whether the audit should proceed. One of the most readily perceived qualities by interviewees is attitude. Being in a stressful situation may require more mental fortitude by the interviewer than technical knowledge. Although an auditor's lack of knowledge on a particular subject can be compensated for by other team members, attitude problems cannot. Each audit team member should contribute to a harmonious demeanor in order to ensure an effective audit. Each member should be supportive of one another and should avoid disagreements in the presence of the auditee. In general, the auditors should not do anything during an audit that has not been described in the agenda. For example, do not bring a PE sample to a TSA unless it is covered by the scope Final EPA QA/G-7 39 January 2000 ------- of the audit. By following the audit agenda, auditors can avoid increasing the innate stress of the audit process. 3.2.2 Opening Meeting The opening meeting should be attended by the audit team, the project manager and the project QA Manager of the auditee, and a representative of the audit site or facility. Key project personnel should attend the meeting if the meeting can be scheduled to avoid disrupting essential project activities. In any case, project personnel should already be aware that an audit is being conducted. The lead auditor should lead this brief meeting. The lead auditor should introduce the members of the audit team and review the purpose and scope of the audit. General areas of the project that will be examined should be described. Additional copies of the audit agenda and the audit questionnaire and/or checklist should be distributed at the meeting. Introductions of project and host site personnel and comments from them should be solicited. Logistical details, such as the locations of various facilities at the audit site and safety precautions, should be discussed in the meeting. Most importantly, the auditee should be informed about what will and will not be done with the findings of the audit. Last, the auditee should be invited to ask any questions about the audit. Figure 8 gives the framework and characteristics of an opening meeting. One useful audit technique during an opening meeting is to enlist the project personnel by asking their advice on what would be the best manner to conduct the audit. Auditors often spend a great deal of time during opening meetings explaining their view of the audit and creating an agenda and a schedule for the audit. The project personnel may be able to suggest a more efficient audit agenda and/or audit schedule. In addition, the auditee may review the checklist during the opening meeting and answer some key audit questions before leaving the meeting. These answers should then be confirmed during the course of the audit. It also can be useful to ask project personnel if there are any specific areas where they expect the auditors to identify findings. This approach allows personnel from the auditee to share their own concerns up-front, smooths the flow of the audit, and frees time for more meaningful discussions. Consider this case history of two auditors. Emissions testing was being conducted at an industrial facility by two EPA contractors. Because each contractor was funded by a different group inside EPA, a TSA of each contractor was conducted by a different EPA auditor. On one day, experienced Auditor A arrived to assess Contractor A. She began the audit with an opening meeting where she introduced herself, met Contractor A personnel, learned the scheduled measurement activities for the day, and reviewed the TSA agenda. Later that day, she interviewed Contractor A personnel at convenient lulls in their activities. On the next morning, inexperienced Auditor B arrived at the audit site and began to conduct the TSA without holding an opening meeting. He approached Contractor Final EPA QA/G-7 40 January 2000 ------- Figure 8. Characteristics of an Opening Meeting Introduce all members of the audit team. Introduce auditee management and key project personnel. Distribute the audit questionnaire and checklist (as applicable). Discuss logistical details and safety precautions. Describe: - What will be done. - How it will be done. - What will/will not be done with the results. Emphasize: - Audit plan agenda. - Findings based on objective evidence. - "No surprises" style. A personnel, interrupted instrument calibration activities that had to be completed before measurements could begin, and began to ask questions. He was told that he was assessing the wrong contractor and was directed to Contractor B personnel, whose calibration activities also were interrupted by his questions. Auditor B lost credibility before his TSA started. 3.2.3 Audit Activities The audit should follow the agenda as closely as possible, both to minimize disruption of project work and to ensure that all activities are performed. A typical audit may begin with a tour of the project and nearby areas of the audit site to familiarize the auditors with the facilities. This tour should be conducted by the project manager and an audit site representative. Depending on the circumstances, a pretour safety orientation may be necessary. If so, all members of the audit team should use appropriate safety gear, which they have brought to the site or which the host site has provided. The audit team should be prepared to take notes during audits. They should have the audit checklist in hand to remind them of the necessary questions to ask during each part of the audit. The performance standards against which the project's activities are being measured should have been reviewed in detail by the audit team; they form the basis for the checklists. Throughout the audit, the auditors should compare what they observe with what the QA Project Plan requires. EPA QA/G-7 41 Final January 2000 ------- The audit should follow the agenda to the extent possible based on current conditions at the audit site. For example, for facilities that process samples from remote sampling sites, a typical place to start the audit is at the receiving station for the incoming samples. The auditor should observe the samples as they are unpacked, logged in, handled, and stored. The auditor should have notes about the identities of samples that have passed through the receiving station earlier and should verify proper implementation of chain-of-custody procedures. A good audit technique is to: ask receiving personnel to look up a few samples in the incoming logs to compare the information with that contained in the project planning documents, note the information contained in the incoming sample log and ask for copies of appropriate pages of the log, ask for the current location and status of the samples, and trace the progress of samples from receiving through storage through processing to retention and/or disposal. There will be many opportunities to collect copies of the samples' paper trail along the way. Any facility that handles the project samples should be investigated and information on it gathered. Typical facilities include refrigerators, extraction or digestion laboratories, and analysis laboratories. There may be storage facilities for calibration standards, samples, and chemicals; balance rooms; special water treatment systems; and specialized climate-controlled instrument rooms. Auditors should have access to and examples of any electronic tracking systems, such as laboratory information management systems. All these facilities should meet the specifications given in the project planning documentation. The auditors should observe project personnel performing their duties; interview them to determine their understanding of the activities they are performing; and examine records of their education, training, and experience in similar work. During another TSA, the project manager insisted on leading the "tour " of the laboratory and explained each function in each area in detail. The auditor discovered during a subsequent review of notes that the real areas of interest to the auditor were not covered in sufficient detail. The project manager had treated the audit as if it were a site visit by a client. The auditor returned to each area and asked the laboratory personnel to describe the details of the operation. During the audit, the auditors may encounter situations they believe require corrective action to maintain data quality or to protect health and safety. They should use their best professional judgment to approach these situations. If immediate action is not required, they should discuss the problem with the other team members and with the EPA QA Manager and Final EPA QA/G-7 42 January 2000 ------- EPA project officer before deciding on the best course of action. For most situations involving audits of extramural activities, third-party auditors do not have the authority to issue stop-work orders. Only the EPA project officer can do this. If the auditors determine that an unsafe or severely deficient condition exists, the management of the organization being assessed and the EPA project officer should be notified immediately. 3.2.4 Observation of Work The observation of actual work activities as part of a technical audit is often considered the most effective technique for determining whether performance of these activities is adequate. Surveillance, discussed in Chapter 4, is a type of technical audit involving observation of a specific technical activity on an extended basis. Auditors should understand the effect of their presence on the person being observed. They should convey an attitude that is helpful, constructive, positive, and unbiased. The primary goal during observation is to obtain the most complete picture possible of the performance, which should then be put into perspective relative to the overall program, system, or process. Before drawing final conclusions, the auditor should verify the results through at least one other technique, such as review of the documentation. The auditors should observe the activities of the project being performed (according to the methods discussed in the audit plan and QA Project Plan) by the project personnel who regularly perform the activities. This observation time usually provides a good opportunity to complete questions on the checklist about the activity and to ask questions of the operator during lulls in the activity. The auditor may make sketches and take notes in order to determine nonconformances from the QA Project Plan implementation (or lack thereof) and gather objective evidence for his or her findings. When nonconformances are discovered, the auditor should attempt to determine the impact of the nonconformance (i.e., deficiency or weakness) by questioning the operator and examining past records. The auditor should not enter into the activity or interfere with the operator of the instrument. The auditor should not make suggestions to the operator or attempt to correct deficiencies. Suggestions and corrections should be evaluated thoroughly and discussed with the EPA project officer before being conveyed to the project manager for appropriate corrective action. 3.2.5 Interviews Interviews provide a means to verify the results of observation, document review, and inspection during audits. In addition, interviews allow the responsible person to explain and clarify those results. The interview helps to eliminate misunderstandings about program implementation, provides a dialogue between the auditor and the auditee, identifies who can explain apparent conflicts or recent changes, and describes the functional organization and program expectations. Checklists developed during audit planning are used to prepare for the interview, which may address points not considered when the checklist was being developed. Final EPA QA/G-7 43 January 2000 ------- Auditors also should prepare questions in advance to keep the interview focused. Interview considerations and techniques are given in Figure 9. A minimum of two people should conduct an interview to provide corroboration and allow for efficient questioning. While one interviewer is asking a question and recording the responses, the other is able to listen actively to the response and to formulate a more thoughtful followup question based on the response. The use of two interviewers helps to ensure that the statements by the respondent are recorded accurately. The corroboration provided by the second interviewer reduces the likelihood of anyone claiming later that a particular answer was never given. Moreover, if there is any confusion about what was said, the two interviewers can discuss the response and agree on what was said. Figure 9. Interviewing Considerations and Techniques Interview considerations: - Group questions by topic or subject. - Maintain a conversational pace. - Consider stress and fatigue (both yours and theirs). Questioning techniques: - Talk less, listen more. - Speak slowly and clearly. - Speak to the plan. - Avoid yes/no questions. - Repeat answers for verification or clarification. In some audits, a significant portion of time is spent interviewing personnel. How questions are asked affects the amount and quality of information received. Preparation for interviewing personnel is the first and most important step to ensure that the proper questions are asked. Auditors should take notes during interviews to ensure that accurate information is being recorded. The importance of active listening, instead of talking and formulating the next question, cannot be emphasized enough. It is important to encourage personnel to respond in as much detail as possible by asking open-ended questions, often those starting with how, when, why, who, and what, instead of yes/no questions. Statements made during interviews should be verified. For example, if, in answer to a question, the interviewee claims to have a document, ask to see it or ask for a copy of it. Clarification techniques such as probing, paraphrasing, and summarizing can be used to make sure that the information received from interviews is clear and complete. Probing is asking followup questions to further explore something the interviewee said. Paraphrasing is repeating and rewording important points. Summarizing is recapping and repeating a set of major points to make sure that all of the important information has been noted correctly. EPA QA/G-7 44 Final January 2000 ------- It is important for auditors to consider the fine line between assessing and consulting. They should ensure that this line is not accidentally crossed in casual conversation. Auditors are often tempted to offer advice based on their own experience. This temptation is particularly strong for auditors who have extensive experience. They are even encouraged by the auditee to share their knowledge. The openly shared thought of the auditor may be taken as a recommendation of significance by the auditee. During one TSA, the auditor asked the laboratory manager why he was erecting a wire cage wall in the sample receiving area. The auditor had never seen a similar arrangement. The laboratory manager answered that the last auditor in the laboratory had told him that the wall was a good idea. 3.2.6 Document Review Document review may be used extensively during an audit to substantiate the information obtained during interviews and observations. During the course of an audit, questions may arise concerning what is heard and seen. The review of documents provides a method for answering these questions and validating the audit results. Document review alone, however, cannot usually ascertain the degree to which documents accurately reflect work activities. This technique should be combined with interviews, observation, and/or inspection to complete the performance picture. Records and documents should be selected carefully to ensure that they are representative and adequately characterize the program, system, or process being assessed. Written documents that are examined, depending on the scope of the audit, may include project records such as QA Project Plans, reports of prior audits, bench sheets, calibration readouts, QC charts, process data readouts, sample logs, custody papers, instrument logs, printouts from spreadsheets, and maintenance notebooks. Moreover, such records may be in electronic form. In addition, they may include general records such as a laboratory QA management plan or quality manual. The auditors may ask to see the results of periodic inteiiaboratory round-robin performance tests. The auditors may compare the analyst's handwritten notes with the spreadsheet data to verify data entry. They also may ask for copies of materials such as notebook pages with specific entries that may provide objective evidence for a particular finding. If possible, auditors should review any available documents before the audit to reduce the time spent at the audit site. To accomplish such preaudit document review, the auditee should provide the appropriate documents to auditors in a timely fashion. The auditors should be reasonable in their preaudit requests for documents because many documents are too bulky or too extensive to photocopy. If possible, auditors should review all important documents or obtain copies of these documents before leaving the audit site. Only as a last resort should auditors accept promises that copies will be sent to them afterward. Final EPA QA/G-7 45 January 2000 ------- Consider the case history of the undelivered calibration records. Important emissions testing was being conducted at a remote industrial facility by a contractor. A veteran emissions tester served as the contractor's QA coordinator at the audit site and was the point-of-contact for auditors who were present to observe the testing. When the auditors asked to see the calibration records for the emissions samplers, the tester said that these records were back at the contractor's office and would be sent to the auditors when he returned. Despite subsequent attempts to obtain the records, the auditors were unable to confirm that the records existed. The final report was submitted with a statement that some of the checklist items could not be completed due to the lack of records at the audit site; that is the absence of calibration records. The report noted that if the instruments were not calibrated as prescribed, then the data obtained could be questionable and not be usable for their intended purpose. 3.2.7 Objective Evidence Compilation During the audit, each auditor should collect objective evidence in the form of notes: copies of notebook pages, logs, instrument and model outputs, and QC charts. The auditor should record general impressions and answers to checklist questions. It is important for the audit team to retire from the presence of the auditee personnel to consult with each other, to review the accumulated material, and to agree on the initial findings that the objective evidence supports. Some tentative findings may be developed early or in later stages of the audit. Verifying the tentative findings as the audit progresses is an ongoing activity. In some cases, it may be necessary to make copies of certain records or to obtain copies of some documents. The audit team may need to make arrangements for copying if reproduction facilities are not readily available. When necessary, efforts to protect confidentiality must be ensured. 3.2.8 Closing Meeting The closing meeting is an important and visible component of an audit. Audit findings and the objective evidence for them are the subject of the closing meeting. The auditee is waiting for a formal presentation of the audit findings, although it may generally know the findings from informal information exchanges during the audit. The lead auditor should lead the closing meeting and present a summary of the findings. The lead auditor should speak clearly, concisely, and slowly. The lead auditor should state that discussions of findings should wait until after the summary has been presented to ensure all findings are presented. Individual audit team members may clarify specific findings. Project managers, who have the authority to implement corrective actions, should attend the closing meeting to hear the summary. Project personnel may benefit from attending the meeting. Characteristics of an closing meeting are given in Figure 10. Final EPA QA/G-7 46 January 2000 ------- Figure 10. Characteristics of an Closing Meeting Present the audit findings: - Be objective. - Describe what was done during the audit. - Summarize what was found during the audit. - Characterize the overall audit briefly; base on preliminary findings. - Give credit for observed good practice. - Address overall effectiveness of the project. - Present each finding in clear concise terms. - Check to make sure that each finding is understood. Don't speculate: - Emphasize fact-based findings. Don't present any big surprises: - Significant findings should have already been communicated. Don't make final conclusions/recommendations/suggested corrective actions: - Further evaluation of results may be needed. - The final conclusions/recommendations/suggested corrective actions should be presented to the client first. Provide an estimate of when the draft findings report will be available for review and comment. Thank everyone for their cooperation and assistance. When appropriate, give out compliments for help, attitude, and so on. Before the closing meeting, the audit team should meet among themselves to discuss the findings of individual team members. This discussion allows the team to resolve any uncertainties or inconsistencies regarding individual findings and to determine the relative importance of the individual findings. Time limitations may prevent extensive discussion of minor findings at the closing meeting. The lead auditor should review these findings and develop a summary or overview of the audit and the findings. The audit team should be well-prepared to discuss the findings at the closing meeting. The lead auditor should provide the auditee with an opportunity to respond to the findings after the summary of findings has been presented. The audit team should listen to and document any explanations or contrary objective evidence presented by the project personnel. There may be additional objective evidence that the audit team did not examine during the audit such as the results of PE sample analyses. Such additional objective evidence may be a legitimate subject for Final EPA QA/G-7 47 January 2000 ------- discussion, but it may not be readily available. The process by which the additional objective evidence will be presented to the auditors should be discussed during the closing meeting, for example, the auditee may promise to send information to the lead auditor within a specified number of days. The lead auditor should not finalize findings for that portion of the audit until the additional objective evidence is in hand. The lead auditor should discuss the audit report that will be prepared by the auditors. The report will contain specific findings based on the objective evidence the auditors have examined. The auditee should understand the logistics and schedule for delivering the report to the EPA project officer and the EPA QA Manager. The auditee should understand that it will have the chance to review and comment on a draft version of the report before it is finalized. When the closing meeting ends, both auditors and auditees should have a clear understanding of the initial findings. However, a copy of the completed audit questionnaire and/or checklist should not be left with the auditee. The auditee should be reminded that additional findings may occur after further examination of the objective evidence obtained during or after the audit. When practical, a summary of the preliminary findings should be left with the auditee. When the auditors return from an audit, they should meet with the EPA project officer and the EPA QA Manager to present a summary of the findings of the audit. This meeting usually involves an informal oral presentation during which the highlights of the audit are given and the EPA project officer's and the EPA QA Manager's questions are answered. 3.3 EVALUATION Throughout the audit, the auditors should be comparing what they observe with what the QA Project Plan requires. They should be evaluating the importance of any deficiencies that are found relative to the overall data quality goals for the project. This evaluation should continue after the auditors leave the audit site. This phase of the audit may be the most critical to its success. The information that is collected during the audit or made available later, such as PE results, should be compiled in an orderly fashion. It should be evaluated according to performance criteria, such as DQOs or DQI goals, that are stated in the QA Project Plan or other project planning document. This evaluation provides the basis for the audit findings. 3.3.1 Identification of Finding "Finding" is a value-neutral term. It is a statement based on objective evidence found during the audit relative to a specific criterion or question. For example, consider the following question: "Are daily calibrations of the monitoring instruments performed as prescribed by the approved QA Project Plan?" Depending on the objective evidence obtained during the audit, the finding would indicate whether the calibrations were performed and whether they were suitable and effective. Figure 11 lists the various types of findings. Final EPA QA/G-7 48 January 2000 ------- Figure 11. Types of Findings Findings are audit results that can generally be divided into three categories: 1. noteworthy practices or conditions (i.e., strengths)-positive; 2. observations, which are neither positive nor negative-neutral; and 3. nonconformances, which are deviations from standards and documented practices (e.g., QAPPs, SOPs, reference methods)-negative. Nonconformances can be divided into two subcategories: a. deficiencies, which adversely impact the quality of results, and b. weaknesses, which do not necessarily (but could) result in unacceptable data. It is almost impossible to be completely objective when identifying findings. Invariably, findings lead to conclusions and probably recommendations in the minds of auditors. Such is the nature of most auditors. This tendency is not entirely undesirable, but auditors should be cautioned not to allow their conclusions to overly influence the evaluation of findings by the client. In some cases, conclusions and recommendations may have been specifically requested by the client. It is then proper to develop conclusions, based on the objective evidence that was compiled, and to prepare recommendations, based on those conclusions, to be delivered to the client, rather than the auditee. 3.3.2 Evaluation of Findings To ensure the relevance of the findings, the "so what" test should be applied. This test helps to determine whether a finding is significant relative to the overall goals of the project, whether it affects the quality of the environmental data collection operation, or whether it affects the quality of the data being produced. The "so what" test is a simple check on the relevance of a finding. A finding may not be significant if there are no implications. The credibility of the findings will largely rest on how they are received by the auditee. Frivolous or irrelevant findings can easily destroy the credibility of the audit. It is essential that the findings reflect significant issues. Insignificant findings should not be included in the findings report because they obscure findings that really matter. Management of the auditee should not have to correct trivial nonconformances but should focus on the significant findings that contribute to its understanding of the project's performance. Auditors should remember that a technical audit is a management tool. The findings report should be stated in terms that relate to the managers' interests and that allow them to apply the audit findings to the project. During the evaluation, the audit team should meet to review and discuss the audit questionnaire, checklist, and/or other documents collected during the audit. The time needed to complete the evaluation will be determined largely by the amount of information to be reviewed Final EPA QA/G-7 49 January 2000 ------- and analyzed, but the evaluation should be completed in a timely manner. All audit team members do not need to review the information together, but all members should be together when a consensus on the findings is reached and when draft recommendations and conclusions are developed. If such a meeting of the audit team is not possible, a telephone conference call may suffice. All nonconformances that the audit team identified should be classified as deficiencies or weaknesses, based on their significance. If several types of audits have been conducted on the project (e.g., TSAs and PEs), the audit team should consider all of them in the evaluation. For instance, a PE might indicate that the accuracy of one measurement does not meet the goals stated in the QA Project Plan. During the TSA, the auditor may have observed a deficiency that explains the accuracy problem. For quantitative audits, performance is evaluated by comparing the known value for a measurable material to the value determined by the measurement system being evaluated. For example, one auditor may calculate the difference between the verified value and the measured value for a PE sample. Another auditor may determine whether an analyst has correctly sorted and counted the microorganisms in a sediment sample. A third auditor may be an experienced taxonomist, whose identifications of fresh-water mollusk species are accepted as true and who is verifying the identification by a field biologist. Spreadsheets may be used for statistical analysis of the findings of quantitative audits. The results of the statistical analysis should be compared with the performance criteria and DQI goals listed in the QA Project Plan. If there are any questions about the accuracy of the PE samples, confirmatory analysis of the reserved split-samples may be necessary to verify that the samples' composition has not changed. The goals of the project and the intended use of the generated data are critical considerations in evaluating audit results. Obviously, deficiencies in critical measurements are of more concern than weaknesses in supplemental measurements. It is also important to consider how widespread are the nonconformances-are all data affected or only those data collected on Saturday when the substitute chemist worked? Any nonconformances from the approved methods should be evaluated, especially with regard to data comparability. 3.4 DOCUMENTATION The product of an audit is a written report to the client. The objective of the report is to communicate audit results to the proper levels of management. Different organizations use different formats for these reports, but they all should clearly state the type of audit, the auditor, the auditee, what was assessed, the findings, and the conclusions and recommendations (if these were also requested by the client). The auditee should have an opportunity to comment before the report is finalized to confirm the accuracy of the information contained in the report. Typically, two reports are produced: the draft findings report and the final report. Final EPA QA/G-7 50 January 2000 ------- 3.4.1 Draft Findings Report The lead auditor is responsible for producing the draft findings report and should organize the work to get the report written. Preparation of the draft findings report may be delayed while additional objective evidence is obtained from the auditee. The lead auditor should contact the auditee if the additional objective evidence does not arrive on the promised deadline. Usually, the draft findings report should be prepared within approximately 30 days of completion of the audit, although it may need to be prepared more quickly in some cases. The draft findings report should give enough detail about the audit to enable readers to understand the current status of the project and to estimate whether project goals will be met. It usually consists of an introduction describing the date, location, purpose, and scope of the audit; a summary of the findings; a list of the auditors; and a discussion of any findings requiring corrective action. It also may include conclusions, recommendations, and suggested corrective actions if it is appropriate for the auditors to prepare these items. In some cases, it may be useful to include appendices that describe how a particular type of audit was performed, a list of the project management and personnel interviewed, a detailed account of the findings, and a copy of the completed questionnaire/checklist. An example draft findings report format is given in Figure 12. The written word has a much greater impact than the spoken word. As a consequence, great care and thoughtfulness should go into preparing the draft findings report. The draft findings report should be clearly and concisely written without unsubstantiated generalizations or ambiguous remarks. Use plain English. Avoid using words that could be misinterpreted, particularly if some elements of the audit are questionable. Corrective actions will not be implemented if the findings are not effectively communicated to the auditee through the draft findings report. Generalities or "loopholes" in the draft findings report generally indicate either that the audit was not properly conducted or that the findings were not based solely on objective evidence. The draft findings report should not gloss over significant deficiencies remaining after the audit has been completed. The draft findings report may contain recommendations and suggested corrective actions if such items were requested by the client at the beginning of the audit. In cases in which non-EPA support personnel have performed an EPA-related audit, the EPA QA Manager or the EPA project officer must prepare the conclusions, recommendations, and necessary corrective actions based on the findings presented in the draft findings report. See Section 2.3 for a discussion of inherently governmental functions. The draft findings report should be submitted for review by the EPA QA Manager and the EPA project officer. These individuals are the principal clients of any third-party audit. The scope of the audit identifies who should receive the final report. The EPA QA Manager should send the draft findings report, through the EPA project officer, to the auditee for comments. The Final EPAQA/G-7 51 January 2000 ------- Figure 12. Example Draft Findings Report Format Chapter 1: Introduction - Auditee. - Audit location and date. - Purpose and objectives of the audit. - Type of audit(s) performed. - List of auditors. Chapter 2: Summary of Findings - Summary of findings and their basis. - List of findings requiring corrective action and why. Chapter 3: Conclusions, Recommendations, and Suggested Corrective Actions (if it is appropriate for the auditors to formulate these items) - List of conclusions. - List of recommendations. - List of suggested corrective actions. Appendices: - A: Description of audit(s) performed. - B: List of personnel interviewed. - C: Detailed findings and rationale (as needed) or the completed audit questionnaire/checklist. auditor should not, at this stage of the audit, send any reports directly to the auditee. An example of a transmittal letter for a draft findings letter is given in Figure 13. The auditee should be given the maximum opportunity to respond to the draft findings report. This response should address the findings and discuss how any corrective actions will be resolved. If the auditee disagrees with the findings, the response can contain a rebuttal. Upon receipt of this response, the lead auditor should determine if the response adequately addresses the findings, if a followup audit is necessary, and when it is appropriate to close out the audit. 3.4.2 Final Report After the auditee's comments have been addressed, the final report should be prepared. The final report should be similar in format to the draft findings report and should be based on the draft findings report. Typically, the auditee's response will be integrated into the summary of findings and corrective actions chapters. The lead auditor is responsible for correcting any findings that are demonstrated to be incorrect by objective evidence to the contrary supplied by the auditee. Opinions of the auditee that differ from those of the auditors are not valid reasons to alter the report. The final report should be submitted to the EPA QA Manager and copies should be sent to the EPA project officer and to the auditee. Final EPA QA/G-7 52 January 2000 ------- UNITED STATES ENVIRONMENTAL PROTECTION AGENCY s National Laboratory for Environmental Research Research Triangle Park, NC 27799 ECOSYSTEM RESEARCH DIVISION September 1, 1999 Mr. Jim Boyd XYZ Corporation 10 Main Street Anytown, CA Dear Mr. Boyd: Please find enclosed the draft findings report for the technical systems audit (TSA) and performance evaluation (PE) performed at your facility on July 21-22, 1999. This report summarizes the findings of the TSA and PE and lists the specific corrective actions needed to resolve the deficiencies identified. Please recall that these deficiencies were identified and discussed with you during the exit briefing on July 22. Please review this report in detail and identify any errors in fact that may have occurred. Please also include a schedule for addressing the necessary corrective actions, identify how the corrective actions will be addressed, and how their effectiveness will be determined. We would appreciate having your response by October 1, 1999. Please accept our thanks for the cooperation and assistance provided during the assessment. If you have any questions, please let me know. You may contact me at 001-222-3333 or by Email at julia.bennett@gov. Sincerely, Julia Bennett EPA Project Officer Enclosure cc: Frank Michael, 123 Environmental Susan Davis, 123 Environmental Michael O'Brien, Quality Assurance Manager Figure 13. Example of a Draft Findings Report Transmittal Letter Final EPA QA/G-7 53 January 2000 ------- An audit file should have been created during the preaudit planning phase. At the completion of the audit, it should contain the audit plan, the QA Project Plan and any other project planning documents, all relevant correspondence, the certificates for any PE samples, the records of important conversations and meetings, the audit questionnaires and/or checklists, the draft findings report, the final report, the closeout memorandum, and any other documents collected or produced as part of the audit. 3.5 CORRECTIVE ACTION After an audit, any necessary corrective action should be timely and effective. In certain cases, it may be necessary to perform corrective action as quickly as possible. Such cases may include adverse impacts on data quality and threats to personnel health and safety. Verbal approval from responsible parties suffice under these conditions. In this case history, a TSAfor an epidemiology study revealed that staff members were working very long days to get the work completed. Given the review of this 5-week project took place in its opening days, this situation was considered to be a threat to the continuing quality of the data. Because of this situation, the auditors submitted their draft findings report at the closing meeting. In this way, the issue was brought to the immediate attention of the EPA project officer and associated management. An immediate response supplied additional staff who relieved the workload. In some situations, additional audits may be needed to verify the effectiveness of the corrective actions. Many organizations use a corrective action form to document any nonconformances that require actions and the resolution of them. These forms generally include the signatures of the individual identifying the need for corrective action, the EPA project officer, the EPA QA Manager, and the individual responsible for implementing the corrective action. The problem requiring corrective action, the proposed corrective action, and the approach for evaluating the corrective action should be described. In some cases, the audit team may be needed to confirm the successful implementation of corrective actions, even though the authority for implementing corrective action rests with the client and the principal responsibility is that of the auditee. When this occurs, the proposed corrective action should be reviewed for concurrence by the lead auditor. This review helps ensure that the planned actions will be effective in resolving the problem areas and nonconformances reported by the audit team. The auditee management responsible for the assessed activities also is responsible for ensuring that effective and timely corrective action occurs. Project management should provide a written response to all audit findings. Each finding should be addressed with specific corrective action steps and with a schedule for implementing them. The corrective actions should address the following: Final EPA QA/G-7 54 January 2000 ------- measures to correct each nonconformance, identification of all root causes for significant deficiencies, determination of the existence of similar deficiencies, corrective actions to preclude recurrence of like or similar deficiencies, assignment of corrective action responsibility, and completion dates for each corrective action. Auditee project management should implement corrective action and provide objective evidence of the effectiveness of the correction. Once such objective evidence is received, the audit will be closed unless a reaudit is planned. If reaudit is needed, project management should cooperate in the new audit. The implementation of corrective actions can be verified in several ways, including: reassessing the deficient areas; reviewing new or revised quality-affecting documents such as manuals, procedures, and training records; confirming the actions during the next scheduled audit; and conducting a surveillance covering the areas of concern. Verification of corrective actions effectiveness is necessary. A solution to a problem may look good on paper, but it may not be able to be implemented readily or effectively. The failure to adequately identify and correct all root causes most likely will result in a recurrence of the deficiency. Therefore, an appropriate amount of followup is necessary to ensure the effectiveness of the corrective action process and to re-establish confidence in the measurement system assessed. 3.6 CLOSEOUT Closeout of an audit is the last formal action of the audit process. It occurs after all corrective actions have been implemented and confirmed as effective. The audit file for the project should be verified as complete and retained for future use as a quality record. A closeout memorandum should be added to the audit file. An example of a closeout letter for an audit is given in Figure 14. Final EPA QA/G-7 55 January 2000 ------- UNITED STATES ENVIRONMENTAL PROTECTION AGENCY National Laboratory for Environmental Research Research Triangle Park, NC 27799 ECOSYSTEM RESEARCH DIVISION October 22, 1999 Mr. Jim Boyd XYZ Corporation 10 Main Street Anytown, CA Dear Mr. Boyd: This letter is to confirm the closeout of the technical systems audit and the associated performance evaluation that 123 Environmental Corporation conducted at your facility on July 21-22, 1999. Based on our evaluation of your response to the draft findings report, we have determined that all deficiencies have been resolved through effective corrective action. This is reflected in the final report on these technical audits, which is enclosed. As we had discussed previously, the final report and other records associated with the technical audits will not be disseminated to other individuals. Again, we thank you very much for your cooperation and assistance during the audit. Please contact me at 001-222-3333 or julia.bennett@gov if you have any further questions about the audit. Sincerely, Julia Bennett, Lead Auditor Enclosure cc: Susan Davis, 123 Environmental Frank Michael, 123 Environmental Michael O'Brien, EPA QA Manager Figure 14. Example of a Closeout Letter Final EPA QA/G-7 56 January 2000 ------- CHAPTER 4 TYPES OF TECHNICAL AUDITS 4.1 INTRODUCTION TO AUDIT TYPES Different types of audits may have different objectives, may be implemented at different times in the project cycle, and may be conducted by different personnel. When appropriate, different types of technical audits may be performed concurrently. Although there are many types and names for technical audits, the types and names used predominantly at EPA, and discussed in more detail in this chapter, follow: readiness reviews, which are conducted before projects are implemented to assess whether procedures, personnel, equipment, and facilities are ready for environmental data to be collected; TSAs, which qualitatively assess the degree to which the procedures and processes specified in the QA Project Plan are being implemented; surveillance, which relies on continuous or frequent assessment of the implementation of an activity to determine conformance with established procedures; and PEs, which quantitatively assess the ability of a measurement system to obtain acceptable results. 4.2 READINESS REVIEWS A readiness review is a technical audit that is planned and performed prior to the initiation of a project to verify that project management has brought the facility and/or applicable measurement systems to a state of readiness to begin the project. EPA has not traditionally used readiness reviews, but the principles can be used to ensure that all parts of an EPA program are in place and ready for the start of a project. The U.S. Department of Energy is a regular user of this audit type (U.S. DOE, 1993). The majority of DOE basic research facilities use readiness reviews for activities that present higher risk (Oak Ridge Institute for Science and Education, 1993). The criteria that these facilities use for determining the need for readiness reviews include: perceived risk, criteria imposed by an external client, internal management goals and external requirements, Final EPA QA/G-7 57 January 2000 ------- past performance and baseline checklists, and initial startup when an operation includes a safety analysis. Most DOE basic research facilities have a formal readiness review, and a few have both formal and informal readiness reviews. According to DOE policy, program work shall not start or resume in nuclear facilities until the facility has been brought to a state of readiness to safely conduct that program work and the state of readiness to operate has been verified. These same principles are applicable to environmental work in a laboratory or at a hazardous waste site. Readiness means achieving a configuration that "puts the right people in the right place at the right time working with the right hardware according to the right procedures and management controls." There are two kinds of criteria in this determination, as follows: Functional criteria indicate that the system is accomplishing its functions in an acceptable manner and operating at an acceptable risk level in terms of environmental, safety, and health risks as well as business risks. Other criteria include the applicable codes, standards, and regulations applied and functioning at all control levels inside and outside of the project's organization. Although the intent of both types of criteria is often the same, their applicability and relevance for specific systems often are quite different. The three basic components of any system are personnel, infrastructure (e.g., buildings and grounds, process equipment, and tools), and procedures and management controls. Each component of the system progresses from initial specification to final state of readiness. However, the three components cannot reach a state of readiness individually. They should be in a collective state of readiness in accordance with overall system operational requirements. The interfaces among these components should go through a similar progression. Can the analytical instrumentation be operated correctly by the technicians who have been selected and trained to operate it? Do the available operating procedures match the analytical instrumentation? Can the procedures be understood and used by the technicians who will be operating the instrumentation? The following two case histories illustrate situations in which a readiness review would have helped during the startup phase of a project: Afield study in North Carolina lost 2 days of time at the audit site because adequate electrical power supply was not available. The project personnel and auditors had to wait for a generator to be shipped to the site. Final EPA QA/G-7 58 January 2000 ------- Staff working on a stationary source emissions sampling project in the rural Pacific Northwest had to buy all of the commercially available ice in a three- county area for use in cooling the project's sampling trains. 4.3 TECHNICAL SYSTEMS AUDITS Technical systems audits (TSAs) are thorough, systematic, and qualitative audits of the measurement system used in environmental data operations. They are usually performed on the site of the project. TSAs may be known by other names such as technical system reviews. Auditors may examine facilities, equipment, personnel, training, procedures, record-keeping, data validation, data management and analysis, and reporting aspects of a measurement system in a TSA. A TSA is often conducted shortly after a project starts to allow for early corrective action and is usually performed at the site of a project. For longer projects, TSAs should be performed on a regular schedule throughout the project's life. TSAs may be performed in conjunction with PEs. Typically, the approved QA Project Plan provides the performance criteria for the TSA. The two main purposes of a TSA are to determine that project personnel and equipment are functioning and that all procedures are being implemented as prescribed in the QA Project Plan and other project planning documents. Objective evidence is gathered by interviewing personnel, examining records, and observing project activities. The planning phase of a TSA includes the following steps: deciding to conduct the audit; selecting an audit team with experience in both audits and the technical area of the project; holding a preaudit meeting; reviewing project planning documents; contacting the auditee; preparing the audit plan, agenda, and questionnaire/checklist; and making the necessary logistical arrangements to conduct the TSA. Figure 15 gives an example of a flowchart for the implementation phase of TSA activities. The steps in this phase of the TSA include the following: traveling to the audit site (as necessary), holding an opening meeting, Final EPA QA/G-7 59 January 2000 ------- Audit Team Arrives at Audit Site Discuss Audit in Opening Meeting Audit Group 1 Audit Group 2 Interview Laboratory Manager Interview Field Operations Manager Visit Laboratory, Witness Operations Visit Field Sampling Sites Review Sample Receiving and Custody Visit Calibration Facility Select Portion of Data, Initiate Audit Trail Select Portion of Data, Initiate Audit Trail Establish Data Audit Trail Through Laboratory Operations to Data Management Function Meet to Discuss Findings Establish Trail Through Field Operations to Data Management Finalize Audit Trails and Complete Audit of Data Quality Prepare Summary of: (a) Overall Operations (b) Audit of Data Quality (c) Laboratory Operations (d) Field Operations Complete Audit Questionnaire Discuss Findings in Closing Meeting On-Site Audit Complete Figure 15. Example of a Flowchart of Technical Systems Audit Implementation EPA QA/G-7 60 Final January 2000 ------- observing project activities, interviewing project personnel, reviewing documents, and holding a closing meeting. Checklists are frequently used to guide the ISA. The checklists are prepared based on performance criteria, such as DQOs, that are listed in the project's QA Project Plan and other planning documents. An example ISA checklist is presented in Appendix B. Upon their return from the audit site, the auditors should evaluate the objective evidence that they collected and prepare a report summarizing the TSA's findings. In some situations, however, it may be necessary to make a determination before leaving the site, particularly if a serious deficiency has been found. In such cases, the findings may need to be presented at the site and the corrective action applied immediately. The technique of tracing samples through a laboratory can be used by auditors to gain an understanding of the overall analysis process during a TSA. The samples should arrive with chain-of-custody papers. The project member who was designated as the sample custodian should take custody of the samples while they are at the audit site. When the samples arrive at the laboratory, they should be logged in and tracked and data may be generated. Typical sample handling steps may be digestion, extraction, concentration, and splitting or aliquoting of the extract for various analyses. One way to trace an audit trail is to physically follow the trail of the samples through the facility, looking in the records for their fate at each step of the trail. Another way to effectively trace the trail is to bring all the documentation associated with a sample into the same room at the same time. This technique is particularly effective if the audit requires checking data entry, spreadsheet output, transfer to summary sheets, and data calculations and statistics. Tracking the actual documents used to record each activity and listing them can be a useful audit approach. During a TSA, an auditor might list those documents that are described by laboratory personnel in each technical area. Subsequently, the auditor might read the list back to confirm that these documents are the only ones being used in this area. This approach allows the personnel to think about the documents in a new way and often results in the identification of additional records and information about activities in the technical area. The following case history was drawn from a recent TSA and involves physically tracing the samples. While the TSA was in progress, samples arrived via overnight courier from the field crews. The auditor was able to observe the receipt of samples by the sample custodian, including verification of the package contents with the chain-of-custody information. After the samples were logged into the laboratory, a central file was opened for the sample set. This new file contained the unique sample identifiers supplied by the field crews, and pertinent sample information and indicated what analyses were required. This file also served as the central tracking system for the laboratory. By examining previously Final EPAQA/G-7 61 January 2000 ------- run project samples, the auditor could follow sample progress through the laboratory to specific instrumental analysis records and finally to the final data packages. A different type ofTSA was performed on a study headquartered in a one-room office. The audit took place at a large conference table and included personnel from the auditee to provide specific documentation and answer related questions. The main thrust of the review was to examine record-keeping procedures and adherence to the QA Project Plan and study protocols. This approach allowed for close examination of numerous files for different aspects of the study. The completeness of the study record was examined with traceability in mind. From this in-depth file review, a number of findings were produced. 4.4 SURVEILLANCE Surveillance is the observation of ongoing work to document conformance with specified requirements and/or procedures, such as those given in a QA Project Plan or SOP. Surveillance is focused on a particular technical activity, rather than on the entire measurement system. It is typically less formal than other types of audits, but it should also include appropriate preparation, conduct, reporting, and followup phases. As appropriate, surveillance may be employed as part of a ISA. The objective of surveillance is to provide confidence through real-time observations that an activity has been performed in accordance with approved and specified methods and procedures. It allows for immediate identification of any deficiency and initiation of action to correct the deficiency and its underlying cause. When deficiencies are identified, the client's management should be notified promptly so that corrective action may be implemented. Surveillance may allow for immediate notification of the status and performance of the project to management if authorized by the scope of the audit. It also can include followup to verify that corrective action was implemented. Surveillance typically begins with an initial communication a few days before the site visit. Audit activities should include an opening meeting in which the auditor explains the purpose and scope of the surveillance. The auditor should then begin to observe the project staff performing the technical activity in question for an agreed-upon time period. The auditor should compare the staffs actions with the description of these actions in the QA Project Plan or SOP. At the end of the surveillance, the auditor should present an informal summary of the findings to the project management and personnel. Surveillance findings should be documented in writing and submitted immediately to the auditee and the client. The auditee should provide a written response to the client that discusses the action taken to correct any observed deficiencies and to prevent similar deficiencies in the future. Final EPA QA/G-7 62 January 2000 ------- As with other types of audits, it is critical that the evaluator for surveillance be technically proficient in or knowledgeable about the activity being monitored. It may be effective to use technical experts with management personnel to perform surveillance. 4.5 PERFORMANCE EVALUATIONS A PE is a quantitative audit in which analytical results are generated by a measurement system for a sample that originates outside of a project. PEs may be known by other names such as proficiency tests. A PE sample mimics routine field samples in all possible aspects, except that its composition is unknown to the analyst and known to the auditor. In the context of the quality system, a PE is used to determine if a measurement system's results are within data quality goals specified in the QA Project Plan. PE results are often used to estimate the degree of bias in the measurement system. Although a PE can identify a problem quantitatively, it typically cannot determine the cause of the problem. The timely acquisition of PE samples is crucial for the success of the PE and should begin early in the audit planning process. The QA Project Plan and other project documentation should be reviewed to determine the appropriate choices for the PE samples' composition. PE samples should have the same matrices and analytes, in the same approximate concentrations, as expected in the field samples, to the extent possible. PE samples analyzed with routine field samples provide a more realistic picture of the biases in the measurement system than those that are analyzed separately with special precautions by the analysts. Once suitable PE samples have been identified, their availability and cost should be researched, which requires advance planning for the audit. They may have to be ordered from the National Institute of Standards and Technology, from a commercial vendor, or from other sources. PE samples should be traceable to National Institute of Standards and Technology or to some other independent organization, whenever possible given other constraints. Constraints may include PE samples with the same matrices and the same analytes of interest as the field samples. If PE samples are purchased from an external source, the certification documentation of their contents should be kept with the audit records. Depending on the source and traceability of the PE sample, it may be necessary for the auditors to independently verify the PE samples' certification. Alternatively, they may be prepared and verified by the auditors. Even if the sample is verified before its use, saving an aliquot is strongly recommended in case questions arise later about the sample. Some samples may not be stable, and analysis of a reserved aliquot may be the only way to determine this instability. Aliquots of the PE samples should be stored until the results of the PEs are received, in case further verification of the PE samples is necessary. The auditors should make suitable arrangements with the auditee to set up and perform the PE. The logistics for conducting a PE are different from an audit involving auditor travel, such as a TSA. PE samples may be shipped to the audit site when only a PE is being performed, Final EPA QA/G-7 63 January 2000 ------- or they may be transported there by an auditor when a PE is being integrated into a TSA. Arrangements should be made to transport the PE samples to the site of the audit in a manner that preserves their integrity and meets all shipping requirements. The samples may be shipped to the site by commercial carrier or carried by auditors to the audit site in order to track them through the entire PE process. The auditors and the auditee should designate a specific individual to receive PE samples shipped to the audit site by a commercial carrier. Such a designation reduces the likelihood that the samples will be lost at the audit site. Appropriate transfer of custody papers and applicable written instructions for use of the PE samples should accompany the shipment. Upon arrival at the audit site, the auditors should retrieve the samples from the PE recipient. The auditors should note the time of receipt and the conditions of PE sample storage. For instance, compressed gas cylinders may need to be warmed to room temperature or liquid solutions of semivolatile organic compounds may require refrigeration. The schedule for performing the PE or transferring the custody of PE samples should be included in the audit agenda. The auditors should formally transfer custody of PE samples and any ancillary equipment. Additionally, they should transmit written instructions for use of the PE samples and equipment. They should review these instructions with the project personnel at the time of transfer. If the auditors are at the audit site for a PE, they should observe the handling and analysis of the PE samples by the project personnel who regularly perform these activities on routine field samples. The personnel should follow as closely as possible their routine operating procedures and the instructions conveyed with the PE samples. The results from analysis of the PE samples may be available immediately, or the raw data may require reduction and summary before transmittal to the auditors. The auditors should make arrangements to return any ancillary equipment at the end of the audit. The auditee's sample custodian should arrange to dispose of the remaining sample. Single-blind PE samples are ones that the analyst knows are PE samples but does not know their analytes or their concentrations. Double-blind samples are analyzed without the analyst being aware that they are PE samples. Double-blind samples should not be distinguishable from routine field samples in any way. Thus, double-blind PE samples are processed routinely and are not subjected to any special treatment. Whenever possible, double-blind PE samples should be introduced into batches of routine field samples before they are shipped to the laboratory. To do this, the PE sample's container, medium, and label, for instance, should be indistinguishable from those of the routine field samples in the batch. In some instances, it may not be possible to introduce a double-blind PE sample before the routine field samples are received at the laboratory, but an independent person at the laboratory (such as the project's QA Manager or an independent auditor who is at the audit site for a TSA) Final EPA QA/G-7 64 January 2000 ------- may add the samples to a batch without the analyst's knowledge. This addition is possible for water samples, but it is not possible for real-time analyses of gaseous samples. Although PEs are typically used to assess bias in measurement systems employing chemical or physical methods, they also may be applied to biological measurement systems. For example, the accuracy of taxonomic identifications offish species can be assessed by conducting independent identifications and enumerations of a subset of routine samples. The independent check is conducted by an experienced taxonomist, whose identifications are accepted as the true values for the sample. A quantitative value for taxonomic accuracy is calculated from the percentage of assessed identifications that agree with the true identifications. Similarly, the accuracy of analyses of species composition, abundance, and biomass in benthic communities can be assessed by independent sorting, counting, and weighing of organisms from sediment samples collected at the same location. Final EPA QA/G-7 65 January 2000 ------- Final EPA QA/G-7 66 January 2000 ------- CHAPTER 5 RELATED TECHNICAL ASSESSMENTS 5.1 INTRODUCTION TO ASSESSMENT TYPES As discussed earlier, not all technical assessments fit the definition of attributes of "audits" precisely. In the context of EPA use and the application of these assessments to environmental programs, they contain some of the general characteristics of audits but usually are more subjective and lack the specific measurable criteria typically expected of audits. Two types of technical assessments are discussed in this chapter, as follows: ADQs, which document the capability of a project's data management system to collect, analyze, interpret, and report data as specified in the QA Project Plan; and DQAs, which are conducted on a validated data set to determine if the data are sufficient and adequate for their intended use as defined by the DQO process. While these assessments may not have the quantitative criteria for evaluation as expected in audits, they are equally useful for the qualitative assessment of conditions affecting environmental programs. The principles of auditing discussed in Chapter 3 apply to these assessments as well as to audits. It is appropriate to note again that peer review may be a viable assessment tool. As discussed briefly in Chapter 1, there are circumstances when peer review may be used along with audits and related technical assessments. In fact, there may be occasions when peer review is the only viable technical assessment mechanism available to decision makers. Typically, this most often occurs in research programs particularly when leading-edge scientific issues are under investigation. Again, quality assessment is not the principal purpose of peer review and, for this reason, it is not addressed here. However, its potential use as a technical assessment tool must be acknowledged. 5.2 AUDITS OF DATA QUALITY The ADQ is an examination of data after they have been collected and verified by project personnel. It is conducted to determine how well the measurement system performed with respect to the performance goals specified in the QA Project Plan and whether the data were accumulated, transferred, reduced, calculated, summarized, and reported correctly. It documents and evaluates the methods by which decisions were made during treatment of the data. Final EPA QA/G-7 67 January 2000 ------- Primary questions to be answered in an ADQ are as follows: Is there sufficient documentation of all procedures used in the data collection effort to allow for repetition of the effort by a person or team with technical qualifications similar to those of the original data collector? Can the data be replicated by the original data collector? Is there sufficient documentation to verify that the data have been collected and reported according to these procedures? Is enough information provided to allow a potential user to determine the quality and limitations of the data and whether the intended use of the data is appropriate? Are the data of sufficient quality with respect to DQI goals and other performance criteria for their intended use? ADQs entail tracing data through their processing steps and duplicating intermediate calculations. A representative set of the data is traced in detail from raw data and instrument readouts through data transcription or transference through data manipulation (either manually or electronically by commercial or customized software) through data reduction to summary data, data calculations, and final reported data. The focus is on identifying a clear, logical connection between the steps. Particular attention is paid to the use of QC data in evaluating and reporting the data set. For a large project, a statistical approach may be necessary to determine a representative number of data sets to be examined. Often, however, the number of sets is limited by the budget of the assessment. An ADQ may occur at different stages of the project. For example, it may be performed after field analysis but prior to off-site laboratory analyses. A typical ADQ begins by reviewing available data from a project, by determining needed missing data, and by devising a plan for the assessment. The plan usually includes steps involving pursuing all available needed data, collecting them, and conducting an extensive review of the entire collection. The products of the ADQ are a report detailing the results of custody tracing, a study of data transfer and intermediate calculations, a review of QA and QC data, a study of project incidents that resulted in lost data, and a review of study statistics. The ADQ report ends with conclusions about the quality of the data from the project and their fitness for their intended use. 5.3 DATA QUALITY ASSESSMENTS DQA is a scientific and statistical evaluation to determine if validated data obtained from environmental data operations are of the right type, quality, and quantity to support their intended use. DQA, which occurs during the assessment phase of a project, is more comprehensive than an Final EPA QA/G-7 68 January 2000 ------- ADQ, which typically occurs during the implementation phase. The relationship between a DQA and an ADQ is analogous to the relationship between DQOs and DQI goals. DQA is described in Guidance for Data Quality Assessment: Practical Methods for Data Analysis (EPA QA/G-9), (U.S. EPA, 1996). DQA is built on the fundamental premise that data quality is meaningful only when it relates to the intended use of the data. The five steps of DQA are: review the DQOs or performance criteria and sampling design, conduct a preliminary data review, select the statistical test, verify the assumptions of the statistical test, and draw conclusions from the data. DQA helps to determine the usability of a data set and provides project management with information on the effectiveness of the measurement systems used. During the planning phase of a project, quantitative and qualitative criteria are defined for determining when, where, and how many samples to collect and the desired level of confidence. DQA provides the assessment needed to determine if the DQOs or performance criteria were achieved. Final EPA QA/G-7 69 January 2000 ------- Final EPA QA/G-7 70 January 2000 ------- CHAPTER 6 GUIDANCE FOR AUDITEES 6.1 PREAUDIT PARTICIPATION The auditee (either intramural or extramural) should be aware that any EPA-funded project can be assessed. In addition to determining conformance to a set of plans or specifications, an audit is an opportunity for the auditee to obtain independent feedback about the suitability and effectiveness of its own quality system. The auditee will most likely first learn of specific plans for an audit from the EPA project officer, who will discuss the type and scope of the audit and the approximate date for the audit. The auditee will be advised that the auditors will be making contact to discuss the details of the audit. After the preaudit planning meeting discussed in Chapter 3, the lead auditor will make informal contact with the auditee's project manager. Their conversation will focus on the audit type, objectives, scope, and schedule, as well as the makeup of the audit team and the project staff who will need to be present at the audit. The project manager should be candid about the status of the project, avoiding any waste of time by not agreeing to a premature audit schedule. The project manager will usually be alerted if there are PE samples to be analyzed so that preparations can be made to receive and analyze the samples. There also will be a discussion about gaining entrance to the audit site, the name of the point-of-contact at the audit site, and any safety requirements. This information should be provided to the lead auditor in a timely manner. If the audit will involve confidential business information, the project manager should notify the lead auditor, who will initiate the confidential business information process. The project manager may wish to provide helpful information to the auditors about travel logistics and the availability of local accommodations. After reviewing the available project documents, the lead auditor may need additional information for preparation of the audit plan. The lead auditor may contact the EPA project officer or the auditee's project manager with a request for site-specific or project-specific information such as the physical layout of the site, workstations, personnel specifics, and the operational status of the project. The project manager should respond in a timely manner to these requests. Note that making this information available to auditors before the audit will reduce the extent that project activities are disrupted during the audit. After the informal conversation, the lead auditor should prepare an audit agenda and notify the auditee of the impending audit by letter (with the agenda attached). The auditee's project manager should then arrange for appropriate personnel to be present at the audit's entrance and closing meetings and for the usual project staff to perform their usual project activities during the audit. The project manager should arrange for the site tour and for an appropriate escort or succession of escorts for the duration of the audit. The project manager also should make sure that appropriate equipment is online and functioning on the day of the audit Final EPAQA/G-7 71 January 2000 ------- and that appropriate project measurements will be performed for the auditors to observe. The project manager should give the auditors' names and schedule to the point-of-contact at the audit site, arrange an on-site safety orientation, and provide identification badges, as necessary, so the auditors can gain entrance to the audit site. 6.2 AUDIT READINESS The project manager should inform the project personnel of the impending audit and should arrange for their participation in the audit. Project personnel should be familiar with the sections of the QA Project Plan and other project planning documents that are relevant to the project activities that they perform. Project personnel should understand that the audit will be based on performance criteria in the QA Project Plan. The auditee may prepare for an independent audit by conducting periodic self-audits during the project. By practicing self-audit, project staff will be aware of audit procedures and prepared for independent audits. Regular self-audits foster upkeep of sample and data logs, project notebooks, QC control charts, and routine recordings of daily observations. Awareness of audit procedures encourages personnel to develop project forms such as field data forms or laboratory data forms. Written protocols and operating procedures should be reviewed and updated routinely. Inspection and testing records for incoming materials should be organized and updated regularly. Participation of the auditee in interlaboratory, round-robin QC programs helps prepare the auditee's analysts to accept the need for PE samples and to be successful in their analyses. 6.3 AUDIT PARTICIPATION 6.3.1 Opening Meeting The auditee should provide a meeting room for the opening meeting and should arrange for the project management, the project QA Manager, the key project personnel, and the audit site representative to attend. After the lead auditor finishes making an opening presentation, all project personnel should introduce themselves. The project management should indicate the readiness of project personnel for the audit. They should be prepared to ask any questions that they have about the audit agenda and schedule. They should be ready to respond to the requests of the audit team according to the agenda and schedule. Although it is quite natural for an auditee to feel anxious about the audit, these feelings should not result in defensiveness or confrontational behavior. Project management can set an example for project staff by projecting a positive attitude toward the audit and the auditors. The audit should be approached as something that will benefit the project. The auditors can look at the project with fresh eyes and provide assistance to the project based on experiences from other audits. This approach helps to ensure that the audit will be a success for everyone. Final EPA QA/G-7 72 January 2000 ------- Occasionally, an unexpected event occurs, and there is a sudden change of plans. Some of the audit activities on the agenda may require rearrangement, or there may be a substitution of personnel due to illness. The audit team will appreciate timely notification of such occurrences. If the audit team is expecting the audit to follow the agenda and repeatedly encounters obstacles, the auditee can expect a discussion of these problems in the report. 6.3.2 Audit Activities Responsible project personnel (e.g., the project's QA Manager) may be expected to accompany the auditors during the audit. Host site personnel are expected to be available. Escort duty can be alternated among project management and staff as appropriate. The escorts should be knowledgeable about the project activities being assessed. In some instances, it may not be appropriate for the immediate supervisor of project staff to be the escort because the supervisor's presence may cause the staff to bias their responses to questions. The escort should be prepared to intervene if project staff begin to react negatively during the audit. For some audits, the auditors will want to observe project personnel performing the tasks being assessed and have an opportunity to interview them. They also may want to observe project activity, documentation, storage facilities, packing and transport of project samples, and handling of calibration standards, QC measurements, and project data. The auditee should try to accommodate the auditors to accomplish their work with as much efficiency and as little disruption of the daily routine as possible. Project management and staff should cooperate with the auditors during the audit. They should respond appropriately and fully to the auditors' questions. Their responses should remain focused on the topic of the question and should not include tangential material, which may confuse the issue. It is possible that an auditor may misunderstand a response, and the respondent should attempt to correct any apparent errors in the auditor's understanding of the project. An appropriate question to the auditor may help to clarify the auditor's understanding. Remember that project management and staff are more familiar with the project than the auditor, who is attempting to cover a lot of material in a short time. They should be prepared to produce project records and to demonstrate procedures being used in the project. The escorts should make notes during the audit that parallel the audit checklist. In addition to providing the auditee with documentation of the audit, these notes will be useful to clarify points during the closing meeting and when implementing corrective actions. 6.3.3 Closing Meeting The closing meeting is attended by project management and the project QA Manager. Project personnel may benefit from attending the meeting. At this meeting, the lead auditor discusses the findings and the objective evidence with them. If contrary objective evidence exists that the auditors have not seen but should know about, this is the opportunity to present it. If the Final EPA QA/G-7 73 January 2000 ------- auditors have misunderstood anything, this is the opportunity to offer correction. If the auditors have made requests for information during the audit that was not immediately available, project management should take careful note and provide the information on a realistic timetable. If the information will not be available, project management should not offer to provide it and should state candidly why it is not available. If auditors leave the closing meeting expecting material that never comes, it delays the audit report and leaves an unfavorable impression of the auditee. 6.4 DRAFT FINDINGS REPORT REVIEW The draft findings report will be submitted for review by the auditee from the EPA project officer. This submission is the opportunity for the auditee to correct erroneous information in the report. A thorough and timely review with a written response will be helpful to the auditors. 6.5 CONFIDENTIALITY AND DISSEMINATION OF AUDIT RESULTS The EPA project officer and the EPA QA Manager should make a decision regarding the confidentiality and dissemination of audit results during preaudit planning. This decision should be communicated explicitly to the auditors and the auditee prior to the audit. Businesses have a legitimate right to preserve the confidentiality of trade secrets and other business information and to limit its use or disclosure by others in order that the business may obtain or retain business advantages it derives from its rights to the information. Regulations relating to this issue are found in "Confidentiality of Business Information" in 40 CFR 2.201. Other statutory requirements and Federal Acquisition Regulations also could apply to the confidentiality decision. EPA rules under environmental statutes regarding confidential business information may apply to some or all items observed during an audit. For particularly sensitive projects, nondisclosure agreements between EPA and the auditee may be needed and should be considered during preaudit planning. Final audit reports are generally considered internal reports and do not receive wide distribution in most cases. The auditee should be informed about how the reports will be distributed and to whom before the audit begins. The auditee should remember that the primary purpose of the audit is to correct weaknesses or deficiencies in the project, not to affix blame on the auditee. It should also remember that the clients for the audit are the EPA project officer and the EPA QA Manager. Final EPA QA/G-7 74 January 2000 ------- REFERENCES 40 CFR 2.201, Code of Federal Regulations, "Confidentiality of Business Information." ASQ (American Society for Quality). 1994. Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs. ANSI/ASQCE4-1994. Milwaukee, WI. ISO (International Organization for Standardization). 1994a. Guidelines for Auditing Quality Systems - Auditing. ISO Standard 10011-1-1994. Geneva, Switzerland. Available from ASQ as ANSI/ISO/ASQC Q10011-1-1994. ISO (International Organization for Standardization). 1994b. Guidelines for Auditing Quality Systems-Management of Audit Programs. ISO Standard 10011-3-1994. Geneva, Switzerland. Available from ASQ as ANSI/ISO/ASQC Q10011-3-1994. ISO (International Organization for Standardization). 1994c. Guidelines for Auditing Quality Systems - Qualification Criteria for Quality Systems Auditors. ISO Standard 10011-2- 1994. Geneva, Switzerland. Available from ASQ as ANSI/ISO/ASQC Q10011-2-1994. ISO (International Organization for Standardization). 1994d. Quality Management and Quality Assurance - Vocabulary. ISO Standard 8402:1994. Geneva, Switzerland. Available from American National Standards Institute, New York, NY. Oak Ridge Institute for Science and Education. 1993. Quality Management Graded Approach Working Paper. Oak Ridge, TN. Russell, J.P., editing director. 1977. The Quality Audit Handbook., Milwaukee, WLASQ Quality Press. U. S. DOE (Department of Energy). 1993. Planning and Conduct of Operational Readiness Reviews (ORR). DOE-STD-3006-95. Washington, DC. U.S. EPA (Environmental Protection Agency). 1994. Guidance for the Data Quality Objectives Process (EPA QA/G-4), EPA/600-R-96/055. Washington, DC. U.S. EPA (Environmental Protection Agency). 1996. Guidance for Data Quality Assessment: Practical Methods for Data Analysis (EPA QA/G-9), EPA/600/R-96/084. QA97 Version. Washington, DC. U.S. EPA (Environmental Protection Agency). 1997. EPA Guidance for Quality Assurance Project Plans for Environmental Data Operations (EPA QA/G-5), EPA/600/R-98/018. Washington, DC. Final EPA QA/G-7 75 January 2000 ------- U.S. EPA (Environmental Protection Agency). 1998a. EPA Quality Manual for Environmental Programs. Order 5360. Washington, DC. U.S. EPA (Environmental Protection Agency). 1998b. Peer Review Handbook., EPA100-B-98-001. Washington, DC. U.S. EPA (Environmental Protection Agency). 1998c. Policy and Program Requirements for the Mandatory Agency-wide Quality System. Order 5360.1 CHG 1. Washington, DC. U.S. GAO (Government Accounting Office). 1994. Government Auditing Standards. Washington, DC. Supplemental Information Sources Alter, D. R. 1994. Quality Audits for Improved Performance. 2nd ed. Milwaukee, WI: ASQ Quality Press. Dzus, G., and G. Sykes, Sr. 1993. How to survive ISO 9000 surveillance. Quality Progress 26 (10): 109-112. Goldblum, D. K., J. Fernando, D. E. Lundquist, B. Smith-Townsend, and J. D. Erving. 1997. Air Force hits QA and QC data targets for labs. Today's Chemist at Work 6 (10): 16-19. Kostoff, R. N. 1997. Peer review: The appropriate GPRA metric for research. Science 277 (5326): 651-652. Kostoff, R. N. 1997. The principles and practices of peer review. Science and Engineering Ethics^ (1): 19-34. Kostoff, R. N. 1997. Research program peer review: principles, practices, and protocols Available online at http://www.dtic.mil/dtic/kostoff/index.html. Luedtke, N. A. 1993. Integrity and auditing. Environmental Testing & Analysis 2 (4):56-60. Mills, Charles A.. 1989. The Quality Audit: A Management Evaluation Tool. Milwaukee, WI: ASQ Quality Press. Pomales, T. J. 1997. "PM10 mass analysis system audit findings: A prelude to PM2 5 (fine) mass analysis. Presented at the Air & Waste Management Association's 90th Annual Meeting & Exhibition, Toronto, Ontario, Canada. Sayle, A. J. 1985. Management Audits. The Assessment of Quality Management Systems. Milwaukee, WI: ASQ Quality Press. Final EPA QA/G-7 76 January 2000 ------- U. S. DOE (Department of Energy). 1996. Implementation Guide for Use with Independent and Management Assessment Requirements of 10 CFR Part 831.120 and DOE 5700.6C Quality Assurance. DOE G 414.1 -1. Washington, DC. U.S. DOE (Department of Energy). 1996. Guide to Good Practices for Operational Readiness Review s(ORR) Team Leader's Guide. DOE-HDBK-3012-96. Washington, DC. U.S. EPA (Environmental Protection Agency). 1997. Laboratory Data Quality at Federal SuperfundSites. E1SKB6-09-0041-7100132. Washington, DC. U.S. GAO (General Accounting Office) 1993. An Audit Quality Control System: Essential Elements. GAO/OP-4.1.6. Gaithersburg, MD. Worthington, J. C. 1998. Continuous improvement in quality audit systems. Environmental Testing and Analysis. 7(l):23-26. Final EPA QA/G-7 77 January 2000 ------- Final EPA QA/G-7 78 January 2000 ------- APPENDIX A GLOSSARY assessment - the evaluation process used to measure the performance or effectiveness of a system and its elements. As used here, assessment is an all-inclusive term used to denote any of the following: audit, performance evaluation, management review, peer review, inspection, or surveillance. audit - a systematic and independent examination to determine whether quality activities and related results comply with planned arrangements and whether these arrangements are implemented effectively and are suitable to achieve objectives. auditee - the organization being assessed. auditor - a person qualified to perform audits. audit of data quality (ADQ) - an examination of data after they have been collected to determine how well the measurement system performed with respect to the data quality goals specified in the quality assurance project plan. ADQs entail tracing data through processing steps and duplicating intermediate calculations and focus on identifying a clear, logical connection between the steps. blind sample - a subsample submitted for analysis with a composition and identity known to the submitter but unknown to the analyst. Blind samples are used to test the analyst's or laboratory's proficiency in the execution of the measurement process. Samples may be either single blind (the analyst knows the sample is a PE sample but does not know what analytes at what concentrations it contains) or double-blind (the analyst does not know the sample is a PE sample). client - any individual or organization for whom items or services are furnished or work is performed in response to defined requirements and expectations. Compare with user below. confidential business information (CBI) - any information, in any form, received by the U.S. Environmental Protection Agency from a person, firm, partnership, corporation, association, or local, State, or Federal agency that relates to trade secrets or commercial or financial information and that has been claimed as confidential by the person submitting it under the procedures in Code of Federal Regulations. contractor - any organization or individual that contracts to furnish services or items or perform work; a supplier in a contractual situation. Final EPA QA/G-7 A-l January 2000 ------- corrective action - an action taken to eliminate the causes of an existing nonconformance, deficiency, or other undesirable situation in order to prevent recurrence. data quality assessment (DQA) - a scientific and statistical evaluation of validated data to determine if the data are of the right type, quality, and quantity to support their intended use. data quality indicators (DQIs) - quantitative statistics and qualitative descriptors used to interpret the degree of acceptability or utility of data to the user. The principal DQIs are bias, precision, accuracy, comparability, completeness, and representativeness. data quality objectives (DQOs) - qualitative and quantitative statements derived from the DQO Process that clarify study technical and quality objectives, define the appropriate type of data, and specify tolerable levels of potential decision errors that will be used as the basis for establishing the quality and quantity of data needed to support. deficiency - an unauthorized deviation from acceptable procedures or practices, or a defect in an item. environmental data - any measurement or information that describes environmental processes, location, or conditions; ecological or health effects and consequences; or the performance of environmental technology. For EPA, environmental data include information collected directly from measurements, produced from models, and compiled from other sources such as databases or the available literature. environmental programs - work or activities involving the environment, including but not limited to characterization of environmental processes and conditions; environmental monitoring; environmental research and development; the design, construction, and operation of environmental technologies; and laboratory operations on environmental samples. environmental technology - an all-inclusive term used to describe pollution control devices and systems, waste treatment processes and storage facilities, and site remediation technologies and their components that may be used to remove pollutants or contaminants from or to prevent them from entering the environment. Examples include wet scrubbers (air), soil washing (soil), granulated activated carbon units (water), and filtration (air, water). Usually, this term applies to hardware-based systems; however, it also applies to methods or techniques used for pollution prevention, pollutant reduction, or containment of contamination to prevent further movement of the contaminants, such as capping, solidification or vitrification, and biological treatment. EPA project officer - the EPA person responsible for the overall technical and administrative aspects of the project. Such persons may be referred to as project manager, project officer, work assignment manager, or similar title for extramural projects. For intramural projects, other titles may include principal investigator and team leader. Final EPA QA/G-7 A-2 January 2000 ------- extramural agreement - a legal agreement between EPA and an organization outside EPA for items or services to be provided. Such agreements include contracts, work assignments, delivery orders, task orders, cooperative agreements, research grants, State and local grants, and EPA- funded interagency agreements. finding - an assessment conclusion that identifies a condition having a significant effect on an item or activity. An assessment finding may be positive or negative, and is normally accompanied by specific examples of the observed condition. good laboratory practices (GLPs) - a quality system concerned with the organizational process and the conditions under which nonclinical health and environmental safety studies are planned, performed, monitored, archived, and reported. graded approach - the process of basing the level of application of managerial controls applied to an item or work product according to the intended use of the results and the degree of confidence needed in the quality of the results. guideline - a suggested practice that is non-mandatory in programs intended to comply with a standard. independent assessment - an assessment performed by a qualified individual, group, or organization that is not a part of the organization directly performing and accountable for the work being assessed. inspection - an examination such as measuring, examining, testing, or gauging one or more characteristics of an entity and comparing the results with specified requirements in order to establish whether conformance is achieved for each characteristic. lead auditor - an individual qualified to organize and direct a technical assessment, to report assessment findings and observations, and to evaluate corrective actions. management system - a structured, nontechnical system describing the policies, objectives, principles, organizational authority, responsibilities, accountability, and implementation plan of an organization for conducting work and producing items and services. nonconformance - a deficiency in characteristic, documentation, or procedure that renders the quality of an item or activity unacceptable or indeterminate; nonfulfillment of a specified requirement. objective evidence - any documented statement of fact, other information, or record, either quantitative or qualitative, pertaining to the quality of an item or activity, based on observations, measurements, or tests which can be verified. Final EPA QA/G-7 A-3 January 2000 ------- observation - an assessment conclusion that identifies a condition (either positive or negative) which does not represent a significant impact on an item or activity. An observation may identify a condition which does not yet cause a degradation of quality. organization - a company, corporation, firm, enterprise, or institution, or part thereof, whether incorporated or not, public or private, that has its own functions and administration. In the context of EPA Order 5360.1 CHG1, an EPA organization is an office, region, national center, or laboratory. peer review - a documented critical review of work by qualified individuals (or organizations) who are independent of those who performed the work, but are collectively equivalent in technical expertise. A peer review is conducted to ensure that activities are technically adequate, competently performed, properly documented, and satisfy established technical and quality requirements. The peer review is an in-depth assessment of the assumptions, calculations, extrapolations, alternate interpretations, methodology, acceptance criteria, and conclusions pertaining to specific work and of the documentation that supports them. performance evaluation (PE) - a type of audit in which the quantitative data generated in a measurement system are obtained independently and compared with routinely obtained data to evaluate the proficiency of an analyst or laboratory. performance evaluation (PE) sample - A sample that mimics actual samples in all possible aspects, except that its composition is known to the auditor and unknown to the auditee. PE samples are provided to test whether a measurement system can produce analytical results within specified performance goals. See also blind sample and performance evaluation. process - a set of interrelated resources and activities that transforms inputs into outputs. Examples of processes include analysis, design, data collection, operation, fabrication, and calculation. program - any work involving the environment, including characterization of environmental processes and conditions; environmental monitoring; environmental research and development; design, construction, and operation of environmental technologies; and laboratory operations on environmental samples. project - an organized set of activities within a program. project manager - the individual in the auditee who has responsibility and accountability for planning and implementing the project and who has authority to implement corrective action. Final EPA QA/G-7 A-4 January 2000 ------- project quality assurance manager - the individual in the auditee who has responsibility for planning, documenting, coordinating, and assessing the effectiveness of the quality system for the auditee. quality - the totality of features and characteristics of a product or service that bears on its ability to meet the stated or implied needs and expectations of the user. quality assurance (QA) - an integrated system of management activities involving planning, implementation, documentation, assessment, reporting, and quality improvement to ensure that a process, item, or service is of the type and quality needed and expected by the client. quality assurance manager - the individual designated as the principal manager within the organization having management oversight and responsibility for planning, documenting, coordinating, and assessing the effectiveness of the quality system for the organization. quality assurance project plan - a document describing in comprehensive detail the necessary QA and QC and other technical activities that must be implemented to ensure that the results of the work performed will satisfy the stated performance criteria. quality control (QC) - the overall system of technical activities that measures the attributes and performance of a process, item, or service against defined standards to verify that they meet the stated requirements established by the customer; operational techniques and activities that are used to fulfill requirements for quality. quality management - that aspect of the overall management system of an organization that determines and implements the quality policy. Quality management includes strategic planning, allocation of resources, and other systematic activities (e.g., planning, implementation, documentation, and assessment) pertaining to the quality system. quality management plan (QMP) - a document that describes the quality system in terms of the organizational structure, policy and procedures, functional responsibilities of management and staff, lines of authority, and required interfaces for those planning, implementing, documenting, and assessing all activities conducted. quality system - a structured and documented management system describing the policies, objectives, principles, organizational authority, responsibilities, accountability, and implementation plan of an organization for ensuring quality in its work processes, products (items), and services. The quality system provides the framework for planning, implementing, documenting, and assessing work performed by the organization and for carrying out required QA and QC activities. Final EPA QA/G-7 A-5 January 2000 ------- quality system audit- a documented activity performed to verify, by examination and evaluation of objective evidence, that applicable elements of the quality system are suitable and have been developed, documented, and effectively implemented in accordance with specified requirements. readiness review - a systematic, documented review of the readiness of the start-up or continued use of a facility, process, or activity. Readiness reviews are typically conducted before proceeding beyond project milestones and prior to initiation of a major phase of work. sampling and analysis plan (SAP) - a detailed document describing the procedures used to collect, preserve, handle, ship, and analyze samples for detection or assessment monitoring parameters. The plan should detail all chain-of-custody and QA and QC measures that will be implemented to ensure that sample collection, analysis, and data presentation activities meet the prescribed requirements. self-assessment - an assessment of work conducted by individuals, groups, or organizations directly responsible for overseeing and/or performing the work. standard operating procedure (SOP) - a written document that details the method for an operation, analysis, or action with thoroughly prescribed techniques and steps; a procedure that is officially approved as the method for performing certain routine or repetitive tasks. surveillance - continual or frequent monitoring and verification of the status of an entity and the analysis of records to ensure that specified requirements are being fulfilled. technical assessment - a systematic and objective examination of a project to determine whether environmental data collection activities and related results comply with the project's QA Project Plan, whether the activities are implemented effectively, and whether they are sufficient and adequate to achieve the QA Project Plan's data quality goals. Technical assessments document the implementation of the QA Project Plan. technical specialist - an active participant in a technical assessment who has specialized technical knowledge of the project being assessed and basic knowledge of assessment techniques and procedures. technical systems audit (TSA) - a thorough, systematic, on-site, qualitative audit of facilities, equipment, personnel, training, procedures, recordkeeping, data validation, data management, and reporting aspects of a system. weakness - a negative assessment finding (i.e., a nonconformance) that does not necessarily result in unacceptable data. Final EPA QA/G-7 A-6 January 2000 ------- APPENDIX B EXAMPLE OF A TECHNICAL SYSTEMS AUDIT CHECKLIST FOR A LABORATORY MEASUREMENT SYSTEM Audited Project: Audi tee: Audit Location: Auditors: Audit Dates: Brief Project Description: AUDIT QUESTIONS RESPONSE Y N NA COMMENT A. QUALITY SYSTEM DOCUMENTATION 1 . Is there an approved Q A Proj ect Plan for the overall project and has it been reviewed by all appropriate personnel? 2. Is a copy of the current approved QA Project Plan maintained at the site? If not, briefly describe how and where quality assurance (QA) and quality control (QC) requirements and procedures are documented at the site. 3. Is the implementation of the project in accordance with the QA Project Plan? 4. Are there deviations from the QA Proj ect Plan? Explain. 5. Do any deviations from the QA Project Plan affect data quality? B-l ------- AUDIT QUESTIONS 6. Are written and approved current standard operating procedures (SOPs) used in the project? If so, list them and note whether they are available at the field site. If not, briefly describe how and where the project procedures are documented. 7. Is the anticipated use of the data known and documented in the QA Project Plan? 8. What are the critical measurements? (List under Comments) 9. Have performance goals for each critical measurement been documented clearly and explicitly in the Q A Project Plan? 10. Do the above performance goals appear to be based on documented performance criteria or on actual QC data compiled for the measured parameter? 1 1 . Are there established procedures for corrective or response actions when performance goals (e.g., out- of-control calibration data) are not met? If yes, briefly describe them. 12. Are corrective action procedures consistent with the QA Project Plan? 13. Have any such corrective actions been taken during the protect? RESPONSE Y N NA COMMENT EPA QA/G-7 B-2 Final January 2000 ------- AUDIT QUESTIONS 14. Has the performance of each of the critical measurements been assessed and documented during the project? 15. For each critical measurement, does the QA Project Plan specify the frequency of calibration, the acceptance criteria for the calibration, and the process for calibration data reduction and review? 16. Briefly describe how calibration and other QC data are documented. 17. Does the calibration documentation show that calibrations are being performed at the required frequency and in the required manner? 18. Are there standard paper or electronic forms to record QC data and operational data? 19. Are the standard forms dated? 20. Is the person who recorded the data identified on the form? 2 1 . Are paper records written in indelible ink? 22. Are the QC data reviewed by another qualified person such as the QA manager or the project manager? Who is this individual? RESPONSE Y N NA COMMENT EPA QA/G-7 B-3 Final January 2000 ------- AUDIT QUESTIONS RESPONSE N NA COMMENT 23. Is the proj ect team adhering to the planned schedule? If not, explain the new schedule. Verify that all schedule changes have been authorized. Additional Questions or Comments: B. ORGANIZATION AND RESPONSIBILITIES Identify the following personnel and determine whether they have the listed responsibilities. 1. Project Manager: (name) Responsible for overall performance of the project, and Communicates with EPA. 2 Project Quality Assurance Manager (QAM): (name) Reviews instrumentation and QC data, and * Performs QC activities. 3 EPA QA Representative: (name) Assists with and will be responsible for review and monitoring of all QA and QC activities. 4. Project Manager at Site: (name) Coordinates with project manager, and Plans and schedules the project. EPA QA/G-7 B-4 Final January 2000 ------- AUDIT QUESTIONS RESPONSE N NA COMMENT 5. Analytical Instrumentation Operator(s): (name) (name) Operate the instrumentation, Calibrate the instrumentation, and Record operational parameters. Who is authorized to halt the project in the event of a health or safety hazard? 7. Does the project maintain descriptions of the project organization and personnel responsibilities? Additional Questions or Comments: C. TRAINING AND SAFETY 1. Do the instrument operators have special training or experience for the operation of the instruments? 2. Do the project files contain current summaries of the training and qualifications of project personnel? 3. Is there special safety equipment required to ensure the health and safety of project personnel? 4. Is each project team member appropriately outfitted with safety gear? EPA QA/G-7 B-5 Final January 2000 ------- AUDIT QUESTIONS 5. Are project personnel adequately trained for their safety during the performance of the project? 8. Is there evidence of conditions that present a clear danger to the health and safety of project personnel? If so, take appropriate steps to stop work or to inform the appropriate responsible parties of the danger. RESPONSE Y N NA COMMENT Additional Questions or Comments: D. ANALYTICAL INSTRUMENTATION 1 . Describe the analytical instrumentation. List the brand, model number, serial number, and range for each instrument. Do the instruments use EPA standard methods? 2. Describe the sampling probe for the instrumentation. 3 . Describe the sampling lines for the instrumentation. 4. Does the sample probe have a calibration valve assembly for sampling system bias tests? 5. Is the sampling system maintained according to the prescribed schedule? 6. Describe the sampling system filter. Is the filter changed according to the prescribed schedule? EPA QA/G-7 B-6 Final January 2000 ------- AUDIT QUESTIONS 7. Describe the sample pump. 8. Describe the sample flow rate control system. List the sample flow rate. 9. Describe the sample distribution manifold. 10. How are data recorded (e.g., the data acquisition system)? Briefly describe the system, giving its brand, model, and serial number. 1 1 . Does the data recording system have a provision for documenting changes in operating parameters? If not, are changes in operating parameters documented in some other manner? 12. Is there a hardcopy backup for the data recording system? 1 3 . Can data be recovered from the hardcopy backup? 14. Is there a schedule for preventive maintenance for the instrumentation? 15. Are calibration and maintenance logs kept for the instrumentation? 16. Review the maintenance and operational records for the instrumentation. Based on your findings, do all instruments appear to be in good operating condition? 1 7 . Are the manufacturer' s operating manuals readily available to the instrumentation operators? RESPONSE Y N NA COMMENT EPA QA/G-7 B-7 Final January 2000 ------- AUDIT QUESTIONS 18. Describe the routine calibration procedure. 19. Does the calibration documentation show that the calibration procedures are being followed? 20. Do the calibration standards have the appropriate levels? 21. Are the calibration standards traceable to standards from the National Institute of Standards and Technology (NIST) or to other accepted standards organizations? 22. What is the instrumentation calibration error according to the calibration documentation? 23. What is the instrumentation linearity error according to the calibration documentation? 24. What are the instrumentation zero and calibration drifts according to the calibration documentation? 25. What is the sampling system bias according to the calibration documentation? 26. Do the instruments have any interferences? How are the data corrected for interferences? 27. Are the calibration standards and delivery system properly maintained? RESPONSE Y N NA COMMENT EPA QA/G-7 B-8 Final January 2000 ------- AUDIT QUESTIONS RESPONSE Y Additional Questions or Comments: N NA COMMENT EPA QA/G-7 B-9 Final January 2000 ------- |