ELAB-9901 February 1999 RECOMMENDATIONS FOR THE IMPLEMENTATION OF PERFORMANCE-BASED MEASUREMENT SYSTEMS (PBMS) A Report Prepared by The Environmental Laboratory Advisory Board, A Federal Advisory Committee Sponsored by the U.S. Environmental Protection Agency ------- FORWARD This report was prepared as part of the activities of the Environmental Laboratory Advisory Board, a Federal Advisory Committee sponsored by the U.S. Environmental Protection Agency. This report has not been reviewed for approval by the U.S. Environmental Protection Agency; hence, the contents do not necessarily represent the views and policies of the U.S. Environmental Protection Agency, nor of other agencies in the Executive Branch of the federal government, nor does mention of trade names or commercial products constitute a recommendation for use. For further information about this report, or other activities of ELAB, please contact the Designated Federal Officer (DFO) for the Environmental Laboratory Advisory Board. Ms. Elizabeth Dutrow, DFO USEPA/ORD 401 M St. SW (8724R) Washington, D.C. 20460 ------- ------- ACKNOWLEDGMENTS The Environmental Laboratory Advisory Board (ELAB) wishes to recognize the efforts of those individuals who served as members of the Performance-Based Measurement System (PBMS) Workgroup and developed the Essential Elements described in this report. PBMS Workgroup Members Name Affiliation Ms. Lara Autry USEPA/Office of Air and Radiation Dr. Richard Burrows Quanterra, Inc. Mr. Ray Frederici Severn-Trent Laboratories Dr. Zoe Grosser The Perkin-Elmer Corporation Ms. Sylvia Labie Florida Department of Environmental Protection Mr. Larry LaFleur NCASI Mr. Jerry Parr Catalyst Information Resources Mr. Bob Runyon USEP A/Region II Dr. Barton Simmons California EPA Dr. A1 Verstuyft Chevron Research and Technology Dr. Llewellyn Williams USEP A/Office of Research and Development ------- ------- EXECUTIVE SUMMARY This report presents the recommendations of the Environmental Laboratory Advisory Board (ELAB) on the implementation of a Performance-Based Measurement System (PBMS). As permitted by federal charter, ELAB provides advice and counsel to the United States Environmental Protection Agency's (USEPA) Administrator, Deputy Administrator, and Environmental Monitoring Management Council, the National Environmental Laboratory Accreditation Conference (NELAC) Board of Directors, and other federal agencies concerning the systems and standards of accreditation for environmental laboratories. ELAB was established on July 31, 1995, in accordance with the Federal Advisory Committee Act (FACA), 5 U.S.C. Appendix 2 Section 9 (c). ELAB Recommendations on Implementation of PBMS 1. USEPA should establish a consistent approach for PBMS, addressing the Essential Elements of a Successful PBMS described in this report, across all of its Program Offices. 2. Each USEPA Program Office should prepare a public report on how the Essential Elements will be included in their PBMS Implementation Plans. The USEPA reports should include specific actions and a schedule for incorporation. 3. The Critical Elements (described in sections A-l through A-6 of this report) should be specifically addressed in guidance, plans, policies, or regulations in any Performance Based Measurement System. Interim measures underway by USEPA, such as the Office of Water's plan to issue a final rule, should move forward provided each Program Office includes a commitment or plan to expeditiously phase in these critical elements. 4. ELAB supports NELAC's commitment to incorporate PBMS, consistent with USEPA's implementation, within the NELAC standards as a foundation for PBMS. While the current standards may not currently satisfy all the anticipated needs of PBMS, NELAC should prepare to address future needs within the context of State statutory and regulatory requirements and the finalized USEPA implementation plans for PBMS. 1 ------- 2 ------- BACKGROUND On October 6, 1997, the United States Environmental Protection Agency (USEPA) provided public notification (62 FR 52098) of a plan to implement performance-based measurement systems (PBMS) for "environmental monitoring in all of its media programs to the extent feasible." USEPA defined PBMS as "a set of processes wherein the data quality needs, mandates or limitations of a program or project are specified, and serve as criteria for selecting appropriate methods to meet those needs in a cost-effective manner." The notice indicated that the regulated community would be able to select any appropriate analytical test method for use in complying with USEPA's regulations. It further indicated that implementation of PBMS would improve data quality and encourage the advancement of analytical technologies. In anticipation of USEPA's announcement of a plan to implement PBMS, the Environmental Laboratory Advisory Board (ELAB) expressed potential concerns about implementing PBMS and established an ad hoc workgroup on July 21, 1996, led by Dr. Kathy Hillig of the BASF Corporation and representing the Chemical Manufacturers Association to study this matter. The workgroup presented a report to ELAB in July 1997. Based on the workgroup's study, ELAB presented several issues for the USEPA to address regarding its implementation of PBMS. The July 1997 report is presented in Attachment 1 to this report. On July 1, 1998, ELAB decided that USEPA's PBMS efforts, as of that time, had not addressed sufficiently the issues set forth in the earlier report, and voted to form a new ad hoc workgroup to develop a report with specific recommendations to ELAB on the implementation of PBMS. The new workgroup was charged with assembling a small group of individuals including key stakeholders from organizations such as laboratories, instrument manufacturers, the regulated community, States, and USEPA. Mr. Jerry L. Parr, of the Global Institute of Environmental Scientists and Catalyst Information Resources, was selected to lead the effort. The workgroup of 11 individuals began its assignment in early July 1998. Meetings were convened every other week through January 4, 1999. Additional efforts were made on individual assignments. The workgroup performed the following activities: • Developed a list of essential elements for a successful PBMS, • Reviewed USEPA's PBMS Implementation Plans, • Reviewed USEPA's goals for PBMS, and, • Developed and evaluated a PBMS Case Study. 3 ------- From these activities, a draft report was prepared with recommendations for ELAB consideration. The draft report was discussed by ELAB on December 10, 1998 and made available for public comment. Based on comments from ELAB and others, the report was revised and presented to ELAB on January 14, 1999, where it was approved and adopted as a final ELAB product after minor editorial revisions. Overall, ELAB estimates over 500 hours of effort were expended to produce this report. 4 ------- SUMMARY OF WORKGROUP ACTIVITIES Essential Elements ELAB places a great deal of importance on the fundamental principles of a successful PBMS program. These principles are referred to as Essential Elements (see Table 1). While all of the Elements are essential to PBMS, six are vital to its proper implementation. These Essential Elements are further described below. There is no particular importance to the order of the elements within each subgroup. The Essential Elements were developed using a "wide perspective" approach to assist USEPA in recognizing key features while not becoming embroiled in details that could restrict its options. ELAB is committed to work with USEPA on the details of these Elements as USEPA considers how it will respond to or approach these issues. The report provides examples to clarify the intent and illustrate the main point of several of the Elements. These examples should not be assumed to be the only nor best illustration of the Element. Table 1. Essential Elements for a Successful PBMS Implementation A. Critical Elements • Legal Standing • Cost Effectiveness • Scientifically Sound and Relevant Validation Process • Clearly Articulated and Appropriate Performance Criteria • Regulatory Development • Documentation B. Important Elements • Flexibility • EPA Optional Approval Process • Consistency • Simplicity • Clarity of Intent • Careful Implementation • Widely Available Reference Materials Note: These Essential Elements are applicable to compliance monitoring independent of PBMS. 5 ------- Review of EPA Implementation Plans ELAB reviewed the USEPA Office of Air and Radiation (OAR) and Office of Solid Waste and Emergency Response (OSWER) PBMS Implementation Plans. The Office of Water (OW) Plan was not available for review due to regulatory development constraints, but ELAB was fully briefed on the elements of the OW plan. After extensive discussions, ELAB determined there was little value in providing comments on these plans. Rather, as discussed in Recommendation 2, ELAB has requested each USEPA Program Office to consider how the Essential Elements will be addressed in its PBMS Implementation Plan. Review of EPA Goals for PBMS USEPA announced six goals (See Attachment 2) for its PBMS program. ELAB reviewed these goals, and while it supports them, ELAB believes the goals will not result in a successful program. This is because the goals, while useful concepts, focus only on the laboratory portion of the program. Several other goals need to be added. These additional goals relate to the needs of regulated industry and the regulators. PBMS Case Study A complex, but realistic, hypothetical Case Study was developed. ELAB's evaluation and development of the Case Study, summarized in Attachment 3, convinced ELAB of the need for a sound PBMS program, and led to a better understanding of the Essential Elements. ELAB believes the Case Study will aid USEPA as it considers the full scope of the Essential Elements. Further details on the Case Study are available in the ELAB workgroup's meeting minutes. 6 ------- ESSENTIAL ELEMENTS FOR SUCCESSFUL IMPLEMENTATION OF PBMS The following discussion defines each Essential Element and provides additional discussion and examples where appropriate. A. CRITICAL ELEMENTS A-l. Legal Standing: Data generated in compliance with the PBMS framework must have the same legal standing as data generated using a promulgated USEPA method. Equal legal standing is the key issue which requires resolution for the development and use of new measurement methods under PBMS. Laboratories and regulated entities will only use measurement methods that are known to be acceptable to the ultimate customer, USEPA. "The Daubert principle" is widely recognized as the basis for legal standing of scientific information. Elements of this principle include publication in peer reviewed journals, presentation at conferences, or USEPA review of the validation data. PBMS should allow laboratories to use any method modifications or new methods that meet the following requirements: 1) The methods should use techniques which are generally accepted by the scientific community (examples of techniques are gas chromatography, enzyme immunoassay, etc.). Courts may require a demonstration that techniques are sufficiently established to have gained general acceptance in their field (People vs. Kelly (1976) 17 Cal.3d 24). This requirement should rarely be an issue, since laboratories will generally use techniques which have been published in peer- reviewed scientific journals or have been included in other USEPA methods or other recognized approved methods, such as Standard Methods or ASTM methods. 2) The methods should be demonstrated to be applicable for their intended use. This demonstration can be accomplished by documented statements of method performance contained in published methods or by a laboratory demonstration of the method's performance for its intended use. The outcome resulting from this element is that any regulated entity meeting the PBMS requirement, and whose lab results demonstrate compliance, should be judged to be in compliance. As stated in the USEPA Federal Register notice on PBMS, "where PBMS is implemented, the regulated community would be able to select any appropriate analytical test method for use in complying with USEPA's regulations." 7 ------- A-2. Cost Effectiveness: Requirements for PBMS for method validation, demonstration of capability, and ongoing quality control should be consistent across all USEPA programs and should also apply consistently to USEPA-published methods, modifications to USEPA-published methods, and new methods. Such requirements should be cost- effective for small laboratories performing limited analyses, large complex laboratories working nationwide, and instrument manufacturers. The cost of demonstrating compliance under PBMS could be prohibitively high and preclude use of any new methods or method modifications, particularly by very small operations such as those in Publicly Owned Treatment Works (POTWs). For example, extensive quality control activities could be required if a laboratory has to demonstrate that the method is adequate for every sample that is analyzed. Quality control (QC) activities, such as initial demonstration of capability in the matrix of interest, for the intended purpose, should be required uniformly for approved USEPA methods as well as for other methods selected under PBMS. If every facility has to undertake a full method validation for every sample, an increase in monitoring costs could occur. If this is the situation, ultimately, PBMS could stifle innovation because the cost to change to a new method (thus mandating another validation study) would serve as a disincentive to implement a new methodology. Where many samples of a similar nature are to be monitored, it would be desirable to have some means of validating a method on a general matrix. The matrices defined in Chapter 5 of the National Environmental Laboratory Accreditation Conference (NELAC) standards represent the types of matrices that should be considered in this type of a general method validation. A tiered validation scheme with varying levels of validation based on the proposed or intended scope of applicability should be considered. An example of such a tiered system could be: 1. nationwide use: multiple laboratories, wide variety of matrices 2. limited use: single laboratory, wide variety of matrices 3. single use: single laboratory; limited matrices In the context of PBMS, method validation is performed to document that the required data quality (e.g. measurement quality objectives, MQOs, or data quality objectives, DQOs) can be met and that the methodology is suitable for its intended purpose. Internal laboratory QC (e.g. laboratory control sample, LCS) is used to demonstrate that the laboratory performed a validated method in a state of control, while project QC samples (e.g. matrix spikes) are used to provide an estimate of uncertainty in the measurement. Successful analyses of Proficiency Test samples are also useful to demonstrate that a laboratory is competent to perform a given method. 8 ------- A-3. Scientifically Sound and Relevant Validation Process: Both the method validation and the PBMS documentation requirements should be based on principles that are widely accepted in the scientific community and on the intended use of the data. In order for the results of compliance monitoring to have credibility with the public, USEPA and the academic communities, method validation should follow sound scientific principles. This would include appropriate tests for all normal method performance characteristics (e.g. accuracy, bias, precision, selectivity, sensitivity, etc.). Requirements to achieve non-essential quality control criteria, such as an absolute retention time, would provide no meaningful data on the reliability or suitability of the measurements and thus would have no purpose in the validation or documentation process. Incorporation of such unnecessary and inappropriate requirements in USEPA regulations would undermine the credibility of PBMS and thus erode public confidence. Examples of accepted principles might include the use of reference materials, comparison to other methods, or interlaboratory validation per ASTM D2777 or the Association of Official Analytical Chemists (AOAC) protocol. These are intended as examples, not as minimum requirements. USEPA should establish a consistent approach for validation, and, this approach should conform to accepted scientific practices. Validation represents the activities required to show that a method has the capability to generate data of the quality needed and should be differentiated from those activities (QC) performed to document the ongoing quality achieved with routine sample analysis. However, validation should be performed using sample types that are as truly representative of those for which the method will be used as possible. All methods, including those published by USEPA, should be validated according to the approach developed. A-4. Clearly Articulated and Appropriate Performance Criteria: USEPA should develop and publish PBMS performance criteria appropriate to the anticipated regulatory use. PBMS performance criteria are the sensitivity, selectivity, precision and accuracy of the data needed to demonstrate compliance with the regulation. Success of PBMS will depend on relevant performance criteria (DQOs/MQOs). These criteria should be published in the regulation and be based solely on regulatory or other programmatic needs. Alternative approaches, such as method performance criteria, could be used until PBMS is fully developed. Ideally, if the performance criteria are based on USEPA published method performance data, USEPA should develop and publish such criteria for each analyte based upon a multi-laboratory, method validation study that uses challenge samples appropriate to the anticipated regulatory use of the method. Alternative processes used to establish such criteria should be founded on well-established scientific principles. Where the performance criteria are founded on the analysis of a reference material or audit sample, the criteria should be based on the 9 ------- normal distribution of results achieved on such samples. If criteria are based on method performance, EPA should demonstrate the criteria are achievable in the matrix of concern (See A- 5 below). A-5. Regulatory Development: In support of new regulations, USEPA should employ or develop laboratory methods that have been demonstrated to be capable of achieving the regulatory compliance monitoring requirements. In order to assure the quality of the science used in the development of regulations, USEPA should submit all the technical studies used to develop a regulation to peer review as part of the regulatory process, prior to finalizing any such regulation. USEPA should demonstrate that any new or revised regulatory measurement requirements are achievable on samples that represent the same level of analytical challenge as the matrix for which the regulation is intended. (Ideally, this would be done with samples of the actual matrix to be monitored, as defined by the regulation). A peer review process for evaluating measurement requirements in USEPA regulations may assist in this context. Supporting data should address not only method development but also the successful application of the method in the context of its intended regulatory use. A-6. Documentation: The documentation required under PBMS must be sufficient for independent verification (i.e., auditing) and reproduction by another laboratory which is skilled in the art. The documentation element is the key link to laboratory accreditation under NELAC. PBMS should be independent of NELAC, as the enforceability of PBMS rests with USEPA or States independent of NELAC; however, when PBMS is implemented, NELAC should develop a system to incorporate PBMS. The documentation element should be appropriate for use either under NELAC or by enforcement from USEPA or other entities. 10 ------- B. IMPORTANT ELEMENTS B-l. Flexibility Regulated entities should have flexibility to modify methods or use new methods, as long as the PBMS requirements are met. No barriers or restrictions on methods or modifications other than those imposed by conformance with the other key elements (e.g. scientifically sound, legally defensible) should be imposed. There should be no limitations on isolation, concentration, enrichment, digestion or analysis and detection, either individually or in any combination. As long as the total method meets the quality requirements, it should be considered acceptable. Laboratories in compliance with PBMS requirements should also be able to exercise this flexibility. Laboratories performing work under contract should be able to use this flexibility within the constraints of their clients' needs and with their approval. B-2. EPA Optional Approval Process: The scientific community should have an effective system for an optional USEPA approval of new analytical methods. There should be no unnecessary barriers to these approvals. The scientific community, i.e. instrument companies, laboratories, regulated entities, and others who may develop methods, sees a need to have USEPA fulfill the role of approving new methods or modifications to methods that are important for compliance monitoring. Such a procedure would be used to verify a method modification or new method if the regulated entity or the laboratory (because of contractual arrangements) is not completely confident that a change they have made will be accepted. This USEPA role is important for global competitiveness of U.S. technology and to provide an additional level of confidence. This approval should simply reflect that the method has been validated according to the guidelines in PBMS and that the method has been shown to provide performance as stated. Clear rules for the level of data required and process for reviewing the data submission are needed. Ideally, USEPA should have a clearinghouse to coordinate the review of method approvals. A Website should be established to provide an opportunity for the public to make comments on methods under review by USEPA. USEPA could also provide bulletins and/or guidance documents geared toward clarifying technical and policy issues associated with PBMS on this website. Methods developed and/or written by consensus organizations (e.g. those from the American Society for Testing and Materials, ASTM) should be recognized automatically by USEPA as approved methods, provided these methods have been demonstrated to meet the specific quality requirements. 11 ------- B-3. Consistency: Consistency in definitions, objectives and criteria for all aspects of PBMS among Program Offices, USEPA Regions and States is essential. Consistency in definitions (such as measurement quality objectives, method validation criteria) and consistent approaches are needed among USEPA Program Offices implementing PBMS. The NELAC standards, especially the Glossary, should be used to help promote consistency. B-4. Simplicity: The implementation of PBMS should be made as simple as possible without departing from the Essential Elements and the PBMS goals. Guidance developedfor PBMS should be written with simplicity and clarity to ensure consistent interpretation and implementation. B-5. Clarity of Intent: Performance criteria must be represented by unambiguous requirements or objectives, which can be easily understood, applied, demonstrated and readily auditable by the laboratory community (laboratories, data users, laboratory assessors). While this element relates to enforceability, there is a need for all stakeholders to understand the intent of various PBMS components as they are interpreted and implemented. This element should address clear language and parity with regulatory and other programmatic needs. B-6. Careful Implementation: Implementation of PBMS should consider how existing regulations and/or monitoring requirements will be affected. A laboratory operates under contractual arrangements with regulated entities that are disposed to insist that approved USEPA methods be used. Hence, even as USEPA moves to fully implement PBMS, regulated entities may be unwilling to accept a perceived risk of using non- USEPA methods. This unwillingness is likely due to their lack of information about the acceptability of results generated by PBMS. Even if a regulated entity is willing to permit a laboratory to use a new or modified method, the laboratory may be unwilling to risk use of a process that is not fully developed or accepted by the measurement community. Training is needed to allow the stakeholder community time to implement the program and experience is needed to instill trust in the process by both sides. Where current regulations specify methods, and not performance criteria, a careful process will need to be implemented to protect the interests of both USEPA and the regulated industry during this transition period. 12 ------- B-7. Widely Available Reference Materials: Readily affordable reference materials should be widely available to assist in the method validation effort. The proper level and scope of method validation is a critical issue that must be addressed (see Elements 2 and 3). Too little will leave PBMS subject to challenge and perhaps unnecessary legal battles associated with enforcement actions. Too much and the system will become too costly to be effective. A sound way to control costs and provide scientifically credible validation is through the use of reference materials. This is a classic approach to method validation. A variety of reference materials, ranging from internally prepared materials to well- characterized and stable environmental samples to certified reference materials should be considered for these purposes. These reference materials should be as similar to real-world samples as is reasonably possible. Further, such reference materials should be representative of samples analyzed in environmental regulatory programs, agencies, and communities. Ideally, certified reference materials should be used to assist in the method validation process. The term "certified reference material" is defined in ISO Guide 30:1992, as a "reference material, accompanied by a certificate, one or more of whose property values are certified by a procedure which establishes its traceability to an accurate realization of the unit in which the property values are expressed, and for which each certified value is accompanied by an uncertainty at a stated level of confidence." USEPA should develop guidance on the proper use of reference materials (including which types are most valuable and/or preferable) and their application within the PBMS framework. USEPA should also facilitate development of new reference materials to help significantly increase their availability through the National Institute of Standards and Technology or through third party vendors and perhaps provide guidance on how to prepare and use suitable QC materials in house when such materials are otherwise unavailable. The performance of existing USEPA published methods should also be established using well-characterized reference materials. This performance data would allow a new or modified method to be directly compared to the USEPA method using the same material(s) where that might be useful, but more importantly, would provide a more consistent approach when USEPA methods are used to assist in establishing MQOs . In the absence of good, performance-based selection criteria, there is value in being able to relate a new method's performance to some benchmark such as a peer reviewed published method (but only if that method has been evaluated with appropriate challenge samples that relate to the study at hand). Finally, the day-to-day performance of the laboratory, with whatever method is selected under the PBMS, can be evaluated using the same reference material. Where no better performance specifications can be provided under PBMS, such documented performance of existing USEPA methods may offer a useful target. 13 ------- 14 ------- BIBLIOGRAPHY ELAB reviewed USEPA Federal Register notices and proposals, other USEPA documents and the NELAC standards related to PBMS. In addition to references cited in this report, the documents listed below were consulted. Keith, L.H., et. al, Principles of Environmental Analysis, Anal Chem, 1983, 55, 2210-2218. Meeder, J.L., Legal Issues Governing Data and EPA Methods, in Seminar for Soil Sampling for Volatile Organics, Berkeley, CA, May 13, 1998. Robertson, M. Case Examples of Faulty Laboratory Results Obtained with Commonly Used EPA Methods, J. Env. Reg, Summer, 1995. Sleevi, P., D. Loring, J. Parr, N. Rothman, Developing a Uniform Approach for Complying With EPA Methods, Presented at the 7th Annual Waste Testing and Quality Assurance Symposium, Washington, DC, July 8-12, 1991. Taylor, J.K. Validation of Analytical Methods, Anal Chem, 1983, 55, 600A-608A. USEPA, Availability, Adequacy, and Comparability of Testing Procedures for the Analysis of Pollutants Established Under Section 304 (h) of the Federal Water Pollution Control Act, January 1988. 15 ------- 16 ------- Attachment 1 ELAB Performance Based Measurement System PBMS Issues Workgroup Report July 1997 ELAB approved the recommendations listed below as developed by this workgroup: 1. Senior EPA officials should advocate the highest level of implementation of PBMS. 2. PBMS training programs for State and/or federal assessors or inspectors should be established prior to implementation of PBMS. 3. Before EPA promulgates a regulation, it must demonstrate and document that MQOs are achievable using available measurement technology. 4. EPA must demonstrate that any new or revised regulatory measurement requirements are achievable on samples that represent the same level of analytical challenge as the matrix for which the regulation is intended. (Ideally, this would be samples of the actual matrix to be monitored, as defined by the regulation.) 5. EPA should consider the remaining important unresolved issues listed below. UNRESOLVED PBMS ISSUES PB Measurement System vs. PB Method There is some confusion with what is a PBMS and what is a PBM. One way to differentiate them is to consider PBMS as allowing any method to be used to satisfy the objectives of the analysis. Each variation to a method would be described or labeled. For PBM, modifications to existing methods would be allowed and the use of that method name and number could still be used. This would be important in the case of permits where methods are specified. Under PBM, variations of a method retain the method name and number and are equivalent to the original method. Sample matrix Validation of methods are usually described via a particular matrix. What are equivalent matrices for QC purposes? What characteristics should be considered? This will be a serious issue if regulators and assessors don't agree. 17 ------- Method Validation Definition is needed so that both labs and assessors know what criteria are needed to validate a method. This is critical since only a validated method can be considered equivalent to existing methods. Method Compliance Will PBMS methods be approved or equivalent to existing or reference methods and be as legally defensible? Provisions are needed which will guarantee that any method that meets a given Program Office's PBMS criteria will have completely equal legal authority. If this issue is not adequately addressed to assure the permittee full acceptance of their data, then the regulated community is not likely to undertake the risk of having their data judged unacceptable. Interlaboratory Comparability Concern that using different variations of a method will give different results by different labs. How is industry to be assured that data is comparable? Which would be the "correct" result? Cost Expectations are for cost savings, but an increase in QA/QC samples may increase cost. Also, increased validation needs, due to more matrices or higher levels of validation, may increase cost as well. Laboratory Client Relationship Changing role of the lab from merely analyzing samples to being involved with sampling, choosing appropriate methods, and defining data packages. Does the lab move away from doing unbiased objective testing? 18 ------- Attachment 2 Goals for the EPA Performance Based Measurement System (PBMS) 1. Provide a simple, straightforward way for the regulatory community to respond to specific measurement needs with reliable, cost-effective, methods. 2. Emphasize project- or application-specific method performance needs rather than requiring that specific measurement technologies be used in order to avoid costly measurement overkill. 3. Encourage the use, by the laboratory community, of professional judgment in modifying or developing alternatives to established USEPA methods. 4. Employ a consistent way to express method performance criteria that is independent of the type of method or technology. This includes articulating measurement needs in qualitative and quantitative terms. 5. Foster new technology development and continuous improvement in measurement methodology, by providing qualitative and quantitative targets for identified measurement gaps to method developers and other researchers. 6. Encourage the measurement community to give the USEPA feedback on new monitoring approach successes as well as failures in order to expand our knowledge of new or modified approaches and to assist others by helping to disseminate this information to the wider monitoring community. 19 ------- 20 ------- Attachment 3 Hypothetical Case Study: Analysis of a Water Sample As an exercise to evaluate any gaps in the Essential Elements, a hypothetical example of how PBMS might work was evaluated. Background A pharmaceutical manufacturing plant also makes batteries. They must comply with the NPDES regulation for a combined wastewater discharge for both pharmaceutical (40 CFR 439) and battery manufacturing (40 CFR 461). The wastewater is subject to a new air regulation (40 CFR 63, Subpart GGG). Some of the wastewater, which cannot be mixed with the primary discharge, is disposed of as a RCRA hazardous waste and must meet the universal treatment standards (40 CFR 268). Finally, as the company is considering internal treatment to drinking water standards (to address public concerns), that regulation (40 CFR 141) is of interest as well. The customer wants to send a water sample to a laboratory for analysis. Their primary concern is volatile organics from pharmaceutical processes and a few metals and inorganics. Workgroup Findings Based on a series of teleconferences, and individual efforts, the PBMS Workgroup concluded: • None of the regulations provided any guidance on performance criteria other than detection level. • Several different methods would be required in the current system to measure the same analytes. • The performance data in the methods is outdated, and in some cases not relevant to the regulation. • Some analytes (e.g. ethylene oxide) are probably not measurable at the regulated level. • Many current methodologies (e.g. inductively coupled plasma/mass spectrometry, ICP/MS) could not be used. • Special methods, established for this application, would not be cost-effective for routine laboratory analyses where many different types of data needs from many customers are required. 21 ------- Recommendations For PBMS to be effective, the laboratory must have defined measurement objectives. This should be in the regulations or an approved project plan or related document. Failing this fundamental effort, performance criteria in USEPA published methods can be used with caution, as different methods have different criteria, and the selection process could be biased. If no criteria are established, the laboratory could establish such criteria based on reasonable scientific principles (e.g. accuracy of 70-130%) and a QC program which generates data of known quality. The workgroup also found that statements of method sensitivity (e.g. the method detection limit) within methods were the least useful QC criteria, as all of the regulations contained action levels many times much greater than those in the method. USEPA published methods should clearly articulate the performance of the method, in the matrix which was validated, at the time of the study. Assertions of performance in other matrices should be avoided. If laboratories wish to use current technology (e.g axial vs radial ICP), the laboratory is responsible for documenting the method's performance. The primary differences between various USEPA methods appear to be the QC requirements, as compared to any technological differences. A consistent QC program, such as in NELAC, would help provide a framework for PBMS. 22 ------- |