EPA600/B-22/001 | January 2022 | www.epa.gov/research Guidelines on Validation of Non-Regulatory Chemical and Radiochemical Methods ------- EPA/600/B-22/001 Guidelines on Validation of Non-Regulatory Chemical and Radiochemical Methods Prepared by: EPA Environmental Methods Forum (EMF) Method Validation Workgroup January 2022 Workgroup Members: John Griggs, Chair Office of Air and Radiation William Adams Office of Water Stephen Blaze Office of Land and Emergency Management Margie St. Germain Region 7 Jennifer Gundersen Office of Research and Development Adrian Hanley Office of Water Keith McCroan Office of Air and Radiation Anand Mudambi Office of Research and Development Barry Pepich Region 10 Yaorong Qian Office of Chemical Safety and Pollution Prevention Robin Segall Office of Air and Radiation Kent Thomas Office of Research and Development Contributors: Charles Cook Region 7 Troy Strock Office of Land and Emergency Management ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Preface The goal of this document is to provide a consistent general approach for the validation and communication of newly developed, adopted (e.g., from another agency, voluntary consensus standard development body such as ASTM International, not previously validated, etc.), or modified chemical and radiochemical methods for non-regulatory purposes. Specifically, this document provides collected information on critical areas of method performance assessment for validation studies. This document also introduces the following new concepts: • Lifecycle of a method, which identifies the typical activities a method goes through during its development and use. • Validation Design, which is a short descriptor indicating the number of laboratories and matrices in a validation study. • Method Validation Summary, which is designed to provide consistency in delivery of summary method validation results in a concise, easy-to-prepare and share format. Use of this guidance will assist the Environmental Protection Agency (EPA) in both validating methods for non-regulatory purposes and communicating the results in a consistent manner, allowing them to be used or further developed for other purposes. It will also serve to assist external parties that develop methods to communicate their method validation results in a standard format to make comparisons between similar validation studies easier. This document was prepared by the Environmental Methods Forum (EMF) Method Validation Workgroup. The EMF is a cross-Agency forum chartered under the EPA's Laboratory Enterprise Council (LEC). For more information on the EMF and LEC, please go to: https://www.epa.gov/labs/national-program-manager-regional-laboratories-activities EPA extends its appreciation to Stephanie Buehler and Ryan James from Battelle for their diligent work and support. 11 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Disclaimer This document is intended to provide internal guidance to U.S. Environmental Protection Agency (EPA) personnel engaged in method validation activities, and for the understanding of those who use EPA methods. This document can also be used by external parties for informational purposes (e.g., private and state laboratories, consensus standard bodies, etc.) engaged in method validation efforts. This document is not in any way binding and EPA retains the discretion, however, to adopt, on a case-by-case basis, approaches that differ from this guidance. The guidance set forth in this document does not create any rights, substantive or procedural, enforceable by law for a party in litigation with EPA or the United States. The intent of the document is not to supersede established practices. It does not replace existing validation practices used by the EPA national program offices for published EPA methods. Rather, the intent is to collect information from various documents and assemble them in one place. The use of mandatory language such as "must" and "require" in this guidance manual reflects sound scientific practice and does not create any legal rights or requirements. The use of non-mandatory language such as "may," "can" or "should" in this guidance does not connote a requirement but does indicate EPA's strong preference for validation and peer review of methods prior to publication for general use. EPA is publicly releasing this document to increase transparency in agency activities and this document may provide useful information for external parties engaged in method validation. References within this document to any specific commercial product, process or service by trade name, trademark, manufacturer or otherwise does not necessarily imply its endorsement or recommendation by EPA. Neither EPA nor any of its employees make any warranty, expressed or implied, nor assume any legal liability of responsibility for any third party's use, or the results of such use, of any information, apparatus, product, or process disclosed in this manual, nor represent that its use by such third party would not infringe on privately owned rights. ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Executive Summary Method validation is an important aspect of establishing chemical and radiochemical laboratory methods. EPA methods used for regulatory purposes rely on program-specific assessment criteria and documentation to guide the conduct of method validation studies. However, EPA also has wide-ranging needs for developing/modifying and validating laboratory methods for non-regulatory purposes to address measurement gaps for both current and emerging contaminants of concern. The guidance in this document is specific to non-regulatory methods. It provides general explanations and concepts collected from a variety of references from standard setting organizations on critical areas of method validation performance assessment to provide guidance on a consistent, general approach. While step-by-step guidance or a requirement for implementing a method validation is not included here, basic method validation principles and possible areas of assessment, as drawn from Agency and non-Agency programs and international programs and guidance bodies, are provided. These principles and areas of assessment are discussed according to the method lifecycle, a novel concept that depicts the steps and processes involved with a method, from the method's beginning by determining its need, to its retirement. A method is developed, depending on needs and intended uses of data, and then validated, taking into account implementation considerations such as holding times and cost. Method validation can be conducted by a single laboratory or multiple laboratories (interlaboratory) on a specific set of analytes in a defined matrix or matrices. Matrix variations should also be part of a well-planned validation study. The number of matrices tested and laboratories participating in a method validation study will vary and are not dictated or defined in this document. In order to appropriately validate a method, method performance characteristics are used. Performance characteristics are a set of parameters that can directly and quantitatively assess the performance of a method, demonstrating if it is fit for its intended purpose. The following typical method performance characteristics are discussed in this document: • Bias/Trueness • Detection and Quantification Capability • Instrument Calibration/Verification • Measurement Uncertainty • Precision • Range • Ruggedness • Selectivity iv ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Each of these performance characteristics are defined along with a discussion of their use and implementation, as well as a list of relevant resources for more details and information. Once completed, consistent and concise communication of method validation studies is important to ensure an accurate and thorough understanding of the method's performance and application. To this end, this document proposes the use of two new communication tools for reporting method validation results: the Validation Design and the Method Validation Summary to convey both the level of validation performed and pertinent information regarding the validation: • The Validation Design describes the validation in a succinct descriptor presented as [aL, bM], where aL is the number (a) of laboratories (L) that participated in the method validation, and bM is the number (b) of different matrices (M) that were used in the method validation. It provides a standardized, easily reported, and easily understood format to convey the level of validation performed based on the number of laboratories that participated and the number of different matrices evaluated. • The Method Validation Summary is a stand-alone table that provides a brief synopsis of the method validation process and results. The Method Validation Summary is intended to be placed at the front of a method validation report to provide the reader with easy access to pertinent information concerning the method validation in a format that is consistent across all reports. In this way, the Method Validation Summary will allow for easier sharing of method validation results across the Agency and provide consistency across all documents and offices. ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Table of Contents Preface ii Disclaimer iii Executive Summary iv Table of Contents vi List of Acronyms viii 1 Introduction 1 1.1 Purpose 2 1.2 Intended Audience 2 1.3 Scope of Guidance 2 2 Activities and Processes Preceding Method Validation 4 3 Interlaboratory and Single Laboratory Method Validation Studies 5 3.1 Interlaboratory and Single Laboratory Considerations 5 3.2 Matrix Definition and Variability 6 4 Method Performance Characteristics 8 4.1 Bias/Trueness 8 4.2 Detection Capability and Quantification Capability 11 4.3 Instrument Calibration 17 4.4 Measurement Uncertainty 19 4.5 Precision 21 4.6 Range 23 4.7 Ruggedness 24 4.8 Selectivity in the Presence of Interferences 27 5 Statistical Evaluation and References 29 6 Method Validation Report 30 7 Consistently and Concisely Communicating Method Validation Studies 31 7.1 Validation Design 31 7.2 Method Validation Summary 32 8 References 34 vi ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Tables Table 1. Cover template, with structure guidelines, for the Method Validation Summary 33 Figures Figure 1. Diagram of the typical lifecycle of a method 3 Appendices Appendix A: Method Validation Matrix Considerations for Individual EPA Offices Appendix B: Analyte Detection and Quantitation Appendix C: Detection and Quantitation Limit Definitions Appendix D: Example Method Validation Summary and Associated Full Method Validation Report Appendix E: EPA Offices Method Validation References Appendix F: Non-EPA Method Validation References vii ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 List of Acronyms ASME American Society of Mechanical Engineers CDC Centers for Disease Control and Prevention CFR Code of Federal Regulations CITAC Co-operation on International Traceability in Analytical Chemistry COA certificate of analysis CRM certified reference material CWA Clean Water Act EMF Environmental Methods Forum EPA Environmental Protection Agency EU European Union FEM Forum on Environmental Measurements GC/HSD gas chromatography/halogen-specific detector GC/MS gas chromatography/mass spectrometry GUM Guide to the Expression of Uncertainty in Measurement IDL instrument detection limit IEC International Electrochemical Commission ILS interlaboratory study ISO International Organization for Standardization IUPAC International Union of Pure and Applied Chemistry JCGM Joint Committee for Guides in Metrology LCMRL lowest concentration minimum reporting level LEC Laboratory Enterprise Council LLOQ Lower Limit of Quantitation LOD limit of detection LOQ limit of quantification MARLAP Multi-Agency Radiological Laboratory Analytical Protocols MDC minimum detectable concentration MDL method detection limit ML minimum level MRL minimum reporting level NIST National Institute of Standards and Technology NPDES National Pollutant Discharge Elimination System PQL practical quantitation limit viii ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 QA quality assurance ReMAP Reference Method Accuracy and Precision RM reference material SI International System SOP standard operating procedure UC MR Unregulated Contaminant Monitoring Rule UTL Upper tolerance limit IX ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 1 Introduction Validation is defined as confirmation by examination and provision of objective evidence that the requirements for a specific intended use or application have been fulfilled (7, 2). Method validation applies the concept of validation to laboratory chemical and/or radiochemical methods. Thus, per the International Organization for Standardization (ISO) definition, method validation is the confirmation (verification and demonstration are considered as alternative terms here), through the provision of objective evidence, that the requirements for a specific intended use or application of a method have been fulfilled (7). Different organizations have various definitions of method validation. One such definition provides that method validation is basically the process of defining an analytical requirement and confirming that the method under consideration has capabilities consistent with what the application requires (3). Method validation is further defined as a process of demonstrating that the method meets the required performance capabilities (4). That is, it makes use of a set of tests that verify any assumptions on which the analytical method is based and establishes and documents the performance characteristics of a method, thereby demonstrating whether the method is fit for a particular analytical purpose (5). United States Environmental Protection Agency (EPA or 'the Agency') and other documents have previously described method validation as the process of demonstrating that an analytical method is suitable for its use and involves conducting a variety of studies to evaluate method performance under defined conditions (3, 6). EPA methods used for regulatory purposes cover a wide range of matrices, methodological approaches, and objectives. Program-specific documentation guides the conduct of method validation studies and development of assessment criteria for such studies. However, EPA also has wide-ranging needs for laboratory methods (e.g., radiochemical and chemical) for non-regulatory purposes. This includes a significant need for internal agency method development and method validation from EPA Offices, Regions, and Programs, which are most aware of the Agency's evolving needs to define new methods and modify existing ones to equip the EPA to best measure contaminants of concern. Efforts by parties outside of EPA (e.g., private and state laboratories, voluntary consensus standard bodies, etc.) may need to be leveraged by the Agency to help fill gaps in method development and method validation to effectively and efficiently address both current and emerging contaminants. Consistent processes and approaches for validating these new non-regulatory laboratory methods are critical to ensuring that efforts by external parties are in harmony with internal agency approaches and criteria, and that methods developed for non- regulatory purposes can potentially be adapted for regulatory purposes as needed. This document presents basic method validation principles and areas of assessment to consider or address when validating laboratory methods intended for non-regulatory purposes. This information is based on approaches and guidelines set forth in documents gathered across Agency programs, international programs and guidance bodies, and non-Agency programs, such as state laboratories and non-EPA federal agencies. 1 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 1.1 Purpose The purpose of this document is to provide a consistent general approach for the validation of newly developed, adopted (e.g., from another agency, voluntary consensus standard development body such as ASTM International, not previously validated, etc.), or modified1 chemical and radiochemical methods for non-regulatory uses. Specifically, this document provides collected information on critical areas of method performance assessment for validation studies. This document introduces the following new concepts: • Lifecycle of a method, which identifies the typical activities a method goes through during its development and use. • Validation Design, which is a short descriptor indicating the number of laboratories and matrices in a validation study. • Method Validation Summary, which is designed to provide consistency in delivery of summary method validation results in a concise, easy-to-prepare and share format. This document is not meant to provide prescriptive or step-by-step guidance on conducting method validation studies. Instead, the intent of this document is to provide an overview of the general principles and important areas of consideration for method validation and to provide lists and, in some cases, links to more detailed resources (e.g., guides and standards, some of which may need to be purchased, and Agency documents) to assist the user in conducting a method validation study. 1.2 Intended Audience This document is intended for use by internal EPA personnel engaged in method validation activities, and for the understanding of those who use EPA methods. This document can also be used by external bodies (e.g., private and state laboratories, voluntary consensus standard bodies, etc.) engaged in method validation efforts. 1.3 Scope of Guidance The lifecycle of a method starts with identifying a need for a method followed by several activities, including defining the method purpose and the intended use of the data, method development, implementation considerations, method validation, release/adoption, use, method review/revision, and method retirement (see Figure 1). It should be noted that many activities in the method lifecycle overlap and are interrelated. This document focuses on the validation of laboratory methods for non-regulatory purposes. It is intended for use across different types of chemical and radiochemical methods and provides a general approach for the validation of a newly developed, adopted, or modified method as well as guidelines for communicating the validation 1 Method modifications that are within the accepted flexibilities of the applied method do not require additional method validation; however, if modifications are made outside the accepted flexibilities of the applied method, then users are responsible for ensuring that the modifications are documented and are supported by a validation study that addresses those modifications. 2 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 performed in a consistent manner. The other activities of the method lifecycle are outside the scope of this document, but some are briefly described in Section 2.0. In addition, sampling methods are outside the scope of this document. Figure 1. Diagram of the typical lifecycle of a method (*) NOTE: Validation is not needed when non-technical edits, clarifications, or grammatical edits result in the release of a method revision that does not change any technical aspects of the method protocol. 3 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 2 Activities and Processes Preceding Method Validation Several activities of the method lifecycle precede method validation and some of these are briefly described in this section. One of the first steps in developing a method involves defining the reason for, or purpose of, the method. A method may be developed to identify and measure a new or emerging analyte, achieve a lower detection capability, meet stricter quality control objectives, or address any number of reasons or needs. The initial use of the data and, if applicable, fitness for purpose criteria should be considered prior to undertaking any method development or validation activities. Method development and method validation are generally not viewed as completely separate processes but rather are considered to be significantly interrelated. Method development is often a complex iterative process. A detailed discussion of method development is outside the scope of this document. However, there are several similarities between the types of experiments conducted during method development and the tests performed during method validation. For example, many of the same performance characteristics assessed in method validation (see Section 4) are evaluated to some extent during method development (3, 6). Before method validation, several aspects related to method implementation are often considered. These include, but are not limited to, holding times, sample preservation, cost, and waste generation. These and other implementation aspects relate to how the method will be used in a laboratory and could have implications for its use and applicability. A detailed discussion of these and other method implementation issues is outside the scope of this document. However, consideration should be given to including information on at least the implementation aspects mentioned in this paragraph and in the Method Validation Summary (see Section 10). All laboratory activities related to the development of a method including method development and method validation activities should be conducted in accordance with the laboratory's quality system. In general, this means compliance with an applicable Quality Assurance Plan (e.g., https://www.epa.gov/qualitv/epa-qar-5~epa-requirements-qualitv-assurance- proiect-plans) and any applicable standard operating procedures (SOPs). 4 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 3 Interlaboratory and Single Laboratory Method Validation Studies 3.1 Interlaboratory and Single Laboratory Considerations An interlaboratory study, as defined by ASTM E691-19el, Standard Practice for Conducting an Interlaboratory Study to Determine the Precision of a Test Method (7), measures the variability of results when a test method is applied many times in multiple laboratories. Thus, interlaboratory validation studies are designed to evaluate the performance of a method, in particular its precision (e.g., reproducibility and repeatability), across multiple laboratories, when the method is performed as written (6-8). Laboratories that participate in an interlaboratory validation study should be representative of the kind of laboratory expected to use the method and should be considered qualified to perform the method (e.g., have qualified staff, correct instrumentation, and appropriate facilities to support the conduct of the method) (7-9). Laboratories selected to participate in the interlaboratory validation should not be limited to those that are exceptionally well qualified or equipped to perform the method (7). Participating laboratories should be expected to conduct the method as written, without deviations, but pay close attention to the flexibilities allowed within the method. This is important to assessing the performance of the method in different settings with different operators and in providing appropriate results for statistical analyses. If deviations from the provided interlaboratory study protocol are made, they should be documented (6) and considered when evaluating the study results. Appropriate test materials should be selected for the interlaboratory validation study. The matrices selected should be within the scope of the method and of the type expected to be encountered when using the method. ASTM E1601-19, Standard Practice for Conducting an Interlaboratory Study to Evaluate the Performance of an Analytical Method (9), recommends that all test materials included in the expected scope of the method be included in the interlaboratory validation. Samples of a certain matrix and concentration should be as homogenous as possible prior to aliquoting into individual test samples and distributing to individual laboratories (7). The AOAC International Official Methods of Analysis notes that any heterogeneity between test samples generated from a single test material must be negligible as compared to any analytical variability, so as not to be a factor in the test results (8). Samples should cover the range of concentrations applicable to the method, and, if possible, should include levels near the upper and lower limits of the established concentration range of the compound(s) being evaluated (9). The number of laboratories that participate in an interlaboratory validation and the number of samples that are used in each validation study may vary. The following discussion on number and type of materials used in an interlaboratory study comes from ASTM E691-19el (7): The number and type of materials to be included in an interlaboratory study (ILS) will depend on the range of the levels in the class of materials to be tested and likely relation of precision to level over that range, the number of different types of materials to which the test method is to be applied, the difficulty and expense 5 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 involved in obtaining, processing, and distributing samples, the difficulty of, length of time required for, and expense ofperforming the test, the commercial or legal need for obtaining a reliable and comprehensive estimate of precision, and the uncertainty ofprior information on any of these points. Sometimes a formal collaborative study is not practical (4) and a single laboratory validation is performed. This can be considered a special case of an interlaboratory validation where only one laboratory conducts the validation activities. Most if not all of the same laboratory tests performed in interlaboratory studies are performed during a single laboratory validation study. However, single laboratory validations may need to consider potential limitations for some validation activities. The International Union of Pure and Applied Chemistry (IUPAC) Technical Report, Harmonized Guidelines for Single-Laboratory Validation of Methods of Analysis (5), notes: It is critically important in single-laboratory method validation to take account of method bias and the laboratory effect. There are a few laboratories with special facilities where these biases can be regarded as negligible, but that circumstance is wholly exceptional. (However, if there is only one laboratory carrying out a particular analysis, then method bias and laboratory bias take on a different perspective.) Normally, method and laboratory effects have to be included in the uncertainty budget, but often they are more difficult to address than repeatability error and the run effect. In general, to assess the respective uncertainties it is necessary to use information gathered independently of the laboratory. The most generally useful sources of such information are (i) statistics from collaborative trials (not available in many situations of single-laboratory method validation), (ii) statistics from proficiency tests, and (Hi) results from the analysis of certified reference materials. When available, certified reference materials (CRMs) can be used in single laboratory validations to assess laboratory and method bias in combination (5). If appropriate CRMs are not available, a laboratory could alternatively use spiked samples (4, 5). In addition, results from a single laboratory validation could be compared to published results or statistics for the method (4) to further confirm the method performance. Also, to better ascertain method bias, an internal round robin could be performed within the laboratory using multiple qualified analysts following general interlaboratory validation guidelines (4). 3.2 Matrix Definition and Variability Method validation is conducted for a specific analyte or set of analytes in a defined matrix or matrices. When planning and conducting a method validation study, the matrix or matrix variations that will be included in the validation need to be defined. The determination of what matrices and associated variables to include in the method validation study depends on the 6 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 intended use and application of the final method and should be contemplated during method development. Matrix variability can have significant implications for method performance. The matrix will have an impact on the sample from collection through analysis. Matrix variation has been cited as one of the most important but least acknowledged sources of error in analytical measurements (5). Variations in matrix can impact detection limits and introduce bias into measurements, and thus are an important consideration of any method validation study. Matrix variability should be an important factor in method ruggedness testing (see Section 7.7). As the Multi-Agency Radiological Laboratory Analytical Protocols (MARLAP) Manual notes, samples collected from different geographical regions or different processes may have differing characteristics that can impact the performance of a method (10). Thus, the ruggedness of a technique for handling variations of a matrix should be investigated, and matrix variations should be a part of any well-planned interlaboratory collaborative study or single laboratory study (10). Additional method application criteria or quality assurance measures (e.g., matrix or surrogate spikes; internal and external calibration techniques; performance reference compounds) may be needed to account for matrix impacts. No universal guidelines or algorithms apply to all matrices that define how many matrices or types of a matrix (e.g., different variations of soil) should be included in a method validation study. However, individual EPA offices may have specific guidelines on matrix variability. Matrix selection will be method-specific and should be based on the scope and method application. For example, when validating a method for use nationally, one may need to include a larger number of matrix variations to cover the range of anticipated method applications, while validation of a method developed for samples from a specific or local site or region may need a much more limited range of matrices for method validation. These aspects of the method should be considered when determining the matrix variations to test during a method validation study and be included in the method validation report. In addition, the use of varied matrices should be considered in conducting different method performance characteristic evaluations (see Section 7). Appendix A provides examples of method validation matrix considerations used by some EPA offices. 7 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 4 Method Performance Characteristics The purpose, scope, and applicability of a method determine the method performance characteristics necessary to properly validate the method. Performance characteristics offer a defined and quantitative set of parameters against which a method can be validated, directly assessing the method and demonstrating that it is fit for its intended purpose. Typical method performance characteristics evaluated during a method validation study include: • Bias/Trueness • Detection and Quantification Capability • Instrument Calibration • Measurement Uncertainty • Precision • Range • Ruggedness • Selectivity The following subsections on each of the performance characteristics provide a definition of the performance characteristic, a discussion of the use and implementation of the performance characteristic, and a list of relevant resources for more details and information. The performance characteristics are provided in alphabetical order which is not meant to imply order of importance. The performance characteristics listed here and discussed in the following subsections are not an exhaustive list but rather those that are typically used and found in various method validation guidelines (3-5). Other performance characteristics may be applicable or specific to certain methods and should be considered for evaluation as part of those method validation studies. 4.1 Bias/Trueness Definition The following definition for bias, specifically for methods, is from ASTM E177-20, Standard Practice for Use of the Terms Precision and Bias in ASTM Test Methods (11). Bias is the difference between the expectation of the test result, and an accepted reference value. ASTM Practice El77 also defines an alternative term for bias - trueness (11). The standard notes that trueness has a more positive connotation. In some EPA programs, accuracy is used as a synonym for bias, but this will not be how it is used in this document. More information about the definition and use of these terms can be found using the following link: https://www.itl.nist.eov/div898/handbook/mpc/sectionl/mi itm#:~: text=In %2 Oparti cut ar% 2C%20for%20a%20measurement.on%20the%20same%20test%20item. 8 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Discussion The following general overview of the determination of bias for a method is taken from the Eurachem Guide: The Fitness for Purpose of Analytical Methods, A Laboratory Guide to Method Evaluation and Related Topics (3). A practical determination of bias relies on comparison of the mean of the results (x) from the candidate method with a suitable reference value (xret). The reference value is sometimes referred as a 'true value' or a 'conventional true value '. Three general approaches are available: a) analysis of reference materials; b) recovery experiments using spiked samples, and c) comparison with results obtained with another method. Bias studies should cover the method scope and may therefore require the analysis of different sample types and/or different analyte levels. To achieve this, a combination of these different approaches may be required. The bias can be expressed in absolute terms b = x — xref (Eq.l) or relative in percent b(%) = X 100 (Eq.2) Xref or as a relative spike recovery R'(%) = X 100 (Eq.3) xspike where x' is the mean value of the spiked sample and xspike is the added concentration. However, in some sectors of analytical measurement, the relative recovery ('apparent recovery ) in percent is also used. R(%) = — X 100 (Eq. 4) xref To determine the bias using reference material (RM), the mean and standard deviation of a series of replicate measurements are determined and the results compared with the assigned property value of the RM. The ideal RM is a certified matrix reference material [CRM] with property values close to those of the test samples of interest. CRMs are generally accepted as providing traceable values. It is also important to remember that a RM should only be used for one purpose during a validation study. For example, an RM usedfor calibration shall not also be used to evaluate bias. Compared to the wide range of sample types and analytes encountered by laboratories, the availability of RM is limited, but it is also important that the 9 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 chosen material is appropriate to the use. It may be necessary to consider how the RM was characterized (for example, if the sample preparation procedure used during characterization of the material is not intended to give the total analyte concentration, but the amount extracted under certain conditions). For regulatory work, a relevant certified material (ideally matrix matched if available) should be used. For methods used for long-term, in-house work, a stable in-house material can be used to monitor bias, but a CRM should be used in the initial assessment. In the absence of suitable RMs, recovery studies (spiking experiments) may be used to give an indication of the likely level of bias. Analytes may be present in a variety of forms in the sample, and sometimes only certain forms are of interest to the analyst. The method may thus be deliberately designed to determine only a form of the analyte. A failure to determine part of, or all the analyte present may reflect an inherent problem with the method. Hence, it is necessary to assess the efficiency of the methodfor detecting all the analyte present. Because it is not usually known how much of a particular analyte is present in a test portion, it is difficult to be certain how successful the method has been at extracting it from the sample matrix. One way to determine the efficiency of extraction is to spike test portions with the analyte at various concentrations, then extract the spiked test portions and measure the analyte concentration. The inherent problem with this is that an analyte introduced in such a way will probably not be bound as strongly as that which is naturally present in the test portion matrix and so the technique will give an unrealistically high impression of the extraction efficiency. It may be possible to assess bias by comparing results from the candidate method with those obtainedfrom an alternative method. There are challenges associated with evaluating bias. The following discussion of some of the challenges, which parallels the discussion on the three approaches to determine bias in the Eurachem Guide (3), is taken from Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods of Analysis (6): The reference material approach introduces the important uncertainty of matrix matching, and the reliability of a bias estimate depends upon the relationship between the composition of the reference material and the samples. When using the alternate method approach, the reliability of a bias estimate is dependent on how much is known about the performance characteristics of the alternate method. 10 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 The matrix spiking approach introduces uncertainty regarding the behavior of spiked materials, compared to materials containing native analyte. This may be particularly problematic for solid materials. RMs are a critical tool in method validation, particularly in regard to assessing method bias. RMs are also important in instrument calibration and are described in more detail for this application in Section 4.3. Useful Resources The following resources can provide further details and information on bias/trueness. • ASTM E177-20, Standard Practice for Use of the Terms Precision and Bias in ASTM Test Methods (11) • Eurachem Guide: The Fitness for Purpose of Analytical Methods, A Laboratory Guide to Method Validation and Related Topics (3) • IUPAC Technical Report, Harmonized Guidelines for Single-Laboratory Validation of Methods of Analysis (5) • ASTM E1601-19, Standard Practice for Conducting an Interlaboratory Study to Evaluate the Performance of an Analytical Method (9) 4.2 Detection Capability and Quantification Capability Definition For both analyte detection capability and quantification capability, a variety of definitions and terms are used by EPA, as well as the national and international analytical/metrology community. Regarding detection capability, the IUPAC Technical Report, Harmonized Guidelines for Single-Laboratory Validation of Methods of Analysis (5), notes: There are several possible conceptual approaches to the subject, each providing a somewhat different definition of the limit. Attempts to clarify the issue seem ever more confusing. If a method is being validated for use in a particular EPA program area, that program area's definitions, terms, and calculational procedures related to detection capability and quantification capability should be incorporated into the method validation process. If appropriate, the method- specific detector and related systems should be identified. 11 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Discussion The following general overview of the concepts of detection capability and quantification capability is from the Eurachem Guide (3). Where measurements are made at low concentrations, there are three general concepts to consider. First, it may be necessary to establish a value of the result which is considered to indicate an analyte level that is significantly different from zero. Often some action is required at this level, such as declaring a material contaminated. This level is known as the 'critical value', 'decision limit', or in [European Union] EU directives, CCa. Second, it is important to know the lowest concentration of the analyte that can be detected by the method at a specific level of confidence. That is, at what true concentration will we confidently exceed the critical value described above? Terms such as 'limit of detection' (LOD), 'minimum detectable value ', 'detection limit', or, in EU directives, CCb are used for this concept. Third, it is also important to establish the lowest level at which the performance is acceptable for a typical application. This third concept is usually referred to as the limit of quantification (LOQ). More detailed discussions on the concepts of detection capability and quantification capability are provided in Appendix B. While it is beyond the scope of this document to include a detailed discussion of the different definitions and terms for detection capability and quantification capability used by EPA and the analytical/metrology community, a few examples have been included to provide greater context for this discussion. Method Detection Limit (MDL). The U.S. EPA Office of Water MDL procedure is worth noting because it is codified in federal regulations (40 Code of Federal Regulations [CFR] Part 136, Appendix B) (12). The MDL is required for most chemical analyses that support National Pollutant Discharge Elimination System (NPDES) permits. It would be appropriate to use the MDL when validating most methods that are applicable to wastewater or Clean Water Act (CWA) compliance monitoring. Types of methods to which the MDL does not apply are detailed in the Scope and Application section of the MDL procedure (72): The MDL procedure is not applicable to methods that do not produce results with a continuous distribution, such as, but not limited to, methods for whole effluent toxicity, presence/absence methods, and microbiological methods that involve counting colonies. The MDL procedure also is not applicable to measurements such as, but not limited to, biochemical oxygen demand, color, pH, specific conductance, many titration methods, and any method where low-level spiked samples cannot be prepared. Except as described in the addendum, for the purposes 12 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 of this procedure, "spiked samples " are prepared from a clean reference matrix, such as reagent water, spiked with a known and consistent quantity of the analyte. MDL determinations using spiked samples may not be appropriate for all gravimetric methods (e.g., residue or total suspended solids), but an MDL based on method blanks can be determined in such instances. The EPA MDL procedure uses low-level spikes of a clean reference matrix and method blanks to calculate a detection limit. The procedure defines the MDL as (12): the minimum measured concentration of a substance that can be reported with 99% confidence that the measured concentration is distinguishable from method blank results. It is important to note that the MDL incorporates every step of the analytical method, including extractions and any mandatory cleanups, not just the final instrumental analysis. The MDL procedure has two subcategories: an initial MDL and an ongoing MDL. The initial MDL is used when a laboratory is first implementing a method, or if the laboratory does not already have data available to calculate an MDL. The ongoing MDL is used for analyses that are run routinely, using low-level spike data that are collected quarterly along with routinely collected method blank data. The MDL(s) is calculated using the spike data. The MDL(b) is calculated using the blank data. The final MDL is higher of the two MDL(s) and MDL(b) calculations. The procedure is relatively short, seven pages, and is available to the public at 40 CFR Part 136 in the eCFR at https://www.ecfr.eov/cei-bin/text- d!\ h U ? '<0. I m i 101> > \ _ * _ * ••'. An MDL frequent questions webpage is also available at https://www.epa.eov/cwa-methods/method-detection-limit-firequent-questions. In the EPA Office of Air and Radiation's ambient air monitoring program, "detection limit" is defined in the quality assurance guidance handbook as "[t]he lowest concentration or amount of the target analyte that can be determined to be different from zero by a single measurement at a stated level of probability" (13). In addition, the MDL procedure (12) is used in the ambient air monitoring program Photochemical Assessment Monitoring (14). The ambient air monitoring data collected by the states is submitted as measured, even if below the MDL. In the EPA Office of Air and Radiation's stationary source regulatory program which includes the New Source Performance Standards of 40 CFR Part 60 and the National Emissions Standards for Hazardous Air Pollutants of 40 CFR Parts 61 and 63, the "limit of detection" is defined in Method 301, the method validation protocol, as "the minimum concentration of a substance that can be measured and reported with 99 percent confidence that the analyte is greater than zero" (75). The Part 136, Appendix B MDL procedure (12) is also specifically referenced by several of the standards as well as_Method 301 (75). Neither the ambient air monitoring program nor the station source program utilizes the LOQ in data reporting. 13 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Limit of Detection (LOD) is used in some EPA methods and is recognized by several standards organizations such as ISO and ASTM International as a means to express the detection capability of a method. LOD is defined as the lowest concentration of the analyte that can be detected by the method at a specified level of confidence. ISO standards note that the LOD should be estimated taking both type I (alpha [a]) and type II (beta [|3]) errors into account (see Appendix B). IUPAC recommends default values for alpha error and beta error equal to 0.05 (5). Other approaches to estimating the LOD involve calculating the standard deviation of replicate measurements of blank samples or replicate measurements of test samples with low concentrations of the analyte in which the LOD equals some multiple (e.g., 3 times) of the standard deviation. Given that the definition of the LOD is somewhat general and that there are different means of calculating the LOD, it is critical that a precise definition of LOD be clearly stated during a method validation effort and the means of calculating the LOD be fully documented. Minimum Detectable Concentration (MDC) is used in most EPA radiochemical methods (drinking water methods are a notable exception) and is used extensively in the environmental radiochemistry community as a means to express the detection capability of a method. The following definition of the MDC is taken from the MARLAP Manual (10): The minimum analyte concentration that must be present in a sample to give a specified probability, l-p, of measuring a response greater that the critical value, leading one to conclude correctly that there is analyte in the sample. The value of P that appears in the definition above, like a, is usually chosen to be 0.05 or is assumed to be 0.05 by default if no value is specified. Lowest Concentration Minimum Reporting Level (LCMRL) and Multi-Laboratory Minimum Reporting Level (MRL) are used in the EPA Unregulated Contaminant Monitoring Rule (UCMR) Program, a national occurrence study of unregulated contaminants in drinking water. The LCMRL is defined in Statistical Procedures for Determination and Verification of Minimum Reporting Levels for Drinking Water Methods (16) as: A single laboratory Lowest Concentration Minimum Reporting Level (LCMRL) is the lowest true concentration for which the future recovery is predicted to fall between 50% to 150% with 99% confidence. It is a statistically-based quantitation procedure that accounts for both precision and accuracy. The determined concentration is based on a statistical calculation by a single laboratory where multiple concentration replicates are processed through the entire analytical method and the data are plotted as measured sample concentration (y-axis) versus true concentration (x-axis). If the data support an assumption of constant variance over the concentration range, an ordinary least-squares regression line is drawn; otherwise, a variance-weighted least-squares regression is used. Prediction interval lines of 99% confidence are drawn about the regression. At the points 14 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 where the prediction interval lines intersect with data quality objective lines of 50% and 150% recovery, lines are dropped to the x-axis. The higher of the two values is the LCMRL.2 The multi-lab oratory MRL is a statistical calculation based on the incorporation of LCMRL data collected from multiple laboratories into a 95% one-sided confidence interval on the 75th percentile of the predicted distribution referred to as the 95-75 upper tolerance limit (UTL). This effectively means that 75% of participating laboratories will be able to meet a set MRL. with a 95% confidence interval. The statistical parameters of the multi-laboratory MRL are based on the EPA established practical quantitation limit (PQL) determination, where the PQL is set at the concentration that 75% of participating laboratories are predicted to meet an analyte's acceptance criteria using a specified regression procedure. The calculation itself is defined in Technical Basis for the LCMRL Calculator (77): The MRL is calculated in three steps whenever there are three or more laboratories providing data with valid LCMRLs or calculated LCMRLs that are below the lowest non- zero spiking level. In the first step, 200 BB LCMRL replicates are calculated for each laboratory data set. In the second step a predicted distribution of some unknown and yet to be observed laboratory is built from the population of replicate laboratory LCMRLs using a random effects model. In the third and last step the MRL is taken to be the upper 95% one-sided confidence interval on the 75th percentile of the predicted distribution referred to as the 95-75 upper tolerance limit (95-75 UTL). Minimum Level (ML) refers to either the sample concentration equivalent to the lowest calibration point in a method or a multiple of the MDL, whichever is higher. Minimum levels can be obtained in several ways: they may be published in a method; they may be based on the lowest acceptable calibration point used by a laboratory; or they may be calculated by multiplying the MDL in a method, or the MDL determined by a laboratory, by a factor of three. For the purposes of NPDES compliance monitoring (under the CWA), EPA considers the following terms to be synonymous: quantitation limit, reporting limit, and minimum level (18-20). Lower Limit of Quantitation (LLOQ). EPA's Office of Resource Conservation and Recovery (part of Office of Land and Emergency Management) publishes the SW-846 Compendium, which uses the LLOQ concept for quantitative analysis (21). The SW-846 Compendium defines the LLOQ as the lowest concentration at which the laboratory has demonstrated that a target analyte can be reliably measured and reported with a certain degree of confidence (21). The LLOQ must be greater than or equal to the lowest initial calibration standard concentration, and each laboratory is required to establish and periodically verify LLOQs at concentrations at which both qualitative and quantitative requirements can routinely be met using the instrumentation, equipment, reagents, supplies, and personnel specific to that laboratory. 2 EPA provides software that will calculate LCMRL values. Instructions regarding the download and use of the software can be found at the following URL: https://www.epa.gov/dwanalvticalmethods/lowest-concentration-minimum-reporting-level- Icmrl-calculator 15 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 LLOQs are established or verified with spiked blanks or representative sample matrices prepared at or near expected target analyte LLOQs, and these quality control samples are processed through all sample preparation and analysis steps used for field samples. SW-846 methods provide default acceptance criteria for establishing or verifying LLOQs, and laboratories are encouraged to use statistically-based limits once they have acquired sufficient data (21). SW-846 methods also recommend including LLOQ verifications on a project-specific basis, as needed (e.g., to evaluate the potential for measurement bias when decision limits are near established LLOQs), and SW- 846 methods defer to project planning documents regarding whether and how to report concentrations below the LLOQ in field samples (21). Appendix C is a compilation of most of the terms used in EPA methods for analyte detection capability and quantification capability. Useful Resources The following resources can provide further details and information on detection capability and quantification capability. • IUPAC Technical Report, Harmonized Guidelines for Single-Laboratory Validation of Methods of Analyses (5) • Eurachem Guide: The Fitness for Purpose of Analytical Methods, A Laboratory Guide to Method Validation and Related Topics (3) • 40 CFRPart 136, Appendix B (12) • Multi-Agency Radiological Laboratory Analytical Protocols (MARLAP) Manual (10) • Currie, L.A. Limits for Qualitative Detection and Quantitative Determination: Application to Radiochemistry (22) • Currie, L.A., Detection: Overview of Historical, Societal, and Technical Issues, in Detection and Analytical Chemistry (23) • Currie, L.A., Presentation of the Results of Chemical Analysis in IUPAC Compendium of Analytical Nomenclature (24) • Currie, L.A. Quality Assurance of Analytical Processes, in IUPAC Compendium of Analytical Nomenclature (25) • Lanier, S. W., Hendrix, C. D. Reference Method Accuracy and Precision (ReMAP): Phase 1 (26) 16 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 4.3 Instrument Calibration Definition Instrument calibration refers to the procedures used for correlating instrument response to an amount of analyte (concentration or other quantity) using measurements of suitable RMs. Discussion An instrument calibration approach is established for a particular application during method development and is confirmed during single laboratory and/or multi-laboratory method validation. There are two major components of instrument calibration: calibration approach and RMs, both of which are described below. Calibration Approach. The calibration approach often includes the selection and application of a calibration model. The following excerpt on calibration model is from Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods of Analysis (6): The characteristics of a calibration function and justification for a selected calibration model should be demonstrated during an intra-laboratory method validation study. The performance of a calibration technique and the choice of calibration model (e.g., first-order linear, curvilinear, or nonlinear) are criticalfor minimizing sources of instrument bias and optimizing precision. A calibration model is a mathematical function that relates composition to instrument response. The parameters of the model are usually estimated from the responses of known, pure analytes. Calibration errors can result from failure to identify the best calibration model; inaccurate estimates of the parameters of the model; or inadequately studied, systematic effects from matrix components. The following excerpt on calibration is from ASTM E2857-11, Standard Guide for Validating Analytical Methods (4): Methods require calibration using measurements of suitable reference materials and mathematical fitting of the measured responses to an algorithm, that is, an equation thought to describe adequately the relationship between the amount of analyte and the measured response. Algorithms are almost always an approximation of the real world, and as such, their ability to fit the data has limits that can be tested by a variety of means. The process of method validation includes evaluation of the mathematical model or models used for instrument calibration. However, it is beyond the scope of this document to describe all possible models or algorithms that might be used for calibration, or the approaches that might be used for their evaluations. 17 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Reference Materials (RMs). RMs are often used to verify instrument calibration. In ISO/ International Electrochemical Commission (IEC) 17025, General Requirements for the Competence of Testing and Calibration Laboratories (2), the standard notes that RMs shall, where possible, be traceable to SI units through a National Metrology Institute (e.g., National Institute of Standards and Technology [NIST] in the United States) or, if that is not possible, then traceable to CRMs. It also states that internal RMs shall be checked as far as technically and economically practicable. The following excerpt on RMs and CRMs is from the Eurachem Guide (3): RMs can be virtually any material used as a basis for reference, and could include laboratory reagents of known purity, industrial chemicals, or other artefacts. The property or analyte of interest needs to be stable and homogenous, but the material does not need to have the high degree of characterization, metrological traceability, uncertainty and documentation associated with CRMs. The characterization of the parameter of interest in a CRM is generally more strictly controlled than for an RM, and in addition the characterized value is certified with a documented metrological traceability and uncertainty. There are generally three options for suitable RMs for instrument calibration: 1) CRMs; 2) RMs with traceability to CRMs; and 3) RMs from other sources. For chemical analysis, it is often the case that CRMs are not available, but it may be possible to obtain RMs with traceability to CRMs from a manufacturer or the laboratory conducting the method validation may prepare RMs with traceability to CRMs. In many cases, it may not be possible to obtain or produce RMs with traceability to CRMs. In those instances, the laboratory conducting the method validation may prepare internal RMs from other sources or obtain them from a manufacturer. These RMs, prepared from other sources should have a supporting certificate of analysis (COA) and should meet certain purity acceptance criteria based on the intended use of the method. Given that there are significantly fewer radiochemical analytes than chemical analytes, CRMs are much more likely to be available for the calibration of radiation instruments for a radiochemical method validation, though CRM availability for chemical analytes may also be related to difficulties in synthesizing or purifying a chemical of interest. The radiochemical CRMs can be used directly for instrument calibration, or as is most often the case, RMs with traceability to CRMs can be used for calibration. Useful Resources The following resources can provide further details and information on instrument calibration. • Eurachem Guide: The Fitness for Purpose of Analytical Methods, A Laboratory Guide to Method Validation and Related Topics (3) 18 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 • IUPAC Technical Report, Harmonized Guidelines for Single-Laboratory Validation of Methods of Analysis (5) • ISO/IEC 17025 General Requirements for the Competence of Testing and Calibration Laboratories (2) • ASTM E2857-11, Standard Guide for Validating Analytical Methods (4) • Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods of Analysis (6) 4.4 Measurement Uncertainty Definition The Joint Committee for Guides in Metrology (JCGM) Guide to the Expression of Uncer- tainty in Measurement (27), often abbreviated as GUM, defines measurement uncertainty as follows: a parameter, associated with the result of a measurement, that characterizes the dispersion of the values that could reasonably be attributed to the measurand NOTE 1 The parameter may be, for example, a standard deviation (or a given multiple of it), or the half-width of an interval having a stated level of confidence. NOTE 2 Uncertainty of measurement comprises, in general, many components. Some of these components may be evaluatedfrom the statistical distribution of the results of series of measurements and can be characterized by experimental standard deviations. The other components, which also can be characterized by standard deviations, are evaluated from assumed probability distributions based on experience or other information. NOTE 3 It is understood that the result of the measurement is the best estimate of the value of the measurand, and that all components of uncertainty, including those arising from systematic effects, such as components associated with corrections and reference standards, contribute to the dispersion. Discussion The result of a measurement is never exactly equal to the true value of the measurand (the particular quantity subject to measurement). The difference between the result and the true value is called the error of the measurement. Since the true value is always unknown, so is the error; however, a properly determined measurement uncertainty allows one to put bounds on the likely magnitude of the error. In conjunction with the measurement result, the uncertainty allows one to find bounds for the most likely values of the measurand. In this regard, it is conceptually similar to the "margin of error" that is commonly reported with statistical polling results. 19 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 When one follows the guidance of the GUM (27), the uncertainty of a measurement is expressed first as a standard deviation, called the combined standard uncertainty. The combined standard uncertainty may be multiplied by a coverage factor, k, to obtain an expanded uncertainty, which describes an interval about the measurement result that is believed to contain the true value with high confidence. The most commonly used coverage factor is k = 2, which is assumed to provide approximately 95 % confidence. The uncertainty of a measurement is intended to describe the quality of that measurement and should be considered when making decisions about the true value of the quantity being measured (for example, when comparing a single measurement result to an action level). It can also be useful when assessing whether a measurement process is producing results of the expected or required quality. Measurement uncertainties may or may not be considered when using statistical tests to make decisions about sampled populations. All or nearly all methods of radiochemical analysis used in the US include procedures for calculating measurement uncertainties. It has become common for laboratories to specify requirements for radioactive standards, calibrations, and calibration verifications in terms of their measurement uncertainties and also to include measurement uncertainties in the evaluation of radiochemical method quality control parameters. The Stationary Source and Ambient Air Monitoring Programs in EPA's Office of Air and Radiation, Office of Air Quality Planning & Standards utilize uncertainty in qualifying RMs used in ambient air and pollutant emissions measurements. The regulatory programs determine the quality (uncertainty) of the gas needed for a particular application, and the EPA Traceability Protocol for Assay and Certification of Gaseous Calibration Standards (28) sets forth the procedures to be followed in certifying the uncertainty of the gases manufactured for use in meeting the program requirements. Measurement uncertainty is most properly understood as a property of a measurement, not a measurement method or even a measurement process. However, an analytical method can still be evaluated in terms of the measurement uncertainty that it is expected to be achieved when used for analysis of samples at specified analyte levels under specified measurement conditions. Note that the predicted uncertainty may vary with the analyte level and may depend on other factors, including interferences. Predicting the uncertainty for a hypothetical measurement requires making assumptions about all such factors. Measurement uncertainty accounts for the effects of both random and systematic measurement errors, that is, for both imprecision and bias in the measurement process. (See Sections 4.5 and 4.1, which describe precision and bias for chemical and radiochemical methods.) When an analytical method provides estimates of measurement uncertainty, the validation process for the method can test the plausibility of the uncertainty estimates by comparing them to either the estimated precision of the method or its estimated root-mean-squared error. More extensive and rigorous testing of the uncertainty can be based on a large number of repeated analyses of the 20 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 same material, accounting not only for the uncertainty of each result but also for estimated measurement correlations among the results. Useful Resources The following resources can provide further details and information on measurement uncertainty. • JCGM 100:2008 Evaluation of Measurement Data: Guide to the Expression of Uncertainty in Measurement (27) • JCGM 101:2008 Evaluation of Measurement Data: Supplement 1 to the Guide to the Expression of Uncertainty in Measurement — Propagation of Distributions Using a Monte Carlo Method (29) • JCGM 102:2011 Evaluation of Measurement Data: Supplement 2 to the Guide to the Expression of Uncertainty in Measurement — Extension to Any Number of Quantities (30) • Eurachem/Co-Operation on International Traceability in Analytical Chemistry (CITAC) Guide, Quantifying Uncertainty in Analytical Measurement, CG 4, Third edition (37) • ASTM D8293-19, Guide for Evaluating and Expressing the Uncertainty of Radiochemical Measurements (32) • Multi-Agency Radiological Laboratory Analytical Protocols (MARLAP) Manual (10) • NIST "Uncertainty of Measurement Results" (33) • The NIST Traceable Reference Material Program for Gas Standards (34) 4.5 Precision Definition Precision, specifically for methods, is defined by ASTM E177-20 (11) as: Precision is the closeness of agreement between independent test results under stipulated conditions. ASTM Practice E177 notes that quantitative measures depend on the stipulated conditions and that independent test results mean that results are obtained in a manner not influenced by any previous result on the same or similar test object (11). Discussion The following discussion of precision as a method performance characteristic is taken from the Eurachem Guide (3): 21 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Precision (measurement precision) is a measure of how close results are to one another. It is usually expressed by statistical parameters which describe the spread of results, typically the standard deviation (or relative standard deviation), calculated from results obtained by carrying out replicate measurements on a suitable material under specified conditions. Deciding on the 'specified conditions' is an important aspect of evaluating measurement precision - the conditions determine the type of precision estimate obtained. 'Measurement repeatability' and 'measurement reproducibility' represent the two extreme measures ofprecision which can be obtained. Documentation of standard methods (e.g. from ISO) will normally include both repeatability and reproducibility data where applicable. Repeatability, expected to give the smallest variation in results, is a measure of the variability in results when a measurement is performed by a single analyst using the same equipment over a short timescale. Repeatability is sometimes referred to as 'within-run', 'within-batch' or 'intra-assay 'precision. Reproducibility, expected to give the largest variation in results, is a measure of the variability in results between laboratories. In validation reproducibility refers to the variation between laboratories using the same method. Between these two extremes, 'intermediate (measurement) precision' gives an estimate of the variation in results when measurements are made in a single laboratory but under conditions that are more variable than repeatability. The exact conditions used should be stated in each case. The aim is to obtain a precision estimate that reflects all sources of variation that will occur in a single laboratory under routine conditions (different analysts, extended timescale, different pieces of equipment etc.). 'Intermediate precision is sometimes referred to as 'within-laboratory reproducibility', 'between-run variation', 'between batches variation' or 'inter-assay variation'. ASTM E177-20 (77) has similar definitions for repeatability and reproducibility as they pertain to replicate measurements using a method. The Eurachem Guide (3) provides a spectrum (measurement repeatability, intermediate measurement precision, reproducibility) for the range of experiments involving replicate analyses designed to take into account variations in operational conditions, which can be expected during routine use of a method. The extent of replicate analysis should be based on the intended use and application of the method and the need to adequately demonstrate that a method is fit for its intended purpose. Useful Resources The following resources can provide further details and information on precision. 22 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 • ASTM E177-20, Standard Practice for Use of the Terms Precision and Bias in ASTM Test Methods (11) • Eurachem Guide: The Fitness for Purpose of Analytical Methods, A Laboratory Guide to Method Validation and Related Topics, Second Edition (3) • IUPAC Technical Report, Harmonized Guidelines for Single-Laboratory Validation of Methods of Analysis (5) • ASTM E1601-19, Standard Practice for Conducting an Interlaboratory Study to Evaluate the Performance of an Analytical Method (9) • ASTM E691-19el, Standard Practice for Conducting an Interlaboratory Study to Evaluate the Precision of a Test Method (7) 4.6 Range Definition The range is defined as the interval of analyte concentrations for which there is a meaningful response from the analytical system (in other words, from the characterized detection level e.g., LOD at the low end to the below saturation at the high end). The following types of ranges are normally confirmed during method validation. The calibration range is defined by the lowest and highest standards used for calibration that meet calibration performance criteria (e.g., linearity check, precision). The quantitation range is the range of analyte concentrations for which acceptable quantitative measurement results are obtained and reported. It spans from the characterized lower quantitation level (e.g., LOQ), through the level of interest (e.g., action levels) to an upper quantitation level (e.g., the highest calibration standard). This range may be extended through sample preparation techniques such as sample dilution or concentration. Discussion Where applicable, method validation should start by determining if the level(s) of interest (e.g., action level, risk limit, target level) can be reliably detected and quantitated on a candidate instrument, followed by determining the range of detections centered on the levels of interest. This defines the range and acceptability of the method for the desired sample results. The characterized detection level (e.g., MDL or LOD) should be well below the level(s) of interest, if possible. The upper limit should be well above any expected levels and is bound by analytical system constraints (e.g., detector saturation). The inclusion of additional factors, such as sample size, dilutions, or concentrations, help define the full range of the method. The determination of the range accounts for all steps of sample preparation and instrument conditions. 23 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 The quantitation range defines the range that meets known performance criteria and is unique to each specific combination of sample preparation and analysis procedures. The quantitation range produces acceptable calibration (linear or other) and is limited by the method/instrument's range of concentrations. This range begins at the characterized lower quantitation level and ends at the upper concentration of the calibration curve, which does not saturate the detector. The low point of the quantitation range can be between the characterized detection level and the lower characterized quantitation level. This quantitation range must be validated for all methods (except for single point calibrations or presence/absence tests). This range must be confirmed each time the method is applied after the validation. This quantitation range must also be confirmed if the method is applied using different instrumentation (within the same laboratory) or in a different laboratory (with the same or different instrumentation). This range may be extended through sample preparation techniques such as sample dilution or concentration. Note that the characterized detection limit may not meet the performance criteria of the calibration curve. The lower characterized quantitation limit is a concentration above the characterized detection limit and has defined performance levels. When the detectors become saturated, the response plateaus, and it is difficult to differentiate concentrations in the upper part of the range. It is important not to include any plateauing part of the range in the quantitation range. Useful Resources The following resources can provide further details and information on range. • ASTM E2857-11, Standard Guide for Validating Analytical Methods (4) • IUPAC Technical Report, Harmonized Guidelines for Single-Laboratory Validation of Methods of Analyses (5) • Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods of Analysis (6) 4.7 Ruggedness Definition The following definition of ruggedness comes from the EPA Forum on Environmental Measurements (FEM) Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods of Analysis (6): the extent to which an analytical method remains unaffected by minor variations in operating conditions. The document further describes that: 24 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Ruggedness testing involves experimental designs for examining method performance when minor changes are made in operating or environmental conditions. The changes should reflect expected, reasonable variations that are likely to be encountered in different laboratories. Discussion The following discussion of ruggedness testing comes from ASTM El 169-18, Standard Practice for Conducting Ruggedness Tests (35): A ruggedness test is a special application of a statistically designed experiment. It is generally carried out when it is desirable to examine a large number of possible factors to determine which of these factors might have the greatest effect on the outcome of the test method. Statistical design enables more efficient and cost effective determination of the factor effects that would be achieved if separate experiments were carried out for each factor. Ruggedness testing approaches can be univariate or mutlivariate in nature. In a univariate approach, one factor is changed at a time and the method performance assessed. Multivariate testing involves changing more than one factor at a time and is a more efficient way to assess ruggedness of a method. Two commonly used multivariate approaches are fractional factorial design (6, 35, 36) and Plackett-Burman design (3, 35, 37). These two designs are used to identify a smaller number of important factors from a list of many potential ones. In a fractional factorial design, only an adequately chosen fraction of the treatment combinations required for the complete factorial experiment is selected to be run (35). Appropriately chosen fractional factorial designs for two-level experiments have the desirable properties of being both balanced and orthogonal. Plackett-Burman designs are very economical designs with a run number that is a multiple of four (rather than a power of 2 for a complete factorial design) (37). Plackett-Burman designs are very useful for economically detecting large main effects, assuming all interactions are negligible. Additional information on the Plackett-Burman design can be found in the original paper describing this approach (37). An appropriate statistical design for conducting the ruggedness testing experiments should be completed prior to the start of any testing. The statistical design used may depend on the ruggedness testing needs that are determined for an individual method. Ruggedness tests should be designed to evaluate a range of possible variables that may impact the method. The IUPAC Technical Report (5) provides the following examples of variables or factors to consider including in a ruggedness test: changes in the instrument operator, changes in brand of reagent, concentration of a reagent, pH of a solution, temperature of a reaction, time allowed for completion of the process, etc. Other variables or factors that could be considered may include sample preparation variations, instrument settings, instrument conditions such as temperatures and flows, sample and/or extract holding times, or sample and/or extract additives. Matrix variability, as discussed in Section 3, may also be a factor that is incorporated into 25 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 ruggedness testing. Different variables may be more important to certain methods and not as important to others. Decisions surrounding which variables to test should be made in consideration of the method's operational needs and critical factors that may be most influential on method performance. A skilled method operator as well as someone with experience in designing ruggedness testing can play an important role in effectively determining which variables to test. The following note about the variables or factors chosen for ruggedness testing comes from ASTM El 169-18 (35): The factors chosen for ruggedness testing are those believed to have the potential to affect the results. However, since no limits may be provided in the standard for these factors, ruggedness testing is intended to evaluate this potential. ASTM El 169-18 (35) recommends testing two levels for each factor evaluated. The following description of these factors comes from ASTM El 169-18 (35): In ruggedness testing, the two levels for each factor are chosen to use moderate separation between the high and low setting. In general, the size effects, and the likelihood of interactions between the factors, will increase with increased separations between the high and low settings of the factors. Experimental runs of each factor of interest should be conducted in random order. Once test results are obtained, statistical analysis should be used to determine the effects of factors on the test method (35). In evaluating the results, ASTM El 169-18 notes that statistical significance is not the same as practical significance (35). There may be practical significance in differences smaller than those determined by the ruggedness tests and statistical evaluations that may need consideration and additional experimentation (35). Ruggedness testing can be conducted towards the end of the method development effort. The following overview of the rationale behind this approach is taken from ASTM El 169-18 (35): Ruggedness testing is usually done within a single laboratory on uniform material, so the effects of changing only the factors are measured. The results may then be used to assist in determining the degree of control required offactors described in the test method. Any operational aspects determined to have a critical impact on the method should be discussed in the method distributed for interlaboratory evaluation. Such aspects will be important for collaborating laboratories to take note of and document during validation efforts to help in assessing overall method performance across different laboratories. Useful Resources The following resources can provide further details and information on ruggedness. 26 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 • Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods of Analysis (6) • Eurachem Guide: The Fitness for Purpose of Analytical Methods, A Laboratory Guide to Method Validation and Related Topics (3) • IUPAC Technical Report, Harmonized Guidelines for Single-Laboratory Validation of Methods of Analysis (5) • ASTM Ell 69-18, Standard Practice for Conducting Ruggedness Tests (35) • Youden, W.J. Statistical techniques for collaborative tests. In: Statistical Manual of the AOAC (36) • Plackett, R.L. et al. The Design of Optimum Multifactorial Experiments (37) 4.8 Selectivity in the Presence of Interferences Definition The following definition for selectivity is from ASTM E2857-11 (4): The selectivity of a method is its ability to produce a result that is not subject to change in the presence of interfering constituents. Discussion At a minimum, a qualitative assessment of selectivity should be conducted during method validation. This section provides a general description of the qualitative assessment of selectivity. The following overview of evaluating selectivity is taken from the IUPAC Technical Report (5): Ideally, selectivity should be evaluated for any important interferent likely to be present. It is particularly important to check interferents that are likely, on chemical principles, to respond to the test. For example, colorimetric tests for ammonia might reasonably be expected to respond to primary aliphatic amines. It may be impracticable to consider or test every potential interferent; where that is the case, it is recommended that the likely worst cases are checked. As a general principle, selectivity should be sufficiently good for any interferences to be ignored. The following overview of the assessment of selectivity is taken from the Eurachem Guide (3): The selectivity of a method is usually investigated by studying its ability to measure the analyte of interest in samples to which specific interferences have been deliberately introduced (those thought likely to be present in samples). Where it is unclear whether interferences are already present, the selectivity of the method can 27 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 be investigated by studying its ability to measure the analyte compared to other independent methods. Confirmatory techniques can be useful as a means of verifying identities. The more evidence one can gather, the better. Inevitably there is a trade-off between costs and time taken for analyte identification, and the confidence with which one can decide if the identification has been made correctly. Whereas evaluation of repeatability requires the measurement to be repeated several times by one technique, confirmation of analyte identity requires the measurement to be performed by several, preferably independent, techniques. Confirmation increases confidence in the technique under examination and is especially useful when the confirmatory techniques operate on significantly different principles. In some applications, for example, the analysis of unknown organics by gas chromatography, the use of confirmatory techniques is essential. When the measurement method being evaluated is highly selective, the use of other confirmatory techniques may not be necessary. An important aspect of selectivity which must be considered is where an analyte may exist in the sample in more than one form such as: bound or unbound; inorganic or organometallic; or different oxidation states. The definition of the measurand is hence critical to avoid confusion. Typically, selectivity is expressed qualitatively. As with any method performance characteristic that is expressed qualitatively, it is critical that the conditions under which the testing was performed be thoroughly described and documented. The following discussion of a "qualitative selectivity statement" is taken from Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods of Analyses (6): A qualitative selectivity statement includes a description of known interferences, interference effects, and the nature of the analytical data and information that substantiates the identity of the analyte (s) in the matrix of concern (e.g., elemental or molecular structure data, retention times from chromatographic separations, selective reaction chemistry, and results from reference standards, reference materials, matrix blanks, other blanks, or matrix fortifications). Quantitative measures of selectivity may also be used, although there is no generally accepted approach for the quantitative treatment of selectivity data. Therefore, the basis for quantitative selectivity measures should be thoroughly described. Useful Resources The following resources can provide further details and information on selectivity. • ASTM E2857-11, Standard Guide for Validating Analytical Methods (4) 28 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 • IUPAC Technical Report, Harmonized Guidelines for Single-Laboratory Validation of Methods of Analyses (5) • Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods of Analysis (6) • Eurachem Guide: The Fitness for Purpose of Analytical Methods, A Laboratory Guide to Method Validation and Related Topics (3) 5 Statistical Evaluation and References A method validation study seeks to demonstrate that a method is fit for a purpose through the generation of results from method performance characteristic testing. These results should be appropriately evaluated and assessed to determine the validity and acceptability of the method. This assessment will generally rely on statistical evaluations of the generated data. However, ASTM E1488-12, Standard Guide for Statistical Procedures to Use in Developing and Applying Test Methods (38) notes that: Statistical procedures often result in interpretations that are not absolutes. Sometimes the information obtained may be inadequate or incomplete, which may lead to additional questions and the needfor further experimentation. Statistical evaluations need proper test planning and data collection to ensure the generation of reliable results. For example, ruggedness testing entails the consideration of the best statistical design for generating results necessary to appropriately evaluate the method of interest, as discussed in Section 4.7. Any statistical evaluations of method validation study data should be conducted by someone knowledgeable (from your office or organization) of the statistical methods and theories being used. A variety of statistical approaches are available, but there are often some that are more widely accepted or suggested for use (38). In addition, different method performance characteristics may involve implementing different statistical methods to appropriately analyze the results. It is a good idea to review existing methods or relevant literature to evaluate statistical methods used. The goal of this document is not to provide detailed, prescriptive guidelines on statistical analyses and procedures to be used for evaluating method validation study data. Rather, this section provides a list of suggested resources for use in understanding and implementing the necessary statistical assessment of generated study data. Any statistical approaches or analysis guidelines or criteria that are particular to an agency/program or its method development strategy should also be referred to and used as appropriate in analyzing method validation data. Review of existing agency method validation reports or scientific literature reporting method validation results could also be helpful in determining what statistical approaches might be applicable for use. The following references provide further information and details on general statistical evaluations of method validation study data. This is not an exhaustive list. 29 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 • ASTME1488-12, Standard Guide for Statistical Procedures to Use in Developing and Applying Test Methods (38) • Belouafa, S., et al. Statistical Tools and Approaches to Validate Analytical Methods: Methodology and Practical Examples (39) • Lynch, J.M. Use of AOAC International Method Performance Statistics in the Laboratory (40) • Ravisankar, P., et al .A Review on Step-by-Step Analytical Method Validation (41) • Wernimont, G. Use of Statistics to Develop and Evaluate Analytical Methods (42) • Guidance for Data Quality Assessment: Practical Methods for Data Analysis (43) • Data Quality Assessment: Statistical Method for Practitioners (44) • ASTM E1601-19, Standard Practice for Conducting an Interlaboratory Study to Evaluate the Performance of an Analytical Method (9) Other references noted throughout this document may also have sections describing the application of statistical procedures for more specific data evaluation efforts and should be considered as well. In addition, specific EPA program offices may have documents that contain program-specific guidance on statistical procedures that are recommended/should be used in validating related methods. For example, the Emissions Measurement Center of the Office of Air Quality Planning and Standards provides guidance on statistical calculations and comparisons to be made to data in Method 301 (75). 6 Method Validation Report A method validation report should be prepared and structured in accordance with the expectations and guidelines/protocols of individual offices and/or programs. This report should address the scope and purpose of the method as well as detail the results from all method performance and application characteristic validations performed, as described previously. The method validation report may also include the following information, as indicated in the Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods of Analysis (6): • background information on method development • details on the method validation techniques employed • changes made to the method as a result of the method validation studies, and • any recommendations for future work. This is not a definitive list. Information and results to be included in the method validation report are at the discretion of each office and/or agency developing the method. Review and release of a method validation report should be conducted in accordance with applicable office and/or agency protocols. 30 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 7 Consistently and Concisely Communicating Method Validation Studies Effective communication of method validation study results is an important part of the release/adoption step of the method lifecycle (see Figure 1). Consistent and concise communication practices can help to advance agency-wide awareness and understanding of validated methods. To aid in the communication of method validation study results, this document includes two newly developed tools: • Validation Design • Method Validation Summary The Validation Design is a very short alphanumeric descriptor meant to concisely describe important aspects of the method validation study design. The Method Validation Summary is a concise synopsis of the method validation results that can be easily prepared and included in any method validation report. Both provide mechanisms for consistent sharing of method validation results. The following sections explain these two new tools in more detail. 7.1 Validation Design The Validation Design provides a standardized format to convey the extent of validation performed for the method based on the number of laboratories that participated in the validation and the number of different matrices evaluated by each participating laboratory. By using the Validation Design in communicating method validation results, readers and users of the method will be easily able to understand how the validation was performed and compare validations across similar methods. The Validation Design represents the results of the validation in a succinct descriptor and is presented as [aL, bM] where aL is the number (a) of laboratories (L) that participated in the method validation, and bM is the number (b) of different matrices (M) that were used in the method validation. For example, a multi-laboratory method validation that used four laboratories, where each laboratory evaluated three different matrices would be reported as [4L, 3M], The number of laboratories used in any given method validation may vary from a single laboratory to multi-laboratory validation, depending on what was determined to be appropriate in activities preceding method validation (see Section 2.0). The number of different representative matrices used for validation samples can also vary. Each matrix should be different and, as previously described, be typical of matrix types applicable to the method under validation. Pertinent references are available to aid in the design and conduct of an interlaboratory or single laboratory validation study (7-9). Results from the interlaboratory or single laboratory study should be statistically evaluated to determine if the performance of the method is acceptable. These same references provide an excellent resource for performing and evaluating these calculations. It is not the intent of this document to provide guidelines for the number of laboratories that should participate in a validation study or the number of samples or different types of matrices that should be used in each study. Useful references are available that define expectations for numbers of participating laboratories and matrices used for non-government organizations and provide reasoning behind these recommendations or requirements (7-9). In addition, the purpose 31 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 and future use of the method may guide the interlaboratory or single laboratory validation study needs. Rather, this document presents the Validation Design that communicates the characteristics of an interlaboratory or single laboratory method validation in an easily reported and understood format. 7.2 Method Validation Summary It is the recommendation of this document that each method validation report that is developed include a Method Validation Summary to serve as a brief synopsis of the method validation process and results. Its intent is to provide a brief overview of the validation at the front of the method validation report to provide the reader with easy access to pertinent and important information. Furthermore, final Method Validation Summary documents can be easily shared across the Agency, and then be linked or referred to the originating office for details to allow for additional collaboration and discussions, especially under emergency situations. In order to facilitate sharing of these summaries across the EPA, the format of the Method Validation Summary should be the same across all methods, such that each Method Validation Summary contains the same types of summary information and details for a method in the same structure. This will allow the reader to access needed information quickly across summaries. A Method Validation Summary should be included in all method validation reports prepared across the Agency. In this way, all method validation reports that are developed should contain a Method Validation Summary that conveys the same type of information for a given method, providing consistency across all documents and offices. Table 1 provides the structure, format, and content of the Method Validation Summary. The Method Validation Summary should be brief, e.g., two pages in length and serve as a stand-alone document. The summary format is designed to be easy to complete and quickly capture all of the pertinent information from the Method Validation Report with minimal additional effort. Thus, method development and validation efforts should produce two documents: a Method Validation Report and Method Validation Summary. It is anticipated that both the report and the summary for a given method would reside in the records storage of the originating office. EPA will internally publish Method Validation Summaries separately for Agency staff to access in order to inform other method development activities across the Agency. Appendix D provides an example of a completed Method Validation Summary. 32 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 Table 1. Cover template, with structure guidelines, for the Method Validation Summary A Validation Design Description 1 Number of Laboratories 2 Number of Matrices 3 Types of Matrices Tested (water, soil, sediment, etc.) B Method Validation Overview Description 1 Method title 2 Author(s) list 3 Date 4 Purpose 5 Qualitative or Quantitative 6 Target Analytes/Parameters NOTES C Method Development Considerations Description and/or Results 1 Sample Cost 2 Sample Holding Times 3 Sample Preservation 4 Waste Generation NOTES D Method Performance Characteristic Description and/or Results 1 Bias/Trueness 2 Detection Capability and Quantification Capability 3 Instrument Calibration 4 Measurement Uncertainty 5 Precision 6 Range 7 Ruggedness 8 Selectivity in the Presence of Interferences NOTES The Method Validation Summary should contain all categories listed here. 33 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 8 References 1. International Organization for Standardization. (2015). ISO 9000:2015, Quality Management Systems, Fundamentals and Vocabulary. 2. International Organization for Standardization and International Electrotechnical Commission. (2017). ISO/IEC 17025:2017, General Requirements for the Competence of Testing and Calibration Laboratories. 3. Magnusson, B., Ornemark, U. (eds.). (2014) Eurachem Guide: The Fitness for Purpose of Analytical Methods: A Laboratory Guide to Method Validation and Related Topics. (ISBN 978-91-87461-59-0). Available from www.eurachem.org. 4. ASTM International. (2016). ASTM E2857-11(2016), Standard Guide for Validating Analytical Methods. 5. Thompson, M.; Ellison, S.L.R.; Wood, R. (2002). Harmonized Guidelines for Single- Laboratory Validation of Methods of Analysis, IUPAC Technical Report. Pure Appl. Chem., 74 (5), 835-855. 6. U.S. Environmental Protection Agency (EPA) Forum on Environmental Measurements (FEM) Method Validation Team; Principal Authors: Mishalanie, E. A., Lesnik, B., Araki, R., and Segal 1, R. (2005). Validation and Peer Review of US Environmental Protection Agency Chemical Methods of Analysis. FEM Document Number 2005-01. https://www.epa.gov/sites/production/files/2016- 02/documents/chemicat method guide i c\ ised 020 '< I |;df 7. ASTM International. (2019). ASTM E691-19el, Standard Practice for Conducting an Interlaboratory Study to Determine the Precision of a Test Method. 8. AO AC International. (2005). Appendix D: Guidelines for Collaborative Study Procedures to Validate Characteristics of a Method of Analysis. Official Methods of Analysis of A OA (' International. 9. ASTM International. (2019). ASTM E1601-19, Standard Practice for Conducting an Interlaboratory Study to Evaluate the Performance of an Analytical Method. 10. U.S. Environmental Protection Agency, U.S. Department of Defense, U.S. Department of Energy, U.S. Department of Homeland Security, U.S. Nuclear Regulatory Commission, U.S. Food and Drug Administration, U.S. Geological Survey, and National Institute of Standards and Technology. (2004). Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP). NUREG-1576. EPA 402-B-04-001C, NTIS PB2004- 105421. https://www.epa.gov/radiation/marlap-manual-and-supporting-documents 11. ASTM International. (2020). ASTM E177-20, Standard Practice for Use of the Terms Precision and Bias in ASTM Test Methods. 12. Definition and Procedure for the Determination of the Method Detection Limit. (2017). 40 CFR Part 136, Appendix B. https://www.ecfr.gov/cgi-bin/text- ic^ * M i 0:s m 11014 L'%7c5f?cd528d526a&MC=tme&no ^ M _ s s U.fg n=div9 34 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 13. U.S. Environmental Protection Agency. (2017). Quality Assurance Handbook for Air Pollution Measurement Systems, Volume 11, Ambient Air Quality Monitoring Program. EPA-454/B-17-001. https://www.epa.gov/sites/production/files/2Q2Q- 1 Q/documents/fmal handbook document 1 17.pdf 14. U.S. Environmental Protection Agency. (2019). Technical Assistance Document for Sampling and Analysis of Ozone Precursors for the Photochemical Assessment Monitoring Stations Program, Revision 2, April 2019. EPA 454/B-10-004 ¦https://www.epa.gov/sites/production/files/2Q19- 11/documents/pams technical assistance document revision 2 april 2019.pdf 15. Method 301 - Field Validation of Pollutant Measurement Methods from Various Waste Media. (DATE?). 40 CFR Part 63, Appendix A. https://www.epa.gov/emc/method-301- field-validation-pollutant-measurement-methods-various-waste-media 16. Winslow, SD; Pepich, BV; Martin, JJ; Hallberg, GR.; Munch, DJ; Frebis, CP; Hedrick, EJ; Krop, RA. (2006). Statistical procedures for determination and verification of minimum reporting levels for drinking water methods. Environ Sci Technol. 40 (1), 281-8. 17. U.S. Environmental Protection Agency. Technical Basis for the Lowest Concentration Minimum Reporting Level (LCMRL) Calculator. (2010). EPA-815-R-11-001. https://nepis.epa. gov/Exe/ZyPDF.cgi?Dockev=Pl 00J7CA.txt 18. Method 608.3: Organochlorine Pesticides and PCBs by GC/HSD. EPA 821-R-16-009. (2016). 40 CFR Part 136, Appendix A. https://www.epa.gov/sites/default/files/ 08/docum ents/m ethod 601 idf 19. Method 624.1: Purgeables by GC/MS. EPA 821-R-16_008. (2016). 40 CFR Part 136, Appendix A. https://www.epa.gov/sites/default/files/2017-08/documents/method 624- 1 2016.pdf 20. Method 625.1: Base/Neutrals and Acids by GC/MS. EPA 821-R-16-007. (2016). 40 CFR Part 136, Appendix A. https://www. epa.gov/ sites/default/files/ 08/docum ents/m eth od 62. idf 21. U.S. Environmental Protection Agency. (2015). Test Methods for Evaluating Solid Waste, Physical/Chemical Methods, EPA publication SW-846, Third Edition, Final Updates I (1993), II (1995), IIA (1994), IIB (1995), 111 (1997), 111 A (1999), II IB (2005), IV (2008), and V (2015). https://www.epa.eov/hw-sw846/sw-846-compendium 22. Currie, L.A. (1968). Limits for Qualitative Detection and Quantitative Determination: Application to Radiochemistry. Anal. Chem. 40, 586-593. 23. Currie, L.A. (1988). Detection: Overview of Historical, Societal, and Technical Issues. In L.A. Currie (Ed.), Detection and Analytical Chemistry (ACS Symp. Ser. 361) (pp. 1-62). American Chemical Society, Washington, DC. 24. Currie, L.A. (1998). Presentation of the Results of Chemical Analysis. In J. Inczedy, T. Lengyel, and A.M. Ure, (Eds.), IUPAC Compendium of Analytical Nomenclature (Chapter 2). Blackwell Science, Oxford. 35 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 25. Currie, L.A. (1998). Quality Assurance of Analytical Processes. In J. Inczedy, T. Lengyel, and A.M. Ure, (Eds.), IUPAC Compendium of Analytical Nomenclature (Chapter 18). Blackwell Science, Oxford. 26. Lanier, W.S. and Hendrix, C. D, (2001). Reference Method Accuracy and Precision (ReMAP): Phase 1 - Precision of Manual Stack Emission Measurements. American Society of Mechanical Engineers (ASME) Report. CRTD Vol. 60. 27. Joint Committee for Guides in Metrology. (2008). JCGM 100:2008, Evaluation of Measurement Data: Guide to the Expression of Uncertainty in Measurement. https://www.bipm.ore/iitils/common/documents/icgm/JCG s08 E.pdf 28. U.S. Environmental Protection Agency. (2012). Traceability Protocol for Assay and Certification of Gaseous Calibration Standards. EPA/600/R-12/53. https://www.epa.gov/air-research/epa-traceabilitv-protocol-assav-and-cert.ification- gaseous-calibration-standards 29. Joint Committee for Guides in Metrology. (2008). JCGM 101:2008, Evaluation of Measurement Data: Supplement 1 to the "Guide to the Expression of Uncertainty in Measurement," Propagation of Distributions Using a Monte Carlo Method. https://www.bipm.org/utils/common/documents/icgm/JCGM 101 JOOS E.pdf 30. Joint Committee for Guides in Metrology. (2011). JCGM 102:2011 Evaluation of Measurement Data: Supplement 2 to the "Guide to the Expression of Uncertainty in Measurement," Extension to Any Number of Quantities. https://www.bipm.org/utils/common/documents/icgm/JCG df 31. Ellison, S. L. R., Williams, A. (eds.). (2012). Eurachem/Co-Operation on International Traceability in Analytical Chemistry (CITAC) Guide CG 4 Quantifying uncertainty in analytical measurement. Third Edition. (QUAM:2012.P1). Available from www.eurachem .org 32. ASTM International. (2019). ASTM D8293-19, Guide for Evaluating and Expressing the Uncertainty of Radiochemical Measurements. 33. National Institute of Standards and Technology (NIST). Uncertainty of Measurement Results. Retrieved March 31, 2020, from https://phvsics.nist.gov/ciiu/Uncertaintv/index.html 34. The NIST Traceable Reference Material Program for Gas Standards. (2013). NIST Special Publication 260-126, available at https://nvlpiibs.nist.gOv/nistpiibs/SpecialPublications/.NIST.SP.260-126rev2013.pdf. 35. ASTM International. (2018). ASTM El 169-18, Standard Practice for Conducting Ruggedness Tests. 36. Youden, W.J. (1975). Statistical Techniques for Collaborative Tests. In: Statistical Manual of the AOAC, AO AC International, Washington, DC, pp. 33-36. 37. Plackett, R.L. and Burman, J.P. (1946). The Design of Optimum Multifactorial Experiments. Biometrika, 33(4), pp.305-325. 36 ------- Guidelines on Validation for Non-Regulatory Chemical and Radiochemical Methods January 2022 38. ASTM International. (2012). ASTM E1488-12, Standard Guide for Statistical Procedures to Use in Developing and Applying Test Methods. 39. Belouafa, S., Habti, F., Benhar, S., Belafkih, B., Tayane, S., Hamdouch, S., ... & Abourriche, A. (2017). Statistical Tools and Approaches to Validate Analytical Methods: Methodology and Practical Examples. International Journal of Metrology and Quality Engineering, 8, 9. 40. Lynch, J. M. (1998). Use of AO AC International Method Performance Statistics in the Laboratory. Journal of AO AC International, 57(3), 679-684. 41. Ravisankar, P., Navva, C. N., Pravallika, D., & Sri, D. N. (2015). A Review on Step-by- Step Analytical Method Validation. IO SR Journal of Pharmacy, 5( 10), 7-19. 42. Wernimont, G. Use of Statistics to Develop and Evaluate Analytical Methods; AOAC: Arlington, VA, 1985. (Fourth printing 1993; ISBN 0-935584-31-5) 43. U.S. EPA. (2000). Guidance for Data Quality Assessment: Practical Methods for Data Analysis. (EPA QA/G-9; QA00 Update) (EPA/600/R-96/084). Washington, DC; US EPA, Office of Environmental Information, https://www.epa.gov/sites/production/files/2015- 06/docum ents/g9-final. pdf 44. U. S. EPA. (2006). Data Quality Assessment: Statistical Methodfor Practitioners. (EPA QA/G- 9S) (EPA/240/B-06/003). Washington, DC; US EPA, Office of Environmental Information. https://www.epa.gOv/sites/production/files/2 /documents/g9s-final.pdf 37 ------- Appendix A Method Validation Matrix Considerations for Individual EPA Offices A-l ------- U.S. EPA Office of Water, Office of Groundwater and Drinking Water Drinking Water Method Development and Validation General validation guidelines for types of matrices and number of laboratories used are provided below. Note: Method validation is incorporated as part of the method development process. Matrix Testing (three, including reagent water) • Reagent water or laboratory water is used as a matrix baseline (no expected matrix effects). • Finished drinking water matrices generally fall into two categories: a surface source tap water (high organic carbon content) and a ground source tap water (high hardness). Usually, at least one of each is used as a test matrix. Note: Some methods analyze for target analytes that may only appear in one type of matrix, e.g., cyanotoxins that would only exist in finished water from surface water sources. In those cases, the type of matrix used in method validation may only include reagent water and finished drinking water from different sources of the same type. Similarly, if specific matrices are expected to have significant effects on method measurements, more of those types of test matrices may be evaluated, e.g., the effect of total organic carbon from surface water sources on early eluting organic analytes in liquid chromatography. Multi-laboratory Study (three, including in-house method performance) • At a minimum for statistical purposes, three laboratories are used for drinking water method validation. More may be used but it depends on availability and the resources of the laboratories since the participation of laboratories is voluntary. A-2 ------- U.S. EPA Office of Water, Office of Science and Technology Wastewater Matrix Types Recommended for Multiple Matrix Type Validation Studies 1. Effluent from a publicly owned treatment works (POTW) 2. ASTM D5905-98 (Reapproved 2018), Standard Specification for Substitute Wastewater 3. Sewage sludge, if sludge will be in the permit 4. ASTM D1141 - 98 (Reapproved 2021), Standard Specification for Substitute Ocean Water, if ocean water will be in the permit 5. Untreated and treated wastewaters up to a total of nine matrix types (see https://www.epa.gov/eg/industrial-effluent-guidelines for a list of industrial categories with existing effluent guidelines) At least one of the above wastewater matrix types should have at least one of the following characteristics: • Total suspended solids (TSS) greater than 40 mg/L • Total dissolved solids (TDS) greater than 100 mg/L • Oil and grease greater than 20 mg/L • NaCl greater than 120 mg/L • CaC03 greater than 140 mg/L A-3 ------- U.S. EPA Office of Air and Radiation, Office of Air Quality Planning and Standards Sample Matrix Considerations for Validation of Emission Test Methods Section 17.1.1 of Method 301 (40 CFR 63, Appendix A), a protocol for validation of stationary source emission test methods, recognizes that validation of a method at a 'similar source' with a similar emission matrix may be adequate to justify application of a candidate method to other similar sources. Because of the wide range of sample matrices that may be encountered from emission sources, there are no formal guidelines in place for addressing matrix differences. However, there is general agreement among those in the emission measurement community who have developed and validated methods that there are a number of sample matrix constituents that must be considered/assessed in determining the breadth of applicability of a particular candidate method. These include: • Acid Gases including NOx, SO2, HC1, HF, and H2SO4 • Other Reactive Gases including NH3, sulfur compounds • Particulate Matter including carbon, metals, and salts • Organic compounds • High moisture If it is possible and resources allow, it is preferable to evaluate the performance of the candidate method on samples from a number of types of emission sources to include various combinations of these matrix constituents and thus challenge the method capability. However, the cost of mobilization and sampling for collection of emission samples for validation purposes can often exceed $100,000, so this is often not a possibility. Thus, Section 14.0 of Method 301 suggests use of ruggedness testing1 as a potential tool to collect data to support a broader application of the candidate method. Sampling for ruggedness testing of emission test methods can often be conducted in the laboratory and multiple variables evaluated using a set of nine test runs, which can be a significant cost savings. 1 Youden, W.J. Statistical techniques for collaborative tests. In: Statistical Manual of the Association of Official Analytical Chemists, Association of Official Analytical Chemists, Washington, DC, 1975, pp. 33-36. A-4 ------- U.S. EPA Office of Resource Conservation and Recovery Matrix Types for SW-846 Method Validation: • Sample matrices should be selected to represent those regulated under the Resource Conservation and Recovery Act (RCRA) (e.g. soil, oily waste, wastewater). • Developers should analyze different types of matrices included in the scope of the method. Matrix types refer to different matrices within a particular medium, e.g. water, soil and ash. Appropriate RCRA matrix types might include: o Aqueous: groundwater, toxicity characteristic leaching procedure (TCLP) leachate and wastewater o Soil: sand, loam and clay soils o Ash: bottom ash, fly ash and/or combined ash • Samples should be well-characterized reference materials and/or spiked matrices containing known amounts of target analytes. • Bulk samples should be carefully homogenized where appropriate to reduce sub- sampling errors Summarized from: https://www.epa.eov/sites/prodiiction/files/2015-10/documents/methdev.pdf A-5 ------- Appendix B Analyte Detection and Quantitation B-l ------- The purpose of analyzing an environmental sample may be either (1) to make a qualitative decision about the presence or absence of a particular analyte in the sample or (2) to quantify the amount of analyte that is present. The first of these goals is referred to as detection, while the second is called quantitation (or quantification). One analysis can often serve both purposes simultaneously. Detection When analyte detection is of interest, two relevant aspects of the analytical method are its detection rule and its detection capability. The detection rule is the statistical test that is used to determine whether the measurement data justify a decision that the analyte is present in a sample. Typically, the test is designed to limit the probability of a false detection in a truly analyte-free sample to a specified small value such as 1 % or 5 %. This probability of false detection is the significance level of the test, often denoted by a. The detection rule is usually implemented as a straightforward comparison of the measured result to a calculated threshold value. This detection threshold goes by various names, including critical level, critical value, decision threshold, and decision level to name a few. The method detection limit (MDL) defined in 40 Code of Federal Regulations (CFR) 136 Appendix B is also an example. The procedures for calculating the detection threshold may be program-specific. The detection capability of a measurement process—its ability to detect the analyte—is often described in terms of the minimum detectable value, which is defined as the true value of the analyte that must be present in a sample to ensure a specified high probability of detection using the given detection rule. This probability is often denoted by 1 -/?, where p denotes the probability of a "false negative" result (non-detection). Assuming the detection rule involves comparison to a detection threshold, the minimum detectable value is the smallest true value of the analyte needed to ensure a specified probability 1 - p of observing a result greater than the detection threshold. Detection capability may be relevant even when the question to be answered is not whether the analyte is present but whether it exceeds a specified action level, as long as the expected background level of the analyte in typical samples is very low relative to the action level. In this case, to ensure adequate measurement capability at the action level, it may suffice to ensure adequate detection capability. Note—The term detection limit, depending on the field of measurement, may be used to mean either a detection threshold or a minimum detectable value, creating opportunities for miscommunication between workers in different fields. The same term is also defined in 40 CFR 141.25 (c) with a somewhat different definition in the context of measuring radionuclides in drinking water. B-2 ------- Assuming the measurement process produces normally distributed results with negligible bias and a well-characterized standard deviation, the relationship between the detection threshold and the minimum detectable value is illustrated by Figure Bl. Analyte concentration Figure Bl - Detection threshold, xc, and minimum detectable value, xd The curve centered at x = 0 represents the distribution of measurement results that can be expected when analyte-free samples are analyzed. A specified percentile of this distribution is identified as the detection threshold, xc. The analyte is considered to be detected if the measured result exceeds the detection threshold. So, for example, if the threshold is set at the 99th percentile of the distribution (as shown in Fig. Bl), then 99 % of all measured results for analyte-free samples should be less than the detection threshold, and approximately 1 % should exceed it, producing false detections. The second curve, on the right, represents the distribution of results that can be expected when the true level of the analyte in a sample equals the minimum detectable value, xd. For such a sample, there is a specified high probability, 1 - /?, of obtaining a result greater than the detection threshold. For example, if the minimum detectable value is defined as the analyte level at which the probability of detection (1 -/?) equals 95 %, then the area under the curve to the right of the detection threshold is 95 %, and the area to the left (/?) is 5 % (as shown in Fig. Bl). When the measurement standard deviation is not well known and must be estimated—by replicate measurements, for example—Fig. Bl is not completely valid but may still be illustrative. Quantitation (Quantification) When quantitation is of interest, the most relevant characteristic of the measurement process is its quantification capability, which is typically defined as the smallest true value of the analyte that ensures a specified acceptable level of relative measurement precision or uncertainty. This value of the analyte might be called by various names, including limit of quantitation (LOQ), quantitation limit, quantification limit, or minimum quantifiable value. The procedures for estimating quantification capabilities may be program-specific. Note—As used here, the terms quantitation and quantification have identical meanings. Each has been used extensively in the literature—for example in the terms listed above. B-3 ------- Quantitation is typically of interest whenever decisions are being made about the average analyte concentration in a sampled population rather than an individual environmental sample or specimen. It may also be of interest if the action level for decision-making about an individual sample does not greatly exceed the expected background level of the analyte in the sample matrix. B-4 ------- Appendix C Detection and Quantitation Limit Definitions C-l ------- Appendix C includes some of the detection and quantitation limit definitions that are used by EPA programs and appear in the literature. This is not a comprehensive list. C-2 ------- I'vpc Name Kxpliiiiiilion Notes AML Alternate Minimum Level A regression approach that provides for the case of nonconstant variance throughout the instrument calibration working range; calculated Currie-type Lc, Ld, and Lq; Lq is based on the standard deviation only (CI). IDL Instalment Detection Limit The concentration equivalent to a signal, due to the analyte of interest, which is the smallest signal that can be distinguished from background noise by a particular instrument. The IDL should always be below the MDL, and is not used for compliance data reporting, but may be used for statistical data analysis and comparing the attributes of different instruments. The IDL is similar to the "critical level" and "criterion of detection" as defined in the literature (C2). Similar to an MDL but is only intended as an instrumental measurement. Samples are not processed through entire method. IQEZ% Interlab oratory Quantitation Estimate Regression approach that provides for nonconstant variance throughout the working range; an interlaboratory quantitation level is determined on the basis of the use of the standard deviation only (C3). Lc, Lu, Lq Critical Level, Detection Level, Quantitation Level Lc, critical level (low false positive error); Ld, detection level (low false negative error); Lq, quantitation level, defined as a multiple (default 10 times) of the standard deviation; the standard deviation is determined from method blank replicates (C4, C5). LCMRL Lowest Concentration Minimum Reporting Level The lowest true concentration for which the future recovery is predicted to fall between 50% and 150% with 99% confidence (C6). Typically used by EPA internally (UCMR Program). It takes into account both precision and accuracy. C-3 ------- Typo Nil inc Kxpliiiiiilion Notes LLOQ Lower Limit of Quantitation The LLOQ is the lowest concentration at which the laboratory has demonstrated target analytes can be reliably measured and reported with a certain degree of confidence, which must be > the lowest point in the calibration curve. LOD Limit of Detection The lowest concentration level that can be determined to be statistically different from a blank (99% confidence). The LOD is typically determined to be in the region where the signal to noise ratio is greater than 5. Limits of detection are matrix, method, and analyte specific (C2). 5:1 signal to noise LOQ/MQL Limit or Level of Quantitation/Minimum Quantitation Level The level above which quantitative results may be obtained with a specified degree of confidence. The LOQ is mathematically defined as equal to 10 times the standard deviation of the results for a series of replicates used to determine a justifiable LOD. Limits of quantitation are matrix, method, and analyte specific (C2). Recommended LOQ = lOo, where o is the standard deviation of the samples (C7). Typically, it is the concentration that produces a signal 10 o above the reagent water blank signal, and should have a defined precision and bias at that level (C8). 10 times the standard deviation or 10:1 signal to noise. LT-MDL and LRL Long-Term MDL, Laboratory Reporting Level LT-MDL is calculated as the MDL of Glaser et al.; additional variance is included from multiple instruments, different matrices, and over time; LRL = 2(LT-MDL) (C9). C-4 ------- Typo Nil inc Kxpliiiiiilion Notes MDL Method Detection Limit The MDL is defined as the minimum measured concentration of a substance that can be reported with 99% confidence that the measured concentration is distinguishable from method blank results. Reference CI 1 contains the necessary equations for calculating MDLs (CIO, Cll). Does not take into account accuracy. Resulting concentration may significantly deviate from the true concentration. ML Minimum Level of Quantitation The term "minimum level" refers to either the sample concentration equivalent to the lowest calibration point in a method or a multiple of the MDL, whichever is higher. Minimum levels can be obtained in several ways: they may be published in a method; they may be based on the lowest acceptable calibration point used by a laboratory; or they may be calculated by multiplying the MDL in a method, or the MDL determined by a laboratory, by a factor of 3. For the purposes of NPDES compliance monitoring, EPA considers the following terms to be synonymous: "quantitation limit," "reporting limit," and "minimum level" (CI2). C-5 ------- Typo Nil inc Kxpliiiiiilion Notes MRL Minimum Reporting Level/Limit The minimum concentration that can be reported by a laboratory as a quantified value for the method analyte in a sample following analysis. This concentration must meet MRL confirmation criteria in the method and must be no lower than the concentration of the lowest calibration standard for each method analyte (CI 3). Data quality objectives are specified for accuracy (70-130% recovery) and precision (10% RSD), though independently applied (C14). MRL = 3 x MDL Where MDL = t* s and Hs a Student's t value and 5 is the estimated standard deviation (CI 5). For EPA MRL, the concentration is assigned by EPA but based on a multi-laboratory study using the LCMRL (precision and accuracy). Current EPA DW methods require an MRL confirmation, which also takes into account precision and accuracy. PQL Practical Quantitation Limit A quantitation limit that represents a practical and routinely achievable quantitation limit with a high degree of certainty (>99.9% confidence) in the results. The PQL appears in older Department of Natural Resources literature and in some current EPA methods, however its use is being phased out by the DNR (C2). The PQL, which is about three to five times larger than the MDL, is a practical and routinely achievable detection level with a relatively good certainty that any reported value is reliable (C16). C-6 ------- Typo N il mo Kxpliiiiiilion Notes PQLs Practical Quantitation Levels Criteria of quantitation level set for accuracy at ±40% and for precision at less than 20% RSD (CI 7). QLs and RDL Quantitation Levels and Reliable Detection Level Interlaboratory quantitation level: the median interlaboratory MDL is multiplied by a variable determined from laboratory performance data (values -4-7); RDL = 2(MDL) (CIS). yc, xD Hubaux-Vos Detection Limits yc is the decision limit corresponding to Currie's Zc; xd is a detection limit corresponding to Currie's Ld; calibration design for detection limits; based on the standard deviation only (C19). C-7 ------- CI Gibbons, R. D., Coleman, D. E., Maddalone, R. F. (1997). An Alternative Minimum Level Definition for Analytical Quantification. Environmental Science & Technology, 31(7), 2071-2077. doi: 10.1021/es960899d. C2 Ripp, J. (1996). Analytical Detection Limit Guidance & Laboratory Guide for Determining Method Detection Limits. (PUBL-TS-056-96). Madison, WI: Wisconsin Department of Natural Resources. C3 ASTM International. (2014). ASTM D6512-07, Standard Practice for Interlaboratory Quantitation Estimate. West Conshohocken, PA: ASTM International. C4 Currie, L. A. (1968). Limits for qualitative detection and quantitative determination. Application to radiochemistry. Analytical Chemistry, 40(3), 586-593. doi:10.1021/ac60259a007. C5 Currie, L. A. (1999). Detection and quantification limits: origins and historical overview. Analytica Chimica Acta, 391(2), 127-134. doi:https://doi.org/10.1016/S0003- 2670(99)00105-1. C6 Winslow, S. D., Pepich, B. V., Martin, J. J., Hallberg, G. R., Munch, D. J., Frebis, C. P., Krop, R. A. (2006). Statistical Procedures for Determination and Verification of Minimum Reporting Levels for Drinking Water Methods. Environmental Science & Technology, 40(1), 281-288. doi:10.1021/es051069f. C7 Keith, L. H., Crummett, W., Deegan, J., Libby, R. A., Taylor, J. K., Wentler, G. (1983). Principles of environmental analysis. Analytical Chemistry, 55(14), 2210-2218. doi:10.1021/ac00264a003. C8 American Public Health Association (APHA), American Water Works Asociation (AWW A), Water Environment Federation (WEF). (2017). 1010 INTRODUCTION. Standard Methods For the Examination of Water and Wastewater. Retrieved from https://www.standardmethods.ore/doi/abs/10.2105/SMWW.2882.004. C9 Childress, C. J. O., Foreman, W. T., Connor, B. F., Maloney, T. J. (1999). New reporting procedures based on long-term method detection levels and some considerations for interpretations of water-quality data provided by the US Geological Survey National Water Quality Laboratory. US Geological Survey Open-File Report, 193, 1. C10 Glaser, J. A., Foerst, D. L., McKee, G. D., Quave, S. A., Budde, W. L. (1981). Trace analyses for wastewaters. Environmental Science & Technology, 75(12), 1426-1435. doi: 10.1021/es00094a002. Cll U.S. Environmental Protection Agency. (2016). Definition and Procedure for the Determination of the Method Detection Limit, Revision 2. (EPA 821-R-16-006). Washington, D.C.: U.S. Environmental Protection Agency, OW/OST/EAD. C12 U.S. Environmental Protection Agency. (2016). Method608.3: Organochlorine Pesticides and PCBs by GC/HSD. (EPA Document No. 821-R-16-009). Washington, DC: US Environmental Protection Agency OW/OST/EAD. C-8 ------- C13 Adams, W. A., Wendelken, S. C. (2015). EPA Method 545: Determination of cylindrospermopsin and anatoxin-a in drinking water by liquid chromatography electrospray ionization tandem mass spectrometry (LC/ESI-MS/MS). (EPA Document No. 815-R-15-009). Cincinnati, OH: U.S. Environmental Protection Agency, OGWDW/SRMD/TSC. C14 Hertz, C., Brodovsky, J., Marrollo, L., Harper, R. (1992). Minimum Reporting Levels Based on Precision and Accuracy for Inorganic Parameters in Water. Proc. WQTC, Toronto. C15 Urbansky, E. T. (2000). Perchlorate in the Environment (Vol. 440): Springer. C16 APHA, AWWA, WEF. (2017). 1030 DATA QUALITY. Standard Methods For the Examination of Water and Wastewater. Retrieved from https://www.standardmethods.org/doi/abs/10.2105/SMWW.2882.0Q6. C17 Oxenford, J. L., McGeorge, L. J., Jenniss, S. W. (1989). Determination of Practical Quantitation Levels for Organic Compounds in Drinking Water. Journal -AWWA, 57(4), 149-154. doi:10.1002/j. 1551-8833.1989.tb03193.x C18 Sanders, P. F., Lippincott, R. L., Eaton, A. (1996). Determining quantitation levels for regulatory purposes. Journal - AWWA, 55(3), 104-114. doi: 10.1002/j. 1551- 8833.1996.tb06523.x. C19 Hubaux, A., Vos, G. (1970). Decision and detection limits for calibration curves. Analytical Chemistry, 42(8), 849-855. doi:10.1021/ac60290a013. C-9 ------- Appendix D Example Method Validation Summary and Associated Full Method Validation Report D-l ------- Appendix D provides an example of a Method Validation Summary developed from a Method Validation Report previously prepared by EPA Region 7. This is meant to provide guidance on how a Method Validation Summary is constructed and derived based on Table 1 (in Section 7.2 of the document), the information provided in Section 7.2, and the full Method Validation Report. Below is the example Method Validation Summary. Though information for an individual Method Validation Summary will change based on the method it is describing, the same table should be used to prepare all Method Validation Summaries. Note that the Method Validation Report provided following the Method Validation Summary serves to provide background for the construction of the Method Validation Summary and is not meant to dictate how Method Validation Reports should be prepared. The format of such reports will vary and should be based on program requirements. Example Method Validation Summary EPA Region 7 "Stir Bar Sorptive Extraction (SBSE or Twister™) Collaborative Research Project A Validation Design Description 1 Number of Laboratories 1 2 Number of Matrices 1 (surface water) 3 Types of Matrices Tested (water, soil, sediment, etc.) Surface water in the Kansas City Urban area; Tested three locations on 12 different streams in the Kansas City area. Noted that 1 -2 streams had high chlorine which impacted IS results. B Method Validation Overview Description 1 Method title Stir Bar Sorptive Extraction (SBSE or Twister) 2 Author(s) list Lorraine Iverson (Kimball), EPA Region 7 Science and Technology Center 3 Date January 8, 2010 (Final Internal Report with Attachments— Region 7) 4 Purpose Test new sorptive extraction technique that reduces the use of methylene chloride while providing better sample results. Develop alternative test procedure for polycyclic aromatic hydrocarbons. 5 Qualitative or Quantitative Quantitative 6 Target Analytes/Parameters 66 (45) semi-volatile organic compounds including PAHs (18), 17 (14) pesticides, 4 pharmaceutical and personal care products, 5 brominated flame retardants NOTES Have expanded the list to include selected herbicides. Evaluating in-situ sample collection. C Method Development Considerations Description and/or Results D-2 ------- 1 Sample Cost Significant reduction in costs for sample shipment, waste disposal, and solvent purchases; Annualized savings over traditional techniques of up to $2162 in solvent and glassware costs and 75% reduction in shipping costs 2 Sample Holding Times Tested for holding time—results good for 14 days without preservation 3 Sample Preservation Tested for holding time—results good for 14 days without preservation 4 Waste Generation Significant reduction in solvent usage and corresponding waste disposal; Annualized savings of up to 32 gallons of solvent, hundreds of glassware NOTES D Method Performance Characteristic Description and/or Results 1 Bias/Trueness Met SW846 8270 and EPA 625 criteria 2 Detection Capability and Quantification Capability Detection limit is 10-100 times lower than SW-846 8270 and EPA 625, pesticide results are comparable to 608 by gas chromatography/electron capture detection 3 Instrument Calibration For polycyclic aromatic hydrocarbons: Linearity of the calibration curves was excellent for the range of 0.2 ug/L to 8 ug/L - a factor of 40. Overall Summary: Linear range varied from 40-fold to only 4-fold 4 Measurement Uncertainty Excellent internal standard area reproducibility, at <10% with no interferents 5 Precision Met SW846 8270 and EPA 625 criteria 6 Range 0.1-20 (ig/L 7 Ruggedness Eight extraction parameters were tested: liners, split flow rates, range of sample volumes and stir times, temperature for desorption, extraction additives (methanol or salt), immediate removal or wait time, reanalysis of stir bar for removal rates 8 Selectivity in the Presence of Interferences Consistent with traditional semi-volatile organic compound and pesticide methods on gas chromatography/mass spectrometry NOTES This method has also been tested on three water sources as part of a multi-laboratory study and is one of the accepted solid phase extraction techniques in the updated EPA Method 625. D-3 ------- The following is the Method Validation Report, previously prepared by EPA Region 7, that was used to construct the Method Validation Summary presented above. This Report is being used to illustrate the development of a Method Validation Summary and is not indicative of how a Method Validation Report must be prepared as such reports should be structured based on program-specific requirements. D-4 ------- Stir Bar Sorptive Extraction (SBSE or Twister™) Collaborative Research Project July 2009 to September 2009 Final Internal Report with Attachments Margie St. Germain Lorraine Iverson ENSV/CARB/ORCS Jeff Robichaud Laura Webb ENSV/EAMB January 8, 2010 D-5 ------- Executive Summary EPA Region 7 completed a 90 day research project to test the feasibility of using a new and novel technology: Stir Bar Sorptive Extraction (SBSE or Twister™) for aqueous environmental samples. We accomplished the tasks that we planned and generated data to show the limits of feasibility using surface water samples. Twisters™ work well for neutral compounds that have limited solubility in water. Because of the benefits of this technology, we would like to purchase the equipment, implement sample analysis for PAHs while continuing work with other compounds. Accompli shments • Analyzed 66 Semi-volatile compounds, meeting method requirements for 42 compounds, including all 18 polyaromatic hydrocarbons. • Analyzed 17 UAA pesticides, meeting method requirements for 14 compounds. • Analyzed 4 pharmaceutical and personal care products with mixed success because of their solubility in water. • Analyzed 5 brominated flame retardants with an indication of good success. • Completed the documentation necessary for an Alternate Test Procedure for polyaromatic hydrocarbons. • Drafted a Standard Operating Procedure for future use. Benefits of Technique • Significant reduction in solvent usage and corresponding waste disposal. • Significant reduction in staff time to perform extraction and analysis. • Significant reduction in costs for sample shipment, waste disposal, and solvent purchases. • Significant reduction in staff exposure to repetitive movements and toxic solvents. • Faster turn-around time for emergency response and screening unknowns. • Field portability of the extraction process. • Ability to analyze semi-volatile compounds, pesticides, and brominated flame retardants in a single aqueous sample. Possible Future Direction • Re-evaluate 12 semi-volatile compounds at a lower concentration to meet method detection criteria. • Evaluate remaining pesticides, and gather data for an ATP for pesticides. • Select an appropriate list of compounds and perform the analyses needed for the ATP for the brominated flame retardants. • Identify other environmental contaminants that would be amenable to this novel technology. • Investigate LC/MS methods and instrumentation in order to analyze samples for the water soluble contaminants that are not amenable to Twister™. D-6 ------- Outline Executive Summary 2 I. Introduction 4 II. Background 4 III. Detailed Overview A. Field Procedures 7 B. Overview of Conditions and Compounds Evaluated 7 C. Sample Preparation Procedures 11 D. Analysis Procedures 11 E. Analytes Tested and Overall Results 1. Semi-volatile Organic Compounds 13 2. Polyaromatic Compounds 17 3. UAA Pesticides 19 4. Pharmaceutical and Personal Care Products 20 5. Brominated Flame Retardants 21 F. Holding Time Study for Semi-volatile Organic Compounds 21 G. Comparability Study 24 H. Matrix Interferences 25 IV. Cost Benefit Analysis 25 V. Conclusions 27 VI. Future Work and Development 30 VII. Acknowledgement and Disclaimer 31 VIII. References 32 IX. Attachments A. Overview of Project Activities A-l B. SVOC Data B-l C. PAH Data C-l D. UAA Pesticide Data D-l E. PPCP and BFR Data E-l F. Holding Time Study Data F-l G. Comparability Study Data G-l H. ATP for PAHs and selected SVOCs H-1 I. Draft SOP for Twister 1-1 D-7 ------- I. Introduction On July 6, 2009, EPA Region 7 entered into a loan agreement with Gerstel, Inc. for a period of 60 days which was later extended to 90 days. During this period we had free use of the Gerstel equipment to perform studies related to environmental analysis of surface waters. The Gerstel equipment included the autosampler, thermal desorption unit (TDU), cooled injection system (CIS), 15 position stir plate, and 10 Twisters™. We purchased an additional 40 Twisters™ for this project. The initial focus of the study was on the semi-volatile compound list, hoping to obtain satisfactory recovery and precision data to warrant an Alternate Test Procedure for the analysis in semi-volatile compounds in water. When the loan agreement was extended, we shifted our focus to add several compounds of concern for urban stream studies. These compound classes included traditional pesticides, polyaromatic hydrocarbons (PAHs), selected pharmaceutical and personal care products (PPCPs), and brominated flame retardants (BFRs). This report summarizes the work and the findings of this 90 day project, and provides our recommendations for future work and directions. The report details are summarized in the attachment while the body of this report provides a detailed overview. The attachments are scientific reports of the actual work, and should be able to stand on their own for the specific topic. II. Background Traditional extraction techniques suffer from several problems and limitations. First, the technique uses dichloromethane, a toxic solvent. The preferred technique is a liquid/liquid extraction using a mechanical shaking motion. This technique depends on comparatively large volumes of dichloromethane as the solvent (approximately 500 milliliters). Most laboratories are moving towards "greener" techniques including liquid-solid extraction (C- 18 impregnated Teflon discs or columns), micro-extraction, and solid phase micro-extraction (SPME). Under executive order 13423, section 2. (e.) (i.), all EPA laboratories are charged with reducing the use of 15 toxic chemicals, including dichloromethane. Second, the liquid/liquid extraction technique is not efficient in either time or labor. This is a relatively labor intensive process, taking roughly 8 hours to complete an extraction of a set of 12 samples. Shaking 1 liter (L) of sample with solvent is physically demanding, and is multiplied by the number of samples extracted. Additional time is needed to perform clean up steps, if needed, and to concentrate the extract to 1 milliliter (mL), resulting in staff time of 2-4 days for a set of 12 samples. Third, the sensitivity is moderate, and occasionally is barely sufficient for the lower action levels. One liter of sample is extracted and concentrated to 1 mL of extract. Then, only 1 microliter (|iL) of the final extract is injected for analysis. This translates to essentially extracting and analyzing only 1 milliliter of sample. Our reporting limits for most analytes are 2 micrograms per liter (|ig/L). Fourth, our traditional extraction technique is not portable. It cannot be done anywhere except within a permanent laboratory. Therefore, field samplers must collect the samples in one gallon D-8 ------- glass containers and transport them back to the laboratory. These samples are very heavy, and if they need to be shipped, the cost of shipping is high. The field samplers must also be cognizant of the seven day holding time from the time of collection to the time of extraction. The field samplers do not have the luxury of keeping the samples with them for several days until it is convenient to return to the laboratory. Stir bar sorptive extraction (SBSE or Twister™) is a relatively new technique to extract organic compounds from aqueous or other liquid samples for analysis by instrumentation such as gas chromatography/mass spectrometry (GC/MS). Twister™ employs an adsorptive coating on a magnetic stir bar. The technique is similar to solid phase micro-extraction (SPME) in theory, but in practice is considerably different due to the difference in physical design. The Twister™ technique provides enhanced sensitivity over the traditional extraction techniques such as liquid/liquid because the complete extracted fraction is quantitatively introduced into the GC/MS system by thermal desorption. While stirring the sample solution, the Twister™ efficiently extracts organic compounds from aqueous or other liquid samples such as milk. After extracting analytes from the samples, each Twister™ is placed into a sealed Twister™ desorption liner, placed in a TDU and introduced directly into the GC/MS. Additionally, Twister™ has the advantage over SPME due to the larger amount of sorptive material (polydimethylsiloxane) in contact with the sample on a physically stable form. Currently, this technique is used in the food manufacturing industry for quality control purposes. There has not been an extensive evaluation of this technique in the environmental chemistry industry in the United States. However, there are 3 articles published from European sources which show that Twister™ is promising in the area of environmental chemistry. EPA action limits have been traditionally set at the analytical method detection limits, parts per billion (ppm). Recently, EPA is shifting towards human risk values as action levels requiring that the analysis be pushed to lower concentrations, parts per trillion (ppt). In a publication about research in Germany using the Twister™ [1], the method detection limits for PAHs are in the low nanograms per liter (ng/L) range instead of the traditional low ug/L range. These lower detection limits approach the concentrations related to risk factors for humans. Currently, pesticides and polychlorobiphenyls (PCBs) are analyzed by a non-specific method of gas chromatography (GC) to reach the medium ng/L concentrations required in the regulations. A second publication from Spain [2] shows data that the method detection limits from Twister™ and gas chromatography/mass spectrometry (GC/MS) can reach low ng/L using full scan and acquiring confirmatory mass spectra for each compound. A third publication also from Spain shows that three classes of compounds can be detected on the Twister™ in a single analysis, thus eliminating multiple extractions. Again, the method detection limits reach the low ng/L or parts per trillion range. Frequently, when EPA analyzes surface water samples for environmental contaminants, EPA looks for a variety of compounds requiring multiple extractions and analyses. These three articles indicate that the use of Twister™ for environmental sample testing would be feasible. The under-appreciated attribute of Twister™ is the potential application for "unknown" analysis. For example, working with the water emergency response group, a water sample could be extracted and analyzed within one or two hours and produce a tentatively identified compound report where the unknown spectra are compared with a spectral library. Any number of organic D-9 ------- contaminants could be quickly detected and identified with this approach at parts per trillion (ppt) levels. Staff members from the EPA Region 7 branches (ENSV/EAMB and ENSV/CARB) worked jointly to plan and implement the research study to evaluate the Twister™ technique. Laura Webb led the field activities including the sample extraction in the field. Lorraine Iverson led the laboratory evaluation. Margie St. Germain coordinated the project and assisted where needed. During the planning meeting, the team of chemists developed a list of questions that needed to be answered in determining the merits of the Gerstel system. Those questions were • What are the optimum conditions for extraction? • What are the optimum conditions of the instrument for analysis? • What is the precision and accuracy of the Gerstel technique? • What are the method detection limits for semi-volatile compounds? • Can traditional method criteria be met using this technique? • What is the holding time of the Twisters™ once the samples were extracted? • Can all semi-volatile compounds be extracted by this technique? • What other compounds would be amenable to this technique? • Does the matrix interfere with this technique? • Would it be possible to gather enough data for an Alternate Test Procedure for the target analytes in the Kansas Urban Stream Studies? The ENSV/EAMB monitoring team, led by Laura Webb, collected surface water samples from twelve different surface waters in the Kansas Urban Stream study. The water monitoring team collected the normal two gallons of sample to be sent back to the laboratory for the semi- volatiles and pesticides analysis in water using the traditional liquid/liquid extraction technique. In addition, the team exposed a separate much smaller portion of the stream sample (10 milliliters initially) to the Twister™ in the field at the time of sample collection. The ENSV/CARB team, led by Lorraine Iverson, analyzed the samples, both for the traditional method and for the Gerstel method. The gallon water sample was extracted and analyzed in the laboratory by following existing methodology. The field extracted Twisters™ were brought back to the laboratory also and analyzed with the GC/MS system. Two additional samples were laboratory extracted Twisters™ using a small aliquot from the gallon samples. The quality assurance requirements for this activity are described in the corresponding Quality Assurance Project Plan (QAPP) for the Kansas Urban Streams Study. The same quality control procedures were followed for both techniques. Method blanks, lab control samples, and matrix spikes were prepared and extracted for the Twister™ technique in the field. Additional quality control samples were also prepared in the laboratory. Finally, a method detection limit study was performed for the Twister™ method concurrently with the Kansas Urban Stream sampling events. D-10 ------- III. Detailed Overview Because of the sheer volume of data that was generated by this project, discussion of each data point would be extensive. This detailed overview will summarize the results in sufficient detail to provide an understanding of the project results. More detailed discussions are provided in the subsequent attachments. 1. Field Procedures The Kansas Urban Streams project was chosen to initially test the Gerstel technique. This is an on-going project of the EAMB branch with the purpose of monitoring urban streams throughout the Kansas City Metro area. There are a series of 12 streams, with three sampling sites on each stream, which have been sampled for chemical, physical, and biological parameters for a period of 4 years [4], These streams were scheduled for sampling at the same time as the equipment loan. These surface water samples provided a perfect media for testing the durability of the equipment and feasibility of the project. Specifically, samples provided an environmental media where there is potential for typical urban analytes, such as pesticides, pharmaceuticals, personal care products, and other industrial chemicals. The water sampling was done with the water monitoring mobile laboratory, thus offering a good platform for the field portion of the study. The samples were collected beginning in mid-July through late August. Each sample was collected in two 4-L amber bottles for the traditional laboratory procedures, and in a 40 mL VOA vial for the Twister™ study. The volatile organic analysis (VOA) vials were immediately placed in a dark cooler, and later stored in the refrigerator in the mobile lab. With one exception, samples were extracted via the Twister™ technique on the same day as sampling in the mobile lab. The final set of samples was extracted in the regional laboratory on the following day. The techniques used in the mobile lab were identical to those used in the regional laboratory, with the exception that the field sample vials were pre-loaded with modifier in the regional laboratory prior to field sampling to reduce the material needed in the mobile lab. B. Overview of Conditions and Compounds Evaluated Once the instrument was set-up, preliminary conditions were determined and tested. With each series of tests, the conditions were further optimized. Five problems were identified and resolved during this project. Finally, additional environmental contaminants were evaluated. A timeline of events and general data results are presented in Attachment A. 1. Conditions and Parameters Tested Eight parameters and conditions were tested; ensuring the analysis of water samples would provide accurate and reproducible data. Different liners were available for the injector. These liners were open tubular, beveled tubular, glass bead filled, glass wool filled, tenax filled, and quartz wool filled. Initial analysis used the default tenax filled liner resulting in many of the compounds not being transferred to the GC/MS. Next we used the beveled tubular and the open tubular. However, there was not enough surface area to capture the high boiling compounds in the semi-volatile list. Next we tried the glass filled liner, and the very reactive compounds, such as 2,4-dinitrophenol and 4,6-dinitro-2- methylphenol, were being captured or destroyed on the liner. Finally, we tried the quartz wool D-ll ------- liner which provided a large surface area for capturing the compounds including the reactive compounds. We tried several variations of the split flow and splitless flow at the injector. We determined that for parts per billion concentrations, that split flows at 80 ml/min and a 1:5 split were optimum for full scan quantitation. However, as the concentrations dropped to the parts per trillion, then splitless flow was optimal. When using the splitless option, the standards had to be less concentrated to obtain a linear range for quantitation. We evaluated a range of sample volumes and stir times. We tried 5 ml, 10 ml, and 100 ml sample volumes. When evaluating samples for semi-volatile compounds, we found that 10 ml samples did not significantly improve the results, so the samples were analyzed with 5 ml aliquots. With this volume, we were able to use 60 minutes (min) as the optimum stir time. We also tried 90 min and 120 min stir times, and again did not observe significant improvements for these times. Any stir times greater than 120 min would not be amenable to field extractions. When we began testing the brominated flame retardants, we used 100 ml samples with a 24 hour stir time allowing us to reach parts per trillion concentrations. Therefore, for routine analyses, 5 ml sample aliquots stirred for 60 minutes would be the optimum time and volume, while for extremely low concentrated analyses, 100 ml sample aliquots stirred for 24 hours would be optimum. We used a stir rate of 1200 rpm for all tests. We did not evaluate stir rates for this project. We evaluated the temperatures during desorption. We tried -120°C and -70°C for the low temperature on the CIS. When we saw no difference between these two temperatures, we used the -70°C for all analyses. For the high temperature, we began at 300°C. When the high molecular weight compounds, high boiling PAHs, were not efficiently removed from the Twister™, the temperature was raised to 310°C and held for 15 min. This solved the carryover problem of the late eluting compounds in the SVOC standards. We evaluated the need for additives to the samples to enhance the adsorption onto the Twister™. The additives we considered were methanol, salt, acid, and base. All tests with base eliminated all the acidic compounds; therefore, base was not used with the project. Methanol improved the adsorption of the PAHs onto the Twister™ by preventing adsorption onto the glassware. We tested 20% methanol and 50% methanol as an additive. The 50% methanol did not significantly improve the recovery of the compounds, so 20% methanol was used for all the tests. Salt improved the adsorption of the compounds with a low octanol-water partition co-efficient (Ko/w) or the compounds that were water soluble. We tested 30% salt and 50% salt, discovering that the 50% salt additive did not significantly improve the recovery of compounds. Therefore, 30%) salt was used for all tests. Finally, we tested the addition of acid in order to enhance the recovery of acidic compounds. Acid did improve the recovery of acidic compounds more so when added to the salt portion rather than the methanol portion. When we added the acid first, we lost all basic compounds. Therefore, when evaluating acidic compounds in a mixture, we first added salt and stirred for 60 min, then added the acid and stirred for an additional 60 min. This technique provided the best recovery of salt and acid combinations. D-12 ------- We wanted to know if removal of the Twister™ from the sample was critical for reproducibility. We tested replicate samples, some where the Twister™ was removed immediately and some where the Twister™ was allowed to sit for an additional 60 min without stirring. We observed no significant differences between the replicate sets. We were also curious if all the compounds were removed from the sample with the first Twister™ run. We set up a series of analyses where the samples had one Twister™ spun. Upon completion, the first Twister™ was removed from the sample, and a second was added and the sample stirred for an additional 60 minutes. We learned that the basic compounds and the neutral compounds generally had 90% of the compounds adsorbed onto the first Twister™ and less than 10% adsorbed onto the second Twister™. The acids and the water soluble compounds were extracted from the sample with more difficulty and ranged from 10% to 50% onto the first Twister™ with something less onto the second Twister™. Finally, we were concerned with the holding time after a Twister™ has been used to extract a sample. Discussion of the results is presented Section F of this report. Generally, the Twisters™ reproduced the results for a 14 day period without any loss. 2. Problems Resolved We identified and resolved five problems with varying degrees of success. The problems included phthalate contamination, carryover of the high boiling compounds, poor recoveries of the basic compounds, poor recoveries of several acidic compounds, and poor precision of the internal standards. We observed sporadic phthalate contamination during the first six weeks of the project. Initially, some of the Twisters™ had not been properly cleaned and contained traces of phthalate, a common contaminant from plastics. The phthalate contamination was reduced by changing the cleaning technique but still observable. Next, we identified possible sources of contamination in the laboratory. By modifying our handling protocols for the Twisters™ and their containers, the phthalate concentration was further reduced but not eliminated. In discussions with Gerstel representatives, we agreed that the new, 1 mL vials could have contamination that would transfer to the Twisters™ while being stored. Future work will ensure that the Twisters™ are stored in the transfer tube and adapter which have been cleaned of phthalates. These practices should eliminate any laboratory and handling sources of contamination. During the initial testing we observed non-reproducibility of the high boiling compounds and the high molecular weight, aromatic compounds. We determined that the CIS temperature of 300°C held for 1 min was not sufficient. Increasing the temperature to 310°C and holding for 15 minutes improved the results; however, the precision was still not sufficient. Late in the study, the transfer temperature between the TDU and CIS inlet was raised from 275°C to 325°C, which improved the results further. In the semi-volatile list there are five very basic compounds which never performed well. The compounds were N-nitroso-di-n-propylamine, 4-chloroaniline, 2-nitroaniline, 3-nitroaniline, and 4-nitroaniline. These compounds are all water soluble. N-nitroso-di-n-propylamine is easily D-13 ------- degraded at high temperatures. A different sorbent or a different analytical technique would need to be employed to obtain acceptable data for these compounds. In the semi-volatile list there are 17 acidic compounds. We were able to improve the recoveries of 11 compounds by using a two step extraction with Twisters™, first with 30% salt followed by acid addition to pH<2. Six acidic compounds are also water soluble and good recoveries were never obtained for these compounds (phenol, benzyl alcohol, benzoic acid, 2-methylphenol, 4- methylphenol, 2,4-dinitrophenol). Throughout the first eight weeks, we observed that the internal standard areas were not very precise as with standard methods, specifically l,4-dichlorobenzene-D4 and perylene-D12. We had been spiking the samples with the method prescribed concentrations using the traditional methods. We also noticed the normal high concentrations of the calibration curves were not consistent with the traditional methods. When we finally reduced the concentration by a factor of ten, and used lower concentrations for the calibration curves, we were able to obtain precise internal standard areas for the calibration curves and for subsequent samples. We were able to see standard deviations less than 10% for the areas of all six internal standards. Using the traditional high level standard, we calculated that we were loading a single Twister with 75,000 ng of contaminants and internal standards. We determined that we had saturated the Twisters with the concentration of possible compounds. Therefore, for future work, the calibration curves and the internal standard areas should be reduced in order to properly function. 3. Rationale for the Change in Scope When we were nearing the original deadline, we realized that we would not be able to resolve all the issues for the complete semi-volatile list of 66 compounds. At that point, we were routinely getting acceptable results for 45 compounds out of 66 compounds. Our ultimate goal was to develop an Alternate Test Procedure (ATP) document for one list of compounds. We realized that the list of Kansas Urban Stream analytes contained mostly neutral compounds, including pesticides and emerging contaminants. Therefore, we were able to receive an additional 30 days on the loan of the equipment. During this time we focused on completing the components of the ATP for PAHs, and determining the feasibility of adding phthalates, pesticides and other emerging contaminants such as pharmaceutical and personal care products (PPCP) and brominated flame retardants (BFR). By determining the feasibility of these additional compounds, we would be able to define the method development for an additional ATP. Detailed discussions are provided in Section E on each of the categories of compounds investigated. C. Sample Preparation Procedures The samples were measured into extraction vials (either 20 mL VOA vials if using 5 or 10 mL sample sizes, or larger bottles when using 100 mL sample size). Four reagent water aliquots were prepared, two for the method blank and two for the laboratory control sample (LCS). One of the samples was chosen for matrix spike (MS) and matrix spike duplicate (MSD); four additional aliquots of this sample were also prepared. To each aliquot, 10 |iL of the internal standard/surrogate solution was added. To the LCS, MS, and MSD 10|iL of spiking solution was added. The Kansas Urban Stream samples were all analyzed using two sample aliquots - one with 20% methanol as the modifier, and one with 3g of salt and concentrated sulfuric acid to pH < 2. Initially, this salt/acid combination was performed at one time and the sample stirred for the D-14 ------- same time period as the methanol portion at 90 minutes. However, during the project it was determined that improved recoveries of basic compounds were obtained if the salt were added first, the sample stirred for a period of time, then the acid added and the sample stirred again for a total time of 120 min. The final sets of samples used this procedure: 5 mL sample with 20% methanol for 1 hour, 5 mL sample with 3g salt for 1 hour, then several drops of acid added and this portion stirred for an additional hour. The methanol portion was held until the other portion was finished, and then both stir bars were removed, rinsed with de-ionized (DI) water, and placed in a TDU tube. It was determined that the stir bars would be placed in the tubes in the same position for all samples and standards, thus the methanol stir bar was inserted first, then followed by the salt/acid stir bar. The TDU tube was then sealed with a transport adaptor and placed in the sampler tray. D. Analysis Procedures The final analytical conditions are listed below. We evaluated several conditions, such as transfer temperature, split and splitless desorption, and CIS initial temperature. These evaluations are summarized Section B. a. TDU • Initial Temperature: 40°C • Delay Time: 0.5 min • Initial Time: 0.01 min • Temperature Program: 60°C/min to 310°C, hold 5 min • Transfer Temperature: 325°C, fixed • Splitless Desorption b. CIS • Initial Temperature: -70°C • Equilibrium Time: 0.5 min • Initial Time: 0.2 min • Temp Program: 12°C/s to 310°C, hold 15 min c. GC • Column: HP 5 MS, 30m x ,25mm x .25 |im • Initial Column Temperature Hold: 40°C for 2 min • Column Temp Program: 40-284 @ 8°/min, 15°/minto 310° • Final Column Temperature Hold: 310° C for 2.9 min • Transfer line Temperature 280°C • Carrier Gas: Helium at lmL/min constant flow d. MS • Source Temperature 230°C • Electron Energy: 70 V (nominal) • Mass Range: 35-450 amu (except for Brominated Flame Retardants which require mass range 100 - 660 amu) • Scan Time: 1.81 scans/s (1.36 scans/s for BFR) e. Selected Ion Monitoring for Brominated Flame Retardants (SIM, Table 1) f. Selected Ion Monitoring for Pharmaceutical and Personal Care Products (SIM, Table 2) D-15 ------- Table 1. Group SIM Conditio Start Time ns for Brominated Flame Compound ietardants Quant Ion Other Ions 1 10.00 Naphthalene-d8 (IS) 136.2 107.9 2-Fluorobiphenyl (s) 172.0 171.1 2 16.00 Acenaphthalene-DIO (IS) 164.2 162.2 Phenanthrene-dlO (IS) 188.2 184.2 Terphenyl-dl4 (s) 244.2 3 28.00 BDE-47 138.1 Chrysene-dl2 (IS) 240.2 236.2 BDE-47 325.9 485.8 BDE-100/BDE-99 403.7 405.7, 563.9 4 33.00 Perylene-dl2 (IS) 264.2 132.2 BDE-153 483.8 140.9, 643.4 IS = inte rnal standard, s Hexabromobiphenyl = surrogate, BDE - bromin 307.8 ated dipheny 467.8, 627.4 ether Table 2. SIM Conditions for Pharmaceuticals and Personal Care Products (PPCP) Group Start Time Compound Quant Ion Other Ions 1 10.00 Naphthalene-d8 (IS) 136.2 107.9 2-Fluorobiphenyl (s) 172.0 171.1 2 17.00 Bisphenol A 213.2 119.1, 228.1 Phenanthrene-dlO (IS) 188.2 184.2 Terphenyl-dl4 (s) Y 244.2 Triclosan 217.9 288.0, 289.9 3 28.00 Ethinyl Estradiol 213.1 160.0, 296.1 Estrone 270.2 146.0, 185.1 Chrysene-dl2 (IS) 240.2 236.2 IS = internal standard, s = surrogate E. Analytes Tested and Overall Results We fully tested the semi-volatile compounds listed in CARB Method 3230. 2 and EPA Method 625 against method performance criteria. The method performance criteria were calibration curve linearity, method detection limit determinations, precision of four replications (initial demonstration of proficiency-IDP), and matrix spike/matrix spike duplicate recovery and bias. The semi-volatile list contains acidic compounds, basic compounds, and neutral compounds. Within the list of neutral compounds, there is a shorter list of polyaromatic hydrocarbons (PAHs). We tested the feasibility of additional compounds, including pesticides listed in EPA Method 608, pharmaceutical and personal care products (PPCP), and brominated flame retardants (BFRs) not previously analyzed by EPA Region 7. This section of the report summarizes the specific compounds within each of these lists, and the successes and failures. D-16 ------- 1. Semi-volatile Organic Compounds (SVOCs) We tested the semi-volatile compounds, not including the PAHs, summarized in Table 3 and detailed in Attachment B. We were able to show Twister™ method comparability with EPA 625 for 23 compounds which are listed in Table 4. The extraction procedure was 20% methanol, 30% salt, or 4 drops 1:1 sulfuric acid to a 5 mL sample size. These compounds met four method performance criteria listed above. Table 5 lists the 12 additional compounds that met three of four method performance criteria and should be considered acceptable once the additional criteria are met. Generally, either the precision was not within the strict ranges set by the method or the method detection limit determinations needed to be performed at a different concentration. For the compounds in Tables 4 and 5, the linear range varied from 40 fold to only 4 fold. The internal standard areas were generally stable; however, the salt and acid extractions proved to be susceptible to matrix effects. The base/neutral surrogate recoveries were generally comparable with what one might obtain from CARB Method 3230.2F; however, the acid surrogates often proved difficult due to lack of sensitivity or due to matrix effects. If these compounds are needed by a client for a certain site, they would prove to be more difficult because multiple extractions and analyses would be required. The time spent would likely still be less than with the traditional technique, because extraction could be done all three ways (methanol, salt, and acid) simultaneously on the same stir plate. The analysis time would be tripled, but this problem is minimal if an auto sampler is used. Finally, the analytes listed in Table 6 were unsuccessful. The majority of the unsuccessful compounds fall into the acidic or basic category. Two unsuccessful compounds (benzyl alcohol and benzoic acid) are not listed in the Clean Water Act for regulatory purposes, but are listed in EPA Method 625. N-nitroso-di-n-propylamine can be thermally reactive, and may never do well with the Twisters™. Some of the unsuccessful analytes are difficult at best when using the traditional extraction and analysis method. Typically these compounds have low sensitivity and/or poor chromatography. Some of the unsuccessful analytes are much more soluble in water and may be best analyzed by an alternate method such as LC/MS. As different sorbents are added to the Twister™ technology, these compounds could be re-evaluated Table 3. Semi-volatile Compounds (not including PAHs) Analyte Name CAS# Log Ko/w Type Phenol 108-95-2 1.46 Acid bis-(2-Chloroethyl)ether 111-44-4 1.29 Neutral 1,3 -Di chl orob enzene 541-73-1 3.53 Neutral 2-Chlorophenol 95-57-8 2.15 Acid 1,4-Dichlorobenzene 106-46-7 3.44 Neutral 1,2-Dichlorobenzene 95-50-1 3.43 Neutral Benzyl alcohol 100-51-6 1.10 Acid 2-Methylphenol 95-48-7 1.95 Acid bi s(2-Chloroi sopropyl)ether 108-60-1 2.48 Neutral Hexachl oroethane 67-72-1 4.14 Neutral D-17 ------- Table 3 (continued). Semi-volatile Compounds (not including PAHs) Analyte Name CAS# Log Ko/w Type 4-Methylphenol 106-44-5 1.94 Acid N-nitroso-di-n-propylamine 621-64-7 1.36 Base Nitrobenzene 98-95-3 1.85 Base Isophorone 78-59-1 1.70 Neutral 2-Nitrophenol 88-75-5 1.79 Acid 2,4-Dimethylphenol 105-67-9 2.30 Acid bis(2-Chloroethoxy)methane 111-91-1 1.30 Neutral 2,4-Dichlorophenol 120-83-2 3.06 Acid Benzoic acid 65-85-0 1.87 Acid 1,2,4-Trichlorobenzene 120-82-1 4.02 Neutral 4-Chloroaniline 106-47-8 1.83 Base Hexachl orobutadi ene 87-68-3 4.78 Neutral 4-Chloro-3-methylphenol 59-50-7 2.70 Acid Hexachl orocy cl opentadi ene 77-47-4 5.04 Neutral 2,4,6-Trichlorophenol 88-06-2 3.69 Acid 2,4,5-Trichlorophenol 95-95-4 3.72 Acid 2-Chloronaphthalene 91-58-7 3.90 Neutral 2-Nitroaniline 88-74-4 1.85 Base Dimethylphthalate 131-11-3 1.60 Neutral 2,6-Dinitrotoluene 606-20-2 2.10 Base 3-Nitroaniline 99-09-2 1.37 Base 2,4-Dinitrophenol 51-28-5 1.67 Acid Dibenzofuran 132-64-9 4.12 Neutral 4-Nitrophenol 100-02-7 1.91 Acid 2,4-Dinitrotoluene 121-14-2 1.98 Base Diethylphthalate 84-66-2 2.42 Neutral 4-Chlorophenyl-phenyl ether 7005-72-3 4.70 Neutral 4-Nitroaniline 100-01-6 1.39 Base 4,6-Dinitro-2-methylphenol 534-52-1 2.13 Acid N-nitrosodiphenylamine 86-30-6 3.13 Base 4-Bromophenyl-phenyl ether 101-55-3 4.94 Neutral Hexachl orob enzene 118-74-1 5.73 Neutral Pentachl orophenol 87-86-5 5.12 Acid Carbazole 86-74-8 3.72 Neutral Di -n-buty lphthal ate 84-74-2 4.50 Neutral Buty lb enzylphthal ate 85-68-7 4.73 Neutral 3,3'-Dichlorobenzidine 91-94-1 3.51 Basic bis(2-Ethylhexyl)phthalate 117-81-7 7.60 Neutral Di -n-octy 1 phthal ate 117-84-0 8.10 Neutral D-18 ------- Table 4. Summary of Successful Semi-volatile Compounds CARB Type of Twister™ EPA Method extraction MDL, Method 625 3230.2F Analyte ug/L MDL, ug/L MDL, ug/L bis(2-Chloroethyl) ether 0.23 5.7 1 Salt 1,3 -Di chl orob enzene 0.14 1.9 1 Methanol 1,4-Dichlorobenzene 0.15 4.4 1 Methanol 1,2-Dichlorobenzene 0.18 1.9 1 Methanol Nitrobenzene 0.50 1.9 1 Methanol bis(2-Chloroethoxy) methane 0.53 5.3 1 Methanol 1,2,4-Trichlorobenzene 0.14 1.9 1 Methanol Hexachl orocy cl opentadi ene 0.40 1 Methanol 2,4,6-Trichlorophenol 0.36 2.7 1.6 Acid 2,6-Dinitrotoluene 0.19 1.9 1 Salt Dibenzofuran 0.08 1 Methanol 2,4-Dinitrotoluene 0.51 5.7 1 Methanol 4-Chlorophenyl-phenyl ether 0.07 4.2 1 Methanol 4,6-Dinitro-2-methylphenol 0.21 24 1.8 Salt N-nitrosodiphenylamine 0.32 1.9 1 Methanol 4-Bromophenyl-phenyl ether 0.08 1.9 1 Methanol Hexachl orob enzene 0.13 1.9 0.8 Methanol Pentachl orophenol 0.25 3.6 2 Acid Carbazole 0.36 2.5 Methanol Di -n-buty lphthal ate 0.09 2.5 1.9 Methanol Buty lb enzylphthal ate 0.06 2.5 1.6 Methanol bis(2-Ethylhexyl)phthalate 0.79 2.5 1.1 Salt Di -n-octy 1 phthal ate 0.29 2.5 1.1 Methanol D-19 ------- Table 5. Summary of Compounds that May be Successful Analyte Log Ko/w Type Comment 2-Chlorophenol 2.15 Acid Low sensitivity; Repeat MDL with higher spike level bis(2- Chl oroi sopropy l)ether 2.48 Neutral Repeat MDL with higher spike level Hexachl oroethane 4.14 Neutral Good performance; Recovery higher than method upper limit (100%) Isophorone 1.70 Neutral Low sensitivity; Spike levels need to be increased 2-Nitrophenol 1.79 Acid Low sensitivity; Repeat MDL with higher spike level 2,4-Dichlorophenol 3.06 Acid Low sensitivity; Repeat MDL with higher spike level Neutral Poor sensitivity and linearity; Recoveries were above the method Hexachl orobutadi ene 4.78 upper limit 4-Chloro-3- methylphenol 2.70 Acid Low sensitivity Neutral Poor sensitivity and linearity; Recoveries were above the method Dimethylphthalate 1.60 upper limit Neutral Poor sensitivity and linearity; Recoveries were above the method Diethylphthalate 2.42 upper limit 3,3'-Dichlorobenzidine 3.51 Base Low sensitivity; Repeat MDL with higher spike level Table 6. Summary of Unsuccessful Compounds Analyte Log Ko/w Type Comment Phenol 1.46 Acid Poor chromatography; Poor sensitivity Benzyl alcohol 1.10 Acid Not regulated; Poor sensitivity 2-Methylphenol 1.95 Acid Water soluble; Poor sensitivity 4-Methylphenol 1.94 Acid Water soluble; Poor sensitivity N-nitroso-di-n- propylamine 1.36 Base Thermally reactive; Poor sensitivity Benzoic acid 1.87 Acid Not regulated; Poor sensitivity 4-Chloroaniline 1.83 Base Poor sensitivity 2,4,5-Trichlorophenol 3.72 Acid Easily complexes with phosphates; Poor sensitivity 2-Nitroaniline 1.85 Base Water soluble; Poor sensitivity 3-Nitroaniline 1.37 Base Water soluble; Poor sensitivity 2,4-Dinitrophenol 1.67 Acid Poor sensitivity 4-Nitrophenol 1.91 Acid Poor sensitivity 4-Nitroaniline 1.39 Base Water soluble; Poor sensitivity D-20 ------- 2. Polycyclic Aromatic Hydrocarbons (PAHs) We tested the polycyclic aromatic hydrocarbons, detailed summary is provided in Attachment C. The PAH compounds (Table 7) are neutral compounds that can be extracted with a single stir bar, 20% methanol as the matrix modifier, and 5 mL sample size. The sensitivity of the PAH compounds was excellent, and reporting limits of 0.2 to 0.4 ug/L were obtained with this small sample size. Table 8 shows the PAH compounds with the lowest MDL obtained by the Twister™, and compared to the traditional method. The MDLs by the Twister™ method were a factor of 10 or more less than EPA Method 625 and CARB Method 3230.2F. Once all the system temperatures were optimized, all method performance criteria were met for all PAH compounds attempted. Linearity of the calibration curves was excellent for the range of 0.2 ug/L to 8 ug/L - a factor of 40. Reproducibility of the internal standard areas was excellent, and was typically less than 10% if interferences such as isooctane were not present. Laboratory control samples and matrix spikes typically gave comparable recovery and precision to Methods 625 and 3230.2F. Surrogate recoveries (base/neutral surrogates only) were similar to those expected in EPA Method 625 and CARB Method 3230.2F. Since the Twister™ technique is a procedural calibration, the recoveries seen were slightly higher than those obtained by the reference method, EPA Method 624. Table 7. Polyaromatic Compounds Analyte Name CAS# Log Ko/w Naphthalene 91-20-3 3.30 2-Methylnaphthalene 91-57-6 3.86 2-Chloronaphthalene 91-58-7 3.90 Acenaphthylene 208-96-8 3.94 Acenaphthene 83-32-9 3.92 Fluorene 86-73-7 4.18 Phenanthrene 85-01-8 4.46 Anthracene 120-12-7 4.45 Fluoranthene 206-44-0 5.16 Pyrene 129-00-0 4.88 Benzo(a)anthracene 56-55-3 5.76 Chrysene 218-01-9 5.81 Benzo(a)pyrene 50-32-8 6.13 Benzo(b)fluoranthene 205-99-2 5.78 Benzo(k)fluoranthene 207-08-9 6.11 Indeno( 1,2,3 -cd)pyrene 193-39-5 6.70 Dibenz(a,h)anthracene 53-70-3 6.75 Benzo(g,h,i)perylene 191-24-2 6.63 D-21 ------- Table 8. Method Detection Limits for PAH Com EPA Method CARB Method Twister™ MDL, 625—MDL, 3230.2F—MDL, Analyte ug/L ug/L ug/L Naphthalene 0.07 1.6 1.00 2-Methylnaphthalene 0.07 1.00 2-Chloronaphthalene 0.06 1.9 1.00 Acenaphthylene 0.06 3.5 1.00 Acenaphthene 0.04 1.9 1.00 Fluorene 0.04 1.9 1.00 Phenanthrene 0.04 5.4 1.00 Anthracene 0.04 1.9 1.00 Fluoranthene 0.05 2.2 0.98 Pyrene 0.05 1.9 0.96 Benzo(a)anthracene 0.06 7.8 0.93 Chrysene 0.05 2.5 0.99 Benzo(b)fluoranthene 0.05 4.8 0.82 Benzo(k)fluoranthene 0.05 2.5 0.93 Benzo(a)pyrene 0.07 2.5 0.84 Indeno( 1,2,3 -cd)pyrene 0.28 3.7 1.00 Dibenz(a,h)anthracene 0.25 2.5 0.76 Benzo(g,h,i)perylene 0.20 4.1 0.92 Dounds 3. UAA Pesticides We spent much less time with the pesticide list compounds, and thus have much less data with which to judge success or failure. Table 9 summarizes the pesticides, and Attachment D provides details. The log Ko/w values are promising for the majority of the analytes. As a reminder, pesticides are analyzed by dual column GC/ECD which has a typical MDL of 100-1000 times lower than full scan GC/MS. Several compounds compare quite favorably to the traditional techniques. Table 10 summarizes all but three pesticides which gave good results and the MDLs obtained compared to the traditional technique. The successful compounds would need to be extracted two different ways (methanol and salt) and analyzed separately. Time needed for the extraction and analysis would likely be less than the traditional methods because the two extractions can be done simultaneously, and an auto sampler minimizes the chemist's effort in analysis. However, it should be noted that we performed the UAA pesticide list tests concurrently with the semi- volatile list. Thus, pesticides and semi-volatiles that are amenable to the methanol extraction can be combined into one extraction/analysis. The same is true for the analytes that are amenable to the salt extraction. Tebuthiuron, Chlorothalonil and Pyrethrins were unsuccessful. The first two gave poor response; and Pyrethrins had problems with linearity. It should be noted, however, that Pyrethrins have the same problems with linearity on GC/ECD. D-22 ------- Table 9. Summary of Tested Pesticides Analyte CAS# Log Ko/w Diethyltoluamide (DEET) 134-62-3 2.20 Propachlor 1918-16-7 2.20 Trifluralin 1582-09-8 5.30 Simazine 122-34-9 2.20 Atrazine 1912-24-9 2.61 Diazinon 333-41-5 3.80 Alachlor 15972-60-8 3.50 Malathion 121-75-5 2.30 Metolachlor 51218-45-2 3.10 Chlorpyrifos 2921-88-2 4.70 Pendimethalin 40487-42-1 4.80 Bromacil 314-40-9 1.70 Bifenthrin 82657-04-3 8.10 Permethrin 52645-53-1 6.50 Chlorothalonil 1897-45-6 3.10 Pyrethrins 8003-34-7 5.00 Tebuthiuron 34014-18-1 1.78 "able 10. Summary of Successfu Pesticides Twister TM CARB Method CARB Method Best MDL, 3240.2 - MDL, 3250.4 - MDL, Compound technique ug/L ug/L ug/L Diethyltoluamide (DEET) Salt 0.09 Propachlor Salt 0.09 0.1 Trifluralin Methanol 0.02 0.025 Simazine Salt 0.17 0.065 Atrazine Salt 0.85 1.5 0.056 Diazinon Methanol 0.22 0.5 Alachlor Salt 0.05 0.065 0.047 Malathion Salt 0.17 Metolachlor Salt 0.21 0.25 0.058 Chlorpyrifos Methanol 0.03 Pendimethalin Methanol 0.24 Bromacil Methanol 0.41 Bifenthrin Methanol 0.95 Permethrin Methanol 4.11 4. Pharmaceutical and Personal Care Products There were mixed results for the pharmaceutical and personal care product (PPCP) list. The PPCP list includes a wide variety of compounds, such as ordinary daily use compounds like D-23 ------- caffeine, over-the-counter pharmaceuticals like ibuprofen, wastes from prescription drugs like hormone supplement, and ordinary contaminants like phthalates from plastic materials. This wide variety of compounds creates the potential need for various analytical techniques. For example, many pharmaceuticals are best analyzed with liquid chromatography/mass spectrometry because of the solubility in water. Compounds that have a low Log Ko/w and are readily soluble in water, like caffeine, do not perform well using the Twister™ technique. To get a sense of the feasibility of using the Twister™ technique, we selected six compounds from the extensive PPCP list to evaluate (Table 11). Details are provided in Attachment E. Table 11. Summary of PPCP Considered Compound CAS# Log Ko/w Reporting Limit-SIM Comment Caffeine Water soluble Ibuprofen Low sensitivity Triclosan 3380-34-5 4.66 0.02 ug/L Bisphenol A 80-05-7 3.64 0.02 ug/L Estrone 53-16-7 3.43 0.05 ug/L Ethinyl Estradiol 57-63-6 4.12 0.05 ug/L Three compounds performed well. The two estrogen compounds, estrone and ethinyl estradiol, were chosen from a larger list of hormones. Bisphenol A was chosen from commonly used plastics. These compounds showed good linearity using full scan GC/MS and using selected ion monitoring (SIM) GC/MS. The estimated reporting limits are listed in Table 11. The compounds were not detected in any samples using the standard 5 mL aliquot. The concentrations of these compounds may be much lower than anticipated. Triclosan appears to extract in any modifier and behaves well in SIM and scan, with small volumes or large. Using the salt additive, linearity was acceptable; only one MS/MSD pair was analyzed, and the recovery was not reproducible. Unfortunately, this compound was not detected in any samples, and may be present at lower concentrations. Three compounds did not perform well. Ibuprofen was inconsistent and showed poor sensitivity. Caffeine was water soluble and was not extractable with Twister™. These compounds may need to be analyzed by LC/MS. 5. Brominated Flame Retardants Brominated flame retardants (BFR) are composed of polybrominated biphenyl ethers (BDE) which are used to treat clothing for infants. For convenience, a commercially available mix of several polybrominated biphenyl ethers was chosen to evaluate the method. The five compounds that were evaluated were linear, especially those eluting early, and showed decent sensitivity using either a larger sample volume or SIM. The proposed regulatory detection limits would require that these compounds be analyzed using 100 mL samples or SIM. The compounds were evaluated on only a few samples, and not detected. The one set of MS/MSD run showed promising results using SIM. There was also one sample/MS run using 100 mL, and acceptable recovery was obtained. Estimated reporting limits for the BFRs are summarized in Table 12, and details are provided in Attachment E. D-24 ------- Table 12. Summary of Brominated Flame Retardants Tested Compound CAS# Log Ko/w Reporting Limit- SIM (ug/L) BDE-47 5436-43-1 6.77 0.01 BDE-100 189084-64-8 7.66 0.01 BDE-99 60348-60-9 7.66 0.01 Hexabromobiphenyl 59080-40-9 9.10 0.05 BDE-153 68631-49-2 8.55 0.01 F. Holding Time Study for Semi-volatile Organic Compounds We began a holding time study on July 22nd by preparing, extracting, and storing replicates of samples at 4°C that had been spiked with a known amount of each compound from the semi- volatile list. We wanted to assess whether it would be acceptable to extract samples with Twister™ and then store them at 4 0 C before analysis. We analyzed the stored Twisters on days 0, 1, 2, 5, 8, 14, 21, and 28. The analytes, once captured on the Twister™ and stored, are stable for at least 14 days. Benzoic Acid, 4-Chloroaniline, 3-Nitroaniline, and 4-Nitroaniline were excluded from the study because they could not be effectively extracted and analyzed. Detailed information about the holding time study can be found in Attachment F. Charts 1 through 6 are of rolling averages broken into groupings of analytes by internal standard. The previous two days' analyses and the current day's analysis are grouped for the rolling average. For example, "to day 14" is an average of days 5, 8, and 14; "to day 28" is an average of days 14, 21, and 28. The problem compounds Benzoic Acid, 4-Chloroaniline, 3-Nitroaniline, and 4-Nitroaniline have been excluded from the following rolling averages. Visual inspection of the charts confirms that the analytes are stable for at least 14 days. We observed a drop in concentration of the late eluting compounds, especially PAHs, as seen by the decrease of concentration after Day 14 in Charts 5 and 6. Because system optimization continued during these 28 days, the holding time study may need to be repeated to see if analytes within the PDMS phase on the Twisters™ are stable for more than 14 days. Chart 1. Semi-volatile Compounds in Group 1 (Dichlorobenzene-D4 - IS) 80.0 n 70.0 60.0 50.0 40.0 30.0 20.0 10.0 0.0 —~— Phenol —bis(2-Chloroethyl)et 1,3-Dichlorobenzene 2-Chlorophenol —*— 1,4-Dichlorobenzene —•— 1,2-Dichlorobenzene —i— Benzyl —2-Methy Iphenol —bis(2-Chloroisopropy Hexachloroethane 4-Methylphenol N-nitroso-di-n-propy D-25 ------- Chart 2. Semi-volatile Compounds in Group 2 (Naphthalene-D8 - IS) dayO to day 1 today2 today5 today8 to day 14 today21 today28 - Nitrobenzene - Isophorone 2-Nitrophenol 2,4-Dimethylphenol - bis(2-Chloroethoxy)m -2,4-Dichlorophenol -1,2,4-Trichlorobenze Naphthalene Hexachlorobutadiene 4-Chloro-3-methylphe 2-Methylnaphthalene Chart 3. Semi-volatile Compounds in Group 3 (Acenaphthene-DIO - IS) 70.0 day 0 to day 1 to day 2 to day 5 to day 8 to day 14 to day 21 to day 28 - Hexachlorocyclopenta -2,4,6-Trichloropheno 2,4,5-Trichloropheno 2-Chloronaphthalene -2-Nitroaniline -Acenaphthylene - Dimethylphthalate -2,6-Dinitrotoluene Acenaphthene 2,4-Dinitrophenol Dibenzofuran 4-Nitrophenol 2,4-Dinitrotoluene Fluorene Diethylphthalate - 4-Chloropheny l-pheny Chart 4. Semi-volatile Compounds in Group 4 (Phenanthrene-DIO - IS) 60.0 50.0 40.0 30.0 20.0 10.0 - 4,6-Dinitro-2-methyl - N-nitrosodiphenylami 4-Bromophenyl-phenyl Hexachlorobenzene - Pentachlorophenol - Phenanthrene -Anthracene - Carbazole Di-n-butylphthalate Fluoranthene day 0 to day 1 to day 2 to day 5 to day 8 to day 14 to day 21 to day 28 D-26 ------- Chart 5. Semi-volatile Compounds in Group 5 (Chrysene-D12 - IS) Chart 6. Semi-volatile Compounds in Group 6 (Perylene-D12 - IS) G. Comparability Study The comparability of our study had two parts. First, since this study utilized a mobile laboratory to extract the majority of the field samples, we wanted to be able to show that extractions that took place in that mobile laboratory were essentially the same as those that may take place in a traditional laboratory setting. Second, we wanted to show that the Twister™ extraction and thermal desorption analysis is essentially the same as or an improvement of the traditional liquid/liquid extraction and liquid injection analysis. For the comparison of field data to laboratory data, we evaluated field spiked samples with laboratory spiked samples. Also, we looked at method blanks that were prepared in the field and in the lab. Because of the variation we were naturally getting during analysis in the months of July 2009 and August 2009, it was difficult to show comparability. The internal standards were fluctuating such that replicate analyses of the same sample (duplicate continuing calibration checks) did not necessarily agree with each other. We continued to optimize instrument D-27 ------- conditions, and did not have perfect conditions on the days that field and laboratory spikes were analyzed. In the end, we determined that we were spiking too much material into the samples for the Twister™ technique. We should be able to get better results with lower concentrations. For the comparison of the Twister™ technique to the traditional laboratory method, we were more successful because in September 2009 we showed that the PAHs and several other compounds gave data of a quality that is equivalent to EPA Method 625. This data is sufficient to apply for an Alternate Test Procedure (ATP) in Region 7. For a detailed discussion of the data, see Attachment G. II. Matrix Interferences (Brenner Heights) During the research project, one sample site exhibited matrix interferences. The sample, labeled 1001 and BH1, was from a small urban stream in Kansas City, Kansas named Brenner Heights Creek. This sampling location has repeatedly posed problems with bacteria sampling results, giving zero E. coli colonies each of the last three sampling seasons, while upstream portions of the creek yield high levels of E. coli. As a result, residual chloride measurements were taken this season, and the downstream results were significantly higher than the upstream portions of the creek. There is a large influx of chlorine into the creek somewhere between BH1 and the next upstream sample, labeled 1005 or BH2. That is the only significant difference between the three sampling sites on this creek. During the first analysis on July 31, 2009, there was a dramatic drop in the last two internal standards for the Brenner Heights samples run with dual stir bars. However, these samples were run during the time of high internal standard concentrations which resulted in variability in the last two internal standard areas for all samples. On September 17, 2009, Brenner Heights was re- sampled and residual chloride levels were also measured. These samples were run several times over the course of the next week with various modifiers. The results show that while methanol modifier does not seem to be affected, the samples that used salt and/or acid modifier gave poor precision for the last two internal standards from Brenner Heights. It appears that changing the ionic concentration of the sample with either acid or salt caused significant matrix affects, especially toward the latter part of the chromatograph. It appears that residual chloride in surface water samples interferes with the analyses, especially when salt or acid are used as a modifier. This phenomenon should be further investigated. IV. Cost/Benefit Analysis During this project, EPA Region 7 tested and validated the extraction method without purchasing the needed equipment. The basic Gerstel equipment lists for $55,688 which was loaned to us for 90 days. The Gerstel equipment with autosampler lists for approximately $75,000. We purchased some liners and stir bars for approximately $1000. EPA evaluated the realistic savings in time, materials, and manpower during this project. With an average sample load of 140 water samples per year for semi-volatile organic analysis using the Gerstel Twister™ with an autosampler, we would be able to • Reduce dichloromethane usage by 6-13% or by $700-$1400, • Reduce staff time per sample, and effectively increase sample capacity by 25%, • Reduce glassware costs for sample containers from $890 to $128, and D-28 ------- • Reduce sample shipping costs by 75% or more. As other analytical methods are tested, additional cost savings may be realized. The sample data would have to be evaluated in a new way. For example, a water sample could be analyzed for most semivolatile compounds, pesticides, and PAHs within a single analysis instead of 3-4 separate extractions and analyses. Those few water soluble compounds that do not perform well using the Twister™ would have to be analyzed by a different technique such as Liquid Chromatography/Multi-sector Mass Spectrometer (LC/MS/MS). Specific examples with immediate impact include: 1. The UAA stream samples are collected during the summer, typically 40-60 samples spread out over several months. Without additional funding, all sample preparation and analysis must be performed by EPA staff only. The maximum samples that could be collected in any given two week period are 15 samples—not because of the collection process but because of the extraction process in the laboratory. Samples are analyzed for SVOC, PAH, and Pesticides in both water and soil. With 7 and 14 day extraction holding times respectively, the first two weeks after sample collection is spent extracting, cleaning, and concentrating extracts prior to analysis. With this type of schedule, the samples for this project are collected over a minimum of two months with about 60 days to analyze and report the data after the receipt of the last sample. This project takes three people full time and one part time technician for a period of 4 months to complete the work. By performing the analysis using Gerstel on the water samples for this one project, one chemist can extract and analyze water samples, potentially for all analytes on a weekly basis. This will leave the remaining two chemists to perform extractions on the solid samples only, prior to analysis. Samples could be collected on a weekly basis (15/week), thus shortening the analysis and reporting time to 2.5-3 months. This is a savings of staff time of three FTEs in one month, or more. 2. A second stream study occurs every summer that looks at water quality issues. In 2009, it was the Kansas tributaries project. This project collects 3-4 samples per week until 35-50 samples are collected. Because these samples trickle in, sample sets are very small resulting in 50% or more analyses being QC data instead of field sample data. Using the traditional method, approximately 200 staff hours were spent extracting and analyzing samples for this project. Our projections of staff time using the Gerstel system would be only 80 hours—mainly the time to analyze samples. D-29 ------- Table 13. Summary o ' Cost Benefits of Twister Process Per Sample Savings Annual Savings (a) Comments Solvent Reduction 300 mL —~ lmL (S) 10 gals (A) 32 gals Hazardous Waste Disposal Sample Transport Costs 1-3 L —~ 40 mL (A) 109 gals -> 4 gals Staff exposure to solvents Almost no exposure Staff Time 3 days —~ minutes Per batch of samples (A) 65 days —~ 4 days Sample Turn-around and scheduling Potential to increase load with same number of staff by 40% or more Glassware (S) 140 gallon jars ($890) —>140 VOA vials ($128) Twisters Replace as needed. Estimate 50 or more uses within life of stir bar. (a) Annual savings is based on either (S) 140 samples per year for SVOCs or (A) 375 samples for pesticides, PAHs, and SVOCs. These numbers include the required quality control samples assuming maximum batch size of 20 samples. V. Conclusions We were able to complete all the tasks we had planned and answer the questions listed in Section II. The results from the tests indicate that some compounds can be extracted from water samples no matter how the sample was modified, while other compounds are very particular and succeed with only one type of modifier. Essentially, all 18 PAHs met EPA Method 625 and CARB Method 3230.2 performance criteria, some of the remaining semi-volatile compounds met or nearly met method criteria. There were 12 semi-volatile compounds that were not amenable to the Twister™ extraction, resulting in a potential of 54 semi-volatile compounds meeting method performance criteria. These compounds generally were water soluble and either an acid or a base. We had time to also test pesticides used for UAA evaluations, pharmaceutical and personal care products and brominated flame retardants. Preliminary data for the UAA pesticides indicate that at least 14 of the 18 pesticides can meet the Method 608 performance criteria. The remaining four pesticides need to be evaluated. Preliminary data for the pharmaceutical and personal care products gave very mixed results, many compounds not meeting default method criteria (calibration curve, D-30 ------- extraction recovery). Review of these compounds indicates that they have a low logKo/w. These compounds may benefit from another analytical technique such as LC/MS. Preliminary data for the brominated flame retardants show that these compounds are amenable to the Twister™ technology. In fact, they tend to perform better at lower concentrations (ppt). Some of the overarching accomplishments and findings include: • Neutral compounds that are insoluble in water can be extracted with Twisters™. • Method detection limits were lower in all compound classes by a factor of 10 compared to the liquid/liquid extraction technique. • Many classes of compounds can be analyzed in the same sample simultaneously. • All but two pesticides from the UAA list met method performance criteria. • Process was fast and efficient, resulting in a fast turn-around capability for screening unknown, aqueous samples. • Process eliminated hazardous solvents and tedious bench work, reducing staff time and strain. • Process may allow us to meet the risk level concentrations for PAHs for Superfund projects. • Draft standard operating procedure documented the optimum conditions for the Twister™ technique. Alternate Test Procedure for UAA list for stream studies Most of the UAA list pesticides that were attempted were successful in either a methanol or salt- modified extraction. The MDLs rivaled those published in CARB Methods 3240.2 and 3250.4, using GC/ECD methods. The Twister™ technique gave data that met method performance criteria in these two methods. A formal demonstration of proficiency has not been completed. One change for future work may include finding pesticide standards in an alternate solvent that causes less interference than iso-octane. Based on the preliminary success of the pesticide analysis, the full pesticide compound list may be evaluated, and the successful compounds will be submitted for an Alternate Test Procedure. Alternate Test Procedure for PAHs Enough data was produced to show equivalency to EPA Method 625 for the PAH target compounds. The method detection limits are between 5 and 25 times lower than those listed in CARB Method 3230.2. The Twister™ technique was shown to be accurate enough and reproducible enough to meet the method performance requirements of both EPA Method 625 and CARB Method 3230.2. We will be submitting this report with raw data to management and Region 7 Regional Administrator for review and approval as an Alternate Test Procedure for PAHs. Table 14 summarizes the compounds that meet or nearly meet the requirements for the ATP. D-31 ------- Table 14. Summary of Proposed Compounds for ATP Compound Group (a) Modifier (b) MDL (ug/L) (c) Comment (d) bis(2-Chloroethyl)ether svoc Salt 0.23 1,3 -Di chl orob enzene svoc Methanol 0.14 2-Chlorophenol svoc Acid Need MDL 1,4-Di chl orob enzene svoc Methanol 0.15 1,2-Di chl orob enzene svoc Methanol 0.18 bi s(2-Chl oroi sopropyl)ether svoc Methanol Need MDL Hexachl oroethane svoc Methanol 0.10 Nitrobenzene svoc Methanol 0.50 Isophorone svoc Salt 0.10 2,4-Dimethylphenol svoc Salt 0.24 bis(2-Chloroethoxy)m ethane svoc Methanol 0.53 2,4-Dichlorophenol svoc Acid Need MDL 1,2,4-Trichlorobenzene svoc Methanol 0.14 Naphthalene PAH Methanol 0.07 Hexachl orobutadi ene svoc Salt 0.19 Need IDP 4-Chloro-3-methylphenol svoc Methanol Need MDL 2-Methylnaphthalene PAH Methanol 0.07 Hexachl orocy cl opentadi ene svoc Methanol 0.06 2,4,6-Trichlorophenol svoc Acid 0.36 2,4,5-Trichlorophenol svoc Acid 2-Chl oronaphthal ene PAH Methanol 0.06 2-Nitroaniline svoc Salt Acenaphthylene PAH Methanol 0.06 Dimethylphthalate SVOC Methanol 0.72 Need IDP 2,6-Dinitrotoluene svoc Methanol 0.19 Acenaphthene PAH Methanol 0.04 Dibenzofuran PAH Methanol 0.08 2,4-Dinitrotoluene SVOC Methanol 0.51 Fluorene PAH Methanol 0.04 Diethylphthalate SVOC Salt 0.22 Need IDP 4-Chlorophenyl-phenyl ether SVOC Methanol 0.07 4,6-Dinitro-2-methylphenol SVOC Salt 0.21 N-nitrosodiphenylamine SVOC Methanol 0.15 4-Bromophenyl-phenyl ether SVOC Methanol 0.08 Hexachl orob enzene svoc Methanol 0.13 Pentachlorophenol svoc Acid 0.25 Phenanthrene PAH Methanol 0.04 Anthracene PAH Methanol 0.04 Carbazole PAH Methanol 0.36 Di-n-butylphthalate SVOC Methanol 0.09 Fluoranthene PAH Methanol 0.05 D-32 ------- Table 14 (continued). Summary of Proposed Compounds for ATP Compound Group (a) Modifier (b) MDL (ug/L) (c) Comment (d) Pyrene PAH Methanol 0.05 Butylbenzylphthalate SVOC Methanol 0.06 Benzo(a)anthracene PAH Methanol 0.06 Chrysene PAH Methanol 0.05 3,3'-Dichlorobenzidine SVOC Salt 0.52 Need MS bi s(2-Ethylhexyl)phthalate SVOC Salt 0.79 Di-n-octylphthalate SVOC Methanol 0.29 Benzo(b)fluoranthene PAH Methanol 0.05 Benzo(k)fluoranthene PAH Methanol 0.05 Benzo(a)pyrene PAH Methanol 0.07 Indeno( 1,2,3 -cd)pyrene PAH Methanol 0.28 Dib enz(a, h)anthracene PAH Methanol 0.25 B enzo(g,h,i)peryl ene PAH Methanol 0.20 (a) SVOC = Semi-volatile compound, PAH = Polyaromatic Hydrocarbon (b) Sample modifier that provides the best data: Methanol, Salt, or Acid (c) Method Detection Limit (MDL) obtained, using the lowest observed during the study (d) If a comment is provided, this compound could be added to the final ATP. Generally, these compounds need to have the optimum calibration range and spiking amount in order to get acceptable data. IDP = Initial Demonstration of Proficiency, MS = Matrix spike. For a detailed discussion of the data for the ATP and the Standard Operating Procedure, see Attachments H and I. VI Future Work/Development The Twister™ extraction and thermal desorption analysis can likely be used for many more compounds than those that we tried during the trial period. Future work could fall into three categories: refine what we have accomplished, develop a screening method, and pushing the detection limits. The three areas that need more work are the 12 semi-volatile compounds that should be successful, pesticide analysis, and the brominated flame retardants. The remaining semi-volatiles that met three of four criteria need some additional work where different concentrations are used, generally lower concentrations. Pesticides that are neutral and relatively water insoluble should perform well. We could investigate the full 608 pesticide list and develop an ATP for pesticides. We should keep in mind that the typical solvent, iso-octane, would need to be substituted with another that is amenable to Twister™. Brominated flame retardants consist of many more isomers that could be tested and evaluated. The full list would need to be identified prior to starting any work in this area. Occasionally, we have emergencies where answers to sample data are needed as fast as possible. The traditional method, working 24 hours a day can provide analytical data results in 3 days. Using the Twister™ technique and an autosampler, answers for up to a dozen samples could be provided in 24 hours working 10-12 hour day. If screening data is all that may be required, the D-33 ------- rugged QC procedures we adhered to need not be followed, and a quick estimate can be provided to customers within hours. Also, this technique could be used to screen samples prior to full extraction using the traditional techniques. The data from the screened samples could provide an immediate determination while the data from the traditional method would provide the compliance data for court litigation. If low detection limits are needed, sample volume and extraction time can be increased to gain orders of magnitude on the MDLs that we obtained from a mere 5 mL sample. Region 7 has been asked by our Superfund customers in the past to do method development to be able to get much lower detection limits for the PAH compounds. Their request is based on Region 9 preliminary remediation goals (PRGs) which are based on human risk assessments. The goals for the PAH compounds are summarized Table 15. The lowest goal is Benzo(a)Pyrene and Dibenz(a,h)anthracene at 0.0092 ug/L in water. During our study, we were able to show an MDL for these compounds of 0.07 and 0.25 ug/L, respectively, using a 5 mL sample volume. It is reasonable to think that the Region 9 PRG can be nearly met by increasing the sample volume and extraction time. Table 15. Summary of PE .Gs and MDLs for PAHs Compound Region 9 PRG, ug/L Traditional technique MDL, ug/L (a) Twister™ MDL, ug/L (5 mL sample) Acenaphthene 370 1.0 0.04 Anthracene 1800 1.0 0.04 Benzo(a)anthracene 0.092 0.93 0.06 Benzo(b)fluoranthene 0.092 0.82 0.05 Benzo(k)fluoranthene 0.92 0.93 0.05 Benzo(a)pyrene 0.0092 0.84 0.07 Chrysene 0.56 0.99 0.05 Dib enz(a, h)anthracene 0.0092 0.76 0.25 Fluoranthene 1500 0.98 0.05 Fluorene 240 1.0 0.04 Indeno( 1,2,3 -cd)pyrene 0.092 1.0 0.28 Naphthalene 0.093 1.0 0.07 Pyrene 180 0.96 0.05 (a) CARB Method 3230.2 VII Acknowledgement and Disclaimer We would like to thank Gerstel and its employees for the opportunity to evaluate the equipment for 3 months, specifically, Ed Pfannkoch, Tim Pence, and Terry Stevens. Ed Pfannkoch was very helpful in trouble shooting and method development, spending countless hours on the phone assisting us with instrument conditions. Tim Pence was a life-saver when we had a TDU malfunction, because he quickly provided us a spare TDU and 506C controller the next day. Terry Stevens was crucial in making the loan agreement a reality and a success. D-34 ------- We would like to acknowledge the support of EPA management in Environmental Services Division who allowed us to pursue this project, and support of the Organic Chemistry Section who helped by sharing more of the sample load in order for the project team to be free to carry on this project. Special note of appreciation goes to Karen Johnson for helping with the traditional semi-volatile analysis, assisting with the Gerstel analysis, and reviewing the final report. Special note of appreciation goes to Jenn Boggess who helped with glassware washing so that the chemists could perform an inordinately large number of extractions in the normal time frame The views of the scientists who worked on this project are not necessarily the views of the Environmental Protection Agency. Mention of specific products does not constitute an endorsement of the product by the United States Environmental Protection Agency. It should be noted that the Gerstel Twister™ is patented technology, and is sold by a single company. At this time there is no other product of this type available. VIII. References [1] Prieto, A; Zuloaga, O.; Usobiaga, A.; Etxebarria, N.; Fernandez, L.A.: "Development of a stir bar sorptive extraction and thermal desorption-gas chromatography-mass spectrometry method for the simultaneous determination of several persistent organic pollutants in water samples.", Journal of Chromatography A, 1174 (2007) 40 - 49. [2] Heiden, A.; Hoffmann, A.; Kolahgar, B.: "Application of stir bar sorptive extraction to the determination of polycyclic aromatic hydrocarbons in aqueous samples.", Journal of Chromatography A, 963 (2002) 225-230. [3] Perez-Carrera, E.; Leon Leon, V.M.; Gomez Parra, A.; Gonzalez-Mazo, E.: "Simultaneous determination of pesticides, polycyclic aromatic hydrocarbons and polychlorinated biphenyls in seawater and interstitial marine water samples, using stir bar sorptive extraction-thermal desorption-gas chromatography-mass spectrometry.", Journal of Chromatography A, 1170 (2007) 82 - 90. [4] Kansas Urban Streams, http//:kcwaters.org. D-35 ------- Appendix E EPA Offices Method Validation References E-l ------- Office/ Organization Document or Reference URL Category Guidance Type Statute/ Application Office of Air and Radiation (OAR) Office of Radiation and Indoor Air, National Air and Radiation Environmental Laboratory Method Validation Guide for Qualifying Methods Used by Radiological Laboratories Participating in Incident Response Activities June 2009 httpsi//www, epa.gov/site s/production/files/2015- 05/documents/method v alidation final with web cover 6~24~03.pdf Radiochemical Method validation and peer review policies and guidelines Methods for processing samples during a response to a radiochemical incident, including radiochemical incidents of national significance. Office of Radiation and Indoor Air, National Air and Radiation Environmental Laboratory Radiological Laboratory Sample Analysis Guide for Incident Response - Radionuclides in Soil EPA 402-R-12-006 September 2012 httpsi//www, epa.gov/site s/production/files/2015- 05/documents/402-r-12- 006 soil guide sept 201 2.pdf Radiochemical General method development guidelines Methods for processing samples during a response to a radiochemical incident, including radiochemical incidents of national significance. Office of Radiation and Indoor Air, National Air and Radiation Environmental Laboratory Radiological Laboratory Sample Analysis Guide for Incident Response - Radionuclides in Water EPA 402-R-07-007 January 2008 https://nepis.epa.gov/Exe /ZyPDF.cgi/60000LAW.PD F?Dockev=60000LAW.PD F Radiochemical General method development guidance Methods for processing samples during a response to a radiochemical incident, including radiochemical incidents of national significance. Office of Radiation and Indoor Air, National Air and Radiation Environmental Laboratory Radiological Laboratory Sample Analysis Guide for In ciden ts of Nation al Significance - Radionuclides in Air EPA 402-R-09-007 June 2009 https://cfpub.epa.gov/si/ si public record report.c fm?Lab=ORIA&dirEntryld =2.31510 Radiochemical General method development guidance Methods for processing samples during a response to a radiochemical incident, including radiochemical incidents of national significance. Office of Research and Development (ORD), Office of the Science Advisor (OSA), Forum on Environmental Measurement (FEM) - OAR Method Detection and Quantitation - Program Use and Needs October 2010 https://www.epa.gov/site s/production/fiies/2016- 11/documents/mth det auant-guide-ref- revision nov2016.pdf Chemical Method quantitation Common practices for determining method detection and quantitation E-2 ------- Office/ Organization Document or Reference URL Category Guidance Type Statute/ Application ORD, OSA, FEM - OAR Calibration Curves - Program Use and Needs October 2010 httpsi//www, epa.gov/site s/production/fiIes/2014- 05/documents/calibratio n-guide-ref-final- oct2010.pdf Chemical Method quantitation Common practices for instrument calibration EPA (General) - OAR Detection Limit and Quantitation Limit Summary Table November 2016 httpsi//www, epa.gov/site s/production/fiIes/2016- 11/documents/mdlmql- toolbox- final nov2016 0.pdf Chemical Method quantitation Common practices for determining detection limits OAR; Office of Air Quality, Planning and Standards (OAQPS); Ambient Air Regulatory Program Ambient Air Monitoring Reference and Equivalent Methods, 40 CFR Part 53 July 2002 httpsi //www .ecfr.gov/cgi -bin/text- idx?SID=6711885e63e97d cf6c9c2c94b0237ab9&mc =true&node=pt40,6,S3&r gn=div5#sp40.6.53.c Chemical Method validation and peer review policies and guidelines Procedural requirements for demonstrating reference and equivalent methods for ambient air OAR, OAQPS, Air Quality Assessment Division, Measurement Technology Group Method 301 - Field Validation of Pollutant Measurement Methods from Various Waste Media March 2018 https://www.epa.gov/site s/production/files/2018- 03/documents/method 3 01 3-26-2018 l.pdf Chemical Method validation and peer review policies and guidelines Procedures for validating an alternative candidate test method for an affected source Office of Chemical Safety and Pollution Prevention (OCSPP) Office of Research and Development (ORD), Office of the Science Advisor (OSA), Forum on Environmental Measurement (FEM) - OCSPP Method Detection and Quantitation - Program Use and Needs October 2010 https://www.epa.gov/site s/production/files/2016- 11/documents/mth det quant-guide-ref- revision nov2016.pdf Chemical Method quantitation Common practices for determining method detection and quantitation ORD, OSA, FEM - OCSPP Calibration Curves - Program Use and Needs October 2010 https://www.epa.gov/site s/production/files/2014- 05/documents/calibratio n-guide-ref-final- oct2010.pdf Chemical Method quantitation Common practices for instrument calibration E-3 ------- Office/ Organization Document or Reference URL Category Guidance Type Statute/ Application EPA (General) - OCSPP Detection Limit and Quantitation Limit Summary Table November 2016 httpsi//www, epa.gov/site Chemical Method quantitation Common practices for determining detection limits s/production/files/2016- 11/documents/mdlmql- toolbox- Office of Enforcement and Compliance Assurance (OECA) Office of Research and Development (ORD), Office of the Science Advisor (OSA), Forum on Environmental Measurement (FEM) - Office of Enforcement and Compliance Assurance (OECA) Method Detection and Quantitation - Program Use and Needs October 2010 httpsi//www, epa.gov/site s/production/files/2016- 11/documents/mth det quant-guide-ref- revision nov2016.pdf Chemical Method quantitation Common practices for determining method detection and quantitation ORD, OSA, FEM - OECA Calibration Curves - Program Use and Needs October 2010 https://www.epa.gov/site s/production/files/2014- 05/documents/calibratio n-guide-ref-final- oct2010.pdf Chemical Method quantitation Common practices for instrument calibration EPA (General), OCSPP Detection Limit and Quantitation Limit Summary Table November 2016 https://www.epa.gov/site s/production/fiies/2016- 11/documents/mdlmql- toolbox- Dffice of Land and Emergenc Chemical y Management (( Method quantitation DLEM) Common practices for determining detection limits Office of Resource Conservation and Recovery (ORCR) Guidance for Methods Development and Methods Validation for the Resource Conservation and Recovery Act (RCRA) Program https://www.epa.gov/site s/production/fiies/2015- 10/documents/methdev, pdf Chemical General method development guidance SW- 846 Methods E-4 ------- Office/ Organization Document or Reference URL Category Guidance Type Statute/ Application Office of Research and Development (ORD), Office of the Science Advisor (OSA), Forum on Environmental Measurement (FEM) - OLEM Method Detection and Quantitation - Program Use and Needs October 2010 httpsi//www, epa.gov/site s/production/fiIes/2016- 11/documents/mth det quant-guide-ref- revision nov2016.pdf Chemical Method quantitation Common practices for determining method detection and quantitation ORD, OSA, FEM - OLEM Calibration Curves - Program Use and Needs October 2010 httpsi//www, epa.gov/site s/production/files/2014- 05/documents/calibratio n-guide-ref-final- oct2010.pdf Chemical Method quantitation Common practices for instrument calibration EPA (General)-OLEM Detection Limit and Quantitation Limit Summary Table November 2016 https://www.epa.gov/site s/production/files/2016- 11/documents/mdlmql- toolbox- final nov2016 O.pdf Chemical Method quantitation Common practices for determining detection limits Office of Research and Development (ORD) Office of the Science Advisor (OSA), Forum on Environmental Measurements (FEM) Flexible Approaches to Environmental Measurement 2008, 2015 https://www.epa.gov/site s/production/files/2014- OS/documents/faem- webinar-presentation- notes.pdf; https://www.epa.gov/me asurements- modeling/flexible- approaches- environmental- measurement Chemical, Biological, Radiochemical General method development guidance Flexible approaches to measurements and methods E-5 ------- Office/ Organization Document or Reference URL Category Guidance Type Statute/ Application National Homeland Security Research Center (NHSRC) Magnuson, M., Campisano, R., Griggs, J., Fitz-James, S., Hall, K., Mapp, L., Mullins, M., Nichols, T., Shah, S., Silvestri, E. and Smith, T., 2014. Analysis of environmental contamination resulting from catastrophic incidents: Part 2. Building laboratory capability by selecting and developing analytical methodologies. Environment international, 72, pp.90-97. httpsi//www,sciencedirec t.com/science/article/pii/ S0160412014000263 Chemical and biological General method development guidance Method Development development process for chemical, biologicals, and biotoxins Office of the Science Advisor (OSA), Forum on Environmental Measurement (FEM) Method Detection and Quantitation - Program Use and Needs October 2010 https://www.epa.gov/site s/production/files/2016- 11/documents/mth det quant-guide-ref- revision nov2016.pdf Chemical Method quantitation Common practices for determining method detection and quantitation OSA, FEM Calibration Curves - Program Use and Needs October 2010 https://www.epa.gov/site s/production/files/2014- 05/documents/calibratio Chemical Method quantitation Common practices for instrument calibration n-guide-ref-final- oct2010.pdf EPA (General) - ORD Detection Limit and Quantitation Limit Summary Table November 2016 https://www.epa.gov/site s/production/files/2016- 11/documents/mdlmql- toolbox- final nov2016 O.pdf Chemical Method quantitation Common practices for determining detection limits OSA, FEM Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods https://www.epa.gov/site s/production/files/2015- 01/documents/chemmet hod validity guide.pdf Chemical Method validation and peer review policies and guidelines Chemical methods of analysis E-6 ------- Office/ Organization Document or Reference URL Category Guidance Type Statute/ Application of Analysis October 2005 OSA, FEM Validation and Peer Review of U.S. Environmental Protection Agency Radiochemical Methods of Analysis September 2006 httpsi//www, epa.gov/site Radiochemical Method validation and peer review policies and guidelines Radiochemical methods of analysis s/production/files/2014- 05/documents/radioche mmethod validity guide, pdf OSA, FEM Emergency Response Methods Validation Policy July 2010, 2016 httpsi//www, epa.gov/site s/production/files/2016- 11/documents/emergenc y response validity polic y reaffirmed nov2016,pd f Chemical, Biological, Radiochemical Method validation and peer review policies and guidelines All environmental methods of analysis (chemical, radiochemical, microbiological) developed for emergency response situation (e.g., natural disaster, homeland security) National Exposure Research Laboratory, Human Exposure and Atmospheric Sciences Division Guidelines for FRM and FEM Applicants September 2011 https://www3.epa,gov/tt n/amtic/files/ambient/cri teria/frmfemguidelines.p df Chemical General method development guidance NAAQS (Clean Air Act)- Manual Reference Methods National Homeland Security Research Center (NHSRC) Risk-Based Criteria to Support Validation of Detection Methods for Drinking Water and Air October 2008 httpsi//cfpub, epa.gov/si/ si public file download.c fm?p download id =4982 47&La b=NHSRC Chemical, Biological, Radiochemical Method validation and peer review policies and guidelines Validation of analytical methods for threat contaminants under the U.S. EPA NHSRC program. E-7 ------- Office/ Organization Document or Reference URL Category Guidance Type Statute/ Application Air Pollution Prevention and Control Division, National Risk Management Research Laboratory Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1314 and Method 1315 September 2012 httpsi//nepis, epa.gov/Ad obe/PDF/PlOOFAFC.pdf Chemical, Radiochemical Method validation and peer review policies and guidelines Validation report to summarize a validation study that was performed on Methods 1314 and 1315 (for EPA to review and approve for the purpose of inclusion into EPA's Test Methods for Evaluating Solid Waste, Physical/Chemical Methods [SW-846 Methods]) Office of Water (OW) Office of Ground Water and Drinking Water Methods Development for Unregulated Contaminants in Drinking Water: Public Meeting and Webinar June 2018 httpsi//www, epa.gov/site s/production/files/2018- 07/documents/method- development- unregulated- contaminants-drinking- water-meeting-materials- june2018.pdf Chemical General methods development guidance Safe Drinking Water Act 500 series Office of Science and Technology (OST), Engineering and Analysis Division (EAD), Engineering and Analytical Support Branch Quality Assurance and Quality Control Requirements in Methods Not Published by EPA May 2009 https://www.epa.gov/site s/production/files/2015- 10/documents/qa-qc-in- methods memo 05-07- 2009.pdf Chemical Method quantitation 40 CFR Part 136 Methods OST, Engineering and Analytical Support Branch/ EAD (4303T), Clean Water Act (CWA) Methods Team Definition and Procedure for the Determination of the Method Detection Limit, Revision 2 December 2016 https://www.epa.gov/site s/production/files/2016- 12/documents/mdl- procedure rev2 12-13- 2016.pdf Chemical Method quantitation Revision 2 of MDL procedure from 40 CFR Part 136 Appendix B E-8 ------- Office/ Organization Document or Reference URL Category Guidance Type Statute/ Application EAD Protocol for Review and Validation of New Methods for Regulated Organic and Inorganic Analytes in Wastewater under EPA's Alternate Test Procedure Program February 2018 httpsi//www, epa.gov/site s/production/files/2018- 03/documents/chemical- Chemical General methods development guidance New methods for organic and inorganic analytes used in CWA programs (specifically, while operating within the Alternate Test Procedure Program) atp-protocol feb- 2018.pdf Office of Research and Development (ORD), Office of the Science Advisor (OSA), Forum on Environmental Measurement (FEM) - OW Method Detection and Quantitation - Program Use and Needs October 2010 httpsi//www, epa.gov/site s/production/files/2016- 11/documents/mth det quant-guide-ref- revision nov2016.pdf Chemical Method quantitation Common practices for determining method detection and quantitation ORD, OSA, FEM - OW Calibration Curves - Program Use and Needs October 2010 https://www.epa.gov/site s/production/files/2014- 05/documents/calibratio n-guide-ref-final- oct2010.pdf Chemical Method quantitation Common practices for instrument calibration EPA (General)-OW Detection Limit and Quantitation Limit Summary Table November 2016 https://www.epa.gov/site s/production/files/2016- 11/documents/mdlmql- toolbox- final nov2016 O.pdf Chemical Method quantitation Common practices for determining detection limits EPA (General)-OW Protocol for the Evaluation of Alternate Test Procedures for Analyzing Radioactive Contaminants in Drinking Water February 2015 http://nepis.epa.gov/Exe/ ZvPDF.cgi?Dockev=P100 MESN.txt Radiochemical General method development guidance; Method validation and peer review policies and guidelines New drinking water methods for Safe Drinking Water Act compliance monitoring submitted to the Drinking Water Alternate Test Procedure Program E-9 ------- Office/ Organization Document or Reference URL Category Guidance Type Statute/ Application EPA (General)-OW Protocol for the Evaluation of Alternate Test Procedures for Organic and Inorganic Analytes in Drinking Water February 2015 https://nepis.epa.gov/Exe /ZyPDF,cgi?Dockey=P100 MERX.txt Chemical General method development guidance; Method validation and peer review policies and guidelines New drinking water methods for Safe Drinking Water Act compliance monitoring submitted to the Drinking Water Alternate Test Procedure Program EPA General EPA (General) Performance Based Measurement System, 62 Fed. Reg. 193 (October 6, 1997) October 1997 httpsi//archive,epa,gov/e pawaste/hazard/web/pdf Z97-26443.pdf Chemical, Biological, Radiochemical General method development guidance Physical, Chemical, and Biological measurements. (Performance Based Measurement System Approach does not apply when a specific method is prescribed in a regulation itself) Office of Research and Development (ORD), Office of the Science Advisor (OSA), Forum on Environmental Measurement (FEM) - Office of Cooperative Environmental Management (OCEM) Method Detection and Quantitation - Program Use and Needs October 2010 httpsi//www, epa.gov/site s/production/files/2016- 11/documents/mth det quant-guide-ref- revision nov2016.pdf General Method quantitation Common practices for determining method detection and quantitation Federal Advisory Committee on Detection and Quantitation Approaches and Uses in Clean Water Act Programs (FACDQ) Report of the Federal Advisory Committee on Detection and Quantitation Approaches and Uses in Clean Water Act Programs December 2007 httpsi//www, epa.gov/site s/production/files/2015- 10/documents/detection- quant-faca final- report 2012.pdf Chemical Method quantitation CWA Methods E-10 ------- Office/ Organization Document or Reference URL Category Guidance Type Statute/ Application EPA (This document was developed in conjunction with multiple federal agencies) Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP) July 2004 httpsi//www, epa.gov/rad Radiochemical General method development guidance; method quantitation; method validation and peer review policies and guidelines Performance-based approach to radioanalytical methods iation/marlap-manual- and-supporting- documents Forum of Environmental Management (FEM) Validation and Peer Review of U.S. EPA Sampling Methods for Chemical and Radiochemical Parameters February 2016 https://www.epa, gow/site s/production/files/2016- 02/documents/radioche m method guide revise Chemical, Radiochemical Method validation and peer review policies and guidelines Validation of new sampling methods for chemical and radiochemical parameters before publication for general use d 020316 00000002.t>df Forum of Environmental Management (FEM) Validation and Peer Review of U.S. Environmental Protection Agency Chemical Methods of Analysis February 2016 https://www.epa.gov/site s/production/files/2016- 02/documents/chemical method guide revised 0 Chemical Method validation and peer review policies and guidelines Validation of new chemical methods of analysis before publication for general use 20316.pdf E-ll ------- Appendix F Non-EPA Method Validation References F-l ------- Office/Organization Information Source or Reference URL Category Guidance Type Statute/Application Non-EPA Federal Agencies US Department of Health and Human Services (HHS), Centers for Disease Control and Prevention (CDC) Centers for Disease Control and Prevention (CDC), National Institute for Occupational Safety and Health (NIOSH) NIOSH Manual of Analytical Methods - Chapter E - Development and Evaluation of Methods April 2016 https://www.cdc.g ov/niosh/nmam/p Chemical General method development guidelines and method validation and peer review policies and guidelines Generalized set of evaluation criteria prepared by NIOSH researchers for the evaluation of sampling and analytical methodology dfs/NMAM 5thEd EBook.pdf CDC, NIOSH Guidelines for Air Sampling and Analytical Method Development and Evaluation May 1995 https://www.cdc.g ov/niosh/docs/95- liy/defaulthtml Chemical General method development guidelines and method validation and peer review policies and guidelines Sampling and analytical methods for workplace compliance determinations CDC, NIOSH Development and Validation of Methods for Sampling and Analysis of Workplace Toxic Substances September 1980 https://www.cdc.g ov/niosh/docs/80- 133/pdfs/80~ 133.pdf Chemical General method development guidelines and method validation and peer review policies and guidelines Sampling and analytical methods of workplace toxic substances National Institutes of Health (NIH) National Institutes of Health (NIH), The National Institute of Environmental Health Sciences (NIEHS), National Toxicology Program (NTP) Validation and Regulatory Acceptance ofToxicological Test Methods - A Report of the ad hoc Interagency Coordinating March 1997 httpsi//ntp,niehs,n ili.gov/iccvam/docs /about docs/valid ate.pdf Chemical Method validation and peer review policies and guidelines Criteria and processes for validation and regulatory acceptance of toxicological testing methods United States Department of Commerce, National Institute of Standards and Technology (NIST) F-2 ------- Office/Organization National Institute of Standards and Technology (NIST) Information Source or Reference Sander, L.C., 2017. Liquid Chromatography: Introduction to Method Development. Journal of Research of the National Institute of Standards and Technology, 122. March 2017 URL https://nvlpubs.nis t.gov/nistpubs/ires /122/ires, 122,018, pdf Category Guidance Type Statute/Application Liquid chromatography and analytical instrumentation methods Chemical General method development guidelines NIST Procedure for Method Validation 2018 https://www.nist, g ov/system/files/do cuments/2018/01/ 12,/procedure-for- method-validation- 20180101.pdf Chemical Method validation and peer review policies and guidelines Laboratory methods United States Department of Labor, Occupational Safety and Health Administration (OSHA) Occupational Safety and Health Administration (OSHA) Validation Guidelines for Air Sampling Methods Utilizing Chromatographic Analysis May 2010 https://www.osha. gov/dts/sltc/meth ods/chromguide/c hromguide.pdf Chemical Method validation and peer review policies and guidelines Air sampling methods utilizing chromatographic analysis OSHA, Methods Development Team, Industrial Hygiene Chemistry Division Evaluation Guidelines for Air Sampling Methods Utilizing Spectroscopic Analysis October 2010 https://www.osha. gov/dts/sltc/meth ods/spectroguide/s pectroguide.pdf Chemical Method validation and peer review policies and guidelines Air sampling methods utilizing spectroscopic analysis U.S. Department of Health and Human Services, FDA, Center for Drug Evaluation and Research (CDER), Center for Veterinary Medicine (CVM) Bioanalytical Method Validation - Guidance for Industry May 2018 https://www.fda. g ov/files/drugs/publ ished/Bioanalytical -Method- Validation- Guidance-for- lndustry.pdf Biological and Chemical Method validation and peer review policies and guidelines Bioanalytical procedures F-3 ------- Office/Organization US FDA, Office of Foods and Veterinary Medicine Information Source or Reference Guidelines for the Validation of Chemical Methods for the FDA FVM Program, 2nd Edition April 2015 URL https://www.fda,g ov/media/81810/d ownload Category Guidance Type Statute/Application Chemical methods for foods and feeds Chemical Method validation and peer review policies and guidelines US FDA, ORA Laboratory Procedure Methods, Method Verification and Validation - ORA-LAB.5.4.5 Revised 08-29-14 https://www.fda,g ov/media/73920/d ownload Chemical and Biological Method validation and peer review policies and guidelines United States Geological Survey (USGS) United States Geological Survey (USGS) https://minerals.us gs.gov/science/ana lytical-methods- development/inde N/A General method development guidelines Macro analytical methods development Non-EPA State Agencies Florida Department of Environmental Protection (FDEP) New and Alternative Laboratory Methods DEP- QA-001/01 February 2004 — , .. MV s.org/gateway/rea dRefFile,asp?refld= 4359&filename=Ne w%20and%20Alter native%20Laborato ry%20Methods.pdf Chemical and Biological Method validation and peer review policies and guidelines Analytical laboratory methods FDEP Procedure And Policy For Demostration Of Capability For Methods, Instruments, And Laboratory Staff November 2018 https://fldeploc.de p.state.fl.us/sop/so pl.asp?sect=BURE AU Chemical and Biological Method validation and peer review policies and guidelines Analytical laboratory methods FDEP Quality Manual for State of Florida Environmental Protection Laboratory January 2019 http://publicfiles,d ep.state.fl.us/dear/ labs/lab qualityma nual 19.pdf Chemical and Biological Method validation and peer review policies and guidelines Analytical laboratory methods F-4 ------- Office/Organization Information Source or Reference URL Category Guidance Type Statute/Application California Department of Pesticide Regulation, Environmental Hazards Assessment Branch California Department of Pesticide Regulation - SOP - Chemistry Laboratory Quality Control July 1995 https://www.wate rboards, ca.gov/wat er issues/program s/tmdl/records/reg i 08/ref244 2.pdf Chemical General method development guidelines and Method validation and peer review policies and guidelines Analytical laboratory methods California Department of Pesticide Regulation, Environmental Monitoring Branch California Department of Pesticide Regulation - SOP - Guide For Analytical Method Development 2017 https://www.cdpr. ca.gov/docs/emon Chemical General method development guidelines and Method validation and peer review policies and guidelines Quantification of pesticides in aqueous and/or sediment samples /pubs/sops/qaqcO 1200.pdf New Jersey Department of Health, Public Health and Environmental Laboratories Quality Manual, Environmental and Chemical Laboratory Services July 2014 https://www.ni.go v/health/phel/doc uments/ecls qm 7 -2014.pdf Chemical, Radiochemical Method validation and peer review policies and guidelines Analytical laboratory methods Private Sector ASTM International ASTM International ASTM E2857-11, Standard Guide for Validating Analytical Methods Reapproved 2016 N/A Chemical Method validation and peer review policies and guidelines Chemical and spectrochemical analytical methods for metals, ores, and related materials ASTM International ASTM E691 - 19el, Standard Practice for Conducting an Interlaboratory Study to Determine the Precision of a Test Method April 2020 N/A Chemical Method validation and peer review policies and guidelines Interlaboratory study F-5 ------- Office/Organization Information Source or Reference URL Category Guidance Type Statute/Application ASTM International ASTM E1169 - 18, Standard Practice for Conducting Ruggedness Tests April 2018 N/A Chemical, Biological, and Radiochemical Method validation and peer review policies and guidelines Ruggedness tests ASTM International ASTM E1601 - 19, Standard Practice for Conducting an Interlaboratory Study to Evaluate the Performance of an Analytical Method November 2019 N/A Chemical Method validation and peer review policies and guidelines Interlaboratory study for analytical method International Union of Pure and Applied Chemistry (IUPAC) International Union of Pure and Applied Chemistry (IUPAC) Thompson, M., Ellison, S.L and Wood, R., 2002. Harmonized guidelines for single-laboratory validation of methods of analysis (IUPACTechnical Report). Pure and Applied Chemistry, 74(5), pp.835-855. httpi//publications .iupac.org/pac/200 2/pdf/740Sx083S, p df Chemical Method validation and peer review policies and guidelines Methods of analysis Eurachem Eurachem The Fitness for Purpose of Analytical Methods - A Laboratory Guide to Method Validation and Related Topics 2014 httpsi//www,eurac hem.org/imaKes/st ories/Guides/pdf/ MV guide 2nd ed EN.pdf Chemical Method validation and peer review policies and guidelines Analytical chemistry AOAC International AOAC International How to Meet ISO 17025 Requirements for Method Verification 2007 httpi//www,aoac,Q )ac prod imis /aoac docs/Iptp/al acc guide 2008.pd | Chemical and Biological Method Verification guidance (Not to be confused with method validation guidance) Chemical and microbiological methods F-6 ------- Office/Organization Information Source or Reference URL Category Guidance Type Statute/Application AOAC International Appendix F Guidelines for Standard Method Performance Requirements 2016 httpi//www,eoma, aoac.org/app f.pdf Chemical and Biological Method validation and peer review policies and guidelines Chemical and biological methods AOAC International Appendix D: Guidelines for Collaborative Study Procedures To Validate Characteristics of a Method of Analysis 2005 httpi//www,aoac,o rg/aoac prod imis Chemical Method validation and peer review policies and guidelines Methods of analysis /AOAC Docs/Stand ardsDevelopment/ Collaborative Stud V Validation Guid elines.pdf National Association of Testing Authorities (NATA) National Association of Testing Authorities (NATA) Technical Note 17 - Guidelines for the validation and verification of quantitative and qualitative test methods 2004, 2012 httpi//www,demar cheisol7025.com/ document/Guideli nes%20for%20the %20validation%20 and%20verifi cation %20of%20quantita tive%20and%20qu alitative%20test%2 0methods.pdf Chemical, Biological Method validation and peer review policies and guidelines Quantitative and qualitative test methods F-7 ------- |