United States Environmental Protection Agency Environmental Monitoring Systems Laboratory Las Vegas NV 89193-3478 Research and Development EPA/600/S4-88/010 Mar. 1988 v>EPA Project Summary Guide to the Application of Quality Assurance Data to Routine Survey Data Analysis S. G. Paulsen, C. L. Chen, K. J. Stetzenbach, and M. J. Miah The National Surface Water Survey at the National Acid Precipitation Assessment Program was designed to evaluate the present status of our nation's surface waters with regard to the problem of acidic precipitation. In this program, extensive effort has been directed toward assuring and quantify- ing the quality of the data produced during the surveys. This report provides assistance in utilizing the quality assur- ance data when interpreting the routine survey data. The quality assurance reports for each of the surface water surveys. Eastern Lake Survey—Phase I, Western Lake Survey—Phase I, National Stream Survey—Phase I and Eastern Lake Survey—Phase II provide detailed in- formation on the detectability, accu- racy and precision of the routine lake data collected within each of these surveys. The data contained in the quality assurance reports pertaining to each of these issues can provide addi- tional information which can enhance the analysis of the routine lake data and extend the applicability of the survey data beyond the original intent by providing future investigators the kind of detailed information about the quality of the data which is necessary when applying the data to studies for which it was not designed. This Project Summary was devel- oped by EPA's Environmental Monitor- ing Systems Laboratory, Las Vegas. NV, to announce key findings of the research project that is fully docu- mented in a separate report of the same title (see Project Report ordering information at back). Introduction This document is designed to assist the end user of the National Surface Water Survey (NSWS) data with the interpre- tation and application of the quality assurance data within each survey. A quality assurance program is used not only to ensure that the data produced from these surveys meet some predeter- mined standards, but also measure its accuracy and precision so that inter and intra survey comparisons are possible. Most users of the survey data will be concerned with the accuracy and preci- sion of any given measurement. The precision of a measurement cannot be changed, however, lack of accuracy resulting from an identifiable error may be correctable. This report is designed to help the user understand and apply the information available in the QA reports. It is hoped that the reader will be able to use the methods provided in this document to evaluate the bias in sample data and correct for it when possible, and apply precision data to population estimates and to temporal and spatial comparisons. The information in this report will also provide the users of these QA data with ideas of what can or cannot be accom- plished with QA data. We will try to identify components of error and their relative magnitude. All of the data (with the exception of chapter 5) used for this report comes from the quality assurance data collected during the Eastern Lake Survey Phase-1 (ELS-I) and Western Lake Survey Phase-1 (WLS-I) of the Aquatic Effects Research Program (AERP). ------- The QA/QC data collected in these surveys have three major functions: 1. Ensure and identify the precision, accuracy, representativeness, completeness, and comparability of the survey data, 2. Improve the interpretation of the survey data, and 3. Provide a means of assessing the risk of altering current QA/QC procedures and improving future sampling efforts. The types of quality assurance and quality control samples collected, their intended function and general frequency of collection are presented in Table 1. Currently the QA/QC data have been used primarily for the first function, identifying and ensuring data quality. This report is designed to provide guid- ance in using the QA/QC data in inter- preting lake sample data and is directed primarily at the data users. A second report will follow in which the QA/QC data are used to optimize furture sam- pling efforts and will be directed toward program managers and planners. The "Guide to the Application of Quality Assurance Data to Routine Survey Data Analysis" is divided into five chapters to better assist the users of the Aquatic Effects Research Program (AERP) quality assurance data. A sum- mary of each chapter is listed below. Chapter Two—Detection Limits Two major classifications of limits appear in the IMSWS Quality Assurance reports: method limits and system limits. Method limits identify detection limits Table 1. Quality Assurance and Quality Control Samples Used During Phase I Surveys of the NSWS Sample Type Description Function Frequency of Use Quality Assurance Field Blank Field Duplicate Field Audit Reagent-grade deionized water subjected to sample colleciton, processing and analysis Duplicate lake or stream sample Synthetic sample or natural lake samples processed at field lab Used in estimating background due to sample colleciton, processing and analysis Used in estimating overall within - batch precision Used in estimating overall among- batch precision and lab bias One per sampling crew per day One per field station per day As scheduled L aboratory A udit Synthetic sample or natural lake sample: prepared at support lab Used in estimating analytical among-batch precision and lab bias As scheduled Quality Control Calibration Blank Reagent Blank Reagent-grade deionized water Reagent-grade deionized water plus reagents for total Al, SiO2 analyses Used in identifying signal drift and contamination of sample Used in identifying contamination of reagents One per lab batch One per lab batch Quality Control Check Sample (QCCS) Standard solution from source other than calibration standard Used in determining accuracy and consistency of instrument calibration Before the first measurement and as specified Detection Limit (QCCS) Standard solution at 2 to 3 times the required detection limit Used in determining accuracy at lower end of linear dynamic range of measurement method One per lab batch Field L aboratory (Trailer) Duplicate Split of lake or stream sample Used in determining analytical within-batch precision of field lab measurements One per field batch Analytical Laboratory Duplicate Matrix Spike Split of sample aliquot Sample aliquot plus known quantity of analyte Used in determining analytical within-batch precision of analytical lab measurements Used in determining sample matrix effect on analytical lab measurement One per lab batch One per lab batch (from Best et al. 1986) ------- applicable to a method under laboratory conditions and represent the lowest level of analyte detectable. Instrumental detection limits are an example of method detection limits and are deter- mined by using reagent or calibration blanks. They are primarily of interest to program and laboratory managers and are used to determine if the analytical laboratory is meeting required specifications. System limits are those limits which apply to the complete measurement process from sample collection in the field through laboratory analysis. The system decision limit and the system detection limit are two examples of system level limits. Some confusion existed during the NSWS about the use of decision and detection limits. In general, the decision limit, as used during NSWS, applies to a conceptual point which allows one to distinguish individual sample measurements from the measurements found for blank samples. The confusion arose because this conceptual point has most frequently been referred to in the past by others simply as the limit of detection. However, the detection limit, as used during NSWS, is somewhat different and refers to the conceptual point which allows the user to determine the lowest true or theoretical concentration which can be distinguished with 95% confidence from blanks. It cannot be used to determine if measurements which have already been taken are different from back- ground. It is recommended that in future reports the current usage of decision and detection limits be abandoned in favor of the more standard and accepted definitions. Chapter Three—Inaccuracy When repeated measurements are taken on a sample, the difference between the mean of the repeated measurements and the true (theoretical) concentration is defined as bias, and bias implies inaccuracy. In NSWS, synthetic audit samples are used to estimate inaccuracy. Synthetic audit samples are prepared in a laboratory by using differ- ent dilutions of standard materials, so the true (theoretical) concentrations are known. Inaccuracy can be estimated by substracting the theoretical concentra- tion from the mean of the measurements on each sample. When consistent inac- curacy exists between laboratories it is considered interlaboratory bias. If this interlaboratory bias is consistent and can be quantified then the routine data can be analyzed taking into account the bias. Chapter Four—Imprecision A measurement from the analytical laboratory is the combination of the true value, systematic biases (determinate errors), and random errors (indetermi- nate errors). Repetitive measurements of the same sample will not normally result in the same answer because of measure- ment imprecision. Commonly used mea- sures of imprecision are variance, coef- ficient of variation (relative standard deviation) and fourth spread (the differ- ence of the upper and lower quartiles). When the magnitude of the precision varies with concentration of analyte then some adjustments must be made in order to use the standard statistical techniques which assume constant variance. One approach is to stabilize the variance with data transformation techniques. The use of the quality assurance data in identi- fying suitable data transformations which tend to stabilize the variance is examined in this chapter. Measurement precision consists of a variety of components. When evaluating the analytical results from a survey, these components of variance can pro- vide some indication of steps along the processing and analytical procedure which are contributing the most to the variance. Efforts can be directed toward reducing or tightening these procedures to reduce the variance in future studies. Chapter Five—Comparison of Analyte Concentration The QA/QC plan is a strategy to monitor laboratory performance and an attempt to guarantee the quality of the measurements. The ultimate purpose of the survey is to compare analyte concen- trations from different regions or differ- ent times using the routine field samples. A preliminary nested model is proposed in this chapter to describe the field routine samples collected in ELS-I. Chapter Six—Comparison of Surveys Four surveys were conducted during Phase I of the AERP (ELS-I, WLS-I, NSS- P, and NSS-I). This chapter presents the data on detectability, precision and accuracy from these surveys so that the data can be more easily examined and compared. The ideal situation is when the estimates of detectability, precision, and accuracy are the same across surveys or similar enough so that the QA data can be pooled for all surveys. But if, for example, the decision limits differ significantly between surveys, then analysis and comparison of the routine lake data across surveys will be a more involved process. The data for all Phase I Surveys are presented for comparative purposes. References Best, M. D., S. K. Drouse, L W. Creelman, and D. J. Chaloud. 1986. National Surface Water Survey Eastern Lake Survey (Phase I—Synoptic Chemistry) Quality Assurance Report. EPA/600/ 4-86/011. U.S. Environmental Protec- tion Agency. Environmental Monitor- ing Systems Laboratory, Las Vegas, Nevada. Box, G. E. P., Hunter, W. G., and Hunter, J. S. 1978. Statistics for Experimenters. Christian, Gary. 1986. Analytical Chem- istry, p. 64. J. Wiley & Sons, New York, 4th ed. Draper, N. R. and Smith, H. 1 981 .Applied Regression Analysis. Drouse, S. K., D. C. Hillman, L. W. Creelman, and S. J. Simon. 1986. National Surface Water Survey East- ern Lake Survey (Phase I—Synoptic Chemistry) Quality Assurance Plan. U. S. Environmental Protection Agency, Las Vegas, Nevada. Hoaglin, D. C., F. Mosteller, and J. W. Tukey. 1983. Understanding Robust and Exploratory Data Analysis. Hubaux, A. and G. Vox. 1970. Decision and detection limits for linear calibra- tion curves. Anal. Chem. 42:849-855. Keith, L. H., W. Crummett, J. Deegan, Jr., R. A. Libby, J. K. Taylor, and G. Wentler. 1983. Principles of environ- mental analysis. Anal. Chem. 55:2210-2218. Linthurst, R. A., D. H. Landers, J. M. Eilers, D. F. Brakke, W. S. Overton, E. P. Meier, and R. E. Crowe. 1986. Characteristics of Lakes in the Eastern United States. Volume I: Population Descriptions and Physico-Chemical Relationships. EPA/600/4-86/007, U.S. Environmental Protection Agency, Washington, D.C. Long, G. L. and J. D. Winefordner. 1 983. Limit of detection: a closer look at the IUPAC definition. Anal. Chem. 55:712- 724. Miller, J. C. and J. N. Miller. 1984. Statistics for Analytical Chemistry. Ellis Horword, Ltd. Chicester, U.K. 202 PP. ------- SAS Institute Inc. 1 985. SAS Procedures Guide for Personal Computers. Ver- sion 6 Edition. Gary, NC: SAS Institute Inc. 373 pp. Silverstein, M. E., M. L. Faber, S. K. Drouse, and T. E. Mitchell-Hall. 1987. National Surface Water Survey West- ern Lake Survey (Phase I—Synoptic Chemistry) Quality Assurance Report. U.S. Environmental Protection Agency. Environmental Monitoring Systems Laboratory, Las Vegas, Nevada. S. G. Paulsen, C. L. Chen, and K. J. Stetzenbach are with the University of Nevada, Las Vegas, NV 89154; M. J. Miah is with Lockheed Engineering and Management Services Company, Las Vegas. NV 89119. Robert D. Schonbrod is the EPA Project Officer (see below). The complete report, entitled "Guide to the Application of Quality Assurance Data to Routine Survey Data Analysis." (Order No. PB 88-166 863/AS; Cost: $14.95, subject to change) will be available only from: National Technical Information Service 5285 Port Royal Road Springfield. VA 22161 Telephone: 703-487-4650 The EPA Project Officer can be contacted at: Environmental Monitoring Systems Laboratory U.S. Environmental Protection Agency P.O. Box93478 Las Vegas. NV 89193-3478 United States Environmental Protection Agency Center for Environmental Research Information Cincinnati OH 45268 Official Business Penalty for Private Use $300 EPA/600/S4-88/010 PS :ENT PRINTING OFFICE.- issa—548-013/8? ------- |