STATISTICAL ANALYSIS
OF
GROUNDWATER
AT
RCRA FACILITIES
-DRAFT GUIDANCE-
office OF SOLID WASTE
WASTE MANAGEMENT DIVISION
U.S. ENVIRONMENTAL PROTECTION AGENCY
401 M STREET, S.W.
WASHINGTON, D.C. 20460
OCTOBER 1988

-------
STATISTICAL ANALYSIS OF GROUND WATER
AT
RCRA FACILITIES
DRAFT GUIDANCE
For
U.S. ENVIRONMENTAL PROTECTION AGENCY
OFFICE OF SOLID WASTE
WASTE MANAGEMENT DIVISION
401 M Street, S.W.
Washington, D.C. 20460

-------
DISCLAIMER
This document is intended to assist Regional and State personnel in
evaluating ground-water monitoring data from RCRA facilities. Conformance
with this guidance is expected to result 1n statistical methods and sampling
procedures that meet the regulatory standard of protecting human health and
the environment. However, EPA will not 1n all cases limit its approval of
statistical methods and sampling procedures to those that comport with the
guidance set forth herein. This guidance 1s not a regulation (i.e., 1t does
not establish a standard of conduct which has the force of law) and should not
be used as such. Regional and State personnel should exercise their discre-
tion 1n using this guidance document as well as other relevant information 1n
choosing a statistical method and sampling procedure for evaluating ground-
water monitoring data from RCRA facilities.
This document has been reviewed by the Office of Solid Waste, U.S. Envi-
ronmental Protection Agency, Washington, O.C., and approved for publication.
Approval does not signify that the contents necessarily reflect the views and
policies of the U.S. Environmental Protection Agency, nor does mention of
trade names, commercial products, or publications constitute endorsement or
recommendation for use.
i

-------
ACKNOWLEDGMENT
This document was developed by EPA's Office of Solid Waste under the
direction of Or. Vernon Myers, Chief of the Ground-Water Section of the Waste
Management Division. The document was prepared by the joint efforts of
Or. Vernon B. Myers, Mr. James R. Brown of the Waste Management Division,
Mr. James Craig of the Office of Policy Planning and Information, and
Mr. Barnes Johnson of the Office of Policy, Planning, and Evaluation.
11

-------
PREFACE
This guidance document has been developed primarily for
evaluating ground-water monitoring data at RCRA (Resource
Conservation and Recovery Act) facilities. The statistical
methodologies described in this document can be applied to both
hazardous (Subtitle C of RCRA), and municipal (Subtitle D of
RCRA) waste land disposal facilities.
The guidance has wider applications however, if one
examines the spatial relationships involved between the
monitoring wells and the potential contaminant source. For
example, Section 4 of the guidance describes background well
(upgradient) versus compliance well (downgradient) comparisons.
This scenario can be applied to other "non-RCRA" situations
involving the same spatial relationships and the same null
hypothesis. The explicit null hypothesis (Ho) for testing
contrasts between means, or where appropriate between medians,
is that the means between groups (here monitoring wells) are
equal (i.e., no release has been detected), or that the group
means are below a prescribed action level (e.g., the
ground-water protection standard). Statistical methods that can
be used to evaluate these conditions are described in Sections
4.2 (Analysis of Variance), 4.3 (Tolerance Intervals), and 4.4
(Prediction Intervals).
iii

-------
When compliance wells (downgradient) are compared to a fixed
standard (e.g., the ground-water protection standard), Section 5
of the guidance should be consulted. The value to which the
compliance wells are compared can be any standard established by
a Regional Administrator, State or county health official, or
another appropriate official.
A note of caution applies to Section 5. The examples are
used to determine whether or not ground-water has been
contaminated as a result of a release from a facility. When the
upper confidence limit is exceeded, further action or assessment
may be warranted. If one wishes to determine whether or not a
cleanup standard has been attained for a Superfund site or a
RCRA facility in corrective action, another EPA guidance
document entitled, "Statistical Methods for the Attainment of
Superfund Cleanup Standards - Draft," should be consulted. This
draft Superfund guidance is a multi-volume set that addresses
questions regarding the success of air, ground-water, and sail
remediation efforts. Information about the availability of this
draft guidance that is currently being developed can be obtained
by calling the RCRA/Superfund Hotline, telephone (800) 424-9346
or (202) 382-3000. Those interested in evaluating individual
wells or an intra-well comparison are referred to Section 6 of
the guidance which describes the use of Shewhart-CUSUM control
charts and trend analysis.
iv

-------
Municipal water supply engineers for example, who wish to
monitor water quality parameters in supply wells may find this
section useful.
Other sections of the guidance have wide applications in
the field of applied statistics, regardless of the intended use
or purpose. Sections 3.2 and 3.3 provide information on
checking distributional assumptions and equality of variance,
while Sections 7.1 and 7.2 cover limit of detection problems and
outliers. Any experiment involving the use of statistics may
consult these sections for helpful advice and references.
Finally, it should be noted that this guidance is not
intended to be the final chapter on the statistical analysis of
ground-water monitoring data, nor should it be used as such. 40
CFR Part 264 Subpart F offers an alternative (1264.97(h)(5) to
the methods suggested and described in this guidance document.
In fact, the guidance recommends a procedure (confidence
intervals) for comparing monitoring data to a fixed standard
that is not mentioned in the Subpart F regulations. This is
neither contradictory or inconsistent, but rather
epitomizes the complexities of the subject matter and
exemplifies the need for flexibility due to the site specific
monitoring requirements of the RCRA program.
v

-------
CONTENTS
1.	Introduction	 1-1
2.	Regulatory Overview	 2-1
2.1	Background	 2-1
2.2	Overview of Methodology	 2-3
2.3	General Performance Standards	 2-3
2.4	Basic Statistical Methods and Sampling
Procedures	 2-6
2.5	Choosing a Sampling Interval	 2-10
2.6	Example Calculations	 2-16
3.	Choosing a Statistical Method	 3-1
3.1	Flow Charts—Overview and Use	 3-1
3.2	Checking Distributional Assumptions	 3-3
3.3	Checking Equality of Variance: Bartlett's Test	 3-16
4.	Background We 11 /Compliance Well Comparisons	 4-1
4.1	Summary Flow Chart for Background Well/Compllance
Well Comparisons	 4-2
4.2	Analysis of Variance	 4-5
4.3	Tolerance Intervals Based on the Normal
Distribution			 4-18
4.4	Prediction Intervals	 4-21
5.	Comparisons with MCLs or ACLs		 5-1
5.1 Summary Chart for Comparison with MCLs or ACLs...... 5-1
5-2	Statistical Procedures	 5-1
6.	Control Charts for Intra-Well Comparisons	 6-1
6-1	Advantages of Plotting Data	 6-1
6-2 Correcting for Seasonality	 6-2
6.3	Combined Shewhart-Cusum Control Charts for Each
Well and Constituent	 6-5
6.4	Update of a Control Chart	 6-11
6.5	Nondetects 1n a Control Chart	 6-12
7.	Miscellaneous Topics		 7-1
7.1	Limit of Oetection	 7-1
7.2	Outliers	 7-10
Appendices
A.	General Statistical Considerations and Glossary of
Statistical Terms	 A-L
B.	Statistical Tables	 B-l
C.	General Bibliography	 C-l
vi

-------
FIGURES
Number	Page
2-1 Hydraulic conductivity of selected rocks	 2-12
2-2 Total porosity and drainable porosity for typical geologic
materials			 2-15
2-3	Potent1ometr1c surface map for computation of hydraulic
gradient	.	*	 2-17
3-1	Flowchart overview	 3-4
3-2 Probability plot of raw chlordane concentrations	 3-11
3-3	Probability plot of log-transformed chlordane concentrations.. 3-13
4-1	Background we 11/compliance well comparisons	 4-3
4-2	Tolerance limits: alternate approach to background
we 11/compliance well comparisons.........	 4-4
5-1	Comparisons with MCLs/ACLs			 5-2
6-1	Plot of unadjusted and seasonaly adjusted monthly
observations............ 						 6-6
6-2 Comb i ned Shewhart-CUSUM chart							 6-2
vii

-------
TABLES
Number	Page
2-1 Summary of Statistical Methods		2-7
2-2 Default Values for Effective Porosity for Use in Time of
Travel (TOT) Analyses		2-13
2-3 Specific Yield Values for Selected Rock Units		2-14
2-4	Determining a Sampling Interval		2-19
3-1	Example Data for Coefficient of Variation Test		3-8
3-2 Example Data Computations for Probability Plotting		3-10
3-3 Cell Boundaries for the Chi-Squared Test		3-17
3-5	Example Data for Bartlett's Test		3-18
4-1	One-Way Parametric ANOVA Table		4-8
4-2 Example Data for One-Way Parametric Analysis of Variance		4-10
4-3 Example Computations 1n One-Way Parametric ANOVA Table		4-11
4-4 Example Data for One-Way Nonparametrlc ANOVA—Benzene
Concentrations (in ppm)		4-16
4-5 Example Data for Normal Tolerance Interval		4-22
4-6	Example Data for Prediction Interval—Chlordane Levels		4-25
5-1	Example Data for Normal Confidence Interval—Aldlcarb
Concentrations in Compliance Wells (ppb)		5-4
5-2 Example Data for Log-Normal Confidence Interval—EDB
Concentrations 1n Compliance Wells (ppb)		5-6
5-3 Values of M and n+l-M and Confidence Coefficients for
Small Samples		5-9
5-4 Example Data for Nonparametrlc Confidence Interval—Si!vex
Concentrations (ppm)		5-10
5-5 Example Data for a Tolerance Interval Compared to an ACL		5-12
viii

-------
TABLES (continued)
Number	Page
6-1 Example Computation for Deseasonalizing Data			6-5
6-2	Example Data for Combined Shewhart-Cusum Chart—Carbon
Tetrachloride Concentration (yg/L)..					.....		6-9
7-1	Methods for Below Detection Limit Values				7-2
7-2 Example Data for a Test of Proportions		7-5
7-3 Example Data for Testing Cohen's Test				7-8
7-4 Example Data for Testing for an Outliers..	»o		7-12
ix

-------
SECTION 1
INTRODUCTION
The U.S. Environmental Protection Agency (EPA) promulgated regulations
for detecting contamination of ground water at hazardous waste land disposal
facilities under the Resource Conservation and Recovery Act (RCRA) of 1976.
The statistical procedures specified for use to evaluate the presence of con-
tamination have been criticized and require improvement. Therefore, EPA has
proposed to revise those statistical procedures 1n 40 CFR Part 264, "Statis-
tical Methods for Evaluating Ground-Water Monitoring Data from Hazardous Waste
Facilities."
In 40 CFR Part 264, EPA has proposed statistical procedures that are
appropriate for evaluating ground-water monitoring data under a variety of
situations. The purpose of this document is to provide guidance in deter-
mining which situation applies and consequently which statistical procedure
may be used. In addition to providing guidance on selection of an appropriate
statistical procedure, this document provides instructions on carrying out the
procedure and interpreting the results.
The regulations provide three levels of monitoring for a regulated
unit: detection monitoring; compliance monitoring; and corrective action.
The regulations define conditions for a regulated unit to be changed /rom one
level of monitoring to a more stringent level of monitoring (e.g., from detec-
tion monitoring to compliance monitoring). These conditions are that there 1s
statistically significant evidence of contamination.
The regulations allow the benefit of the doubt to reside with the current
stage of monitoring. That is, a unit will remain in its current monitoring
stage unless there is convincing evidence to change it. This means that a
unit will not be changed from detection monitoring to compliance monitoring
(or from compliance monitoring to corrective action) unless there 1s statisti-
cally significant evidence of contamination (or contamination above the com-
pliance limit).
The main purpose of this document 1s to guide owners, operators, regional
administrators, and other Interested parties 1n the selection, use, and Inter-
pretation of appropriate statistical methods for monitoring the ground water
at each specific regulated unit. Topics to be covered include sampling
needed, sample sizes, selection of appropriate statistical design, matching
analysis of data to design, and Interpretation of results. Specific recom-
mended methods are detailed and a general discussion of evaluation of
alternate methods is provided. Statistical concepts are discussed 1n an
1-1

-------
appendix. References for suggested procedures are provided as well as
references to alternative procedures and general statistics texts. Situations
calling for external consultation are mentioned as we 11 as sources for obtain-
ing expert assistance when needed.
1-2

-------
SECTION 2
REGULATORY OVERVIEW
EPA promulgated ground-water monitoring and response
standards for permitted facilities in 1982 (47 FR 32274, July 26,
1982), for detecting releases of hazardous wastes into ground
water from storage, treatment, and disposal units.
The Subpart F regulations required ground-water data to be
examined by statistical procedures to determine whether there was
a significant exceedance of background levels, or other allowable
levels, of specified chemical parameters and hazardous waste
constituents.	One concern was that the procedure in the
regulations could result in a high rate of "false positives*'
(Type I error), thus requiring an owner or operator
unnecessarily to advance into a more comprehensive and expensive
phase of monitoring. More importantly, another concern was that
the procedure could result in a high rate of "false negatives"
(Type II error), i.e., instances where actual contamination would
go undetected.
As a result of these concerns, EPA amended the procedure
with five different statistical methods that are more appropriate
for ground-water monitoring (53	39720; October 11, 1988).
These amendments also outline sampling procedures and performance
standards that are designed to help minimize the event that a
statistical method will indicate contamination when it is not
present (Type I error), and fail to detect contamination when it
is present (Type II error).
2.1 BACKGROUND
Subtitle C of the Resource Conservation Recovery Act of 1976
(RCRA) creates a comprehensive program for the safe management of
hazardous waste. Section 3004 of RCRA requires owners and
operators of facilities that treat, store, or dispose of
hazardous waste to comply with standards established by EPA that
are "necessary to protect human health and the environment."
Section 3005 provides for implementation of these standards under
permits issued to owners and operators by EPA or authorized
States. Section 3005 also provides that owners and operators of
existing facilities that apply for a permit and comply with
applicable notice requirements may operate until a permit
2-1

-------
determination is made. These facilities are commonly known as
"interim status" facilities. Owners and operators of interim
status facilities also must comply with standards set under
Section 3004.
EPA promulgated ground-water monitoring and response
standards for permitted facilities in 1982 (47 FR 32274, July 26,
1982), codified in 40 CFR Part 264, Subpart F„ These standards
established programs for protecting ground water from releases of
hazardous wastes from treatment, storage,, and disposal units.
Facility owners and operators were required to sample ground
water at specified intervals and to use a statistical procedure
to determine whether or not hazardous wastes or constituents from
the facility are contaminating ground water. As explained in
more detail below, the Subpart F regulations regarding
statistical methods used in evaluating ground-water monitoring
data that EPA promulgated in 1982 have generated criticism.
The former Part 264 regulations provided that the Cochran's
Approximation to the Behrens Fisher Student's t-test (CABF) or an
alternate statistical procedure approved by EPA be used to
determine whether there is a statistically significant exceedance
of background levels, or other allowable levels, of specified
chemical parameters and hazardous waste constituents. Although
the • regulations have always provided latitude for the use of an
alternate statistical procedure, concerns were raised that the
CABF statistical procedure in the regulations was not
appropriate. It was pointed out that: (1) the replicate
sampling method is not appropriate for the CABF procedure, (2)
the CABF procedure does not adequately consider the number of
comparisons that must be made, and (3) the CABF does not control
for seasonal variation.	Specifically, the concerns were that
the CABF procedure could result in "false positives" (Type I
error), thus requiring an owner or operator unnecessarily to
collect additional ground-water samples, to further characterize
ground-water quality, and to apply for a permit modification,
which is then subject to EPA review. In addition, there was
concern that CABF may result in "false negatives" (Type II
error), i.e., instances where actual contamination goes
undetected. This occurred because the background data, which is
often used as the basis of the statistical comparisons, was
highly variable due to temporal, spatial, analytical, and
sampling effects.
As a result of these concerns, EPA amended both the
statistical method and the sampling procedures of the
regulations, by requiring (if necessary) that owners or operators
more accurately characterize the hydrogeology and potential
contaminants at the facility, and by including in the regulations
performance standards that all the statistical methods and
2-2

-------
sampling procedures must meet (53 FR 39720; October 11, 1988).
Statistical methods and sampling procedures meeting these
performance standards would have a low probability of indicating
contamination when it is not present, and of failing to detect
contamination that actually is present. The facility owner or
operator would have to demonstrate that a procedure is
appropriate for the conditions at the facility and to ensure that
it meets the performance standards outlined below. This
demonstration holds for any of the four statistical methods and
sampling procedures outlined in this regulation as well as any
alternate methods or procedures proposed by facility owners and
operators.
EPA recognizes that the selection of appropriate monitoring
parameters is also an essential part of a reliable statistical
evaluation. The Agency addressed this issue in a previous
Federal Register notice (52 IE 25942, July 9. 1987).
2.2 OVERVIEW OF METHODOLOGY
EPA has elected to retain the idea of general performance
requirements that the regulated community must meet. This
approach allows for flexibility in designing statistical methods
and sampling procedures to site-specific considerations.
EPA has tried to bring a measure of certainty to these methods ,
while accommodating the unique nature of many of the regulated
units in question. Consistent with this general strategy, the
Agency is establishing several options for the sampling
procedures and statistical methods to be used in detection
monitoring and, where appropriate, in compliance monitoring.
The owner or operator shall submit, for each of the chemical
parameters and hazardous constituents listed in the facility
permit one or more of the statistical methods and sampling
procedures described in today's regulations. In deciding which
statistical test is appropriate, he or she will consider the
theoretical properties of the test, the data available, the site
hydrogeology, and the fate and transport characteristics of
potential contaminants at the facility.	The Regional
Administrator will review, and if appropriate approve the
proposed statistical methods and sampling procedures when issuing
the facility permit.
The Agency recognizes that there may be situations where any
one statistical test may not be appropriate. This is true of new
facilities with little or no ground-water monitoring data. If
insufficient data prohibits the owner or operator from specifying
a statistical method of analysis, then contingency plans
containing several methods of data analysis and the conditions
under which the method can be used will be specified by the
Regional Administrator in the permit. In many cases, the
parametric ANOVA can be performed after six months of data has
2-3

-------
been collected. The owner or operator may propose modifying the
permit at a later date when more data is available and he wishes
to use a specific method of analysis.
2.3 GENERAL PERFORMANCE STANDARDS
EPA's basic concern in establishing these performance
standards for statistical methods is to achieve a proper balance
between the risk that the procedures will falsely indicate that a
regulated unit is causing background values or concentration
limits to be exceeded (false positives), and the risk that the
procedures will fail to indicate that background values or
concentration limits are being exceeded (false negatives). EPA's
approach is designed to address that concern directly. Thus, any
statistical method or sampling procedure, whether specified here
or as an alternative to those specified, should meet the
following performance standards:
1.	The statistical test is to be conducted separately for each
hazardous constituent in each well. If the distribution of the
chemical parameters or constituents is shown by the owner or
operator to be inappropriate for a normal theory test, then the
data should be transformed or distribution-free theory test
should be used. If the distributions for the constituents
differ, more than one statistical method may be needed.
2.	If an individual well comparison procedure is used to compare
an individual compliance well constituent concentration with
background constituent concentrations or a ground-water
protection standard, the test shall be done at a Type I error
level of no less than 0.01 for each testing period. If a
multiple comparisons procedure is used the Type I experimentwise
error rate shall be no less than 0.05 for each testing period,
however the Type I error of no less than 0.01 for individual well
comparisons must be maintained. This performance standard does
not apply to control charts, tolerance intervals, or prediction
intervals unless they are modeled after hypothesis testing
procedures that involve setting significance levels.
3.	If a control chart approach is used to evaluate ground-water
monitoring data, the specific type of control chart and its
associated parameters shall be proposed by the owner or operator
and approved by the Regional Administrator if he or she finds it
to be protective of human health and the environment.
4.	If a tolerance interval or a prediction interval is used to
evaluate ground-water monitoring data, the levels of confidence,
and for tolerance intervals the percentage of the population that
the interval must contain, shall be proposed by the owner or
operator and approved by the Regional Administrator if he or she
2-4

-------
finds these parameters to be protective of human health and the
environment.	These parameters will be determined after
considering the number of samples in the background data base,
the distribution of the data, and the range of the concentration
values for each constituent of concern.
5.	The statistical method will include procedures for handling
data below the limit of detection with one or more procedures
that are protective of human health and the environment. Any
practical guantification limit (pgl) approved by the Regional
Administrator under section 264.97(h) that is used in the
statistical method shall be the lowest concentration level that
can be reliably achieved within specified limits of precision and
accuracy during routine laboratory operating conditions that are
available to the facility.
6.	The statistical method will consider, and if necessary
control or correct for, seasonal and spatial variability and
temporal correlation in the data.
In referring to "statistical methods", EPA means to emphasize
that the concept of "statistical significance" must be reflected
in several aspects of the monitoring program. This involves not
only the choice of a level of significance, but also the choice
of a statistical test, the sampling requirements, the number of
samples, and the frequency of sampling. Since all of these
interact to determine the ability of the procedure to detect
contamination. The statistical methods, like a comprehensive
ground-water monitoring program, must be evaluated in their
entirety, not by individual components. Thus a systems approach
to ground-water monitoring is endorsed.
The second performance standard requires further
delineation. For individual well comparisons in which an
individual compliance well is compared to background, the Type I
error level shall be no less than 0.01 for each testing period.
In other words, the probability of the test resulting in a false
positive is no less than 1 in 100. EPA believes that this
significance level is sufficient in limiting the false positive
rate while at the same time controlling the false negative
(missed detection) rate.
Owners and operators of facilities that have an extensive
network of ground-water monitoring wells may find it more
practical to use a multiple well comparisons procedure. Multiple
comparisons procedures control the experimentwise error rate for
comparisons involving multiple upgradient and dovngradient
wells. If this method is used, the Type I experimentwise error
rate for each constituent shall be no less than 0.05 for each
testing period.
In conducting a multiple well comparisons procedure, if the
owner or operator chooses to use a t-statistic rather than an
2-5

-------
F-statistic, the individual well Type I error level must be
maintained at no less than O.Ol. This provision should be
considered if a facility owner or operator wishes to use a
procedure that distributes the risk of a false positive evenly
throughout all monitoring wells and monitoring parameters (eg.
Bonferroni t-test).
Setting these levels of significance at 0.01 and 0.05
respectively raises an important question in how the false
positive rate will be controlled at facilities with a large
number of ground-water monitoring wells and monitoring
constituents. The Agency set these levels of significance on the
basis of a single testing period, and not on the entire operating
life of the facility. Further, large facilities can reduce the
false positive rate by implementing a unit-specific monitoring
approach. Nonetheless it is evident that facilities with an
extensive number of ground-water monitoring wells and that are
monitoring for many constituents will still generate a large
number of comparisons during each testing period. At these
facilities, it may be difficult to keep the false positive error
rate at an acceptable level.
Such cases may require the Regional Administrator to use
discretion in deciding if a statistically significant result is
indicative, of an actual release from the facility, or if it is a
false positive.	In making this decision, the Regional
Administrator may note the relative magnitude of the
concentration of the constituent(s). If the exceedance is based
on an observed compliance well value that has the same relative
magnitude as the pql (practical quantification limit), or the
background concentration level, then a false positive is more
likely to be observed, and further sampling and testing may be
appropriate. if however the background concentration level or an
action level is exceeded by an order of magnitude, then the
exceedance is more likely to be indicative of a release from the
facility.
2.4 BASIC STATISTICAL METHODS AND SAMPLING PROCEDURES
The Final Rule specifies five types of statistical methods to
detect contamination in ground water. EPA believes that at least
one of these types of procedures will be appropriate for a wide
variety of situations. To address situations where these methods
may not be appropriate, EPA has included a provision for the
owner or operator to select an alternate method which is subject
to approval by the Regional Administrator.
1. A parametric analysis of variance (ANOVA) followed by
multiple comparison procedures to identify specific sources of
difference. The procedures will include estimation and testing
of the contrasts between the mean of each compliance well and the
background mean for each constituent.
2-6

-------
2.	An analysis of variance (ANOVA) based on ranks followed by
multiple comparison procedures to identify specific sources of
difference. The procedure will include estimation and testing of
the contrasts between the median of each compliance well and the
median background levels for each constituent.
3.	A procedure in which a tolerance interval or a prediction
interval for each constituent is established from the background
data, and the level of each constituent in each compliance well
is compared to its upper tolerance or prediction limit.
4.	A control chart approach which will give control limits for
each constituent. If any compliance well has a value or a
sequence of values that lie outside the control limits for that
constituent, it may constitute statistically significant
evidence of contamination.
5.	Another statistical method submitted by the owner or operator
and approved by the Regional Administrator.
A summary of these statistical methods and their
applicability is presented in Table 2-1. The table lists types
of comparisons and the recommended procedure and refers the
reader to the appropriate section where a discussion and example
can be found.
EPA is specifying multiple statistical methods and sampling
procedures and has allowed for alternatives because no one method
or procedure is appropriate for all circumstances. EPA believes
that the suggested methods and procedures are appropriate for the
site-specific design and analysis of data from ground-water
monitoring systems, and that they can account for more of the
site-specific factors than Cochran's Approximation to the Behrens
Fisher Student's t-test (CABF) and the accompanying sampling
procedures in the past regulations. The statistical methods
specified here address the multiple comparison problems and
provide for documenting and accounting for sources of natural
variation. EPA believes that the specified methods and
2-7a

-------
TABLE 2-1. SUMMARY OF STATISTICAL METHODS
SUMMARY OF STATISTICAL METHODS
COMPOUND
TYPE OF COMPARISON
RECOMMENDED METHOD
SECTION OF
GUIDANCE
DOCUMENT
ANY
COMPOUND
IN
BACKGROUND
BACKGROUND VS
COMPLIANCE WELL
ANOVA
TOLERANCE LIMITS
PREDICTION INTERVALS
4.2
4*3
4.4
INTRA-WELL
CONTROL CHARTS
6
ACL/MCL
SPECIFIC
FIXED STANDARD
CONFIDENCE INTERVALS
TOLERANCE LIMITS
5.2.1
5.2.2
SYNTHETIC
MANY NONDETECTS
IN DATA SET
SEE BELOW DETECTION
LIMIT TABLE 7-1
7.1
2-7b

-------
procedures consider and control for natural temporal and spatial variation.
The decision on the number of wells needed in a monitoring system will be made
on a site-specific basis by the Regional Administrator and will consider the
statistical method being used, the site hydrogeology, the fate and transport
characteristics of potential contaminants, and the sampling procedure. The
number of wells must be sufficient to ensure a high probability of detecting
contamination when it is present. To determine which sampling procedure
should be used, the owner or operator shall consider existing data and site
characteristics, including the possibility of trends and seasonality. These
sampling procedures are:
1.	Obtain a sequence of at least four samples taken at an interval that
ensures, to the greatest extent technically feasible, that an inde-
pendent sample is obtained, by reference to the uppermost aquifer's
effective porosity, hydraulic conductivity, and hydraulic gradient,
and the fate and transport characteristics of potential contami-
nants. The sampling interval that is proposed must be approved by
the Regional Administrator.
2.	An alternate sampling procedure proposed by the owner or operator
and approved by the Regional Administrator if he or she finds 1t to
be protective of human health and the environment.
EPA believes that the above sampling procedures will allow the use of
statistical methods that will accurately detect contamination. These sampling
procedures may be used to replace the sampling method present 1n the former
Subpart F regulations. Rather than taking a single ground-water sample and
dividing 1t into four replicate samples, a sequence of at least four samples
taken at intervals far enough apart in time (dally, weekly, or monthly,
depending on rates of ground-water flow and contaminant fate and transport
characteristics) will help ensure the sampling of a discrete portion (I.e., an
independent sample) of ground water. In hydrogeologic environments where the
ground-water velocity prohibits one from obtaining four Independent samples on
a semiannual basis, the replicate sampling method described in the former
Subpart F regulations or an alternate sampling procedure that is approved by
the Regional Administrator may be utilized.
The Regional Administrator shall approve an appropriate sampling proce-
dure and interval submitted by the owner or operator after considering the
effective porosity, hydraulic conductivity, and hydraulic gradient 1n the
uppermost aquifer under the waste management area, and the fate and transport
characteristics of potential contaminants. Most of this information is
already required to be submitted 1n the facility's Part 8 permit application
under §270.14(c) and may be used by the owner or operator to make this deter-
mination. Further, the number and kinds of samples collected to establish
background concentration levels should be appropriate to the form of statisti-
cal test employed, following generally accepted statistical principles. For
example, the use of control charts presume a well defined background of at
least eight samples per well. By contrast, ANOVA alternatives might require
only four samples per well.
2-8

-------
It seems likely that most facilities will be sampling monthly over four
consecutive months, twice a year. In order to maintain a complete annual
record of ground-water data, the facility owner or operator may find it
desirable to obtain a sample each month of the year. This will help identify
seasonal trends in the data and permit evaluation of the effects of auto-
correlation and seasonal variation if present in the samples.
The concentrations of a constituent determined in these samples are
intended to be used in one-point-in-time comparisons between background and
compliance wells. Some facility owners or operators may want to use the con-
centrations to establish a "moving average" 1n the background well data base
for comparison to the compliance well values at the frequency required 1n the
facility permit. Using several background values to establish a "moving aver-
age" is an acceptable method of analysis; however, the number of degrees of
freedom will be Increased, making this method more sensitive to changes in
constituent concentrations. Further, this method does not account for sea-
sonal variation as effectively as one-point-in-time comparison procedures.
Therefore, most owners and operators will find one-point-1n-time comparisons
to be the preferred method of analysis. This approach will help reduce the
components of seasonal variation by providing for simultaneous comparisons
between background and compliance well information.
The different sampling intervals were chosen to allow for the unique
nature of the hydrogeologic systems beneath hazardous waste sites. This sam-
pling scheme will give proper consideration to the temporal variation of and
autocorrelation among the ground-water constituents. The specified procedure
requires sampling data from background wells, at the compliance point, and
according to a specific test protocol. The owner or operator should use a
background value determined from data collected under this scenario 1f a test
approved by the Regional Administrator requires it or 1f a concentration limit
in compliance monitoring is to be based upon background data.
EPA recognizes that there may be situations where the owner or operator
can devise alternate statistical methods and sampling procedures that are more
appropriate to the facility and that will provide reliable results. There-
fore, today's regulations allow the Regional Administrator to approve such
procedures if he or she finds that the procedures balance the risk of false
positives and false negatives in a manner comparable to that provided by the
above specified tests and that they meet specified performance standards. In
examining the comparability of the procedure to provide a reasonable balance
between the risk of false positives and false negatives, the owner or operator
will specify in the alternate plan such things as the sampling frequency and
the sample size.
The methods Indicate that the procedure must provide reasonable confi-
dence that the migration of hazardous constituents from a regulated unit into
and through the aquifer will be detected. (The reference to hazardous con-
stituents does not mean that this option applies only to compliance monitor-
ing; the procedure also applies to monitoring parameters and constituents in
the detection monitoring program since they are surrogates indicating the
presence of hazardous constituents.) The protocols for the specific tests,
however, will be used as general benchmark to define "reasonable confidence"
2-9

-------
in the proposed procedure. If the owner or operator shows that his suggested
test is comparable in its results to one of the specified tests, then it is
likely to be acceptable under the "reasonable confidence" test. There may be
situations, however, where 1t will be difficult to directly compare the per-
formance of an alternate test to the protocols for the specified tests. In
such cases the alternate test will have to be evaluated on its own merits.
2.5 CHOOSING A SAMPLING INTERVAL
Section 264.97(g) of 40 CFR Part 264 Subpart F provides the owner or
operator of an RCRA facility with a flexible sampling schedule that will allow
him or her to choose a sampling procedure that will reflect site-specific con-
cerns. This section specifies that the owner or operator shall, on a semi-
annual basis, obtain a sequence of at least four samples from each well, based
on an interval that is determined after evaluating the uppermost aquifer's
effective porosity, hydraulic conductivity, and hydraulic gradient, and the
fate and transport characteristics of potential contaminants. The intent of
this provision 1s to set a sampling frequency that allows sufficient time to
pass between sampling events to ensure, to the greatest extent technically
feasible, that an independent ground-water sample 1s taken from each well.
For further Information on ground-water sampling, refer to the EPA "Practical
Guide for Ground-Water Sampling," Carcelona et al., 1985.
The sampling frequency of the four semiannual sampling events required in
Part 264 Subpart F can be based on estimates using the average linear velocity
of ground water. Two forms of the Oarcy equation stated below relate ground-
water velocity to effective porosity (Ne), hydraulic gradient (i), and hydrau-
lic conductivity (K):
Vh-(Kh*1)/Ne and Vv»(Kv*i)/Ne
where Vh and Vv are the horizontal and vertical components of the average
linear velocity of ground water, respectively; Kh and Kv are the horizontal
and vertical components of hydraulic conductivity; 1 is the head gradient; and
Ne 1s the effective porosity. In applying these equations to ground-water
monitoring, the average linear horizontal velocity can be used to determine an
appropriate sampling interval. Usually, field Investigations will yield bulk
values for hydraulic conductivity. In most cases, the bulk hydraulic conduc-
tivity determined by a pump test, tracer test, or a slug test will be suffi-
cient for these calculations. The vertical component of velocity (Vv),
however, should be considered 1n estimating flow velocities in areas with sig-
nificant components of vertical velocity such as recharge and discharge zones.
To apply the Oarcy equation to ground-water monitoring, one needs to
determine the parameters K, 1, and Ne. The hydraulic conductivity, K, is the
volume of water at the existing kinematic viscosity that will move In unit
time under a unit hydraulic gradient through a unit area measured at right
angles to the direction of flow. The reference to "existing kinematic vis-
cosity" relates to the fact that hydraulic conductivity 1s not only determined
by the media (aquifer), but also by fluid properties (ground water or poten-
tial contaminants). Thus, 1t is possible to have several hydraulic conduc-
tivity values for many different chemical substances that are present in the
2-10

-------
same aquifer, In either case it is advisable to use the greatest value for
velocity that is calculated using the Oarcy equation to determine sampling
intervals. This will provide for the earliest detection of a leak from a
hazardous waste facility and expeditious remedial action procedures. A range
of hydraulic conductivities (the transmitted fluid is water) for various aqui-
fer materials is given in Figure 2-1. The top line has units m/d; the middle
line ft/d is commonly used. The bottom line has volume units per ft2 of area.
The hydraulic gradient, 1, is the change in hydraulic head per unit of
distance in a given direction. It can be determined by dividing the differ-
ence in head between two points on a potentlametrie surface map by the ortho-
gonal distance between those two points (see example calculation). Water
level measurements are normally used to determine the natural hydraulic gradi-
ent at a facility. However, the effects of mounding in the event of a leak
from a waste disposal facility may produce a steeper local hydraulic gradient
in the vicinity of the monitoring well. These local changes in hydraulic
gradient should be accounted for in the velocity calculations.
The effective porosity, Ne, 1s the ratio, usually expressed as a per-
centage, of the total volume of voids available for fluid transmission to the
total volume of the porous medium dewateredo It can be estimated during a
pump test by dividing the volume of water removed from an aquifer by the total
volume of aquifer dewatered (see example calculation). Table 2-2 presents
approximate effective porosity values for a variety of aquifer materials. In
cases where the effective porosity is unknown, specific yield may be substi-
tuted Into the equation. Specific yields of selected rock units are given in
Table 2-3. In the absence of measured values, dralnable porosity 1s often
used to approximate effective porosity. Figure 2-2 Illustrates representative
values of drainable porosity and total porosity as a function of aquifer
particle size.
Once the values for Kt I, and Ne are determined, the average linear
ground-water velocity can be calculated. Using the Darcy equation, we can
determine the time required for ground water to pass through the complete
monitoring well diameter by dividing the monitoring well diameter by the aver-
age linear velocity of ground water. This value will represent the minimum
time interval required between sampling events that will yield an independent
ground-water sample. (Three-dimensional mixing of ground water in the vicin-
ity of the monitoring well will occur when the well is purged before sampling,
which 1s one reason why this method only provides an estimation of travel
time).
In determining these sampling Intervals, one should note that many chemi-
cal compounds will not travel at the same velocity as ground water. Chemical
characteristics such as adsorptlve potential, specific gravity, and molecular
size will influence the way chemicals travel 1n the subsurface. Large mole-
cules, for example, will tend to travel slower than the average linear veloc-
ity of ground water because of matrix Interactions. Compounds that exhibit a
strong adsorptive potential win undergo a similar fate that will dramatically
change time of travel predictions using the Darcy equation. In some cases
chemical interaction with the matrix material will alter the matrix structure
and its associated hydraulic conductivity that may result in an increase 1n
contaminant mobility.
2-11

-------
IGNEOUS AND METAMORPHIC ROCKS	
U n f roc t u red	Froctured
BASALT
Unfractuced	Froctured	Lova flow
	SANOSTONE	
Fractured Semiconsolidoted
shale
Unfractured	Fractured
CARBONATE ROCKS
Fractured	Cavernous
CLAY	SILT, LOESS
SILTY SAND
CLEAN SAND
Fint Coarse
GLACIAL TILL	GRAVEL
i	i	i	¦	i '	'	I	1	1	1	1	1
10"' I0*7 10"* I0*3 I0"4 I0"3 to"2 10*' I 10 10 2 I03 I04
m d"'
I	I	l	I	' '	I	I	1	1	1	1	1
I0"7 I0"4 I0"5 I0"4 I0"3 I0"2 10"' I 10 10 2 10 3 10 4 10 5
ft d"'
l	I	l	I	I	I	l	I	I	I	I	1	1
I0"7 10*' I0*s I0~4 I0"3 I0"2 10"' I 10 10 2 10 3 10 4 10 3
gal d" ft-1 '
Source: Heath, R. C. 1983. Basic Ground-Water Hydrology. U.S. Geological
Survey Water Supply Paper, 2220, 84 pp.
Figure 2-1. Hydraulic conductivity of selected rocks.
2-12

-------
TABLE 2-2. DEFAULT VALUES FOR EFFECTIVE POROSITY FOR USE
IH TIME OF TRAVEL (TOT) ANALYSES
Soil textural classes
Effective porosity
of saturation3
Unified soil classification system
GS, GP, GM, GC, SW, SP, SM, SC
ML, MH
CI, OL, CH, OH, PT
USDA soil textural classes
Clays, silty clays, sandy clays
Silts, silt loams, sllty clay loams
All others
Rock units fall)
Porous media (nonfractured rocks
such as sandstone and some carbonates)
Fractured rocks (most carbonates,
shales, granites, etc.)
0.20
(20%)
0.15
(15*)
0.01.
(i*r
0.01.
(l*)b
0.10
(10*)
0.20
(20*)
0.15
(15*)
0.0001
(0.01*)
Source: Barari, A.t and L. S. Hedges. 1985. Movement of Water
in Glacial Till. Proceedings of the 1 7th international Congress of the
international Association of Hydrogeologuts, pp. 129-134*
a These values are estimates and there may be differences between
similar units. For example, recent studies indicate that
weathered and unweathered glacial till may have markedly dif-
ferent effective porosities (Barari and Hedges, 1985; Bradbury
et al., 1985)„
b Assumes de minimus secondary porosity. If fractures or soil
structure are present, effective porosity should be 0.001
(0.1*).
2-13

-------
TABLE 2-3. SPECIFIC YIELD VALUES FOR
SELECTED ROCK UNITS
Rock type
Specific yield {%)
Clay	2
Sand	22
Gravel	19
Limestone	18
Sandstone	(semiconsolidated) 6
Granite	0.09
Basalt (young)	8
Source: Heath, R. C. 1983. Basic Ground-Water
Hydrology. U.S. Geological Survey, Water Supply
Paper 2220, 84 pp.
2-14

-------
50
45
40
35
30
_ 25
5
" 20
a.
15
10
5
0
Specific yie»d
(dramabi* porosity)
>
1/16 1/18 1/4 1/2 1 2 4 B 16 32 64 128 2S6
Maximum 10% grain til*, miliimatan
(Tht gram ui* >» MdAic/i m« cumulttif toW. Ofinning wiifi rfit eaiflwr msttnit,
rttcrtm 10% of tfi» total itmpir]
Source: Todd, D. K. 1980. Ground Water Hydrology. John
Wiley and Sons, New York. 534 pp.
Figure 2-2. Total porosity and dralnable porosity for
typical geologic materials.
2-15

-------
This effect has been observed with certain organic solvents in clay units (see
Brown and Andersen, 1981). Contaminant fate and transport models may be use-
ful in determining the influence of these effects on movement in the sub-
surface. A variety of these models are available on the commercial market for
private use.
2.6 EXAMPLE CALCULATIONS
2.6.1 Example 1: Determining the Effective Porosity (Ne)
The effective porosity, Ne, can be determined during a pump test using
the following method:
Ne » Volume of water removed/Volume of aquifer dewatered
Volume of water removed:
Pumping rate of pump: 50 gpm
Pumping duration:	30 min
50 gpm x 30 min * 1,500 gal
Volume of aquifer dewatered:
V	- (l/3)*r2h
where r is the radius of area affected by pumping and h is the drop 1n the
water level. If, for example, h ¦ 3 ft and r « 18 ft, then:
V	» (l/3)*3.14*182*3 - 1,018 ft 3
Next,
(1,018 ft3)(7.48 gal/ft3) . 7,615 gal
from which:
Ne - 1,500 gal/7,615 gal - 19.7%
2-16

-------
2.6.2 Example 2: Determining the Hydraulic Gradient (i)
The hydraulic gradient can be determined from a potentiometric surface
map (Figure 2-3 below) using the following method.
Figure 2-3. Potentiometric surface map for computation
of hydraulic gradient.
1 - Ah/1 « (29.2 ft - 29.1 ft)/100 ft - 0.001 ft/ft
Here, Ah is the difference in gradient measured at Pzl and Pz2, and 1 is the
distance between the two points.
2.6,3 Example 3: Determining the Average Linear Velocity of Ground Water
A land disposal facility has ground-water monitoring wells that are
screened in an unconfined si 1ty sand aquifer. Slug tests, pump tests, and
tracer tests conducting during a hydrogeologic site investigation have
revealed that the aquifer has a hydraulic conductivity (Kh) of 15 ft/d and an
effective porosity (Ne) of 15%. Using a potentiometric map (as in example 2),
the regional hydrauli.c gradient has been determined to be 0.003 ft/ft.
What 1s the minimum time interval between sampling events that will allow
one to obtain an Independent sample of ground water?
Calculate the average linear horizontal component of ground water (Vh):
Kh = 15 ft/d
Ne x 0.15
1 = 0.003 ft/ft
29.2'
ii
7wr
2-17

-------
Vh = TKh/Ne - (0.003)(15)/(0.15) - 0.3 ft/d
(0.3 ft/d)(12 in/ft) - 3.6 in/d
Discussion: The average linear horizontal velocity of ground water has
been calculated and is equal to 3.6 in/d. Monitoring well diameters at this
particular facility are 4 in. We can determine the minimum time interval
between sampling events that will allow one to obtain an independent sample of
ground water by dividing the monitoring well diameter by the average linear
velocity.
Based on the above calculations, the owner or operator could sample every
other day. However, because the velocity can vary with recharge rates sea-
sonally, a weekly sampling interval would be advised.
(4 in)/(3.6 in/d) » 1.1 d
Suggested Sampling Interval
Date
Obtain Sample No.
June 1
June 8
June 15
June 22
1
2
3
4
Table 2-4 gives some results for common situations.
2-18

-------
TABLE 2-4. DETERMINING A SAMPLING INTERVAL
DETERMINING fl SAMPLING INTERURL
UNIT
Kh (ft/d)
Ne (%)
Vh (in/mo)
SAMPLING INTERVAL
GRAVEL
EE 4
1 9
9.5 EE 4
DAILY
SAND
EE 2
22
8.2 EE 2
DAILY
)ILTY SAND
EE 1
14
1.3 EE 2
WEEKLY
TILL
EE -3
2
3.6 EE -2
MONTHLY *
;S (SEMICON)
EEO
6
3.0 EE 1
WEEKLY
BASALT
EE -1
8
2.25
MONTHLY *
The average linear velocities assume 1-0.005.
* Use a Monthly sampling interval or an alternate sampling procedure
2-19

-------
SECTION 3
CHOOSING A STATISTICAL METHOD
This chapter discusses the choice of an appropriate statistical method.
Section 3.1 Includes a flow chart to guide this selection. Section 3.2 con-
tains procedures to test the distributional assumptions of statistical methods
and Section 3.3 has procedures to test specifically for equality of variances.
The choice of an appropriate statistical test depends on the type of mon-
itoring and the nature of the data. The proportion of values in the data set
that are below detection 1s one important consideration. If most of the
values are below detection, a test of proportions is suggested.
One set of statistical procedures is suggested when the monitoring con-
sists of comparisons of water sample data from the background (hydraulically
upgradlent) well with the sample data from compliance (hydraulically down-
gradient) wells. The recommended approach 1s analysis of variance. Also, for
a facility with limited amounts of data, 1t 1s advisable to Initially use the
ANOVA method of data evaluation, and later, when sufficient amounts of data
are collected, to charge to a tolerance Interval or a control chart approach
for each compliance well. However, alternative approaches are allowed. These
include adjustments for seasonality, use of tolerance Intervals, and use of
prediction intervals. These methods are discussed In Chapter 4.
When the monitoring objective 1s to compare the concentration of a haz-
ardous constituent to a fixed level such as a maximum concentration limit
(MCL), a different type of approach 1s needed. This type of comparison com-
monly serves as a basis of compliance monitoring. Control charts may be used,
as may tolerance or confidence intervals. Methods for comparison with a fixed
level are presented in Chapter 5.
When a long history of data from each well is available, 1ntra-well com-
parisons are appropriate. That 1s, the data from a single well are compared
over time to detect shifts in concentration, or gradual trends 1n concentra-
tion that may Indicate contamination. Methods for this situation are pre-
sented 1n Chapter 6.
3.1 FLOW CHARTS—OVERVIEW AND USE
The selection and use of a statistical procedure for ground-water moni-
toring 1s a detailed process. Because a single flow chart would become too
complicated for easy use, a series of flow charts has been developed. These
flow charts are found at the beginning of each chapter and are intended to
guide the user in the selection and use of procedures in that chapter. The
3-1

-------
more detailed flow charts can be thought of as attaching to the general flow
charts at the indicated points.
There are three general types of statistical procedures discussed. One
type of procedure is appropriate for comparisons among wells; that is, back-
ground well to compliance well data comparisons. The second type of procedure
is to compare compliance well data with a constant limit such as an alternate
concentration limit (ACL) or a maximum concentration limit (MCL). The third
type of procedure is designed for intra-well comparisons. This method of
analysis may be used when sufficient data from an individual well exist and
the data allow for the identification of trends. A recommended control chart
procedure (Starks, 1988) suggests that a minimum background sample of eight
observations is needed. Thus an intra-well control chart approach could begin
after the first complete year of data collection.
The first question to be asked in determining the appropriate statistical
procedure 1s the type of monitoring program specified in facility permit. The
type of monitoring program may determine if the appropriate comparison 1s
among wells, comparison of downgradient data to a constant, intra-well com-
parisons, or a special case.
If the facility 1s in detection monitoring, the appropriate comparison is
between wells that are hydraulically upgradient from the facility and those
that are hydraulically downgradient. The statistical procedures for this type
of monitoring are presented in Chapter 4. In detection monitoring, 1t is
likely that many of the monitored constituents may result 1n few quantified
results (i.e., much of the data are below the 11m1t of analytical detec-
tion). If this 1s the case, then the test of proportions (Section 7.1.3) may
be recommended. If the constituent occurs in measurable concentrations 1n
background, then analysis of variance (Section 4.2) 1s recommended. This
method of analysis 1s preferred when the data lack sufficient quantity to
allow for the use of tolerance intervals or control charts.
If the facility 1s in compliance monitoring, the permit will specify the
type of compliance limit. If the compliance limit is determined from the
background, the statistical method is chosen from those that compare back-
ground well to compliance well data. Statistical methods for this case are
presented in Chapter 4. The preferred method is the appropriate analysis of
variance method 1n Section 4.2, or if sufficient data permit, tolerance inter-
vals or control charts. The flow chart 1n Chapter 4 aids 1n determining which
method 1s applicable.
If a facility in compliance monitoring has a constant maximum concentra-
tion limit (MCL) or alternate concentration limit (ACL) specified, then the
appropriate comparison is with a constant. Methods for comparison with MCLs
or ACLs are presented 1n Chapter 5, which contains a flow chart to aid in
determining which method to use.
Finally, when more than one year of data have been collected from each
well, the facility owner or operator may find 1t useful to perform intra-well
comparisons over time to supplement the other methods. This is not a regula-
tory requirement, but 1t could provide the facility owner or operator with
3-2

-------
information about the site hydrogeology. These methods are presented in
Chapter 6.
The user should refer to Figure 3-1 to determine the type of comparison
required. Initially this is either a comparison of background well to com-
pliance well data or a comparison of compliance well data with an MCL or
ACL. This leads the user to either Chapter 4 or 5. Eventually, there will be
sufficient data to add the intra-well comparison presented in Chapter 6.
3.2 CHECKING DISTRIBUTIONAL ASSUMPTIONS
The purpose of this section is to provide users with methods to check the
distributional assumptions of the statistical procedures recommended for
ground-water monitoring. It is emphasized that one need not do an extensive
study of the distribution of the data unless a nonparametric method of analy-
sis is used to evaluate the data. If the owner or operator wishes to trans-
form the data, 1t must first be shown that the untransformed data are inappro-
priate for a normal theory test. Similarly, if the owner or operator wishes
to use nonparametric methods, he or she must demonstrate that the data do
violate normality assumptions.
EPA has adopted this approach because most of the statistical procedures
that meet the criteria set forth in the regulations are robust with respect to
departures from many of the distributional assumptions. That is, only extreme
violations of assumptions will result 1n an incorrect outcome of a statistical
test. Moreover, it is only In situations where 1t 1s unclear whether contami-
nation 1s present that departures from assumptions will alter the outcome of a
statistical test. EPA therefore believes that it is protective of the envi-
ronment to adopt the approach of not requiring testing of assumptions on a
wide scale.
It should be noted that the distributional assumptions for statistical
procedures apply to the errors of the observations. Application of the dis-
tributional tests to the observations themselves may lead to the conclusion
that the distribution does not fit the observations. In some cases this lack
of fit may be due to differences in means for the different wells or some
other cause. The tests for distributional assumptions are best applied to the
residuals from a statistical analysis. The residuals are the differences be-
tween the original observations and the values predicted by a model. For
example, 1n analysis of variance, the predicted values are the group means and
the residual is the difference between each observation and Its group mean.
If the conclusion from testing the assumptions 1s that the assumptions
are not adequately met, then a transformation of the data may be used or a
nonparametric statistical procedure selected. Many types of concentration
data have been reported in the literature to be adequately described by a log-
normal distribution. That 1s, the natural logarithm of the original observa-
tions has been found to follow the normal distribution. Consequently, if the
distributional assumptions are found to be violated for the original data, a
transformation by taking the natural logarithm of each observation 1s sug-
gested. This assumes that the data are all positive. If the log transforma-
tion does not adequately normalize the data or stabilize the variance, one
3-3

-------
OVERVIEW FLOWCHART
Start
/ X Compliance Monitoring
Detection Monitoring/ Type of \or Corrective Action
X Permit /
MCL/ACL
Background
with
i	
with
..j
/ Type of N
Compliance
v Limit j
Background'
Compliance Well
Comparisons
(Section 4)
Comparisons
with MCUACLs
(Section 5)
Intra-Wei I
Comparisons
If more than
1Yr. of Data
Control Charts
(Section 6)
Figure 3-1
3-4

-------
should use the nonparametric procedure or seek the consultation of a profes-
sional statistician to determine an appropriate statistical procedure.
The following sections present four selected approaches to check for
normality. The first option refers to literature citation, the other three
are statistical procedures. The choice is left to the user. The availability
of statistical software and the user's familiarity with it will be a factor in
the choice of a method. The coefficient of variation method, for example,
requires only the computation of the mean and standard deviation of the
data. Plotting on probability paper can be done by hand but becomes tedious
with many data sets. However, the commercial Statistical Analysis System
(SAS) software package provides a computerized version of a probability plot
in its PROC UNIVARIATE procedure. SYSTAT, a package for PCs also has a prob-
ability plot procedure. The chi-squared test is not readily available through
commercial software but can be programmed on a PC (for example in LOTUS 1-2-3)
or in any other (statistical) software language with which the user is
familiar. The amount of data available will also Influence the choice. All
tests of distributional assumptions require a fairly large sample size to
detect moderate to small deviations from normality. The chi-squared test
requires a minimum of 20 samples for a reasonable test.
Other statistical procedures are available for checking distributional
assumptions. The more advanced user is referred to the Kolmogorov-Smlrnov
test (see, for example, Undgren, 1976) which 1s used to test the hypothesis
that data come from a specific (that is, completely specified) distribution.
The normal distribution assumption can thus be tested for. A minimum sample
size of 50 1s recommended for using this test.
A modification to the Kolmogorov-Smirnov test has been developed by
LilHefors who uses the sample mean and standard deviation from the data as
the parameters of the distribution (Lilliefors, 1967). Again, a sample size
of at least 50 is recommended.
Another alternative to testing for normality 1s provided by the rather
involved Shapiro-WHk's test. The interested user 1s referred to the relevant
article in Biometrika by Shapiro and Wilk (1965).
3.2.1 Literature Citation
PURPOSE
An owner or operator may wish to consult literature to determine what
type of distribution the ground-water monitoring data for a specific con-
stituent are likely to follow. This may avoid unnecessary computations and
make it easier to determine whether there 1s statistically significant evi-
dence of contamination.
PROCEDURE
One simple way to select a procedure based on a specific statistical dis-
tribution, is by citing a relevant published reference. The owner or operator
may find papers that discuss data resulting from sampling ground water and
3-5

-------
conclude that such data for a particular constituent follow a specified dis-
tribution. Citing such a reference may be sufficient justification for using
a method based on that distribution, provided that the data do not show evi-
dence that the assumptions are violated.
A literature citation should consider the distribution of data for the
specific compound being monitored. In addition, it should consider sites with
similar hydrogeologic characteristics to- the extent possible. However,
because many of the compounds may not be studied in the literature, extrapola-
tions to compounds with similar chemical characteristics and to sites with
similar hydrogeologic conditions are also acceptable. Basically, the owner or
operator needs to provide some reason or justification for choosing a par-
ticular distribution.
3-2.2 Coefficient of Variation
Many statistical procedures assume that the data are normally distrib-
uted. The concentration of a hazardous constituent in ground water is Inher-
ently nonnegative, while the normal distribution allows for negative values.
However, if the mean of the normal distribution is sufficiently above zero,
the distribution places very little probability on negative observations and
is still a valid approximation.
One simple check that can rule out use of the normal distribution 1s to
calculate the coefficient of variation of the data. The use of this method
was required by the former Part 264 Subpart F regulations pursuant to Sec-
tion 264.97(h)(1). Because most owners and operators as well as regional
personnel are already familiar with this procedure, it will probably be used
frequently. The coefficient of variation, CV, is the standard deviation of
the observations, divided by their mean. If the normal distribution 1s to be
a valid model, there should be very little probability of negative values.
The number of standard deviations by which the mean exceeds zero determines
the probability of negative values. For example, if the mean exceeds zero by
one standard deviation, the normal distribution will have less than 0.159
probability of a negative observation.
Consequently, one can calculate the standard deviation of the observa-
tions, calculate the mean, and form the ratio of the standard deviation di-
vided by the mean. If this ratio exceeds 1.00, there 1s evidence that the
data are riot normal and the normal distribution should not be used for those
data. (There are other possibilities for nonnormallty, but this is a simple
check that can rule out obviously nonnormal data.)
PURPOSE
This test is a simple check for evidence of gross nonnormallty in the
ground-water monitoring data.
PROCEDURE
To apply the coefficient of variation check for normality proceed as fol-
lows.
3-6

-------
Step 1. Calculate the sample mean, X, of n observations X1t i*l,
n.
n
X - ( z X.)/n
1=1 1
Step 2. Calculate the sample standard deviation, S.
i (X, - X)*/(n - 1)
i*l 1
1/2
Step 3. Divide the sample standard deviation by the sample mean,
ratio is the CV.
This
CV « S/X.
Step 4. Determine if the result of Step 3 exceeds 1.00. If so, this is
evidence that the normal distribution does not fit the data adequately.
EXAMPLE
Table 3-1 1s an example data set of chlordane concentrations in 24 water
samples from a fictitious site. The data are presented in order from least to
greatest.
Applying the procedure steps to the data of Table 3-1, we have:
Step 1. X - 1.52
Step 2. S - 1.56
Step 3. CV « 1.56/1.52 - 1.03
Step 4. Because the result of Step 3 was 1.03, which exceeds 1.00, we
conclude that there is evidence that the data do not adequately follow the
normal distribution. As will be discussed in other sections one would then
either transform the data, use a nonparametric procedure, or seek professional
guidance.
NOTE. The owner or operator may choose to use parametric tests since
1.03 1s so close to the limit but should use a transformation or a nonpara-
metric test 1f he or she believes that the parametric test results would be
incorrect due to the departure from normality.
3-7

-------
TABLE 3-1. EXAMPLE DATA FOR COEFFICIENT
OF VARIATION TEST
Chlordane concentration (ppm)
0,04
0.18
0.18
0.25
0.29
0.38
0o50
0.50
0.60
0.93
0.97
1.10
1.16
1.29
1.37
1.38
1.45
1.46
2-58
2.69
2.80
3.33
4.50
6.60
3.2.3 Plotting on Probability Paper
PURPOSE
Probability paper 1s a visual aid and a diagnostic tool 1n determining
whether a set of data follows a normal distribution. Also, approximate esti-
mates of the mean and standard deviation of the distribution can be read from
the plot.
PROCEDURE
Let X be the variable; Xx, X2,...,X1,...,Xn the set of n observations.
The values of X can be raw data, residuals, or transformed data.
3-8

-------
Step 1. Rearrange the observations in ascending order:
X(l), X(2)	X(n).
Step 2. Compute the cumulative frequency for each distinct value X(i)
as (>/("+!)) x 100%. The divisor of (n+1) is a plotting convention to avoid
cumulative frequencies of 100% which would be at infinity on the probability
paper.
If a value of X occurs more than once, then the corresponding value of 1
increases appropriately. For example, if X(2) »X(3), then the cumulative
frequency for X(l) is 100*l/(n+l), but the cumulative frequency for X(2) or
X(3) is 100*(1+2)/(n+1).
Step 3. Plot the distinct pairs [X(i), (i/n+1)) x 1001 values on prob-
ability paper (this paper is commercially available) using an appropriate
scale for X on the horizontal axis. The vertical axis for the cumulative
frequencies is already scaled from 0.01 to 99.99%.
If the points fall roughly on a straight line (the line can be drawn by
hand with a ruler), then one can conclude that the underlying distribution 1s
approximately normal. Also, an estimate of the mean and standard deviation
can be made from the plot. The horizontal line drawn through 50% cuts the
plotted line at the mean of the X values. The horizontal line going through
84% cuts the line at a value corresponding to the mean plus one standard devi-
ation. By subtraction, one obtains the standard deviation.
REFERENCE
Dixon, W. J., and F. J. Massey, Jr. Introduction to Statistical Analysis.
McGraw-Hill, Fourth Edition, 1983.
EXAMPLE
Table 3-2 lists 22 distinct chlordane concentration values (X) along with
their frequencies. These are the same values as those listed 1n Table 3-1.
There is a total of n«24 observations.
Step 1. Sort the values of X in ascending order (column 1).
Step 2. Compute [100 x (1/25)], column 4, for each distinct value of X,
based on the values of i (column 2).
Step 3. Plot the pairs [X.», 100x(i/25)] on probability paper (Fig-
ure 3-2).
INTERPRETATION
The points in Figure 3-2 do not fall on a straight line; therefore, the
hypothesis of an underlying normal distribution is rejected. However, the
shape of the curve Indicates a lognormal distribution. This is checked in the
next step.
3-9

-------
TABLE 3-2. EXAMPLE DATA COMPUTATIONS FOR
PROBABILITY PLOTTING
Concentration
Absolute



~X
frequency
1
100x(i/(n+l))
ln(X)
0.04
1
1
4
-3.22
0.18
2
3
12
-1.71
0.25
1
4
16
-1.39
0.29
1
5
20
-1.24
0.38
1
S
24
-0.97
0.50

3
32
-0.69
0.60
1
9
36
-0.51
0.93

10
40
-0.07
0.97
1
11
44
-0.03
1.10
1
12
48
0.10
1.16
1
13
52
0.15
1.29
1
14
56
0.25
1.37
1
15
60
0.31
1.38
1
16
64
0.32
1.45
1
17
68
0.37
1.46
1
18
72
0.38
2.58
1
19
76
0.95
2.69
1
20
80
0.99
2.80
1
21
84
1.03
3.33
1
22
88
1.20
4.50
1
23
92
1.50
6.60
1
24
96
1.89
3-10

-------

Concentration 100x (i/(n+1))
X-Axis: (Concentration)
Figure 3-2. Probability plot of raw chlordane concentrations.
3-11

-------
Next, take the natural logarithm of the X-values (ln(X))f column 5 in
Table 3-2). Repeat Step 3 above using the pairs [ln(X), lOOx(i/25)]. The re-
sulting plot is shown in Figure 3-3. The points fall approximately on a
straight line (hand-drawn) and the hypothesis of lognormality of X, i.e.,
ln(X) is normally distributed, can be accepted. The mean can be estimated at
slightly below 0 and the standard deviation at about 1.2 on the log scale.
3,2.4 The Chi-Squared Test
The chi-squared test can be used to test whether a set of data fits a
specified distribution. Most introductory courses in statistics explain the
chi-squared test, and its familiarity among owners and operators as well as
regional personnel may make it a frequently used method of analysis. In this
application the assumed distribution is the normal distribution, but other
distributions could also be used. The test consists of defining cells or
ranges of values and determining the expected number of observations that
would fall in each cell according to the hypothesized distribution. The
actual number of data points 1n each cell is compared with that predicted by
the distribution to judge the adequacy of the fit.
PURPOSE
The ch1 -squared test 1s used to test the adequacy of the assumption of
normality of the data.
PROCEDURE
Step 1. Determine the appropriate number of cells, K. This number
usually ranges from 5 to 10. Divide the number of observations, N, by 4. Use
the largest whole number of this result, using 10 if the result exceeds 10.
Step 2. Standardize the data by subtracting the sample mean and divid-
ing by the sample standard deviation:
zi * (Xi - X)/S
Step 3. Determine the number of observations that fall 1n each of the
cells defined according to Table 3-3. The expected number of observations for
each cell is N/K, where N 1s the total number of observations and K is the
number of cells. Let denote the observed number 1n cell 1 (for 1 taking
values from 1 to K) and let denote the expected number of observations 1n
cell i. Note that 1n this case the cells are chosen to make the E-j's equal.
Step 4. Calculate the ch1-squared statistic by the formula below:
K (N - EJ*
X2 • E	f
Step 5. Compare the calculated result to the table of the ch1-squared
distribution with K-3 degrees of freedom (Table 1, Appendix B). Reject the
hypothesis of normality if the calculated value exceeds the tabulated value.
3-12

-------
100x (i/(n+1))
•3 -2.5 -2
X-Axis: Log (Concentration)
Mean+Std
Figure 3-3. Probability plot of log-transformed chlordane concentrations.
3-13

-------
TABLE 3-3. CELL BOUNDARIES FOR THE CHI-SQUARED TEST


Number of cells (K)



5
6 7
8
9
10
Cell boundaries
-0.84
-0.97 -1.07
-LIS
-1.22
-1.28
for equal ex-
-0.25
-0.43 -0.57
-0.67
-1.08
-0.84
pected cell
0.25
0,00 -0.18
-0.32
-0.43
-0.52
sizes with the
0.84
0,43 0.18
0.00
-0.14
-0.25
normal distri^

0.97 0.57
0.32
0.14
0.00
bution

1.07
0.67
0.43
0.25



1.15
1.08
0.52




1.22
0.84





1.28
REFERENCE





Remington, R. D.
, and M. A
o Schork. Statistics
with Applications
to the
Biological and Health Sciences.
Prentice-Hall, 1970.
235-236
•

EXAMPLE
The data in Table 3-4 are N ¦ 21 residuals from an analysis of variance
on dloxin concentrations. The analysis of variance assumes that the errors
(estimated by the residuals) are normally distributed. The ch1-squared test
1s used to check this assumption.
Step 1. Divide the number of observations, 21, by 4 to get 5.25. Keep
only the integer part, 5, so the test will use K ¦ 5 cells.
Stefl 2. The sample mean and standard deviation are calculated and found
to be: X » 0.00, S ¦ 0.24. The data are standardized by subtracting the mean
(0 in this case) and dividing by S. The results are also shown in Table 3-4.
Step 3. Determine the number of (standardized) observations that fall
into the five cells determined from Table 3-3. These divisions are: (1) less
than or equal to -0.84, (2) greater than -0.84 and less than or equal to
-0.25, (3) greater than -0.25 and less than or equal to +0.25, (4) greater
than 0.25 and less than or equal to 0.84, and (5) greater than 0.84. We find
4 observations 1n cell 1, 6 in cell 2, 2 1n cell 3, 4 in cell 4, and 5 1n
cell 5.
Step 4. Calculate the chi-squared statistic. The expected number 1n
each cell 1s N/K or 21/5 - 4.2.
	. 2.10
3-14

-------
TABLE 3-4. EXAMPLE DATA FOR CHI-SQUARED
TEST
Observation
Residual
Standardized
residual
1
-0.45
-1.90
2
-0.35
-1.48
3
-0.35
-1.48
4
-0.22
-0.93
5
-0.16
-0.67
6
-0.13
-0.55
7
-0.11
-0.46
8
-0.10
-0.42
9
-0.10
-0.42
10
-0.06
-0.25
11
-0.05
-0.21
12
0.04
0.17
13
• 0.11
0.47
14
0.13
0.55
15
0.16
0.68
16
0.17
0.72
17
0.20
0.85
18
0.21
0.89
19
0.30
1.27
20
0.34
1.44
21
0.41
1.73
3-15

-------
Step 5. The critical value at the 5% level for a chl-squared test with
2 (K-3 « 5-3 = 2) degrees of freedom is 5.99 (Table 1, Appendix B). Because
the calculated value of 2.10 is less than 5.99 there is no evidence that these
data are not normal.
INTERPRETATION
The cell boundaries are determined from the normal distribution so that
equal numbers of observations should fall in each cell. If there are large
differences between the number of observations in each cell and that predicted
by the normal distribution, this is evidence that the data are not normal.
The chi-squared statistic is a nonnegative statistic that increases as the
difference between the predicted and observed number of observations in each
cell Increases.
If the calculated value of the chi-squared statistic exceeds the tabu-
lated value, there is statistically significant evidence that the data do not
follow the normal distribution. In that case, one would need to do a trans-
formation, use a nonparametric procedure, or seek consultation before Inter-
preting the results of the test of the ground-water data. If the calculated
value of the chi-squared statistic does not exceed the tabulated critical
value, there is no significant lack of fit to the normal distribution and one
can proceed assuming that the assumption of normality is adequately met.
Remark: The chi-squared statistic can be used to test whether the re-
siduals from an analysis of variance or other procedure are normal. In this
case the degrees of freedom are found by (number of cells minus one minus the
number of parameters that have been estimated). This may require more than
the suggested 10 cells. The ch1-squared test does require a fairly large sam-
ple size in that there should be generally at least four observations per
cell.
3.3 CHECKING EQUALITY OF VARIANCE: BARTLETT1S TEST
The analysis of variance procedures presented in Chapter 4 are often more
sensitive to unequal variances than to moderate departures from normality.
The procedures described in this section allow for testing to determine
whether group variances are equal or differ significantly. Often in practice
unequal variances and nonnormality occur together. Sometimes a transformation
to stabilize or equalize the variances also produces a distribution that Is
more nearly normal. This sometimes occurs 1f the Initial distribution was
positively skewed with variance increasing with the number of observations.
Only Bartlett's test for checking equality, or homogeneity, of variances 1s
presented here. It encompasses checking equality of more than two variances
with unequal sample sizes. Other tests are available for special cases. The
F-test is a special situation when there are only two groups to be compared.
The user is referred to classical textbooks for this test (e.g., Snedecor and
Cochran, 1980). In the case of equal sample sizes but more than two variances
to be compared, the user might want to use Hartley's or maximum F-rat1o test
(see Nelson, 1987). This test provides a quick procedure to test for variance
homogeneity.
3-16

-------
PURPOSE
Bartlett's test is a test of homogeneity of variances. In other words,
it is a means of testing whether a number of population variances of normal
distributions are equal. This is an assumption made in analysis of variance
when comparing concentrations of constituents between background and compli-
ance wells, or among compliance wells. It should be noted that Bartlett's
test is sensitive to nonnormality in the data. With long-tailed distributions
the test too often rejects equality (homogeneity) of the variances.
PROCEDURE
Assume that data from k wells are available and that there are data
points for well i.
Step 1. Compute the k sample variances S?,...,S^. Each variance has
associated with it f* ¦ n^-1 degrees of freedom. Take the natural logarithm
of each variance, ln(S1),...,ln(S|c).
Step 2. Compute the test statistic
k
X2 - f*ln(Sp) - I f,*ln(S*)
k	/k \
where f « z f, « i n. - k
thus f 1s the total sample size minus the number of wells (groups); and
* 1 ^	2
Sp ¦ j r f^Sj , the pooled variance across wells.
Step 3. Using the chi-squared table (Table 1, Appendix B), find the
critical value for X* with (k-I) degrees of freedom at a predetermined signif-
icance level, for example, 0.05.
INTERPRETATION
If the calculated value X2 1s larger than the tabulated value, then con-
clude that the variances are not equal at that significance level.
REFERENCE
Johnson N. L., and F. C. Leone. Statistics and Experimental Design in
Engineering and the Physical Sciences. Vol. I, John Wiley and Sons, New York,
1977.
3-17

-------
EXAMPLE
Manganese concentrations are given for k=6 wells in Table 3-5 below.
TABLE 3-5. EXAMPLE DATA FOR BARTLETT'S TEST
Sampling






date
Well 1
Well 2
Well 3
Well 4
Well 5
Well 6
January 1
50
46
272
34
48
68
February 1
73
77
171
3,940
54
991
March 1
244

32


54
Apri1 1
202

53



-
4
2
4
2
2
3
f^ » n.j-1 =
3
1
3
1
1
2
si "
95
22
112
2,762
3
537
v-
9,076
481
12,454
7,628,418
8
288,349
f^S.,* -
27,229
481
37,362
7,628,418
8
576,698
In(S^) -
9
6
9
16
2
13
fi*1n(S.j2) -
27
6
28
16
2
25
Step 1. • Compute the six sample variances and take their natural
logarithm, ln(Sx)),..., ln(S6), as 9, 6,..., 13, respectively.
6	2
Step 2. • Compute I f4*ln(S,) » 105,
1-1 1 1
This 1s the sum of the last line in Table 3-5.
6
• Compute f - r f. « 3 + 1 +...+ 2 » 11
1-1 1
Compute Sp
3-18

-------
Take the natural logarithm af S*: ln(Sp) » 14
Compute X2 « 11*14 - 105 * 44
Step 3. The critical Xz value with 6-1 » 5 degrees of freedom at the 5%
significance level is 11.1 (Table 1 in Appendix B). Since 44 1s larger than
11.1, we conclude that the six variances S ,...,S , are not homogeneous at the
5% significance level.	1 s
INTERPRETATION
The sample variances of the data from the six wells were compared by
means of Bartlett's test. The test was significant at the 5% level, suggest-
ing that the variances are significantly unequal (heterogeneous). A log-
transform of the data can be done and the same test performed on the trans-
formed data. Generally, if the data followed skewed distribution, this ap-
proach resolves the problem of unequal variances and the user can proceed with
an ANOVA for example.
On the other hand, unequal variances among well data could be a direct
indication of well contamination, since the individual data could come from
different distributions (i.e., different means and variances). Then the user
may wish to test which variance differs from which one. The reader is
referred here to the literature for a gap test of variance (Tukey, 1949;
David, 1956; or Nelson, 1987).
NOTE
In the case of k-2 variances, the test of equality of variances 1s
the F-test (Snedecor and Cochran, 1980).
Bartlett's test simplifies in the case of equal sample sizes, n^-n,
1-1,...,k. The test used then 1s Cochran's test. Cochran's test focuses on
the largest variance and compares 1t to the sum of all the variances. Hartley
introduced a quick test of homogeneity of variances that uses the ratio of the
largest over the smallest variances. Technical aids for the procedures under
the assumption of equal sample sizes are given by L. S. Nelson in the Journal
of Quality Technology, Vol. 19, 1987, pp. 107 and 165.
3-19

-------
SECTION 4
BACKGROUND WELL/COMPLIANCE WELL COMPARISONS
There are many situations 1n ground-water monitoring that call for the
comparison of data from different wells. The assumption is that a set of
wells can be defined that are not contaminated. Generally these are back-
ground wells and have been sited to be hydraulically upgradient from the
regulated unit. A second set of wells are sited hydraulically downgradlent
from the regulated unit and are otherwise known as compliance wells. The data
from these compliance wells are compared to the data from the background wells
to determine whether there is any evidence of contamination in the compliance
wells that would presumably result from a release from the regulated unit.
If the owner or operator of a hazardous waste facility does not have
reason to suspect that the test assumptions of equal variance or normality
will be violated, then he or she may simply choose the parametric analysis of
variance as a default method of statistical analysis. In the event that this
method Indicates a statistically significant difference between the groups
being tested, then the test assumptions should be evaluated.
This situation, where the relevant comparison Is between data from back-
ground wells and data from compliance wells, Is the topic of this section.
Comparisons between background well data and compliance well data may be
called for in all phases of monitoring. This Is the general case for detec-
tion monitoring. It 1s also the case for compliance monitoring 1f the com-
pliance limits are determined by the background well constituent concentration
levels. It may also be the case 1n corrective action. Compounds that are
present in background wells (e.g., naturally occurring metals) are most appro-
priately evaluated using this comparison method.
The procedures described 1n this section are applicable whenever the
relevant comparison is between background well data and compliance well
data. Section 4.1 provides a flow chart and overview for the selection of
methods for comparison of background well and compliance well data. Sec-
tion 4.2 contains analysis of variance methods. These provide methods for
directly comparing background well data to compliance well data. Section 4.3
describes a tolerance interval approach, where the background well data are
used to define the tolerance limits for comparison with the compliance well
data. Section 4.4 contains an approach based on prediction intervals, again
using the background well data to determine the prediction Interval for com-
parison with the compliance well data. Methods for comparing data to a fixed
compliance limit (an MCL or ACL) will be described in Section 5.
4-1

-------
4.1 SUMMARY FLOW CHART FOR BACKGROUND WELL/COMPLIANCE WELL COMPARISONS
Figure 4-1 is a flow chart to aid in selecting the appropriate statisti-
cal procedure for background well to compliance well comparisons. The first
step is to determine whether most of the observations are quantified (that is,
above the detection limits) or not. Generally, if more than 50% of the obser-
vations are below the detection limit (as might be the case with detection or
compliance monitoring for volatile organicsj then the appropriate comparison
is a test of proportions. The test of proportions compares the proportion of
detected values in the background wells to those in the compliance wells. See
Section 7.1 for a discussion of dealing with data below the detection limit.
If the proportion of detected values is 50% or more, then an analysis of
variance procedure is the first choice. Tolerance limits or prediction inter-
vals are acceptable alternative choices that the user may select. The analy-
sis of variance procedures give a more thorough picture of the situation at
the facility, however, the tolerance limit or prediction interval approach is
acceptable and requires less computation in many situations.
Figure 4-2 1s a flow chart to guide the user if a tolerance limits
approach 1s selected* The first step in using Figure 4-2 1s to determine
whether the facility is in detection monitoring. If so, much of the data may
be below the detection limit. See Section 7.1 for a discussion of this case,
which may call for consulting a statistician. If most of the data are quanti-
fied, then follow the flow chart to determine 1f normal tolerance limits can
be used. If the data are not normal (as determined by one of the procedures
in Section 3.2), then the logarithm transformation may be done and the trans-
formed data checked for normality. If the log data are normal, the logndrmal
tolerance limit should be used. If neither the original data nor the log-
transformed data are normal, seek consultation with a professional
statistician.
If a prediction interval is selected as the method of choice, see Sec-
tion 4.4 for guidance in performing the procedure.
If analysis of variance 1s to be used, then continue with Figure 4-1 to
select the specific method that is appropriate. A one-way analysis of vari-
ance is recommended. If the data show evidence of seasonality (observed, for
example, in a plot of the data over time), a trend analysis or perhaps a two-
way analysis of variance may be the appropriate choice. These instances may
require consultation with a professional statistician.
If the one-way analysis of variance is appropriate, the computations are
performed, then the residuals are checked to see 1f they meet the assumptions
of normality and equal variance. If so, the analysis concludes. If not, a
logarithm transformation may be tried and the residuals from the analysis of
variance on the log data are checked for assumptions. If these stm do not
adequately satisfy the assumptions, then a one-way nonparametrie analysis of
variance may be done, or professional consultation may be sought.
4-2

-------
UHV/ivunuUHWOUivirLiMiMUt; vvcll UUIVlh'Ahl^UNb
Start
Proportion of
Nondetects
\»50% /
Ym
Test of
Proportions
No
inclusions
Tolerance Limits
(Alternate Approach)
Prediction intervals
(Alternate Approach)
Control Charts
Conclusions
Conclusions
Conclusions
No
Yes
Oro-Way
ANOVA
Save Residuals
Take
Logs
Yes
1>
'ResJduabN
Normaiy
Distributed?,
No
No
Original
. Data?^
isions
Yes
No
Yes
Parametric
One-Way
ANOVA
oncKisJons
Figure 4-1. Background well/compllance well comparisons.

-------
Tolerance Limits: Alternate Approach to
Background/Compliance Well Comparisons
Are Data
Normal?
Yes
Conclusions
No
x Are >
Log Data
Normal?
Yes
Conclusions
No
Take Log
of Data
Normal
Tolerance
Urn its
Tolerance Limits
Log Normal
Tolerance
Limits
Consult with
Professional
Statistician
Figure 4-2. Tolerance limits: alternate approach to background
well/compliance well comparisons.
4-4

-------
4.2 ANALYSIS OF VARIANCE
If contamination of the ground water occurs from the waste disposal
facility and if the monitoring wells are hydraulically upgradient and
hydraulically downgradient from the site, then contamination is unlikely to
change the levels of a constituent in all wells by the same amount. Thus,
contamination from a disposal site can be seen as differences in average con-
centration among wells, and such differences can be detected by analysis of
variance.
Analysis of variance (ANOVA) is the name given to a wide variety of sta-
tistical procedures. All of these procedures compare the means of different
groups of observations to determine whether there are any significant differ-
ences among the groups, and if so, contrast procedures may be used to
determine where the differences lie. Such procedures are also known in the
statistical literature as general linear model procedures.
Because of its flexibility and power, analysis of variance 1s the pre-
ferred method of statistical analysis when the ground-water monitoring is
based on a comparison of background and compliance well data. Two types of
analysis of variance are presented: parametric and nonparametrlc one-way
analyses of variance. Both methods are appropriate when the only factor of
concern is the different monitoring wells at a given sampling period.
The hypothesis tests with parametric analysis of variance usually assume
that the errors (residuals) are normally distributed with constant variance.
These assumptions can be checked by saving the residuals (the difference
between the observations and the values predicted by the analysis of variance
model) and using the tests of assumptions presented 1n Section 3. Since the
data will generally be concentrations and since concentration data are often
found to follow the lognormal distribution, the log transformation 1s sug-
gested 1f substantial violations of the assumptions are found in the analysis
of the original concentration data. If the residuals from the transformed
data do not meet the parametric ANOVA requirements, then nonparametric
approaches to analysis of variance are available using the ranks of the obser-
vations. A one-way analysis of variance using the ranks 1s presented.
When several sampling periods have been used and it is important to con-
sider the sampling periods as a second factor, then two-way analysis of vari-
ance, parametric or nonparametric, is appropriate. This would be one way to
test for and adjust the data for seasonality. Also, trend analysis (e.g.,
time series) may be used to identify seasonality 1n the data set. If neces-
sary, data that exhibit seasonal trends can be adjusted. Usually, however,
seasonal variation will affect all wells at a facility by nearly the same
amount, and 1n most circumstances, corrections will not be necessary. Fur-
ther, the effects of seasonality will be substantially reduced by simultane-
ously comparing aggregate compliance well data to background well data.
Situations that require an analysis procedure other than a one-way ANOVA
should be referred to a professional statistician.
4-5

-------
4.2.1 One-May Parametric Analysis of Variance
In the context of ground-water monitoring, two situations exist for which
a one-way analysis of variance is most applicable^
*	Data for a water quality parameter are available from several wells
but for only one time period (e.g., monitoring has just begun).
*	Data for a water quality parameter are available from several wells
for several time periods. However, the data do not exhibit sea-
sonality.
In order to apply a parametric one-way analysis of variance, a minimum
number of observations is needed to give meaningful results. At least p > 2
groups are to be compared (i.e., two or more wells). It 1s recommended that
each group (here, wells) have at least three observations and that the total
sample size, N, be large enough so that N-p > 5. A variety of combinations of
groups and number of observations in groups will fulfill this minimum. One
sampling Interval with four independent samples per well and at least three
wells would fulfill the minimum sample size requirements. The wells should be
spaced so as to maximize the probability of intercepting a plume of contamina-
tion. The samples should be taken far enough apart in time to guard against
autocorrelation.
PURPOSE
One-way analysis of variance is a statistical procedure to determine
whether differences in mean concentrations among wells, or groups of wells,
are statistically significant. For example, is there significant contamina-
tion of one or more compliance wells as compared to background wells?
PROCEDURE
Suppose the regulated unit has p wells and that n* data points (concen-
trations of a constituent) are available for the ith well. These data can be
from either a single sampling period or from more than one. In the latter
case, the user could check for seasonality before proceeding by plotting the
data over time. Usually the computation will be done on a computer using a
commercial program. However, the procedure 1s presented so that computations
can be done using a desk calculator, 1f necessary.
P
Step 1. Arrange the N - r n. data points 1n a data table as follows:
1-1 1
4-6

-------
Well	Well

Observations
Total
Mean
Well No. 1
2
*11	"in,
*1.
•
IX
3
•


•
•
P
*pl	XPnD
XP.
xn
p-


X
X
Step 2. Compute well totals and well means as follows:
nl
X. » e X- . , total of all n, observations at well i
i- j,1 13	i
X. « — X, , average of all n. observations at well 1
•.	1•	I
P n1
X ¦ z z X.. , grand total of all n, observations
*• 1-1 j-1 J	1
X ¦ 4 X , grand mean of all observations
• • n • •
These totals and means are shown 1n the last two columns of the table above.
Step 3. Compute the sum of squares of differences between well means
and the grand mean:
SSWells " Ł"1 <*l.
(The formula on the far right 1s usually most convenient for calculation.)
This sum of squares has (p-1) degrees of freedom associated with 1t and 1s a
measure of the variability between wells.
4-7

-------
Step 4. Compute the	corrected total	sum of squares
p	n,	_	P n,
sst^,i » E	* (*44-X )2	* = E1 X?. - (X2 /N)
Total U1 ij i=1 ^ ij
(The formula on the far right is usually most convenient for calculation.)
This sum of squares has (N-l) degrees of freedom associated with it and is a
measure of the variability in the whole data set.
Step 5. Compute the sum of squares of differences of observations
within wells from the well means. This is the sum of squares due to error and
is obtained by subtraction:
SSError * SSTotal " SSWells
It has associated with it (N-p) degrees of freedom and is a measure of the
variability within wells.
Step 6. Set up the ANOVA table as shown below in Table 4-1. The sums
of squares and their degree of freedom were obtained from Steps 3 through 5.
The mean square quantities are simply obtained by dividing each sum of squares
by its corresponding degrees of freedom.
TABLE 4-1. ONE-WAY PARAMETRIC ANOVA TABLE
Source of
Variation
Sums of squares
Degrees of
freedom
Mean squares
F
Between wells
ssWells
p-1
^Wells
" ssWells/(p-l)
MS,, n
p _ Wells
MSError
Error (within
we 11s)
^Error
N-p
Error
SSError/(N-p)

Total
ssTotal
N-l


Step 7. To test the hypothesis of equal means for all p wells, compute
F * MSue-ns/MScrror O^st column in Table 4-1). Compare this statistic to the
tabulated F statistic with (p-1) and (N-p) degrees of freedom (Table 2, Appen-
dix B) at the 5% significance level. If the calculated F value exceeds the
tabulated value, reject the hypothesis of equal well means. Otherwise, con-
clude that there 1s no significant difference between the concentrations at
the p wells and thus no evidence of well contamination.
4-8

-------
In the case of a significant F (calculated F greater than tabulated F in
Step 7), the user will conduct the next few steps to determine which compli-
ance well(s) is (are) contaminated. This will be done by comparing each com-
pliance well with the background well(s). Concentration differences between a
pair of background wells and compliance wells or between a compliance well and
a set of background wells are called contrasts in the ANOVA and multiple com-
parisons framework.
Step 8. Determine if the significant F is due to differences between
background and compliance wells (computation of Bonferroni t statistics)
Assume that of the p wells, u are background wells and m are compliance
wells (thus u + m » p). Then m differences—m compliance wells each compared
with the average of the background wells—need to be computed and tested for
statistical significance. If there are more than five downgradient wells, the
individual comparisons are done at the comparlsonwise significance level of
1%, which may make the experimentwise significance level greater than 5%.
Obtain the total sample size of all u background wells.
u
z
1-1
Compute the average concentration from the u background wells.
*up" v ii*1 •
Compute the m differences between the average concentrations from
each compliance well and the average background wells.
n-in,
UP 4.1 1
X1. " Xup ' 1 * !••••• m
Compute the standard error of each difference as
SE,. IMSError (l/nup* 1/n,)!*
where MSŁrror 1s determined from the ANOVA table (Table 4-1) and n^
is the number of observations at well 1.
Obtain the t-stat1st1c t -	from Bonferroni1 s t-table
(Table 3, Appendix B) with a ¦ 0.05.
Compute the m quantities ¦ SE^ x t for each compliance well 1.
If m > 5 use the entry for t^N.p^(1-0.01)• That ^s* use the entry
m ¦ 5.	*
4-9

-------
INTERPRETATION
If the difference Xi - Xup exceeds the value D^, conclude that the ith
compliance well has significantly higher concentrations than the average back-
ground wells. Otherwise conclude that the well is not contaminated. This
exercise needs to be performed for each of the m compliance weils individu-
ally. The test is designed so that the overall experimentwise error is 5% if
there are no more than five compliance wells.
CAUTIONARY NOTE
Should the regulated unit consist of more than five compliance wells,
then the Bonferroni t-test should be modified by doing the individual compari-
sons at the 1% level so that the Part 264 Subpart F regulatory requirement
pursuant to §264.97(i)(z) will be met. Alternately, a different analysis of
contrasts, such as Scheffe's, may be used. The more advanced user is referred
to the second reference below for a discussion of multiple comparisons.
REFERENCES
Johnson, Norman L., and F. C. Leone. 1977. Statistics and Experimental
Design, in Engineering and the Physical Sciences. Vol. II, Second Edition,
John Wiley and Sons, New York.
Miller, Ruppert G., Jr. 1981. Simultaneous Statistical Inference. Second
Edition, Sprlnger-Verlag, New York.
EXAMPLE
Four lead concentration values at each of six wells are given in
Table 4-2 below. The wells consist of u»2 background and m»4 compliance
wells. (The values in Table 4-2 are actually the natural logarithms of the
original lead concentrations,)
Step 1. Arrange the 4 x 6 « 24 observations in a data table as follows:
TABLE 4-2. EXAMPLE DATA FOR ONE-WAY PARAMETRIC ANALYSIS OF VARIANCE




Loq
of Pb
concentrations
(uq/L)







Well
Well







total
mean
Well
^ell
No. Date:
Jan 1
Feb 1
Mar 1
Apr 1
(*1.)
C*1.>
std. dev.
1
Background wells
4.06
3.99
3.40
3.83
15.28
3.82
0.295
2
3.83
4.34
3.47
4.22
15.86
3.96
0.398
3
Compliance wells
5.61
5.14
3.47
3.97
18.18
4.55
0.996 (max)
4
3.53
4.54
4.26
4.42
16.75
4.19
0.456
5

3.91
4.29
5.50
5.31
19.01
4.75
0.771
6

5.42
5.21
5.29
5.08
21.01
5.25
0.142 (min)
X.. - 106.08 X.. « 4.42
4-10

-------
Step 2. The calculations are shown on the right-hand side of the data
table above. Sample standard deviations have been computed also.
Step 3. Compute the between-well sum of squares.
SSWells * 1 (15-282 + •••• + 21.012) - ^ x 106.082 - 5.76
with [6 (wells) - lj ¦ 5 degrees of freedom.
Step 4. Compute the corrected total sum of squares.
SSTotal * 4,062 + 3*"2 + + 5,082 " x 106-082 " u*94
with [24 (observations) - 1] - 23 degrees of freedom.
Step 5. Obtain the within-well or error sum of squares by subtraction.
SSEy.ror - 11.94 - 5.76 - 6.18
with [24 (observations) - 6 (wells)] ¦ 18 degrees of freedom.
Step 6. Set up the one-way ANOVA as 1n Table 4-3 below:
TABLE 4-3. EXAMPLE COMPUTATIONS IN ONE-WAY PARAMETRIC ANOVA TABLE
Source of	Sums of Degrees of
variation	squares freedom	Mean squares
Between wells
Error
(within wells)
Total
5.76	5
6.18	18
11.94	23
5.76/5 - 1.15
6.18/18 « 0.34
1.15/0.34 » 3.38
Step 7. The calculated F statistic 1s 3.38. The tabulated F value with
5 and 18 degrees of freedom at the o ¦ 0.05 level 1s 2.77 (Table 2, Appen-
dix B). Since the calculated value exceeds the tabulated value, the hypothe-
sis of equal well means must be rejected, and post hoc comparisons are
necessary.
4-11

-------
Step 8. Computation of Bonferroni t statistics.
Note that there are four compliance wells, so m * 4 comparisons will
be made
nup » 8	total number of samples in background wells
Xup « 3.89 average concentration of background wells
Compute the differences between the four compliance wells and the
average of the two background wells:
X3 - Xup -	4.55	-	3.89	- 0.66
X- - *Up -	4.19	-	3.89	- 0.3
Xs - Xup »	4.75	-	3.89	- 0.86
X6 - Xup «	5.25	-	3.89	« 1.36
• Compute the standard error of each difference. Since the number of
observations is the same for all compliance wells, the standard
errors for the four differences will be equal.
SE1 - [0.34 (1/8 + 1/4)1* - 0.357
From Table 3, Appendix B, obtain the critical t with (24 - 6) ¦ 18
degrees of freedom, m - 4, and for a - 0.05. The approximate value
is 2.43 obtained by linear Interpolation between 15 and 20 degrees
of freedom.
Compute the quantities 0^. Again, due to equal sample sizes, they
will all be equal.
D1 « SE1 x t - 0.357 x 2.43 - 0.868
INTERPRETATION
The F test was significant at the 5% level. The Bonferroni multiple
comparisons procedure was then used to determine for which wells there was
statistically significant evidence of contamination. Of the four differences
*1 " *up» only - Xup * 1.36 exceeds the critical value of 0.868. From
this it 1s concluded that there is significant evidence of contamination at
Well 6. Well 5 1s right on the boundary of significance. It 1s likely that
Well 6 has intercepted a plume of contamination with Well 5 being on the edge
of the plume.
4-12

-------
All the compliance well concentrations were somewhat above the mean con-
centration of the background levels. The well means should be used to indi-
cate the location of the plume. The findings should be reported to the
Regional Administrator.
4.2.2 Qne-Way Nonparametric Analysis of Variance
This procedure is appropriate for interwell comparisons when the data or
the residuals from a parametric ANOVA have been found to be significantly dif-
ferent from normal and when a log transformation fails to adequately normalize
the data. In one-way nonparametric ANOVA, the assumption under the null
hypothesis is that the data from each well come from the same continuous dis-
tribution and hence have the same median concentrations of a specific hazard-
ous constituent. The alternatives of interest are that the data from some
wells show increased levels of the hazardous constituent in question.
The procedure is called the Kruskal-WalUs test. For meaningful results,
there should be at least three groups with a minimum sample size of three in
each group. For large data sets use of a computer program is recommended. In
the case of large data sets a good approximation to the procedure is to re-
place each observation by its rank (Its numerical place when the data are
ordered from least to greatest) and perform the (parametric) one-way analysis
of variance (Section 4.2.1) on the ranks. Such an approach can be done with
some commercial statistical packages such as SAS.
PURPOSE
The purpose of the procedure is to test the hypothesis that all wells (or
groups of wells) around regulated units have the same median concentration of
a hazardous constituent. If the wells are found to differ, post-hoc compari-
sons are again necessary to determine 1f contamination is present.
Note that the wells define the groups. All wells will have at least four
observations. Denote the number of groups by K and the number of observations
in each group by n^, with n being the total number of all observations. Let
denote the jth observation in the ith group, where j runs from 1 to the
number of observations in the group, n^, and 1 runs from 1 to the number of
groups, K.
PROCEDURE
Step 1. Rank all observations from least to greatest. Let denote
the rank of the jth observation in the 1th group. As a convention, denote the
background well(s) as group 1.
Step 2. Add the ranks of the observations 1n each group. Call the sum
of the ranks for the 1th group Rj. Also calculate the average rank for each
group, R.j « R^/n^.
4-13

-------
Step 3. Compute the Kruskal-Wallis statistic:
12
n(n+l)
K
Ł
i«l
R?
i
ni
- 3(n+l)
Step 4. Compare the calculated value H to the tabulated chi-squared
value with (K-l) degrees of freedom, where K is the number of groups (Table 1,
Appendix B)» Reject the null hypothesis if the computed value exceeds the
tabulated critical value.
Step 5. If the computed value exceeds the value from the chi-squared
table, compute the critical difference for well comparisons to the background,
assumed to be group 1:
n(n+l)
12
1/2
n.
1
nj
"1 '(-/(K-l))
for 1 taking values 2,..., K,
where	is the upper (a/(K-l)) percentile from the standard normal
distribution found in Table 4, Appendix B. Note: If there are more than five
compliance wells at the regulated unit (k > 6), use Z.0l, the upper one-
percentile from the standard normal distribution.
Step 6. Form the differences of the average ranks for each group to the
background and compare these with the critical values found in step 5. to. de-
termine which wells give evidence of contamination. That is, compare R-i-Rx to
for i taking the values 2 through K. (Recall that group 1 Is the
background.)
While the above steps are the general procedure, some details need to be
specified further to handle special cases. First, it may happen that two or
more observations are numerically equal or tied. When this occurs, determine
the ranks that the tied observations would have received if they had been
slightly different from each other, but still in the same places with respect
to the rest of the observations. Add these ranks and divide by the number of
observations tied at that value to get an average rank. This average rank is
used for each of the tied observations. This same procedure 1s repeated for
any other groups of tied observations.
The effect of tied observations is to increase the value of the sta-
tistic, H. Unless there are many observations tied at the same value, the
effect of ties 1s negligible (in practice, the effect of ties can probably be
neglected unless some group contains 10 percent of the observations all tied,
which 1s most likely to occur for concentrations below detection limit). If
the statistic is significant without any correction for ties, it will also be
significant if corrected for ties.
4-14

-------
Second, if there are any values below detection, consider all values
below detection as tied at zero. (It is irrelevant what number is assigned to
nondetected values as long as all such values are assigned the same number,
and it is smaller than any detected or quantified value.)
ADJUSTMENT FOR TIES
If there are 50% or more observations that fell below the detection
limit, then this method is inappropriate. The user' is referred to Section 7,
"Miscellaneous Topics." Otherwise, if there are tied values present in the
data, use the following correction for the H statistic
1" (ili v(n3~nj)
where g ¦ the number of groups of distinct tied observations and T^ » (t^-t^),
where t^ 1s the number of observations 1n the tied group 1. Note that unique
observations can be considered groups of size 1, with the corresponding T^ *
(13-1) « 0.
REFERENCE
Hollander, Myles, and D. A. Wolfe. 1973. Nonparametric Statistical
Methods. John Wiley and Sons, New York.
EXAMPLE
The data 1n Table 4-4 represent benzene concentrations 1n water samples
taken at one background and five compliance wells.
Step 1. The 20 observations have been ranked from least to greatest.
The limit of detection was 1.0 ppm. Note that two values 1n Well 4 were below
detection and were assigned value zero. These two are tied for the smallest
value and have consequently been assigned the average of the two ranks 1 and
2, or 1.5. The ranks of the observations are indicated 1n parentheses after
the observation in Table 4-4. Note that there are 3 observations tied at 1.3
that would have had ranks 4, 5, and 6 1f they had been slightly different.
These three have been assigned the average rank of 5 resulting from averaging
4, 5, and 6. Other ties occurred at 1.5 (ranks 7 and 8) and 1.9 (ranks 11 and
12).
Step 2. The values of the sums of ranks and average ranks are indicated
at the bottom of Table 4-4.
Step 3. Compute the Kruskal-WalUs statistic
H - 2q(204) <342/4 + + 35.5V3) - 3(20+1) - 14.68
4-15

-------
TABLE
4-4. EXAMPLE
DATA FOR ONE-WAY
NONPARAMETRIC ANOVA—BENZENE CONCENTRATIONS (1n
ppra)

Background


Compliance wells


Date
Well 1
Hell 2
Well 3
Well 4
Well 5
Well 6
Jan 1
1.7 (10)
11.0 (20)
1.3 (5)
0 (1.5)
4.9 (17)
1.6 (9)
Feb 1
1.9 (11.5)
8.0 (18)
1.2 (3)
1.3 (5)
3.7 (16)
2.5 (15)
Mar 1
1.5 (7.5)
9.5 (19)
1.5 (7.5)
0 (1.5)
2.3 (14)
1.9 (11.5)
Apr 1
1.3 (5)


2.2 (13)



n, = 4
n2 = 3
n3 = 3
nH = 4
ns = 3
n6 = 3
Sum of ranks:
R, = 34
R2 = 57
R3 = 15.5
R„ = 21
Rs = 47
R6 = 35.5
Average rank:
R, = 8.5
R2 = 19
R3 = 5.17
Ri, = 5.25
Rs = 15.67
R6 = 11.83

K = 6, the number of wells




6
n = i n< = 20, the total number of observations.
1=1

-------
ADJUSTMENT FOR TIES
There are four groups of ties in the data of Table 4-4:
Tj »	(23-2) * 6	for the 2 observations of 1,900.
T2 =	(23-2) - 6	for the 2 observations of 1,500.
T3 -	(33-3) » 24	for the 3 observations of 1,300.
¦	(23-2) « 6	for the 2 observations of 0.
4
Thus Z T. » 6+6+24+6 ¦ 42
i-1 1
and H' = l-(42/(203-20)) " "off * 14*76» a neg^g^le change from 14.68.
Step 4. To test the null hypothesis of no contamination, obtain the
critical chi-squared value with (6-1) ¦ 5 degrees of freedom at the 5% signif-
icance level from Table 1, Appendix B. The value 1s 11.07. Compare the cal-
culated value, H', with the tabulated value. Since 14.76 1s greater than
11.07, reject the hypothesis of no contamination at the 556 level. If the site
was in detection monitoring 1t should move into compliance monitoring. If the
site was in compliance monitoring it should move into corrective action. If
the site was in corrective action 1t should stay there.
In the case where the hydraullcally upgradient wells serve as the back-
ground against which the compliance wells are to be compared, comparisons of
each compliance well with the background wells should be performed 1n addition
to the analysis of variance procedure. In this example, data from each of the
compliance wells would be compared with the background well data. This com-
parison 1s accomplished as follows. The average ranks for each group, R^ are
used to compute differences. If a group of compliance wells for a regulated
unit have larger concentrations than those found 1n the background wells, the
average rank for the compliance wells at that unit will be larger than the
average rank for the background wells.
Step 5. Calculate the critical values to compare each compliance well
to the background well.
In this example, K«6, so there are 5 comparisons of the compliance wells
with the background wells. Using an experimentwise significance level of a *
0.05, we find the upper 0.05/5 « 0.01 percentile of the standard normal
distribution to be 2.33 (Table 4, Appendix B). The total sample size, n, 1s
20. The approximate critical value, C2» is computed for compliance Well 2,
which has the largest average rank, as:
1/2
- 10.5
The critical values for the other wells are: 10.5 for Wells 3, 5, and 6; and
9.8 for Well 4.
20(21)
1/2
1 * 1
"IT

4 +I
4-17

-------
Step 6. Compute the differences between the average rank of each com-
pliance well and the average rank of the background well:
Differences	Critical values
D2 » 19.0 - 8.5 - 10.5
03 » 5.17 - 8=5 - -3.33
D„ - 5.25 - 8.5 - -3.25
D5 - 15.67 - 8.5 » 7.17
D6 » 11.83 - 8.5 « 3.13
C2
»
10.5
c3

10.5
C„

9,8
Cs
9
10.5
C6
m
10.5
Compare each difference with the corresponding critical difference. D2 * 10.5
equals the critical value of C2 » 10.5. We conclude that the concentration of
benzene averaged over compliance Well 2 1s significantly greater than that at
the background well. None of the other compliance well concentration of
benzene is significantly higher than the average background value. Based upon
these results, only compliance Well 2 can be singled out as being
contaminated.
For data sets with more than 30 observations, the parametric analysis of
variance performed on the rank values 1s a good approximation to the Kruskal-
Wallis test (Quade, 1966). If the user has access to SAS, the PR0C RANK pro-
cedure 1s used to obtain the ranks of the data. The analysis of variance pro-
cedure detailed in Section 4.2.1 1s then performed on the ranks. Contrasts
are tested as in the parametric analysis of variance.
INTERPRETATION
The Kruskal-WallIs test statistic 1s compared to the tabulated critical
value from the ch1-squared distribution. If the test statistic does not
exceed the tabulated value, there is no statistically significant evidence of
contamination and the analysis would stop and report this finding. If the
test statistic exceeds the tabulated value, there is significant evidence that
the hypothesis of no differences in compliance concentrations from the back-
ground level is not true. Consequently, if the test statistic exceeds the
critical value, one concludes that there 1s significant evidence of contami-
nation. One then proceeds to Investigate where the differences He, that is,
which wells are indicating contamination.
The multiple comparisons procedure described 1n steps 5 and 6 compares
each compliance well to the background well. This determines which compliance
wells show statistically significant evidence of contamination at an experi-
mentwise error rate of 5 percent. In many cases, inspection of the mean or
median concentrations will be sufficient to Indicate where the problem lies.
4.3 TOLERANCE INTERVALS BASED ON THE NORMAL DISTRIBUTION
An alternative approach to analysis of variance to determine whether
there is statistically significant evidence of contamination 1s to use toler-
ance intervals. A tolerance interval is constructed from the data on
4-18

-------
(uncontaminated) background wells. The concentrations from compliance wells
are then compared with the tolerance interval. With the exception of pH, if
the compliance concentrations do not fall in the tolerance interval, this pro-
vides statistically significant evidence of contamination.
Tolerance intervals ar*e most appropriate for use at facilities that do
not exhibit high degrees of spatial variation between background wells and
compliance wells. Facilities that overlie extensive, homogeneous geologTc
deposits (for example, thick, homogeneous lacustrine clays) that do not natu-
rally display hydrogeochemical variations may be suitable for this statistical
method of analysis.
A tolerance interval establishes a concentration range that is con-
structed to contain a specified proportion of at least ?% of the population
with a specified confidence coefficient, Y. The proportion of the population
included, P, is referred to as the coverage. The probability with which the
tolerance interval includes the proportion P% of the population is referred to
as the tolerance coefficient.
A coverage of 95% 1s recommended. If this is used, random observations
from the same distribution as the background well data would exceed the upper
tolerance limit less than 5% of the time. Similarly, a tolerance coefficient
of 95% is recommended. This means that one has a confidence level of 95% that
the upper 9556 tolerance limit will contain at least 95J6 of the distribution of
observations from background well data.- These values were chosen to be con-
sistent with the performance standards described in Section 2. The use of
these values corresponds to the selection of a of 5% in the multiple well
testing situation.
The procedure can be applied with as few as three observations from the
background distribution. However, doing so would result 1n a large upper
tolerance limit. A sample size of eight or more results 1s an adequate toler-
ance interval. The minimum sampling schedule called for in the regulations
would result 1n at least four observations from each background well. Only 1f
a single background well is sampled at a single point in time is the sample
size so small as to make use of the procedure questionable.
Tolerance intervals can be constructed assuming that the data or the
transformed data are normal. Tolerance Intervals can also be constructed
assuming other distributions. It 1s also possible to construct nonparametrl c
tolerance intervals using only the assumption that the data came from some
continuous population. However, the nonparametrlc tolerance intervals require
such a large number of observations to provide a reasonable coverage and
tolerance coefficient that they are impractical in this application.
The range of the concentration data 1n the background well samples should
be considered in determining whether the tolerance Interval approach should be
used, and if so, what distribution is appropriate. The background well con-
centration data should be inspected for outliers and tests of normality
applied before selecting the tolerance Interval approach. Tests of normality
were presented in Section 3.2. Note that 1n this case, the test of normality
would be applied to the background well data that are used to construct the
4-19

-------
tolerance interval. These data should all be from the same normal
di stribution.
In this application, unless pH is being monitored, a one-sided tolerance
interval or an upper tolerance limit is desired, since contamination is indi-
cated by large concentrations of the^hazardous constituents monitored. Thus,
for concentrations, the appropriate tolerance interval is (0, TL), with the
comparison of importance being the larger limit, TL.
PURPOSE
The purpose of the tolerance interval approach is to define a concentra-
tion range from background well data, within which a large proportion of the
monitoring observations should fall with high probability. Once this is done,
data from compliance wells can be checked for evidence of contamination by
simply determining whether they fall 1n the tolerance interval. If they do
not, this is evidence of contamination.
In this case the data are assumed to be approximately normally distrib-
uted. Section 3.2 provided methods to check for normality. If the data are
not normal, take the natural logarithm of the data and see if the transformed
data are approximately normal. If so, this method can be used on the loga-
rithms of the data. Otherwise, seek the assistance of a professional
statistician.
PROCEDURE
Step 1. Calculate the mean, X, and the standard deviation, S, from the
background well data.
Step 2. Construct the one-sided upper tolerance limit as
TL - X + K S,
where K is the one-sided normal tolerance factor found in Table 5, Appendix B.
Step 3. Compare each observation from compliance wells to the tolerance
limit found in Step 2. If any observation exceeds the tolerance limit, that
is statistically significant evidence that the well 1s contaminated. Note
that if the tolerance Interval was constructed on the logarithms of the orig-
inal background observations, the logarithms of the compliance well observa-
tions should be compared to the tolerance limit. Alternatively the tolerance
limit may be transferred to the original data scale by taking the anti-
logarithm.
REFERENCE
Lieberman, Gerald J. 1958. "Tables for One-sided Statistical Tolerance
Limits." Industrial Quality Control. Vol. XIV, No. 10.
4-20

-------
EXAMPLE
Table 4-5 contains example data that represent lead concentration levels
in parts per million in water samples at a hypothetical facility. The
background well data are in columns one and two, while the other four columns
represent compliance well data.
Step 1. The mean and standard deviation of the n « 8 observations have
been calculated for the background well. The mean is 51.4 and the standard
deviation is 16.3.
Step 2. The tolerance factor for a one-sided normal" tolerance interval
is found from Table 5, Appendix B as 3.188. This is for 95% coverage with
probability 95% and for n » 8. The upper tolerance limit is then calculated
as 51.4 + (3.188)(16.3) « 103.4.
Step 3. The tolerance limit of 103.3 1s compared with the compliance
well data. Any value that exceeds the tolerance limit indicates evidence of
contamination. Two observations from Well 1, two observations from Well 3,
and all four observations from Well 4 exceed the tolerance limit. Thus there
is evidence of contamination at Wells 1, 3, and 4.
INTERPRETATION
A tolerance limit with 95% coverage gives an upper bound below which 95%
of the observations of the distribution should fall. The tolerance coeffi-
cient used here 1s 95%, Implying that at least 95% of the observations should
fall below the tolerance limit with probability 95%, 1f the compliance well
data come from the same distribution as the background data. In other words,
1n this example, we are 95% certain that 95% of the background lead concentra-
tions are below 104 ppm. If observations exceed the tolerance limit, this is
evidence that the compliance well data are not from the same11 distribution, but
rather are from a distribution with higher concentrations. This 1s Inter-
preted as evidence of contamination.
4.4 PREDICTION INTERVALS
A prediction Interval 1s a statistical interval calculated to include one
or more future observations from the same population with a specified confi-
dence. In ground-water monitoring, a prediction Interval approach may be used
to make comparisons between background and compliance well data. The concen-
trations of a hazardous constituent in the background wells are used to estab-
lish an interval within which K future observations from the same population
are expected to He with a specified confidence. Then each of K future obser-
vations of compliance well concentrations is compared to the prediction inter-
val. The Interval 1s constructed to contain all of K future observations with
the stated confidence. If any future observation exceeds the prediction
Interval, this is statistically significant evidence of contamination. In
application, the number of future observations to be collected, K, must be
specified. Thus, the prediction interval is constructed for a specified time
period in the future. One year 1s suggested. The interval can be constructed
either to contain all K individual observations with a specified probability,
or to contain the K1 means observed at the K' sampling periods.
4-21

-------
TABLE 4-5. EXAMPLE DATA FOR NORMAL TOLERANCE INTERVAL
Lead concentrations (ppm)
Background well	Compliance wells	
Date	A	B" Well I Well 2 Well 3 WelTO
Jan 1 58.0	46.1 273.1* 34.1 49.9 225.9*
Feb 1 54.1	75.7 170.7* 93,7 73.0 183.1*
Mar 1 30.0	32.1 32.1 70.8 244.7* 198.3*
Apr 1 46.1	68.0 53.0 83.1 202.4* 160.8*
n » 8	The upper 95% coverage tolerance limit
Mean » 51.4	with tolerance coefficient of 95% is
SD « 16.3	51.4 + (3.188)(16.3) - 103.4
* Indicates contamination
4-22

-------
The prediction interval presented here is constructed assuming that the
background data all follow the same normal distribution. If that is not the
case (See Section 3.2 for tests of normality), but a log transformation
results in data that are adequately normal on the log scale the interval may
still be used. In this case, use the data after transforming by taking the
logarithm. The future observations need to also be transformed by taking
logarithms before comparison to the interval. (Alternatively, the end points
of the interval could be converted back to the original scale by taking their
anti-logarithms.)
PURPOSE
The prediction interval is constructed so that K future compliance well
observations can be tested by determining whether they lie in the interval or
not. If not, evidence of contamination is found. Note that the number of
future observations, K, for which the interval is to be used, must be speci-
fied in advance. In practice, an owner or operator would need to construct
the prediction interval on a periodic (at least yearly) basis, using the most
recent background data. The Interval is described using the 99% confidence
factor appropriate for individual well comparisons. It is recommended that a
one-sided prediction interval be constructed for the mean of the four observa-
tions from each compliance well at each sampling period.
PROCEDURE
Step 1. Calculate the mean, X, and the standard deviation, S, for the
background well data (used to form the prediction interval).
Step 2. Specify the number of future observations for a compliance well
to be Included in the interval, K. Then the Interval 1s given by
I ~ S yi/n. - 1/n t(n_1# K( 0-95)
where 1t is assumed that the mean of the m observations taken at the K sam-
pling periods will be used. Here n 1s the number of observations in the back-
ground data, and t^n_^ K q is found from Table 3 1n Appendix B. The
table is entered with K as the number of future observations, and degrees of
freedom, v ¦ n-1. If K > 5, use the column for K « 5.
Step 3. Once the interval has been calculated, at each sampling period,
the mean of the m compliance well observations 1s obtained. This mean 1s com-
pared to see 1f 1t falls 1n the Interval. If 1t does, this 1s reported and
monitoring continues. If a mean concentration at a sampling period does not
fall 1n the prediction Interval, this is statistically significant evidence of
contamination. This 1s also reported and the appropriate action taken.
4-23

-------
REMARK
For a single future observation, t is given by the t-distribution found
in Table 6 of Appendix B. In general, the interval to contain K future means
of sample size m each is given by
where t is as before from Table 3 of Appendix B and where m 1s the number of
observations in each mean. Note that for K single observations, m«l, while
for the mean of four samples from a compliance well, m-4.
Note, too, that the prediction intervals are one-sided, giving a value
that should not be exceeded by the future observations. The 5% experlmentwise
significance level 1s used with the Bonferronl approach. However, to ensure
that the significance level for the individual comparisons does not go below
l%3 a/K is restricted to be 1% or larger. If more than K comparisons are
used, the comparisonwise significance level of 1% 1s used, implying that the
comparisonwise level may exceed 5*.
Table 4-6 contains chlordane concentrations measured at a hypothetical
facility. Twenty-four background observations are available and are used to
develop the prediction Interval. The prediction Interval is applied to K»2
sampling periods with m«4 observations at a single compliance well each.
Step 1. Find the mean and standard deviation of the 24 background well
measurements. These are 101 and 11, respectively.
Step 2. There are K ¦ 2 future observations of means of 4 observations
to be included in the prediction interval. Entering Table 3 of Appendix B at
K = 2 and 20 degrees of freedom (the nearest entry to the 23 degrees of
freedom), we find t^o 2 0 95) * 2°09- The "interval is given by
101 + (11)2.09(1/4 + 1/24)1/2 - (0, 113.4).
Step 3. The mean of each of the four compliance well observations at
sampling period one and two 1s found and compared with the Interval found 1n
Step 2. The mean of the first sampling period 1s 122 and that for the second
sampling period is 113. Comparing the first of these to the prediction inter-
val for two means based on samples of size 4, we find that the mean exceeds
the upper limit of the prediction Interval. This is statistically significant
evidence of contamination and should be reported to the Regional Administra-
tor. Since the second sampling period mean is within the prediction interval,
the Regional Administrator may allow the facility to remain 1n Its current
stage of monitoring.
t
(n-1, K, 0.95)
EXAMPLE
4-24

-------
TABLE 4-6. EXAMPLE DATA FOR PREDICTION INTERVAL—CHLORDANE LEVELS
Backqround well
data—Well 1
Compliance well
data—Well 2

Chlordane


Chlordane

concentration


concentration
Sampling date
(ppb)
Sampling date
(PPb)
January 1, 1985
97
July 1,
1986
123

103

120

104


116

85


128
April 1, 1985
120

m ¦
4

105

Mean *
122

104

SD -
5

108



July 1, 1985
110
October
1, 1986
116.

95


117

102


119

78


101
October 1, 1985
105

m ¦
4

94

Mean ¦
113

110

SD -
8

111



January 1, 1986
80




106




115



105
April 1, 1986	100
93
89
113
n -	24
Mean ¦	101
SD -	11
4-25

-------
INTERPRETATION
A prediction interval is a statistical interval constructed from back-
ground sample data to contain a specified number of future observations from
the same distribution with specified probability. That is, the prediction
interval is constructed so as to have a 95% probability of containing the next
K sampling period means, provided that there is no contamination. If the
future observations are found to be in the prediction interval, this is evi-
dence that there has been no change at the facility and that no contamination
is occurring. If the future observation falls outside of the prediction
interval, this is statistical evidence that the new observation does not come
from the same distribution, that is, from the population of uncontaminated
water samples previously sampled. Consequently, if the observation is a con-
centration above the prediction interval's upper limit, it is evidence of
contamination.
The prediction interval could be constructed in several ways. It can be
developed for means of observations at each sampling period, or for each in-
dividual observation at each sampling period.
It should also be noted that the estimate of the standard deviation, S,
that is used should be an unbiased estimator. The usual estimator, presented
above, assumes that there is only one source of variation. If there are other
sources of variation, such as time effects, or spatial variation 1n the data
used for the background, these should be included 1n the estimate of the vari-
ability. This can be accomplished by use of an appropriate analysis of vari-
ance model to include the other factors affecting the variability. Determina-
tion of the components of variance 1n complicated models 1s beyond the scope
of this document and requires consultation with a professional statistician.
REFERENCE
Gibbons, Robert 0. 1987. "Statistical Prediction Intervals for the
Evaluation of Ground-Water Quality." Ground Water. Vol. 25, pp. 455-465.
4-26

-------
SECTION 5
COMPARISONS WITH MCLs OR ACLs
This section includes statistical procedures appropriate when the moni-
toring aims at determining whether ground-water concentrations of hazardous
constituents are below or above fixed concentration limits. In this situation
the maximum concentration limit (MCL) or alternate concentration limit (ACL)
is a specified concentration limit rather than being determined by the back-
ground well concentrations. Thus the applicable statistical procedures are
those that compare the compliance well concentrations estimated from sampling
with the prespeclfled fixed limits. Methods for comparing compliance well
concentrations to a (variable) background concentration were presented in
Section 4.
The methods applicable to the type of comparisons described in this sec-
tion include confidence Intervals and tolerance Intervals. A special section
deals with cases where the observations exhibit very small or no variability.
5.1	SUMMARY CHART FOR COMPARISON WITH MCLs OR ACLs
Figure 5-1 1s a flow chart to aid the user 1n selecting and applying a
statistical method when the permit specifies an MCL or ACL.
As with each type of comparison, a determination 1s made first to see if
there 1s enough data for 1ntra-well comparisons. If so, these should be done
in parallel with the other comparisons.
Here, whether the compliance limit 1s a maximum concentration limit (MCL)
or an alternate concentration limit (ACL), the recommended procedure to com-
pare the mean compliance well concentration against the compliance limit 1s
the construction of a confidence Interval. This approach is presented in
Section 5.2.1. Section 5.2.2 adds a special case of limited variance in the
data. If the permit requires that a compliance limit is not to be exceeded
more than a specified fraction of the time, then the construction of tolerance
limits 1s the recommended procedure, discussed in Section 5.2.3.
5.2	STATISTICAL PROCEDURES
This section presents the statistical procedures appropriate for compari-
son of ground-water monitoring data to a constant compliance limit, a fixed
standard. The Interpretation of the fixed compliance limit (MCL or ACL) is
that the mean concentration should not exceed this fixed limit. An alterna-
tive interpretation may be specified. The permit could specify a compliance
limit as a concentration not to be exceeded by more than a small, specified
5-1

-------
Comparisons with MCL/ACLs
Comparisons with
MCL/ACLs
(Section 5)
with
Intra-Well Comparisons
if Mora than 1 Yr of Data
Control Charts
(Section 6)
Type of
Comparison
with Mean
Confidence Intervals
Are Data
Normal?
with Upper 95th Percentile
onclusions
1
r

Tolerance Limits
Ł
Normal
Confidence
Intervals
(^^Conclusiona^^)
Consult with
No
Professional
Statistician

Take Log
of Data
Are
Log Data
Normal?
Enough
Data
Available?
Yes
Log Normal
Confidence
Intervals

Nonparametric
Confidence
Intervals
Conclusions)
Figure 5-1. Comparisons with MCLs/ACLs.
5-2

-------
proportion of the observations. A tolerance interval approach for such a
situation is also presented.
5.2.1 Confidence Intervals
When a regulated unit is in compliance monitoring with a fixed compliance
limit (either an MCL or an ACL), confidence intervals are the recorranended pro-
cedure pursuant to §264.97(h)(5) in the Subpart F regulations. The unit will
remain in compliance monitoring unless there is statistically significant evi-
dence that the mean concentration at one or more of the downgradlent wells
exceeds the compliance limit. A confidence interval for the mean concentra-
tion is constructed from the sample data for each compliance well individu-
ally. These confidence intervals are compared with the compliance limit. If
the entire confidence interval exceeds the compliance limit, this 1s statisti-
cally significant evidence that the mean concentration exceeds the compliance
1imit.
Confidence intervals can generally be constructed for any specified dis-
tribution. General methods can be found in texts on statistical inference
some of which are referenced in Appendix C. A confidence limit based on the
normal distribution 1s presented first, followed by a modification for the
log-normal distribution. A nonparametric confidence interval 1s also
presented.
5.2.1.1 Confidence Interval Based on the Normal Distribution
PURPOSE
The confidence interval for the mean concentration 1s constructed from
the compliance well data. Once the Interval has been constructed, 1t can be
compared with the MCL or ACL by inspection to determine whether the mean con-
centration significantly exceeds the MCL or ACL.
PROCEDURE
Step 1. Calculate the mean, X, and standard deviation, S, of the sample
concentration values. Do this separately for each compliance well.
Step 2. For each well calculate the confidence interval as
X ± t(0.99, n-1) S/,/"
where t^Q gg	is obtained from the t-table (Table 6, Appendix B).
Generally, there will be at least four observations at each sampling period,
so t will usually have at least 3 degrees of freedom.
Step 3. Compare the intervals calculated 1n Step 2 to the compliance
limit (the MCL or ACL, as appropriate). If the compliance limit 1s contained
1n the interval or is above the upper limit, the unit remains in compliance.
5-3

-------
If any well confidence interval's lower limit exceeds the compliance limit,
this is statistically significant evidence of contamination.
REMARK
The 99th percentile of the t-distribution is used in constructing the
confidence interval. This is consistent with an alpha (probability of Type I
error) of 0.01, since the decision on compliance is made by comparing the
lower confidenca limit to the MCL or ACL. Although the interval as con-
structed with both upper and lower limits is a 98% confidence interval, the
use of 1t is one-sided, which is consistent with the 1% alpha level of
individual well comparisons.
EXAMPLE
Table 5-1 lists hypothetical concentrations of Aldicarb 1n three compli-
ance wells. For illustration purposes, the MCL for Aldicarb has been set at
7 ppb. There is no evidence of nonnormallty, so the confidence interval based
on the normal distribution is used.
TABLE 5-1. EXAMPLE DATA FOR NORMAL CONFIDENCE INTERVAL—ALDICARB
CONCENTRATIONS IN COMPLIANCE WELLS (ppb)
Sampling



date
Well 1
Well 2
Well 3
Jan. 1
19.9
23.7
5.6
Feb. 1
29.6
21.9
3.3
Mar. 1
18.7
26.9
2.3
Apr» 1
24.2
26.1
6.9
X »
23.1
24.6
4.5
S »
4.9
2.3
2.1
MCL ¦ 7 ppb
Step 1. Calculate the mean and standard deviation of the concentrations
for each compliance well. These statistics are shown 1n the table above.
Step 2. Obtain the 99th percentile of the t-d1str1but1on with (4-1) • 3
degrees of freedom from Table 6, Appendix B as 4.541. Then calculate the con-
fidence Interval for each well's mean concentration.
Well 1: 23.1 t 4.541(4.9)//T"- (12.0, 34.2)
Well 2: 24.6 ± 4.541(2.3)/^"- (19.4, 29.8)
Well 3: 4.5 t 4.541(2.1)//T» (-0.3, 9.3)
5-4

-------
where the usual convention of expressing the upper and lower limits of the
confidence interval in parentheses separated by a comma has been followed.
Step 3. Compare each confidence interval to the MCL of 7 ppb. When this
is done, the confidence interval for Well 1 lies entirely above the MCL of 7,
indicating that the mean concentration of Aldicarb in Well 1 significantly
exceeds the MCL. Similarly, the confidence interval for Well 2 lies entirely
above the MCL of 7. This is significant evidence that the mean concentration
in Well 2 exceeds the MCL. However, the confidence interval for Well 3 is
mostly below the MCL. Thus, there is no evidence that the mean concentration
in Well 3 exceeds the MCL.
INTERPRETATION
The confidence interval is an interval constructed so that it should con-
tain the true or population mean with specified confidence (98% in this
case). If this interval does not contain the compliance limit, then the mean
concentration must differ from the compliance limit. If the lower end of the
interval 1s above the compliance limit, then the mean concentration must be
significantly greater than the compliance limit, indicating noncompliance.
5.2.1.2 Confidence Interval for Log-Normal Data
PURPOSE
The purpose of a confidence Interval for the mean concentration is to
determine whether there is statistically significant evidence that the mean
concentration exceeds a fixed compliance limit. The interval gives a range
that includes the true mean concentration with confidence 98%. The lower
limit will be below the true mean with confidence 99%, corresponding to an
alpha of 1%.
PROCEDURE
This procedure 1s used to construct a confidence interval for the mean
concentration from the compliance well data when the data are log-normal (that
is, when the logarithms of the data are normal). Once the interval has been
constructed, 1t can be compared with the MCL or ACL by inspection to determine
whether the mean concentration significantly exceeds the MCL or ACL.
Step 1. Take the natural logarithm of each data point (concentration
measurement). Also, take the natural logarithm of the compliance limit.
Step 2. Calculate the sample mean and standard deviation of the data (on
the log scale) from each compliance well. (This 1s Step 1 of the previous
section, working now on the logarithms.)
Step 3. Form the confidence intervals for each compliance well as
X t t(0.99, n-1) S/^
5-5

-------
where t^Q gg n_^ is from the t-distribution in Table 6 of Appendix B. Here
t will typically have 3 degrees of freedom.
Step 4. Compare the confidence intervals found in Step 3 to the
logarithm of the compliance limit found in Step 1» If any of the intervals
lies entirely above the logarithm of the compliance limit, there is evidence
that the unit is out of compliance. Otherwise, the unit is in compliance.
EXAMPLE
Table 5-2 contains EOB concentration data from three compliance wells at
a hypothetical site. The MCL is assumed to be 20 ppb. For demonstration pur-
poses, the da.:a are assumed not normal; a log-transformation normalized them
adequately. The lower part of the table contains the natural logarithms of
the concentrations.
Step 1. The logarithms of the data are used to calculate a confidence
interval. Take the natural log of the concentrations in the top part of
Table 5-2 to find the values given in the lower part of the table. For exam-
ple, ln(24.2) * 3.19, . . ., ln(25.3) * 3.23. Also, take the logarithm of the
MCL to find that ln(20) « 2.99.
TABLE 5-2. EXAMPLE DATA FOR LOG-NORMAL CONFIDENCE INTERVAL—EOB
CONCENTRATIONS IN COMPLIANCE WELLS (ppb)
Sampling
date

Well 1
Well 2
Well 3
Jan.
1

24.2
39.7
55.7
Apr.
1

10.2
75.7
17.0
Jul.
1

17.4
60.2
97.8
Oct.
1

39.7
10.9
25.3


X *
22.9
46.6
49.0


s -
12.6
28.0
36.6
MCL » 20 ppb









Log concentrations

Jan.
1

3.19
3.68
4.02
Apr.
1

2.32
4.33
2.84
Jul.
1

2.85
4.10
4.58
Oct.
1

3.68
2.39
3.23


X -
3.01
3.62
3.67


s -
0.57
0.86
0.78
Log (MCL) » 2.99




5-6

-------
Step 2. Calculate the mean and standard deviation of the log concentra-
tions for each compliance well. These are shown in the table.
Step 3. Form the confidence intervals for each compliance well.
Wen 1: 3.01 ± 4.541(0.57)/A» (1.72, 4.30)
Well 2: 3.62 ± 4.541(0.86)/A- (1.67, 5.57)
Well 3: 3.67 ± 4.541(0.78)//5"» (1.90, 5.44)
where 4.541 is the value obtained from the t-table (Table 6 in Appendix B) as
in the previous example.
Step 4. Compare the individual well confidence intervals with the MCL
(expressed on the log scale). The natural log of the MCL of 20 ppm is 2.99.
None of the individual well confidence intervals for the mean has a lower
limit that exceeds this value, so none of the individual well mean concentra-
tions is significantly different from the MCL.
Note: The lower and upper limits of the confidence interval for each
well's mean concentration could be converted back to the original scale by
taking antllogs. For example, on the original scale, the confidence Intervals
would be:
Well 1: (exp(1.72), exp(4.30)) or (5.58, 73.70)
Well 2: (exp(1.67), exp(5.51)) or (5.31, 262.43)
Well 3: (exp(1.90), exp(5.44)) or (6.69, 230.44)
These limits could be compared directly with the MCL of 20 ppb. It is gen-
erally easier to take the logarithm of the MCL rather than the antilogarithm
of all of the intervals for comparison.
INTERPRETATION
If the original data are not normal, but the log-transformation ade-
quately normalizes the data, the confidence interval (on the log scale) is an
interval constructed so that the lower confidence limit should be less than
the true or population mean (on the log scale) with specified confidence (99%
In this case). If the lower limit exceeds the appropriate compliance limit,
then the mean concentration must exceed that compliance limit. If the lower
end of the Interval 1s above the compliance limit, the mean concentration must
be significantly greater (at the 1% level) than the compliance limit.
5-7

-------
5.2.1.3 Nonparametric Confidence Interval
If the data do not adequately follow the normal distribution even after
the logarithm transformation, a nonparametric confidence interval can be con-
structed. This interval is for the median concentration (which equals the
mean if the distribution is symmetric). It requires a minimum of seven (7)
observations in order to construct an interval with a two-sided confidence
coefficient of 98%, corresponding to a one-sided confidence coefficient of
99SL Consequently, it is applicable only for the pooled concentration of
compliance wells at a single point in time or for special sampling to produce
a minimum of seven observations at a single well during the sampling period.
PURPOSE
The nonparametric confidence interval is used when the data have been
found to violate the normality assumption, a log-transformation fails to
normalize the data, and no other specific distribution is assumed. It pro-
duces a simple confidence interval that 1s designed to contain the true or
population median concentration with specified confidence. If this confidence
interval contains the compliance limit, 1t 1s concluded that the median con-
centration does not differ significantly from the compliance limit. If the
interval's lower limit exceeds the compliance limit, this is statistically
significant evidence that the concentration exceeds the compliance limit and
the unit is out of compliance.
PROCEDURE
Step 1. Within each compliance well, order the n data from least to
greatest, denoting the ordered data by X(l)t. . ., X(n), where X(1) 1s the 1th
value in the ordered data.
Step 2. Determine the critical values of the order statistics. If the
minimum seven observations is used, the critical values are 1 and 7. Other-
wise, find the smallest integer, M, such that the cumulative binomial distri-
bution with parameters n (the sample size) and p » 0.5 is at least 0.99.
Table 5-3 gives the values of M and n+l-M together with the exact confidence
coefficient for sample sizes from 4 to 11. For larger samples* take as an
approximation the nearest integer value to
M » n/2 + 1 + ZQ>99 /[W7*T
where Zg oq 1s the 99th percentile from the normal distribution (Table 4,
Appendix of and equals 2.33.
Step 3. Once M has been determined in Step 2, find n+l-M and take as the
confidence limits the order statistics, X(M) and X(n+1-M). (With the minimum
seven observations, use X(l) and X(7).)
Step 4. Compare the confidence limits found in Step 3 to the compliance
limit. If the lower limit, X(M) exceeds the compliance limit, there 1s evi-
dence of contamination. Otherwise, the unit remains 1n compliance.
5-8

-------
TABLE 5-3. VALUES OF M AND n+l-M AND CONFIDENCE
COEFFICIENTS FOR SMALL SAMPLES
n
M
n+l-M
Two-sided
confidence
4
4
1
87.5%
5
5
1
93.8%
6
6
1
96.9%
7
7
1
98.4%
S
8
1
99.2%
9
9
1
99.6%
10
9
2
97.9%
11
10
2
98.8%
REMARK
The nonparametric confidence Interval procedure requires at least seven
observations 1n order to obtain a (one-sided) significance level of 1% (confi-
dence af 99%). This means that data from two (or more) wells or sampling
periods would have to be pooled to achieve this level. If only the four
observations from one well taken at a single sampling period were used, the
one-sided significance level would be 6.2556. This would also be the false
alarm rate.
Ties do not affect the procedure. If there are ties, order the observa-
tions as before, Including all of the tied values as separate observations.
That 1s, each of the observations with a common value 1s Included 1n the
ordered list (e.g., 1, 2, 2, 2, 3, 4, etc.). The ordered statistics are found
by counting positions up from the bottom of the 11st as before. Multiple
values from separate observations are counted separately.
EXAMPLE
Table 5-4 contains concentrations of Sllvex in parts per million from two
hypothetical compliance wells. The data are assumed to consist of four sam-
ples taken each quarter for a year, so that sixteen observations are available
from each well. The data are neither normal nor log-normal, so that the non-
parametric confidence interval 1s used. The MCL 1s taken to be 25 ppm.
Step 1. Order the 16 measurements from least to greatest within each
well separately. The numbers in parentheses beside each concentration in
Table 5-4 are the ranks or order of the observation. For example, in Well 1,
the smallest observation is 2.32, which has rank 1. The second smallest 1s
3.17, which has rank 2, and so forth, with the largest observation of 21.36
having rank 16.
5-9

-------
TABLE 5-4. EXAMPLE DATA FOR NONPARAMETRIC CONFIDENCE
INTERVAL—SILVEX CONCENTRATIONS (ppm)

Well 1

Well 2

Sampling
Concentration

Concentration

date
(ppm)
Rank
(ppm)
Rank
Jan. 1
3.17
(2)
3,52
(6)

2.32
(1)
12.32
(15)

7.37
(11)
2„28
(4)

4.44
(6)
5.30
(7)
Apr. 1
9.50
(13)
8.12
(ID

21.36
(16)
3.36
(5)

5.15
(7)
11.02
(14)

15.70
(15)
35,05
(16)
Jul. 1
5.58
(8)
2.20
(3)

3.39
(3)
0.Q0
(1.5)

8.44
(12)
9.30
(12)

10.25
(14)
10.30
(13)
Oct. 1
3.65
(4)
5.93
(8)

6.15
(9)
6.39
(9)

6.94
(10)
0.00
(1.5)

3.74
(5)
6.53
(19)
Step 2. The sample sl2e 1s large enough so that the approximation 1s
used to find M.
M - 16/2 + 1 + 2.33 /(16/4) • 13.7 » 14
Step 3. The approximate 95% confidence limits are given by the
16 + 1 - 14 ¦ 3rd largest observation and the 14th largest observation. For
Well 1 the 3rd observation 1s 3.39 and the 14th largest observation 1s
10.25. Thus the confidence limits for Well 1 are (3.39, 10.25). Similarly
for Well 2, the 3rd largest observation and the 14th largest observation are
found to give the confidence interval (2.20, 11.02). Note that for Well 2
there were two values below detection. These were assigned a value of zero
and received the two smallest ranks. Had there been three or more values
below the limit of detection, the lower limit of the confidence interval would
have been the limit of detection because these values would have been the
smallest values and so would have included the third order statistic.
5-10

-------
Step 4. Neither of the two confidence intervals' lower limit exceeds the
MCL of 25. In fact, the upper limit is less than the MCL, implying that the
concentration in each well is significantly below the MCL.
INTERPRETATION
The order statistics used to form the confidence interval in the nonpara-
metric confidence interval procedure will contain the population median with
confidence coefficient of 98%. The population median equals the mean whenever
the distribution is symmetric. The nonparametric confidence interval 1s gen-
erally wider and requires more data than the corresponding normal distribution
interval, and so the normal or log-normal distribution interval should be used
whenever it 1s appropriate.
If the confidence interval contains the compliance limit (either MCL or
ACL), then it is reasonable to conclude that the median compliance well con-
centration does not differ significantly from the compliance limit. If the
lower end of the confidence interval exceeds the compliance limit, this 1s
statistically significant evidence at the IX level that the median compliance
well concentration exceeds the compliance limit and the unit is out of
compliance.
5.2.2 Tolerance Intervals for Compliance Limits
In some cases a permit may specify that a compliance limit (MCL or ACL)
1s not to be exceeded more than a specified fraction of the time. Since lim-
ited data will be available from each monitoring well, these data can be used
to estimate a tolerance interval for concentrations from that well. If the
upper tolerance limit 1s less than the compliance limit, the data indicate
that the unit 1s 1n compliance. That 1s, concentrations should be less than
the compliance limit at least a specified fraction of the time. If the upper
tolerance limit of the interval exceeds the compliance limit, then the concen-
tration of the hazardous constituent could exceed the compliance limit more
than the specified proportion of the time.
This procedure compares an upper tolerance limit to the MCL or ACL. With
small sample sizes the upper tolerance limit can be fairly large, particularly
if large coverage with high confidence is desired. If the owner or operator
wishes to use a tolerance limit 1n this application, he/she should suggest
values for the parameters of the procedure subject to the approval of the
Regional Administrator. To exemplify the procedure a 95% coverage is used
with 95% confidence. This means that the upper tolerance limit is a value
which, with 95% confidence, will be exceeded less than 5% of the time.
PURPOSE
The purpose of the tolerance interval approach 1s to construct an inter-
val that should contain a specified fraction of the concentration measurements
from compliance wells with a specified degree of confidence. In this appli-
cation it is generally desired to have the tolerance interval contain 95% of
the measurements of concentration with confidence at least 95%.
5-11

-------
PROCEDURE
It is assumed that the data used to construct the tolerance interval are
approximately normal. The data may consist of the concentration measurements
themselves if they are adequately normal (see Section 3.2 for tests of normal-
ity), or the data used may be the natural logarithms of the concentration
data. It is important that the compliance limit (MCL or ACL) be expressed in
the same units (either concentrations or logarithm of the concentrations) as
the observations.
Step 1. Calculate the mean, X, and the standard deviation, S, of the
compliance well concentration data.
Step 2. Determine the factor, K, from Table 5, Appendix B, for the sam-
ple size, n, and form the one-sided tolerance interval
X + KS
Table 5, Appendix B contains the factors for a 9556 coverage tolerance Interval
with confidence factor 9S9S.
Step 3. Compare the upper limit of the tolerance interval computed in
Step 2 to the compliance limit. If the upper limit of the tolerance Interval
exceeds that limit, this 1s evidence of contamination.
EXAMPLE
Table 5-5 contains Aldlcarb concentrations at a hypothetical facility 1n
compliance monitoring. The data are concentrations 1n parts per million (PPM)
and represent observations at three compliance wells. Assume than the permit
establishes an ACL of 50 PPM that is not to be exceeded more than 556 of the
time.
TABLE 5-5. EXAMPLE DATA FOR A TOLERANCE
INTERVAL COMPARED TO AN ACL
Samp11nq	Aldlcarb concentrations (ppm)
date	Well 1 Well 2 Well 3
Jan. 1
19.9
23.7
25.6
Feb. 1
29.6
21.9
23.3
Mar. 1
18.7
26.9
22-3
Apr. 1
24.2
26.1
26.9
Mean =
23.1
24.7
24.5
SD - 4.93	2.28	2.10
ACL * 50 ppm
5-12

-------
Step 1. Calculate the mean and standard deviation of the observations
from each well. These are given in the table.
Step 2. For n = 4, the factor, K, in Table 5, Appendix B, is found to
be 5.145. Form the upper tolerance interval limits as:
Well 1: 23.1 + 5.145(4.93) - 48.5
Well 2: 24.7 + 5.145(2.28) - 36.4
Well 3: 24.5 + 5.145(2.10) » 35.3
Step 3. Compare the tolerance limits with the ACL of 50 PPM. Since the
tolerance limits are below the ACL, there is no statistically significant
evidence of contamination at any well. The site remains in detection
monitoring.
INTERPRETATION
It may be desirable in a permit to specify a compliance limit that is not
to be exceeded more than 5% of the time. A tolerance interval constructed
from the compliance well data provides an estimated interval that will contain
95% of the data with confidence 95*. If this Interval is below the selected
compliance limit, concentrations measured at the compliance wells should
exceed the compliance limit less than 5% of the time. If the upper limit of
the tolerance interval exceeds the compliance limit, then more than 5% of the
concentration measurements would be expected to exceed the compliance limit.
5.2.3 Special Cases with Limited Variance
Occasionally, all four concentrations from a compliance well at a par-
ticular sampling period could be identical. If this 1s the case, the formula
for estimating the standard deviation at that specific sampling period would
give zero, and the methods for calculating parametric confidence intervals
would give the same limits for the upper and lower ends of the Intervals,
which is not appropriate.
In this case it is assumed that there 1s some variation in the data, but
that the concentrations were rounded and gave the same values after round-
ing. To account for the variability that was present before rounding, take
the least significant digit in the reported concentration as having resulted
from rounding. It 1s assumed that rounding results 1n a uniform error on the
interval centered at the reported value with the Interval ranging up or down
one half unit from the reported value. This assumed rounding 1s used to
obtain a nonzero estimate of the variance for use in cases where all the mea-
sured concentrations were found to be identical.
PURPOSE
The purpose of this procedure is to obtain a nonzero estimate of the
variance when all observations from a well during a given sampling period gave
identical results. Once this modified variance is obtained, its square root
5-13

-------
is used in place of the usual sample standard deviation, S, to construct con-
fidence intervals or tolerance intervals.
PROCEDURE
Step 1. Determine the least significant value of any data point. That
is, determine whether the data were reported to the nearest 10 ppm, nearest 1
ppm, nearest 100 ppm, etc. Denote this value by 2R.
Step 2. The data are assumed to have been rounded to the nearest 2R, so
each observation is actually the reported value ±R. Assuming that the obser-
vations were identical because of rounding, the variance is estimated to be
R2/3, assuming the uniform distribution for the rounding error. This gives
the estimated standard deviation as
S" - R/*/I
Step 3. Take this estimated value from Step 2 and use 1t as the estimate
of the standard deviation 1n the appropriate parametric procedure. That is,
replace S by S'.
EXAMPLE
In calculating a confidence Interval for a single compliance well, sup-
pose that four observations were taken during a sampling period and all
resulted 1n 590 ppm. There 1s no variance among the four values 590, 590,
590, and 590.
Step 1. Assume that each of the values 590 came from rounding the con-
centration to the nearest 10 ppm. That Is, 590 could actually be anything
from 585.0 to 594.99, or any value between 585 and 595. Thus, 2R Is 10 ppm,
so R is 5 ppm.
Step 2. The estimate of the standard deviation is
S' - 5//T « 5/1.732 - 2.89 ppm
Step 3. Use S1 ¦ 2.89 and X « 590 to calculate the confidence interval
(see Section 5.2.1) for the mean concentration from this well. This gives
590 ± (4.541)(2.89//?) - (583.4, 596.6)
as the 98% confidence Interval of the average concentration. Note that 4.541
is the 99th percentile from the t-d1stribution (Table 6, Appendix B) with 3
degrees of freedom since the sample size was 4.
5-14

-------
INTERPRETATION
When identical results are obtained from several different samples, the
interpretation is that the data are not reported to enough significant figures
to show the random differences. The data are regarded as having resulted from
rounding more precise results to the reported observations. The rounding is
assumed to result in variability that follows the uniform distribution on the
range ±R, where 2R is the smallest unit of reporting. This assumption is used
to calculate a standard deviation for the observations that otherwise appear
to have no variability.
REMARK
Assuming that the data are reported correctly to the units indicated,
other distributions for the rounding variability could be assumed. The max-
imum standard deviation that could result from rounding when the observation
is ±R is the value R.
5-15

-------
SECTION 6
CONTROL CHARTS FOR INTRA-WEU COMPARISONS
The previous sections covered various situations where the compliance
well data were compared to the background data to detect possible contamina-
tion. This section discusses the case where the level of each constituent
within a single well is being monitored over time. In essence, the data for
each constituent in each well are plotted on a time scale and inspected for
obvious features such as trends or sudden changes in concentration levels.
The method suggested here is a combined Shewhart-CUSUM control chart for each
well and constituent.
The control chart method 1s recommended when data comprising at least
eight independent samples over a one-year period are available. This require-
ment is specified under current RCRA regulations and applies to each constit-
uent in each well.
As discussed in Section 2, a common sampling plan will obtain four Inde-
pendent samples from each well on a semi-annual basis. With this plan a con-
trol chart can be implemented when one year's data are available. As a result
of Monte Carlo simulations, Starks (1988) recommended at least four sampling
periods at a unit of eight or more wells, and at least eight sampling periods
at a unit with fewer than four wells.
The use of control charts can be an effective technique for monitoring
the levels of a constituent at a given well over time. It also provides a
visual means of detecting deviations from a "state of control." It 1s clear
that plotting of the data 1s an important part of the analysis process. Plot-
ting is an easy task, although time-consuming 1f many data sets need to be
plotted. Advantage should be taken of graphics software, since plotting of
time series data will be an ongoing process. New data points will be added to
the already existing data base each time new data are available. The follow-
ing few sections will discuss, in general terms, the advantages of plotting
time series data; the corrective steps one could take to adjust when season-
ality in the data is present; and finally, the detailed procedure for con-
structing a Shewhart-CUSUM control chart, along with a demonstration of that
procedure, 1s presented.
6.1 ADVANTAGES OF PLOTTING DATA
While analyzing the data by means of any of the appropriate statistical
procedures discussed in earlier sections is recommended, it is also recom-
mended to plot the data. Each data point should be plotted against time using
a time scale (e.g., month, quarter). A plot should be generated for each
6-1

-------
constituent measured in each well. For visual comparison purposes, the scale
should be kept identical from well to well for a given constituent.
Another important application of the plotting procedure is for detecting
possible trends or drifts in the data from a given well. Furthermore» when
visually comparing the plots from several wells within a unit, possible con-
tamination of one rather than all downgradient wells could be detected which
would then warrant a closer look at that well. In general, graphs can provide
highly effective illustrations of the time series„ allowing the analyst to
obtain a much greater feeling for the data. Seasonal fluctuations or sudden
changes, for example, may become quite evident, thereby supporting the analyst
in his/her decision of which statistical procedure to use. General upward or
downward trends, if present, can be detected and the analyst can follow-up
with a test for trend, such as the nonparametric Mann-Kendall test (Mann,
1945; Kendall, 1975). If, in addition, seasonality is suspected, the user can
perform the seasonal Kendall test for trend developed by Hirsch et al.
(1982). The reader is also referred to Chapters 16 and 1'7 of Gilbert's
"Statistical Methods for Environmental Pollution Monitoring," 1987. In any of
the above cases, the help of a professional statistician is recommended.
Another important use of data plots 1s that of identifying unusual data
points (e.g., outliers). These points should then be investigated for pos-
sible QC problems, data entry errors, or whether they are truly outliers.
Many software packages are available for computer graphics, developed for
mainframes, mini-, or microcomputers. For example, SAS features an easy-to-
use plotting procedure, PROC PLOT; where the hardware and software are avail-
able, a series of more sophisticated plotting routines can be accessed through
SAS GRAPH. On microcomputers, almost everybody has his or her favorite
graphics software that they use on a regular basis and no recommendation will
be made as to the most appropriate one. The plots shown 1n this document were
generated in LOTUS 1-2-3.
Once the data for each constituent and each well are plotted, the plots
should be examined for seasonality and a correction is recommended should
seasonality be present. A fairly simple-to-use procedure for deseasonal1zing
data is presented in the following paragraphs.
6.2 CORRECTING FOR SEASONALITY
A necessary precaution before constructing a control chart is to take
into account seasonal variation of the data to minimize the chance of mistak-
ing seasonal effect for evidence of well contamination. If seasonality 1s
present, then deseasonal1z1ng the data prior to using the combined Shewhart-
CUSUM control chart procedure is recommended.
Many approaches to deseasonali2e data exist. If the seasonal pattern is
regular, it may be modeled with a sine or cosine function. Moving averages
can be used, or differences (of order 12 for monthly data for example) can be
used. However, time series models may Include rather complicated methods for
deseasonalizing the data. Another simpler method exists which should be ade-
quate for the situations described in this document. It has the advantage of
6-2

-------
being easy to understand and apply, and of providing natural estimates of the
monthly or quarterly effects via the monthly or quarterly means. The method
proposed here can be applied to any seasonal cycle—typically an annual cycle
for monthly or quarterly data.
NOTE
Corrections for seasonality should be used with great caution as they
represent extrapolation into the future. There should be a good scientific
explanation for the seasonality as well as good empirical evidence for the
seasonality before corrections are made. Larger than average rainfalls for
two or three Augusts in a row does not justify the belief that there will
never be a drought in August, and this idea extends directly to groundwater
quality. In addition, the quality (bias, robustness, and variance) of the
estimates of the proper corrections must be considered even in cases where
corrections are called for. If seasonality is suspected, the user might want
to seek the help of a professional statistician.
PURPOSE
When seasonality 1s known to exist in a time series of concentrations,
then the data should be deseasonal1zed prior to constructing control charts in
order to take into account seasonal variation rather than mistaking seasonal
effects for evidence of contamination.
PROCEDURE
The following Instructions to adjust a time series for seasonality are
based on monthly data with a yearly cycle. The procedure can be easily modi-
fied to accommodate a yearly cycle of quarterly data.
Assume that N years of monthly data are available. Let x^ denote the
unadjusted observation for the 1th month during the jth year. J
Step 1. Compute the average concentration for month i over the N-year
period:
*1 * (*11 + •** + *iN^^
This is the average of all observations taken in different years but during
the same month. That 1s, calculate the mean concentrations in January, then
the mean for February, and so on for each of the 12 months.
Step 2. Calculate the grand mean, X, of all N*12 observations.
Step 3. Compute the adjusted concentrations,
hi • hi - (*1 - *)
6-3

-------
Computing Xjremoves the average effect of month i from the monthly
data, and adding X, the overall mean, places the adjusted 2^ values about the
same mean, X, It follows that the overall mean adjusted observation, Z,
equals the overall mean unadjusted value, X.
EXAMPLE
Columns 2 through 4 of Table 6-1 show monthly unadjusted concentrations
of a fictitious analyte over a 3-year period.
TABLE 6-1. EXAMPLE COMPUTATION FOR DESEASONALIZING DATA


Unadjusted


Monthly adjusted

concentrations
3-Month
concentrations

1983
L984
1985
average
1983
1984
1985
January
1.99
2.01
2.15
2.05
2.10
2.13
2.27
February
2.10
2.10
2.17
2.12
2.14
2.15
2.21
March
2.12
2.17
2.27
2.19
2.10
2.15
2.25
April
2.12
2.13
2.23
2.16
2.13
2.14
2.24
May
2.11
2.13
2.24
2.16
2.12
2.13
2.25
June
2.15
2.18
2.26
2.20
2.12
2.15
2.23
July
2.19
2.25
2.31
2.25
2.11
2.16
2.23
August
2.18
2.24
2.32
2.25
2.10
2.16
2.24
September
2.16
2.22
2.28
2.22
2.11
2.17
2.22
October
2.08
2.13
2.22
2.14
2.10
2.16
2.24
November
2.05
2.08
2.19
2.11
2.11
2.14
2.25
December
2.08
2.16
2.22
2.16
2.09
2.17
2.23
Overall 3-year
average « 2.17





Step 1. Compute the monthly averages across the 3 years. These values
are shown in the fifth column of Table 6-1.
Step 2. The grand mean over the 3-year period is calculated to be 2.17.
Step 3. Within each month and year, subtract the average monthly
concentration for that month and add the grand mean. For example, for January
1983, the adjusted concentration becomes
1.99 - 2.05 + 2.17 » 2.11
The adjusted concentrations are shown in the last three columns of Table 6-1.
6-4

-------
The reader can check that the average of all 36 adjusted concentrations
equals 2.17, the average unadjusted concentration. Figure 6-1 shows the plot
of the unadjusted and adjusted data. The raw data clearly exhibit seasonality
as well as an upwards trend which is less evident by simply looking at the
data table.
INTERPRETATION
As can be seen in Figure 6-1, seasonal effects were present in the
data. After adjusting for monthly effects, the seasonality was removed as can
be seen in the adjusted data plotted in the same figure.
6.3 COMBINED SHEWHART-CUSUM CONTROL CHARTS FOR EACH WELL AND CONSTITUENT
Control charts are widely used as a statistical tool in industry as well
as research and development laboratories. The concept of control charts is
relatively simple, which makes them attractive to use. From the population
distribution of a given variable, such as concentrations of a given constit-
uent, repeated random samples are taken at intervals over time. Statistics,
for example the mean of replicate values at a point 1n time, are computed and
plotted together with upper and/or lower predetermined limits on a chart where
the x-axis represents time. If a result falls outside these boundaries, then
the process is declared to be "out of control"; otherwise, the process is
declared to be "In control." The widespread use of control charts is due to
their ease of construction and the fact that they can provide a quick visual
evaluation of a situation, and remedial action can be taken, 1f necessary.
In the context of ground water monitoring, control charts can be used to
monitor the Inherent statistical variation of the data collected and to flag
anomalous results. Further Investigation of data points lying outside the
established boundaries will be necessary before any direct action 1s taken.
A control chart that can be used on a real time basis must be constructed
from a data set large enough to characterize the behavior of a specific
well. It is recommended that data from a minimum of eight samples within a
year be collected for each constituent at each well to permit an evaluation of
the consistency of monitoring results with the current concept of the hydro-
geology of the site. Starks (1988) recommends a minimum of four sampling
periods at a unit with eight or more wells and a minimum of eight sampling
periods at a unit with less than four wells. Once the control chart for the
specific constituent at a given well 1s acceptable, then subsequent data
points can be plotted on 1t to provide a quick evaluation as to whether the
process is 1n control.
The standard assumptions in the use of control charts are that the data
generated by the process, when it 1s in control, are Independently and
normally distributed with a fixed mean v and constant variance a2. The most
important assumption 1s that of Independence; control charts are not robust
with respect to departure from Independence (e.g., serial correlation). In
general, the sampling scheme will be such that the possibility of obtaining
serially correlated results 1s minimized, as noted in Section 2. The assump-
tion of normality is of somewhat less concern, but should be investigated
6-5

-------
lime Series ot Monthly Observations
(Unadjusted, Adjusted, 3—year Mean)
2.32
2.3 -
2.28 -
2.26 -
2.24 -
2.22 -
2.2 -
2.18 -
2.16 -
2.14 -
2.12 -
2.1
2.08 -
2.06
2.04 -
2.02 -
1.9B
Jan—83 May—83 Sep—B3 Jan—84 May—84 Sep—84 Jan—85 May—85 Sep—85
Time (month)
~ Unadjusted	+ Adjusted		 3—year Mean
Figure 6-1. Plot of unadjusted and seasonaly adjusted monthly observations.

-------
before plotting the charts. A transformation (e.g., log-transform, square
root transform) can be applied to the raw data so as to obtain errors normally
distributed about the mean. An additional situation which may decrease the
effectiveness of control charts is seasonality in the data. The problem of
seasonality can be handled by removing the seasonality effect from the data,
provided that sufficient data to cover at least two seasons of the same type
are available (e.g., 2 years when monthly or quarterly seasonal effect). A
procedure to correct a time series for seasonality was shown above in
Section 6.2.
PURPOSE
Combined Shewhart-cumulative sum (CUSUM) control charts are constructed
for each constituent at each well to provide a visual tool of detecting both
trends and abrupt changes in concentration levels.
PROCEDURE
Assume that data from at least eight independent samples of monitoring
are available to provide reliable estimates of the mean, w, and standard
deviation, o, of the constituent's concentration levels in a given well.
Step 1. To construct a combined Shewhart-CUSUM chart, three parameters
need to be selected prior to plotting:
h - a decision Internal value
k - a reference value
SCL - Shewhart control limit (denoted by U in Starks (1988))
The parameter k of the CUSUM scheme 1s directly obtained from the	value,
D, of the displacement that should be quickly detected; k - D/2. It 1s	recom-
mended to select k ¦ 1, which will allow a displacement of two standard	devia-
tions to be detected quickly.
When k 1s selected to be 1, the parameter h 1s usually set at values of 4
or 5. The parameter h 1s the value against which the cumulative sum in the
CUSUM scheme will be compared. In the context of groundwater monitoring, a
value of h « 5 is recommended (Starks, 1988; Lucas, 1982).
The upper Shewhart limit is set at SCL » 4.5 in units of standard devia-
tion. This combination of k » 1, h * 5, and SCL * 4.5 was found most appro-
priate for the application of combined Shewhart-CUSUM charts for groundwater
monitoring (Starks, 1988).
Step 2. Assume that at time period T^, ni cpncentration measurements
Xlt ..., Xn1, are available. Compute their average X^.
Step 3. Calculate the standardized mean
Zi « (X., - u)/o
6-7

-------
where u and a are the mean and standard deviation obtained from prior monitor-
ing at the same well (at least four sampling periods in a year).
Step 4. Compute the quantity
- max [0, Zj - k + S^}
where max {A, B} is the maximum of A and B. At sampling period Tlf S0 » 0.
Step 5. Plot the values of S^ versus on a time chart for this com-
bined Shewhart-CUSUM scheme. Declare an "out-of-control" situation at sam-
pling period if for the first time, > h or > SCL. This will indicate
probable contamination at the well and further investigations will be
necessary.
REFERENCES
Lucas, J. M. 1982. "Combined Shewhart-CUSUM Quality Control Schemes." Jour-
nal of Quality Technology. Vol. 14, pp. 51-59.
Starks, T. H. 1988 (Draft). "Evaluation of Control Chart Methodologies for
RCRA Waste Sites."
Hockman, K. K., and J. M. Lucas. 1987. "Variability Reduction Through Sub-
vessel CUSUM Control." Journal of Quality Technology. Vol. 19, pp. 113-121.
EXAMPLE
The procedure 1s demonstrated on a set of carbon tetrachloride measure-
ments taken monthly at a compliance well over a 1-year period. The monthly
means are presented in the second column of Table 6-2 below. Estimates of u
and o, the mean and standard deviation of carbon tetrachloride measurements at
that particular well were obtained from a preceding monitoring period at that
well; u = 5.5 yg/L and o * 0.4 yg/L.
Step 1. The three parameters necessary to construct a combined
Shewhart-CUSUM chart were selected as h » 5; k » 1; SCL ¦ 4.5 in units of
standard deviation.
Step 2. The monthly means are presented in Table 6-2.
Step 3. Standardize the mean observations within each sampling
period. These computations are shown 1n the third column of Table 6-2. For
example, Zj ¦ (4.28 - 5.50)/0.4 ¦ -3.05.
6-8

-------
TABLE 6-2. EXAMPLE OATA FOR COMBINED SHEWHART-CUSUM CHART—
CARBON TETRACHLORIDE CONCENTRATION (ug/L)

Sampling





period
Mean concentration,
Standardized Xi#
Zi

CUSUM,
Date
Ti
*1
Zi - k
Si
Jan 6
1
4.28
-3.05
-4.05
0
Feb 3
2
5.21
-0.73
-1.73
0
Mar 3
3
4.77
-1.83
-2.83
0
Apr 7
4
4.71
-1.98
-2.98
0
May 5
5
6.16
1.65
0.65
0.65
Jun 2
6
5.84
0.85
-0.15
0.50
Jul 7
7
6.53
2.58
1.58
2.08
Aug 4
8
7.38
4.70
3.70
5-7SE
Sep 1
9
6.43
2.32
1.32
7-10b
Oct 6
10
6.29
1.98
0.98
8.08°
Nov 3
11
6.21
1.78
0.77
8.85°
Dec 1
12
6.20
1.75
0.75
9.60°
Parameters: Mean ¦ 5.50; std ¦ 0.4; k ¦ 1; h ¦ 5; SCL ¦ 4.5.
a Indicates "out-of-control" process via Shewhart control limit (SCL).
k CUSUM "out-of-control" signal.
Step 4. Compute the quantities S^, 1 » 1, .... 12. For example,
51	« max {0, -4.05 + 0} » 0
52	» max (0, -1.73 + 0} ¦ 0
55	- max (0, 0.65 + S*} , max 0#65 + o} - 0.65
56	» max [0, -0.15 + 0.65} » max {0, 0.50} ¦ 0.50
etc.
These quantities are shown 1n the last column of Table 6-2.
Step 5. Construct the control chart. The y-axis 1s 1n units of stan-
dard deviations. The x-ax1s represent time, or_the sampling periods. For
each sampling period, T*, record the value of and S^. Draw horizontal
lines at values h ¦ 5 ana SCL ¦ 4.5. These two lines represent the upper con-
trol limits for the CUSUM scheme and the Shewhart control limit, respec-
tively. The chart for this example data set is shown 1n Figure 6-2.
6-9

-------
COMBINED SHEWHART-CUSUM CHART
mean=5.5; std=0.4; k=1; h=5; SCL=4.5
n
c
D
"O
V
N
u
o
c
o
o\ (/)
o Ł
c
o
D
L.
-M
c
V
o
c
o
(J
h
SCL
~
Standardized Mean
Sampling Period
CUSUM
Figure b-2

-------
The combined chart indicates possible contamination starting at sampling
period T8. Both the CUSUM scheme and the Shewhart control limit were exceeded
by S8 and Z8, respectively. Investigation of the situation should begin to
confirm contamination and action should be required to bring the variability
of the data back to its previous level.
INTERPRETATION
The combined Shewhart-CUSUM control scheme was applied to an example data
set of carbon tetrachloride measurements taken on a monthly basis at a well.
The statistic used 1n the construction of the chart was the mean measurement
per sampling period. (It should be noted that this method can be used on an
individual measurement as well.) Estimates of the mean and standard deviation
of the measurements were available from previous data collected at that well
over at least four sampling periods.
The parameters of the combined chart were selected to be k » 1 unit, the
reference value or allowable slack for the process; h » 5 units, the decision
interval for the CUSUM scheme; and SCL » 4.5 units, the upper Shewhart control
limit. All parameters are 1n units of o, the standard deviation obtained from
the previous monitoring results. Various combinations of parameter values can
be selected. The particular values recommended here appear to be the best for
the initial use of the procedure from a review of the simulations and recom-
mendations in the references. A discussion on this subject 1s given by Lucas
(1982), Starks (1988), and Hockman and Lucas (1987). The choice of the param-
eters h and k of a CUSUM chart is based on the desired performance of the
chart. The criterion used to evaluate a control scheme 1s the average number
of samples or time periods before an out-of-control signal 1s obtained. This
criterion 1s denoted by ARL or average run length. The ARL should be large
when the mean concentration of a hazardous constituent 1s near its target
value and small when the mean has shifted too far from the target. Tables
have been developed by simulation methods to estimate ARLs for given combina-
tions of the parameters (Lucas, Starks, Hockman and Lucas). The user 1s
referred to these articles for further reading.
6.4 UPDATE OF A CONTROL CHART
The control chart is based on preselected performance parameters as well
as on estimates of y and a, the parameters of the distribution of the measure-
ments 1n question. As monitoring continues and the process is found to be 1n
control, these parameters need periodic updating so as to incorporate this new
information into the control charts. Starks (1988) has suggested that in
general, adjustments 1n sample means and standard deviations be made after
sampling periods 4, 8, 12, 20, and 32, following the Initial monitoring period
recommended to be at least eight sampling periods. Also, the performance
parameters h, k, and SCL would need to be updated. The author suggests that
h « 5, k « 1, and SCL ¦ 4.5 be kept at those values for the first 12 sampling
periods following the initial monitoring plan, and that k be reduced to 0.75
and SCL to 4.0 for all subsequent sampling periods. These values and sampling
period numbers are not mandatory. In the event of an out-of-control state or
a trend, the control chart should not be updated.
6-11

-------
6.5 NONOETECTS IN A CONTROL CHART
Regulations require that four independent water samples be taken at each
well at a given sampling period. It is the mean of the four concentration
measurements of a particular constituent that is used in the construction of a
control chart. Now situations will arise when the concentration of a constit-
uent is below detection limit for one or more samples. The following approach
is suggested for treating nondetects when plotting control charts.
[f only one of the four measurements 1s a nondeteet,, then replace 1t with
one half of the detection limit (MDL/2) or with one half of the practical
quantitation limit (PQl/2) and proceed as described in Section 6.3.
If either two or three of the measurements are nondetects, use only the
quantitated values (two or one, respectively) for the control chart and pro-
ceed as discussed earlier in Section 6.3.
If all four measurements are nondetects, then use one half of the detec-
tion limit or practical quantitation limit as the value for the construction
of the control chart. This is an obvious situation of no contamination of the
well.
In the event that a control chart requires updating and a certain propor-
tion of the measurements 1 s below detection limit, then adjust the mean and
standard deviation necessary for the control chart by using Cohen's method
described 1n Section 7.1.4. In that case, the proportion of nondetects
applies to the pool of data available at the time of the updating and would
include all nondetects up to that time, not just the four measurements taken
at the last sampling period.
6-12

-------
SECTION 7
MISCELLANEOUS TOPICS
This chapter contains a variety of special topics that are relatively
short and self contained. These topics include methods to deal with data
below the limit of detection and methods to check for, and deal with outliers
or extreme values in the data.
7.1 LIMIT OF DETECTION
In a chemical analysis some compounds may be below the detection limit
(DL) of the analytical procedure. These are generally reported as not
detected (rather than as zero or not present) and the appropriate limit of
detection is usually given. Data that include not detected results are a
special case referred to as censored data in the statistical literature. For
compounds not detected, the concentration of the compound is not known.
Rather 1t 1s known that the concentration of the compound 1s less than the
detection limit.
There are a variety of ways to deal with data that include values below
detection. There 1s no general procedure that 1s applicable in all cases.
However there are some general guidelines that usually prove adequate. If
these do not cover a specific situation, the user should consult a profes-
sional statistician for the most appropriate way to deal with the values below
detection.
A summary of suggested approaches to deal with data below the detection
limit 1s presented as Table 7-1. The method suggested depends on the amount
of data below the detection limit. For small amounts of below detection
values, simply replacing a "NO" (not detected) report with a small number, say
the detection limit divided by two, and proceeding with the usual analysis is
satisfactory. For moderate amounts of below detection limit data, a more
detailed adjustment 1s appropriate, while for large amounts one may need to
only consider whether a compound was detected or not as the variable of
analysis.
The meaning of small, moderate, and large above 1s subject to judgment.
Table 7-1 contains some suggested values. It should be recognized that these
values are not hard and fast rules, but are based on judgment. If there 1s a
question about how to handle values below detection, consult a statistician.
7-1

-------
TABLE 7-1. METHODS FOR BELOW DETECTION LIMIT VALUES
Percentage
of Nondetects
in the Data Base
Statistical
Analysis Method
Section of
Guidance Document
Less than 15%
Replace NDs with
MDL/2 or PQL/2,
then proceed with
parametric procedures:
Section 7.1.1

•	ANOVA
•	Tolerance Units
•	Prediction Intervals
•	Control Charts
Section 4.2.1
Section 4.3
Section 4.4
Section 6
Between 15 and 50%
Use NDs as ties,
then proceed with
Nonparametric ANOVA
or
use Cohen's adjustment,
then proceed with:
Section 4.2.2

Section 7.1.3

•	Tolerance Limits
•	Confidence Intervals
•	Control Charts
Section 4.3
Section 5.2.1
Section 6
More than 50%
Test of Proportions
Section 7.1.2
7-2

-------
It should be noted that the nonparametric methods presented earlier auto-
matically deal with values below detection by regarding them as all tied at a
level below any quantitated results. The nonparametric methods may be used if
there is a moderate amount of data below detection. If the proportion of non-
quantified values in the data exceeds 25%, these methods should be used with
caution. They should probably not be used if less than half of the data con-
sists of quantified concentrations.
7.1.1 The PL/2 Method
The amount of data that are below detection plays an important role 1n
selecting the method to deal with the limit of detection problem. If a small
proportion of the observations are not detected, these may be replaced with a
small number, usually the method detection limit divided by 2 (MDL/2), and the
usual analysis performed. This is the recommended method for use with the
analysis of various procedure of Section 4.2.1. Seek professional help if in
doubt about dealing with values below detection limit. The results of the
analysis are generally not sensitive to the specific choice of the replacement
number.
As a guideline, if 15% or fewer of the values are not detected, replace
them with the method detection limit divided by two and proceed with the
appropriate analysis using these modified values. Practical quantitation
limits (PQL) for Appendix IX compounds were published by EPA 1n the Federal
Register (Vol 52, No 131, July 9, 1987, pp 25947-25952J. These give practical
quantitation limits by compound and analytical method that may be used 1n
replacing a small amount of nondetected data with the quantitation limit
divided by 2. If approved by the Regional Administrator, site specific PQL's
may be used in this procedure. If more than 15% of the values are reported as
not detected, it 1s preferable to use a nonparametric method or a test of pro-
portions.
7.1.2. Test of Proportions
If more than 50% of the data are below detection but at least 10% of the
observations are quantified, a test of proportions may be used to compare the
background well data with the compliance well data. Clearly, if none of the
background well observations were above the detection limit, but all of the
compliance well observations were above the detection limit, one would suspect
contamination. In general the difference may not be as obvious. However, a
higher proportion of quantitated values 1n compliance wells could provide evi-
dence of contamination. The test of proportions 1s a method to determine
whether a difference in proportion of detected values 1n the background well
observations and compliance well observations provides statistically signifi-
cant evidence of contamination.
The test of proportions should be used when the proportion of quantified
values is small to moderate. If very few quantified values are found, a
method based on the Polsson distribution may be used as an alternative
approach. A method based on a tolerance limit for the number of detected
compounds and the maximum concentration found for any detected compound has
been proposed by Gibbons (1988). This alternative would be appropriate when
7-3

-------
the number of detected compounds is quite small relative to the number of
compounds analyzed for as might occur in detection monitoring.
PURPOSE
The test of proportions tests whether the proportion of compounds
detected in the compliance well data differs significantly from the proportion
of compounds detected in the background well data. If there is a significant
difference, this is evidence of contamination.,
PROCEDURE
The procedure uses the normal distribution approximation to the binomial
distribution. This assumes that the sample size is reasonably large. Gener-
ally, if the proportion of detected values is denoted by P, and the sample
size is n, then the normal approximation is adequate, provided that nP and
n(l-P) both are greater than or equal to 5.
Step 1. Determine X, the number of background well samples in which the
compound was detected. Let n be the total number of background well samples
analyzed. Compute the proportion of detects:
A
Pu » x/n
Step 2. Determine Y, the number of compliance well samples in which the
compound was detected. Let M be the total number of compliance well samples
analyzed. Compute the proportion of detects:
Pd « y/m
Step 3. Compute the standard error of the difference 1n proportions:
Sg * CI(x+y)/(n+fli)][1 - (x+y)/(n+m)][1/n + 1/ml}
and form the statistic:
Z - /%
Step 4. Compare the absolute value of Z to the 97.5th percentile from
the standard normal distribution, 1.96. If the absolute value of Z exceeds
1.96, this provides statistically significant evidence at the 5% significance
level that the proportion of compliance well samples where the compound was
detected exceeds the proportion of background well samples where the compound
was detected. This would be interpreted as evidence of contamination. (The
two-sided test 1s used to provide information about differences 1n either
direction.)
EXAMPLE
Table 7-2 contains data on cadmium concentrations measured in background
well and compliance wells at a facility. In the table, "BDL" is used for
below detection limit.
7-4

-------
TABLE 7-2. EXAMPLE DATA FOR A TEST OF PROPORTIONS
Cadmium concentration (ug/L)
at background well
(24 samples)
0.1
BDL
0.12
BOL
BDL*
BDL
0.26
BOL
BDL

0.1

BDL

0.014

BDL

BDL

BDL

BDL

BDL

0.12

BDL

0.21

BDL

0.12

BDL

BDL

Cadmium concentration (yg/L)
at compliance wells
(64 samples)
0.12
BDL
0.024
0.08
BDL
BDL
BDL
BDL
BDL
0.2
0.11
BDL
BDL
0.06
BOL
0.1
BDL
BDL
BDL
0.23
0.1
0.012
BDL
0.04
BDL
0.11
BDL
BDL
BDL
BDL
BDL
0.031
0.1
BDL
BDL
BDL
BDL
BOL
0.01
0.12
BDL
BDL
0.07
BDL
BDL
BDL
BOL
BDL
0.19
0.12
BOL
BDL
0.08
BOL
0.1
BDL

BDL
0.26

0.01
BDL

BDL
0.02

BDL
BOL

~
BDL means below detection limit.
7-5

-------
Step 1. Estimate the proportion above detection in the background
wells. As shown in Table 7-2, there were 24 samples from background wells
analyzed for cadmium, so n» 24. Of these, 16 were below detection and x = 8
were above detection, so Pu « 8/24 « 0.333.
Step t. Estimate the proportion above detection in the compliance
wells. There were 64 samples from compliance wells analyzed for cadmium, with
40 below detection and 24 detected values. This gives m ¦ 64, y * 24, so PH *
24/64 » 0.375.
Step 3. Calculate the standard error of the difference.
Sq = ([(8+24)/(24+64)][l-(8+24)/(24+64)](1/24 +1/64)}1/2 « 0.115
Step 4. Form the statistic Z and compare it to the normal
distribution.
7 , 0.375 - 0.333 . 0 ,7
L 57115
which is less in absolute value than the value from the normal distribution,
1.96. Consequently, there is no significant evidence that the proportion of
samples with cadmium levels above the detection limit differs in the
background well and compliance well samples.
INTERPRETATION
Since the proportion of water samples with detected amounts of cadmium in
the compliance wells was not significantly different from that 1n the
background wells, the data are Interpreted to provide no evidence of contam-
ination. Had the proportion of samples with detectable levels of cadmium in
the compliance wells been significantly higher than that in the background
wells this would have been evidence of contamination. Had the proportion been
significantly higher in the background wells, additional study would have been
required. This could indicate that contamination was migrating from an off-
site source, or it could mean that the hydraulic gradient had been incorrectly
estimated or had changed and that contamination was occurring from the facil-
ity, but the ground-water flow was not in the direction originally estimated.
Mounding of contaminants in the ground water near the background wells could
also be a possible explanation of this observance.
7.1.3 Cohen's Method
If a confidence interval or a tolerance interval based upon the normal
distribution is being constructed, a technique presented by Cohen (1959)
specifies a method to adjust the sample mean and sample standard deviation to
account for data below the detection limit. The only requirements for the use
of this technique is that the data are normally distributed and that the
detection limit be always the same. This technique is demonstrated below.
7-6

-------
PURPOSE
Cohen's method provides estimates of the sample mean and standard devia-
tion when some observations are below detection. These estimates can then be
used to construct tolerance, confidence, or prediction intervals.
PROCEDURE
Let n be the total number of observations, m represent the number of flata
points above the detection limit (DL), and represent the value of the ith
constituent value above the detection limit.
Step 1. Compute the sample mean x^ from the data above the detection
limit as follows:
X . » - . I,X.
d m 1»1 i
Step 2. Compute the sample variance from the data above the detection
limit as follows:
m	m	.J m ^
ill'V*1* lilX1 1 " S >
si' 	sn— " 	5TI	
Step 3. Compute the two parameters, h and t, as follows:
h. ital
n
and
t ¦	
(x-DL)2
where n is the total number of observations (i.e., above and below the
detection limit), and where DL is equal to the detection limit.
A
These values are then used to determine the value of the parameter x from
Table 7 in Appendix B.
Step 4. Estimate the corrected sample mean, which accounts for the data
below detection limit, as follows:
* " *d " x(*d " DL>
Step 5. Estimate the corrected sample standard deviation, which accounts
for the data below detection limit, as follows:
7-7

-------
s * (Sd* + 1 (xd - DL)2)'/2
Step 6. Use the modified values of X and S in the procedure for con-
structing a tolerance interval (Section 4.3) or a confidence interval (Sec-
tion 5,2.1).
REFERENCE
Cohen, A. C., Jr. 1959. "Simplified Estimators for the Normal Distribution
When Samples are Singly Censored or Truncated." Technometrics. 1:217-237.
EXAMPLE
Table 7-3 contains data on sulfate concentrations. Three observations of
the 24 were below the detection limit of 1,450 mg/L and are denoted by
"< 1,450" in the table.
TABLE 7-3. EXAMPLE DATA FOR TESTING COHEN'S TEST
Sulfate concentration (mg/L)
1,850
1,760
<1,450
1,710
1,575
1,475
1,780
1,790
1,780
<	1,450
1,790
1,800
<	1,450
1,800
1,840
1,820
1,860
1,780
1,760
1,800
1,900
1,770
1,790
1,780
> DL » 1,450 mg/L
Note: A symbol "<" before a number indicates that the value
is not detected. The number following is then the limit of
detection.
7-8

-------
Step 1. Calculate the mean from the m » 21 values above detection
xd - 1,771.9
Step 2. Calculate the sample variance from the 21 quantified values
S^ » 8,593.69
Step 3. Determine
h » (24-21)/24 - 0.125
and
T » 8593.69/(1771.9-1450)2 = 0.083
Enter. Table 7 of Appendix B at h « 0.125 and T » 0.083 to determine the
value of x. Since the table does not coptain these entries exactly, double
linear interpolation was used to estimate x » 0.14986.
REMARK
For the interested reader, the details of the double linear interpolation
are provided.
The values from Table 14 between which the user needs to Interpolate are:
h » 0.10	h - 0.15
T
0.05	0.11431	0.17935
0.10	0.11804	0.18479
There are 0.025 units between 0.01 and 0.125 on the h-scale. There are
0.05 units between 0.10 and 0.15. Therefore, the value of Interest (0.125)
Hes (0.025/0.05 * 100) » 50% of the distance along the interval between 0.10
and 0.15. To linearly interpolate between the tabulated values on the h axis,
the range between the values must be calculated, the value that 1s 50% of the
distance along the range must be computed and then that value must be added to
the lower point on the tabulated values. The result is the interpolated
value. The Interpolated points on the h-scale for the current example are:
0.17935 - 0.11431 » 0.06504	0.06504 * 0.50 « 0.03252
0.11431 + 0.03252 - 0.14683
0.18479 - 0.11804 - 0.06675	0.06675 * 0.50 - 0.033375
0.11804 + 0.033375 » 0.151415
On the T-axis there are 0.033 units between 0.05 and 0.083. There are
0.05 units between 0.05 and 0.10. The value of interest (0.083) lies
7-9

-------
(0.0330.05 * 100) - 66% of the distance along the interval between 0.05 and
0.10. The interpolated point on the T-axis is:
0.141415 - 0.14683 - 0.004585 0.004585 * 0.66 = 0.0030261
0.14683 + 0.0030261 = 0.14986
Thus, x » 0.14986.
Step 5. The corrected sample mean and standard deviation are then esti-
mated as follows:
X = 1,771.9 - 0.14986 (1,771.9 - 1,450) - 1,723.66
S - [8,593.69 + 0.14986(1,771.9 - 1.450)*]1/2 = 155.31
Step 6. These modified estimates of the mean, X » 1723.66, and of the
standard deviation, $ * 155.31, would be used in the tolerance or confidence
interval procedure. For example, if the sulfate concentrations represent
background at a facility, the upper 95% tolerance limit becomes
1723.7 + (155.3)(2.309) » 2082.3 mg/L
Observations from compliance wells in excess of 2,082 mg/L would give sta-
tistically significant evidence of contamination.
INTERPRETATION
Cohen's method provides maximum likelihood estimates of the mean and
variance of a censored normal distribution. That is, of observations that
follow a normal distribution except for those below & limit of detection,
which are reported as "not detected." The modified estimates reflect the fact
that the not detected observations are below the limit of detection, but not
necessarily zero. The large sample properties of the modified estimates allow
for them to be used with the normal theory procedures as a means of adjusting
for not detected values in the data. Use of Cohen's method in more compli-
cated calculations such as those required for analysis of variance procedures,
requires special consideration from a professional statistician.
7.2 OUTLIERS
A ground-water constituent concentration value that is much different
from most other values in a data set for the same ground-water constituent
concentration can be referred to as an "outlier." Possible reasons for
outliers can be:
A catastrophic unnatural occurrence such as a spill;
Inconsistent sampling or analytical chemistry methodology that may
result in laboratory contamination or other anomalies;
Errors in the transcription of data values or decimal points; and
7-10

-------
True but extreme ground-water constituent concentration measure-
ments.
There are several tests to determine if there is statistical evidence
that an observation is an outlier. The reference for the test presented here
is ASTM paper E178-75.
PURPOSE
The purpose of a test for outliers is to determine whether there is
statistical evidence that an observation that appears extreme does not fit the
distribution of the rest of the data. If a suspect observation is identified
as an outlier, then steps need to be taken to determine whether it is the
result of an error or a valid extreme observation.
PROCEDURE
Let the sample of observations of a hazardous constituent of ground water
be denoted by Xlt ..., Xn. For specificity, assume that the data have been
ordered and that the largest observation, denoted by Xn, 1s suspected of being
an outlier. Generally, inspection of the data suggests values that do not
appear to belong to the data set. For example, if the largest observation is
an order of magnitude larger than the other observations, 1t would be suspect.
Step 1. Calculate the mean, X and the standard deviation, S, of the data
including all observations.
Step 2. Form the statistic
Tn • (X„ - X)/S
Note that Tn 1s the difference between the largest observation and the sample
mean, divided by the sample standard deviation.
Step 3. Compare the statistic T. to the critical value given the sample
size, n, in Table 8	in Appendix B. If the statistic exceeds the critical
value from the table,	this is evidence that the suspect observation, Xn, is a
statistical outlier.
Step 4. If the value is identified as an outlier, one of the actions
outlined below should be taken. (The appropriate action depends on what can
be learned about the observation.) The records of the sampling and analysis
of the sample that led to it should be investigated to determine whether the
outlier resulted from an error that can be Identified.
If an error (1n transcription, dilution, analytical procedure, etc.)
can be Identified and the correct value recovered, the observation should be
replaced by its corrected value and the appropriate statistical analysis done
with the corrected value.
• If it can be determined that the observation 1s in error, but the
correct value cannot be determined, then the observation should be deleted
7-11

-------
from the data set and the appropriate statistical analysis performed. The
fact that the observation was deleted and the reason for its deletion should
be reported when reporting the results of the statistical analysis.
If no error in the value can be documented then it must be assumed
that the observation is a true but extreme value. In th.is case it must not be
altered. It may be desirable to obtain another sample to confirm the observa-
tion. However, analysis and reporting should retain the observation and state
that no error was found in tracing the sample that led to the extreme observa-
tion »
EXAMPLE
Table 7-4 contains 19 values of total organic carbon (TOC) that were
obtained from a monitoring well. Inspection shows one value which at 11,000
mg/L is nearly an order of magnitude larger than most of the other observa-
tions. It is a suspected outlier.
Step 1. Calculate the mean and standard deviation of the data.
X » 2300 and S - 2325.9
TABLE 7-4. EXAMPLE DATA FOR TESTING FOR AN OUTLIER
Total organic carbon (mg/L)
1,700
1,900
1,500
1,300
11,000
1,250
1,000
1,300
1,200
1,450
1,000
1,300
1,000
2,200
4,900
3,700
1,600
2,500
1,900
7-12

-------
Step 2. Calculate the statistic T19.
Tl9 = (11000-2300)/2325.9 = 3.74
Step 3. Referring to Table 8 of Appendix B for	the upper 5% significance
level, with „n * 19, the critical value 1s 2.532.	Since the value of the
statistic T ¦ 3.74 1s greater than 2.532, there is	statistical evidence that
the largest observation is an outlier.
Step 4. In this case, tracking the data revealed that the unusual value
of 11,000 resulted from a keying error and that the correct value was 1,100.
This correction was then made in the data.
INTERPRETATION
An observation that is 4 or 5 times as large as the rest of the data is
generally viewed with suspicion. An observation that is an order of magnitude
different could arise by a common error of misplacing a decimal. The test for
an outlier provides a statistical basis for determining whether an observation
is statistically different from the rest of the data. If 1t is, then it is a
statistical outlier. However, a statistical outlier may not be dropped or
altered just because it has been identified as an outlier. The test provides
a formal identification of an observation as an outlier, but does not identify
the cause of the difference.
Whether or not a statistical test 1s done, any suspect data point should
be checked. An observation may be corrected .or dropped only if 1t can be
determined that an error has occurred. If the error can be identified and
corrected (as in transcription or keying) the correction should be made and
the corrected values used. A value that 1s demonstrated to be incorrect may
be deleted from the data. However, if no specific error can be documented,
the observation must be retained 1n the data. Identification of an observa-
tion as an outlier but with no error documented could be used to suggest
resampling to confirm the value.
7-13

-------
APPENDIX A
GENERAL STATISTICAL CONSIDERATIONS AND
	glWSaAy or STATISTICAL TEAmS	
A-l

-------
GENERAL STATISTICAL CONSIDERATIONS
FALSE ALARMS OR TYPE I ERRORS
The statistical analysis of data from ground-water monitoring at RCRA
sites has as its goal the determination of whether the data provide evidence
of the presence of, or an increase in the level of, contamination. In the
case of detection monitoring, the goal of the statistical analysis is to
determine whether statistically significant evidence of contamination
exists. In the case of compliance monitoring, the goal is to determine
whether statistically significant evidence of concentration levels exceeding
compliance limits exists. In monitoring sites in corrective action, the goal
is to determine whether levels of the hazardous constituents are still above
compliance limits or have been reduced to at or below the compliance limit.
These questions are addressed by the use of hypothesis tests. In the
case of detection monitoring, 1t 1s hypothesized that a site 1s not contami-
nated; that 1s, the hazardous constituents are not present 1n the ground
water. Samples of the ground water are taken and analyzed for the constitu-
ents 1n question. A hypothesis test 1s used to decide whether the data indi-
cate the presence of the hazardous constituent. The test consists of calcu-
lating one or more statistics from the data and comparing the calculated
results to some prespecifled critical levels.
In performing a statistical test, there are four possible outcomes. Two
of the possible outcomes result in the correct decision: (a) the test may
correctly indicate that no contamination is present or (b) the test may cor-
rectly indicate the presence of contamination. The other two possibilities
are errors: (c) the test may Indicate that contamination is present when 1n
fact it is not or (d) the test may fail to detect contamination when it 1s
present.
If the stated hypothesis is that no contamination is present (usually
called the null hypothesis) and the test indicates that contamination 1s
present when in fact 1t 1s not, this 1s called a Type I error. Statistical
hypothesis tests are generally set up to control the probability of Type I
error to be no more than a specified value, called the significance level, and
usually denoted by o. Thus 1n detection monitoring, the null hypothesis would
be that the level of each hazardous constituent 1s zero (or at least below
detection). The test would reject this hypothesis 1f some measure of concen-
tration were too large, Indicating contamination. A Type I error would be a
false alarm or a triggering event that is inappropriate.
In compliance monitoring, the null hypothesis is that the level of each
hazardous constituent is less than or equal to the appropriate compliance
A-2

-------
limit. For the purpose of setting up the statistical procedure, the simple
null hypothesis that the level is equal to the compliance limit would be
used. As in detection monitoring, the test would indicate contamination if
some measure of concentration is too large. A false alarm or Type I error
would occur if the statistical procedure indicated that levels exceed the
appropriate compliance limits when, in fact,-they do not. Such an error would
be a false alarm in that it would indicate falsely that compliance limits were
being exceeded.
PROBABILITY OF DETECTION AND TYPE II ERROR
The other type of error that can occur is called a Type II error. It
occurs if the test fails to detect contamination that is present. Thus a
Type II error is a missed detection. While the probability of a Type I error
can be specified, since it is the probability that the test will give a false
alarm, the probability of a Type II error depends on several factors, includ-
ing the statistical test, the sample size, and the significance level or prob-
ability of Type I error. In addition, it depends on the degree of contamina-
tion present. In general, the probability of a Type II error decreases as the
level of contamination increases. Thus a test may be likely to miss low lev-
els of contamination, less likely to miss moderate contamination, and very
unlikely to miss high levels of contamination.
One can discuss the probability of a Type II error as the probability of
a missed detection, or one can discuss the complement (one minus the prob-
ability of Type II error) of this probability. The complement, or probability
of detection, 1s also called the power of the test. It depends on the magni-
tude of the contamination so that the power or probability of detecting con-
tamination increases with the degree of contamination.
If the probability of a Type I error is specified, then for'a given sta-
tistical test, the power depends on the sample size and the alternative of
interest. In order to specify a desired power or probability of detection,
one must specify the alternative that should be detected. Since generally the
power will increase as the alternative differs more and more from the null
hypothesis, one usually tries to specify the alternative that 1s closest to
the null hypothesis, yet enough different that it is important to detect.
In the detection monitoring situation, the null hypothesis is that the
concentration of the hazardous constituent is zero (or at least below detec-
tion). In this case the alternative of interest is that there is a concen-
tration of the hazardous constituent that is above the detection limit and 1s
large enough so that the monitoring procedure should detect it. Since it 1s a
very difficult problem to select a concentration of each hazardous constituent
that should be detectable with specified power, a more useful approach is to
determine the power of a test at several alternatives and decide whether the
procedure is acceptable on the basis of this power function rather than on the
power against a single alternative.
In order to increase the power, a larger sample must be taken. This
would mean sampling at more frequent Intervals. There is a limit to how much
can be achieved, however. In cases with limited water flow, it may not be
possible to sample wells as frequently as desired. If samples close together
A-3

-------
in time prove to be correlated, this correlation reduces the information
available from the different samples. The additional cost of sampling and
analysis will also impose practical limitations on the sample size that can be
used.
Additional wells could also be used to increase the performance of the
test. The additional monitoring wells would primarily be helpful in ensuring
that a plume would not escape detection by missing the monitoring wells. How-
ever, in some situations the additional wells would contribute to a larger
sample size and so improve the power.
In compliance monitoring the emphasis is on determining whether addi-
tional contamination has occurred, raising the concentration above a compli-
ance limit. If the compliance limit is determined from the background well
levels, the null hypothesis is that the difference between the background and
compliance well concentrations is zero. The alternative of interest 1s that
the compliance well concentration exceeds the background concentration. This
situation is essentially the same for power considerations as that of the
detection monitoring situation.
If compliance monitoring is relative to a compliance limit (MCL or ACL),
specified as a constant, then the situation is different. Here the null hypo-
thesis is that the concentration is less than or equal to the compliance
limit, with equality used to establish the test. The alternative is that the
concentration 1s above the compliance limit. In order to specify power, a
minimum amount above the compliance limit must be established and power speci-
fied for that alternative or the power function evaluated for several possible
alternatives.
SAMPLE DESIGNS AND ASSUMPTIONS
As discussed 1n Section 2, the sample design to be employed at a regu-
lated unit will primarily depend on the hydrogeologic evaluation of the
site. Wells should be sited to provide multiple background wells hydrauli-
cally upgradlent from the regulated unit. The background wells allow for
determination of natural spatial variability 1n ground-water quality. They
also allow for estimation of background levels with greater precision than
would be possible from a single upgradient well. Compliance wells should be
sited hydraulically downgradient to each regulated unit. The location and
spacing of the wells, as well as the depth of sampling, would be determined
from the hydrology to ensure that at least one of the wells should intercept a
plume of contamination of reasonable size.
Thus the assumed sample design 1s for a sample of wells to Include a
number of background wells for the site, together with a number of compliance
wells for each regulated unit at the site. In the event that a site has only
a single regulated unit, there would be two groups of wells, background and
compliance. If a site has multiple regulated units, there would be a set of
compliance wells for each regulated unit, allowing for detection monitoring or
compliance monitoring separately at each regulated unit.
Data from the analysis of the water at each well are initially assumed to
follow a normal distribution. This is likely to be the case for detection
A-4

-------
monitoring of analytes in that levels should be near zero and errors would
likely represent instrument or other sampling and analysis variability. If
contamination is present, then the distribution of the data may be skewed to
the right, giving a few very large values. The assumption of normality of
errors in the detection monitoring case is quite reasonable, with deviations
from normality likely indicating some degree of contamination,, Tests of nor-
mality are recommended to ensure that the data are adequately represented by
the normal distribution.
In the compliance monitoring case, the data for each analyte will again
initially be assumed to follow the normal distribution. In this case, how-
ever, since there is a nonzero concentration of the analyte in the ground
water, normality is more of an issue. Tests of normality are recommended. If
evidence of nonnormality is found, the data should be transformed or a
distribution-free test be used to determine whether statistically significant
evidence of contamination exists.
The standard situation would result in multiple samples (taken at dif-
ferent times) of water from each well. The wells would form groups of back-
ground wells and compliance wells for each regulated unit. The statistical
procedures recommended would allow for testing each compliance well group
against the background group. Further, tests among the compliance wells
within a group are recommended to determine whether a single well might be
intercepting an isolated plume. The specific procedures discussed and recom-
mended 1n the following sections should cover the majority of cases. They
will not cover all of the possibilities. In the event that none of the proce-
dures described and Illustrated appears to apply to a particular case at a
given regulated site, consultation with a statistician should be sought to
determine an appropriate statistical procedure.
The following approach is recommended. If a regulated unit 1s 1n detec-
tion monitoring, it will remain 1n detection monitoring until or unless there
is statistically significant evidence of contamination, in which case it would
be placed 1n compliance monitoring. Likewise, if a regulated unit is in com-
pliance monitoring, it will remain 1n compliance monitoring unless or until
there is statistically significant evidence of further contamination, in which
case it would move into corrective action.
In monitoring a regulated unit with multiple compliance wells, two types
of significance levels are considered. One 1s an experimentwlse significance
level and the other is a comparisonwise significance level. When a procedure
such as analysis of variance 1s used that considers several compliance wells
simultaneously, the significance is an experimentwlse significance. If
Individual well comparisons are made, each of those comparisons 1s done at a
comparisonwise significance level.
The fact that many comparisons will be made at a regulated unit with
multiple compliance wells can make the probability that at least one of the
comparisons will be incorrectly significant too high. To control the false
positive rate, multiple comparisons procedures are allowed that control the
experimentwlse significance level to be 5%. That 1s, the probability that one
or more of the comparisons will falsely indicate contamination is controlled
A-5

-------
at 5%. However, to provide some assurance of adequate power to detect real
contamination, the comparisonwlse significance level for comparing each
individual well to the background is required to be no less than 1%.
Control of the experimentwise significance level via multiple comparisons
procedures is allowed for comparisons among several wells. However, use of an
experimentwise significance level for the comparisons among the different haz-
ardous constituents is not permitted. Each hazardous constituent to be moni-
tored for in the permit must be treated separately.
A-6

-------
GLOSSARY OF STATISTICAL TERMS
(underlined terms are explained subsequently)
Alpha (<*)
Alpha-error
Alternative hypothesis
Arithmetic average
Confidence coefficient
Confidence Interval
Cumulative distribution
function
Distribution-free
A greek letter used to denote the significance
level or probability of a Type I error.
Sometimes used for Type I error.
An alternative hypothesis specifies that the
underlying distribution differs from the null
hypothesis. The alternative hypothesis usually
specifies the value of a parameter, for example
the mean concentration, that one is trying to
detect.
The arithmetic average of a set of observations
1s their sum divided by the number of
observations.
The confidence coefficient of a confidence
Interval for a parameter is the probability that
the random Interval constructed from the sample
data contains the true value of the parameter.
The confidence coefficient 1s related to the
significance level of an associated hypothesis
test by the fact that the significance level (1n
percent) is one hundred minus the confidence
coefficient (in percent).
A confidence interval for a parameter is a
random interval constructed from sample data in
such a way that the probability that the
Interval will contain the true value of the
parameter is a specified value.
Distribution function.
This 1s sometimes used as a synonym for
nonparametrlc. A statistic 1s distribution-free
if its distribution does not depend upon which
specific distribution function (in a large
class) the observations follow.
A-7

-------
Distribution function	The distribution function for a random variable,
X, is a function that specifies the probability
that X is less than or equal to t, for all real
values of t.
Experimentwise error rate This term refers to multiple comparisons. If a
total of n decisions are made about comparisons
(for example of compliance wells to background
wells) and x of the decisions are wrong, then
the experimentwise error rate is x/n.
Hypothesis	This 1s a formal statement about a parameter of
interest and the distribution of a statistic.
It is usually used as a null hypothesis or an
alternative hypothesis. For example, the null
hypothesis might specify that ground water had a
zero concentration of benzene and that analyti-
cal errors followed a normal distribution with
mean zero and standard deviation 1 ppm.
Independence	A set of events are independent if the
probability of the joint occurrence of any
subset of the events factors into the product of
the probabilities of the events. A set of
observations is independent if the joint
distribution function of the random errors
associated with the observations factors Into
the product of the distribution functions.
Mean	Arithmetic average.
Median	This is the middle value of a sample when the
observations have been ordered from least to
greatest. If the number of observations is odd,
it is the middle observation. If the number of
observations is even, it is customary to take
the midpoint between the two middle observa-
tions. For a distribution, the median is a
value such that the probability 1s one-half that
an observation will fall above or below the
median.
Multiple comparison	This 1s a statistical procedure that makes a
procedure	large number of decisions or comparisons on one
set of data. For example, at a sampling period,
several compliance well concentrations may be
compared to the background well concentration.
A-8

-------
Nonparametric statistical
procedure
A nonparametric statistical procedure is a
statistical procedure that has desirable
properties that hold under mild assumptions
regarding the data. Typically the procedure is
valid for a large class of distributions rather
than for a specific distribution of the data
such as the normal.
Normal population,
normality
The errors associated with the observations
follow the normal or Gaussian distribution
function.
Null hypothesis
One-sided test
One-sided tolerance limit
One-sided confidence limit
A null hypothesis specifies the underlying
distribution of the data completely. Often the
null distribution specifies that there is no
difference between the mean concentration 1n
background well water samples and compliance
well water samples.
A one-sided test is appropriate if concentra-
tions higher than those specified by the null
hypothesis are of concern. A one-sided test
only rejects for differences that are large and
in a prespedfled direction.
This 1s an upper limit on observations from a
specified distribution.
This is an upper limit on a parameter of a
distribution.
Order statistics
Out!ier
The sample values observed after they have been
arranged in increasing order.
An outlier is an observation that
He an unusually long way from the
observations in a series of
observations.
is found to
rest of the
replicate
Parameter
Percentile
A parameter 1s an unknown constant associated
with a population. For example, the mean
concentration of a hazardous constituent in
ground water is a parameter of Interest.
A percentile of a distribution 1s a value below
which a specified proportion or percent of the
observations from that distribution will fall.
A-9

-------
Power
Significance level
Type I error
Type II error
A-10
The power of a test is the probability that the
test will reject under a specified alternative
hypothesis. This is one minus the probability
of a Type II error. The power is a measure of
the test's ability to detect a difference of
specified size from the null hypothesis.
Sometimes referred to as the alpha level, the
significance level of a test is the probability
of falsely rejecting a true null hypothesis.
The probability of a Type I error.
A Type I error occurs when a true null
hypothesis is rejected erroneously. In the
monitoring context a Type I error occurs when a
test incorrectly indicates contamination or an
increase 1n contamination at a regulated unit.
A Type II error occurs when one fails to reject
a null hypothesis that is false. In the
monitoring context, a Type II error occurs when
monitoring fails to detect contamination or an
increase 1n a concentration of a hazardous
constituent.

-------
APPENDIX B
STATISTICAL TABLES
B-l

-------
CONTENTS
Table
1	Percentiles of the x2 Distribution With
v Degrees of Freedom, x2v>p	
2	95th Percentiles of the F-Distribution With and
m2 Degrees of Freedom, frVltV2>o.9s	
3	95th Percentiles of the Bonferronl t-Stat1st1cs,
t(v, a/m)	 B-5
4	Percentiles of the Standard Normal Distribution, Up	 B-6
5	Tolerance Factors (K) for One-Sided Normal Tolerance
Intervals With Probability Level (Confidence Factor)
Y ¦ 0.95 and Coverage P ¦ 95%	 B-8
6	Percentiles of Student's t-D1str1but1on	 B-9
7	Values of the Parameter x for Cohen's Estimates
Adjusting for Nondetected Values			B-10
8	Critical Values for T_ (One-Sided Test) When the
Standard Deviation Is Calculated Fronf the Saine Sample... B-ll
B-2

-------
TABLE 1. PERCENTILES OF THE x2 DISTRIBUTION WITH
v DEGREES OF FREEDOM, x*,p
>\
0.750
0.900
0.950
0.975
0.990
0.995
0999
1
1.323
2.706
3.841
5.024
6.635
7.879
10.83
2
2.773
4.605
5.991
7.378
9.210
10.60
13.82
3
4.108
6.251
7.815
9.348
11.34
12.84
16.27
4
5.385
7.779
9.488
11.14
1328
14.86
18.47
5
6.626
9.236
11.07
12.83
15.09
16.75
20J2
6
7.841
10.64
12,59
14.45
16.81
18.55
22.46
7
9.037
1102
14.07
16.01
18.48
20.28
24.32
8
10.22
13.36
15.51
17.53
20.09
21.96
26.12
9
11.39
14.68
16.92
1902
21.67
23.59
27.88
10
IZ55
15.99
18.31
20.48
23-21
25.19
29.59
11
13.70
17.28
19.68
21.92
24.72
26.76
31-26
12
14.85
18.55
21.03
23.34
26.22
28.30
32.91
13
15.98
19.81
22.36
24.74
27.69
29.82
34.53
14
17.12
21.06
23.68
26.12
29.14
31.32
36.12
IS
18.25
2Z31
25.00
27.49
30.58
32.80
37.70
16
19.37
23.54
26.30
28.85
32.00
34.27
39.25
17
20.49
24.77
27.59
30 19
33.41
35.72
40.79
18
21.60
25.99
28.87
31.53
34.81
37.16
42.31
19
22.72
2720
30.14
32.85
36.19
38.58
43.82
20
23.83
28.41
31.41
34.17
17 SI
40.00
45J2
21
24.93
29.62
32.67
35.48
38.93
41.40
46.80
22
26.04
30.81
33.92
36.78
40.29
42.80
48.27
23
27.14
32.01
35.17
38.08
41.64
44.18
49.73
24
28-24
33.20
36.42
39.36
42.98
45.56
51.18
25
29.34
34.38
37,65
40.65
44.31
46.93
52.62
26
30.43
35.56
38.89
41.92
45.64
48.29
54.05
27
31.53
36.74
40.11
43.19
46.96
49.64
55.48
28
3Z62
37.92
41..34
44.46
48.28
50.99
56.89
29
33.71
39.09
42.56
45.72
49.59
52.34
58.30
30
34.80
40.26
43.77
46.98
50.89
53.67
59.70
40
45.62
51.80
55.76
59.34
63.69
66.77
73.40
50
56.33
63.17
67.50
71.42
76.15
79.49
86.66
60
66.98
74.40
79.08
83J0
88.38
91.95
99.61
70
77.58
85.53
90.53
95.02
100.4
104.2
112-3
80
88.13
96.58
1015
106.6
112J
116 J
124.8
90
98.65
107.6
113.1
118.1
124.1
128.3 '
137.2
100
109.1
118 J
124 J
129.6
135.8
1402
149.4
SOURCE: Johnson, Norman L. and F. C. Leone. 1977. Statistics and Experimental
Design in Engineering and the Physical Sciences. Vol. I. Second Edition. John
Wiley and Sons, New York.
B-3

-------
TABLE 2. 95th PERCENTILES OF THE F-DISTRIBUTION WITH
AND v2 DEGREES OF FREEDOM, F 0
:0.95
10
II
15
20
24
30
40
to
120
161.4
11.51
10.13
7.71
6.61
3.99
5.59
5.32
3.12
4.96
4 84
4.73
4.67
4.60
4.54
4.49
4.45
4.41
4.31
4.35
4.32
4.30
4.2t
4.26
4.24
4.23
4.21
4.20
4.11
4.17
4.01
4.00
3.92
3.14
199.5
19.00
9.35
6.94
5.79
5.14
4.74
4.46
4.26
4.10
3.91
3.89
3.(1
3.74
3.61
3.63
3.39
3J5
3.52
3.49
3.47
3.44
3.42
3.40
3.39
3.37
3.35
3.34
3J3
3.32
3.23
J.15
3.07
3.00
215.7
19.16
9.28
6.59
5.41
4.76
4.35
4.07
3.16
3.71
3.39
3.49
3.41
3.34
3.29
3.24
3.20
3.16
3.1J
3.10
3.07
3.05
3.03
3.01
2.99
2.91
2.96
193
2.93
in
2.14
2.76
161
2.60
224.6
19.25
9.12
6.39
5.19
4.53
4 12
3.14
3.63
3.41
3.36
3.26
3.11
3.11
3.06
3.01
2.96
2.93
2.90
2.17
2.14
2.12
MO
2.71
176
2.74
2.73
171
2.70
2.69
2.61
2J3
145
137
230.2
19.30
9.01
6.26
5.05
4.39
3.97
3.69
3.41
3.33
3.20
3.11
3.03
196
2.90
115
111
177
174
171
161
166
164
162
160
159
1J7
136
155
153
145
i r
129
121
234.0
19.33
1.94
6.16
4.95
4.21
3.17
3.51
3J7
3.22
3.09
3.00
192
115
179
174
170
166
163
1(0
157
155
153
151
149
147
146
145
143
142
134
123
117
110
236.1
19.35
1.19
6.09
4.11
4.21
3.79
3.30
3.19
3.14
3.01
191
1(3
176
171
1U
2.61
III
134
131
149
146
144
142
140
119
137 .
2.36
133
13)
123
117
109
101
231.9
19.37
1.15
6.04
4.12
4.15
3.73
3.44
3.21
3.07
195
115
177
170
164
159
155
131
2.41
143
142
140
137
1M
134
2J2
131
129
121
127
II*
110
102
l.«4
240.5
19.31
1.11
6.00
4.77
4.10
3.61
3.39
3.11
3.02
190
110
171
163
139
154
149
146
142
1)9
137
1)4
1)2
130
121
127
123
124
122
121
112
104
1.9*
IH
241.9
19.40
1.79
5.96
4.74
4.06
3.64
3.35
3.14
191
115
175
167
160
134
149
145
141
131
1)3
132
1)0
127
123
124
122
120
119
111
116
1M
1.9*
1.91
1.1)
243.9
19.41
1.74
5.91
4.61
4.00
3.57
3.21
3.07
1*1
179
169
1«0
133
141
142
1)1
134
131
121
123
123
120
111
116
115
113
112
110
10*
100
I.*2
1.13
1.73
245.9
19.4)
1.70
5.16
4.62
).*4
3 51
).22
3.01
115
172
162
15)
146
140
1)5
131
127
123
2.20
111
115
113
111
109
107
106
104
103
101
l.*2
1.14
1.73
1.67
241.0
19.45
1.66
5.10
4.56
3.17
3.44
3.15
194
177
165
154
146
13*
133
121
123
II*
ll<
112
110
107
105
10)
101
l.*9
1.97
1.96
1.94
l.*3
1.14
1.73
1.66
1.37
249.1
19.45
1.64
3.77
4.3)
3.14
3.41
3.12
190
174
161
151
2.42
133
129
124
119
113
111
101
105
10)
101
I.M
1.96
I.*3
I.*3
1.91
1.90
1.1*
1.7*
1.70
1.61
1.32
230.1
19.46
1.62
5.75
4.50
3.11
3.31
3.01
116
170
137
147
131
131
125
119
115
111
107
104
101
I.M
1.96
1.94
L92
1.90
I.U
1.17
1.15
I.M
1.74
I.i3
1.55
1.46
251.1
19.47
1.59
5.72
4.44
3.77
3.34
3.04
113
166
153
143
134
127
120
115
110
106
10)
I.**
1.96
1.94
1.91
1.19
1.17
1.15
1.14
1.12
I.II
1.7*
1.69
1.3*
1.50
1.3*
2512
19.41
1.57
5.69
4.4)
3.74
3.30
3.01
179
162
149
131
130
122
116
111
106
102
I.M
1.93
1.92
1.19
I.M
I.M
1.12
1.10
1.7*
1.77
1.73
1.74
1.64
1.3)
1.4)
l.)2
253.3
19.49
1.55
3.66
4.40
3.70
3.27
197
175
151
145
134
125
111
2.11
106
101
1.97
1.9)
1.90
1.17
I.M
l.ll
1.7*
1.77
1.75
1.7)
1.71
1.70
1.61
IJI
1.47
l.)3
1.22
254.3
19.50
1.53
5.63
4.36
3.67
3.23
193
171
154
140
130
121
113
107
101
1.96
1.92
I.U
I.M
l.ll
1.71
1.76
1.73
1.71
1.69
1.67
1.65
1.64
1.62
IJI
l.)9
1.25
1.00
NOTE: vx: Degrees of freedom for numerator
v2: Degrees of freedom for denominator
SOURCE: Johnson, Norman L. and F. C. Leone. 1977. Statistics and Experimental
Design in Engineering and the Physical Sciences. Vol. I. Second Edition. John
Wiley and Sons, New York.
B-4

-------
TABLE 3. 95th PERCENTILES OF THE BONFERRONI
t-STATISTICS, t(w, a/m)
where v = degrees of freedom associated with the mean
squares error
m = number of comparisons
a * 0.05, the experimentwise error level
\ 1J1
I
2
3
4
5
\a/m
V \
0.05
0.025
0.0167
0.0125
0,01
4
2., 13
2.78
3„20
3.51
3.75
5
2.02
2.57
2*90
3,17
3.37
6
1.94
2.45
2*74
2.97
3.14
7
1.90
2.37
2.63
2.83
3.00
8
1.86
2.31
2.55
2.74
2.90
9
1.83
2.26
2.50
2.67
2.82
10
1.01
2.23
2.45
2.61
2.76
15
1.75
2.13
2.32
2.47
2.60
20
1.73
2.09
2.27
2.40
2.53
30
1.70
2.04
2„21
2,34
2.46
•
1.65
1.96
2.13
2.24
2.33
SOURCE: For a/m » 0.05, 0.025, and 0.01, the percentiles
were extracted from the t-table (Table 6, Appendix 8) for
values of F»l-a of 0.95, 0.975, and 0.99, respectively.
For a/m ¦ 0.05/3 and 0.05/4, the percentiles were
estimated using "A Nomograph of Student's t" by Nelson,
L. S. 1975. Journal of Quality Technology, Vol. 7,
pp. 200-201.
B-5

-------
TABLE 4. PERCENTILES OF THE STANDARD NORMAL DISTRIBUTION, Up
Wm.
Up
Up
p
0.000
0.001
0.002
0.003
0.004
0.005
0.006
0.007
0.008
0.009
0.50
0.0000
0.0025
0.0050
0.0075
0.0100
0.0125
0.0150
0.0175
0.0201
0.0226
0.51
0.0251
0.0276
0.0301
0.0326
0.0351
0.0376
0.0401
0.0426
0.0451
0.0476
0.52
0.0502
0.0527
0.0552
0.0577
0.0602
0.0627
0.0652
0.0677
0.0702
0.0728
0.53
0.0753
0.0778
0.0803
0.0828
0.0853
0.0878
0.0904
0.0929
0.0954
0.0979
0.54
0.1004
0.1030
0.1055
0.1080
0.1105
0.1130
0.1156
0.1181
0.1206
0.1231
0.55
0.1257
0.1282
0.1307
0.1332
0.1358
0.1383
0.1408
0.1434
0.1459
0.1484
0.56
0.1510
0.1535
0.1560
0.1586
0.1611
0.1637
0.1662
0.1687
0.1713
0.1738
0.57
0.1764
0.1789
0.1815
0.1840
0.1866
0.1891
0.1917
0.1942
0.1968
0.1993
0.58
0.20(9
0.2045
0.2070
0.2096
0.2121
0.2147
0.2173
0.2198
0.2224
0.2250
0.59
0.2275
0.2301
0.2327
0.2353
0.2378
0.2404
0.2430
0.2456
0.2482
0.2508
0.60
0.2533
0.2559
0.2585
0.26 M
0.2637
0.2663
0.2689
0.2715
0.2741
0.2767
0.61
0.2793
0.2819
0.2845
0.2871
0.2898
0.2924
0.2950
0.2976
0.3002
0.3029
0.62
0.3055
0.3081
0.3107
0.3134
0.3160
0.3186
0.3213
0.3239
0.3266
0.3292
0.63
0.3319
0.3345
0.3372
0.3398
0.3425
0.3451
0.3478
0.3505
0.3531
0.3558
0.64
0.3585
0.3611
0.3638
0.3665
0.3692
0.3719
0.3745
0.3772
0.3799
0.3826
0.65
0.3853
0.3880
0.3907
0.3934
0.3961
0.3989
0.4016
0.4043
0.4070
0.4097
0.66
0.4125
0.4152
0.4179
0.4207
0.4234
0.4261
0.4289
0.4316
0.4344
0.4372
0.67
0.4399
0.4427
0.4454
0.4482
0.4510
0.4538
0.4565
0.4593
0.4621
0.4649
0.68
0.4677
0.4705
0.4733
0.4761
0.4789
0.4817
0.4845
0.4874
0.4902
0.4930
0.69
0.4959
0.4987
0.5015
0.S044
0.5072
O.JIOI
0.5129
0.5158
0.5187
0.5215
0.70
0.5244
0.5273
0.5302
0.5330
0.5359
0.5388
0.5417
0.5446
0.5476
0.5505
0.71
0.5534
0.5563
0.5592
0.5622
0.5651
0.5681
0.5710
0.5740
0.5769
0.5799
0.72
0.5828
0.5858
0.5888
0.5918
0.5948
0.5978
0.6008
0.6038
0.6068
0.6098
0.73
0.6128
0.6158
0.6189
0.6219
0.6250
0.6280
0.6311
0.6341
0.6372
0.6403
0.74
0.6433
0.6464
0.6495
0.6526
0.6557
0.6588
0.6620
0.6651
0.6682
0.6713
NOTE: For values of P below 0.5, obtain the value of	from Table 4 and
change Its sign. For example, UQ<45 • -U(1_0i45) « -u0#55 - -0.1257.
(Continued)
B-6

-------
TABLE 4 (Continued)
p
0.000
o.oot
0.002
0.003
0.004
0.005
0.006
0.007
0.008
0.009
0.75
0.6745
0.6776
0.6808
0.6840
0.6871
0,6903
06935
0.6967
0,6999
0 7011
0.76
0 7063
0.7095
0,7128
0.7160
0.7192
0.7225
0.7257
0 7290
0.7323
0.7356
0.77
0 7388
0.7421
0.7454
0.7488
0.7521
0.7554
0.7588
0.7621
0.7655
0.7688
0.78
0.7722
0.7756
0.7790
0.7824
0.7858
0.7892
0.7926
0.7961
0.7995
0.8030
0.79
0.8064
0.8099
0.8134
0.8169
0.8204
0.8239
0.8274
0.8310
0.8345
0.8381
0 80
0.8416
0.8452
0.8488
0.8524
0.8560
0.8596
0.8633
0.8669
0.8705
0.8742
0 81
0.8779
0.8816
0.8853
0.8890
0.8927
0.8965
0.9002
0.9040
0.9078
0.9116
0.82
0.9154
0.9192
0.9230
0.9269
0.9307
0.9346
0.9385
0.9424
0.9463
0.9502
0.83
0.9542
0.9581
0.9621
0.9661
0.9701
0.9741
0.9782
0.9822
0.9863
0.9904
0.84
0.9945
09986
1.0027
1.0069
1.0110
1.0152
1.0194
1.0237
1.0279
1.0322
0-85
1.0364
1.0407
1.0450
1.0494
1.0537
1.0581
1.0625
1.0669
1.0714
1.0758
0 86
1.0803
1.0848
1.0893
1.0939
1.0985
1.1031
1.1077
1.1123
1.1170
1.1217
0.87
1.1264
1.1311
1.1359
1.1407
1.1455
1.1503
1.1552
I.I 601
1.1650
1.1700
0.88
1.1750
I.i800
1.1850
1.1901
1J952
1.2004
1.2055
1.2107
1.2160
1.2212
0.89
1 2265
1.2319
1.2372
1.2426
1.2481
1.2536
1.2591
1.2646
1.2702
1.2759
0,90
1.2816
1.2873
1.2930
1.2988
1J047
1.3106
1.3165
1.3225
1.3285
1.3346
0.91
1.3408
1.3469
1.3532
1.3595
1.3658
1.3722
1.3787
1,3852
1.3917
1.3984
0.92
1.4051
1.4118
1.4187
1.4255
1.4325
1.4395
1.4466
1,4538
1.4611
1.4684
0.93
1.4758
1.4833
1.4909
1.4985
1.5063
I.5I4I
1.5220
1.5301
1.5382
1.5464
0.94
1.5548
1.5632
1.5718
1.5805
1.5893
1.5982
1.6072
1.6164
1.6258
1.6352
0 95
1.6449
1.6546
1.6646
1.6747
1.6849
1.6954
1.7060
1.7169
1.7279
1.7392
0,96
1.7507
1.7624
1.7744
1.7866
1.7991
1.8119
1.8250
1.8384
1.8522
1.8663
0.97
1.8808
1.8957
1.9110
1.9268
1.9431
1.9600
1.9774
1.9954
2.0141
2.0335
0.98
2.0537
2.0749
2.0969
2.1201
2.1444
2.1701
2.1973
2.2262
2.2571
2.2904
099
2.3263
2.3656
2.4089
2.4573
2.5121
2.5758
2.6521
2.7478
2.8782
3.0902
SOURCE: Johnson, Norman L. and F. C. Leone. 1977. Statistics and Experimental
Design in Engineering and the Physical Sciences. Vol. I, Second Edition. John
Wiley and Sons, New York.
B-7

-------
TABLE 5. TOLERANCE FACTORS (K) FOR ONE-SIDED NORMAL TOLERANCE
INTERVALS WITH PROBABILITY LEVEL (CONFIDENCE FACTOR)
Y = 0.95 AND COVERAGE P - 95%
rx
K
3
7.655
4
5.145
5
4.202
6
3.707
7
3.399
8
3.188
9
3.031
10
2.911
11
2.815
12
2.736
13
2.670
14
2.614
15
2.566
16
2.523
17
2.486
18
2.543
19
2.423
20
2.396
21
2.371
22
2.350
23
2.329
24
2.309
25
2.292
30
2.220
35
2.166
40
2.126
45
2.092
50
2.065
n
K
75
1.972
100
1.924
125
1.891
150
1.868
175
1.850
200
1.836
225
1.824
250
1.814
275
1.806
300
1.799
325
1.792
350
1.787
375
1.782
400
1.777
425
1.773
450
1.769.
475
1.766
500
1.763
525
1.760
550
1.757
575
1.754
600
1.752
625
1.750
650
1.748
675
1.746
700
1.744
725
1.742
750
1.740
775
1.739
800
1.737
825
0.736
850
1.734
875
1.733
900
1.732
925
1.731
950
1.729
975
1.728
1000
1.727
SOURCE: (a) for sample sizes s 50: Lieberman, Gerald F. 1958. "Tables for
One-sided Statistical Tolerance	Limits." Industrial Quality Control. Vol. XIV,
No. 10. (b) for sample sizes > 50: K values were calculated from large
sample approximation.
B-8

-------
TABLE 6. PERCENTILES OF STUDENT's t-DISTRIBUTION
(F ® l-o; n = degrees of freedom)
n \
.00
.75
90
.95
M7S
.99
.995
9995
t
.325
1.000
3.078
6.314
12 706
31 821
03.657
636.619
1
.289
818
1 888
2.920
4 303
6.965
9 925
31 598
3
.277
715
1.838
2.353
3.182
4.541
5.841
12.941
4
271
741
1.533
2.132
2.776
3.747
4.604
8.610
i
.287
.727
1.475
2 015
2.571
3 365
4.032
6 859
6
.285
.718
1.440
1 943
2 447
3.143
3.707
5 959
7
.283
.711
1.415
1.895
2.365
2 998
3 499
5 405
8
.282
.708
2.397
1.800
2 300
2.896
3.355
5.041
9
.281
.703
1.383
1.833
2 262
2.821
3.250
4 781
10
.280
700
1.372
1.812
2.228
2 764
3.169
4.587
U
.280
.897
1.363
1.790
2 201
2.718
3.100
4 437
12
.259
.895
1.356
1.782
2.179
2.081
3.055
4.318
13
.259
.094
1.350
1.771
2.160
2.150
3.012
4.221
M
.258
.092
1.345
1,701
2.145
2.>24
2.977
4.140
IS
.258
891
1.341
1.753
2.131
2.002
2.947
4.073
16
.258
.890
1.337
1.740
2.120
2.583
2.921
4.015
17
.257
889
1.333
1.740
2.110
2.567
2.898
3 968
18
.257
.688
1.330
1.734
2.101
2.552
2.878
3.922
19
.257
688
1.328
1.729
2.093
2 539
2.801
3.883
20
.257
.887
1.325
1.725
2.080
2.528
2.845
3.850
21
.257
.888
1.323
1.721
2.080
2.518
2.831
3.819
22
.258
888
1.321
1.717
2.074
2.508
2.819
3.792
23
.258
885
1.319
1.714
2.069
2.500
2.807
3.767
24
.258
885
1.318
1.711
2.064
2.492
2.797
3.745
25
.258
884
1.316
1.708
2.060
2 485
2.787
3 725
28
.258
.884
1.31S
1.706
2.050
2.479
2.779
3.707
27
.256
.884
1.314
1.703
2.052
2.473
2.771
3.690
28
.258
.883
1.313
1.701
2 048
2.407
2 703
3.674
29
.256
.883
1.311
1 699
2.045
2.402
2.750
3.659
30
.250
.883
1.310
1.697
2.042
2.457
2.750
3.640
40
.255
881
1.303
1.684
2.021
2.423
2.704
3.551
80
.254
.879
1.290
1.671
2.000
2.390
2.660
3.460
120
.254
.077
1.289
1.658
1.980
2.358
2.617
3.373
m
.253
.874
1 282
1.645
1.960
2.326
2.576
3.291
SOURCE: CRC Handbook of Tables for Probability and Statistics. 1966.
W. H. Beyer, Editor. Published by the Chemical Rubber Company. Cleveland,
Ohio.
B-9

-------
TABLE 7. VALUES OF THE PARAMETER \ FOR COHEN'S ESTIMATES
ADJUSTING FOR NONDETECTED VALUES

.00
.11
.oft
.:o
.is
.31
.30
.31
.40
.40
.M
.U
.40
.•ft
.70
,T|
.010100
.010111
.0109ftO
.011310
.011041
.011102
.013243
.012120
.012714
.013030
.01227*
.013113
.013730
.013900
.014171
•014371
.014079
.014773
.0149*7
.01ft104
02UO4
.022002
.022790
.023499
.024070
024*00
821211
023730
.020243
.020720
.3271*9
.027949
¦020007
020113
029127
.029230
.029723
.030107
.030902
.032220
.033390
.034400
.030443
.030377
.037349
.03007?
.030000
.030034
.040322
.041004
.041733
.042301
043012
044200
.044940
040420
.041303
.043230
.044902
.040310
.047929
.049000
.030010
.001120
.001173
.003103
.004113
.000009
.030990
.000074
.007720
.000010
.009304
.000103
.000023
.001070
.002007
.004079
.000100
.000300
.000990
.001122
.002900
.004341
,000000
.000021
.000131
.000300
.070439
.071130
.072001
.073043
.074010
.070942
.070000
.077040
.074003
.077909
.070000
.072030
.074373
.070100
.077730
.079332
.091200
.007413
000433
091311
003193
.009034
.092903
.000039
.099210
.002301
.002700
.000000
.000300
.007070
.000017
.090133
.001310
002477
.000011
.000007
.090007
. 10143
10290
.10111
10721
.10020
11121
.11300
11400
.11000
.11027
.011330 .030000 .000040 . 00413 .070471 .004730
,10710
10064
.10007
1U10
•13107
•12321
.00024
.10197
.10134
.10041
.11131
.11400
.:io<7
.*1914
.12110
.12377
.12301
.12000
.13011
.13209
.13402
.13100
.13773
.13012
.14120
.14297
.11090
.11431
.11004
.13140
.12490
.12772
.130*9
.13333
.13191
.13047
.14090
.14321
.14112
.14773
.14007
.11194
.13400
.13099
.11712
.11003
.17342
.17931
.19479
.19001
.19910
.2033*
.10741
.21139
.21317
.21002
.32231
.32171
.22910
.23234
.23110
.33*0*
.2410*
.24412
.24740
.34900
.21022
.22741
.20401
.27031
.27*3*
.20193
.30737
.29200
.30701
.30213
.30721
.31104
.31430
.32041
.32400
.32903
.33307
.33703
.34091
.00
.10
.11
.21
.30
.11
.10
.u
.71
.90
.11
.10
.90
.11700 .14400 .10170 .20033 .34471 1.00
\ a
.33
.30
.31
.40
.40
.30
.11
.00
.09
.70
.99
.90
y
.00
.31102
.4021
.<941
.1001
.7000
.0300
.9000
1.141
1.330
l.JOl
3.170
3.213
.00
.90
.32792
.4130
.1004
.4101
.7212
.1940
.9094
1.104
1.310
1.U1
2.303
3.314
.09
.10
.33002
.4223
.1104
.0234
.7400
.0703
1.017
1.101
1.379
1.000
2.239
3.340
.10
.11
.34400
.4330
.13*0
.0301
.7141
.0000
1.030
1.304
1.400
1.130
2.311
3.370
.11
.20
!ium
.4423
.1403
.9403
.7070
.9013
1.001
1.222
1*410
1.411
2.300
3.401
.20
.21
.30003
.4110
.3000
.0000
.7119
.9100
1.007
1.240
1.439
1.971
2.301
3.430
*31
.30
.30700
.4100
.1004
.0713
.7027
.0300
1,003
1.357
1.407
1.093
2.330
3.404
.30
.33
.37370
.4070
.3009
.1031
.9090
.9437
1.009
1.374
1.479
1.713
2.313
3.491
.33
.40
.30033
.471*
.1701
.0027
.0170
.0970
1.113
1.300
1.404
1.733
2.374
3.130
.40
.41
.30000
.4031
.1000
.7019
.9190
•9700
1.137
1.300
1.3U
1*711
2.300
3.147
.41
« JO
#39270
.4004
.3907
.7139
.0409
.9030
1.141
1.331
I. no
1.770
2.411
3.171
.10
.11
.30*70
.4070
.Mil
.7221
.1017
.0900
1.110
1.337
1.140
1.700
2.443
3.001
.u
.M
.40447
.1040
.•133
.7310
.0020
1*007
1.100
1.311
1.M1
1.000
2.401
3.010
.00
.11
.41000
.1114
.1313
.7413
.0729
1.010
1.103
1.300
1.377
1*014
2.404
3.404
.00
.70
.41100
•1100
.0201
•7902
.0031
1*030
1.190
1.390
1.303
1*041
2.307
3.079
.70
.71
.42000
.1140
.0397
.7190
.0922
1.042
1.307
1.304
1.000
l.OOO
2.Ill
3.701
.71
.10
.43013
.1300
.9441
.7070
.0031
1.003
1*320
1.400
1*024
1.071
2.140
3.730
.00
.*0
.43123
.1370
.•Ul
.7701
.0127
1.044
1.333
1.422
1.139
1.091
2.100
3.714
.00
.10
.43023
.3430
.4 MO
.7944
•9331
1.074
1*344
1.439
1.413
1.900
2.100
3.779
.90
.11
.44U2
•0400
.0400
.7911
•9314
1.009
1.310
1.440
*.•00 '
1.994
2.007
3.901
.90
1.00
.44191
.3140
.0734
.0001
.9400
1.090
1.307
1.401
1.003
1.940
2.030
3.137
1.00




For
lU nil
- o4
r* i.
mo ,n -
0.




SOURCE: Cohen, A. C., Jr. 1961. "Tables for Maximum Likelihood Estimates:
Singly Truncated and Singly Censored Samples." Technometrics.
B-10

-------
TABLE 8. CRITICAL VALUES FOR T- (ONE-SIDED TEST) WHEN THE
STANDARD DEVIATION IS CALCULATED FROM
THE SAME SAMPLE
NwNbw of
OtHmliHh
a
Unwo.t*
Sifniftcaacc
L«»«l
Upp*0.»
Sifmliaaca
L»<1
Upjwr 1*
SiiuAcsnca
L«r«t
Ups«rl3*
iptifscancii
Lnd
Vppm 3%
Sifiuficatct
L«m1
Upper 10%
Si|ni(icuc«
Level
3
i.m
1.155
1.135
U35
1.153
1.14*
4
1,499
1.494
1.492
1.411
1.463
1.425
5
i.no
1.744
1.749
1.715
1.672
1.402
4
1011
1.973
1.944
1.1*7
IJ22
1.729
7
2-201
2.139
2.097
2.020
1.931
1.121
1
2J5I
2J74
1221
1126
1032
1.909
9
2.492
2jr
2J23
1215
1110
1.977
10
ztot
14*2
14 tO
1290
1174
1036
II
2.703
tsu
2.4*5
1335
12M
10U
12
2.791
1434
UK
1412
12U
1134
13
1*67
urn
1407
1441
1331
1175
IJ
113)
2.735
1*59
2J0T
2J7I
1213
15
U«
U04
1705
1549
2409
1247
I*
3 052
z.«:
1747
:.
-------
TABLE 8 (Continued)
Nwmlxrvf
OfcMrvauoai.
m
Upper 0.1^
SifmfiaMa
Level
t pper 0.5S
StfiuQcue*
Level
Upper 1%
Sifwfcaaa
Lerei
Upper 2J%
Sifaafiauce
Lmi
Upper 3%
Sipifieuci
Lot!
Upper 10*
Significance
L*vd
51
3.791
J.49I
JJ45
3.13*
2.964
1775
52
3.101
3.300
3JS3
3.143
1971
2.713
53
3.119
3.507
3J6I
3.151
2.971
2.790
M
3.113
JJI*
3J61
3.151
2.9M
1791
55
3.134
iju
3J76
3.166
2.992
1104
5*
3J42
JJJI
3JI3
J.172
3.000
nil
57
3.151
JJ3*
3-391
3.110
1.006
lilt
51
3.I5S
3.W4
3.397
3.116
3.0IJ
1124
59
3.167
1.553
3.405
3.193
3.019
nil
60
3.174
3.MO
3.411
3.199
3.025
1137
61
j.m:
l-H*
3.41S
3.205
3.032
1M2
6:
3.S5-»
3.3*3
3424
>.:i:
3 037
Ituv
u
3.194
1579
3.430
3.211
3 044
2.K34
*4
1.901
UM
1XJ7
3.224
3.049
:.uo
(5
3.910
Mf:
3.441
3.230
3.055
lit*
M
3.917
3J»
3.449
3.235
3.061
1171
*7
3.913
1.605
3.454
3J4I
3.066
2477
61
3.930
1.610
3.460
3.244
3.071
1113
*9
3.9)6
1.617
3.466
3.252
3.07*
un
TO
3.94}
3.61!
3>»71
JJS7
3.012
2.193
71
3.94*
J.C7
3.476
JJI2
3.0«7
1197
71
3.954
3.633
3.4*2
3J»7
3.092
1903
73
3.9*0
1431
1M7
JJTJ
IMS
1901
74
3.9*5
J.641
iAK
3.271
3.102
1912
75
JJ7I
3.641
1AH*
xza
3.107
1917
76
3.»n
1*54
3J02
jstr
1.111
J.9I2
77
3.9J2
3.631
1J07
3.291
1.117
l9r
7*
3.957
1.643
3JII
3J9T
3.121
1911
79
3.992
3.669
3JI6
3J0I
3.I2S
1915
SO
!.99t
3.(73
3.521
3 JOS
1.130
2.940
II
4,oo:
14T7
1325
3-309
J.134
1945
S3
4.007
3.611
3429
1JI5
1.139
1949
13
4 0i:
1*17
3J34
3 J19
3.143
1951
14
4.017
3.691
3J39
3J2J
J.147
1957
•5
4j021
1jM5
1543
jjr
3.151
19*1
It
4o:»
3.699
3J47
JJ1I
1.155
3.94*
17
4031
1.704
3.551
3J35
3.160
2.970
U
4 035
3.701
3J35
3-339
3.163
1973
19
4.039
3.712
3JJ9
3-343
3.167
1977
90
4.044
3.716
3.563
3J47
3.171
1.911
91
4.049
3.7»
3.567
3J50
3.174
:.9M
9:
4.053
3.7a
3.570
3.355
3.179
2.919
93
40JT
3.72S
JJ75
3.355
3.152
2.993
94
40M
3.732
3J79
3J42
3.116
2.996
95
4.0W
3.736
3JU
3J65
3.119
3.00(1
9*
4.069
3.739
3.516
3.3**
3193
3.003
97
4.073
3.744
JJI*
JJ72
3.19*
3.00*
91
4.076
3.747
JMJ
JJ77
3.201
34)11
•9
4.0*0
3.750
3.397
IJtO
3.204
3.014
100

3.754
3.600
3JU
3.207
3.017
(Continued)
B—12

-------
TABLE 8 (Continued)
Nvmhcr wi
Cp«wO I"
Cppct 0.3S
tnwi*!
l!pr« 2.5*
Upper 3*5
I'WR 10**
ObMrvAlliNIk

SifmfwMCt
SifiHlica»c«
Sifftifcaatc

Stpiifcanct
4
Lf»«l
Lod
Lttd
Lmi
Lnfl
Lmi
tui
4JMH
3.737
3.10}
3JW>
3.210
3.021
102
404 :
3.760
3.401
3.3*0
3-214
3024
10)
4.0*5
3.7*3
3*10
3 J*3
3.217
3427
164
4.0»l
j:m
3.414
3 J*'
3.120
3 030
105
4.10:
3771
3*17
3.400
3.124
3033
10b
4.10?
3.774
3.*»
3.403
3.227
3.037
IIT»
4 1(1*
37T»
3.423
3.40*
3-230
3 040
iO>
4.112
37M
)*:t
3.40*
3.233
3.043
l(N
4.1 II
37*4
3.42*
3.4i:
3.23*
3.04*
LiU
4 lit
3.717
3^3:
3.41J
3.23*
3.04*
III
4.122
3.7*0
J.*3*
3.411
3-24!
3052
u:
4.I2J
J.*3
3*3*
3.4U
3.245
3035
in
4.12*
3.7*4
3.m:
3.424
3J4I
3034
IU
4.UJ
3.7**
3.*43
3.427
3.231
30*1
!!3
4 133
3.102
3*47
3.430
3 2V
3044
III
4.IJ*
3103
3*30
3.433
3.237
30*7
117
4 141
3.W
3*53
3.433
3.23*
3070
IU
4.144
3.»l 1
J.4S4
3.431
3.242
3 07 J
II*
4.14*
J.1I4
3*3*
3.441
3.243
3.073
!20
4IJO
3JI7
3442
3444
3.247
3.071
i:i
4 133
3.11*
J.A45
3.417
3.270
3.0*1
i::
4.134
3ir
3.1*7
3.4 JO
3.274
3CJ3
JM
*IJ*
3.114
V470
3.452
3.274
3.0M
\U
4.1*1
3.127
3.*7J
3.435
}jn
3.0W
*25
4.1*4
3.(31
3.47J
3.437
3.31
3.W2
12ft
4 1**
3.133
3.*77
3.4*#
3.244
30*3
i:*
4.IM
J.»3*
3.MO
3.4*2
J.2M
30*7
i;?
4.173
i.Ut
JM)
3.4*;
3.2t*
3.103

4.I7J
3»40
1.4(4
3.4*7
3J*t
3 102
IX'
4.ITX
3.143
3.MI
3 470
3-2*4
3 104

4.1*0
3M>
3.**0
3473
3-2*4
3.107
; tj
4.113
3.14*
IJM!
3.473
3.2**
MO*
iw
4 115
3.150
It!
34**
3.302
3 112
IJ4
4.1tk
345.1
J.***
3.4*0
3J0*
3 114
1J5
4.1*0
3.»J*
J.'»)
3.4(2
3.30*
3li*
IJfr
4.1*3
Mtt
3.702
3.4S4
3.30*
3 11*

4.1*4
3 MO
3 TIM
3.417
3.311
3.122
!H
4 l*»
3.443
J.707
3.4**
3.313
3 124
;.*«
4.2(»i
}.**«
1 "10
3.4*1
3.315
3 ::t
140
4.203
3U7
3.712
3.4*3
3.311
3 12*
141
4.3)5
316*
3.714
3 4*7
3.32b
3 !.'!
t4:
4.20?
3.171
1.714
3 4**
3.322
3 . i J!
143
4 :o*
3.1*4
3*1*
3 501
3.324
3.135
144
4.212
3*74
3.721
3.303
3j:»
3 Ol
U<
4.:m
J.I*
3.7U
J.SK
3.321
3 !4C
14*
4.::*
3.tll
3.723
3 JO?
3J3I
3.142 '
14?
421*
3 >43
3.727
3.50*
3.3J4
3 144
SOURCE: ASTM Designation E178-75, 1975. "Standard
Recommended Practice for Dealing With Outlying
Observations."
B-13

-------
APPENDIX C
GENERAL BIBLIOGRAPHY
C-l

-------
The following 11st provides the reader with those references directly
mentioned in the text. It also includes, for those readers desiring further
information, references to literature dealing with selected subject matters 1n
a broader sense. This list is 1n alphabetical order.
ASTM Designation: E178-75. 1975. "Standard Recommended Practice for Dealing
with Outlying Observations."
ASTM Manual on Presentation of Data and Control Chart Analysis. 1976. ASTM
Special Technical Publication 15D.
Bararl, A., and L. S. Hedges. 1985. "Movement of Water in Glacial Till."
Proceedings of the 17th International Congress of the International Association of
Hydrogeologists. pp. 129-134.
Barcelona, M. J., J. P. G1bb, J. A. Helfrich, and E. E. Garske. 1985. "Prac-
tical Guide for Ground-Water Sampling." Report by Illinois State Water Sur-
vey, Department of Energy and Natural Resources for USEPA. EPA/600/2-85/104.
Bartlett, M. S. 1937. "Properties of Sufficiency and Statistical Tests."
Journal of the Royal Statistical Society, Series A. 160:268-282.
Box, G. E. P., and J. M. Jenkins. 1970. Time Series Analysis. Hoi den-Day, San
Francisco, California.
Brown, K. W., and D. C. Andersen. 1981. "Effects of Organic Solvents on the
Permeability of Clay Soils." EPA 600/2-83-016, Publication Ho. 83179978, U.S.
EPA, Cincinnati, Ohio.
Cohen, A. C., Jr. 1959. "Simplified Estimators for the Normal Distribution
When Samples Are Singly Censored or Truncated." Technometrics. 1:217-237.
Cohen, A. C., Jr. 1961. "Tables for Maximum Likelihood Estimates: Singly
Truncated and Singly Censored Samples." Technometrics. 3:535-541-
Conover, W. J. 1980. Practical Nonparametric Statistics. Second Edition, John
WHey and Sons, Hew York, New York.
CRC Handbook of Tables for Probability and Statistics. 1966. William H. Beyer
(ed.). The Chemical Rubber Company.
Current index to Statistics. Applications, Methods and Theory. Sponsored by
American Statistical Association and Institute of Mathematical Statistics.
Annual series providing Indexing coverage for the broad field of statistics.
David, H. A. 1956. "The Ranking of Variances in Normal Populations." Jour-
nal of the American Statistical Association. Vol. 51, pp. 621-626.
Davis, J. C. 1986. Statistics and Data Analysis in Geology. Second Edition.
John Wiley and Sons, New York, New York.
C-2

-------
Dixon, W. J., and F. J. Massey, Jr. 1983. Introduction to Statistical Analysis,
Fourth Edition. McGraw-Hill, New York, New York.
Gibbons, R. D. 1987. "Statistical Prediction Intervals for the Evaluation of
Ground-Water Quality." Groundwater. Vol. 25, pp. 455-465.
Gilbert, R. 1987. Statistical Methods for Environmental Pollution Monitoring.
Professional Books Series, Van Nos Reinhold.
Heath, R„ C. 1983. Basic Ground-Water Hydrology. U.S. Geological Survey
Water Supply Paper. 2220, 84 p.
Hirsch, R. M., J. R. Slack, and R. A. Smith. 1982. "Techniques of Trend
Analysis for Monthly Water Quality Data." Water Resources Research. Vol. 18,
No. 1, pp. 107-121.
Hockman, K. K., and J. M. Lucas. 1987. "Variability Reduction Through Sub-
vessel CUSUM Control. Journal of Quality Technology. Vol. 19, pp. 113-121.
Hollander, M., and 0. A. Wolfe. 1973. Nonparametric Statistical Methods. John
Wiley and Sons, New York, New York.
Huntsberger, D. V., and P. Billingsley. 1981. Elements of Statistical Infer-
ence. Fifth Edition. Allyn and Bacon, Inc., Boston, Massachusetts.
Johnson, N. L., and F. C. Leone. 1977. Statistics and Experimental Design in
Engineering and the Physical Sciences. 2 Vol., Second Edition. John Wiley and
Sons, New York, New York.
Kendall, M. G., and A. Stuart. 1966. The Advanced Theory of Statistics.
3 Vol. Hafner Publication Company, Inc., New York, New York.
Kendall, M. G., and W. R. Buckland. 1971. A Dictionary of Statistical Terms.
Third Edition. Hafner Publishing Company, Inc., New York, New York.
Kendall, M. G. 1975. Rank Correlation Methods. Charles Griffin, London.
Lang ley, R. A. 1971. Practical Statistics Simply Explained. Second Edition.
Dover Publications, Inc., New York, New York.
Lehmann, E. L. 1975. Nonparametric Statistical Methods Based on Ranks. Holsten
Day, San Francisco, California.
Lleberman, G. J. 1958. "Tables for One-Sided Statistical Tolerance
Limits." Industrial Quality Control. Vol. XIV, No. 10.
LUliefors, H. W. 1967. "On the Kolmogorov-Smlrnov Test for Normality with
Mean and Variance Unknown." Journal of the American Statistical Association.
64:399-402.
Lingren, B. W. 1976. Statistical Theory. Third Edition. McMillan.
C-3

-------
Lucas, J. M. 1982. "Combined Shewhart-CUSUM Quality Control Schemes." Jour-
nal of Quality Technology. Vol. 14, pp. 51-59.
Mann, H. B. 1945. "Non-parametric Tests Against Trend." Econometrica.
Vol. 13, pp. 245-259.
Miller, R. G., Jr. 1981. Simultaneous Statistical Inference. Second Edition.
Sprlnger-Verlag, New York, New York.
Nelson, L. S. 1987. "Upper 10%, 5%, and IX—Points of the Maximum F-
Ratlo." Journal of Quality Technology. Vol. 19, p. 165.
Nelson, L. S. 1987. "A Gap Test for Variances." Journal of Quality Technol-
ogy. Vol. 19, pp. 107-109.
Noether, G. E. 1967. Elements of Nonparametric Statistics. Wiley, New York.
Pearson, E. S., and K. 0. Hartley. 1976. Biometrika Tables for Statistician.
Vol. 1, Biometrika Trust, University College, London.
Quade, D. 1966. "On Analysis of Variance for the K-Sample Problem." Annals
of Mathematical Statistics. 37:1747-1748.
Remington, R. 0., and M. A. Schork. 1970. Statistics with Applications to the Bio-
logical and Health Sciences. Prentice-Hall, pp. 235-236.
Shapiro, S. S., and M. R. VMIk. 1965. "An Analysis of Variance Test for Nor-
mality (Complete Samples)." Biometrika. Vol. 52, pp. 591-611.
Snedecor, G. W., and W. G. Cochran. 1980. Statistical Methods. Seventh Edi-
tion. The Iowa State University Press, Ames, Iowa.
Starks, T. H. 1988 (Draft). "Evaluation of Control Chart Methodologies for
RCRA Waste Sites." Report by Environmental Research Center, University of
Nevada, Las Vegas, for Exposure Assessment Research Division, Environmental
Monitoring Systems Laboratory-Las Vegas, Nevada. CR814342-01-3.
Steel, R. G. 0., and J. H. Torrle. 1980. Principles and Procedures of Statistics,
A Biometrical Approach. Second Edition. McGraw-Hill Book Company, New York,
New York.
Todd, 0. K. 1980. Ground Water Hydrology. John Wiley and Sons, New York,
534 p.
Tukey, J. W. 1949. "Comparing Individual Means 1n the Analysis of Vari-
ance." Biometrics. Vol. 5, pp. 99-114.
C-4

-------
Statistical Software Packages:
BMDP Statistical Software. 1983. 1985 Printing. University of California
Press, Berkeley.
SAS: Statistical Analysis System, SAS Institute, Inc.
SAS® User's Guide: Basics, Version 5 Edition, 1985.
SAS® User's Guide: Statistics, Version 5 Edition, 1985.
SPSS: Statistical Package for the Social Sciences. 1982. McGraw-Hill.
SYSTAT: Statistical Software Package for the PC. Systat, Inc., 1800 Sherman
Avenue, Evanston, Illinois 60201.
C-5

-------