vvEPA
United States
Environmental Protection
Agency
Office of Acid Deposition,
Environmental Monitoring and
Quality Assurance
Washington DC 20460
EPA/600/4-86/008
April 1985
Research and Development
Eastern Lake Survey
Phase I
Quality Assurance Plan
-------
EPA 600/4-86/008
April 1985
Eastern Lake Survey
Phase I
Quality Assurance Plan
A Contribution to the
National Acid Precipitation Assessment Program
U.S. Environmental Protection Agency
Office of Research and Development
Washington, DC 20460
Environmental Monitoring Systems Laboratory - Las Vegas, NV 89114
Environmental Research Laboratory - Corvallis, OR 97333
-------
Notice
The information in this document has been funded wholly or in part by the U.S. Environmental Protection
Agency (EPA) under Contract No. 68-03-3249 and 68-03-3050 to Lockheed Engineering and Management Services
Company, Inc., No. 68-02-3889 to Northrop Services, Inc., and Interagency Agreement No. 40-1441-84 with the
U.S. Department of Energy. It has been subject to the Agency's peer and administrative review, and it has been
approved for publication as an EPA document.
Mention of corporation names, trade names or commercial products does not constitute endorsement or rec-
ommendation for use.
This document has been published previously. As part of the AERP Technical Information Program, this docu-
ment has been repackaged and retitled to clearly identify its relationship to other documents produced for the
Eastern Lake Survey. The document contents and reference number have not changed. Proper citation of this
document remains:
S. K. Drouse, D. C. Hillman, L. W. Creelman, J. F. Potter, and S. J. Simon, National Surface Water Survey, Eastern
Lake Survey (Phase I - Synoptic Chemistry) Quality Assurance Plan. EPA-600/4-86-008, U. S. Environmental Pro-
tection Agency, Las Vegas, NV 1986.
-------
Abstract
The National Surface Water Survey of the National Acid Precipitation Assessment Program is a three-phase
project to evaluate the current water chemistry of lakes and streams, determine the status of fisheries and other
biotic resources, and select regionally representative surface waters for a long-term monitoring program to
study changes in aquatic resources. A synoptic survey of eastern lakes is part of the first phase of the National
Surf ace Water Survey lake study. This manual delineates the quality assurance plan for the Eastern Lake Survey-
Phase I.
To ensure that procedures are performed consistently and that the quality of the data generated can be deter-
mined, the Quality Assurance Project Plan for the Eastern Lake Survey-Phase I specifies the following mea-
sures:
• Provide detailed, written sampling methodology.
• Simultaneously train all personnel participating in field activities.
• Conduct site visits to each field operations base throughout the sampling period to ensure that all methods
are being performed properly.
• Perform extensive evaluation of analytical laboratories both before selection and throughout their partici-
pation.
• Assess variability introduced at each level of activity in both field and analytical laboratories by processing
audit samples (both synthetic and natural lake samples), duplicates, and blanks along with routine sam-
ples.
• Provide detailed, written analytical methodology.
• Use internal quality control procedures at the field and analytical laboratories to detect potential contami-
nation and to verify established detection limits.
• Enforce holding time requirements.
• Use protocols in the field and in the analytical laboratory to confirm that reported data are correct.
• Enter data into the data base twice, and scan for outlying values to eliminate effects of transcription errors.
• Verify data by means of range checks, internal consistency checks, and quality assurance evaluations.
• Validate verified data by analysis of the reasonableness of data; base the analysis on the values expected
for the particular region or subregion involved.
in
-------
Contents
Page
Abstract ill
Figures vi
Tables vii
Acknowledgement viii
1.0 INTRODUCTION 1
2.0 PROJECT DESCRIPTION 3
3.0 PROJECT ORGANIZATION 5
4.0 QUALITY ASSURANCE OBJECTIVES FOR PRECISION ACCURACY, COMPLETENESS
REPRESENTATIVENESS, AND COMPARABILITY 8
4.1 Precision and Accuracy 8
4.2 Completeness 8
4.3 Representativeness 8
4.4 Comparability 8
5.0 SAMPLING STRATEGY 11
5.1 Selection of Sampling Design 11
5.2 Allocation of Lakes Within Strata 12
5.3 Number of Samples per Lake 12
6.0 FIELD OPERATIONS 14
6.1 Helicopter Crew Activities 14
6.2 Field Station Operations 19
6.3 Training 26
7.0 FIELD MEASUREMENT QUALITY CONTROL CHECKS 29
7 1 Lake Site Measurements 29
7.2 Field Laboratory Measurements 29
80 ANALYTICAL PROCEDURES 32
9.0 ANALYTICAL INTERNAL QUALITY CONTROL 33
9.1 Sample Receipt 33
9.2 Sample Analysis 33
9 3 Analytical Laboratory Documentation for Quality Control 33
9 4 Internal Quality Control Within Each Method 33
9.5 Overall Internal Quality Control 38
9.6 Instrumental Detection Limits 39
9 7 Data Reporting 39
9.8 Daily Evaluation of Quality Control Data 40
100 PERFORMANCE AND SYSTEM AUDITS 43
-------
10.1 Performance Audit Samples 43
10.2 Quality assurance System Audits (On-Site Evaluations) 45
11.0 ACCEPTANCE CRITERIA 46
11.1 Audit Sample Results 46
11.2 Duplicate Analysis Results 46
11.3 .Corrective Action 47
12.0 DATA MANAGEMENT SYSTEM 48
12.1 Raw Data Set 48
12.2 Verified Data Set 48
12.3 Validated Data Set 48
13.0 DATA EVALUATION AND VERIFICATION 53
13.1 Field Data Review 53
13.2 Analytical Data Review 53
14.0 DATA VALIDATION 57
14.1 Overview 57
14.2 Detection of Outliers 57
14.3 Detection of Systematic Error 58
14.4 Treatment of Outliers and Systematic Differences 61
15.0 REFERENCES 63
APPENDICES
Appendix A - Data Forms for Reporting Analytical Results A-1
Appendix B - Telephone Record log B-1
Appendix C - Field On-Site Evaluation Questionnaire C-1
Appendix D - Analytical Laboratory On-Site Evaluation Questionnaire D-1
Appendix E - Eastern lake Survey Preaward Scoring Sheet E-1
Appendix F - Eastern Lake Survey Data Package completeness Checklist F-1
Appendix G - Eastern Lake Survey Verification Report G-1
-------
Figures
Number Page
2.1 Organizational diagram of the National Surface Water Survey and Approximate
Years When the Field Activities are to be Initiated 4
3.1 National Surface Water Survey Internal Management Structure 6
5.1 Regions and Subregions of the Eastern U.S. Potentially Susceptible to Acidic
Deposition 13
6.1 Flowchart of Helicopter Crew Activities 15
6.2 National Surface Water Survey Form 1 - Lake Data 16
6.3 Field Sample Label 18
6.4 Flow Scheme of Daily Field Station Activities 20
6.5 Aliquot and Audit Sample Labels 21
6.6 National Surface Water Survey Form 2 - Batch QC Field Data 23
6.7 National Surface Water Survey Form 3 - Shipping 27
6.8 Eastern Lake Survey Data Flow Scheme 28
12.1 Data Management for the Eastern Lake Survey 49
14.1 Flowchart for Data Validation Process 59
17
-------
Tables
Number
1.1 Sections in This Report and in the Analytical Methods manual that Address Quality
Assurance Subjects [[[ 2
4.1 Quality Assurance objectives for Precision, Accuracy, and Detectability .......... 9
5.1 Variables Selected for Lake Classification, Eastern Lake Survey ................. 12
6.1 List of Data Forms and labels Used in the Field ............................... 17
6.2 List of Sample Codes [[[ 22
6.3 List of Aliquots, Containers, Preservatives, and Corresponding Sample
Parameters [[[ 24
6.4 Split Sample Descriptions .................................................. 25
8.1 List of Parameters and Corresponding Measurement Methods .................. 32
9.1 List of Maximum Holding Times ............................................. 34
9.2 Maximum Control Limits for Quality Control Check Samples (QCCS) ............. 35
9.3 Summary of Internal Quality Control Checks for Analysis Methods .............. 36
9.4 List of Data Forms Used by the Analytical Laboratory .......................... 37
9.5 Chemical Reanalysis Criteria ............................................... 39
9.6 Conductance Factors of Ions ............................................... 40
9.7 List of Decimal Place Reporting Requirements ................................ 41
9.8 National Surface Water Survey Lab/Field Data Qualifiers (Tags) ................. 42
10.1 Desired Composition of Synthetic Field and Laboratory Audit Samples for the
Eastern Lake Survey [[[ 44
12.1 Data Qualifiers (Flags) for Raw Data Set ...................................... 50
12.2 Mission Value Codes [[[ 52
13.1 Exception-Generating and Data Review Programs ............................. 54
14.1 Physical Variables Subject to Validation ...................................... 58
-------
Acknowledgment
Contributions provided by the following individuals were essential to the completion of this quality assurance
document and are gratefully acknowledged: R. Linthurst, R. Schonbrod (U.S. Environmental Protection Agency);
J. Baker, J. Engels, M. Faber, J. Fountain, Jr., D. Hoff, F. Morris, D. Peck, J. Potter, M. Stepanian (Lockheed Engi-
neering and Management Services Company, Inc.); J. Kramer (McMaster University); J. Eilers (Northrop Serv-
ices, Inc.); D. Landers (State University of New York); D. Brakke (Western Washington University); and the word
processing staff at Computer Sciences Corporation.
vni
-------
Section 1
Introduction
Data published in earlier studies are consistent with
the hypothesis that certain surface waters within the
United States have decreased in pH and/or alkalinity
over time. Acidic deposition has been suggested as a
contributor to such decreases. Also, numerous stud-
ies have led to the conclusion that the effects of
acidic deposition on surface water chemistry are
influenced by variations among lake, stream, and
associated watershed characteristics (U.S. EPA,
1984(a) and (b)).
Attempts have been made to extrapolate local stud-
ies to a regional/national scale to estimate quantita-
tively the risk to aquatic resources (especially fish)
from acidic deposition. These assessments have had
only limited success because of problems associ-
ated with (1) the comparability of the sampling and
analytical methodologies used, (2) the possibility of
biased or non-representative sampling sites, and (3) a
small and incomplete data base.
The National Surface Water Survey (NSVVS or "the
program") of the National Acid Precipitation Assess-
ment Program (NAPAP), Task Group E (Aquatic
Effects) is designed to overcome some of these defi-
ciencies. NSWS is a three-phase program to evaluate
the present water chemistry of lakes and streams,
determine the status of fisheries and other biotic
resources, and select regionally representative sur-
face waters for a long-term monitoring program to
study changes in aquatic resources.
A synoptic survey of eastern lakes is part of the first
phase of the NSWS lake study. This manual deline-
ates the quality assurance (QA) plan for the Eastern
Lake Survey-Phase I (ELS-I) or "the project". A
description of the project and its organization is given
in the following sections. The planning, thoughts, and
decisions that produced the objectives and design of
the ELS-I are discussed in detail in U.S. EPA, 1984a
and1984b.
The QA policy of the Environmental Protection
Agency (EPA) requires every monitoring and measure-
ment project to have a written and approved QA pro-
ject plan (Costle, 1979a and 1979b). This requirement
applies to all environmental monitoring and measure-
ment efforts authorized or supported by EPA through
regulations, grants, contracts, or other formal means.
The QA project plan should specify the policies, orga-
nization, objectives, functional activities, QA, and
quality control (QC) activities designed to achieve the
data quality goals of the project. All project personnel
should be familiar with the policies and objectives
outlined in the QA project plan to assure proper inter-
actions among the field operations, laboratory opera-
tions, and data management.
EPA guidance states that the 16 items shown in Table
1.1 should be ad dressed in the QA project plan (U.S.
EPA, 1980). Some of these items are extensively
addressed in the methods manual (Hillman et al.,
1986) for this project; therefore, as allowed by the
guidelines, method-specific discussions are not
repeated in this document.
-------
Subject
Title Page
Table of Contents
Project Description
Project Organization and Responsibility
QA Objectives
Sampling Procedures
Sample Custody
Calibration Procedures
Analytical Procedures
Data Analysis, Validation, and Reporting
Internal QC Checks
Performance and System Audits
Preventive Maintenance
Assessment of Precision, Accuracy, and Completeness
Corrective Actions
QA Reports to Management
This Report
Tof C
2
3
4
6
6
6
8
6, 9. 12, 13, 14
7,9
10
6
4, 11
9, 11
9, 10
Methods Manual
1
2
2,3
2
4-13
3
3
2, 3
3
a Addressing these 16 QA subjects is required by U.S. EPA, 1980.
Table 1.1. Sections in this Report and in the Analytical Methods Manual that Address Quality
Assurance Subjects3.
-------
Section 2
Project Description
Figure 2,1 shows the structure of NSWS as presently
planned.
The ELS-I portion of NSWS Phase I is a synoptic sur-
vey of approximately 1,800 lakes in the Eastern
United States. The ELS-I has the following specific
objectives:
• Determine the percentage (by number and area)
and location of lakes in regions of the Eastern
United States that are potentially susceptible to
change as a result of acidic deposition and that
have low acid neutralizing capacity (ANC).
• Investigate the relationships among water chem-
istry, regional acidic deposition patterns, land
use, physiographic features, lake morphologic
characteristics, and basin geometry, within and
among regions of the Eastern United States.
• Identify smaller subsets of representative lakes
for more intensive sampling in Phases II (tempo-
ral variability and biological resources) and III
(long-term monitoring).
ELS-I includes a pilot study that has the following pri-
mary objectives:
• Test the statistical sampling design.
• Test all methods that will be used during the sur-
vey.
• Finalize the data quality objectives (DQO's) for
Phase I.
• Finalize QA/QC guidelines for Phase I.
• Train personnel for the field operations.
• Test the data analysis plan.
To ensure that procedures are performed consistently
and that the quality of the data generated can be
determined, the QA project plan for ELS-I specifies
the following measures:
• Provide detailed, written sampling methodology
(Morris et al., 1986).
• Simultaneously train all personnel participating
in field activities.
• Conduct site visits to each field operations base
throughout the sampling period to ensure that all
methods are being performed properly.
• Perform extensive evaluation of analytical labo-
ratories both before selection and throughout
their participation.
• Assess variability introduced at each level of
activity in both field and analytical laboratories
by processing audit samples (both synthetic and
natural lake samples), duplicates, and blanks
along with routine samples. (Field laboratory
refers to the on-site mobile laboratory that per-
forms preliminary analyses; analytical labora-
tory refers to the off-site contract laboratory that
performs the more sophisticated analyses.)
• Provide detailed, written analytical methodology
(Hillmanetal., 1986).
• Use internal QC procedures at the analytical lab-
oratory to detect potential contamination and to
verify established detection limits.
• Enforce holding time requirements.
• Use protocols in the field and in the analytical
laboratory to confirm that reported data are cor-
rect.
• Enter data into the data base twice, and scan for
outlying values to eliminate effects of transcrip-
tion errors.
• Verify data by means of range checks, internal
consistency checks, and QA evaluations.
• Validate verified data by analyzing the reason-
ableness of data; base the analysis on the values
expected for the particular region or subregion
involved.
A quality assurance report describing the findings of
the ELS Phase I and the effectiveness of this QA plan
will be issued after the completion of Phase I.
-------
NATIONAL SURFACE WATER SURVEY (NSWS)
National
Lake
Survey
(NLS)
National
Stream
Survey
(NSS)
Phase I - Pilot Survey
(1984)
Phase I - Pilot Survey
(1985)
Phase I - Synoptic Survey
Phase I - Synoptic Survey
(1986)
Eastern
Lakes
(1984)
Western
Lakes
(1985)
Phase II
Temporal
Variability
(1986)
Phase II
Biological
Resources
(1986)
Phase II
Temporal
Variability
(1987)
Phase II
Biological
Resources
(1987)
Phase III
Long-Term Monitoring
(1987)
Phase III
Long-Term Monitoring
(1987)
Figure 2.1. Organizational Diagram of the National Surface Water Survey and Approximate Years
When the Field Activities are to be Initiated.
-------
Section 3
Project Organization
Figure 3.1 illustrates the NSWS management struc-
ture. The program director is the EPA official who has
overall responsibility for the program. The responsi-
bilities of the program manager, technical director,
and administrative coordinator are as follows:
• Program Manager. The program manager is the
EPA Headquarters representative for the NSWS
and serves as the liaison between the headquar-
ters staff, the laboratory directors, and NAPAR
Questions regarding general management and
resources should be forwarded to the program
manager through the technical director.
• Technical Director. The technical director per-
forms his responsibilities at the discretion of the
program manager. The primary role is to see that
the program objectives are satisfied, that the
components of the program are well-integrated,
and that deadlines are met. The technical direc-
tor coordinates and integrates the activities of
the Environmental Research Laboratory (ERL) at
Corvallis, Oregon, the Environmental Monitoring
Systems Laboratory (EMSL) at Las Vegas,
Nevada, and the Oak Ridge National Laboratory
(ORNL) at Oak Ridge, Tennessee. The technical
director also coordinates peer review and
resolves issues of responsibility. His office is the
focal point of general public inquiry and distribu-
tion of report information. The technical director
represents the program manager as necessary
and keeps the pro gram manager informed of
EPA laboratory activities, progress, and perform-
ance.
• Administrative Coordinator. The administrative
coordinator reports directly to the program man-
ager. The primary role of this position is to moni-
tor the budget and personnel needs of the survey
staff. The administrative coordinator also makes
contractual arrangements at the headquarters
level and provides special services as needed.
The roles of the various laboratories are as follows:
ERL Corvallis
In view of the role the Corvallis laboratory plays in the
Agency's acidic deposition research program, and
the major roles it must perform during the survey pro-
gram, it is appropriate that ERL Corvallis becomes a
primary focal point for ELS-I. Its responsibilities for
all phases of NSWS include:
• Developing sampling design
• Selecting sampling sites
• Preparing sampling protocols (jointly with EMSL-
LV)
• Collecting supplemental historical and other
available data on each sample site (aquatic and
terrestrial components)
• Analyzing data (jointly with EMSL-LV)
• Interpreting and mapping data
• Preparing reports (final and progress reports,
with contributions from the other laboratories
relative to their responsibilities)
• Assessing and resolving all science-related
issues other than QA/QC and data management
(jointly with other laboratories as necessary)
• Coordinating survey activities with NAPAP man-
agement staff.
EMSL Las Vegas
The Las Vegas laboratory has particular expertise in
matters relating to QA/QC, logistics, analytical serv-
ices, and sampling protocols. The responsibilities of
this laboratory, for all phases of NSWS, include:
• Developing QA/QC procedures for all compo-
nents of the program, except data management
(a responsibility of ORNL and ERL Corvallis)
• Preparing all sampling protocols (jointly with
ERL Corvallis)
• Preparing an analytical methods manual
• Preparing a field training and operations manual
• Preparing and implementing the QA project plan
• Coordinating logistical support and equipment
needs for all field operations
-------
PROGRAM DIRECTOR
PROGRAM MANAGER
NAPAP ACID DISPOSITION
ASSESSMENT STAFF
NAPAP TASK
GROUP E
TECHNICAL
DIRECTOR
Peer Review
ERL CORVALLIS
Sampling Design
Site Selection
Site Description
Data Validation
Data Interpretation
Reporting
ADMINISTRATIVE
COORDINATOR
EMSL LAS VEGAS
Field Operations
and Logistics
Analytical Methods
Data Verification
QA/QC (Including
QA Manager)
ERL CORVALLIS
ANDORNL
Data Management
Figure 3.1. National Surface Water Survey Internal Management Structure.
-------
• Training sampling personnel
• Distributing all samples to analytical laborato-
ries
• Developing and implementing QA/QC proce-
dures for verifying all field measurements and
analytical laboratory data
• Independently assessing field measurements
and laboratory data quality (bias and variability)
• Assessing and resolving all problems pertaining
to QA/QC, logistics, and analytical services.
ORNL
ORNL has considerable expertise in managing large
data bases, manipulating data, and restructuring
data bases to satisfy data analysis needs. ERL Cor-
vallis oversees the activities of ORNL, which has
NSWS responsibilities for:
• Developing and maintaining a data management
system
• Entering all field, laboratory, and support data
into the data base, and simultaneously assuring
data quality
• Preparing computer-generated summary tables,
statistics, and graphics for reports.
-------
Section 4
Quality Assurance Objectives for
Accuracy, Completeness, Representativeness,
and Comparability
4.1 Precision and Accuracy
The QA objectives for precision and accuracy of the
parameters being measured are given in Table 4.1.
Precision and accuracy are determined by analyzing
data from QA/QC samples. External QA/QC samples
include the following:
• Field blank - A field blank is a deionized water
sample meeting specifications for ASTM Type 1
reagent water (ASTM, 1984) that is carried to the
lake on the helicopter and processed through the
Van Dorn sampler as though it were a routine
sample. One field blank is collected by each heli-
copter on each operating day. Field blank data
are used to establish the estimated system deci-
sion limit, quantitation limit, or background
value that can be expected for each type of analy-
sis. For data interpretation, a data point above
this value is considered a positive response.
• Field duplicate - A field duplicate is a second
sample collected at the lake site by the same
team immediately after the routine sample is col-
lected. Field duplicate data are used to estimate
the overall within-batch precision for the sam-
pling and analysis process. One field duplicate is
collected per field station per operating day.
• Audit samples • Audit samples are materials with
known characteristics, used to determine the
accuracy of the measurement system. Two types
of audit samples serve as QA checks for the ELS-
I: field audit samples (both natural and synthetic)
help to check the over all field and laboratory per-
formance; laboratory audit samples help to
check the performance of the analytical labora-
tory. Audit samples are discussed in Section
10.0.
Internal field QA/QC samples are used primarily by
the sampling teams and field laboratory staff to
check the accuracy of the measurement system in
the field. Internal field QA/QC samples include the
following: Hydrolab QCCS, field laboratory QCCS (for
pH, DIG, and turbidity), trailer dupli cate, and trailer
(calibration) blank. These are described in Sections 6
and 7.
Internal laboratory QA/QC samples include the fol-
lowing: calibration blank, reagent blank, quality con-
trol check sample (QCCS), detection limit QCCS,
matrix spike, and analytical laboratory duplicate.
These are described in Section 9.0.
4.2 Completeness
The objective for completeness of data (the amount
of valid data obtained from a measurement system
compared to the amount expected to be obtained
under correct normal conditions) is 90 percent or bet-
ter for all parameters. This figure is based on experi-
ence gained during previous studies and is subject to
change as experience is gained during the survey.
4.3 Representativeness
The question of representativeness is an important
one. ELS-I is designed to achieve the objectives out-
lined in Section 2. It is not intended to determine the
chemistry of any given lake in detail. Therefore,
achieving ELS-I objectives does not require that the
only sample taken from a lake be completely repre-
sentative of those waters. In most cases, only one
sample per lake will be taken during Phase I. However,
a determination of whether one sample per lake is
sufficient to achieve the objectives of Phase I can
only be made when estimates of "within lake" and
"among lakes" variances are obtained. Estimates of
these variances will be made using currently availa-
ble data and the statistical sampling design for Phase
I, which will be modified if necessary. (See Section 5.)
In Phases II and III, more intensive studies of individ-
ual lakes will be performed.
4.4 Comparability
Comparability is assured by having a uniform set of
procedures for all sampling teams and a uniform set
of units for reporting the data. Further more, the QA
procedures described in succeeding sections allow
for the determination of bias for each sampling team
and analytical laboratory so that their results can be
compared.
-------
Parameter3
Al, Total extractable
Al, Total
Acidity (BNC)"
Alkalinity (ANC)e
Ca
cr
True Color
DIG
DOC
F", Total
Fe
K
Mg
Mn
a Dissolved ions and metals are being
b Ranges are for lake waters.
Units
mg/L
mg/L
/xeq/L
/*eq/L
mg/L
mg/L
Color units'
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
determined, except where
Expected
Rangeb
0.005- 1.0
0.005- 1.0
10-150
-100-1,000
0.5-20
0.2-10
0-200
0.05-15
0.1-50
0.01-0.2
0.01-5
0.1-1
0.1 -7
0.01 -5
noted.
0 Unless otherwise noted, this is the BSD at concentrations above (about 10 times) instrumental
d BNC = base-neutralizing capacity (
e ANC = acid-neutralizing capacity ('
1 APHA platinum-cobalt units
~ acidity)
•-alkalinity)
Required
Detection
Limits
0.005
20 ( 0.01)
0.005
20 ( 0.01)
5
5
0.01
0.01
0
0.05
0.1
10 ( 5)
0.005
0.01
0.01
0.01
0.01
detection limits.
Precision
Relative Standard
Deviation (RSD)
Upper Limit (%)c
10 ( 0.01)
10 ( 0.01)
10
10
5
5
±58
10
5( 5)
5
10
5
5
10
Accuracy
Maximum Absolute
Bias (%)
10/20
10/20
10
10
10
10
—
10
10
10
10
10
10
10
9 Absolute precision goal in terms of applicable units.
h Blank must be 0.9 yuS/cm.
Table 4.1. Quality Assurance Objectives for Precision, Accuracy, and Detectability.
-------
Parameter3
Na
NH4*
N03"
pH, Field
pH, Analytical lab
P, Total
Si02
S042"
Specific Conductance
Turbidity
a Dissolved ions and metals are being
b Ranges are for lake waters.
Required
Expected Detection
Units Range" Limits
mg/L 0.5-7 0.01
mg/L 0.01-2 0.01
mg/L 0.01-5 0.005
pH units 3-8
pH units 3-8
mg/L 0.005-0.07 0.002
mg/L 2-25 005
mg/L 1-20 0.05
//S/cm 5-1000 h
NTU 2-15 2
determined, except where noted.
Precision
Relative Standard
Deviation (BSD)
Upper Limit (%)c
5
5
10
±0.1 pH9
±005pH9
10 (P 001)
20 (P 001)
5
5
1
10
Accuracy
Maximum Absolute
Bias (%)
10
10
10
(±0.1)
(±0.1)
10/20
10
10
5
10
0 Unless otherwise noted, this is the BSD at concentrations above (about 10 times) instrumental detection limits.
d BNC = base-neutralizing capacity (
e ANC = acid-neutralizing capacity ('
1 APHA platinum-cobalt units
~ acidity)
--alkalinity)
9 Absolute precision goal in terms of applicable units.
h Blank must be 0.9 juS/cm.
Table 4.1. Quality Assurance Objectives for Precision, Accuracy, and Detectability (Continued).
-------
Section 5
Sampling Strategy
The sampling strategy for selecting lakes is outlined
here in some detail because the usefulness of the
results obtained in this program depends critically on
the sampling design. A more complete treatment is
given elsewhere (U.S. EPA, 1984c).
The overall strategy is to use a systematic, stratified
design to select the lakes. There are three stratifica-
tion factors: regions, subregions, and alkalinity map
classes (Table 5.1). Each stratum is an alkalinity map
class within a subregion within a region. All three
alkalinity map classes are found within each of the
eleven subregions, so the total number of strata in the
ELS-I is 33. Within strata, lakes are selected system-
atically on the basis of geographic location; ordering
factors (elevation or lake type) were also used in some
strata.
The division of a heterogeneous population into more
homogeneous subdivisions can result in a gain in pre-
cision over simple random sampling. A systematic
approach was chosen over proportional sampling
since it is easier to implement and provides similar
results.
The stratified systematic sampling design has the fol-
lowing important attributes and advantages:
• It provides an unbiased estimate of the distribu-
tion of population attributes of interest, such as
the number and area of acidic lakes and low-
alkalinity lakes in regions potentially suscepti-
ble to acidic deposition.
• It permits comparisons of variable distributions
among major strata.
• It can be completed and implemented expedi-
tiously.
• It provides a sound basis for the selection of rep-
resentative lakes for Phase II and Phase III stud-
ies.
5.1 Selection of Sampling Design
The rationale for the choice of the particular stratifi-
cation and ordering factors are discussed in Sections
5.1.1 and 5.1.2, respectively.
5.7.1 Stratification Factors
The stratification factors are regions, sub regions,
and alkalinity map classes.
• Regions - It is known apriori that certain regions
of the United States contain an abundance of
low-alkalinity waters and are, therefore, consid-
ered potentially susceptible to acidic deposition.
Only lakes greater than four hectares in area
within those regions were selected for sampling.
Three regions were identified and are listed in
Table 5.1.
• Subregions - Within each region, sufficient data
are available to suggest that because of physiog-
raphy or known water quality information, some
areas of each region include lakes that are more
similar to each other than to lakes in surrounding
areas. To ensure that a representative sample is
drawn from each apparently discontinuous por-
tion of the region, subregion was chosen as the
second stratification factor. Each subregion is
shown in Figure 5.1.
• Alkalinity Map Classes - Within each subregion
there are lakes with differing alkalinities.
Because it is important in this national assess-
ment to refine estimates of the number of lakes
in different alkalinity/ potential susceptibility
classes, the third stratification factor, alkalinity,
was incorporated. This choice ensures that sam-
ples are collected from a wide range of lakes
presently expected to have different chemical
compositions. The three alkalinity classes
selected are given in Table 5.1. Figures 6.3
through 6.5 in U.S. EPA, 1984(c) show the alkalin-
ity patterns for various regions and subregions,
based on currently available data. Alkalinity
classes can be obtained from maps prepared by
staff at ERL Corvallis (Omernik and Powers,
1985; Omernik and Griffith, 1985; Omernik and
Kinney, 1985; Omernik et al., in preparation).
5.1.2 Ordering Factors
The four ordering factors evaluated were elevation,
lake type, watershed size, and lake size. Only the first
two were used for ordering, and only one per stratum.
(See Table 5.1.)
-------
Level of Criterion
Divisions within Levels or Criteria
Regiona
Subregion
Alkalinity Map Class
Elevation (used only in 1A and 1C)
Lake Type (used only in 1E and 2A)
1. Northeast
2. Upper Midwest
3. Southeast
See Figure 5. 1bc!l
100
100 to 200
200
1,200m (300m increments)
1,200m (600 m increments)
Seepage
Non-Seepage
Reservoir
a Regions of interest are defined primarily from alkalinity maps using existing data.
b Northeastern subregions: 1A, Adirondack Mountains; 1B, Pocono and Catskill Mountains; 1C, Central New England; 1D,
Southern New England; 1E, Maine.
0 Upper Midwestern subregions: 2A, Northeastern Minnesota; 2B, Upper Peninsula of Michigan; 2C, North-Central Wisconsin;
2D, Upper Great Lakes Area.
d Southeastern subregions- 3A, Southern Blue Ridge Mountains; 3B, Florida.
Table 5.1. Variables Selected for Lake Classification, Eastern Lake Survey.
• Elevation - Studies in the mountainous parts of
the eastern and western United States have dem-
onstrated that water chemistry is correlated with
elevation; some of the lowest-alkalinity waters,
for example, are found at the highest elevations.
Therefore, elevation was selected as an ordering
factor.
• Lake Type - There are two chemically significant
divisions of lake type: seepage (no inlets or out-
lets) and non-seepage systems. The two types
have generally different chemistries and thus
could have different responses to potential
acidic deposition inputs. Reservoirs have been
chosen as a lake-type class to ensure that the
data can be used to address regional concerns
over drinking water supplies.
Sources for determining lake type and elevation are
topographic maps. In general, the 1:250,000 scale
U.S. Geological Survey topographic maps are the
most practical.
5.2 Allocation of Lakes within Strata
Generally, 50 lakes are sampled in each stratum. How-
ever, up to 90 lakes per stratum may be chosen in par-
ticularly heterogeneous areas or in areas highly sus-
ceptible to acidic deposition. In strata represented by
fewer than 50 lakes, all lakes are sampled.
All the lakes in some strata are ordered according to
one of the ordering factors. The selection of these var-
iables depends on anticipated features of the subre-
gion in question. Afterthe appropriate ordering factor
has been applied, the list of lakes is divided into n
intervals, where n is the number of lakes to be
selected. The first sample lake is selected at random
between lakes one and n. Thereafter, every nth lake is
selected. This procedure ensures that each lake of a
stratum in the frame list has an equal probability of
being included.
5.3 Number of Samples Per Lake
The present design involves taking one sample per
lake. Phase I sampling takes place during the autumn
mixing period, since at that time the lakes should be
well-mixed, and the one sample should be representa-
tive of the lake water chemistry as a whole. The
design is optimized for spatial coverage and inter-
comparison of lakes because within-lake variability is
expected to be less than among-lake variability.
However, an attempt will be made to verify that one
sample per lake is adequate. The within-lake variabil-
ity will be estimated for a number of lakes using previ-
ously obtained data for selected parameters. For the
pilot study, if the within-lake variability is determined
to be large enough to influence the reliability of the
predictor, three samples will be collected from differ-
ent areas of the lake.
12
-------
2C
2A
NORTHEAST
REGION
UPPER
MIDWEST
REGION
SOUTHEAST
REGION
Figure 5.7. Regions and Subregions of the Eastern United States Potentially Susceptible to Acidic
Deposition.
13
-------
Section 6
Field Operations
Field operations are based at field stations under the
supervision of a field base coordinator. Each station
contains a mobile laboratory whose activities are
conducted by a five-person team. Two helicopter
crews operate from each station and make one or
more excursions to lake sites each day. On each
excursion, 5 to 10 lakes are sampled. The total sample
load is restricted to 20 lake samples per station per
day.
In the following two sections, the activities of the heli-
copter crews and field station crews are discussed. A
more detailed description of field operations (includ-
ing safety) is given in Morris et al., 1986. Section 6.3 of
this QA project plan addresses the field personnel
training.
6.1 Helicopter Crew Activities
Each helicopter crew consists of a pilot, an observer,
and a sampler. The observer's responsibility is to
ensure that all measurements and sampling opera-
tions are performed correctly and that data are accu-
rately recorded. The sampler must be qualified to
operate all equipment and follow prescribed proce-
dures. For each excursion, the activities of each heli-
copter crew are divided into (1) activities at the field
station and (2) activities enroute to or at the lake site.
They are described in the following two subsections.
A flow chart of field crew activities is shown in Figure
6.1.
6.7.7 Station Activities
Prior to leaving the field station, helicopter crews per-
form the following tasks:
• Prepare a detailed navigation sheet giving
courses and distances for the excursion as well
as the navigational coordinates of each lake to
be sampled. A flight plan is filed with the field
base coordinator in order to predict when sam-
ples will arrive and for safety considerations.
Preliminary information about a lake, such as
latitude, longitude, name, and status (regular,
alternative, or special), is supplied by ERL Cor-
vallis and local authorities. If possible, the loca-
tion of the deepest part of each lake is predeter-
mined and indicated on a map as the preferred
sampling site.
• Ensure that they have all the required supplies.
• Calibrate the Hydrolab unit, which is used to
obtain pH, conductance, and temperature pro-
files of each lake. The procedures are described
in detail in Section 7 of this QA project plan.
Hydrolab calibration data are recorded on the
Hydrolab calibration form.
Once these steps are performed, the helicopter pro-
ceeds to the lake to take samples.
6.7.2 Lake Site Activities
As a lake is approached, photographs of the area are
taken. Preceding each set of lake photographs, a
photo of a card showing the date and the lake identifi-
cation (ID) number is taken. After the photographs are
taken, the photograph numbers are written on Form 1,
Lake Data, shown in Figure 6.2. (Table 6.1 lists the
data forms and labels used in the field.) The observer
also notes watershed characteristics which can be
determined from the air. Form 1 contains an outline of
the lake, sketched previously from topographic maps.
The sampling location is marked on the sketch with
an x. The pilot then lands at or near the deepest part
of the lake. This point is determined by using a combi-
nation of visual observations from the air and the
depth recorder.
NOTE: Procedures have been developed to handle
problem situations that may develop. These situa-
tions include finding at the coordinates given: (1) no
lake, (2) more than one lake, (3) a stream or flowing
water, (4) a lake too shallow to obtain a debris-free
water sample (0.5 m), (5) an inaccessible lake, (6) a
lake with high conductance (>1500 /uS/cm), (7) a
small lake, and (8) a multilobed or dendritic lake for
which the location of the best sampling site is uncer-
tain. These procedures are described in detail in Mor-
ris et al., 1986.
The following operations are performed in the order
given. The pilot maintains position by visual refer-
ence to landmarks or an anchored buoy, as conditions
dictate.
1. Depth. The lake depth at the sampling site is
determined with the electronic depth finder or
calibrated sounding line and the result is
recorded on Form 1.
14
-------
FIELD STATION
Excursion 1
1. Calibrate Hydrolab
2. Check list of equipment
3.
4.
en route
1.
2.
H LAKE SITE
and supplies for day's
sampling
Load helicopter
Check list of lakes to be
sampled and file flight
plan with field base
coordinator
Prepare site description
Take aerial photographs
NEXT LAKE
SAME PROCEDURE
FIELD STATION
1. Unload samples
2. File lake data forms with
field base coordinator
3. Check Hydrolab
calibration and perform
required equipment
maintenance
4. Plan and prepare for next
day's sampling
NEXT EXCURSION
1. Measure site depth
2. Profile conductance and
temperature. Determine
stratification status.
3. Make Secchi disk
transparency
determination
4. Prepare blank (at first lake
only)
5. Collect sample
6. Obtain DIG and pH syringe
samples
7. Transfer remaining sample
to a 4-L container
8. If necessary, obtain a
duplicate sample
9. Verify that forms and
labels are correctly filled
out
10. Depart from lake site
Figure 6-1. Flowchart of Helicopter Crew Activities.
-------
NATIONAL SURFACE WATER SURVEY
FORM1
LAKE DATA
STATE
LAKE 10
LAKE NAME
LORAN READINGS (J)
D D M M M Y Y
DATE i 11 i i i i i i i i i i i
SAMPLING TIME ._, i—,. i—, i—, h
HYDROLAB ID i
INITIAL
FINAL L_Jl_J.I_Jl_J pH'
INITIAL i ii ii ii i f^(^\
!._•.,_,!_, pH/~-\
Jl l.l il I oH —
LATITUDE: i_i i_i L_I i_j L
PHOTOGRAPHsQ
FRAME ID AZIMUTH
1_,^_, LAP CARD
^^ ^^^°
_i.u_i i_i LONGITUDE: i i u_i i_i i_i i_i.i_i i_j FIN/
DISTURBANCES WITHIN 100 METERS (
D ROADS [H LIVESTOCK
D DWELLINGS D INDUSTRY
SITE DEPTH (ft) X 0.3048 m/ft = i— - 1 I_J.L_I m
VL 1 1 1 1 1 P 1 1 /Jb ^~^
)F SHORE
3 MINES/QUARRIES
D LOGGING CH OTHER
AIR TEMP. L_J L_J i— j°C
SITE DEPTH:
,nO
SECCHI DEPTH: DISAPPEAR I
REAPPEAR I
LAKE STRATIFICATION DATA
DEPTH T°C ,uS
BOTTOM -1.5m
1.S
1 1 1—
>m\_y i — 1 1 — i.i — i\^) ' — ' ' — ' ' — ' ' — '
_J,L__I 1 ' i iTt 1 1 ) 1 ] i i L__J i -i
AT-C (1.5, B -1.5m): ^^.^ Q VN oT. S^
pH
{^} i — ii — 1.1 — ii — 1(^)
o — — o
PROCEED.
'OP HERE
0.6 DEPTH
T°C
O
pH
—o
AT°C (1.5, 0.6 DEPTH): u_i L_I.I_J (~)
LAKE DIAGRAM tt
Elevation ft r>iitl«(«
!., Inlole
\
LAKE DEPTH
CHECK ONE
ns20m Q
4 5
6 10
8 15
10 20
12 25
14 30
16 35
18 40
20 45
50
IFAT>4°CFILLIN I
FOLLOWING DATA BLOCK
/ •b
•20m T° C llS
i — i i — 1.1 i ^) i i i i i — i i — i
i — i i — 1.1 — i (J) i — i i — i i — i i — i
U-. ,__,.,_. O I-JI— .1— ,!_,
i — 1 1 — 1.1 — i C_) i — 1 1 — 1 1 — 1 1 — i
._,,__,.,__. O U_,U_,^_,L_,
1 1 1 I.I 1 {_) 1 1 1 1 1 1 1 1
l_.l_J.H-,O ^-I^I—IL-,
i — i i — 1.1 — i \_y i — 1 1 — i i—i i — i
i — i i i.i i V_x i i i i i i \ i
-.^.^.O ^.-.^L-,
0
o
o
o
o
o
o
0
o
o
COMMENTS: D NOT SAMPLED, SEE BELOW
DATA QUALIFIERS
@ INSTRUMENT UNSTABLE
® REDONE FIRST READING NOT
ACCEPTABLE
© INSTRUMENTS, SAMPLING GEAR
NOT VERTICAL IN WATER COLUMN
(D) SLOW STABILIZATION
© HYDROLAB CABLE TOO SHORT
® DID NOT MEET QCC
(X) OTHER (explain)
D FLOWING WATER
REASON LAKE
NOT SAMPLED.
(CHECK) DHIGH COND. (>1500^S) DNON-LAKE QTOO SHALLOW
O INACCESSIBLE D NO ACCESS PERMIT D URBAN/INDUSTRIAL
D OTHER
FIELD LAB USE ONLY
TRAILER ID-
BATCH ID
SAMPLE ID_
FIELD CREW DATA
CREW ID
OBSERVER_
SAMPLER _
OBS. SIGN._
GROUND CREW MEMBER
SIGN.
FORM DISTRIBUTION
WHITE COPY—ORNL
PINK COPY—EMSL-LV
YELLOW COPY—FIELD
Figure 6.2. National Surface Water Survey Form - Lake Data.
16
-------
Data Form
Description
Lake Data
Batch QC Field Data
Shipping
Hydrolab Calibration Form
Field Sample Label
Field Audit Sample Label
Lab Audit Sample Label
Aliquot Label
Table 6.1. List of Data Forms and Labels Used in the Field.
2. Parameter Profile. pH, specific conductance,
and temperature are measured at 1.5 m below
the surface and at 1.5 m above the lake bottom,
using the Hydrolab. If the temperature differ-
ence exceeds 4°C, a third measurement is
obtained at 60 percent of the lake depth. If the
temperature difference between the top and 60
percent depth is 4°C, the lake is classified as
weakly stratified. If the temperature difference
exceeds 4°C, the lake is considered to be
strongly stratified. In a strongly stratified lake, a
temperature and conductance profile is
obtained at 5 m intervals for lakes greater than
20 m deep and 2 m intervals for lakes less than
20 m deep. The results are recorded on Form 1.
3. Secchi Disk Transparency. A Secchi disk
secured on a calibrated line is lowered on the
shady side of the helicopter until the disk disap-
pears from view. The disk is then raised until it
reappears. Both of these depths are recorded on
Form 1. Their average is the Secchi disk trans-
parency. The observer must not wear sun-
glasses.
4. Sample Collection. The 6.2-L Van Dorn sampler
is rinsed with surface water, lowered to 1.5 m
below the lake surface, triggered to collect a
sample, raised to the surface, then set on the
pontoon platform in a vertical position. This is
done to prevent the sample from leaking. It is
imperative that air is not introduced before
steps 5 and 6 are performed.
NOTE: Sample collection is performed on the
upwind side of the helicopter to minimize con-
tamination potential.
5. DIC andpH Syringe Samples. From the Luer-Lok
syringe port on the Van Dorn sampler, a 20-mL
aliquot is withdrawn using a 60-mL syringe
(equipped with valve). The syringe is rinsed with
this aliquot and the rinse is discarded. A 60-mL
aliquot is withdrawn and the syringe is sealed. A
second syringe sample is obtained similarly. A
field sample label, shown in Figure 6.3, is com-
pleted and attached. "Batch ID" and "Sample
ID" are assigned at the field station. After label-
ing, the syringes are placed in a plastic bag and
are stored in the cooler (maintained at 4°C).
6. Sample Transfer. A clean 4-L Cubitainer is rinsed
with three 500-mL portions of sample. The Cubi-
tainer is filled with sample, compressed to
remove all headspace, then capped securely.
The field sample label shown in Figure 6.3 is
attached and completed. After labeling, the
Cubitainer is stored in the cooler (maintained at
4°C).
7. Duplicate Sample Collection. At the first lake
sampled each day, one or both helicopter crews
at each station (as assigned by the field base
coordinator) collect a duplicate sample by
repeating steps 4 through 6. On the label (Figure
6.3), sample type "Duplicate" is checked.
8. Field Blank Sample Collection. One field blank
Is prepared on the first excursion of each day. In
place of step 4, the Van Dorn sampler is rinsed
with three 200-mL portions of deionized water,
then is filled with deionized water. Step 6 is per-
formed as for a lake sample. The sample type
"Blank" is checked on the label (Figure 6.3).
9. Upon completion of steps 1 through 8, the
Hydrolab sonde and the Van Dorn sampler are
rinsed with deionized water and are stored
securely in the helicopter. The observer verifies
17
-------
Lake ID
Date
Sampled
Crew ID
Time
Sampled
Sample Type (Check One)
Routine
Duplicate
Blank
Batch ID
Sample ID
Figure 6.3. Field Sample Label.
18
-------
that the Lake Data Form is properly completed
and that all containers are correctly labeled.
10. The helicopter proceeds to the next lake, where
the same lake site activities are performed.
When time necessitates, the helicopter returns
to the field station. The helicopter is refueled,
when necessary, at remote airports or by fuel
truck. The Hydrolab calibration is checked at the
end of each day. Any drift is noted on the calibra-
tion form. The manufacturer's instructions for
care and maintenance of the pH meter and elec-
trode are followed. The re chargeable batteries
are charged overnight and the electrodes are
stored in tap water or 3M KCI.
6.2 Field Station Operations
The field station includes a fully equipped mobile lab-
oratory, staffed by a field lab coordinator, a field lab
supervisor, and three analysts. The field lab coordina-
tor is responsible for the overall operation of the labo-
ratory (sample tracking and logistics, data, forms,
safety, etc.). The field lab supervisor, with the aid of
the analysts, is responsible for field station measure-
ments and sample processing. When possible, the
field lab coordinator also assists with sample proc-
essing. This section describes the field station activi-
ties that are outlined in Figure 6.4. Subsection 6.2.1
describes reagent preparation and 6.2.2 describes
sample processing.
6.2.1 Reagent Preparation
Reagents for aluminum extraction, DIG determina-
tion, and pH determination must be prepared before
helicopter crews return with samples. Detailed proce-
dures for reagent preparations are given in Morris et
al., 1986.
6.2.2 Sample Processing
The following steps describe sample processing
operations. They are performed in the order given.
6.2.2.1 Sample Description and Identification
1. Samples are organized into batches that are
processed together. A batch consists of all sam-
ples (approximately 20 to 30 including routine,
duplicate, blank, split, and audit samples) proc-
essed at a field station on a given day. Each
batch is assigned a unique batch ID number,
which is recorded on the labels of all samples
(and of corresponding aliquots). Each sample is
then randomly assigned a sample ID number as
follows:
Routine Samples. Three sample containers are
filled at each lake, namely two syringes (for DIG
and pH determinations) and a Cubitainer. One
sample ID number is assigned to all three con-
tainers and is recorded on each container label.
Duplicate and Blank Samples. Sample ID num-
bers are assigned in the same manner as for the
routine samples. (Note: There are no syringe
samples for the blanks.)
Field Audit Samples. One 2-L field audit sample
(received each day from a central source) is
included in each day's batch of samples. The
label on the field audit container is shown in Fig-
ure 6.5a. The code (Table 6.2) indicates what kind
of a sample it is and the concentrate lot number.
A field audit sample is assigned a sample ID
nun.ber in the same manner as a routine sample.
The ID number is recorded on the label.
Lab Audit Samples. One lab audit sample
(received each day from a central source) is
included in each day's batch. A single lab audit
sample consists of a set of seven aliquots. Each
aliquot has a temporary label like that in Figure
6.5b, giving the aliquot number, audit sample
code, amount of preservative, and shipping
date. An aliquot label (Figure 6.5c) is attached to
each aliquot. The lab audit sample is then
assigned a sample ID and batch ID number in
the same manner as a routine sample. The batch
and sample ID numbers are recorded on each of
the seven aliquot labels. The date and amount of
preservative are also recorded on the label.
After the batch and sample ID numbers are
assigned and recorded on each sample label,
the same information is entered on Form 2,
Batch QC Field Data (Figure 6.6). Also, the lake
ID and the appropriate code for each sample
(from Table 6.2) are entered on Form 2.
Note 1: The sample ID numbers are assigned at
random to all samples in a batch. Furthermore,
sample ID numbers run consecutively from 1 to
the number of samples in the batch. Audit sam-
ples must not always be assigned the same
sample ID number.
Note 2: Field audit samples are processed
exactly like routine lake samples (steps 1
through 6). Lab audit samples receive no field
treatment other than labeling and shipping.
Note 3: Seven different aliquots (numbered as in
Table 6.3) are prepared from each field sample
(routine, duplicate, and blank samples). Each
aliquot is assigned the same batch and sample
ID numbers as the sample from which it is pre-
pared. Additional aliquots are taken for split
samples.
Note 4: After the lake and sample ID numbers
and the sample codes are entered on Form 2, the
temporary label on the lab audit sample is
19
-------
BEFORE SAMPLE ARRIVAL
1. Prepare reagents for analyses:
(a) Total extractable AL
(b) DIG
(c) pH
2. Warm up and calibrate instruments:
(a) Turbidimeter
(b) Carbon analyzer
(c) pH meter
SAMPLES
FOLLOWING SAMPLE ARRIVAL
1. Insert required audit samples,
assign batch and ID numbers, start
Batch form
2. Determine DIG
3. Measure pH
4. Measure turbidity
5. Determine true color
6. Complete Batch and Shipping
Forms
7. Distribute data
SAMPLE SHIPMENT
Figure 6.4. Flow Scheme of Daily Field Station Activities.
20
-------
FIELD AUDIT SAMPLE
Radian ID No.
Date
Shipped
Code
Batch
Date
Received
ID
a. Field Audit Sample Label
LAB AUDIT SAMPLE
Aliquot No.
Date
Shipped
Date
Received
Code
Preservative Amount
b. Lab Audit Sample Label
Aliquot No.
Batch ID
Sample ID
Date Sampled
Preservative
Amount
Parameters
NOTE: The aliquot no.,
preservative, and parameters
are preprinted on the seven
aliquot labels
c. Aliquot Label
Figure 6-5. Aliquot and Audit Sample Labels.
21
-------
Sample Type
Code
Description
Normal
Audit
R
D
B
TD
F L 1 -1
Split
Routine lake sample
Duplicate lake sample
Field bank sample
Field station (trailer) duplicate
Radian ID number
Concentrate lot number
Concentration level (L = low, H = high, N = natural)
Type of audit sample
(F = field audit sample)
(L = lab audit sample)
(P = performance evaluation sample)
A split sample is an additional aliquot from a routine, duplicate, or
blank sample, and has the same ID number as the original
sample. However, the aliquot has an additional sample code. For
example, suppose that the original sample is assigned the ID
number 4. The split sample also receives the ID number 4, as
well as the letter E recorded under the Split Code column on Form
2 (or N or C, depending on where the split sample is shipped).
Shipping destination of split sample
(N = Norwegian lab, C = Canadian lab, E = EPA Corvallis lab)
Table 6.2. List of Sample Codes.
removed (seven aliquots) and placed in the lab
audit logbook.
6.2.2.2 DIG Determination. Immediately after assign-
ment of batch and sample ID numbers, one crew
member begins the DIG analyses. DIG is determined
in routine, lake duplicate, and field audit samples.
The routine and lake duplicate samples are contained
in sealed syringes (filled at the lake site). For field
audit samples, a syringe sample is taken from the 2-L
sample prior to analysis. The results are recorded on
Form 2, Batch QC Field Data (Figure 6.6). The meas-
urement procedures are discussed in Section 7.2.1.
Copies of all raw data must be sent to the QA man-
ager when requested.
6.2.2.3 Sample Filtration, Preservation, and Aliquot
Preparation. All manipulations are performed in the
laminar flow hood by gloved personnel. Sample filtra-
tion and preservation are described in Morris et al.,
1986. Seven aliquots are prepared from each sample
as specified in Table 6.3.
Split Samples. Both Norway and Canada have per-
formed extensive acidic deposition monitoring stud-
ies, but each group has used different analytical
methodologies. By analyzing the same samples
using different methodologies and comparing
results, a conclusion can be drawn about the equiva-
lence of Norwegian, Canadian, and ELS-I methods.
There are three types of split samples. Descriptions
of each are given below and are summarized in Table
6.4. When a sample is split, the appropriate split code
is recorded on Form 2, Batch QC Field Data.
NOTE: These splits are prepared for the ELS-I and
may or may not be required for other surveys.
• EPA Corvallis Splits. An EPA Corvallis split con-
sists of one aliquot which isprepared like aliquot
1, except the sample volume is 125 ml and a 125-
mL container is used. Volume permitting, all
samples are split for Corvallis. When volume lim-
itations exist, sample splits for other laborato-
ries take precedence. Corvallis analyzes the
samples by inductively coupled plasma spec-
22
-------
NSWS
FORM 2
DATE RECEIVED
6f DATA MGT.
ENTERED
RE-ENTERED
BATCH/QC FIELD DATA
BATCH
NO. St
IN BA1
STATIC
SAMPLE
ID
0 1
02
03
04
05
06
07
08
09
1 0
1 1
1 2
1 3
1 4
1 5
16
1 7
1 8
1 9
20
2 1
22
23
24
25
26
27
28
29
30
DUP
u
i m n
^MPLES D
•CM
*B TO W
ATCH SE
ATE SHI
HIGH
:NT
PPED
IN ID fiRFW IP
LAKE
ID
SAMPLE
CODE
TD
DIG (mg/L)
QCCS LIMITS
UCL 2.2
LCL-
1.8
VALUE QCCS
STATION pH
QCCS LIMITS
UCL 4''
LCL -
-3.9
VALUE QCCS
DATE SA
AIR-BILL
FIELD ST
MANAGE
MPLED
NO.
FAT1ON
R
TURBIDITY CNTU)
QCCS LIMITS
nr.i - 5.5
1 r.l - 4.5
VALUE QCCS
COLOR
(APHA
UNITS)
VALUE
SPLIT
CODES
(E,C,N.)
COMMENTS:
WHITE - ORNL COPY
YELLOW - FIELD COPY
PINK - EMSL-LV COPY
Figure 6.6. National Surface Water Survey Form 2 - Batch QC Field Data.
23
-------
Preservative
and
Description3
Parameters
a Aliquots 2, 3, 4,
1 2 3
(250 ml) (10 mi.) (250 ml)
Filtered Filtered Filtered
pH<2with MIBK-HQ No
HN03 Extract Preservative
Ca Total Cf
Extractable
Mg Al F"
K S042-
Na N03"
Mn Si02
Fe
5, and 6 must be stored at 4°C in the dark.
Aliquot/(Container)
456
(125 ml) (500ml) (125 ml)
Filtered Unfiltered Unfiltered
pH<2with No pH<2with
H2S04 Preservative H2S04
DOC pH Total P
NH4+ BNC
ANC
Specific
Conductance
DIG
7
(125 ml)
Unfiltered
pH < 2 with
HN03
Total Al
Table 6.3. List of Aliquots, Containers, Preservatives, and Corresponding Sample Parameters.
-------
Split
EPA-Corvallis
Norwegian"
Canadian
Quantity
125mL
500ml
2 x500 mL
250 ml_°
250 mLc
250 mLd
250 mLd
3 Except when there is insufficient sample
Number
All samples3
90
25
115
115
115
115
due to other splits
Filtered sample
Raw unfiltered
Raw unfiltered
Raw unfiltered
Raw unfiltered
Raw unfiltered
Descriptions
acidified with HN03to pH<2
sample
sample
sample acidified with HN03 to ph<2
sample
sample
b Samples to be taken in the Northeast.
c Sent to Ontario
d Sent to Canada
Ministry of the Environment, Rexdale, Ontario,
Centre for Inland Waters
Burlington, Ontario,
Canada.
Canada.
Table 6.4. Split Sample Descriptions.
troscopy (ICP), providing checks on the perform-
ance of the analytical laboratories, as well as
additional data that are potentially useful in
understanding lake chemistry.
• Norwegian Splits. A Norwegian split consists of
one or two aliquots which are identical to aliquot
5. A total of 115 samples from the Northeast will
be split to Norway. Ninety of the splits will con-
sist of one 500-mL aliquot, while twenty-five of
the splits will consist of two separate 500-mL ali-
quots (Norway will return upon receipt one of the
two aliquots to the QA manager).
• Canadian Splits. A Canadian split consists of
four aliquots, designated as C1, C2, C3, and C4.
The aliquots are described below.
C1: Aliquot C1 is identical to aliquot 5 except it
consists of 250 ml_.
C2: Aliquot C2 is identical to aliquot 7 except it
consists of 250 ml.
C3: Aliquot C3 is identical to aliquot 5 except it
consists of 250 ml_.
C4: Aliquot C4 is identical to aliquot 5 except it
consists of 125 ml_.
6.2.2.4 Total Extractable Aluminum. A third crew
member begins this procedure at the same time the
DIG measurements are begun. The aluminum is
extracted into an MIBK layer, which is transferred to a
10-mL centrifuge tube and is capped tightly. An ali-
quot label (Figure 6.5c) is attached. This is aliquot 2 in
Table 6.3. The aliquot is stored at 4°C in the dark until
shipment.
6.2.2.5 pH Determination (Field Station). After DIG
determination, the pH of the sample remaining in the
sealed syringes is determined. (One syringe should
contain about 60 ml_ sample and the other about 40
mL) The QC procedures are discussed in Section
7.2.2. The results are recorded on Form 2. Copies of al I
raw data are sent to the QA manager when requested.
NOTE: Two pH measurements are performed during
field operations: (1) in situ at the lake with the Hydro-
lab and (2) at the field station in a sample chamber.
6.2.2.6 Turbidity. A Monitek Model 21 laboratory
nephelometer is used to determine turbidity of rou-
tine, duplicate, and blank samples. The nephelometer
is calibrated directly in Nephelometric Turbidity Units
(NTU). The QC procedures are discussed in Section
7.2.3.
Results are recorded on Form 2. Copies of all raw data
are sent to the QA manager when requested.
6.2.2.7 True Color Determination. After centrifuga-
tion to remove turbidity, the color is determined using
the Hach Model CO-1 Color Test Kit, following the
manufacturer's instructions. Results are recorded on
Form 2. The QC procedures are discussed in Section
7.2.4.
6.2.2.8 Sample Shipment. When a batch is com-
pletely processed and ready for shipment, the sam-
ples are assembled into groups according to their
25
-------
shipping destination. Split samples are shipped to
ERL Corvallis, to a Canadian lab, or to a Norwegian
lab. All other samples are shipped to an analytical
lab.
Prior to shipping, each aliquot is sealed in a plastic
bag and is placed in a Styrofoam-lined shipping car-
ton, along with 8 to 12 frozen freeze-gel packs to main-
tain the aliquots at 4°C. One set of Form 3 (Shipping
Form, Figure 6.7) is completed for each carton, and
two copies (sealed in a plastic bag) are enclosed in
the carton. The carton is then sealed and shipped by
overnight delivery to its destination.
Upon receiving the shipment, the laboratory checks
the temperature and condition of the containers and
verifies that all the containers listed on the shipping
form are included in the shipment. If there is any dis-
crepancy, the field laboratory coordinator should be
notified immediately.
6.2.2.9 Data Distribution. Copies of all forms (except
labels) are kept at the field station. Copies of Forms 1,
2, and 3 are sent to the locations indicated in Figure
6.8.
One copy of Form 3 is sent to the Sample Manage-
ment Office (SMO) and two copies are sent with the
samples to the analytical lab. Upon receipt by the lab,
the condition of the samples is noted on the two
forms, one of which is forwarded immediately to
SMO.
After the laboratory analysis is completed, the results
are entered into the data base by laboratory person-
nel from a remote terminal, or forms (Appendix A) are
sent to the data base manager for entry into the data
base. In either case, a copy of the results is sent to the
QA manager. Quality assurance of the data transfer
into the data base is discussed in Section 12.0.
6.3 Training
Prior to the pilot study and the Phase I survey, all per-
sonnel must be trained. Safety procedures and regu-
lations, field base operations, and field laboratory
analytical procedures are the chief subjects of
instruction. Training includes classes and realistic
simulations of actual activities to be performed.
Where possible, these simulations include helicopter
sampling from a lake and sample preparation in a
field station. If these opportunities are not available,
sampling and field measurements are made from a
boat, and the field station activities are simulated in a
laboratory.
26
-------
NATIONAL SURFACE WATER SURVEY
SAMPLE MANAGEMENT OFFICE
P.O. BOX 818
ALEXANDRIA, VA 22314
NSWS
FORM 3
SHIPPING
RECEIVED BY
IF INCOMPLETE IMMEDIATELY NOTIFY:
SAMPLE MANAGEMENT OFFICE
(703) 5S7-2490
PAGE.
.OF.
FROM
(STATION ID):
SAMPLE
10
01
02
03
04
OS
06
07
oa
09
10
11
12
13
U
15
16
17
18
19
20
21
22
23
24
25
28
27
28
29
30
1
TO
(LAB):
BATCH
ID
DATE SAMPLED
ALIOUOTS SHIPPED
(FOR STATION USE ONLY)
2
3
4
5
6
7
8
DATE SHIPPED DATE RECEIVED
AIR-BILL NO.
SPLITS
e
SAMPLE CONDITION UPON LAB RECEIPT
(FOR LAB USE ONLY)
QUALIFIERS.
V.
M-
ALIQUOT SHIPPED
ALIQUOT MISSING DUE TO DESTROYED SAMPLE
Figure 6.7. National Surface Water Survey Form 3 - Shipping.
27
-------
ANALYTICAL
LABORATORY
Form 3 (2 copies)
FIELD
STATION
(Keeps 1 copy
of forms
1,2, and 3)
1 copy
Form 3
SAMPLE
MANAGEMENT
OFFICE
Form 3
QA
MANAGER
Forms 1 and 2
DATA
BASE
Forms 1 and 2
Figure 6.8. Eastern Lake Survey Data Flow Scheme.
28
-------
Section 7
Field Measurement Quality Control Checks
Field measurements are those made by the field
crews at the lake sites and by the field laboratory per-
sonnel at the field laboratory. These measurements
are described in Sections 7.1 and 7.2, respectively.
7.1 Lake Site Measurements
The lake site measurements consist of three Hydro-
lab determinations (lake water temperature, pH, and
specific conductance), Secchi disk transparency, air
temperature, and site depth. All measurements are
recorded on Form 1 - Lake Data.
7.1.1 Hydrolab
The Hydrolab instrument is used to measure lake-
water temperature, pH, and specific conductance in
situ.
The QC procedures consist of calibrating the instru-
ment before and after each excursion and monitoring
the drift between calibrations. These procedures are
described in detail in Morris et al., 1986. The following
is a summary of QC procedures for the Hydrolab.
• Lake-water temperature. There is no calibration
control for temperature. The instrument is cali-
brated at the factory and should be accurate to
0.2°C. The accuracy is checked each day against
a National Bureau of Standards (NBS)-traceable
thermometer and if an error of 1°C or more is
found, the manufacturer should be consulted. An
error of this size usually indicates a malfunction
of the instrument.
• pH. Following thedaily calibration,aquality con-
trol check sample standard (QCCS) of CO2-
saturated deionized water is measured. The
QCCS has a theoretical pH value of 3.91 at stand-
ard temperature and pressure. The instrument
drift is determined by remeasuring the QCCS
after completion of all sample analyses. If either
QCCS reading deviates from the theoretical pH
by more than ±0.1 pH unit, the data qualifier "N"
is recorded on Form 1. The procedure for Hydro-
lab pH measurement is described in Morris et al.,
1986.
• Specific Conductance. A procedure for the spe-
cific conductance determination is described in
Morris et al., 1986. A KCI calibration standard of
147 //S/cm is used. The error allowed on the cali-
bration standards is 1 percent of the highest
reading on the scale (2,000 MS/cm) used. The
allowed error on the QCCS (CO2-saturated deion-
ized water having a standard specific conduc-
tance value of 50 nS/cm) is ± 20 /uS/cm.
7.7.2 Secchi Disk Transparency
The Secchi disk transparency should be taken on the
shady side of the helicopter, and the observer must
not wear sunglasses.
7.1.3 Air Temperature, Site Depth, and Elevation
Air temperature is measured by one of the helicopter
instruments. An electronic depth recorder mounted in
the helicopter is used to locate the sampling site,
which ideally should be the deepest part of the lake.
Once daily the depth recorder is checked against a
calibrated sounding line. The helicopter pilot must
calibrate the altimeter periodically. Site elevation is
checked by helicopter instrumentation to confirm the
map reading; if a discrepancy occurs, it should be
recorded on Form 1.
7.2 Field Laboratory Measurements
The field laboratory measurements consist of DIG,
pH, turbidity, and true color. The data are recorded on
Form 2. The QC procedures are described in detail in
Morris et al., 1986. This section contains a summary
of the QC procedures.
7.2.7 DIG
DIG is measured in routine, duplicate, and field audit
samples using the Dohrman Model DC-80 carbon ana-
lyzer. The measurement procedure follows:
1. Initial calibration is performed using the work-
ing standard(10.00 mg/L C).
2. Two QC standards (2.00 mg/L C and 20.00 mg/L
C) are measured to verify the initial calibration.
3. If both QC standards are within 10 percent of the
theoretical concentration, the values are
entered in the DIG logbook and analysis pro-
29
-------
ceeds. If the standards are not within 10 percent,
then steps 1 and 2 are repeated.
4. A calibration blank is measured.
5. If the calibration blank is less than 0.1 mg/L C,
the value is re corded and sample analysis con-
tinues. If the calibration blank is 0.1 mg/L C or
greater, then (1) the laboratory supervisor should
be informed, (2) corrective action must be taken,
and (3) steps 1 through 5 are repeated. Normally,
one calibration blank is analyzed per batch.
6. DIG is measured for eight samples.
7. A 2.0 mg/L C QCCS is analyzed to check the cal i-
bration.
8. If the QCCS is within 10 percent of theoretical
concentration, the value is recorded in the log-
book and sample analysis proceeds. If the
QCCS is not within 10 percent, it should be
determined whether there is enough left of the
samples associated with the unacceptable
QCCS to reanalyze them. If enough sample is
left, steps 1 through 7 are repeated, including
analysis of all samples since the last acceptable
QCCS. If not enough sample remains, the unac-
ceptable QCCS value is recorded in the DIG log-
book and the sample ID numbers associated
with the unacceptable QCCS are noted. Sample
analysis must not continue until acceptable
QCCS values are obtained.
9. One sample is measured in duplicate per batch.
These duplicates are called trailer duplicates. If
the difference between the two measurements
is greater than 10 percent, another sample is
measured in duplicate. If the difference is still
greater than 10 percent, the laboratory supervi-
sor should be notified, and the problem should
be noted on Form 2 with a data qualifier.
10. When sample analysis is complete, a final QC
check is required, and the relevant QC informa-
tion is recorded on Form 2, Batch QC Field Data.
7.2.2 pH
pH is determined in routine, duplicate, and field audit
samples using the Orion Model 611 pH meter with
Orion Ross epoxy body combination pH electrode.
The measurement procedure is as follows:
1. The instrument is standardized according to the
manufacturer's instructions.
2. The pH of the pH 4 and the pH 7 buffers is mea-
sured and the results are recorded in the log-
book. If either differs from the certified value by
more than 0.02 pH units, steps 1 and 2 are
repeated. If acceptable results cannot be
obtained, the electrode is replaced and the
above procedure is repeated. (The old elec-
trodes should be shipped to the QA manager at
EMSL-LV where they will be tested further.)
3. When satisfactory results are obtained for the
buffers, the pH of a pH 4.00 QC sample is mea-
sured and the result is recorded in the log book.
If the reading differs from 4.0 by more than ± 0.1
pH unit, steps 1 and 2 are repeated, and the pH
of a fresh QCCS is measured. If acceptable
results are still not obtained, the laboratory
manager should be consulted. Lake samples are
not to be analyzed until an acceptable value for
the QCCS has been obtained.
4. Samples are measured for pH. After every five
samples, a pH 4.00 QC sample is measured and
the result is recorded in the logbook. If the mea-
sured pH of the QC sample is 4.0 ± 0.1 pH units,
measurement of samples proceeds.
5. If the QCCS is not acceptable, it should be deter-
mined whether there is a sufficient amount of
sample remaining to repeat the analysis. If so,
steps 1 through 3 are repeated and all samples
analyzed since the last acceptable QCCS are
reanalyzed. If not enough sample remains, the
sample ID numbers associated with the unac-
ceptable QCCS are recorded in the logbook.
Steps 1 through 3 are repeated before proceed-
ing with sample analysis.
6. One sample per batch is measured in duplicate.
If the difference between the two measurements
is greater than 0.1 pH unit, another sample is
measured in duplicate. If the difference is still
greater than 0.1 pH unit, the laboratory supervi-
sor should be notified, and the problem should
be noted on Form 2 with a data qualifier.
7. After the last sample in a batch has been ana-
lyzed, a final QCCS is analyzed and the value is
recorded in the logbook.
8. When this analysis is completed, the relevant
QC information is recorded on Form 2.
7.2.3 Turbidity
Turbidity is determined in routine, duplicate, and
blank samples using the Monitek Model 21 laboratory
nephelometer. The measurement procedure is as fol-
lows:
1. The nephelometer, set on range 2, is zeroed and
then is calibrated on range 20 with a 10.0 NTU
standard, following the manufacturer's recom-
mendations.
2. Calibration linearity is verified by analyzing 1.7,
5.0, and 20.0 NTU QC samples. (The 20.0 NTU QC
sample is measured on range 200.) The mea-
sured values must be 1.7 ± 0.3, 5.0 ± 0.5, and
20.0 ± 1.0. If the measured values are unaccept-
able, step 1 is repeated. Accept able results
30
-------
must be obtained prior to sample analysis.
Acceptable results for the 5.0 NTU QC sample
are recorded on Form 2.
3. After every eight samples, a 5.0 NTU QC sample
is measured. If the measured value is 5.0 ± 0.5
NTU, QC and sample results are recorded on
Form 2.
4. If the QC measurement is unacceptable, the
instrument must be recalibrated and the pre-
vious eight samples must be reanalyzed.
Acceptable QC values are recorded on Form 2
along with associated sample results.
7.2.4 True Color
The only QC check on true color is that one sample
per batch is measured in duplicate. If the two meas-
urements differ by more than 10 percent, another
sample is measured in duplicate. If acceptable
results are still not obtained, the laboratory supervi-
sor must be notified and a data qualifier must be
recorded on Form 2 with the results. Acceptable
results are also recorded on Form 2.
31
-------
Section 8
Analytical Procedures
Table 8.1 lists the analytical procedures that are used
to determine each parameter. A detailed description
of these procedures is provided in the methods man-
ual (Hillman et al., 1986). Internal QC checks on the
analytical procedures are discussed in the next sec-
tion.
Parameter
Method3
1. Acidity (BNC)
2. Alkalinity (ANC)
3 Aluminum, total
4. Aluminum, total extractable
5 Ammonium, dissolved
6 Calcium, dissolved
7. Chloride, dissolved
8. Fluoride, total
9. Inorganic carbon, dissolved
10 Iron, dissolved
11. Magnesium, dissolved
12. Manganese, dissolved
13. Nitrate, dissolved
14. Organic carbon, dissolved
15. pH
16. Phosphorus, total
17 Potassium, dissolved
18. Silica, dissolved
19. Sodium, dissolved
20. Sulfate, dissolved
21. Specific conductance
Titration with Gran plot
Titration with Gran plot
EPA method 202.2 AAS (furnace)
Extraction with 8-hydroxyquinoline into MIBK followed by
AAS (furnace)
EPA Method 350.1
EPA Method 215.1 - AAS (flame)
Ion chromatography
Ion selective electrode
Instrumental (Similar to DOC)
EPA Method 236.1 - AAS (furnace)
EPA Method 242.1 - AAS (flame)
EPA Method 243.1 - AAS (flame)
Ion chromatography
EPA Method 415.2
pH electrode and meter
USGS Method I-4600-78 or Modified USGS Method
EPA Method 258 1 - AAS (flame)
USGS Method I-2700-78
EPA Method 273.1 -AAS (flame)
Ion chromatography
EPA Method 120.1
a AAS methods are taken from U.S. EPA, 1983. laboratories that have ICP instrumentation may use EPA Method 200.7,
reproduced in Appendix A of Hillman et al., 1986, for determining Ca, Fe, Mg, and Mn, providing they can demonstrate the
detection limits specified in Table 4.1. If the ICP instrumentation is not able to meet the required detection limits, it may still
be used to analyze samples which contain the analytes at concentrations greater than 10 times the ICP detection limit Other
samples must be analyzed by furnace or flame AAS.
Table 8.1. List of Parameters and Corresponding Measurement Methods.
32
-------
Section 9
Analytical Internal Quality Control
9.1 Sample Receipt
All samples received by the analytical laboratory
should be checked in by a receiving clerk who (1)
records the date received on the shipping form, (2)
checks the samples to identify discrepancies with the
shipping form, (3) fills out the "sample condition" por-
tion of the shipping form, and (4) mails a copy of the
completed shipping form to the Sample Management
Office (SMO). The "sample condition" column should
note such information as leakage in shipping, insuffi-
cient sample, noticeable suspended particulates,
partially frozen samples, and the temperature of the
sample containers. If there are any discrepancies, the
field base coordinator must be notified immediately.
These data are kept on a computer file by SMO and
are available to interested parties. The laboratory
retains a copy of the completed shipping form for the
laboratory records. The samples are refrigerated as
soon as possible.
Samples are received already preserved and ready for
analysis. Sample aliquots 2,3,4,5, and 6 must be kept
refrigerated and in the dark while not in use. When an
analysis is to be performed, the analyst should re
move an aliquot from the sample and return the sam-
ple to the refrigerator as soon as possible.
Even after all analyses have been completed and the
results have been checked, samples should remain
stored in a refrigerator at 4°C, in case reanalysis is
necessary.
9.2 Sample Analysis
Procedures given in the methods manual (Hillman et
al., 1986) are to be followed exactly for each parame-
ter whose value must be measured. Table 8.1 is a list
of all required parameter measurements and the
associated methods. Table 4.1 lists the required preci-
sion, expected ranges, and detection limits for each
parameter. All analyses for each parameter must be
performed within the specified holding times given in
Table 9.1.
9.3 Analytical Laboratory Documentation
for Quality Control
The following documents and information must be
updated constantly and must be available to the ana-
lyst and the supervisor involved in the project:
• Laboratory Standard Operational Procedures
(SOP's). Detailed instructions about the labora-
tory and instrument operations.
• Laboratory QA Plan. Clearly defined laboratory
protocol, including personnel responsibilities
and use of QC samples.
• List of In-House Samples. Including projected
dates for completion of analyses; allows analyst
to schedule further analyses.
• Instrument Performance Study Information.
Information on baseline noise, calibration stand-
ard response, precision as a function of concen-
tration, and detection limits; used by analyst and
supervisor to evaluate daily instrument perform-
ance.
• QC Charts. The most recent QC charts with 99
percent and 95 percent control limits for all
QCCS and detection limit QCCS, generated and
up dated for each batch. The same QCCS must
be used for all QC charts to ensure the continuity
of the charts. (Note: The purpose of preparing
QCCS charts is to ensure that the actual control
limits do not exceed the limits given in Table 9.2.).
• Data Sheet QC Report. Report by the laboratory
manager reviewing QC results for each parame-
ter and flagging all results outside statistically
established QC limits for reanalysis before data
are submitted to recipients.
9.4 Internal Quality Control Within Each
Method
Internal QC must be an integral part of any measure-
ment procedure to ensure that results are reliable. A
summary of internal QC procedures for each method
is given in Table 9.3. QC procedures for certain meas-
urements (pH, BNC, ANC, and specific conductance)
are detailed in the appropriate method description in
Hillman et al., 1986. Details on internal QC proce-
dures for automated colorimetric analyses (total P,
ammonium, and silica), instrumental carbon analy-
33
-------
Holding
Time
7 Days
14 Days
28 Days
28 Days3
Prameter
PHC
Total extractable Al
Alkalinity (ANC)
Acidity (BNC)
Specific
Total P
I\IH4 +
Ca
Mg
Conductance
DIG
DOC
cr
S042~
F"
Si02
K
Na
Total Al
Mn
Fe
a Although the EPA (E.S. EPA, 1983) recommends a maximum holding time of 6 months for these metals, this study requires
that all of the metals be determined within 28 days, this is to ensure that significant changes do not occur and to obtain data
in a timely manner.
b Although the EPA (U.S. EPA, 1983) recommends that nitrate in unpreserved samples (un-acidified) be determined within 48
hours of collection, evidence exists (Peden, 1981 and APHA et al , 1985) that nitrate is stable for 2 to 4 weeks if stored in
the dark at 4°C.
0 Although the EPA (U.S. EPA, 1983) recommends that pH be measured immediately after sample collection, evidence exists
(McQuaker et al., 1983) that it is stable for up to 15 days if stored at 4°C and sealed from the atmosphere. Seven days is
specified here for added precautions. The pH is also measured in a sealed sample at the field station within 12 hours of
sample collection.
Table 9.1. List of Maximum Holding Times.
ses (dissolved organic carbon and dissolved inor-
ganic carbon), ion chromatography analyses (nitrate,
chloride, fluoride, and sulfate), and atomic absorp-
tion or emission analyses (Ca, Mg, K, Na, Mn, Fe, total
Al, and total extractable Al) are described below.
1. Initial Calibration. An initial calibration is per-
formed as required for each analytical method.
Next, the linear dynamic range (LDR) is deter-
mined for the initial calibration. The concentra-
tions of the calibration standards must bracket
the expected sample concentrations. (Occa-
sionally the standards suggested by a method
must be adjusted to meet this requirement.) The
low standard should not be greater than 10
times the detection limit. If during the analysis
the concentration of a sample is above the LDR,
two options are available. One option is to dilute
and reanalyze the sample. In this case, the
diluent should have a matrix similar to the sam-
ple matrix with respect to all preservatives (acid
type and concentration) used. Alternatively, two
concentration ranges may be calibrated. Sam-
ples are first analyzed on the lower concentra-
tion range. Each sample whose concentration
exceeds the upper end of the LDR is then reana-
lyzed on the higher concentration range. If this
option is taken, separate QC samples must be
analyzed and reported for each range.
2. QCCS. Immediately after standardization of the
instruments, a QCCS containing the analyte of
interest at a concentration in the mid calibration
range must be analyzed. QCCS may be obtained
commercially or may be prepared by the analyst
from a source which is independent from the cal-
ibration standards. The calibration QC sample
must be analyzed to verify the calibration curve
prior to any sample analysis, after every 10 sam-
ples, and after the last sample. The observed
value for the QC sample must not differ from the
theoretical value by more than the limits given in
Table 9.2. When an unacceptable value for the
calibration QC sample is obtained, the instru-
ment must be recalibrated and all samples that
were analyzed after the last acceptable QC sam-
ple must be reanalyzed. Furthermore, the
observed concentrations for the QC sample
must be plotted on a QC chart and 99 percent
and 95 percent confidence intervals must be
developed. To ensure the continuity of QC
charts, a QCCS sample of the same theoretical
concentration must be used throughout the
plotting process. The 99 percent control limit
34
-------
Parameter
Al, total extractable
Al, total
Ca
cr
DIG
DOC
F, total
F3
K
Mg
Mn
Na
NH4+
N03-
P, total
Si02
SO/~
Specific Conductance
Maximum Control Limit for QCCS (% Difference from
Theoretical Concentration of QCCS
± 20%
±20%
±5%
±5%
±10%
±10%
±5%
±10%
±5%
±5%
±10%
±5%
±10%
±10%
±20%
±5%
±5%
±2%
Table 9.2. Maximum Control Limits for Quality Control Check Samples (QCCS).
must not differ from the theoretical concentra-
tion by more than the limits given in Table 9.2. If
it does, the QA manager must be consulted.
After each day of analysis, QC charts should be
updated, cumulative means should be calcu-
lated, and new warning and control limits (95
percent and 99 percent, respectively) should be
determined. To indicate bias for a given analy-
sis, there must be at least seven successive
points on one side of the theoretical mean. If
bias is indicated, analysis must be stopped and
an explanation must be sought. Current control
charts must be available to the analyst and
available during on-site evaluations.
3. Detect/on Limit QCCS. This is a low-level QC
sample containing the analyte of interest at a
concentration two to three times the required
detection limit. This QC sample must be ana-
lyzed once per batch for the parameters total
extractable Al, total Al, dissolved metals (Ca, Fe,
K, Mg, Mn, Na), and total P. The results are
reported on Form 20, Blanks and QCCS Results.
(See Table 9.4.) The purpose of the detection
limit QC sample is to eliminate the necessity of
formally deter mining the detection limit on a
daily basis. The measured value must be within
20 percent of the theoretical concentration. If it
is not, the problem must be identified and cor-
rected, and an acceptable result must be
obtained prior to sample analysis.
4. Calibration Blank. A calibration blank must be
analyzed once per batch, immediately after the
initial calibration, to check for base line drift.
Rezero if necessary. The calibration blank is
defined as a "0" mg/L standard and contains
only the matrix of the calibration standards. The
observed concentration of the calibration blank
must be less than or equal to twice the required
detection limit. If it is not, the instrument must
be rezeroed and the calibration must be
rechecked.
5. Reagent Blank. A reagent blank must be pre-
pared and analyzed for each batch of samples
35
-------
Parameter or Method
QC Check
Control Limits
Corrective Action3
ANC, BNC, pH
1. Titrant standardization crosscheck
2. Electrode calibration (Nernstian
response check)
3. pH QCCS (pH 4 and 10) analysis
4. Blank analysis (salt spike)
5. Duplicate analysis
6 Protolyte comparison
1. Relative differences 5%
2. Slope = 1 00 ± 0 05
3. pH4 = 4.00 ±005
pH10 = 1000 ± 005
4. Blank <10Meq/L
5. RSD<10% (ANC and BNC)
± 0.05 pH units (pH)
6.SeeHillmanetal., 1986
1. Restandardizetitrants.
2. Recalibrate or replace electrode.
3. Recalibrate electrode
4. Prepare fresh KC1 spike solution
5. Refine analytical technique, analyze
another duplicate
6. See Hillman etal., 1986
Ions (Cf, Total F",
NrV, N03", S042")
1a. Initial QCCS analysis (calibration 1a,b. The lesser of the 99% Cl or value
and verification) given in Table 9.2
Metals (Total Al, total extractable Al, Ca, 1b. Continuing QCCS analysis (every 10
F3, K, Mg, Mn, Na) samples)
Si02, Total P, DIG, DOC, Spec. Cond.
Ions (Cf, Total F, NH/, N03" S042")
Metals, (Total Al, total extractable, Al,
Ca, Fe, K, Mg, Mn, Na)
Si02, Total P, DIG, DOC, Spec Cond.
2a. Detection limit determination
(weekly)
2b. DL OCCS analysis (daily; metals
and total P only)
3. Bank analysis
4 Duplicate analysis
5. Matrix spike (except ext. Al, DIG,
and spec, cond.)
6. Resolution test (1C only)
2a. Detection limits given in Table 4.1
2b. % Recovery = 100 ± 20%
3a. Blank<2 x DL (except sp. cond.)
3b. Blank<0.9/iS/cm (sp. cond. only)
4. Duplicate precision (%RSD) limits
given in Table 4.1
5. % Recovery = 100 ± 15%
6. Resolution>60%
1a. Prepare new standards and
recalibrate
1b. Recalibrate. Reanalyze associated
samples
2a,b. Optimize instrumentation and
technique
3a.b. Determine and eliminate
contamination source. Prepare
fresh blank solution. Reanalyze
associated samples.
4. Investigate and eliminate source of
imprecision. Analyze another
duplicate.
5. Analyze 2 additional spikes. If one or
both outside control limits, analyze
all samples in that batch by method
of standard additions.
6 Clean or replace separator column.
Recalibrate.
aa To be followed when OC check is outside control limits.
Table 9.3. Summary of Internal Quality Control Checks for Analysis Methods.
-------
Data Form
Description
11
13
15b
15b
16b
17
18
19
20
21
22
23
Summary of Sample Results
Alkalinity and Acidity Analyses Results
QC Data for Alkalinity and Acidity Analyses
Specific Conductance (Measured and Calculated)
Anion-Cation Balance Calculations
Ion Chromatography Resolution Test
Detection Limits
Sample Holding Time Summary
Blanks and QCCS Results
Matrix Spikes Results
Duplicates Results
Standard Additions Results
a These forms are shown in Appendix A.
b Form not required to be submitted with data package but recommended for internal QC requirements.
Table 9.4. List of Data Forms Used by the Analytical Laboratory3.
for methods which require sample preparation
(dissolved SiO2 and total Al). A reagent blank is
defined as a sample composed of all the
reagents (in the same quantities) used in prepar-
ing a real sample for analysis. It is also carried
through the same digestion/extraction proce-
dure as a real sample. The concentration of the
reagent blank must be less than or equal to
twice the required detection limit. If the concen-
tration exceeds this limit, the source of contami-
nation must be investigated and eliminated. A
new reagent blank must then be prepared and
analyzed for any sample in which the high
reagent blank value contributed significantly
(>10 percent) to the value of the parameter in
question. If a high reagent blank problem cannot
be corrected, then the QA manager must be con-
tacted. Reagent blank results are reported on
Form 20 but are not subtracted from sample
results.
6. Preliminary Sample Analysis. Approximately
seven samples and a re agent blank must be
analyzed prior to matrix spike and duplicate
analyses to determine approximate endoge-
nous sample concentrations.
7. Matrix Spike Analysis. One matrix spike must be
prepared for each batch. It is prepared by spik-
ing an aliquot of a sample* with a known quan-
tity of analyte prior to analysis. The spike con-
centration must be twice the endogenous level
or ten times the required detection limit, which-
ever is larger. Also, the volume of the spike
added must be negligible (less than or equal to
one percent of the sample aliquot volume). The
spike recovery must be 100 ± 15 percent to be
acceptable. If the recovery is not acceptable for
all parameters, two additional, different sam-
ples must be spiked with the analyte in question,
must then be analyzed, and recoveries must be
calculated. If one or both recoveries are not 100
± 15 percent, the entire batch must be analyzed
by standard additions for the parameter in ques-
tion. The standard addition is performed by ana-
lyzing the sample, the sample plus a spike at
about the endogenous level, and the sample
plus a spike at about twice the endogenous
level. The concentration of the matrix spike
QA analysis on a full sample is recommended. If sufficient sample
volume is not available, QA analysis may be performed on a per-
aliquot basis.
37
-------
must not exceed the linear range of the method.
For this reason, the matrix spike for furnace
analyses, which determine low levels of analyte,
must be chosen judiciously and may be different
than suggested above. The samples may be
diluted or the spike levels may be adjusted so
that the linear range is not exceeded when per-
forming standard additions for furnace AA anal-
yses. A matrix spike is not required for total
extractable Al analyses. The percent recovery of
spikes is calculated as described below:
% spike recovery =
Value of sample
plus spike
value of unspiked
sample
value of spike added
x 100
8. Duplicate Sample Analysis. One sample per
batch must be prepared and analyzed in dupli-
cate for each parameter. The relative standard
deviation is plotted on a QC chart and 99 percent
and 95 percent confidence intervals are estab-
lished. Initial control limits are set at the preci-
sion levels given in Table 4.1. The control limits
should not exceed these values. If they do, the
QA manager must be notified immediately. If
duplicate values fall outside the control limits,
an explanation must be sought (such as instru-
ment malfunction, calibration drift, etc.). A sec-
ond, different sample must then be analyzed in
duplicate. No further samples may be analyzed
until duplicate sample results are within the
control limits, unless approval is given from the
QA manager. The percent relative standard devi-
ation (%RSD) is calculated as described below:
and after the last sample, a QC sample must be
analyzed to continually verify the calibration
curve. If the measured value differs from the the-
oretical value by more than the limits given in
Table 9.2, the instrument is recalibrated and the
previous 10 samples are reanalyzed.
9.5 Overall Internal Quality Control
Once the value of each parameter in a sample is deter-
mined, there are several procedures for checking the
correctness of analyses. These procedures are out-
lined in the following subsections.
9.5.1 Anion-Cation Balance
Theoretically, the acid neutralizing capacity (ANC) of
a sample equals the difference, expressed as
microequivalents per liter 3 mg/L,
unmeasured organic anions are most likely responsi-
ble; thus, reanalysis is unnecessary. The QA manager
must be contacted when questions arise regarding
reanalysis.
38
-------
A. Anion-Cation Balance
Total Ion Strength
<50
>50<100
£100
B Specific Conductance
Measured Specific
Conductance (/uS/cm)
<5
>5<30
>30
% Ion Maximum Balance Difference3
60
30
15
Maximum % Specific
Conductance Difference3
50
30
20
a If the absolute value of the percent difference exceeds these values, the sample is reanalyzed When reanalysis is indicated,
the data for each parameter is examined for possible analytical error. Suspect results are then redetermmed and the above
percent differences are recalculated (Peden, 1981). If the percent differences for reanalyzed samples are still unacceptable,
or no suspect data are identified, the QA manager must be contacted for guidance
Table 9.5. Chemical Reanalysis Criteria.
9.5.2 Conductance Balance
An approximation of the conductance of a sample
can be calculated by adding together the equivalent
conductances for each measured ion at infinite dilu-
tion. The calculated conductances are determined by
multiplying the concentration of each ion by the
appropriate factor given in Table 9.6. The percent con-
ductance difference (%CD) is calculated as follows:
% CD =
calculated cond. - measured cond.
measured conductance
x 100
Samples which have %CD's exceeding the limits
listed in Table 9.5 are reanalyzed. As with the percent
ion difference calculation, an unacceptable value for
%CD indicates either the presence of unmeasured
ions or an analytical error in the measurement. For
the surface waters sampled, the ions included in the
%CD calculation are expected to account for 90 to
100 percent of the ions in a sample. However, in con-
trast to the percent ion difference calculation, there is
no term in the %CD calculation to account for proto-
lytes that are not specifically determined. The QA
manager must be contacted when questions arise
regarding reanalysis.
9.6 Instrumental Detection Limits
Instrumental Detection Limits (IDL's) must be deter-
mined and reported weekly for each parameter
except pH, specific conductance, ANC, and BNC. For
this study, the detection limit is defined as three times
the standard deviation of 10 nonconsecutive repli-
cate reagent or calibration blank analyses. Calibra-
tion bjanks are analyzed when a method does not
require a reagent blank. In some analyses, such as
those using ion chromatography and Technicon
AutoAnalyzers, a signal may or may not be obtained
for a blank analysis. If a signal is not obtained for a
blank analysis, the instrumental detection limit is
defined as three times the standard deviation of 10
nonconsecutive replicate analyses of a standard
whose con centration is three to four times the
required detection limit. Detection limits must not
exceed the limits listed in Table 4.1.
9.7 Data Reporting
Results from each method are recorded on the appro-
priate data forms (Table 9.4). After a sample (aliquots
1 through 7) is completely analyzed, the results are
summarized on the Data Form 11 and are reported to
the number of decimal places listed in Table 9.7.
Results are annotated by the data qualifiers (tags)
39
-------
Ion
Ca2+
cr
co/-
H +
HC03~
Mg2+
[H + ]
pH =
[OH']
HC03
C032"
KI =
Specific Conductance
(/xS/cmat25°C)
per mg/L Ion
2.60 Na+
2.14 NH4+
2.82 C042~
3.5 x 105(permole/L) N03~
0.715 K +
3.82 OH-
moles/L = 1(TpH
initial pH measured before acidity titration
Kw
[H + ]
5.080 [DIC(mg/L)][H + ]K1
[H + ]2 + «! + [H + ]K1K2
4.996 [DIC(mg/L)] Kt K2
[H + ]2 + [H + ]K, + K,K2
4.4463 x 1CT7 K2 = 4.6881 x 10~11
Specific Conductance
(yuS/cmat25°C)
per mg/L
2.13
4.13
1.54
1.15
1 84
1.92 x 105(permole/L)
3 Taken from American Public Health Association et al., 1985, and from Weast, 1972.
Table 9.6. Conductance Factors of Ions3.
listed in Table 9.8, where applicable. After a form is
completed, the analytical laboratory supervisor must
sign it, indicating he has reviewed the data and that
the samples were analyzed exactly as described in
the methods manual (Hillman et al., 1986). All devia-
tions from the manual require the authorization of the
QA manager prior to sample analysis.
Copies of raw data must be submitted as requested
by the QA manager. All original raw data must be
retained by the lab until notified otherwise by QA
manager. Raw data include data system printouts,
chromatograms, note books, QC charts, standard
preparation data, and all information pertinent to
sample analysis.
9.8 Daily Evaluation of Quality Control
Data
Each laboratory should make a Daily Sample Status
Report by telephone to the QA manager (or his autho-
rized representative) as directed. The objective of
these reports is to keep the QA manager informed of
the status of the internal and external QC checks in
the laboratory in order to identify and solve problems
that may arise. The reports also allow the QA man-
ager to obtain preliminary results for the blanks,
duplicates, and laboratory and field audit samples
that are double-blind to the laboratories. (A discus-
sion of blind and double-blind samples is presented
in Section 10.) Otherwise, these data would not be
available for QA/QC data evaluation until they were
reported by the laboratories, which may be up to 35
days after the samples are received. During the daily
telephone contact, the QA manager (or designee)
completes the Telephone Record Log shown in
Appendix B.
After each day of analysis, QC charts are updated and
new control and warning limits are determined. The
QA chemist then performs a QC audit in which all the
data are reviewed. Any values that lie outside the con-
trol or warning limits are checked to verify that they
are not the result of a transcription error. If bias is indi-
40
-------
Parameter
Al, total extractable
Al, total
Acidity (BND)
Alkalinity (ANC)
Ca
cr
DIG
DOC
F, total
Fe
K •
Mg
Mn
Na
NH4 +
N03"
PH
P, total
SiO,
SO/'
Specific Conductance
a Report to the number of decimal places
significant figures should be reported.
Recommended Number of Decimal Places
in Reported Results3
3
3
1
1
2
2
2
1
3
2
2
2
2
2
2
3
2
3
2
2
1
in the actual instrument detection limit (IDL). Where applicable, no more than three
Table 9.7. List of Decimal Place Reporting Requirements.
cated (seven successive points on one side of the the analytical laboratory supervisor, the QA chemist,
cumulative mean), analyses are stopped and an and each analyst.
explanation is sought. Copies of the plots are given to
41
-------
Qualifier
Indicates
A Instrument unstable
B Redone, first reading not acceptable
C Instruments, sampling gear not vertical in water
column
D Slow stabilization
E Hydrolab cable too short
F Result outside QA criteria (with consent of QA
manager)
G Result obtained from method of standard
additions
H Holding time exceeded criteria (Form 19 only)
J Result not available; insufficient sample volume
shipped to laboratory from the field
K Result not available, entire aliquot not shipped
L Not analyzed due to interference
M Result not available, sample lost or destroyed by
lab
N Not required
P Result outside QA criteria, but insufficient volume
for reanalysis
Q Result outside QA criteria
R Relsult from reanalysis
S Contamination suspected
T Leaking container
U Result not required by procedure; unnecessary
V Anion-cation balance outside criteria due to high
DOC
W % difference ($D) calculation (Form 14) outside
criteria due to high DOC
X Available for miscellaneous comments in the field
only
Y Available for miscellaneous comments in the field
only
Z Available for miscellaneous comments in the field
only
Measurements taken at < 1 5m
Table 9.8. National Surface Water Survey Lab/Field Data Qualifiers (Tags).
42
-------
Section 10
Performance and System Audits
10.1 Performance Audit Samples
Field and laboratory audit samples are used as part
of the QA activities of the ELS-I. The audit samples
are shipped to the analytical laboratories from the
field stations as though they were routine lake sam-
ples. Every attempt is made to assure that the analyti-
cal laboratory does not recognize the audit samples
as different from the routine lake samples. As a result,
the audit samples are double blind to the analytical
laboratory. That is, the laboratory neither recognizes
them as audit samples nor knows their compositions.
10.1.1 Field Audit Samples
The purpose of field audit samples is to identify prob-
lems affecting data quality that may occur during
sample processing, shipment, or analysis. These
problems could include sample contamination, sam-
ple degradation, solvent evaporation, and improper or
inaccurate sample analysis. When used in conjunc-
tion with laboratory audit samples, the analysis of
these samples provides data that can be used to dis-
tinguish field station problems from analytical labo-
ratory problems. There are two types of field audit
samples: synthetic field audit samples and natural
field audit samples.
The synthetic field audit samples are prepared at a
central laboratory and are sent to the field laboratory
to be processed through all the filtration and preser-
vation steps and labeled as though they were authen-
tic lake samples. Thus, they are single-blind samples
to the field laboratory (i.e., recognized as audit sam-
ples but of unknown composition) and, concurrently,
double-blind samples to the analytical laboratory.
The desired composition of the synthetic field audit
samples is shown in Table 10.1. Two formulations of
synthetic samples are used for the ELS-I to test the
sampling and analysis systems at both high and low
concentrations of analytes.
Waters collected from Big Moose Lake in the
Adirondack Mountains and from Lake Superior at
Duluth, Minnesota, are utilized as natural audit sam-
ples for the survey. The waters of Big Moose Lake are
low in alkalinity and thus are susceptible to acidic
deposition; the Lake Superior waters represent a buf-
fered system. These natural samples are passed
through a 0.45 /^filter and are maintained at 4°C to
minimize changes in composition. Aliquots (2 L) are
included as part of a batch.
70.7.2 Laboratory Audit Samples
The purpose of these samples is to identify problems
affecting data quality that may occur during the ana-
lytical process. Thus, lab audit samples help verify
the accuracy of analytical procedures and assure
that the laboratory is maintaining the capability to
properly analyze the samples.
The synthetic laboratory audit samples are sent to
the field station from a central laboratory, already
split into seven aliquots. The audit samples are
labeled by the field laboratory personnel, are inclu-
ded in a batch with routine lake samples processed
on the same day, and are shipped to the analytical lab-
oratory for analysis.
The desired composition of the synthetic laboratory
audit samples is given in Table 10.1. Two formulations
of synthetic samples are used to test the analytical
system at both high and low concentrations of analy-
tes for the ELS-I.
70.7.3 Application of Audit Sample Data
Data are obtained from the analyses of the audit sam-
ples for the following purposes:
• To judge the performance of the field stations in
the processing and shipment of samples
• To judge the continued capability of the analyti-
cal laboratories to properly analyze the samples
• To establish a statistically valid estimate of the
overall bias and precision of the analyses
• To establish a statistically valid estimate of the
stability of a typical lake sample when stored at
4°C by evaluating the natural lake sample over
the period of the study.
Acceptance windows are established for the meas-
urement of each parameter in the audit samples. The
size of the windows is based upon the information
available for each analytical method at the time the
study is initiated. If the analytical results for a meas-
urement fall outside the acceptance window, the
43
-------
Parameter
Acidity (BNC)*
Al (total and total extractable)
Alkalinity (ANC)*
Ca
Cl-
DIC
DOC
F-, total
Fe
K
Mg
Mn
Na
NH/
NH3-
P, total
pH*
Si02
S042~
Specific Conductance**
'These parameters are related and
Low
Concentration
10-50
0.01-0.10
10-50
0.1-1
0.1-1
0.1-1.0
0.1-1.0
0 01-0.05
0.02-1.0
0.1-1
0 1-1
0.02-1 0
0.5-3
0.01-0.50
0.01-0 50
0.005-0.03
4-5
1-5
1-5
1-50
affect the analytical results of one another.
High
Concentration
100-200
0.2-1.0
100-200
1-10
1-5
3-10
10-20
0.05-0.2
1-10
1-5
1-5
1-10
5-12
0.5-2.0
0.5-2.0
0.01-0.1
7-8
10-20
10-20
100-200
Units
Aieq/L
mg/L
Meq/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
PH
mg/L
mg/L
/uS/cm
**To be determined by concentration of other parameters
NOTE Mass balance (anions vs.
cations) must be maintained Nitrogen/phosphorus
ratio must be reasonable
(10/20).
Table 10.1. Desired Composition of Synthetic Field and Laboratory Audit Samples for the Eastern
Lake Survey.
EMSL-LV QA staff reviews the data to determine the
cause of the problem and immediately contacts the
analytical laboratory or field station, whichever is
appropriate, to seek corrective action. Data for rou-
tine samples analyzed with the audit samples are
also checked to determine if they were also affected
by the problem. If they were affected, reanalysis of
the samples in question is requested. The establish-
ment of the acceptance windows is described in Sec-
tion 11.
Approximately 150 synthetic audit samples and 60
natural-water audit samples are scheduled to be proc-
essed during the ELS-I. A statistical evaluation of the
audit data should provide a good estimate of the bias
and precision of the analytical methods for each
required measurement. Furthermore, any changes
over time in analytical results for the natural-water
audit samples without corresponding change in the
other audit samples can be attributed to lack of ana-
lyte stability. The findings of a comparative study
between audit sample types will provide a statisti-
cally valid estimate of the true maximum holding
times allowable for each type of analysis.
44
-------
The audit samples are a key factor in the QA program
for the ELS-I. It is intended that every effort be made
to provide high-quality audit samples.
10.2 Quality Assurance System Audits
(On-Site Evaluations)
The systems audit consists of qualitative evaluation
of field and analytical laboratory facilities, equip-
ment, and operations such as record keeping, data
reporting, and QC procedures.
10.2.1 Field Operations On-Site Evaluation
Each field station and sampling team in the ELS-I can
expect at least one on-site evaluation during the
course of the sampling effort. This is an on-site
inspection to review the sampling procedures, field
station operations, and QA efforts. For each field sta-
tion and the corresponding sampling teams, the on-
site evaluation should be conducted as soon as pos-
sible after the start of operations. The questionnaire
given in Appendix C is used to assist in the evalua-
tion.
The field auditor conducts an in-depth review of all
field operations. This includes but is not limited to (a)
interviewing the field base coordinator and the field
laboratory supervisor, (b) observing the field station
operations, (c) interviewing each sampling team, (d)
accompanying one or more of the sampling teams
during a sample excursion, and (e) writing a summary
report that includes results, observations, and recom-
mendations. If there are any problems, the evaluator
must either attempt to correct them or must bring
them to the attention of the field base coordinator.
70.2.2 Analytical Laboratory On-Site Evaluation
Each analytical laboratory participating in the ELS-I
can expect a minimum of two in-depth on-site evalua-
tions conducted by the EPA QA manager or his autho-
rized representative. The questionnaire in Appendix D
is used to assist in the on-site laboratory evaluation.
The first on-site laboratory evaluation is performed
after the laboratory has successfully analyzed a set
of Pre-Award Performance Evaluation (PE) samples
for the contract-required parameters and before the
actual survey analytical work begins. The PE samples
contain up to the maximum number of analytes for
which determination is required, in the expected con-
centration ranges. The PE sample results are scored
using the ELS-I Pre Award Score Sheet given in
Appendix E. Grading emphasizes analytical accu-
racy, but a substantial portion of the grade depends
on meeting the QA, internal QC, reporting, and deliv-
erable requirements.
The auditor summarizes all observations in an On-
Site Laboratory Evaluation Report and brings all
problems that occur to the attention of the laboratory
manager for corrective action.
The second on-site laboratory evaluation is con-
ducted after approximately one-third of the ELS-I
analyses have been completed. During the second
on-site evaluation, QA sample (audit, duplicate, and
blank) data and QC data received to date are
reviewed. The laboratory questionnaire is updated, if
necessary, noting all changes since the first on-site
evaluation. An On-Site Laboratory Evaluation Report
is written for this and for each additional on-site labo-
ratory evaluation.
45
-------
Section 11
Acceptance Criteria
11.1 Audit Sample Results
Acceptance windows for single values from audit
samples are based on previous interlaboratory analy-
ses of the same sample material. The objective of cre-
ating windows is to predict intervals for acceptable
single future values based on a sample mean (X) and
sample standard deviation (s) computed from n previ-
ously observed values. The limits of the windows are
determined by using a t-statistic (t).
V -
t =
V
y -
(n-1)s2
(n-1)2
where:
is a Student's t
Z is the standard normal variate, having a normal
distribution with a mean of 0 and a variance of 1
M is a variable with a chi-square distribution with r
degrees of freedom, and Z and /u are independent
The observed values X1; X2, X3, Xn are indepen-
dent and have a normal distribution (~N) with a popu-
lation mean (/x) and variance(ff 2). A(1 - a)prediction
interval for a single future value y is needed. Let T =
sample mean and s = samplestandarddeviation.lt is
known that
y -v, N (M, a2) and x •
Io^
j, n
Therefore,
y-x-v-N 0, a2
Z=
y - x
and r = n-1
The upper and lower limits of the window can be for-
malized as follows:
x + (t) (s)
T- (t) (S)
upper limit of the window
— = lower limit of the window
Substituting,
The Student's t-value (t) has n-1 degrees of freedom.
The t-value is for a 2-tailed test with a cumulative
probability of 0.975 (i.e., 2.5% probability on either
side).
For predicting future values, wider windows than the
standard 95% confidence interval about the mean are
desirable. As the number of observed values
increases, more variance occurs due to chance alone.
Initially, there may not be sufficient data (n < 10)
available to provide good interval estimates. Arbitrary
criteria may be used until 10 or more values are availa-
ble. The windows should be periodically updated as
more data are accumulated.
Grubbs' test (Grubbs, 1969) is applied to the data
before interval estimation to detect outliers. The out-
liers are excluded from the computation of the win-
dows.
Windows for matrix spike analysis results are compu-
tationally identical to those for audit sample results.
11.2 Duplicate Analysis Results
Acceptance criteria for the RSD are based on the
upper 95th percentile of observed values of RSD.
46
-------
Because the RSD is affected by concentration, these
criteria are applied only when the mean of the dupli-
cate analyses exceeds the contract-required detec-
tion limit (CRDL) by a factor of 10.
Arbitrary acceptance criteria may be used until suffic-
ient (at least 10) RSD values have been observed.
The distribution of the RSD values cannot be esti-
mated accurately until sufficient RSD values have
been observed. It is recommended that no outlier test
be applied until the distribution has been estimated.
11.3 Corrective Action
Laboratories which fail to meet the acceptance crite-
ria for analysis of audit samples, matrix spikes, or
duplicates are required to repeat the analysis that
produced the erroneous results. If results from the
second analysis are still unacceptable, further cor-
rective action must be initiated.
47
-------
Section 12
Data Management System
The purpose of the data base management system is
to assemble and store data generated as part of the
ELS-I, to provide basic reports of the survey results, to
perform simple statistical analyses, and to provide
data security. A detailed description of the system is
given in the Data Management Proposal (ORNL,
1984). The relationship of data base management to
the overall ELS-I is shown in Figure 12.1.
All data sets are protected from unauthorized or acci-
dental access by individual, system, and file
password protection.
The data are stored in three major data sets, namely
(1) a raw data set, (2) a verified data set, and (3) a vali-
dated data set. These three data sets make up the
ELS data base.
12.1 Raw Data Set
At Oak Ridge National Laboratory (ORNL), the field and
analytical laboratory data (analytical results and data
qualifiers-see Table 9.8) reported on Data Forms 1,2,11,
13,18,19, 20, 21, 22, and 23 are directly entered into the
raw data base using the Statistical Analysis System
(SAS). A copy of the data package consisting of these
forms is also sent to the EPA EMSL-LV QA staff for con-
current data analysis. Data receipt is acknowledged, and
field stations and the analytical laboratory verify that all
forms are received by the data base management person-
nel.
The SAS full-screen editor procedure is used to provide
gross error checking as data are entered. All data are
entered into two separate data sets by two different oper-
ators. For the ELS-I data base, a custom program (COM-
PARE) has been developed in SAS to compare the two
data sets and identify any inconsistencies. The
advantage of this double entry and comparison process
is that entry errors are removed from the system.
12.2 Verified Data Set
As the field and analytical laboratory data are trans-
mitted and magnetic tapes of the raw data are made
available to the EMSL-LV QA group, all data are evalu-
ated and verified as described in Section 13.0. The
data are processed by "Automated Quality Assur-
ance Review, Interactive Users System" (AQUARIUS),
an online QA system developed by the EMSL-LV QA
staff. Reports generated by AQUARIUS range in sub-
ject from complex protolyte analysis to simple exter-
nal/internal blank checks for QA purposes. AQUAR-
IUS generates "exception tuples" that direct ORNL in
flagging problem data, as deemed necessary by the
EMSL-LV QA staff. Flags that are generated by
AQUARIUS are given in Table 12.1. Exception tuples
are defined as a grouped set of values that specify the
batch, sample, and parameter to be flagged, modi-
fied, or verified. These exception tuples are then sent
to ORNL via magnetic tape and are entered into the
verified data set. The process might have to be
repeated several times until all the data problems are
resolved. The old data values are maintained in the
raw data set as historical backup.
In addition to the standard QA analyses, AQUARIUS
is used to generate various printouts supplied to the
QA manager to point out intralab, inter lab, and inter-
field bias, as well as discrepancies in blanks, audits,
or other QA samples. The overall outcome is a verified
data base in which all values either are qualified or
replaced with missing value codes (listed in Table
12.2). The QA personnel coordinate with the field sta-
tions and the analytical laboratories to make all
appropriate corrections in the data.
12.3 Validated Data Set
A computer printout of the verified data set is pro-
vided to the ERLCorvallis staff who initiate the valida-
tion process. The validation process increases the
overall integrity of the data base by evaluating all
data for internal and regional consistency using all
the QA/QC information available.
The validation process compares data for a set of var-
iables against a much narrower range utilizing knowl-
edge of relationships in aquatic chemistry and lim-
nology to identify intrasite sample inconsistencies.
Intersite validation consists of comparing single site
values with adjacent sites within a region. Data for
groups of sites are compared and mapped to check
for consistency. The validation process is discussed
further in Section 14.0. After undergoing this review-
ing process, the data, site by site, are transferred to
the validated data base. The data will then be
released by the EPA and will be made available to all
data users.
48
-------
SITE SELECTION
FIELD STATIONS
ANALYTICAL LABORATORIES
RAW
DATA
SET
VERIFIED
DATA
SET
VALIDATED
DATA
SET
ACCESS,
DISTRIBUTION,
ANALYSIS
DATA ENTRY
BYORNL
BATCH REPORTS,
RANGE CHECKS
DATA EDITING
(FLAGGING)
SITE REPORTS,
MAPS
DATA EDITING,
FLAGGING OF
QUESTIONABLE DATA
REPORTS, MAPS
STATISTICS
VERIFICATION
BY EPA
EMSL-LV QA
VALIDATION
BY ERL-
CORVALLISAND
EMSL-LV QA
PRELIMINARY
ANALYSIS
* = DATA TRACKING SYSTEM
Figure 12.1. Data Management for the Eastern Lake Survey.
49
-------
AO Anion/Cation % Ion Balance Difference (%IBD) is outside criteria due to unknown cause
A1 Anion/Cation % Ion Balance Difference (%IBD) is outside criteria due to nitrate contamination
A2 Anion/Cation % Ion Balance Difference (%IBD) is outside criteria due to anion (other than nitrate) contamination.
A3 Anion/Cation % Ion Balance Difference (%IBD) is outside criteria due to cation contamination.
A4 Anion/Cation % Ion Balance Difference (%!BD) is outside criteria due to unmeasured organic protoiyles (fits Oliver Model).
A5 Anion/Cation % ion Balance Difference (%IBD) is outside criteria due to possible analytical error - anion concentration too
high (list suspect anion).
A6 Anion/Cation % Ion Balance Difference (%IBD) is outside criteria due to possible analytical error - cation concentration too
low (list suspect cation).
A7 Anion/Cation % Ion Balance Difference (%IBD) is outside criteria due to possible analytical error - anion concentration too
low (list suspect anion).
A8 Anion/Cation % Ion Balance Difference (%IBD) is outside criteria due to possible analytical error - cation concentration too
high (list suspect cation).
BO External (field) blank is above expected criteria of pH, DIG, DOC, specific conductance, ANC, and BNC determinations.
B1 Internal (lab) blanks is >2 x CRDL for pH, DIC, DOC, specific conductance, ANC, and BMC determination
B2 External (field) blank is above expected criteria and contributed >20% to sample values (This flag is not used for pH, DIC,
DOC, ANC, or BNC determinations.)
B3 Internal (lab) blank is >2 x CRDL and contributes >10% to the sample concentrations. (This flag is not used for pH, DIC,
DOC, ANC, or BNC determinations.)
B4 Potential negative sample bias based on internal (lab) blank data.
B5 Potential negative sample bias based on external (field) blank data.
CO % Conductance Difference (%CD) is outside criteria due to unknown cause (possible analytical error - ion concentration too
high).
C1 % Conductance Difference (%CD) is outside criteria due to possible analytical error - anion concentration too high (list
suspect anion).
C2 % Conductance Difference (%CD) is outside criteria due to anion contamination.
C3 % Conductance Difference (%CD) is outside criteria due to cation contamination.
C4 % Conductance Difference (%CD) is outside criteria due to unmeasured organic ions (fits Oliver Model).
C5 % Conductance Difference (%CD) is outside criteria due to possible analytical error in conductivity measurement.
C6 % Conductance Difference (%CD) is outside criteria due to possible analytical error - anion concentration too low (list
suspect anion).
C7 % Conductance Difference (%CD) is outside criteria due to unmeasured prololyte ions (does not fit Oliver Model).
C8 % Conductance Difference (%CD) is outside criteria due to possible analytical error - cation concentration too low (list
suspect cation).
C9 % Conductance Difference (%CD) is outside criteria due to possible analytical error - cation concentration too high (list
suspect cation).
DO External (field duplicate precision exceeded the maximum expected % relative standard deviation (%RSD), but either the
routine or the duplicate concentrations were < 10 x CRDL.
D2 External (field) duplicate precision exceeded the maximum expected % relative standard deviation (%RSD), and both the
routine and duplicate sample concentrations were > 10 x CRDL.
03 Internal (lab) duplicate precision exceeded the maximum contract-required % relative standard deviation (%RSD), and both
the routine and duplicate sample concentrations were 2.10 x CRDL.
Table 12.1. Data Qualifiers (Flags) for Raw Data Set.
50
-------
FO % Conductance Difference (%CD) exceeded criteria when Hydrolab conductance value was substituted
F1 Hillman/Kramer protolyte analysis program indicated field laboratory pH problem when Hydrolab pH value was substituted.
F2 Hillman/Kramer protolyte analysis program indicated unexplained field pH/DIC problem when Hydrolab pH value was
substituted.
HO The maximum holding time criteria were not met.
H1 Mo "Date Analyzed" data were submitted for reanalysis data.
L1 Instrumental Detection Limit (IDL) exceeded CRDL and sample concentration was <10 x IDL
MO Value obtained using a method which is unacceptable by the IFB contract.
NO Audit sample value exceeded upper control limit.
N1 Audit sample value was below control limit.
N2 Audit sample value exceeded control limits due to suspect audit sample preparation.
N5 N03~ data obtained from analysis of aliquot 5.
PO Field problem - station pH.
P1 Field problem - station DIG.
P2 Field problem - unexplained (pH/DIC).
P3 Lab problem - initial pH (ANC).
P4 Lab problem - initial pH (BNC).
P5 Lab problem - unexplained - initial pH (ANC or BNC).
P6 Lab problem - initial DIG.
P7 Lab problem - air-equilibrated pH/DIC.
P8 Lab problem - unexplained - initial pH/DIC.
P9 Lab problem - ANC determination.
Q1 QCCS was above contractual criteria.
02 QCCS was below contractual criteria.
Q3 Insufficient number of QCCS were measured.
Q4 No QCCS analysis was performed.
SO Matrix spike % recovery (%REC) was above contractual criteria.
S1 Matrix spike % recovery (%REC) was below contractual criteria.
V Data verified.
Table 12.1. Data Qualifiers (Flags) for Raw Data Set (Continued).
51
-------
A - Carbonate alkalinity, C02-Acidity and mineral acidity are eliminated from data base due to method inconsistencies
C - Temporary flag indicating raw data incomplete pending confirmation by analytical laboratory
N - Eliminate from data set pending review of aliquot 5 nitrate data.
R - Temporary flag indicating raw data incomplete pending reanalysis.
X - Permanent flag indicating invalid data based on QA review.
- Value never reported.
(Note. These codes appear in numeric fields only.)
Table 12.2. Mission Value Codes.
52
-------
Section 13
Data Evaluation and Verification
As the field and analytical laboratory data are
received by the EMSL-LV QA staff, all data are evalu-
ated based on the available QA/QC information,
using the established and organized review process
described here. The objective of the data verification
process is to identify and correct, flag, or eliminate
data of unacceptable quality. Computer pro grams
have been developed to automate this process as
much as possible. Each batch of data is evaluated on
a sample-by-sample basis, as described in the follow-
ing subsections.
13.1 Field Data Review
Each field data form is reviewed to check for the fol-
lowing items:
1. Lake ID. Form 1 is compared to Form 2 to iden-
tify and correct transcription errors.
2. Trailer Duplicate. On Form 2 should be recorded
a duplicate lake ID that matches a routine lake
sample ID.
3. Hydrolab Calibration Data. The pH and specific
conductance calibration data on Form 1 are
compared to data on the Hydrolab calibration
forms to ensure that initial calibration criteria
are met; if the criteria are not met, correct data
qualifiers are noted.
4. Hydrolab pH. The pH reading at 1.5 m, recorded
on Form 1, is compared to the field laboratory
pH reading on Form 2.
5. Field Laboratory (Trailer) pH and DIG. Form 2
measurements for field audit samples are evalu-
ated in accordance with the associated accept-
ance criteria. Routine duplicate pairs and trailer
duplicate pairs are evaluated for precision.
6. pH and DIC QCCS Data. Form 2 QCCS data are
evaluated to ensure that QCCS criteria are met.
Data anomalies are reported to the field laboratory
coordinator for review, and data reporting errors are
reported to ORNL to be corrected before entry into
the raw data set. All telephone communications are
recorded in bound notebooks and data corrections
are annotated on the appropriate forms.
13.2 Analytical Data Review
73.2.7 Daily QA Communications
Daily calls are made to each field station and analyti-
cal laboratory (1) to ensure that QA/QC guidelines are
being followed, (2) to ensure that samples are being
handled and analyzed properly, (3) to obtain current
sample data, and (4) to discuss problems that may
occur during analyses.
The primary objective of these calls is to identify and
resolve issues quickly, before they affect data quality
or interfere with the completion of the survey.
Preliminary sample data are obtained verbally, by
computer, or by TELEFAX, depending on the analyti-
cal laboratory. The preliminary data are evaluated by
comparing the QA sample data against acceptance
criteria.
Responsible parties are notified of problems and all
interactions are recorded in bound notebooks.
73.2.2 Preliminary Review of Sample Data Package
The sample data packages are reviewed for complete-
ness, internal QC compliance, and appropriate use of
data qualifiers. The Data Package Completeness
Checklist (given in Appendix F) is used to assure con-
sistency in the review of all data packages. Any dis-
crepancies related to analytical data are reported to
the appropriate analytical laboratory manager for
corrective action. If discrepancies affect billing or
data entry, then SMO or ORNL is notified. Comments
provided in the cover letter are also reviewed to deter-
mine their impact on data quality and the need for any
follow-up action by the laboratory. This data review
process is also important in verifying that the con-
tractual requirements are met for the purpose of pay-
ment.
73.2.3 Review of QA/QC Data
The analytical data reported on data forms are
entered into the raw data set by ORNL as the data
packages are received. A magnetic tape containing
raw data is sent to the EPA IBM 3081 at the National
Computer Center (NCC), Research Triangle Park,
53
-------
North Carolina. Each tape received by the NCC tape
library is given a volume serial number and a BIN
number that indicates the physical location of the
tape. The tape is loaded remotely by the EMSL-LV QA
staff, and exception programs, listed in Table 13.1, are
generated.
The ELS-I Verification Report listed in Appendix G is
completed with the use of outputs from exception
reports (along with the original data and field note-
books). The verification report is a worksheet
designed to systematically guide the auditor through
the verification process by explaining how to flag
data, tracking data resubmissions, tracking reanaly-
sis and confirmation requests, listing the steps to
help explain the QA exceptions, summarizing all mod-
ifications to the raw data base, and listing all verified
sample data.
One hundred percent of the analytical data are veri-
fied, sample by sample. A routine lake sample has to
meet both the anion/cation %ID = and %CD criteria
in order to be verified, unless the discrepancy can be
explained by either the presence of organic species
(as indicated by the protolyte analysis program) or an
obvious correctable reporting error.
Additional flags are applied to a given parameter,
even though the verification is on a "per sample"
basis, when the batch QA sample data do not meet
the acceptance criteria for QA samples such as field
blanks, field duplicates, or audit samples. Each
parameter is also flagged if internal QC checks such
as matrix spike recovery, calibration and reagent
blank analytical results, internal duplicate precision,
instrumental detection limits, QCCS analytical
results, and required holding times do not meet speci-
fications. The final source of flags is the protolyte
analysis program. A detailed description of the evalu-
ation of DIG, DOC, pH, ANC, and BNC data by the pro-
tolyte analysis program is given in Section 13.2.4. In
all cases, the flags that are generated by the com-
puter programs are reviewed by the auditor for rea-
sonableness and consistency before being entered
into the data base.
13.2.4 Computer Evaluation of DIG, DOC, pH, ANC,
and BNC Data
An evaluative computer program performs data
checks and uses carbonate equilibria and DOC data
to identify analytical error and the source of protoly-
tes (acidic or basic species) in the sample. Thus, the
DIG, DOC, pH, ANC, and BNC data are rigorously eval-
uated in light of known characteristics of carbonate
equilibria. The overall process of data evaluation
based on carbonate equilibria is summarized below.
13.2.4.1 Redundant Alkalinity Checks f or pH and DIG.
Evaluations of carbonate equilibria indicate that alka-
linity is not affected by changes in dissolved CO2 con-
centration. Furthermore, alkalinity can be calculated
from carbonate equilibria if the DIG and pH are
Program
Type
Exception-Generating Programs.
1 = Audit Sample Summary
2 = Lab/Field Blank Summary
3 = Field Duplicate Precision Summary
4 = Instrumental Detection Limit Summary
5 = Holding Time Summary
6 = % Conductance difference Calculations
7 = Anion/Cation Balance Calculations
8 = Internal Lab Duplicates
9 = Matrix Spike Summary
10 = Protolyte Analysis (DIG, DOC, pH, ANC, and BNC Data Evaluation)
11 = Reagent/Calibration Blanks and QCCS
Data Review Programs1
1 = Raw Data Listing - Format for QA Manager
2 = Complete Raw Data Listing - Format for Audit Staff
3 = Comparison of Form 1 and Form 2
4 = Comparison of Form 2 and Form 11
5 = QA/QC Flag Summary
6 = Modified Gran Analysis Program
(LH, LL, FH, FL, FN)
(B, LB, FB)
(R/D Pairs)
(All Species)
(All Species)
(All Species)
(All Species)
(All Species)
(pH and DIG)
(pH and DIG)
Table 13.1. Exception-Generating and Data Review Programs.
54
-------
known. A theoretical alkalinity, C, is calculated from
each of the three pH/DIC pairs:
C-| = pH/DIC of "closed system" syringe sam-
ples (field laboratory)
62 = pH/DIC of "open system" samples (analyt-
ical laboratory)
63 = pH/DIC of "air-equilibrated system" sam-
ples (analytical laboratory)
The third data pair is obtained on an aliquot that has
been equilibrated with standard air (300 ppm C02). If
there is no analytical error, the three calculated alka-
linities should agree within experimental error. The
precision for calculated alkalinity values of less than
or equal to 100 /xeq/L should be within ±10 neqlL
and within ± 10 percent for calculated alkalinity val-
ues greater than 100 /xeq/L The precision windows
are based on the estimated precision of the pH and
DIG measurements used in the calculations. If this
comparison indicates a potential analytical error (i.e.,
the precision limit is exceeded), the redundant pH
and DIG values are compared to identify the source of
error. Further evaluation of theQA/QC information for
the individual data pairs usually identifies one of the
pH or DIG measurements within the outlier pair as the
source of error. Because of the redundancy in meas-
urement, for every sample that is analyzed an accept-
able pH or DIG value from one of the data pairs should
be available to the data user.
13.2.4.2 Verification of Measured ANC. The mea-
sured ANC is evaluated by comparing it to the aver-
age of the acceptable calculated values for alkalinity
determined during the evaluation of pH and DIG.
• Carbonate Systems. For a true carbonate sys-
tem, the measured ANC should equal (within
experimental error) the calculated alkalinity. The
difference between measured ANC and the cal-
culated alkalinity should be within ±15 jueq/L
for calculated alkalinities less than or equal to
100 fieqIL, and within ± 10 percent for larger val-
ues. If the measured ANC differs from the calcu-
lated alkalinity, an analytical error is indicated in
either the titration or in the pH or DIG measure-
ments.
• Mixed Systems. Mixed systems are those, repre-
sented by samples that have significant concen-
trations of other protolytes in addition to the car-
bonate species. In natural waters, organic bases
derived from humic and fulvic acids are often
present and can have a significant contribution
to the ANC. Two empirical relationships among
DOC, pH, and organic protolytes have been pro-
posed by Oliver et al. (1983). The first relates the
total organic protolyte to DOC, and the second
relates the mass action quotient (pKo) of the
organics present to the sample pH.
DOC and pH are measured in each sample. The empir-
ical relationships (defined by the Oliver model) and
the measured pH and DOC values are used to esti-
mate the contribution of organic protolytes to the
measured ANC. The measured ANC should equal,
within experimental error, the sum of the calculated
alkalinity and the estimated organic protolyte contri-
bution, assuming that significant concentrations of
other non-organic protolytes are not present and
there is no analytical error. The precision should be
within ±15 /^eq/L for calculated ANC less than or
equal to 100 yueq/L and within ± 10 percent for larger
values.
13.2.4.3 Verification of Measured BNC. BNC, unlike
ANC, isaffected by changes in dissolved CO2concen-
tration. Therefore, evaluation and verification of
those data cannot utilize as much redundancy as that
of ANC data. Only the initial pH and DIG values deter-
mined in the analytical laboratory (data pair C2) can
be used to calculate BNC for comparison with the
measured value. As with ANC, other protolytes can
contribute to the measured BNC. An estimate of CO2-
acidity is calculated from data pairs and carbonate
equilibria. The calculated acidity should equal, within
experimental error, the measured BNC, if no other pro-
tolytes are present. Precision for calculated acidity
values less than or equal to 100 /ueq/L should be
within ±10 Meq/Land within 10 percent for larger val-
ues. If the calculated acidity is greater than the mea-
sured BNC, an analytical error in the pH, DIG, or BNC
determination is indicated.
The pH and DIG measurements are verified by the pre-
vious tests (QA/QC redundancy and alkalinity
checks). If the calculated acidity is less than the mea-
sured BNC, the difference may be due to the presence
of other protolytes or to an analytical measurement
error. The Oliver model is used to evaluate the contri-
bution from organic acids.
13.2.4.4 System Check for Total Carbonate. For a car-
bonate system, it can be shown that the sum of alka-
linity and acidity equals total carbonate concentra-
tion in the sample. For a mixed system, it can be
shown that the sum of ANC and BNC equals the total
protolyte concentration in the sample. Thus, the cal-
culated values of alkalinity and acidity can be com-
bined and compared to the sum of the measured AJMC
and BNC, as an additional check of the data. For a
carbonate system, the sum of ANC and BNC should
equal, within experimental error, the total carbonate
concentration or the sum of calculated acidity and
alkalinity. If this sum is less than the calculated total
carbonate, an analytical error is indicated because
the two titrations must account for all carbonate spe-
cies present in the sample. Other protolytes or analyt-
ical error is indicated if the sum of ANC and BNC
exceeds the calculated total carbonate. Again, the
Oliver model is used to evaluate the data.
55
-------
The precision for this evaluation should be within ±
15 Atmole/L for total carbonate concentrations less
than or equal to 100Mmole/L, and ±10 percent for
higher concentrations.
The protolyte analysis program generates flags (Table
12.1) based on the data checks described above to
indicate the source of problems. Flowcharts demon-
strating these data checks are available from EMSL-
LV.
73.2.5 Follow-up with Analytical Laboratories
After the review of all data is completed, the analyti-
cal laboratories are requested to resubmit data
reporting forms that are incomplete, submit correc-
tions of previously reported data, confirm previous
results, and reanalyze certain samples that do not
meet QA/QC criteria. The analytical laboratories are
directed to respond within a reasonable time so that
the results are evaluated in time for them to be useful
to the survey.
73.2.6 Preparation and Delivery of Verification
Tapes
The steps identified in sections 13.2.2 through 13.2.5
are followed to identify suspect data and to correct
erroneous data. The information obtained by this
process is accumulated by the EMSL-LV QA staff and
is placed on magnetic tapes, which are sent to ORNL.
There, the new data are entered into the raw data set,
to correct and flag the original data. These steps may
have to be repeated several times before all the data
are verified.
56
-------
Section 14
Data Validation
14.1 Overview
Validation, used in the context of data bases, is the
process by which data are evaluated for quality con-
sistent with the intended use of the information.
Because validation is a process linked to the goals
and methods of a project, the process must be
defined for each data base. Consequently, no single
set of criteria can be applied to all data bases to judge
their validity. Validation is, therefore, a functional
term for describing the continuing process of defin-
ing the quality of the data with each step resulting in
increased knowledge of, and presumably confidence
in, the data. This is accomplished by reviewing the
data for errors; data known to be erroneous are identi-
fied so that correct data can be substituted, and pos-
sible errors are flagged to alert the user to their ques-
tionable status.
In the verification step, which precedes validation,
the quality of the analytical chemical data is deter-
mined through a rigorous protocol based on known
principles of chemistry. However, not all potential
errors in the data are evaluated in the verification
process. Therefore, the purpose of the validation
process for the ELS-I is to investigate errors in the
chemical analyses not detected in verification, and to
provide a review of the quality of the nonchemical var-
iables. The list of physical variables subject to valida-
tion is shown in Table 14.1.
Two aspects of the data validation process are the
identification of outliers and the evaluation of possi-
ble systematic error in the measurement process.
Both of these aspects are exploratory, as opposed to
test-oriented, and as such, the methods stress visual
presentations and subjective, though conservative,
selection procedures. The objective is to attract
attention to certain data values or sets of values in
order that special thought and caution be given them
when considering analysis or model-building based
upon them. The methods selected for detection of
outliers and systematic errors were chosen for their
simplicity of implementation from a computational
standpoint, using pre-existing software whenever
possible. The process of validation is summarized in
Figure 14.1.
14.2 Detection of Outliers
Outliers, observations that are not typical of the pop-
ulation from which the sample is drawn, are identified
using univariate, bivariate, and multivariate analyses.
These procedures assist in identifying outliers that
require further scrutiny. However, observations that
are atypical with respect to the population may result
from analytical error or heterogeneity in chemistry
among lakes. It is essential to separate analytical
errors from abnormal lake chemistry to avoid the
undesirable effect of purging analytically correct val-
ues from the data base (discussed Section 14.4).
14.2.1 Univariate Analyses
An initial approach to outlier detection is to consider
each variable individually, searching for values that
are extreme with respect to the sample. The method
used here is the box plot (Tukey, 1977) as implemented
in MINITAB (Ryan et al., 1981) and SAS (SAS Institute,
1982). The boxplot summarizes the data for one varia-
ble based on the median and upper (Fu) and lower (Fl)
fourths or quartiles. The difference between the
upper and lower quartiles is known as the inter-quar-
tile range (Fu - Fl = dF); any value greater than the
absolute value of 3 dF is identified as an outlier.
74.2.2 Bivariate Analyses
Although values of two variables may not be be con-
sidered extreme relative to some expected or typical
relationship. Scatter plots are useful for examining
expected theoretical or empirical relationships
between variables. The bivariate relationships exam-
ined in this process are shown in Table 14.2. Outliers
are identified by visual inspection of the plots and by
listing of residuals based on a least squares regres-
sion analysis where a linear relationship exists.
Observations are identified as outliers if the absolute
value of the standardized residual [(actual-predicted)/
residual standard deviation] is generally greater than
3. Because the least squares analysis can be strongly
biased by certain types of outliers, the residuals from
resistant line fits, lines fit through the medians of par-
titions of the data (Velleman and Hoaglin, 1981), are
examined for DOC, true color, turbidity, and Secchi
disk transparency. Other variables are treated by use
of an iterative process of linear regression, identifica-
tion and removal of outliers, and repeated linear
57
-------
Variable
General Description of Validation Checks
1. Latitude
3. Lake Elevation
4. Lake Area
5. Watershed Area
6. Site Depth
7. Stream Inlets and Outlets
8. Lake Hydrologic Type
9. Shoreline Land Use
10. Water Temperature
11. Secchi Disk Transparency
12. True Color
13. Turbidity
Lake location, as measured by LORAN, is maps
Lake characteristics are checked against state records, where
available, to confirm lake identification
Data are compared to aerial photographs
Recorded temperature is checked to see if it falls in appropriate
range.
Data are checked for internal consistency
Table 14.1. Physical Variables Subject to Validation.
regression to identify additional outliers that would
not have necessarily been identified had major outli-
ers not been removed first.
14.2.3 Multivariate Analyses
Although examination of scatter plots is an important
and necessary step for evaluating possible errors in
the data, bivariate analyses must be limited to those
variables that have obvious associations. The magni-
tude of the data set precludes examination of all pos-
sible distributions and bivariate plots. For example,
the numberof bivariate plots required for all combina-
tions of the analytical variables exceeds 4,600.
Although many of these combinations of variables
are of no interest, many combinations remain.
Clearly, this is not a practical or efficient method for
examining all the data.
An alternative method of examining data for system-
atic and random errors is through multivariate analy-
sis in which several variables are examined simulta-
neously. Because theoretical relationships are
expected to exist among certain chemical variables,
it is useful to examine these sets of variables as
groups (Table 14.3).
Two primary multivariate techniques are used to iden-
tify outliers: cluster analysis and principal compo-
nent analysis (PCA). Cluster analysis is a classifica-
tion technique for identifying similarities (or,
conversely, dissimilarities) among observations.
Each observation is compared to other observations
in the set and is assigned to a group or cluster using a
measure of similarity.
The primary clustering technique used in the valida-
tion process is the FASTCLUS procedure in SAS (SAS
Institute, 1982). This method is a nonhierarchical divi-
sive method that is sensitive to outliers. A less formal
clustering technique also used for selected samples
is the Trilinear Diagram (Hem, 1970). TheTrilinear Dia-
gram is useful for examination of possible errors
associated with the major cations and anions. The
other clustering techniques are used for the related
sets of variables shown in Table 14.3.
Principal component analysis is a technique that also
is commonly used to reduce large data matrices into
manageable dimensions. New variables called princi-
pal components are formed from linear combinations
of the original variables such that the first principal
component reflects most of the variance or disper-
sion in the data. Each successive principal compo-
nent explains less variance, and examination of the
first several components is generally sufficient to
describe the data. If the original data are approxi-
mately normally distributed, the resulting principal
components are also approximately normal. Thus, a
plot of any two components typically results in an
elliptical cluster with outliers displaced from the
ellipse.
Where appropriate, least squares multiple linear
regression techniques also are used to identify obser-
vations with high absolute values of the standardized
residual.
14.3 Detection of Systematic Error
Methods for evaluating systematic error are less
exploratory because they require a source of external
comparison. Here the tests are similar to comparison
with standards (such as audits or split samples), with
one major difference. The external references, con-
58
-------
c
Data Set No. 2
(Verified)
UNIVARIATE
Box Plots
Probability Plots
MULTIVARIATE
PCA
Cluster Analysis
Trilinear Plots
MLR
Outliers
Modify or
Delete Value
Yes
BIVARIATE
Scatter Plots
Regression
Relational
Comparative
Systematic
Differences
Flag or
Modify Values
No
No
Data Set
No. 3
(Validated)
Figure 14.1. Flowchart for Data Validation Process.
59
-------
Specific Conductance
(analytical laboratory)
ANC
pH (field laboratory)
DOC
True Color
Turbidity
Calcium
Sodium
DIG (equilibrated)
Aluminum (total)
2-
vs. Specific conductance (field)
Calculated specific conductance
Sum of cations
Sum of anions
Calcium
Calcium and sodium
pH (field laboratory)
ANC
DIG
vs. Calcium
pH (all measures)
Sum of cations
Sum of anions
Calculated ANC (sum of base cations, -SO/" -C1
vs pH (all measurements)
Sum of cations
Sum of anions
Aluminum (total extractable)
Sulfate (expressed as % of total anions)
vs. True color
Anion deficit
Secchi disk transparency
Turbidity
vs. Anion deficit
Secchi disk transparency
Turbidity
vs Secchi disk transparency
vs. Magnesium
Silica
vs. Chloride
vs DIG (unequilibrated)
vs. Aluminum (total extractable)
a Bivariate plots and linear regressions are also prepared for variables analyzed on split samples using ICPAES (Inductively
Coupled Plasma Atomic Emission Spectroscopy). Ca, Mg, Na, K, S, Si, P (total), Al (total), Fe, and Mn
Table 14.2. Pairs of Variables Used to Check for Random and Systematic Errors
sisting of data sets obtained from other investigators,
cannot be viewed as "standards." Hence, a difference
between data from the ELS-I and another data source
does not necessarily imply that the ELS-I data are in
error. However, comparisons with external data
sources serve as aids for evaluating the quality of the
data by bringing attention to data that may require
additional scrutiny. Clearly, existence of systematic
differences between the ELS-I data and several exter-
nal data sources would be cause for careful reevalua-
tion of the data in question.
Two types of systematic errors are investigated in the
ELS-I data base: a constant additive effect (resulting
in a nonzero intercept), or an effect that is dependent
on the magnitude of the variable being measured
(causing a slope ^1 or nonlinearity in the relation-
ship). The "reference" data sets selected for system-
atic error are shown in Table 14.4. The external data
sets used were selected based on the need to com-
pare data from all three alkalinity classes, accessibil-
ity of the data, and documentation of QA procedures.
60
-------
Group
Variables
1
2.
3.
4.
5
6
7.
8
9
10.
Major amons and cations
pH, ANC, DOC, true color
Turbidity, Secchi disk transparency, true color
Nitrate, phosphorus, ammonium, turbidity, Secchi disk transparency
Anion deficit, DOC, true color
pH, total extractable aluminum, fluoride, DOC
Silica, major cations
Iron, manganese, total extractable aluminum, DOC
ANC, DIG, pH
pH, sulfate, DOC
Table 14.3. Related Groups of Variables Used in Multivariate Analyses.
14.4 Treatment of Outliers and Systematic
Differences
Data identified as outliers through the above proce-
dures may be accept able when evaluated in the con-
text of other variables or when considering limita-
tions of the methods used in the ELS-I. Therefore,
before the original data sources are rechecked, the
outliers and systematic differences identified in the
validating process are reviewed for plausibility by the
staff at ERL Corvallis. Data that remain suspect fol-
lowing screening by staff scientists are sent to the
appropriate organization for reexamination.
Outliers and systematic differences for all chemical
variables are checked against reported values by
staff at EMSL-LV; site location and watershed related
variables are reviewed by the Geographic Research
Team at ERL Corvallis; and remaining variables are
checked by staff at ORNL The data rechecking step
for suspect values identified in validation of the
chemical variables may reveal the following possible
conditions and may require the associated response
listed:
CONDITION:
(1) Suspect value in data set number 2 (verified data
set) is found to be a transcription or transposition
error.
RESPONSE:
Correct value is placed in data set number 3 (validated
data set).
CONDITION:
(2) Suspect value in data set number 2 agrees with
reported value, and value was flagged in verifica-
tion.
RESPONSE:
Value is flagged in data set number 3.
CONDITION:
(3) Suspect value in data set number 2 agrees with
reported value, but value was not flagged in verifi-
cation.
RESPONSE:
Value may be flagged in data set number 3 depending
on evidence for possible error.
Values flagged in data set number 2, but not identified
as aberrant in data validation, remain unchanged and
flagged except in cases where the flag is not required
for interpretation of the data; in these cases, the flag
is removed. The protocol for resolution of outliers for
the non chemical variables is similar, with the excep-
tion that response (2) is omitted.
Resolution of systematic differences between the
ELS-I and external reference data involves reexami-
nation of the methods used to collect the ELS-I data.
The effort involved in evaluating systematic differ-
ences depends on the evidence available to suggest
that a bias may exist in the ELS-I data and the variable
under consideration.
In most cases, sufficient information to perform an
appropriate correction for bias in the ELS-I data is not
likely to be available. However, the identification of a
possible bias is provided to assist the user in inter-
preting such data.
61
-------
Region/
Subregion
1A
1C
1D
1E
2
2A,B,C
2C,D
3A
3B
Political
Boundary
New York
Vermont
New Hamphire
Massachusetts
Maine
Upper Midwest
Minnesota,
Michigan,
Wisconsin
Wisconsin
Southern
Blue Ridge
Mountains
Florida
Data Set
NY Dept. Environ. Conservation
ADDNET
Vermont Dept. of Environ.
Control
Long Term Monitoring
N.H. Water Supply and
Pollution Control Comm.
Dept of Environ. Qual. Engr.
Maine Dept of Fish and Game
Long Term Monitoring
ADDNET
ERL-D/UMD
Long Term Monitoring
Wisconsin County Surface
Water Resources
ADDNET
Long Term Monitoring
U.S.G.S. Hydrologic
Almanac
Aquatic Weed Survey
Number of
Lake in
Common
with NSWS
17
157
58
32
52
24
235
6
209
75
38
287
48
10
36
Not Resolved
Variable
Physical
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Type
Chemical
X
X
X
X
X
X
X
X
Table 14.4. External Reference Data Sets Used for identifying Systematic Differences.
62
-------
Sect/on 75
References
American Public Health Association, American Water Works Association, and Water Pollution Control Federa-
tion, 1985. Standard Methods for the Examination of Water and Wastewater, 16th Ed. APHA, Washington,
D.C.
American Society for Testing and Materials, 1984. Annual Bookof ASTM Standards, Vol. 11.01, Standard Specifi-
cation for Reagent Water, D1193-77 (reapproved 1983). ASTM, Philadelphia, Pennsylvania.
Costle, D. M., May 30, 1979(a). Administrator's Memorandum, EPA Quality Assurance Policy Statement. U.S.
Environmental Protection Agency, Washington, D.C.
Costle, D. M., June 14,1979(b). Administrator's Policy Statement, Quality Assurance Requirements for All EPA
Extramural Projects Involving Environmental Measurements. U.S. Environmental Protection Agency,
Washington, D.C.
Grubbs, F. E., 1969. Procedures for Detecting Outlying Observations in Samples. Technometrics, TCMTA, v. 11, n.
4, pp. 1-21.
Hem, J. D., 1970. Study and Interpretation of the Chemical Characteristics of Natural Water, 2nd Ed. U.S. Geolog-
ical Survey Water Supply Paper 1473. U.S.G.S., Washington, D.C.
Hillman, D. C., J. F. Potter, and S. J. Simon, 1986. National Surface Water Survey Eastern Lake Survey (Phase I --
Synoptic Chemistry) Analytical Methods Manual U.S. Environmental Protection Agency, Las Vegas,
Nevada.
Kramer, J. R., 1982. Alkalinity and Acidity. In: R. A. Minearand L. H. Keith (eds.), Water Analysis, Vol. 1. Inorganic
Species, Part 1. Academic Press, Orlando, Florida.
McQuaker, N. R., P. D. Kluckner, and D. K. Sandberg, 1983. Chemical Analysis of Acid Precipitation: pH and
Acidity Determinations. Environ. Sci. Technol., v. 17, n. 7, pp. 431-435.
Morris, F. A., D. V. Peck, M. B. Bonoff, and K. J. Cabbie, 1986. National Surface Water Survey Eastern Lake Survey
(Phase I -Synoptic Chemistry). Field Operations Report. U.S. Environmental Protection Agency, Las
Vegas, Nevada.
Oak Ridge National Laboratory, 1984. National Surface Water Survey Project - Data Management Proposal.
Environmental Sciences Division and Computer Sciences, UCC-ND, ORNL, Oak Ridge, Tennessee.
Oliver, B. G., E. M. Thurman, and R. K. Malcolm, 1983. The Contribution of Humic Substances to the Acidity of
Colored Natural Waters. Geochim. Cosmochim. Acta, v. 47, pp. 2031-2035.
Omernik, J. M.,andC. F. Powers, 1983. Total Alkalinity of Surface Waters--a National Map. Annals of the Associ-
ation of American Geographers, v. 73, pp. 133-136.
Omernik, J. M., A. J. Kinney, G. E. Griffith, and A. L. Yeaple. Total alkalinity of surface waters: a map of the
Southeast and Southcentral region. Corvallis Environmental Research Lab, U.S. Environmental Protec-
tion Agency, Corvallis, Oregon (in preparation).
Omernik, J. M. and G. E. Griffith, 1985. Total Alkalinity of Surface Waters: a Map of the Upper Midwest Region.
EPA-6001D-85-043, U.S. Environmental Protection Agency, Corvallis, Oregon.
Omernik, J. M. and A. J. Kinney, 1985. Total Alkalinity of Surface Waters: a Map of the New England and New York
Region. EPA-6001D-84-216, U.S. Environmental Protection Agency, Corvallis, Oregon.
63
-------
Peden, M. E., 1981. Sampling Analytical and Quality Assurance Protocols for the National Atmospheric Deposi-
tion Program. ASTM D-22 Symposium and Workshop on Sampling and Analysis of Rain, American Society
for Testing and Materials, Philadelphia, Pennsylvania.
Ryan, T. A., B. L Joiner, and B. F. Ryan, 1981. Minitab: Reference Manual. Pennsylvania State University, Univer-
sity Park, Pennsylvania.
SAS Institute, Inc., 1982. SAS User's Guide: Statistics. SAS, Gary, North Carolina.
Tukey, J. W., 1977. Exploratory Data Analysis. Addison-Wesley Publishing, Reading, Massachusetts.
U.S. Environmental Protection Agency, 1980. Interim Guidelines and Specifications for Preparing Quality Assur-
ance Project Plans. QAMS 005/80. U.S. Environmental Protection Agency, Washington, D.C.
U.S. Environmental Protection Agency, 1983 (revised). Methods for Chemical Analysis of Water and Wastes. EPA-
600/4-79-020. U.S. EPA, Cincinnati, Ohio.
U.S. Environmental Protection Agency, 1984a. National Surface Water Survey, Phase I. U.S. EPA, Office of
Research and Development, Washington, D.C.
U.S. Environmental Protection Agency, 1984b. National Surface Water Survey, Phase I. Research Plan, A Sum-
mary of Contents. U.S. EPA, Corvallis, Oregon.
U.S. Environmental Protection Agency, 1984c. National Surface Water Survey, National Lake Survey - Phase I.
NAPAP Project Reference No. E-23. U.S. EPA, Washington, D.C.
Velleman, P. R, and D. C. Hoaglin, 1981. Applications, Basics, and Computing of Exploratory Data Analysis.
Duxbury Press, Boston, Massachusetts.
Weast, R. C. (ed.), 1972. CRC Handbook of Chemistry and Physics, 53rd Ed., CRC Press, Cleveland, Ohio.
64
-------
Appendix A
Data Forms for Reporting Analytical Results
The daily communications with the analytical laboratory are entered into the Telephone Record Log or the labo-
ratory notebook to track and resolve all problems encountered during anlayses.
A-l
-------
NATIONAL SURFACE WATER SURVEY
FORM 11
Page 1 of 2
LAB NAME
BATCH ID
SUMMARY OF SAMPLE RESULTS
LAB MANAGER'S SIGNATURE
SAM-
PLE
ID:
Ul
~~0"2
(53
~03
"US
~OT>
~07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
ALIQUOT 10
1
Ca
mg/L
Mg
mg/L
K
mg/L
Na
mg/L
Mn
mg/L
Fe
mg/L
2
Total
Extr. Al
mg/L
3
cr
mg/L
2-
so4
mg/L
NO,"
mg/L
sio2
mg/L
1SE
Total F'
mg/L
NOTE: Approved data qualifiers and Instructions for their use are listed 1n Table 9.7 of the QA Plan.
-------
NATIONAL SURFACE WATER SURVEY
FORM 11
Page 2 of 2
LAB NAME
BATCH ID
SUMMARY OF SAMPLE RESULTS
LAB MANAGER'S SIGNATURE
SAM-
PLE
ID:
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
ALIQUOT ID
4
DOC
mg/L
NH4+
mg/L
5
Measured
Eq.
pH
Alk
Init. pH
Acy
Init. pH
Acy
yeq/L
Alk
yeq/L
Spec.
Cond.
iiS/cm
Eq.
DIC-
mg/L
Init.
DIC
mg/L
6
Total
P
mg/L
7
Total
Al
mg/L
Note: Approved data qualifiers and instructions for their use are listed in Table 9.8 of the QA plan.
-------
Lab Name
Lab Manager's Signature
RESULTS
[Alk] =
DATA
NATIONAL SURFACE WATER SURVEY
Form 13
ALKALINITY AND ACIDITY RESULTS
Batch ID
Page 1 of 1
Ana1yst_
Sample ID
Meq/L
eq/L
eq/L
INITIAL SAMPLE VOLUME
BLANK ALKALINITY
mL
ueq/L
DATE STANDARDIZED
DATE STANDARDIZED
ACID TITRATION
BASE TITRATION
VOLUME HC1
(mL)
0.00
0.00 (with KC1 )
MEASURED
PH'
CALCULATED
pH
VOLUME NaOH
(mL)
0.00
0.00 (with KC1 )
MEASURED
pH'
CALCULATED
PH
A-4
-------
NATIONAL SURFACE WATER SURVEY
Form 14*
Page 1 of 1
QC DATA FOR ALKALINITY
AND ACIDITY ANALYSES
LAB NAME
BATCH ID
LAB MANAGER'S SIGNATURE
SAMPLE
ID
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
26
29
30
Alk
weq/L
C02-Acy
ueq/L
CALCULATED Alk
RESULT
DIFFERENCE3
5X&
*Form not required in data package but reconmended for internal QC requirements.
a Difference » Calculated Alk-Measured Alk
DIC (in umoles/L)-([Alk] + [C02-Acy]J
DIC
x 100
A-5
-------
LAB NAME
NATIONAL SURFACE WATER SURVEY
Form 15* Page 1 of 1
SPECIFIC CONDUCTANCE
BATCH ID LAB MANAGER'S SIGNATURE
Sample
ID
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
SPECIFIC
CONDUCTANCE (pS/cm)
Calculated
Measured
SCO**
Specific Conductance
Factors of IONS
[(pS/cm at 25'C) per mg/L]
CALCULATED SPECIFIC CONDUCTANCE FOR EACH ION yS/cm
HC03
0.715
Ca+2
2.60
C03~2
2.82
cr
2.14
Mg+2
3.82
N03"
1.15
K*
1.84
Na+
2.13
so4-2
1.54
NH4+
4.13
H+
3,5xl05
(per
mole/L)
OH"
1.92xl05
(per
mole/L)
*Form not required In data package but recommended for Internal QC requirements.
Calculated Specific Conductance - Measured Specific Conductance
**%CD =
x 100
Measured Specific Conductance
Note: Reanalysls criteria are given In Table 9.5 of the QA plan.
-------
LAB NAME
NATIONAL SURFACE WATER SURVEY
Form 16*
ANION-CATION BALANCE CALCULATION
BATCH ID LAB MANAGER'S SIGNATURE
Page 1 of 1
Sample
ID
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
% Ion
Difference **
Factors to Convert
mg/L to ueq/L
Ions - (ueq/L)
Ca+2
49.9
ci-
28.2
Mq+2
82.3
N03"
16.1
K+
25.6
Na+
43.5
S04"2
20.8
F-
52.6
NH4*
55.4
ALK
H+***
*Form not required in data package but recommended for internal QC requirements.
**% Ion Difference ($ID) =
(Alk « ANC)
***[H+] = (10-PH) x 106
ANC + Z An ions - r Cations (except H+)
I Anions + z Cation + ANC + 2[H+]
x 100
A-7
-------
NATIONAL SURFACE WATER SURVEY
Form 17
1C RESOLUTION TEST
Page 1 of 1
LAB NAME
BATCH ID
LAB MANAGER'S SIGNATURE
1C Resolution Test
1C Make and Model:_
Date:
Concentration:
pg/mL, NO"
yg/mL
Column Back Pressure (at max. of stroke):
Flow Rate: mL/min
Column Model:
psi
Date of Purchase:
Column Manufacturer:
Column Serial No:
Is precolumn in system _
(a) cm (b)
Yes
No
cm
Percentage Resolution: 100 x (1-a/b)
The resolution must be greater than 60%
Test Chromatogram:
so.
A-8
-------
NATIONAL SURFACE WATER SURVEY
Form 18
Page 1 of 1
LAB NAME
DETECTION LIMITS
BATCH ID
LAB MANAGER'S SIGNATURE
Parameter
Ca
Mg
K
Na
Mn
Fe
Al , Total
Extractable
CT
v-
N03"
Si02
F-, Total
NH4+
DOC
Specific
Conductance
DIC
P, Total
Al , Total
Units
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
mg/L
yS/cm
mg/L
mg/L
mg/L
Required
Detection
Limit
0.01
0.01
0.01
0.01
0.01
0.01
0.005
0.01
0.05
0.005
0.05
0.005
0.01
0.1
**
0.05
0.002
0.005
Instrumental
Detection
Limit*
Date Determined
(DD MMM YY)
1
out by the analytical lab
"Report the X, which must not exceed 0.9 yS/cm, of six (6) nonconsecuti ve
blanks
Mote: Report with four significant figures or down to IDL
A-9
-------
NATIONAL SURFACE WATER SURVEY
FORM 19
Page 1 of 2
LAB NAME
BATCH 10
SAMPLE HOLDING TIME SUMMARY
LAB MANAGER'S SIGNATURE
DATE SAMPLED*
DATE RECEIVED*
Parameter
Holding
Time
Holding Time
Plus
Date Sampled
Sample ID:
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Ca
28
Mg
28
K
28
Na
28
Mn
28
Fe
28
Total
Extr. Al
7
ci-
28
S042'
28
N03
7
S102
28
ISE
Total F-
28
Date* Analyzed**
*Report these dates as Julian dates (I.e. March 26, 1984 = 4086).
**If parameter was reanalyzed due to QA problems, report the last date analyzed.
-------
NATIONAL SURFACE WATER SURVEY
FORM 19
Page 2 of 2
LAB NAME
BATCH ID
SAMPLE HOLDING TIME SUMMARY
LAB MANAGER'S SIGNATURE
DATE SAMPLED*
DATE RECEIVED*
Parameter
Holding
Time
Holding Time
Plus
Date Sampled
Sample ID:
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
13
19
20
21
22
23
24
25
26
27
28
29
30
DOC
14
NH,+
28
Eq. pH
7
Acidity
14
Alkalinity
14
Specific
Conductance
14
Eq. D1C
14
In1t. DIC
14
Total P
28
Total Al
28
Date* Analyzed**
*Report these dates as Julian dates (I.e. March 26, 1984 = 4086).
**If parameter was reanalyzed due to QA problems, report the last date analyzed.
-------
NATIONAL SURFACE WATER SURVEY
FORM 20
1 of 2
to
LAB NAME
BLANKS AND QCCS
BATCH ID LAB MANAGER'S SIGNATURE
Parameter
Cal Ibratlon
Blank
Reagent Blank
DL (Theoretical
QCCS Measured
Low QCCS
True Value
Low QCCS Upper
Control Limit
Low QCCS Lower
Control Limit
Initial
Continuing
Continuing
Continuing
Continuing
Continuing
Final
High QCCS
True Value
High QCCS Upper
Control Unit
High QCCS Lower
Control Limit
Initial
Continuing
Final
ALIQUOT 10
1
Ca
mg/L
N
Mg
mg/L
N
K
mg/L
N
Na
mg/L
N
Mn
mg/L
N
Fe
mg/L
N
2
Total
Extr.Al
mg/L
N
3
cr
mg/L
N
N
N
SO,,?'
mg/L
N
N
N
N03
mg/L
N
M
N
Si 02
mg/L
N
N
ISE
Total f
mg/L
"N
N
N
Note: Approved Data Qualifiers and instruction for their use are listed in Table 9.8 of the QA plan.
-------
NATIONAL SURFACE WATER SURVEY
FORM 20
LAB NAME
BATCH ID
BLANKS AND QCCS
LAB MANAGER'S SIGNATURE
2 of 2
Parameter
Calibration
Blank
Reagent Blank
DL theoretical
QCCS measured
Low QCCS
True Value
Low QCCS Upper
Control Limit
Low QCCS Lower
Control Limit
Initial
Continuing
Continuing
Continuing
Continuing
Continuing
Final
High QCCS
True Value
High QCCS Upper
Control Limit
High QCCS Lower
Control Limit
Initial
Continuing
Final
4
DOC
mg/L
N
N
N
NH/
mg/L
N
N
N
ALIQUOT ID
Measured
Eq
PH
N
N
N
N
Alk
PH
N
N
N
N
Acy
PH
N
N
N
N
5
Spec.
Cond.
nS/cm
N
N
N
Eq.
DIC
mg/L
N
N
N
Init.
DIC
mg/L
N
N
N
6
Total
P
mg/L
N
7
Total
Al
mq/L
Note: Approved Data Qualifiers and instruction for their use are listed in
Table 9.8 of the QA plan.
A-13
-------
NATIONAL SURFACE WATER SURVEY
FORM 21
Page 1 of 1
LAB NAME
BATCH ID
MATRIX SPIKES
LAB MANAGER'S SIGNATURE
Parameter
MS First
(oMg.l
Sample ID
Sample Result
Spike Result
Spike Added
Percent
Recovery
MS Second
Sample ID
Sample Result
Spike Result
Spike Added
Percent
Recovery
MS Third
Sample ID
Sample Result
Spike Result
Spike Added
Percent
Recovery
ALIQUOT ID
1
Ca
mg/L
Mg
mg/L
K
mg/L
Ha
mg/L
Mn
mg/L
Fe
mg/L
3
cr
mg/L
SO,''
mg/L
N03
mg/L
sio2
mg/C
ISE
Total F"
mg/L
4
DOC
mg/L
MH4*
mg/L
6
Total
P
mg/L
7
Total
Al
mg/L
Note: Matrix spikes not required for AHquots 2 and 5.
-------
LAB NAME
NATIONAL SUFACE WATER SURVEY
Form 22
Page 1 of 2
DUPLICATES
BATCH ID
LAB MANAGER'S SIGNATURE
Parameter
Dupl Icate
Sample ID
Sample Result
Dupl Icate
Result
% RSD
Second Duplicate
Sample ID
Sample Result
Dupl icate
Result
% RSD
Third Dupl icate
Sample ID
Sample Result
Dupl icate
Result
% RSD
ALIQUOT ID
1
Ca
mg/L
Mg
mg/L
K
mg/L
Na
mg/L
Mn
mg/L
Fe
mg/L
2
Total
Extr.Al
mg/L
3
cr
mg/L
SO,'-
mg/L
NOo
mg/L
SiOo
mg/L
ISE
Total F"
mg/L
Note: Approved Data Qualifiers and instructions for their use are listed in Table 9.8 of the QA plan.
-------
NATIONAL SURFACE WATER SURVEY
Form 22
DUPLICATES
Page 2 of 2
LAB NAME
BATCH ID
LAB MANAGER'S SIGNATURE
Parameter
Dupl icate
Sample ID
Sample Result
Dupl icate
Result
% RSD*
Second
Dupl icate
Sample ID
Sample Result
Dupl icate
Result
% RSD*
Third Dupl icate
Sample ID
Sample Result
Dupl icate
Result
% RSD*
ALIQUOT ID
4
DOC
mg/L
NH4*
mg/L
Measured
Eq.
pH
Alk
Initial
PH
Acy
Initial
PH
5
Acy
ueq/L
Alk
yeq/L
Spec.
Cond.
pS/cm
Eq.
DIC
mg/L
Init.
DIC
mg/L
6
Total
P
mg/L
7
Total
Al
mg/L
*Report the absolute difference instead of %RSD for pH determinations.
Note: Approved Data Qualifiers and instructions for their use are listed in Table 9.8 of the
QA plan.
A-16
-------
NATIONAL SURFACE WATER SURVEY
Form 23
Page 1 of 1
LAB NAME
BATCH ID
STANDARD ADDITIONS
SAMPLE ID LAB MANAGER'S SIGNATURE
Parameter
MS First
(orig.)
Sample ID
Single
Response
Spike added
(Cone.]
Sample
Spike 1
Response
Spike 2
Cone.
Sample
Spike 2
Response
Calc. Sample
Cone, for
Orig. Sample
(Summarized
on Form 11)
ALIQUOT ID
1
Ca
mg/L
Mg
mg/L
K
mg/L
Na
mg/L
Mn
mg/L
Fe
mg/L
3
ci-
mg/L
S042"
mg/L
N0§
mg/L
SiOg
mg/L
ISE
Total F-
mg/L
4
DOC
mg/L
NH4+
mg/L
6
Total
P
mg/L
7
Total
mg/L
Note 1: Approved Data Qualifiers and instructions for their use are listed in Table 9.8 of the QA plan.
Note 2: Standard additions not required for Aliquots 2 and 5.
-------
Appendix B
Telephone Record Log
The daily communications with the analytical labora- laboratory notebook to track and resolve all problems
tory are entered into the Telephone Record Log or the encountered during anlayses.
B-l
-------
FACSIMILE
NATIONAL SURFACE WATER SURVEY PROGRAM
EMSL-LV-QA MANAGER/LABORATORY COMMUNICATION SYSTEM
QA/QC Telephone Record Log
Date of Call:
Laboratory Name:
Lab Contact:
EMSL-LV QA Contact
or Designee:
Batch Number(s):
I. Initial and Continuing Calibration
A. Calibration blank is
-------
FACSIMILE
I. (Continued)
D. QCCS Is plotted on a control
chart (95 and 99% control limits
are developed, acceptable range
is +10%)
Comments:
E. Calibration linearity is ML99
Comments:
F. Instrument detection limits
-------
FACSIMILE
II. Matrix Spike Analysis:
A. Matrix spike recoveries are 85-115%
Comments:
B. Samples are spiked at 2x the
endogenous level or dt 10 X IDL).
Comments:
III. Duplicate Sample Analysis or (RSD)
A. RSD values are within the contract
required control limits
Comments:
B. RSD values are plotted on QC chart
and 95% and 99% control limits are
established
Comments:
IV. Ion Chromatography Resolution Test
Resolution of the anion separator is
>_60%
Comments:
Yes
No
If No:
Recommended
orrective Action
Spike 2 addi tional
different samples.
If one or both are
stilT out then
analyze by stan-
dard additions
Analyze a second
different sample
in duplicate
No further samples
may be analyzed
until duplicate
sample results
are within the
control limits
1. Replace anion
separator column
2. Repeat calibration
B-4
-------
FACSIMILE
V. Anion-Cation Balance Check
Calculated percent Ion
(%ID) is less than the
recommended criteria.
Comments:
Difference
contract-
VI. Conductance Balance Check
Calculated percent conductance
difference (%CD) meets the contract-
recommended criteria.
Comments:
Yes
No
If No:
Recommended
Corrective Action
Reanalyze samples
Consult QA manager
Reanalyze samples
Consult QA manager
Distribution: (1) Lab Copy, (2) EMSL-LV-QA Copy.
B-5
-------
Appendix C
Field On-Site Evaluation Questionnaire
c-i
-------
GENERAL (Page 1 of 1)
Questionnaire Completion Date
Field Station
Location
Field Laboratory Supervisor
Questionnaire Completed By (If more than one, indicate sections completed
by each auditor.)
r> ',
u 4.
-------
FIELD LABORATORY PERSONNEL (Page 1 of 1)
Position
Field Laboratory Coordinator
Field Laboratory Supervisor
Analyst
Analyst
Analyst
Name
Academic
Training*
Special
Training*
Years
Experience**
*List highest degree obtained and specialty. Also list years toward a degree.
**List only experience directly relevant to task to be performed.
-------
FIELD LABORATORY - STANDARD OPERATING PROCEDURES (Page 1 of 1)
Item
Is the training manual followed in detail?
Are copies available in the field station?
Are analysis logbooks kept up to date?
Are all on-site changes in procedures clearly documented and
justified, in field laboratory supervisor's logbook and approved
by appropriate personnel?
Yes
No
Comments:
C-4
-------
FIELD LABORATORY FACILITIES (Page 1 of 1)
Item
Is the field station kept clean and organized?
Are the refrigerator and freezer temperatures
monitored on a daily basis and recorded in a
logbook?
Are waste disposal containers available and
clearly labeled, and is waste disposed of
properly?
Are chemicals stored properly?
Is balance calibration checked daily and
recorded in a logbook?
Is water supply purity monitored daily and
recorded in a logbook?
Yes
No
Comment
General Comments:
C-5
-------
FIELD LABORATORY EQUIPMENT (Page 1 of 6)
Color Test Kits
Item
Is manufacturer's operating manual readily
avai lable?
Is kit cleaned and stored properly?
Are viewing tubes kept clean?
Is logbook kept up to date and signed daily?
Is centrifuge maintained and kept clean?
Yes
No
Comment
Comments:
C-6
-------
FIELD LABORATORY EQUIPMENT (Page 2 of 6)
Nephelometer
Item
Is manufacturer's operating manual available?
Is instrument kept clean?
Are cuvettes kept clean and scratch-free?
Is logbook kept up to date and signed daily?
Is calibration checked before and after every
8 samples?
Are standards kept refrigerated when not in use?
Yes
No
Comment
Comments:
C-7
-------
FIELD LABORATORY EQUIPMENT (Page 3 of 6)
Carbon Analyzer
Item
Is manufacturer's operating manual available?
Is instrument kept clean?
Is the injection valve flushed daily after use
with deionized water?
Is logbook kept up to date and signed daily?
Is IR analyzer power left on at all times?
Is standard stock solution prepared biweekly,
and is QC stock solution prepared weekly;
are they stored at 4°C?
Are working standards prepared daily?
Is exposure of samples and standards to the
atmosphere minimized?
Is required QC followed?
Are pump tubes checked for wear and
replaced on a regular basis (about every 2 weeks)?
Are syringes and glassware cleaned
properly after use?
Are C02 and moisture scrubbers on
standard bottles replaced when exhausted?
Is tin scrubber in IR analyzer checked
daily and refilled when necessary?
Yes
No
Comment
Comments:
C-8
-------
FIELD LABORATORY EQUIPMENT (Page 4 of 6)
pH Apparatus
Item
Are meter and electrode operating manuals
available?
Is logbook kept up to date and signed daily?
Is pH QC sample prepared daily?
Is electrode stored in 3M KC1?
Is required QC followed?
Are electrodes checked and filled (if
necessary) prior to use?
Are sample chambers cleaned after use?
Are buffers capped tightly after use?
Yes
No
Comment
Comments:
C-9
-------
FIELD LABORATORY EQUIPMENT (Page 5 of 6)
Filtration and Preservation Apparatus
Item
Is hood kept neat and clean?
Is contamination evident?
Is hood sealed when not in use?
Is filtration apparatus kept ultraclean as
specified?
Are precautions taken to prevent contamination
of filtrators, filter funnels, filters, sample
bottles, and reagents?
Is a water trap used with the vacuum pump?
Are micropipets kept in an upright position
at all times?
Is the calibration of micropipets checked
weekly?
Are sample aliquots properly labeled?
Is vacuum maintained at 10 to 12 inches Hg while
fi Itering?
Yes
No
Comment
Comments:
C-10
-------
FIELD LABORATORY EQUIPMENT (Page 6 of 6)
MIBK Extraction
Item
Is the centrifuge operating manual
Is the extraction logbook kept up
signed daily?
Is leakage of sample volume (> 8.5
the logbook?
Are reagents (NaOAc and hydroxyqui
fresh daily?
Is NH40H made fresh weekly and the
recorded in the logbook?
avai lable?
to date and
mL) noted in
noline) made
preparation
Are pipets calibrated weekly?
Is the 2b mL of standard measured
accurately?
Is the sample buffered to pH 8?
Is the buffer/MIBK solution shaken
for 10 seconds?
vigorously
Is disposal of solid and liquid wastes conducted
properly?
Yes
No
Comment
Comments:
C-ll
-------
SAMPLE PROCESSING (Page 1 of
Item
Are all station documents kept in an orderly
fashion?
Are all forms completed as required and signed
by supervisor?
Is lab audit notebook kept up to date (labels
inserted, etc.)?
Are audit samples assigned different ID numbers
from day to day?
Are samples kept at 4°C when not being used?
Are coolers containing sample kept in the
shade and shut?
Are freeze-gel packs kept frozen?
Are samples properly packed for shipping
(sealed, cooled to 4°C, individually
wrapped, etc.)?
Are 2 copies of the completed shipping form
included with batch of samples?
Yes
No
Comment
Comments:
C-12
-------
FIELD SAMPLING CREW PERSONNEL (ELS-I) (Page 1 of 1)
Position
A. Field Duty Officer
B. Team A
1. Pilot
2. Observer
3. Sampler
C. Team B
1. Pilot
2. Observer
3. Sampler
D. Team C
1. Pilot
2. Observer
3. Sampler
t". Team D
1. Pilot
2. Observer
3. Sampler
Name
Agency
Academic
Training*
Special
Training
Years
Experience**
*List highest degree obtained and specialty. Also list years toward a degree.
**List only experience directly relevant to task to be performed.
-------
FIELD SAMPLING (Page 1 of 1)
Item
Has adequate space been provided for
predeparture activities?
Are facilities clean and organized?
Is equipment clean and organized?
Yes
No
Comment
C-14
-------
HYDROLAB CALIBRATION (Page 1 of 1)
Item
Are copies of the Hydrolab Manual available?
Is the logbook kept up to date and signed daily?
Is the instrument cleaned and stored properly?
Is someone at the field station capable of
performing the maintenance procedures
described in the manual?
Has any other maintenance been necessary?
Are adequate spare parts (batteries, etc.)
available?
Are backup units available?
Is the instrument performing well in the field?
Are there any problems with the instrument?
Is the person performing the calibration
familiar with the procedures and control limits?
Have there been any deviations from the standard
procedures?
Are the QC samples prepared each day?
Is the instrument calibrated before and rechecked
after each excursion?
Yes
No
Comment
C-15
------- |