United States
Environmental Protection
Agency
Environmental Monitoring
Systems Laboratory
Us Vegas NV 89193-3478
Research and Development
EPA/800/S4-89/031 Nov. 1989
Project Summary
Direct/Delayed Response
Project: Quality Assurance
Plan for Preparation and
Analysis of Soils from the
Mid-Appalachian Region of the
United States
M. L. Papp, R. D. Van Remortel, C. J. Palmer, G. E. Byers, B. A.
Schumacher, R. L. Slagie, J. E. Teberg, and M. J. Miah
The quality assurance (QA) policy
of the U.S. Environmental Protection
Agency (EPA) requires every
monitoring and measurement project
to have a written and approved
quality assurance project plan. The
purpose of this quality assurance
plan Is to specify the policies,
organization, objectives, and the
quality evaluation and quality control
activities needed to achieve the data
quality requirements of the
Direct/Delayed Response Project
(ODRP), Mid-Appalachian Soil Survey
(MASS). These specifications are
used to assess and control
measurement errors that may enter
the system at various phases of the
project, e.g., during soil sampling,
preparation, and analysis.
This Project Summary was
developed by EPA's Environmental
Monitoring Systems Laboratory, Las
Vegas, NV, to announce key findings
of the research project that Is fully
documented In a separate report of
the same title (see Protect Report
ordering Information at back).
Project Description
The DDRP is conducted as part of the
interagency, federally mandated National
Acid Precipitation Assessment Program
which addresses the concern over
potential acidification of surface waters
by atmospheric deposition within the
United States. The overall purpose of the
project is to characterize geographic
regions of the United States by predicting
the long-term response of watersheds
and surface waters to acidic deposition.
The EPA is assessing the role that
atmospheric deposition of sulfur plays in
controlling long-term acidification of
surface waters. Recent trend analyses
have indicated that the rate of sulfur
deposition is either unchanging or slowly
declining in the Northeastern United
States, but is increasing in the
Southeastern United States. If a "direct"
response exists between sulfur
deposition and surface water alkalinity,
then the extent of current effects on
surface water probably would not change
much at current levels of deposition, and
conditions would improve as the levels of
deposition decline. If surface water
chemistry changes in a "delayed"
manner, e.g., due to chemical changes in
the watershed, then future changes in
water chemistry (even with level or
declining rates of deposition) become
difficult to predict. This range of potential
effects has clear and significant
implications to public policy decisions on
sulfur emissions control strategies.
The ODRP focuses on regions of the
United States that have been identified as
-------
potentially sensitive to surface water
acidification. The MASS is the third of
three DDRP regional surveys and
includes portions of Pennsylvania,
Virginia, and West Virginia. Surface water
acidification in the Mid-Appalachian
region was studied during the Eastern
Lakes Survey in 1984 and in the National
Stream Survey, Phase I, in 1985.
Information gained during these and
other surveys has been used to optimize
the quality assurance program in the
MASS.
Field and laboratory data collected in
the DDRP are included in the system
description modeling and are then used
to assess the key processes that regulate
the dynamics of base cation supply and
sulfur retention within watersheds.
Integrated watershed data are used to
calibrate three dynamic simulation
models that predict future regional
responses to acidic deposition.
Uncertainty and error propagation
estimates are an important part of all
levels of analysis.
The DDRP staff at the EPA
Environmental Research Laboratory in
Corvallis, Oregon, is involved in all
aspects of the MASS. Responsibilities of
this laboratory include the development
of an experimental design, QA oversight
for soil mapping and sampling, and data
analysis and interpretation. The DDRP
staff at the EPA Environmental Monitoring
Systems Laboratory in Las Vegas,
Nevada, is involved in matters relating to
QA implementation, logistics, and
analytical services. The responsibilities of
this laboratory include the development
and implementation of operational
procedures and quality evaluation/control
criteria for soil sample preparation and
analysis, and the development of data
verification procedures for field and
laboratory data.
The use of interagency agreements by
EPA has allowed the DDRP to engage
support from several groups that have
expertise in areas of importance to the
project. The DDRP staff at Oak Ridge
National Laboratory is involved in
structuring and managing large data
bases to satisfy DDRP data analysis
requirements. The soil survey staff of the
United States Department of Agriculture,
Soil Conservation Service is involved in
DDRP soil mapping and sampling.
Laboratory analysis support is solicited
through competitive bid by independent
contract laboratories.
Field and Laboratory
Operations
Field mapping and sampling operations
in the MASS are conducted by Soil
Conservation Service soil scientists under
the supervision of EPA representatives.
Approximately five sampling crews, each
composed of three to four qualified
persons, are selected. These crews are
responsible for the description of soil and
site characteristics, sample collection,
and shipment of samples to the
preparation laboratory. Members of each
crew are soil scientists that are
experienced with the universally
recognized National Cooperative Soil
Survey characterization procedures. The
crew leader is responsible for selecting
sampling sites and documenting all field
data.
The sampling protocols specify the
procedures for ensuring the integrity of
the samples. Specific instructions on
excavating and draining (if necessary)
soil pits are given. Crews are provided
with standard containers for sample
collection. Samples are kept as cool as
possible in the field and during transport
to the preparation laboratory. In order to
maintain their integrity and to reduce the
possibility of biological degradation,
samples are stored at a temperature of
4°C within 24 hours of sampling. It is
anticipated that 150 pedons are to be
sampled during the MASS, with each
pedon yielding an average of six routine
samples from its component horizons.
To allow the determination of bulk
density, natural soil clods are collected in
triplicate, where possible from each soil
horizon. Where clods cannot be
collected, known volumes of soil are
collected in duplicate. Using a volume
replacement method, the volume of soil
is estimated by replacing a small
excavated soil cavity with a known
volume of foam beads. If this method
does not yield satisfactory samples, a
volume filling method is attempted by
filling a container of known volume with
soil and attempting to pack it to the same
density normally encountered in the
horizon being sampled.
The preparation laboratory is the
intermediate link between the sampling
crews and the analytical laboratories. The
laboratory is located at the EPA
Environmental Monitoring Systems
Laboratory in Las Vegas, Nevada,
although previous DDRP surveys utilized
the services of multiple university
laboratories for sample preparation. It
was thought that consolidating t
preparation activities at a single labc
tory might result in better samp
handling, improved sample integrity, a
higher data quality. The laboratory serv
as a central location for soil process!
and for introducing double-blii
measurement quality samples into t
sample flow. The soil samples collect
by the sampling crews are sent \
overnight courier to the preparati
laboratory, where the laboratory st
processes the samples and prepar
homogeneous, anonymous subsampl
that are shipped to the analytic
laboratories. To accomplish these tas
successfully, the preparation laboratc
must uniformly track, process, and stc
all samples.
After a bulk soil sample is air dried, t
soil is carefully disaggregated a
sieved. The less than 2-mm soil
retained in a labeled plastic bottle a
placed in cold storage when not undi
going processing. The rock fragments <
retained for gravimetric analysis. In ore
to obtain representative volumes of soil
is necessary to prepare homogeneo
subsamples from the less than 2-mm s
fraction using a riffle splitter. T
subsamples are placed in labeled plas
bottles and are organized in batches tl
are shipped to separate contra
analytical laboratories for general a
elemental analyses. Each batch norms
contains 40 samples but may have
many as 42 samples. With the except!
of a quality control audit sample,
samples are randomly placed within
batch and cannot be distinguished by t
recipient laboratory. The unused pprtio
of the bulk soil samples are archived
cold storage.
All raw data obtained are recorded or
series of preparation laboratory raw di
forms and are immediately entered int<
personal computer to ensure prop
tracking, processing, and evaluation
the samples. The computer is direc
linked to the QA staff and allows real-tir
data evaluation and tracking of samp
by both parties. The precision a
accuracy criteria for measurement qua
samples are checked while the samp
are being processed, permitting 1
prompt identification of any discrc
ancies.
During shipment, the samples receiv
by the analytical laboratories c
undergo segregation both by particle s
and by density; therefore, each sam
must be re-homogenized by thorou
mixing prior to the removal of aliquots
-------
analysis. The raw data generated during
sample analysis are entered on a
personal computer using a specially
designed entry and verification system.
Data reports are then generated to assist
in the evaluation of data quality. Before a
batch of analytical data is accepted all
completeness and quality control
requirements must be met. In addition,
the following documents must be
updated constantly at the analytical
laboratory and must be available, upon
request, to the analysts and the
supervisor involved in the project:
standard operating procedures,
laboratory QA plan, list of in-house
samples, instrument performance data,
control charts for all check samples and
detection limit check samples, and
quality control data reports.
Frequent communications, i.e., two or
three contacts each week, are maintained
with each analytical laboratory in order to
obtain current sample status and to
discuss any problems that may occur
during analyses. These discussions are
to be recorded in a logbook by the QA
staff and the laboratory staff. Preliminary
and final data are available for access via
electronic transfer. The preliminary data
are reviewed for anomalies and if a
problem is identified, the laboratory is
iQtified. Corrective action or reanalysis
•nay be suggested.
Legal chain-of-custody procedures are
not required for this study; however,
sample custody must be documented.
The sampling crew leader is responsible
for all samples in the field as well as the
sample shipments. As soil shipments are
received in the preparation laboratory,
they are immediately logged in and
checked against the packing slip and the
pedon description forms for proper
labeling and quantity. If any discrep-
ancies are found, the soil sampling task
leader and the field crew leader are
notified. Following the receipt of samples,
the appropriate laboratory managers are
responsible for sample integrity until the
samples are archived.
Quality Assurance Program
The data collection criteria provide a
balance between constraints of time and
cost and the quality of data necessary to
achieve the DORP research objectives.
The MASS QA Plan is designed to
accomplish the following general
objectives:
• establish the QA criteria used to control
md assess data collected in the
survey,
• provide standardized sampling,
preparation, and analytical methods
and procedures,
• utilize assessment samples and
procedures to verify the quality of the
data,
• perform field and laboratory on-site
audits to ensure that all activities are
properly performed and that
discrepancies are identified and
resolved,
• evaluate the data and document the
results in QA reports to EPA manage-
ment.
The raw soil characterization data for
the DDRP surveys are collected during
four major operational phases consisting
of mapping, sampling, preparation, and
analysis. A certain amount of data
measurement uncertainty is expected to
enter the system at each phase.
Grouping of the data, e.g., by sampling
class/horizon configurations, also
increases uncertainty. The sampling
population itself is a source of
confounded uncertainty that is extremely
difficult to quantify.
Generally, the data quality objectives
for the MASS encompass the overall
allowable uncertainty from sample
measurement and from the sampling
population that the data users are willing
to accept in the analytical results.
Because of the many possible
confounded sources of uncertainty and
the data collection focus of this QA plan,
overall data quality objectives for the
survey are not described herein. Rather,
the plan focuses on the definition,
implementation, and assessment of
measurement quality objectives (MQOs)
that are specified for the entire sample
preparation and analysis phases of data
collection and portions of the field
sampling phase. The MQOs are more or
less specific goals defined by the data
users that clearly describe the data
quality that is sought for each of the
measurement phases. The MQOs are
defined according to the following six
attributes:
• Detectability—the lowest concentration
of an analyte that a specific analytical
procedure can reliably detect,
• Precision—the level of agreement
among multiple measurements of the
same characteristic,
• Accuracy—the difference between an
observed value and the true value of
the parameter being estimated,
• Representativeness—the degree to
which the data collected accurately
represent the population of interest,
• Completeness—the quantity of data
that is successfully collected with
respect to the amount intended in the
experimental design,
• Comparability—the similarity of data
from different sources included within
individual or multiple data sets; the
similarity of analytical methods and
data from related projects across
regions of concern.
An attempt has been made to define
each quality attribute individually for each
measurement phase of the program
(sampling, preparation, analysis) or
collectively for overall measurement
system uncertainty on a parameter-by-
parameter basis. The term "uncertainty"
is used as a generic term to describe the
sum of all sources of error associated
with a given portion of the measurement
system. In order to reduce measurement
uncertainty in the final data values, error
occurring in each phase must be
identified and rectified during that phase
of the survey. Measurement uncertainty
for the MASS is controlled through well-
defined protocols, system audits, and the
introduction of measurement quality
samples at each measurement phase.
Initial MQOs were established on the
basis of the requirements of EPA data
users and on the selection of appropriate
methods to obtain the data. The MQOs
were reviewed by persons familiar with
analytical methods and soil
characterization techniques, including soil
chemists and laboratory personnel.
Modifications to the MQOs and protocols
were implemented based on information
gained from the DORP Northeastern
region and Southern Blue Ridge Province
surveys, from peer review comments, or
according to a particular analytical
procedure or instrument.
Measurement quality samples are
placed at a rate of about one sample in
four in each batch of analytical samples
to determine measurement uncertainty.
The measurement quality samples are
used during various phases of the survey
and allow the QA staff to control and
assess the quality of the data collection
process. Included in each batch of
samples are: audit samples that have
known ranges of analyte concentration
-------
(detectability, precision, and accuracy),
duplicates of routine samples introduced
at each measurement phase (precision
and representativeness), and an audit
sample for use in laboratory quality
control (accuracy). There are separate
analytical within-batch precision and
accuracy MQOs for organic and mineral
soils because of differences in
methodology and the expected wide
variability in analyte concentrations. For
this reason, mineral and organic samples
are organized and analyzed in separate
batches (see below)
The quality evaluation samples are
those samples which are known to the
QA staff but are either blind or double
blind to the sampling crews, preparation
laboratory, or analytical laboratory. A
blind sample has a concentration range
that is unknown to the analysts, whereas
a double blind sample cannot be
distinguished from routine sample and
has a concentration range that is
unknown to the analyst. These samples
provide an independent check on the
analytical process and can be used to
evaluate whether the MQOs have been
met for any given run or batch, or for all
batches, i.e., overall measurement
uncertainty. Important characteristics of
the audit samples include their similarity
to routine samples in matrix type and
concentration level, their homogeneity
and stability, and their defined accuracy
windows. Every quality evaluation sample
has a specific purpose in the data
assessment scheme.
In order to produce data of consistently
high quality, the contract laboratories are
required to analyze certain types of
quality control samples that are known to
the laboratory staff and that can be used
by the analysts to identify and control
analytical measurement uncertainty. Each
of these samples has certain
specifications that must be met before
data for the parameter or batch are
accepted. The control samples are non-
blind samples procured under contract to
assist the laboratories in meeting
laboratory MQOs and include soil
samples, e.g., analytical duplicates, and
non-soil samples, e.g., reagent blanks.
The samples allow the laboratory
manager and the QA staff to assess
whether the physical and chemical
analysis is under control.
Quality Assurance
Implementation
The quality assurance program is
implemented through training, on-site
systems audits, independent assess-
ments, and other procedures used to
control and assure the quality of the data
being collected. Verification of these data
is accomplished through a series of
computer and manual checks of data
quality.
Corrective action for errors made at the
preparation and analytical laboratories is
accomplished primarily through the
application of a QA reanalysis template
for each analysis of interest. The
templates provide a precision and
accuracy checklist of the measurement
quality samples for each batch and i
useful in deciding whether to reanalyze
particular parameter.
The analytical data verification is
multi-faceted, computerized approach
provide a concise and consists
assessment of the data. The over
process if highlighted by the Laboratc
Entry and Verification Information Sysfa
(LEVIS). The LEVIS programs a
implemented on personal computers a
facilitate the data entry and qual
control sample evaluation at the analytii
laboratories as well as the evaluation
laboratory performance by the QA sti
The system is a menu-driven prodi
that is designed as a two-pha
operation, where phase one is
analytical laboratory system and pha
two is a quality assessment system. T
LEVIS initiative was pursued becau
previous surveys had shown that mam
verification of the data was both lab
and time-intensive, and allowed only
limited amount of real-time correcti
action on the part of the laboratories.
An internal consistency program
used to generate routine data outliers
each sample batch. Analytical data frc
each parameter are correlated agaii
corresponding data from all otr
analytical parameters measured in t
MASS. For each parameter, t
parameter pair with the strongest lin<
relationship is identified and evaluate
Soil chemistry relationships are anotl
tool used to examine the interr
consistency of the routine sample data
is expected that approximately
Status and Assessment of Measurement Quality Samples
Status of Sample*
Sample Type
Low-range field audit"
Field audit pair
Field duplicate
Low-range lab audit"
Lab audit pair*
Preparation duplicate
Quality control audit
Manager's sample
if Per
Batch
1
2
1
1
2
2
1
-
QA Staff
K
K
-
K
K
-
K
K
Sampling
Crew
B
B
B
--
-
-
--
-
Preparation
Laboratory
B
B
B
-
-
B
--
B
Analytical
Laboratory
DB
DB
DB
DB
DB
DB
B
-
Assessment Purpose"
System
D.A
P,A
P
-
-
-
-
--
Sampling
D.A
P.A
P.R
-
--
--
-
--
Preparation
D.A
P.A
P
-
-
P.R
•-
A
Analyst
A
P.A
P
D
P.A
P
A
--
• K = known concentration, B = blind, DB = double blind.
*> D - detectabilHylcontamination, P - precision, A * accuracy, R - representativeness.
* Not placed in organic soil batches.
" Triplicate in organic soil batches.
-------
percent of the data will not comply with
the relationships; these anomalous data
are examined by a staff soil chemist who
either qualifies them or assigns appro-
priate flags signifying the discrepancy.
Data Quality Assessment and
Reporting
The assessment of detectability is
accomplished on a parameter basis at
four levels: (1) compliance with contract-
required detection limits; (2) calculation
of actual instrument detection limits; (3)
calculation of estimated system detection
limits; and (4) identification of routine
samples having concentrations below the
system detection limits. The results can
be grouped in tabular form to allow
comparisons among the values for any
parameter of interest.
A statistical evaluation procedure that
has been developed by the QA staff and
data users is applied to the data in order
to assess precision as a function of
confounded data collection uncertainty.
An additive step-function model is used,
where an observed value of any soil
characteristic is considered as the sum of
the "true" or accepted value and an error
term. Precision is evaluated for each
variance segment of the range of
concentration for a given analyte.
The accuracy windows for the
laboratory audit samples are based on
previous interlaboratory analyses of the
same sample material by the same
protocols. The objective of setting
windows is to specify a range of
acceptable single values based on a
mean and standard deviation computed
from a number of previously observed
values. The prediction intervals used for
the accuracy windows are generally
determined with confidence intervals
constructed around the mean of these
values, using appropriate weighting
factors.
The sampling aspect of repre-
sentativeness is assessed by comparing
the individual site and pedon
classifications with the component
sampling classes to which the soils are
assigned. Representativeness of the
measurement quality samples is
assessed by comparing the concentration
ranges of data from the duplicate
samples to the overall concentration
range of the routine sample data.
Representativeness of the analytical
samples is identified by assessing the
homogenization and subsampling
procedures at the preparation and
analytical laboratories using precision
estimates form the duplicate samples.
Sampling completeness is assessed by
comparing the actual number of soil
pedons and associated horizons sampled
to that number specified in the MASS
design. Completeness of the sample
collection, preparation, and analysis is
calculated using data from the verified
data base, while completeness of the
data form a data user's perspective, i.e.,
amount of usable data, is determined
using the validated data base.
Following completion of the MASS, a
comparison is made across the
Northeastern, Southern Blue Ridge, and
Mid-Appalachian regions that focuses on
method differences, audit sample results,
laboratory effects, and other QA features
of the surveys. Comparison of the DORP
data bases to other similar data bases
may also be undertaken. Summary
statistics are used to collate individual
values into groups that enable the data
users to discern trends of interest among
the surveys.
Task leaders for the various stages of
the MASS provide a written summary of
operations to the project managers on a
quarterly basis. These reports describe
the kinds of data collected as well as
summarize the QA activities associated
with the data. The summary of QA
activities includes the following:
• Overview of QA activities,
• List of changes to the QA program,
• Results of system and performance
audits,
• Assessment of data quality based on
the verified data bases,
• Documentation of unfavorable incidents
and corrective actions,
• Distribution of updated control charts,
• Results of special studies.
Reports relating to data quality
assessment are also prepared by QA
staff. These reports include interpretation
of performance audits and replicate
sample data, estimates of measurement
error, and identification of any major
discrepancies found during the
assessment. In addition to this QA Plan,
the QA staff produces laboratory
operations manuals for use in the
preparation and analysis of soil samples.
Upon completion of the data verification
activities, summary QA reports of the
sample preparation and sample analysis
data are produced and distributed to all
cooperating DDRP staff and data users.
Data Management System
The purpose of data base management
for the MASS is to facilitate the collection,
entry, review, modification, and distri-
bution of all data associated with the
survey. Additional tasks include the data
analysis for report generation, statistical
analysis, verification, and data file
security. The MASS final data base
contains three progressive versions of the
data gathered: (1) raw data base. (2)
verified data base, and (3) validated data
base.
-------
M. L Papp, R. D. Van Remortel, C. J. Palmer, G. E. flyers, fl. A. Schumacher, R.
L Slagle, J. E. Teberg, and M. J. Mian are with Lockheed Engineering and
Sciences Company, Las Vegas, NV 89119.
L J. Blume is the EPA Project Officers (see below).
The complete report, entitled "Direct/Delayed Response Project: Quality
Assurance Plan tor Preparation and Analysis of Soils from the Mid-Appalachian
Region of the United States,'' (Order No. PB 90-116 971/AS; Cost $31.00, subject
to change) will be available only from:
National Technical Information Service
5285 Port Royal Road
Springfield, VA 22161
Telephone: 703-487-4650
The EPA Project Officer can be contacted at.
Environmental Monitoring Systems Laboratory
U.S. Environmental Protection Agency
Las Vegas, NV 89193-3478
United States
Environmental Protection
Agency
Center for Environmental Research
Information
Cincinnati OH 45268
Official Business
Penalty for Private Use $300
EPA/600/S4-89/031
I&ICUOI
ACE.CI
CHICAGO
------- |