United States
Environmental Protection
Agency
Office of Solid Waste and
Emergency Response
(5102G)
EPA 542-R-01-015
October 2001
www.epa.gov
www.clu-in.org
V EPA Current Perspectives in Site Remediation and Monitoring
THE RELATIONSHIP BETWEEN SW-846, PBMS, AND INNOVATIVE
ANALYTICAL TECHNOLOGIES
D. M. Crumbling1 and Barry Lesnik2
Introduction
This summary explains EPA's position
regarding testing methods used within waste
programs, documentation of EPA's position,
the reasoning behind EPA's position, and the
relationship between analytical method
regulatory flexibility and the use of on-site
measurements (also termed "field analytical
methods") to improve the cost-effectiveness
of contaminated site cleanups.
Although the flow of site cleanup work can be
accelerated and site cleanup can be more
economical when on-site analytical methods
are used, the adoption of field methods has
been hindered by misunderstandings about
regulatory requirements for data quality and a
traditional reliance on fixed laboratory
methods to provide nearly all of the data upon
which site decisions are based. Contrary to
widespread opinion [see Reference 1, Note 4],
EPA policy does NOT "approve" (in a restric-
tive sense) which specific analytical methods
may be used to generate most of the analytical
chemistry data used within the "waste
programs" (such as the RCRA, Superfund, or
other contaminated site cleanup programs).
However, to support the analytical needs of
the RCRA program (and by extension, other
waste/contaminated site management prog-
rams), EPA has created and maintains a
methods compendium, entitled Test Methods
for Evaluating Solid Waste, Physical/
Chemical Methods (also known as "SW-
846"). [EPA's SW-846 Manual and suppor-
ting information are available on-line at:
http://www.epa.gov/SW-846/sw846.htm.]
SW-846 is currently in its Third Edition, and
Draft Update FVB has just been issued [see
Reference 9].
SW-846 is a guidance document meant to
assist analytical chemists and other users by
suggesting sampling and analytical proce-
dures that have undergone thorough evalua-
tion to identify the strengths and weaknesses
of the methods, and the expected analytical
performance for the range of sample types
evaluated [see Reference 1, Notes 1 and 3]. It
is EPA's position that for the majority of
methods in SW-846 (which are not method-
defined parameters, as discussed further
below):
• SW-846 is NOT the ONLY source of
methods that can be used.
• Methods in SW-846 do NOT need to be
implemented exactly as written in SW-
846.
• Performance data presented in SW-846
methods should NOT be used as
regulatory default or absolute "QC
requirements" [see Reference 2, page
TWO-land-2].
1 EPA, Technology Innovation Office
2 EPA, Office of Solid Waste, Economics, Methods, and Risk Analysis Division
1
-------
Causes of Confusion about "EPA-Approved
Methods"
Policies in other programs
One source of confusion about requirements for "EPA-
approved methods" within the waste programs stems
from the fact that EPA's Water programs have regula-
tory requirements for "EPA-approved reference
methods" which are specifically written into their regu-
lations. If these reference methods are followed exactly,
by definition in these programs, the data generated using
these reference methods are automatically considered to
be appropriate for regulatory compliance. These
methods are mandatory and prescriptive. Because so
many U.S. commercial laboratories are set up to comply
with Water program requirements, and since discussions
about "EPA methods" are rarely explicit about the
specific regulatory program under discussion, many
people assume that the same policy of prescriptive
methods applies to all EPA programs. However, EPA's
Office of Solid Waste and Emergency Response
(OSWER) which includes the programs responsible for
cleaning up contaminated sites operates under a very
different analytical paradigm than the Water programs.
There are a number of reasons for this. The most
important reason is that the variety of matrices
encountered in the waste management and cleanup
programs, and the variety of decisions involving those
matrices, are much too diverse to expect prescriptive,
one-size-fits-all sampling and analytical methods to
produce scientifically defensible data across this entire
range of variables [see Reference 2, PREFACE-1].
Therefore, there are no "reference methods" (as defined
by EPA's Water program) included in SW-846. Within
the waste programs, all SW-846 methods or other
appropriate methods must be demonstrated by the
analyst to generate scientifically reliable data (i.e., data
of known quality) for the analytes of concern, in the
matrices of concern, at the concentrations of concern,
within the context of the intended application. EPA
policy in the waste programs is that analyses are
required to "get the right answer" as demonstrated by
quality assurance mechanisms. If an accepted method
cannot "get the right answer" due to analytical difficul-
ties with the matrix, etc., selection of a different method,
or modification of a method is required. Having run a
method "as written" is no excuse for reporting faulty
data. These issues will be discussed in additional detail
below.
The CLP in the Superfundprogram
Another cause for confusion is that the Superfund
program maintains the Contract Laboratory Program
(CLP), which has had a history of highly prescribed
methodologies, reporting limits, and QA/QC criteria.
The reason for the prescriptiveness of the CLP results
however, not from regulatory requirements, but from
contract requirements. The Contract Laboratory Prog-
ram is maintained by the Superfund program as a service
to provide Regional offices with ready access to contract
laboratory services. The CLP strives for consistency in
data reporting formats and expected analytical data
quality [see information on http://www.epa.gov/
superfund/programs/clp/aboutsrv.htm], since CLP
results are often relied upon for enforcement and other
sensitive situations. This is done by having laboratories
that participate in this program agree to the exact terms
of the contract in order to win a place on the list of
available CLP labs. Although many of the terms of past
contract mechanisms were planned with the expectation
that compliance with those terms would ensure consis-
tency in analytical quality, the realities of waste sample
types and difficult analytes have the potential to
generate inaccurate or non-informative data if needed
method modifications, such as changing extraction
procedures or adding cleanup steps, were not permitted
according to the terms of the contract. Fortunately, this
situation has changed substantially in recent years as the
CLP explores new ways to permit the analytical
flexibility needed to ensure data quality on a sample-by-
sample basis, while still accommodating the require-
ments of a government contract mechanism (for
additional information, see www.epa.gov/superfund/
programs/clp/methflex.htm).
It should also be noted that use of CLP services by EPA
Regions is not required, but is one option open to site
managers to simplify their workloads. Obviously then,
compliance with CLP contract requirements is not
incumbent upon entities not governed by those EPA
contracts. Nonetheless, because of the discomfort of
most environmental managers with selecting or evalua-
ting analytical chemistry methods, any mechanism that
seems to offer the ability to simply "check a box" when
choosing laboratory services and abdicate responsibility
for ensuring data quality to someone else (i.e., the
"government") possesses an irresistible attraction. This
has led many labs to market themselves to the private
sector as CLP labs or equivalent in order to ride on the
coat tails of the CLP mystique, has fueled the miscon-
ception that prescriptiveness in analytical methodologies
can ensure consistently accurate data, and has furthered
-------
a misconception that CLP laboratories are certified or
accredited by EPA (which they are not). On the other
hand, where this has tended to promote consistency in
reporting formats and enforcement expectations, there
has no doubt been benefit.
Interpretation of the term "approved method"
Another reason for confusion has to do with the use of
the word "approved," and understanding what that
means. SW-846 methods are said to be "approved" for
use under the RCRA regulatory program to "comply
with the requirements of subtitle C of the Resource
Conservation and Recovery Act (RCRA)" [see Refer-
ence 1, page 3089 and Reference 2, PREFACE-1]. To
support application to RCRA programs, "SW-846
analytical methods are written [in their most rigorous
form] as quantitative trace [<1000 ppm] analytical
methods to demonstrate that a waste does not contain
analytes of concern that cause it to be managed as a
hazardous waste" [see Reference 2, page TWO-1 and
section 2.1.1]. This form of "EPA-approval," however,
not the same as that granted for methods in the Water
programs, where methods must be used and followed as
written. Approved methods for the RCRA program (i.e.
SW-846 methods) are methods that have been validated
for, and should be able to be used for, most RCRA
applications, but they are not requiredto be used (except
for the method-defined parameters-discussed below).
The application of "EPA-approved" methods is thus
neither exclusive or restrictive [see Reference 2,
Disclaimer-1 page]. SW-846 generally offers several
alternative methods for the same class of analytes, any
of which might be selected to measure target analytes in
the context of a particular project or permit. Methods
that are not in SW-846 may also be selected. No matter
whether a waste generator uses an SW-846 method or
alternative method, the user must still demonstrate that
the method is applicable for its intended purpose, [see
Reference 1, Note 2]. If a method not published in SW-
846 is proposed, and there is also little or no perfor-
mance data published in any peer-reviewed forum, then
the amount of analytical documentation that will need to
be submitted to the regulatory body for scientific
evaluation will be greater than when an established
method is used. But the regulatory body should not
rej ect any scientifically valid method that is proposed by
a regulated entity simply because it does not appear on
the "SW-846 list." As long as a method can be
demonstrated to achieve the needed sensitivity and
accuracy for the target analytes in the matrix in question,
then that method should be considered as a viable
analytical option.
On the other hand, simply because a method from SW-
846 is selected does NOT mean that it can be assumed
that implementing the method as written will auto-
matically produce reliable data for a particular applica-
tion. Especially when unusual or complex matrices are
involved, SW-846 methods must still undergo a "demon-
stration of applicability" to establish adequate analytical
performance in the context of that application [see
Reference 2, Section 2.1]. Modification of generalized
methods is often requiredto improve method performan-
ce for certain target analytes in certain matrices. It
should also be noted that SW-846 methods are NOT
equivalent to Standard Operating Procedures (SOPs),
and cannot be substituted for project-specific or labora-
tory-specific SOPs [see Reference 2, PREFACE-1].
Failure to distinguish the impact of sampling
considerations
One-size-fits-all approaches assign accountability based
on whether a certain procedure was followed, not on
whether work was performed correctly and accurately.
While comforting in the short-run because of their
simplicity, one-size-fits-all approaches are truly useful
only as stop gap measures until more reliable informa-
tion or understanding becomes available. Reliance on
them after that becomes counter-productive, error-prone,
and wasteful in the long-run as evidence of decision
errors (due to faulty underlying assumptions) accumu-
late. The expectation that simply regulating how
analytical methods are used can guarantee sufficient data
quality is seductive to regulators and practitioners alike
because it avoids the much more difficult issues of
project planning, sample representativeness, and the
integration of professional/technical competence and
scientific advancement into all levels of project imple-
mentation [see Reference 2, PREFACE-1]. It is also
convenient to think that if project decisions are later
shown to be "wrong," the blame can be assigned to the
laboratory for generating the "wrong" results.
But the unavoidable truth is that even //the most highly
accurate laboratory methods were used on each
individual sample, the data will be meaningless or
misleading if the sample collection procedures (proce-
dures implemented by field personnel over which
laboratories have no control) do not ensure the represen-
tativeness of the samples in the context of the project
decisions. In other words, does the sample selection
process ensure that data from those samples will
represent the parameter of interest? A sampling design
-------
that is supposed to determine whether spills or leaks
could have occurred at a site will be very different from
a sampling design that is supposed to determine the
average concentration of contaminants across some risk-
based exposure unit. Sampling design must consider
where specimens are collected, when samples are
collected, and how samples are collected. "Where" is
frequently determined by statistically-based sample
designs when quantitative estimates of decision certainty
are desired. "When" may also be governed by statistical
considerations, as well as by seasonal effects or other
time-sensitive factors. "How" involves a consideration
of representative "sample support" (the dimensions and
orientation of specimens as they are extracted from the
parent material being tested), and the selection of the
sampling tools that will be used to extract the specimen
(spatulas, soil corers, drum samplers, etc.). [See EPA
guidances for statistics and sampling design available on
http://cluin.org/chartext_edu.htm#stats. A more
thorough discussion of the issue of representativeness in
regard to environmental data can be found in EPA
QA/G-5, Appendix H. See Reference 8.]
Failure to distinguish between determinative and
sample preparative methods
Non-chemists also show a strong tendency to focus
solely on determinative analytical methods (the
instrumentation used to actually generate the analytical
result) to the exclusion of other very important aspects
of sample analysis, such as sample preservation,
subsampling in the laboratory, sample preparative,
extraction, or digestion methods, and extract cleanup
methods. Yet, as with representative sample collection,
appropriate sample preparative methods can mean the
difference between data that are effective for defensible
decision-making and data that are completely unreliable,
irrespective of how much quality control is imposed on
the determinative method. Sadly, a great deal of time
and money is spent to micromanage laboratory
determinative methodologies, while these other factors
are completely ignored. Huge gains in the reliability of
analytical results could be attained by refocusing
resources to ensure the representativeness of sample
collection, and by supporting the "mixing and matching"
of sample preparative, cleanup, and determinative
methods for the purpose of generating the appropriate
data needed to address specific project decisions [see
Reference 2, PREFACE-1, and Reference 3, page 3].
Prescriptive SW-846 methods for method-defined
parameters
Yet another reason for confusion is that there are a few
specific requirements in regulations to use SW-846
methods exactly as written. EPA regulations state [see
Reference 1, Note 2] that "Several of the hazardous
waste regulations under Subtitle C of RCRA require that
specific testing methods in SW-846 be employed for
certain applications." These requirements relate to
testing used to determine a specific kind of property that
is termed a "method-defined parameter." The regulation
goes on to say that "Any reliable method may be used
to meet other requirements in 40 CFR parts 260 through
270" [emphasis added].
"Method-defined parameters" are characteristics or
properties of waste materials that are defined by the
outcome of a particular testing procedure. The test must
be performed exactly as written because the way the
method is performed determines the results, and
interpretation of the results has been standardized based
on implementing the testing procedure in the same way
every time. Where RCRA regulations are involved, a
method-defined parameter is a method that defines the
related regulation, and so it must be followed exactly as
written. Examples of these method-defined parameters
are the Toxicity Characteristic Leaching Procedure
(TCLP, SW-846 Method 1311), and tests to determine
the free liquid component of a waste (SW-846 Method
9095) or the corrosivity of a waste material (SW-846
Method 1110). If the method is not performed in the
exact manner as written (for example, if the TCLP is
performed with a different leaching solution or for a
different time period), the result for the measured
parameter cannot be used to interpret compliance with
the corresponding regulation [see Reference 2, page
TWO-1; and Reference 3, page 4].
There are only a few method-defined parameter methods
in SW-846. The vast majority of SW-846 methods (and
much, if not all, of the testing done during hazardous
waste site characterization) are not method-defined
parameters. That means the analytical method is
measuring a parameter that is real physical matter, such
as the total amount of arsenic (As) in a kilogram of soil.
The amount of total arsenic is an independent, verifiable
quantity that is at theoretically and conceptually possible
to measure exactly, even if that measurement is a
technological challenge. Such methods are proper candi-
dates for method modifications or for selection of
alternative methods that permit improvement of
analytical performance for specific sample types, or to
-------
improve the cost-effectiveness of environmental
monitoring programs while still ensuring that the
"correct" regulatory decisions are being made. The goal
of the RCRA methods program is not to arbitrarily
require specific procedures (which is a command-and-
control mechanism), but to allow members of the
environmental community freedom to achieve their
regulatory objectives in ways that make both scientific
and economic sense within their particular context. This
is called a performance-based, or results-oriented,
approach.
The development of regulation-defined testing proce-
dures should be considered carefully. Where there are
regulatory requirements for certain testing procedures to
be implemented as written in the regulation, the
responsibility for the accuracy or realism of the results
is assumed by the regulating entity. For example, the
purpose of the TCLP is to estimate the likelihood that
waste deposited in a landfill will leach toxic constituents
into groundwater. If the TCLP test predicts that no
leaching will occur, and the waste is placed in a landfill,
but then leaching does occur in the real world, the blame
for faulty real-world predictability does not lie with the
regulated entity, but with the regulator/regulation for
requiring a test that does not always yield reliable
results. Yet developing a prescriptive test that will be
reliable across multiple impacting variables is extremely
difficult, if not impossible.
Writing regulations that require prescriptive methods for
actual physical quantities (such as the amount of
benzene in a liter of groundwater) turns these measure-
ments into method-defined parameters from a regulatory
perspective, and ceases to recognize benzene concentra-
tions as real molecules whose accurate quantification is
continually improved with experience and technology
advancement. If a regulating entity relies on prescriptive
methods, it is then under continual pressure to update its
regulations to keep pace with improvements in analytical
chemistry technology. This has proven to be an impos-
sible task. A more technologically, economically, and
scientifically feasible approach is the Performance-
Based Measurement System approach (discussed
below).
Performance-Based Measurement Systems (PBMS)
Why a performance-based approach to analytical
methods is advantageous
As discussed above, prescriptive regulation of analytical
methods is not wise for several reasons:
1. Eliminating analytical flexibility forces some testing
to be done inappropriately because site- or sample-
specific issues (such as matrix complexities, recovery
issues, or interferences) cannot be addressed to ensure
accurate analytical results.
2. For some site decisions, rigorous quantitative data
may not be needed-only a semi-quantitative or "go or
no-go" result is required to make the correct decision. It
is wasteful to pay for high levels of analytical data
quality that are not relevant to project needs; yet
regulatory programs that prescribe specific methods
seldom permit a graded approach to selecting methods
or so that analytical performance can be tailored to
match specific project needs. However, it is well
established that as long as adequate planning and
QA/QC protocols ensure that the data quality will be
known and appropriate to the intended data use, it is
frequently possible to use less expensive analytical
methods for some or all of the data collection efforts,
while achieving a higher level of overall decision
certainty if amore representative number of samples can
be tested [see Reference 7].
3. Regulatory analytical rigidity damages the ability of
the environmental laboratory community to grow in
expertise and advance technologically. Prescriptive
methods prohibit the use of professional analytical
chemistry skills that could otherwise select the most
appropriate methods or modify and troubleshoot
methods to ensure that the "right answer" is obtained.
Discussions with experienced environmental chemists
and skilled laboratory auditors reveal a common
consensus that years of prescriptive methods has
atrophied the competence of U.S. commercial environ-
mental laboratories, and is a contributing factor to
laboratory fraud. These observers have seen highly
trained and experienced professional chemists replaced
by technicians who mindlessly operate equipment while
lacking the technical understanding and critical thinking
abilities needed to guarantee the analytical quality
needed to support environmental decisions (see WTQA
references regarding laboratory and data quality issues).
4. In addition, prescriptive requirements inhibit the
development of new and better analytical methods for
the environmental laboratory because of the great time
lag between the introduction of an innovative, improved
technology and the regulatory acceptance that would
allow it to be freely used in the marketplace. While
analytical science is making great strides in other
industries, application of improved, cost-effective
analytical technologies in the environmental arena lags
-------
behind. Statements from instrument makers and vendors
reveal their reluctance to develop and market new
equipment for environmental applications because a
near-term return on their investment appears unlikely no
matter how much promise the technology may offer in
lower analytical costs or improved analytical quality.
PBMS'within EPA 's waste programs
Although the Methods Team responsible for maintaining
SW-846 has worked under a performance-based
approach since the inception of SW-846, this fact is
seldom recognized. Despite the many efforts of the
Methods Team to counteract disturbing trends toward
analytical micromanagment, SW-846 has been misap-
plied in a prescriptive manner in the implementation of
many federal and state programs [see Reference 3, page
2]. In another attempt to combat analytical prescriptive-
ness, EPA has formally adopted an agency-wide policy
called the Performance-Based Measurement System
(PBMS) approach, as announced in an October 6, 1997
Federal Register Notice [see Reference 4]. Although the
policy addresses all agency programs, this discussion
will only involve the application of PBMS to RCRA-
related programs (and by extension, to all programs
relying on SW-846 methods), as announced in a May 8,
1998 Federal Register Notice [see Reference 5].
As discussed in Section II.A, page 25431 of the May 8,
1998 Federal Register Notice, a PBMS approach
"conveys 'what' needs to be accomplished, but not
prescriptively 'how' to do it... [T]he regulating entity
will specify questions to be answered by the monitoring
process, the decisions to be supported by the data, the
level of uncertainty acceptable for making the decisions,
and the documentation to be generated to support the
PBMS approach...Data producers will demonstrate that
a proposed sampling and analytical approach meets the
monitoring criteria specified in the Quality Assurance
Project Plans or Sampling and Analysis Plans for the
individual projects or applications."
This means that any analytical method may be used to
generate data (whether or not it is currently published in
SW-846) as long as it can be demonstrated to:
• measure the constituent of concern,
• in the matrix of concern,
• at the concentration level of concern,
• at the degree of accuracy as identified as necessary
to address the site-decision.
number and placement of specimen collections) interact
with the analytical consideration and the decisions to be
made in an integrated fashion to generate a specified
level of overall certainty in a decision. For example,
contrast two scenarios wherein a given waste stream
contains constituents subj ect to regulatory monitoring to
ensure to some specified degree of statistical certainty
that the true constituent concentration is indeed less than
a given regulatory threshold. In Scenario l,the regulated
analytes tend to occur at levels very close to the
regulatory threshold. Statistical calculations determine
that a certain number of samples and a certain level of
analytical method accuracy (i.e., precision and bias) will
be required to establish regulatory compliance. In
contrast, for Scenario 2, the same regulated analytes in
the same waste stream tend to occur at concentrations
significantly less than the regulatory threshold, but all
other conditions (such as the desired degree of statistical
confidence) are the same. For Scenario 2, demonstrating
that regulatory compliance is achieved would require
few er sample s and less stringent analytical accuracy than
those required in Scenario 1.
For this reason, regulations should serve only to set a bar
for overall statistical certainty in environmental
decisions, but should not attempt to prescribe sample
numbers or to limit analytical technologies. When
regulated entities are given clearly defined (and
consistently enforced) compliance goals and the freedom
to customize their processes and monitoring programs to
match their particular circumstances, industry quickly
discovers the most cost-effective means to achieve the
goals of environmental regulation, including the creation
of innovative technologies. Flexibility can also encour-
age regulated entities to exceed the environmental
protection goals desired by regulatory requirements, if
the cost of lowering the absolute concentration of
discharged regulated analytes in the waste stream can be
offset by savings in the monitoring costs (as exemplified
in Scenario 2). If appropriate technical expertise is
incorporated by the regulatory body, it is not difficult to
develop oversight programs that discourage "cheating"
while permitting this kind of flexibility.
When sampling and analytical considerations are
allowed to be co-variables in an equation whose output
is the overall confidence (statistical certainty) desired in
a regulatory decision, a much more cost-effective and
protective monitoring program can be developed than is
possible under programs built on a foundation of
prescriptive, one-size-fits-all assumptions.
This also means that sampling considerations (the A PBMS is consistent with Agency-wide EPA policies
-------
Performance-based approaches are the foundation of the
paradigm shift away from command-and-control
regulatory structures (which are very expensive and
unsatisfactory in other ways) toward more results-
driven, economical, market-based approaches to
environmental protection. Adopting a PBMS policy is a
first step toward accepting newly available analytical
tools and the work strategies they support so that
management of contaminated sites can be made more
affordable and more defensible. The triad approach to
site cleanup (i.e., the integration of systematic planning,
dynamic work plans, and on-site analysis) is based on
PBMS principles. It has consistently demonstrated
savings up to 50% over the life of a project when
compared to the costs of more traditional cleanups,
while maintaining or improving confidence inprotective
site decisions. The dynamic work plan approach using
on-site (i.e., field) analytical methods has been described
in the informative video entitled, "Field Analytics: The
Key to Cost Effective Site Cleanup" [Reference 6].
More information about the triad approach and its
implications for the management of hazardous waste
sites can be found in Reference 7.
regarding quality management and the implementation
of quality systems. EPA quality policies do not require
that specific procedures or analytical technologies be
required or designated. Rather, EPA's Agency-wide
Quality Manual requires the use of "a systematic
planning process based on the scientific method [and
based on] a common-sense graded approach to ensure
that the level of detail in planning is commensurate with
the importance and intended use of the work and the
available resources" [see section 3.3.8 of Reference 10].
Quality policies require that whatever methods are used,
they must be adequately documented in order to
demonstrate that the data quality will be known and be
adequate to defensibly support achievement of the stated
project objectives [see sections 5.3.1 and 5.3.3 of
Reference 10].
Implications of PBMS for contaminated site cleanup
Integration of new analytical technologies for characteri-
zation and monitoring, and new remediation technolo-
gies for cleaning up sites, offer "smarter solutions" for
managing the environmental issues related to hazardous
waste. When hazardous waste practice clings tenacious-
ly to the familiar habits developed during its infancy,
everyone loses.
References
Reference 1: Federal Register, Vol. 60, No. 9, Friday, January 13, 1995, Rules and Regulations, pages 3089-3095.
Item is retrievable at http://www.access.gpo.gov/su_docs/aces/aces 140.html. Search using the following entries:
1995 Federal Register; Final Rules and Regulations; On 01/13/1995; Search Term = "Hazardous Waste
Management System"
Reference 2: Pages from the body of SW-846. These pages can be viewed or downloaded from the following
websites:
page "DISCLAIMER-1" from http://www.epa.gov/SW-846/disclaim.pdf
pages "TWO-1 and TWO-2" from http://www.epa.gov/SW-846/chap2.pdf
pages PREFACE-1 and PREFACE-2 from http://www.epa.gov/SW-846/preface.pdf
Reference 3: Article entitled "An Update of the Current Status of the RCRA Methods Development Program,"
available from http://www.epa.gov/SW-846/rcra.pdf
Reference 4: Federal Register, Vol. 62, No. 193, Monday, October 6, 1997, Notices, pages 52098-52100. Item is
retrievable at http://www.access.gpo.gov/su_docs/aces/acesl40.html. Search using the following entries: 1997
Federal Register; Notices; On 10/06/1997; Search Term = "Performance Based Measurement System"
Reference 5: Federal Register, Vol. 63, No. 89, Friday, May 8,1998, Notices, pages 25430-25438. Item is retrievable
at http://www.access.gpo.gov/su_docs/aces/acesl40.html. Search using the following entries: 1998 Federal
Register; Proposed Rules; On 05/08/1998; Search Term = "RCRA-Related Methods"
Reference 6: Tufts University video, Field Analytics: The Key to Cost Effective Site Cleanup. 18 minutes in length.
The video is available for viewing or ordering through the following website: http://cluin.org/video/Hanscom.htm
-------
Reference 7: Access the issue paper, Current Perspectives in Site Remediation and Monitoring: Using the Triad
Approach to Improve the Cost-Effectiveness of Hazardous Waste Cleanups (EPA 542-R-01-016) from
http: //cluin. org/tiopersp/
Reference 8: Access the document, EPA Guidance for Quality Assurance Project Plans (EPA QA/G-5), at
http ://www .epa.gov/quality/qs-docs/g5 -final .pdf
Reference 9: Federal Register, Vol. 65, No. 228, Monday, November 27,2000, Proposed Rules, pages 70679-70681.
Item is retrievable at http://www.access.gpo.gov/su_docs/aces/aces 140.html. Search using the following entries:
2000 Federal Register; "Proposed Rules"; On 11/27/2000; Search Term = "IVB"
Reference 10: AccessEPA Quality Manual for Environmental Programs (5360 Al), athttp://www.epa.gov/qualityl/
qs-docs/5360.pdf
Additional information regarding laboratory and data quality issues can be found in the following Waste Testing and
Quality Assurance Symposium papers, available at http://cluin.org/products/dataquality
Selected papers from WTQA '97 - 13th Annual Waste Testing and Quality Assurance Symposium Proceedings:
• "Options in Data Validation: Principle for Checking Analytical Data Quality" by Shawna Kennedy (pp. 169-172)
• "Laboratory Analyst Training in the 1990's and Beyond" by Roy-Keith Smith (pp. 172-182)
• "Investigation versus Remediation: Perception and Reality" by Emma P. Popek (pp. 183-188)
• "Performance-Based Evaluation of Laboratory Quality Systems: An Objective Tool to Identify QA Program
Elements that Actually Impact Data Quality" by Sevda K. Aleckson and Garabet H. Kassakhian (pp. 195-199)
• "The Method Detection Limit: Fact or Fantasy?" by Richard Burrows (pp. 200-203)
Selected papers from WTQA '98 - 14th Annual Waste Testing and Quality Assurance Symposium Proceedings:
• "Techniques for Improving the Accuracy of Calibration in the Environmental Laboratory" by Dennis A. Edgerley
(pp.181-187)
• "Interpretation of Ground Water Chemical Quality Data" by G. M. Zemansky (pp. 192-201)
Selected papers from WTQA '99 - 15th Annual Waste Testing and Quality Assurance Symposium Proceedings:
• "Lessons Learned from Performance Evaluation Studies" by Ruth L. Forman (pp. 38-46)
• "Questionable Practices in the Organic Laboratory: Part II" by Joseph Solsky (pp. 121-125)
• "The Role of a Compliance Program and Data Quality Review Procedure under PBMS" by Ann Rosecrance (pp.
231-235)
------- |