United States Environmental Protection Agency Office of Solid Waste and Emergency Response (5102G) EPA 542-R-01-016 October 2001 www.epa.gov www.clu-in.org V EPA Current Perspectives in Site Remediation and Monitoring USING THE TRIAD APPROACH TO IMPROVE THE COST- EFFECTIVENESS OF HAZARDOUS WASTE SITE CLEANUPS D. M. Crumbling1 Executive Summary U.S. EPA's Office of Solid Waste and Emergency Response is promoting more effective strategies for characterizing, monitoring, and cleaning up hazardous waste sites. In particular, the adoption of a new paradigm holds the promise for better decision-making at waste sites. This paradigm is based on using an integrated triad of systematic planning, dynamic work plans, and real-time measurement technologies to plan and implement data collection and technical decision-making at hazardous waste sites. A central theme of the triad approach is a clear focus on overall decision quality as the overarching goal of proj ect quality assurance, requiring careful identification and management of potential causes for errors in decision-making (i.e., sources of uncertainty). Perspective EPA's Office of Solid Waste and Emergency Response (OSWER) manages the Superfund, RCRA Corrective Action, Federal Facilities, Underground Storage Tank, and Brownfields programs. "Smarter solutions" for the techni- cal evaluation and clean up of such contam- inated sites can take two major forms. One is through the adoption of new technologies and tools; the other is to modernize the strategy by which tools are deployed. Both are connected in a feedback loop, since strategy shifts are both fueled by and fuel the evolution of innovative technology. In the area of hazar- dous waste site monitoring and measurement, new technologies have become available with documented performance showing them capable of substantially improving the cost- effectiveness of site characterization. The current traditional phased engineering approach to site investigation (mobilize staff and equipment to a site, take samples to send off to a lab, wait for results to come back and be interpreted, then re-mobilize to collect additional samples, and repeat one or more times) can be incrementally improved by the occasional use of on-site analysis to screen samples so that expensive off-site analysis is reserved for more critical samples. Yet, as discussed elsewhere, integration of new tools into site cleanup practices faces an array of obstacles [1]. If the cost savings promised by new technologies is to be realized, a funda- mental change in thinking is needed. Faster acceptance of cost-effective characterization and monitoring tools among practitioners is even more important now that Brownfields and Voluntary Cleanup Programs are gaining in importance. For these programs that focus on site redevelopment and reuse, factors such as time, cost, and quality are of prime concern. Modernization of the fundamental precepts underlying characterization and cleanup practices offers cost savings of about EPA, Technology Innovation Office ------- 50% while simultaneously improving the quality of site decision-making. The idealized model for an innovation-friendly system that produces defensible site decisions at an affordable cost would have the following characteristics: • it would be driven by achieving performance, rather than by complying with checklists that do not add value; • it would use transparent, logical reasoning to articu- late project goals, state assumptions, plan site activities, derive conclusions, and make defensible decisions; • it would value the need for a team of technical experts in the scientific, mathematical, and engineering disciplines required to competently manage the complex issues of hazardous waste sites; • it would require regular continuing education of its practitioners, especially in rapidly evolving areas of practice; • its practitioners would be able to logically evaluate the appropriateness of an innovative technology with respect to project-specific conditions and prior technology performance, with residual areas of uncertainty being identified and addressed; and • it would reward responsible risk-taking by practitioners who would not fear to ask, "why don't we look into...?" or "what if we tried...?" What form might such an idealized model take ? A maj or step toward this goal would involve institutionalizing the triad of systematic planning, dynamic work plans, and real-time analysis as the foundation upon which cost- effective, defensible site decisions and actions are built. None of the concepts in the triad are new, but the boost given by computerization to technology advancement in recent years is now providing strategy options that did not exist before. Pockets of forward-thinking practition- ers are already successfully using this triad; the concept is proven. The Triad's First Component: Systematic Planning Most organizational mission statements pledge a commitment to quality. EPA is no different. EPA Order 5360.1 CHG 2 requires that work performed by, or on behalf of, EPA be governed by a mandatory quality system to ensure the technical validity of products or services [2]. A fundamental aspect of the mandatory quality system is thoughtful, advance planning. The EPA Quality Manual for Environmental Programs explains that "environmental data operations shall be planned using a systematic planning process that is based on the scientific method. The planning process shall be based on a common sense, graded approach to ensure that the level of detail in planning is commensurate with the importance and intended use of the work and the available resources" [3]. Systematic planning is the scaffold around which defensible site decisions are constructed. The essence of systematic planning is asking the right questions and strategizing how best to answer them. It requires that for every planned action the responsible individual can clearly answer the question, "Why am I doing this?" First and foremost, planning requires that key decision- makers collaborate with stakeholders to resolve clear goals for a proj ect. A team of multi-disciplinary, experi- enced technical staffthen works to translate those goals into realistic technical objectives. The need for appropriately educated, knowledgeable practitioners from all disciplines relevant to the site's needs is vital to cost-effective project success. Multi-disciplinary Technical Team During the planning phase, the most resource-effective characterization tools for collecting data are identified by technically qualified staff who are familiar with both the established and innovative technology tools of their discipline. For example, the hydrogeologist will be conversant not only with the performance and cost issues of well drilling techniques, but also with the more innovative and (generally) less costly direct push technologies entering common use. The sampling design expert will understand how uncertainties due to sampling considerations (where, when, and how samples are collected) impact the representativeness of data generated from those samples, and thus the ability of those samples to provide accurate site information [4]. The team's analytical chemist will not only know the relative merits of various traditional sample preserva- tion, preparation, and analysis methods, but also the strengths and limitations of innovative techniques, including on-site analytical options. The chemist's responsibilities include designing the quality control (QC) protocols that reconcile project-specific data needs with the abilities of the selected analytical tools. When risk assessment is part of a project, involvement of the risk assessor at the beginning of proj ect planning is vital ------- to ensure that a meaningful data will be available for risk assessment purposes. Other technical experts might include (depending on the nature of the project) regula- tory experts, soil scientists, geochemists, statisticians, wildlife biologists, ecologists, and others. When project planners wish to express the desired decision confidence objectively and rigorously in terms of a statistical certainty level, statistical expertise is required to translate that overall decision goal into data generation strategies. Demonstrating overall statistical confidence in decisions based on environmental data sets will require the cost-effective blending of • the number of samples, • the expected variability in the matrix (i.e., matrix heterogeneity), • the analytical data quality (e.g., precision, quantitation limits, and other attributes of analytical quality) [5], • the expected contaminant concentrations (i.e., how close are they expected to be to regulatory limits), • the sampling strategy (e.g., grab samples vs. composites; a random sampling design vs. a systematic design), and • the costs. Since sampling design and analytical strategy interact to influence the statistical confidence in final decisions, collaboration between an analytical chemist, a sampling expert, and a statistician is key to selecting a final strategy that can achieve project goals accurately, yet cost-effectively. Software tools are also available now to assist technical experts to develop sampling and analysis designs. Although they can be powerful tools, neither statistics nor software programs can be used as "black boxes." A knowledgeable user must be able to verify that key assumptions hold true in order to draw sound conclusions from statistical analyses and software outputs. The statistician is concerned with controlling the overall (or summed) variability (i.e., uncertainty) in the final data set, and with the interpretability of that final data set with respect to the decisions to be made. The statistician does this during project planning by addres- sing issues related to "sample support" (a concept that involves ensuring that the physical dimensions of samples are representative of the original matrix in the context of the investigation), by selecting a statistically valid sampling design, and by estimating how analytical variability could impact the overall variability. The field sampling expert is responsible for implementing the sampling design while controlling contributions to the sampling variability as actual sample locations are selected and as specimens are actually collected, preserved, and transported to the analyst. The analytical chemist is responsible for controlling components of variability and uncertainty that stem from the analytical side (such as analyte extraction, concentration, and instrumental determinative analysis), but also for over- seeing aspects of sample preservation, storage, homo- genization, and possibly subsampling (if done by the analyst). The analytical chemist should select analytical methods that can meet the analytical variability (precision) limits estimated by the statistician. The chemist must be able to evaluate the relative merits of methods for their detection capacity (detection or quantitation limits), specificity (freedom from inter- ferences), and selectivity (uniqueness of the analytes detected), and match those properties to the data type and quality needed by all the data users involved with the project. Finally, the chemist is responsible for desig- ning an analytical QC program that will establish that the analytical data sets are of known and documented quality. Controlling the various sources of analytical and sampling uncertainties (assuming no clerical or data management errors) ensures that data of known overall quality are generated. Since the single largest source of uncertainty in contaminated site decisions generally stems from matrix heterogeneity, increasing the samp- ling density is critical to improving decision confidence. Managing Uncertainty as a Central Theme Project planning documents should be organized around the theme of managing the overall decision uncertainty. The purpose of systematic planning, such as EPA's Data Quality Objectives (DQO) process used for the systematic planning of environmental data collection, is to first articulate clear goals for the anticipated project, and then to devise cost-effective strategies that can achieve those goals. Project planning documents [such as work management plans, quality assurance project plans (QAPPs), sampling and analysis plans (SAPs), etc.] should be written so that the reader can explicitly identify what those decisions are and what sources of uncertainty could potentially cause those decisions to be made in error. The balance of project planning ------- documents should discuss the rationale and procedures for managing each major source of uncertainty to the degree necessary to achieve the overall decision quality (i.e., decision confidence and defensibility) desired by project managers and stakeholders. After completion of the proj ect, summary reports should clearly discuss the project goals that were actually achieved, the decisions that were made, the uncertainties that actually impacted project decision-making, the strategies used to manage these uncertainties, and the overall confidence in the project outcome (which is a function of what uncertainties remain). Conceptual Site Model Using all available information, the technical team develops a conceptual site model (CSM) that crystallizes what is already known about the site and identifies what more must be known in order to achieve the project's goals. A single project may have more than one CSM. Different CSM formulations are used to depict exposure pathways for risk assessment, the site's geology or hydrogeology, contaminant concentrations in surface or subsurface soils, or other conceptual models of contaminant deposition, transport, and fate. Depending on the specifics of the project, CSMs may take the form of graphical representations, cross-sectional maps, plan- view maps, complex representations of contaminant source terms, migration pathways, and receptors, or simple diagrams or verbal descriptions. The team uses the CSM(s) to direct field work that gathers the necessary information to close the information gaps that stand in the way of making site decisions. Data not needed to inform site decisions will not be collected. (Although this sounds elementary, the one-size-fits-all approach used by many practitioners routinely leads to the collection of costly data which are ultimately irrelevant to the project's outcome.) The CSM will evolve as site work progresses and data gaps are filled. The CSM thus serves several purposes: as a planning and organizing instrument, as a modeling and data interpretation tool, and as a communication device among the team, the decision-makers, the stakeholders, and the field personnel. Systematic planning provides the structure through which foresight and multi-disciplinary technical expertise improves the scientific quality of the work and avoids blunders that sacrifice time, money, and the public trust. It guides careful, precise communication among participants and compels them to move beyond the ambiguities of vague, error-prone generalizations [5]. Systematic planning requires unspoken assumptions to be openly acknowledged and tested in the context of site-specific constraints and goals, anticipating problems and preparing contingencies. It should be required for all proj ects requiring the generation or use of environmental data [6]. The Second Component of the Triad: Dynamic Work Plans When experienced practitioners use systematic planning combined with informed understanding about the likely fate of pollutants in the subsurface and advanced technology, an extremely powerful strategy emerges for the effective execution of field activities. Terms associa- ted with this strategy include expedited, accelerated, rapid, adaptive, or streamlined site characterization. Its cornerstone is the use of dynamic work plans. Formu- lated as a decision tree during the planning phase, the dynamic work plan adapts site activities to track the maturing conceptual site model, usually on a daily basis. Contingency plans are developed to accommodate even- tualities that are considered reasonably likely to occur during the course of site work, such as equipment malfunction, the unanticipated (but possible) discovery of additional contamination, etc. Dynamic work plans have been championed and successfully demonstrated for over 10 years by a number of parties [7, 8]. Success hinges on the presence of experienced practitioners in the field to "call the shots" based on the decision logic developed during the planning stage and to cope with any unanticipated issues. For small uncomplicated sites, or for discrete tasks within complex sites, project management can be streamlined so smoothly that characterization activities blend seamlessly into cleanup activities. Just as the design of a dynamic work plan requires the first component of the triad (systematic planning) to choreograph activities and build contingencies, implementation of a dynamic work plan generally requires the third member of the triad (real-time genera- tion and interpretation of site data) so that data results are available fast enough to support the rapidly evolving on-site decision-making inherentto dynamic work plans. The Third Component: Real-Time Analysis Real-time decision-making requires real-time informa- tion. There are a variety of ways real-time data can be generated, ranging from very short turnaround from a conventional laboratory (off-site analysis) to on-site mobile laboratories using conventional analytical instru- ------- mentation to "hand-held" instrumentation set up in the back of a van or under a tent in the field. For many projects, on-site analysis in some manner will be the most cost-effective option, although this will always depend on many factors, including the target analyte list and the nature of the decisions to be made at a particular project. On-site analysis can be performed within the standard phased engineering approach; however, it does not achieve its full potential for cost- and time-savings except in the context of dynamic work plans. All sampling and analysis designs should be designed with thoughtful technical input from systematic planning, but the nature of field analytical methods and the critical role they play in the context of dynamic work plans makes systematic planning vital so that the most approp- riate sampling and measurement tools are selected and suitably operated. Data collection is not an end in itself: its purpose is to supply information. There has been a counter-productive tendency to fixate solely upon the quality of data points, without asking whether the information quality and representativeness of the data set was either sufficient or matched to the planned uses of the data. On-site analysis can never eliminate the need for traditional laboratory services; but the judicious blending of intelligent sampling design, dynamic work plans, and on-site analy- sis, supplemented by traditional laboratory testing as necessary, can assemble information-rich data sets much more effectively than total reliance on fixed lab analyses. The lower costs and real-time information value of field analysis permits much greater confidence in the representativeness of data sets due to greater sampling density and the ability to delineate a hot spot or "chase a plume" in real-time [4]. When the gathering of reliable information to guide defensible site decisions is a clear priority, field analytical technologies offer a much more valuable contribution than is implied when the concept is downplayed as "field screening." The cost advantages of on-site analysis extend well beyond possible "per sample" savings, since the use of the integrated triad approach maximizes the chances that the project will be done right the first time over the shortest possible time frame. Informative data sets that accurately represent true site conditions across the proj ect' s lifetime (from assessment to characterization through remediation and close-out) never happen by accident. No matter whether the on-site generated data are expected to be used for "screening" purposes or for "definitive" decision-making, good analytical chemistry practice must be followed and QC protocols must be designed carefully. Analytical chemists are the trained professionals best able to construct valid QC protocols that will integrate 1) the site-specific data needs and uses, 2) any site-specific matrix issues, and 3) the strengths and limitations of a particular analytical technology. Ignoring these considerations risks a chain of errors that waste effort and money: faulty data sets lead to erroneous conclusions, that, in turn, lead to flawed site decisions and/or ineffectual remedial actions. Good decisions rely on representative data sets that are of known quality. Therefore, the expertise of an analytical chemist must go along when analytical methods are taken to the field, whether in absentia as a written site-specific Standard Operating Procedure (SOP) that atechnician will follow, or in person as an instrument operator or supervising field chemist. Field analytical chemistry has made significant advances in scientific rigor and credibility. Computerization, miniaturization, photonics (e.g., lasers and fiber optics), materials research, immunochemistry, microwave tech- nologies and a host of other chemical, biological, and physical science disciplines are contributing to a multiplicity of technology improvements and innova- tions for analytical chemistry in general, and for the specialized practice of on-site analytical chemistry in particular. When compared to the convenience and control offered by fixed laboratory analysis, field analysis offers unique challenges to its practitioners, leading to the blossoming of a recognized subdiscipline. Field analysis now has its own dedicated international conferences, a peer-reviewed journal (Field Analytical Chemistry and Technology, published by Wiley Inter- Science), and university-based research centers. There is a small but growing number of companies offering specialized on-site analytical services and consulting expertise to the environmental community, and their professional standards and practices will be addressed by the newly formalized Field Activities Committee within the National Environmental Laboratory Accreditation Council (NELAC). Environmental chemists are not alone in recognizing the potential of field analysis. Even the pharmaceutical industry is taking their analytical methods to the field to screen for new drugs in marine and terrestrial ecosystems. "Who would have thought we could do this much in situ now? When we first started, people said we were crazy," marveled a University of Illinois chemistry professor. While acknowledging that "on-site analysis may seem the stuff of science fiction," he predicted that the pace of technological advances will make it commonplace for the pharmaceutical industry within ------- five years [9]. Will the same be true for the environ- mental remediation industry? On-site interpretation of data is greatly facilitated by decisions support software tools using classical statis- tical analysis and geostatistical mapping algorithms. Laptop PCs may be used to manage data and produce 2- or 3-dimensional images representing contaminant distributions, including an assessment of the statistical reliability of the projections. Cost-benefit and risk- management analyses produced within minutes can allow decision-makers to weigh options at branch points of the dynamic work plan, or to select optimum sampling locations that can give the "most bang for the characterization buck" by minimizing decision uncer- tainty. The graphical output of the software greatly facilitates meaningful communication of site issues and decisions with regulators and the public. As with all tools, users need to understand possible pitfalls and consult with experts as necessary to avoid misapplica- tions that could lead to faulty outputs. Experience with the Triad Approach In the early 1990s, the Department of Energy (DOE) articulated the concepts of the triad approach as Expedited Site Characterization (ESC) [10]. In addition, DOE linked dynamic work plans with systematic planning with the intent of speeding up Superfund site investigations and feasibility studies at DOE sites in an approach called SAFER (Streamlined Approach for Environmental Restoration). Showing the acceptance of this paradigm among remediation experts, ASTM has issued three guides describing various applications of expedited or accelerated approaches [11, 12, 13]. In 1996-1997, EPA Region 1 and Tufts University coordinated with the U.S Air Force to conduct a demonstration of a dynamic site investigation using real- time results generated by a mobile laboratory to delineate residual soil contamination at Hanscom Air Force Base. The project showed that innovative tech- nologies combined with an adaptive sampling and analysis program could drastically reduce the time and cost, while increasing the confidence, of site decisions [14]. Argonne National Laboratory's Environmental Assess- ment Division (EAD) uses Adaptive Sampling and Analysis Programs (ASAP) to expedite data collection in support of hazardous waste site characterization and remediation. ASAPs rely on "real-time" data collection and field-based decision-making, using dynamic work plans to specify the way sampling decisions are to be made, instead of determining the exact number and location of samples before field work begins. EAD focuses on the decision support aspects of ASAP data collection, including the management and visualization of data to answer questions such as: What's the current extent of contamination? What's the uncertainty associated with this extent? Where should sampling take place next? When can sampling stop? A variety of software tools are used to facilitate real-time data collection and interpretation, including commercial databases, standard geographical information system (GIS) packages, customized data visualization and decision support software based on Bayesian statistics, and Internet applications to foster real-time communi- cation and data dissemination. The EAD is documenting that ASAP-style programs consistently yield cost savings of more than 50% as compared to more traditional sampling programs [15]. The U.S. Army Corps of Engineers began institutionali- zing an integrated approach to systematic planning under the name "Technical Project Planning (TPP) Process." Although it does not address dynamic work plans and on-site analysis directly, the TPP engineering manual stresses the importance of a multi-disciplinary team that performs "comprehensive and systematic planning that will accelerate progress to site closeout within all project constraints" [16]. A 1997 review of 11 initial projects performed under the TPP approach demonstra- ted the following successes: • Met all schedules (and "train-wreck" and "break- neck" milestones); • Improved project focus and communications; • Improved defensibility and implementability of technical plans; • Eliminated "excessive" data needs and identified "basic" data needs; • Increased satisfaction of USACE's Customers; • Improved relations and communication with regulators; and • Documented cost savings of at least $4,430,000 (total savings for all 11 projects) [17]. In addition, a well-documented USAGE project using the triad approach in combination with Performance- Based Measurement System (PBMS) principles (for ------- both the field analytical and fixed laboratory methods) achieved site closure while demonstrated an overall project savings of 50% ($589K actual project cost vs. $1.2M projected cost) [18]. The Florida Department of Environmental Protection created the Drycleaning Solvent Cleanup Program (DSCP) to address contamination from small dry cleaner shops. Under the DSCP, rapid site characterizations are performed using on-site mobile laboratories and direct push technologies to characterize soil and ground water contamination, assess cleanup options, and install permanent monitoring wells, all in an average of 10 days per site. Site characterization costs have been lowered by an estimated 30 to 50 percent when compared to conventional assessments [19]. Whether the focus of a site investigation is ground water, surface water, sediment, soil, or waste characterization, or a combination thereof, the triad References approach has been shown to achieve site closeout faster and cheaper than traditional phased approaches. The question becomes: What are the barriers that hinder wider utilization of this approach? Past reasons no doubt included the limited selection of rapid turnaround field analytical and software tools so vital for implementing dynamic work plans efficiently. As described earlier however, recent years have seen a growing array of analytical options able to meet many types of data quality needs. Technology advancement would be even more brisk if a paradigm of logical evaluation, acceptance, and use by practitioners and regulators were the norm. To benefit from the tools we currently have and boost our available options, we must modernize habits that were established during the infancy of the environmental remediation industry. Other papers in this series address the limitations of prescriptive require- ments for analytical methods and analytical data quality [4,20]. [1] National Research Council. 1997. Committee on Innovative Remediation Technologies. Innovations in Ground Water and Soil Cleanup: From Concept to Commercialization, National Academy Press, Washington, DC, pp. 40-75. http://www.nap. edu [2] U.S. EPA. 2000. EPA Order 5360.1 CHG 2: Policy and Program Requirements for the Mandatory Agency-wide Quality System, U.S. Environmental Protection Agency, Washington, DC, May. http://www.epa.gov/qualityl/ qs-docs/53 60-1 .pdf [3] U.S. EPA. 2000. EPA 5360 Al: EPA Quality Manual for Environmental Program?,, U.S. Environmental Protection Agency, Washington, DC, May. http://www.epa.gov/qualityl/qs-docs/5360.pdf [4] Crumbling, D.M. 2001. Current Perspectives in Site Remediation and Monitoring: Applying the Concept of Effective Data to Environmental Analyses for Contaminated Sites. EPA 542-R-01-013. September. http://cluin. org/tiopersp/ [5] Crumbling, D.M. 2001. Current Perspectives in Site Remediation and Monitoring: Clarifying DQO Terminology Usage to Support Modernization of Site Cleanup Practices. EPA 542-R-01-014. October, http://cluin.org/tiopersp/ [6] U.S. EPA. 1998. EPA Office of Inspector General Audit Report: EPA Had Not Effectively Implemented Its Superfund Quality Assurance Program, ReportNo.ElSKF7-08-0011-8100240, September 30,199% http://www.epa. gov:80/oigearth/audit/list998/8100240.pdf [7] Burton, J.C. 1993. Expedited Site Characterization: A Rapid, Cost-Effective Process for Preremedial Site Characterization, Superfund XIV, Vol. II, Hazardous Materials Research and Control Institute, Greenbelt, MD, pp. 809-826. [8] Robbat, A. 1997. A Guideline for Dynamic Workplans and Field Analytics: The Keys to Cost-Effective Site Characterization and Cleanup, sponsored by President Clinton's Environmental Technology Initiative, through the U.S. Environmental Protection Agency, Washington, DC. http://clu-in.org/download/char/dynwkpln.pdf ------- [9] Drollette, D. 1999. Adventures in drug discovery. In Photonics Spectra, September 1999, pp. 86 - 95. [10] DOE. 1998. Expedited Site Characterization. Innovative Technology Summary Report, OST Reference #77. Office of Environmental Management, U.S. Department of Energy, December 1998. http://ost.em.doe.gov/ifd/ost/ pubs/cmstitsr.htm. See also http://www.etd.ameslab.gov/etd/technologies/projects/esc/index.html [11] ASTM. 1998. Standard Practice for Expedited Site Characterization ofVadose Zone and Ground Water Contamination at Hazardous Waste Contaminated Sites, D6235-98. Conshohocken, PA. www.astm.org [12] ASTM. 1998b. Standard Guide for Accelerated Site Characterization for Confirmed or Suspected Petroleum Releases, E1912-98. Conshohocken, PA. www.astm.org [13] ASTM. 1996. Standard Provisional Guide for Expedited Site Characterization of Hazardous Waste Contaminated Sites, PS85-96. Conshohocken, PA. www.astm.org [14] U.S. EPA. 1998. Innovations in Site Characterization Case Study: Hanscom Air Force Base, Operable Unit 1 (Sites 1,2, and 3). Washington, DC. EPA 542-R-98-006. See also http://clu-in.org/charl_edu.cfm#site_char [15] U.S. Department of Energy Environmental Assessment Division (EAD) Adaptive Sampling and Analysis Program (ASAP) webpage: http://www.ead.anl.gov/project/dsp_topicdetail.cfm?topicid=23 [ 16] USAGE. 1998. Environmental Quality: Technical Project Planning (TPP) Process (Engineering Manual 200-1 - 2), Washington, DC. August 1998. http://www.usace.army.mil/inet/usace-docs/eng-manuals/em200-l-2/toc.htm [17] USAGE. 1999. Personal communication withHeidiNovotny,PE., Technical Liaison Manager, U.S. Army Corps of Engineers HTRW Center of Expertise, July 22, 1999. [18] U.S. EPA. 2000. Innovations in Site Characterization Case Study: Site Cleanup of the Wenatchee Tree Fruit Test Plot Site Using a Dynamic Work Plan. Washington, DC. EPA 542-R-00-009. August. See also http://clu-in. org/charl _edu. cfm#site_char [19] Applegate, J.L. and D.M. Fitton. 1998. Rapid Site Assessment Applied to the Florida Department of Environmental Protection's Drycleaning Solvent Cleanup Program, in Proceedings Volume for the First International Symposium on Integrated Technical Approaches to Site Characterization, Argonne National Laboratory, pp. 77-92. Paper available at http://cluin.org/charl_edu.cfm#mode_expe [20] Crumbling, D.M.. 2001. Current Perspectives in Site Remediation and Monitoring: The Relationship Between SW-846, PBMS, and Innovative Analytical Technologies. EPA 542-R-01-015. August, http://cluin.org/tiopersp/ ------- |