United States Environmental Protection Agency Office of Solid Waste and Emergency Response Washington, DC 20460 EPA/542/B-00/002 April 2000 &EPA Subsurface Remediation: Improving Long-Term Monitoring & Remedial Systems Performance Conference Proceedings JUNE 1999 Federal Remediation Technologies Roundtable ------- EPA/542/B-00/002 April 2000 Conference Proceedings Subsurface Remediation: Improving Long-Term Monitoring and Remedial Systems Performance June 8-11, 1999, St. Louis, Missouri ------- Acknowledgments The Technology Innovation Office would like to acknowledge and thank the planning committee who provided guidance and material for this report and provided comments on draft documents. The planning committee consisted of representatives from the U.S. Army Corps of Engineers, U.S. Navy, U.S. Air Force, U.S. Department of Energy, and the U.S. Environmental Protection Agency. Notice This document has been funded by the United States Environmental Protection Agency (U.S. EPA) under Contract No. 68-C7-0011, Work Assignment 2-59 by Science Applications International Corporation. Mention of trade names or commercial products does not constitute endorsement or recommendation for use. For more information about this project, please contact Kathy Yager, Technology Innovation Office, U.S. Environmental Protection Agency, 2890 Woodbridge Avenue, Building 18, Edison, New Jersey 08837. E mail: yager.kathleen@epa.gov. ------- Table of Contents Section Page 1 Introduction 1 2 Plenary and Case Studies 3 2.1 Plenary Session 3 2.2 Case Studies 4 3 Systems Performance Assessment and Optimization 7 3.1 Soil Vapor Extraction and Air Sparging 7 3.2 Permeable Reactive Barriers 9 3.3 LTM of Monitored Natural Attenuation: Guidance and Case Studies 11 3.4 Bioremediation and Phytoremediation 13 3.5 Above Ground Treatment Systems 14 4 Long-Term Monitoring Optimization 16 4.1 Data Quality Objectives and LTM Guidance 16 4.2 Monitoring Networks and Sampling 17 4.3 Data Assessment and Management 20 4.4 Analytical Issues 21 4.5 Data Presentation/Visualization 22 5 In situ / Remote Sensors and Networks and Emerging Monitoring Technique s 24 Abstracts Better Remediation Results: The Path Forward, Walter W. Kovalick, Jr., Ph.D 29 Getting to Site Closeout, Karla L. Perri 29 Remedial Systems Optimization Within the Site Closeout Process, Mario E. lerardi 29 RAO/LTM Optimization At The Naval Industrial Reserve Ordnance Plant, Fridley, Minnesota - A Case History, Richard C. Cronce, Ph.D 29 US Army Corps of Engineers Remediation System Evaluation Case Studies, Dave Becker 30 US EPA, Technology Innovation Office Demonstration Project of Hydraulic Optimization Modeling at Existing Pump and Treat Systems, Rob Greenwald 31 Long-Term Monitoring and Optimization of an Air Force Pump and Treat Facility, Philip Hunter, P.G. 32 The Benefits of Remediation System Audits: Several Case Studies for the Private Sector, Rusty Norris. 33 Long-Term Monitoring Optimization at Naval Air Station, Brunswick, ME, Emil Klawitter 34 Development of USEPA's SVE Optimization Guide, Ralph S. Baker, Ph.D 35 ------- Assessment of Soil Venting Performance and Closure, Dominic C. DiGiulio 36 In Situ Air Sparging Design Paradigm: Development and Evaluation, Richard Johnson, Ph.D 37 SVE Optimization Using Pneulog™ at Air Force Installations, Lloyd D. Stewart, Ph.D 38 Modeling Performance of Soil Vapor Extraction Systems, Zhenhua Jiang, Ph. D 38 Long-Term Performance Monitoring of a Permeable Reactive Barrier to Remediate Contaminated Groundwater, Robert W. Puls, Ph.D 39 In Situ Remediation of VOC-Contaminated Groundwater Using Zero-Valent Iron: Long-Term Performance, tf. fF: Gillham, Ph.D 40 DOE/DOD/EPA Collaborative Research on the Long-Term Performance of Permeable Reactive Barriers (PKBs),NicKorte 40 Performance Monitoring of the Permeable Reactive Barrier at the Somersworth, NH, Landfill Superfund Site, Timothy Sivavec, Ph.D 41 Performance Evaluations at the Moffett Field and Department of Defense Permeable Barrier Sites, Charles Reefer 42 Air Force LTM Methodology and Approach for Monitored Natural Attenuation, Patrick Haas 43 Evaluating the Natural Attenuation of Transient-Source Compounds In Groundwater At The Kl Avenue Landfill Site, Varadhan Ravi, Ph.D 43 Monitored Natural Attenuation of Explosives at Louisiana Army Ammunition Plant, Judith Pennington, Ph.D 44 EPA Guidance for Long-term Monitoring of Natural Attenuation, Herb Levine 45 Phytoremediation Performance Monitoring and Optimization: Hydrological and Geochemical Assessments, Scott W. Beckman, Ph.D 45 Optimization, Uncertainty, and Risk Analysis of Design and Operation of in situ Bioremediation, Christine A. Shoemaker, Ph.D 46 Monitoring Performance of an Enhanced In Situ Bioremediation Field Evaluation, Kent S. Sorenson, Jr. 47 Process Optimization of Remedial Systems, Sarabjit Singh, P.E 48 Optimization at the Milan Army Ammunition Plant Operable Unit One Treatment of Explosives- Contaminated Groundwater, Lindsey K. Lien P.E. 49 Facility Performance Audits: Bang for the Buck$, TedH. Streckfuss, P.E. 50 Flow and Transport Optimization for Pump and Treat Systems, David Ahlfeld, Ph.D 50 ------- Data Quality Objectives: Implementing the Process, John Warren 51 Remedial Systems Optimization and Long-Term Monitoring Guidance Documents, Daniel Welch, Maj.52 Statistical Methods Useful in Assessment Monitoring and Corrective Action Programs, Robert D. Gibbons, Ph.D 52 Using the Data Quality Objective Process to Revise a Groundwater Monitoring Program: the Experience at Pantex, Nancy Hassig, Ph.D 53 Optimization of LTM Networks: Statistical Approaches to Spatial and Temporal Redundancy, Kirk Cameron, Ph.D 54 Optimizing A Ground-Water Monitoring Network for Assessing Air Sparging Effectiveness on BTEX and MTBE, Steven J. Naber 55 Decision Support Software for Designing Long-term Monitoring Plans (LTMPs), Charles J. Newell, Ph.D.,P.E. 56 Simple, Inexpensive Diffusion Samplers for Investigating VOCs in Groundwater, Don A. Vroblesky, Ph.D 57 Optimization of Long-Term Monitoring Costs via Statistical and Geo-Statistical Thinking, Maureen Ridley 58 Rapid Data Access: Key to Integrated Use of Environmental Characterization and Monitoring Information, Maureen Ridley 59 Data requirements for Long-Term Monitoring and Data Comparability, Joseph D. Evans 60 How the Badger Army Ammunition Plant Saved $400,000 in Long-Term Monitoring Costs, John P. Hansen 61 Sample Size Determination and Computation of the 95% Upper Confidence Limit of the Mean in Environmental Applications, Anita Singh Ph.D 62 Some Alternative Statistical Procedures for Environmental Data Analysis, A.K. Singh, Ph.D 63 Environmental Resources Program Information Management Systems (ERPIMS) and GIS, Robin Lovell 63 Sampling and Analysis Plan Under PBMS, Barry Lesnik 63 Sample Collection and Handling Alternatives for VOC Soil Characterization: Method 5035, Alan D. Hewitt 64 Environmental Applications of NRL's Continuous Flow Immunosensor, Lisa C. Shriver-Lake 64 Groundwater Modeling System, Version 2.1, Earl Eldris 65 ------- Practical Internet-Based Applications of Geographic Information Systems (GIS) in Support of Long-term Monitoring and Remedial Program Optimization, Francis E. Slavich 66 Advanced Chemical Sensors for Monitoring of Organic Solvents in Groundwater, Radislav A. Potyrailo, PhD 67 Remote Sensing Assessment Usage in Long-Term Monitoring of Phytoremediation Field Sites, Suzette R. Burckhard 67 Long-Term Monitoring of Subsurface Barrier Integrity - Current Technology Capabilities and Limitations, David E. Daniel, Ph.D 68 Water Balance Monitoring of the Alternative Landfill Cover Demonstration (ALCD), Stephen F. Dwyer 69 Long-term Monitoring of Remediation Approaches in the Vadose Zone, Lome Everett, Ph.D 70 Short-Term and Long-Term Vadose Zone Monitoring: Current Technologies, Development, and Applications, Boris Faybishenko, Ph.D 70 The E-SMART® Base-wide Demonstration at Tinker Air Force Base: A Networked Array of Environmental Sensors, Steve Leffler, Ph.D 71 DIRECT PUSH TECHNOLOGIES: Recent Demonstrations, Bruce J. Nielsen 72 Results From a One-Year Field Trial of an Automated, Down-Hole Radiation Monitoring System, Garry W. Roman 73 ------- Section 1 Introduction This document summarizes the presentations and workshops of a conference on improving long-term monitoring (LTM) and remedial systems performance that was held in St. Louis, Missouri between June 8th to 11th, 1999. The conference was sponsored and developed by the Federal Remediation Technologies Roundtable (www.Trlr.goy). an interagency consortium dedicated to improving the availability of innovative technologies through technology evaluations, to developing (with industry) technical solutions to common contamination problems, and to working with States to overcome permitting barriers to the use of new technologies. The conference was designed to provide up-to-date information on LTM and systems optimization through presentations and topical workshops. The conference was convened to address the need to evaluate and assess the processes and practices related to monitoring and optimizing subsurface remedial performance and associated contaminant changes. These processes and practices are especially critical to in situ processes and natural attenuation strategies that require lengthy time frames to accomplish remediation. Implementation of these processes and strategies has resulted in high operation, maintenance, and monitoring costs which can impact the effectiveness, timing, and cost of remedial solutions. The objectives of the conference were to: 1) highlight successes and issues related to improving the performance of subsurface remediation technologies, 2) showcase practical approaches to cost-effective monitoring of remedial performance, and 3) identify research gaps and needs from current practice. This document summarizes the presentations and workshops delivered at the conference and analyzes the significant elements of the presentations/workshops from the perspective of how they contribute to our understanding of LTM and optimization issues. In addition, future needs, when appropriate, are assessed as a mechanism to drive future studies and conferences in this area. This document generally follows the conference agenda to retain its structure and flow. The conference plenary session and case studies of the first full day (June 9th) were developed to present an overview of the issues and work-to-date in LTM and optimization successes and lessons learned. Three concurrent tracks were developed for June 10th and 11th. The LTM sessions addressed methods to reduce the costs associated with long-term groundwater and vadose zone monitoring in remedial operations. The systems performance assessment and optimization sessions focused on LTM and operation strategies for remedial system optimization. The in situ/remote sensors and networks session presented current information on ------- the development and use of sensor technology for monitoring groundwater and vadose zone environments. In addition to the presentations, the conference offered six workshops. Summaries of these workshops can be found with the formal presentations in the appropriate sections. This document contains embedded links to websites cited in the presentations and workshops and to other sections of this document. In electronic copies of this document, these links enable readers to connect directly to the referenced site or section. ------- Section 2 Plenary and Case Studies 2.1 Plenary Session The plenary session served as a springboard for defining the goals and objectives of the conference. The overview summarized our current state-of-knowledge in the areas of LTM and system optimization and identified future needs and challenges in these areas. Walter Kovalick Q) provided opening remarks and stated the general goals of the conference as being: (1) to highlight LTM successes and issues, (2) to showcase practical LTM strategies, (3) to identify applied research gaps, and (4) to identify needs from current practice. He also suggested there is a great deal that can be learned from "gray panther" technologies such as pump-and-treat as their use has been widespread, and they have been in operation for many years. Lessons learned from optimizing these systems can be applied to the next generation of in situ technologies, which are inherently more difficult to optimize and are associated with high monitoring costs. Dr. Kovalick discussed several current EPA projects addressing LTM and optimization issues. These included: (1) Superfund 5-year review guidance, (2) operation and maintenance strategies, (3) pump-and-treat optimization, (4) SVE optimization, and (5) in situ process monitoring and measurement assessment. Karla Perri (2) addressed new directions in the DOD environmental cleanup programs, which emphasize effective and permanent solutions. As DOD has been cleaning some sites for 25 years and anticipates they will require an additional 25 years of cleanup, there are opportunities for optimization in order to accelerate cleanup and reduce costs. In addition, site closeout will require efficient LTM and maintenance technologies. Evolving issues for closing out DOD sites are unexploded ordnance (UXO) detection and cleanup, land use controls, voluntary cleanup, and public-private partnerships. Additional information can be acquired at www.afbca.hq.af.mil/closeout. Mario lerardi (3) presented an additional perspective on reaching site closeout. Mr. leradi stressed that remedial systems must be continually evaluated and optimized in place after installation. Program requirements should also be flexible to allow modifications to the system as new information is acquired during the ongoing treatment. Key challenges in the closeout process include: (1) improving awareness of requirements beyond the record of decision (ROD), (2) optimizing existing cleanup and monitoring system performance, (3) applying new and evolving technologies, and (4) using data better by way of ------- technologies such as data visualization. Mr. lerardi also identified specific needs in sensor technology, diffusion samplers, and remote control monitoring. Future conferences/workshops would benefit from a discussion of the definition and scope of LTM and system optimization. For example, the Air Force defines LTM as monitoring initiating after cleanup goals are met, while other agencies and jurisdictions define LTM as monitoring initiating at the remedial investigation stage. Semantics aside, these definitions are important in defining the types of technology and systems development needs, as well as in providing a foundation for interagency cooperative efforts and regulatory negotiations. Similarly, further refinement and possible subclassifications of LTM and optimization could be discussed to include issues such as what time frame constitutes LTM, and what the differences between process optimization and process modification are. Discussing the private-sector and regulatory perspectives within a plenary framework would benefit future conferences. The private-sector perspective on LTM and systems optimization may be very different from that of federal facilities in terms of infrastructure, technology needs, and cost. The regulatory perspective would enable the audience to assess the requirements and mechanisms for optimization and process modification. 2.2 Case Studies Case studies of remedial systems optimization and LTM were presented to demonstrate the current state- of-practice in applying strategies to accelerate cleanup and reduce costs. Six case studies were presented which covered a range of strategies from performing system analysis audits to using data visualization and geostatistical techniques. Several general themes emerged: (1) use experienced staff and good engineering practices to optimize processes, (2) geostatistical, mathematical, and data visualization techniques can be used to more efficiently assess data and optimize systems, (3) negotiations and communication with the regulatory community are important in implementing optimization results, and (4) significant system improvements can often be realized with modest upfront cost. Dick Cronce and Scott Glass (4) presented results from a Remedial Actions Operations/Long-Term Monitoring (RAO/LTM) optimization at the Naval Industrial Reserve Ordnance Plant in Fridley, Minnesota. The 80-acre site was placed on the NPL in 1989 and is contaminated with chlorinated solvents. The pump-and-treat system consists of six groundwater recovery wells and air stripping. Initial optimization activities used a numerical model to negotiate regulatory requirements for sampling ------- frequency and number of monitoring wells sampled. Further optimization included application of the SmartSite™ system, a data management and systems analysis approach. Optimization recommendations included the use of sensors to monitor pumping and extraction rate, remote control of systems operation, information management optimization, analysis of operation and maintenance services, and empowerment of operators to institute recommendations and changes. The cost for the assessment was estimated to be approximately $40,000, representing around 10% of the predicted cost savings. Dave Becker (5) presented results from several Remediation System Evaluations (RSEs) to identify cost savings and optimize remedial system performance. Test sites evaluated included a Superfund landfill site in New Jersey, an Army installation in Utah, and an Army installation in Washington State. The goals of the RSE process are to identify easily attainable savings modifications, verify definable and realistic objectives, ensure adequate maintenance of Government-owned equipment, and verify protectiveness consistent with CERCLA 5-year reviews. Detailed checklists for evaluating these aspects of the systems have been developed and are available on the Internet at: http://ww\\'.cnvironmcntal.usacc.army.mil/library/guidc/rscchk/rscchk.htnil . Each evaluation cost approximately $20,000 and resulted in potential savings from $80,000 to over $300,000 per year in operating and maintenance costs. Rob Greenwald (6) discussed the use of hydraulic optimization modeling to optimize pump-and-treat systems at three facilities. The hydraulic optimization uses linear programming to determine the "best" pumping rates and well placements subject to specified constraints. Multiple optimal solutions are evaluated relative to site-specific technical and nontechnical considerations. An initial screening approach using a simple spreadsheet, basic costs for system operation, and very rough "educated guesses" of the potential savings allows a quick determination if the potential cost savings justify the detailed hydraulic optimization model. Of the three test sites evaluated under this EPA-funded effort, hydraulic optimization modeling yielded significant savings at two sites. Total savings were projected to be in the millions of dollars over the life of the projects. Philip Hunter (7) presented strategies for optimizing LTM and pump-and-treat. The Air Force is preparing a draft Remedial Process Optimization Handbook as well as decision support software and algorithms that aid in optimization of monitoring and remedial costs. The use of statistical models, data visualization, and field-portable analytical techniques are important tools in the optimization process. A Remedial Process Optimization exercise was performed at Wurtsmith Air Force Base, Michigan. The Base operates pump-and-treat systems for the treatment of chlorinated solvents. Preliminary optimizations ------- considered over 200 possible extraction well locations simultaneously. The results of the optimization exercise indicated that the existing extraction well locations were not located or screened appropriately. The existing extraction wells are scheduled to be abandoned, and new extraction wells from the optimized design will be installed. Rusty Norris (!) provided information on Remediation System Audits (RSAs) for industrial clients. The RSA process is a holistic approach that identifies ways to reduce cost and duration. The process considers such factors such as risk-based cleanup levels, monitoring frequency, remedial performance, and the possibility of converting to natural attenuation processes. RSAs typically cost less than $10,000, take several weeks to complete, and realize savings in the hundreds of thousands of dollars. Emil Klawitter and Michael Barry (9) discussed an RAO/LTM optimization at the Naval Air Station at Brunswick, Maine. The site was placed on the NPL in 1987, and uses pump-and-treat with UV oxidation for the solvent contaminated groundwater. The initial LTM plan used the existing remedial investigation network of 36 wells and required quarterly sampling resulting in costs of approximately $550,000 per year. A geostatistical approach was used in conjunction with a data quality objective (DQO) assessment to evaluate the current LTM plan. The revised LTM plan reduced the number of sampled wells from 36 to 22 and reduced the frequency of sampling. In addition, additional monitoring wells were proposed to address data gaps identified during the geostatistical analysis. Based on these modifications, the monitoring costs were reduced from $550,000 to $250,000 per year. The treatment system is currently being evaluated. ------- Section 3 Systems Performance Assessment and Optimization Operation of remedial systems over the last several decades has provided a substantial database of information regarding their performance and cost. With this information, assessment of system performance and recommendations for optimization have been performed on many operations. This information and the approaches used to gather, analyze, and perform optimization analyses are applicable to existing remedial systems as aids to reducing costs and shortening remedial time-frames. Through presentations and workshops, the conference sessions addressed current approaches and tools for assessing the performance and optimizing the design of existing and planned remedial systems. Technology groups addressed include soil vapor extraction (SVE) and air sparging, permeable reactive barriers, bioremediation and phytoremediation, above ground treatment systems, and natural attenuation. 3.1 Soil Vapor Extraction and Air Sparging Vapor extraction and sparging technologies for in situ treatment of organics are cost-effective, proven technologies that have been successfully implemented at many hazardous waste sites. These technologies are now considered conventional even though innovative applications of the technologies continue to be developed. The five presentations provided valuable insights and lessons learned in site screening for application, characterization methods for design, LTM and performance assessment, and system optimization guidelines. Ralph Baker (HI) summarized a USEPA SVE optimization guide that is planned for release in the Fall of 1999. When published, information on the optimization guide will be found at htt2I//wwwiclitin.org. Relevant findings the guide will address include: (1) site characterization limitations may impact SVE effectiveness; (2) SVE system performance is limited by subsurface gas flow and preferential gas flow; (3) SVE should be designed based on site-specific performance data; (4) specific discharge estimates rather than vacuum influence are preferred for design; (5) full-scale implementation should be phased in (iterative design); (6) proper monitoring technologies are needed; (7) performance should be optimized through periodic comprehensive site reviews; (8) diffusion limitations dictates SVE remedial progress; (9) ------- SVE is not applicable to smear-zone, capillary fringe, or groundwater; and (10) rebound testing should be part of any closure. Dominic DiGiulio (11) discussed the need to provide consistent approaches to performance assessment of soil venting technologies. Even though 27% of Superfund sites use or have used soil venting, there is a lack of efficient and consistent approaches to performance assessment and a general inability to obtain closure. Conventional radius of influence testing using vacuum measurements over distance are inadequate. Three-dimensional modeling using superposition is the preferred method to determine the volume of soil amenable to SVE. The use of in situ pressure transducers and transient testing allows for assessments of gas permeability on relatively small scales. This is analogous to the use of slug tests for hydrogeologic characterizations. Rick Johnson (J_2) presented approaches and techniques for applying in situ air sparging (IAS) to contaminated groundwater. Three steps are recommended for screening and application of an IAS project: (1) risk characterization, (2) conceptual model development, and (3) system design. Initial screening should use a single sparge and several vadose zone monitoring points, and if acceptable, a standard IAS system can be installed. Ongoing monitoring requires assessment of air distribution, off-gas concentration, and groundwater quality. Air distribution can be assessed using water pressure, neutron probes, electrical resistance, and chemical tracers. Groundwater characterization for IAS is best assessed through direct-push technologies as opposed to conventional monitoring wells which act as short circuits for air. Bo Stewart (13) discussed a new well logging technique known as PneuLog™. The PneuLog™ is specifically designed to characterize a site for SVE applicability as well as to optimize SVE once installed. The technology uses an airflow indicator that is moved along the length of a well screen during vacuum extraction and monitors gas concentrations. The output from the PneuLog™ is a mass removal per foot profile of a well. The process can identify specific strata from which contaminants are being removed. The results from a PneuLog™ survey of extraction wells can be used to optimize SVE performance by increasing or decreasing flow to specific wells, shutting down wells, or installing new wells. The technique also enables better estimates of operation time and attainment of cleanup goals. Zhenhua Jiang (J_4) discussed an SVE systems performance model, T2VOC.soil vapor extraction A major issue in using SVE revolves around the exit strategy or "when to turn it off." The use of the T2VOC model assists in the 3-D evaluation of SVE performance. The model evaluates multi-component, multi- ------- phase flow paths. Future efforts with the model involve evaluating different NAPL compounds as well as system performance as individual wells are turned on and off. Overall, the SVE and air sparging presentations demonstrated the ability to optimize performance and provide approaches and tools for assessing the applicability of the technology for site-specific conditions. The widespread use of SVE and air sparging technologies is in many ways analogous to the historical use of pump-and-treat. Pump-and-treat at many sites was reclassified as a containment technology after evaluations revealed lengthy remedial time frames. SVE and air sparging technologies can potentially fall prey to similar problems if the systems do not effect significant source reduction. Optimization assessments may expedite cleanup by identifying source materials and the best way to achieve maximum removal rates. Future conferences and workshops could continue to update assessment and optimization tools that are being developed or demonstrated. 3.2 Permeable Reactive Barriers Permeable reactive barriers (PRBs) are a low-cost, low maintenance, passive technology for the treatment of relatively shallow contaminated groundwater. Presentations at the conference focused on the reductive dechlorination of chlorinated solvents using iron. Common issues addressed include the long-term effectiveness of the emplaced iron, fouling and reduction of permeability, and downgradient monitoring technologies. Robert Puls Q5) discussed long-term performance issues and solutions for PRBs. The two main issues are long-term effectiveness and potential decreases in permeability. Performance monitoring can be accomplished by measuring pH increases in the wall and Eh decreases downgradient, as well as downgradient reductions in the contaminants of concern. In addition, water quality measurements such as of dissolved oxygen, sulfides, and alkalinity are also important. Long-term fouling assessments can be accomplished through analysis of the type and nature of surface precipitates. Optimization areas that are being investigated include decreasing the cost of granular iron, identifying better catalysts, and using deeper and less expensive emplacement technologies. Robert Gillham (16) presented findings from LTM performance investigations. Performance issues investigated include longevity of the iron, biofouling, and loss of activity and permeability by precipitate formation. Iron precipitate formation studies showed porosity losses of only 0.004% per day, with a layer thickness increase of only 10"10 cm/day. Field tests in upstate New York revealed that reducing the ------- area/per unit flow of water minimizes precipitation and scaling, and that none of the 30 wells installed showed problems with plugging. Nic Korte (17) discussed methods to assess hydraulic measurements through PRBs with special emphasis on issues facing DOE sites. DOE sites are concerned with the transport of radionuclides through passive remediation systems such as PRBs. Colloidal release is a mechanism observed to account for the non- fouling behavior of PRBs. This mechanism is beneficial for non-rad applications, but is of concern to radioactive sites. A downhole well tool, the Colloidal Borescope, is used to assess the potential colloidal release of contaminant from the barrier. The tool can also be used to assess flow and transport through the barrier. Timothy Sivavec (18) presented results of a study of a PRB pilot-scale application, the Somersworth, New Hampshire Landfill Superfund site. The study provided useful monitoring and performance information. Based on a quantification of mineral precipitates, there was a less than 3% loss in porosity over the 18-month study. In addition, biofouling due to microbial growth was not observed. Monitoring protocols are in development that would integrate commercially available sensors for groundwater characterization with Internet site real-time data acquisition and visualization technologies. This would allow early detection of possible contaminant breakthrough. Charles Reeter (19) discussed the results of a pilot-scale PRB treatment at the former Naval Air Station Moffett Field site in Mountain View, California as well as the results of four other sites. Significant findings indicate that in some cases chlorinated solvents were reduced but not eliminated, necessitating polishing technologies such as biosparging. General concerns in the application of PRBs include: (1) multiple designs, (2) need for design and construction guidance, (3) need for consistent sampling and analysis protocols, (4) methods to predict PRB failure, and (5) information on PRB life expectancies. Overall, PRBs were depicted as effective technologies for the containment of migrating plumes of chlorinated solvents. The studies presented did not indicate serious concerns involving the life expectancy of the iron or fouling and loss of permeability. However, monitoring for breakthrough and measures to reduce overall costs (treatment and monitoring) are general concerns. Future conferences could update information on PRB effectiveness and monitoring technologies. Specifically, sensor technology integrated with real-time data visualization systems would be particularly beneficial for monitoring breakthrough. 10 ------- 3.3 LTM of Monitored Natural Attenuation: Guidance and Case Studies Although natural attenuation is not considered an active remediation technology, the monitoring needs require tools to assess its applicability and progress. In addition, coupling source reduction with natural attenuation may significantly reduce remedial time-frames and in some cases allow natural attenuation to be employed. Conference presentations and workshops addressed performance and monitoring issues. Four presentations discussed the Air Force and EPA's approach to LTM of natural attenuation, as well as case studies evaluating natural attenuation. Two workshops, one on the BIOCHLOR and BIOSCREEN models and one on source control coupled with natural attenuation, provided an in-depth review of new tools and emerging issues. Patrick Haas (20) presented the Air Force's perspective on LTM. The Air Force approach is to use site- specific criteria for determining the applicability and monitoring needs for natural attenuation. A contingency plan and a specific LTM plan are required. A protocol is being prepared for effective placement, construction, and sampling of well points. Varadhan Ravi (21) discussed natural attenuation at the KL Landfill site. Monitoring natural attenuation at a landfill site is difficult due to the transient nature of the source, which may falsely indicate that the plume is attenuating. The use of numerical and analytical models to assess temporal and spatial variability was discussed. Geostatistic techniques were used for visualizing spatial heterogeneity; time-series techniques were used to characterize temporal variability. Judith Pennington (22) presented results from a natural attenuation study of explosives at the Louisiana Army Ammunition Plant. Natural attenuation of TNT and RDX is being investigated at a 26-acre site. The monitoring issues involve demonstrating the rate and magnitude of bacterial decomposition of the energetic materials. The study demonstrated that certain "biomarkers" are evidence of the natural breakdown of the contaminants. In addition, stable isotopes and microbial PLFA analyses were also beneficial. Herb Levine (23) discussed the development of an EPA guidance document on the natural attenuation of BTEX compounds and chlorinated solvents in groundwater. The document will focus on several important objectives of natural attenuation such as the determination of whether natural attenuation is 11 ------- occurring, potential toxic transformations, plume expansions, new releases that may impact effectiveness, when and how to implement a contingency remedy (if necessary), and frequency of monitoring. Charles Newell ran a workshop on the application of the BIOCHLOR and BIOSCREEN models for screening chlorinated or hydrocarbon sites for natural attenuation potential. The models can be accessed for free at the following web sites: BIOSCREEN - wvwŁ|3a^oiMa/kMliaMtail; BIOCHLOR - www.gSMigt.corn. The models can be run from Microsoft Excel 97 and are used to simulate contaminant degradation based on user input. The models assist in the determination of natural attenuation applicability and should be used in conjunction with additional evidence for final determination. Patrick Haas (AFCEE) led a panel discussion workshop on source control coupled with monitored natural attenuation. Other panel members included Paul Hadley (CaDTSC), Chuck Newell (Groundwater Services, Inc.), Dominick Diguilio (EPA), Ken Lovelace (EPA) and Mark Bershold (CaDTSC). The panel discussed the issues involved in and benefits of performing source control and removal as an aid to accelerating and/or enabling natural attenuation. There is a need for additional research and development of simulation models to evaluate and predict the effects of aggressive source removal on natural attenuation processes. Although theoretical approaches have evaluated the effect of source control on natural attenuation, case studies and actual site data are needed to assess the usefulness of this approach. Monitoring and modeling techniques for natural attenuation are similar to techniques used for active remediation. However, there are specific concerns relevant to natural attenuation such as off-site migration and long-term effectiveness. Natural attenuation is generally accepted as applicable to hydrocarbon sites and is becoming more acceptable as an alternative for chlorinated sites. Future conferences and workshops could focus on emerging evidence for natural attenuation of other organic contaminants and metals. In addition, more information is needed to determine the impacts of coupling source removal with natural attenuation. Applications where source removal negatively and positively impacted natural attenuation should be discussed. There may be a need for more sophisticated models and decision support software to provide not only screening, but also data reduction and visualization during the monitoring phase. The BIOCHLOR and BIOSCREEN models are useful for general screening, but more evidence may be required to predict the degradation pathways of natural attenuation, as well as long-term confirmation. Expert systems and other integrated decision support techniques could be developed to meet this need. 12 ------- 3.4 Bioremediation andPhytoremediation Bioremediation is an accepted in situ technology for the treatment of organic contaminants and some metals. Phytoremediation is an emerging technology for organics and metals that is receiving attention due to its low cost and environmental acceptance. Both technologies offer effective and economical solutions to contaminant problems. However, biological systems are inherently difficult to predict and control, and in the case of phytoremediation, may require lengthy remedial time frames. The three presentations in this session addressed monitoring and optimization tools and related issues. Scott Beckman (24) presented results from a joint EPA, ESTCP, and USGS study of phytoremediation of chlorinated ethenes at Carswell Air Force Base in Fort Worth, Texas. Due to the long-term remediation requirements inherent in phytoremediation, LTM is important in determining the rate and effectiveness of the system. In particular, hydrogeological monitoring to determine effectiveness of tree transpiration processes and geochemical groundwater characterizations should be performed. Selection of the appropriate monitoring stations is critical to perceiving changes in water uptake. Geochemical and biological groundwater indicators such as dissolved oxygen, hydrogen ion concentration, iron speciation, and microbial counts are useful for assessing the reductive dechlorination process. Christine Shoemaker (25) discussed an approach to designing and operating in situ bioremediation by using optimization algorithms coupled with groundwater fate and transport models. This approach can be used to determine the optimal parameters such as pumping rates and nutrient additions. A sensitivity analysis (i.e., an analysis that determines which parameters are responsible for the largest effects) aids in the identification of important factors to control during in situ bioremediation. Sensitivity results were presented for a simple aerobic bioremediation system as well as a more complex anaerobic dechlorination process. Kent Sorenson (26) presented results from a large-scale in situ field test of a bioremediation treatment process at the Test Area North (TAN) Facility, Idaho National Engineering and Environmental Laboratory. An integrated monitoring strategy was implemented to assess the performance of the system. Elements of the monitoring program consisted of dedicated low-flow pumps, flow-through cell measurements, field analyses, in situ monitoring equipment, and near real-time data analysis. The integrated monitoring program revealed significant information, allowing the bioremediation process to be optimized. 13 ------- Bioremediation effectiveness is dependent on factors that may impact the ability of the microorganisms to perform optimally. These factors may be complex and site-specific. Further development of sensitivity analysis and process optimization through modeling can be an important design and monitoring tool. The use of field-based characterization analyses and sensors should also be further investigated. One specific technology need is the application of in situ sensors that can directly detect specific microbial assemblages responsible for contaminant transformations. Current sensors and in situ analyses rely on secondary indices such as dissolved oxygen and ORP to predict microbial conditions. Furthermore, microbial analyses are expensive and require acquisition of soil and/or groundwater. In situ microbial sensors would offer real-time data representing the type and size of microbial populations. Future phytoremediation needs involve the development of hydrological models and sensors that can ascertain small temporal hydraulic changes associated with transpirative processes. System performance monitoring may require development of water balance measurement protocols which evaluate both water input and plant uptake. Development of quantitative and rapid methods to measure plant metabolic contaminant degradation byproducts will enable assessments of the plant's ability to transform the contaminants of concern. 3.5 Above Ground Treatment Systems It is commonly assumed that above ground treatment systems are easier to optimize and monitor than in situ processes. However, the increasing use of treatment trains is requiring optimization of all system components. This multi-component optimization will require advanced monitoring tools and optimization programs to achieve maximum system efficiency. The session addressed optimization and monitoring tools and needs for above ground treatment systems. Sarabjit Singh (27) discussed process optimization of an above ground treatment system at McClellan Air Force Base. The optimization consisted of four major approaches: system reliability improvement, optimization of process parameters, minimization of process generated waste, and optimization of sampling type and frequency. System reliability improvements resulted in uptimes exceeding 90%. Waste minimization optimizations reduced chemical feed rates and scrubber blowdown. Lindsey Lien (28) presented results from optimization of a groundwater treatment system at the Milan Army Ammunition Plant in Milan, Tennessee. The treatment system consists of electrochemical precipitation, ultraviolet oxidation, and carbon adsorption. The challenge was to evaluate and optimize 14 ------- each component so the total system operates efficiently and cost-effectively. The analysis was performed at two different influent concentration ranges to determine effectiveness over a range of operating conditions. Ted Streckfuss (29) discussed the US Army Corps of Engineers approach to performing performance audits at treatment facilities. Critical elements of a performance audit include: enhancement of system operation, optimization of equipment components, and minimization of utility expense. David Ahfield (30) (University of Massachusetts) and Richard Peralta (Utah State University) led a workshop on flow and transport optimization for pump-and-treat systems. They discussed mathematical models for designing and optimizing groundwater pump-and-treat systems models. The goal of the optimizations is to save costs by collecting less data while retaining high quality information for decision- making. Many mathematical models were discussed but two specific models were highlighted: the Simplex Method, which automatically finds the optimum design from millions of combinations, and REMAX, a non-linear transport optimization model. The workshop presented several guidelines for optimization that included; (1) application of a phased approach, (2) the need for a good simulator, (3) the importance of defining the optimization problem, (4) experiences from other sites, and (5) comprehension of available codes and programs. The use of treatment trains in above ground systems will require greater coordination and optimization between system components. Areas for future development and conference discussion include use of sensors (to predict changes in concentration) and computerized systems to provide "on-the-fly" system analysis and process modification/optimization. In addition, the use of optimization software using advanced algorithms can be integrated into the control software. 15 ------- Section 4 Long-Term Monitoring Optimization 4.1 Data Quality Objectives and LTM Guidance Two speakers, representing EPA and the Defense Logistics Agency, presented known and established processes that can be used for optimization of LTM. They discussed the data quality objective (DQO) process and the remedial process optimization (RPO) models. The ROP process was developed specifically for long-term optimization; the DQO process is a general approach used for many scientific procedures. John Warren (H) discussed the DQO process and provided new information on software available from the EPA. This software allows an investigator to alter false positives, false negatives, and the number of samples needed in order to calculate the total sample number based on "decision error risk." This software is available to the public on request. Elements of the presentation included: (1) decision making and acceptable false positives and false negatives (Type I and Type II errors), (2) estimation of the number of samples needed and budgeting for appropriate sample number, and (3) total error variance. The information presented discussed statistical error and acceptable risk. Major Dan Welch (32) presented information on the RPO approach developed by the Air Force. He noted the existence of an interagency RPO workgroup. The RPO process: (1) prepares managers for the "5-year review," (2) optimizes LTM, and (3) determines data requirements. It is designed to answer the question, "can we reduce and/or streamline the information?" A primary theme of the presentation was the use of common sense in reducing and streamlining information while maintaining overall data quality. Emphasis for future conferences could focus on the issue of acceptable Type I and Type II errors for the DQO process. "Acceptable error" is project dependent, and experimental designers are often unable to determine acceptable Type I and Type II errors. Additionally, information on the use of the RPO process in terms of sampling, analysis, well reduction, etc. could provide the audience with details for implementation. Perhaps additional information from sites where the RPO process has been successfully implemented in order to optimize remedial processes or even achieve site closure would be a good follow-on to this presentation. Site closure was the emphasis of many talks, and there seems to be a lack of information available on what is needed or how to eventually achieve "site closure." 16 ------- 4.2 Mon itoring Networks an d Sampling Presentations and two workshops illustrated the many different tools that are available for optimizing the LTM process. They include statistical tools, support software, and low cost sampling equipment and procedures. Each presentation discussed examples of how these tools are used and the subsequent cost saving associated with each. Emphasis was placed on the use of software and statistics, with minimal discussions of the cost savings associated with improved analytical and sampling strategies. Additional information in these areas would interest future audiences. The presentation on low cost diffusion samplers was well received and initiated an audience discussion. This was an informative presentation, and there are probably many more methods in practice that would interest future conference attendees. A replacement speaker for Robert Gibbons (33) discussed statistical methods for detection monitoring. These methods are used in assessment monitoring and corrective action programs. Significant elements of the presentation included: (1) a false belief that analytical methods give true concentrations, (2) the use of statistics after data collection in order to construct intervals representing true concentrations, (3) the methods to determine true contaminant concentrations and sample size requirements, and (4) hypothesis testing: "innocent until proven guilty." In addition, investigators could demonstrate compliance by showing the Upper Confidence Limit (UCL) is less than the clean-up standard, in other words "reject the null only if data give evidence that we are not meeting compliance." Nancy Hassig and Mickey Brown (34) presented an example of the use of DQOs to revise a groundwater monitoring program at Pantex. This was an example of the DQO process presented earlier by John Warren. The presentation emphasized: (1) the DQO process as a means to encourage the data user to better define goals, (2) the management of data uncertainty and acknowledgment that there is uncertainty of collected data, (3) the optimization steps that are needed, and (4) ways to actually revise the monitoring program. Kirk Cameron (35) presented specific examples of the use of statistics (kriging) to revise LTM programs. The presentation included: (1) a discussion of data problems, (2) the removal of redundant data through kriging, and (3) the running of scenarios to achieve optimization. Most of the presentation was an actual exercise in statistics. The kriging example demonstrated that reducing the number of wells by as much as 50% resulted in minimal loss of information. 17 ------- Steve Naber's (36) presentation was similar to Kirk Cameron; focusing on the use of statistics (kriging and trend analysis) to revise an LTM program. The presentation also used a case study to demonstrate the application of statistical methods for optimization. Air sparging and natural attenuation were the remediation processes in use. The goal of the statistical exercise was the elimination of sampling points as a means to minimize superfluous data. Charles Newell (37) discussed the use of decision support software for designing LTM plans. The presentation discussed the types of tools available, and "what can be used for a Decision Support System." Often, too many wells are analyzed too frequently. Trend analysis, to determine plume stability, was demonstrated on several examples from different types of plumes. The final software package, discussed during the presentation, will be available around October 2000. Once available, use of this software by others would be an interesting follow-on for this presentation. Don Vroblesky (38) introduced a low cost diffusion sampling device for VOCs in groundwater. The advantages of this device are that it: (1) is an innovative idea for sampling volatiles, 2) is a very low tech collection method, (3) gives a good representation of discharge zones for underground contaminated plumes, and (4) the collected data provide temporal information and can eliminate well purging. Gary Tuckfield and Maureen Ridley (39) led a workshop on the optimization of LTM costs via statistical and geostatistical thinking. The workshop discussed the use of statistics to eliminate redundant well sampling at pump-and-treat remediation sites. The first example presented used kriging to reduce the number of wells sampled without compromising the quality of the monitoring programs. This process involves removing wells from the sampling plan and comparing the effect on the results as each well is removed. Kriging provides a linear combination of the point contamination sources that have influence in areas where samples are not obtained. Well elimination is based on the following guidelines: (1) Relevancy (2) Redundancy, (3) Reliability, and (4) Regulatory. An example of applying these guidelines at an actual site was presented. An integral element of the study was a presentation to the regulators and their part in the decision making process based upon the information presented. The cost of the study was approximately $85,000, with cost savings estimated at approximately $200,000 per year. The second half of the workshop addressed the Cost Effective Sampling (CES) program designed to perform a sensitivity analysis. The model, developed and deployed at the Lawrence Livermore National Laboratory (LLNL), has also been used at the DOE Savannah River Site and Riverbank Army Ammunitions Plant to reduce monitoring costs. CES uses simple linear 18 ------- programming to evaluate a sampling strategy for a monitoring network by evaluating temporal changes and variability in contaminants sampled from well points. Sampling frequency is then based on an analysis of the long-term stability and trends in the data. The technology has reduced monitoring costs by 25-40%, and reduced annual monitoring costs at LLNL by $400,000. Future presentations and workshops could focus on the use and application of statistical software packages from an end-user perspective. It would be useful to track and update the results of optimization studies and build a database of case studies as a means of evaluating the best available strategies for specific applications. Conferences, workshops, or Internet-based technology transfer vehicles could be used to disseminate the evolving knowledge base. Future conferences could address the use of new and innovative sampling devices available as well as standard operating procedures (SOPs) for their use. More information on specific limitations of samplers and the limitations of the collected data would be useful. More examples of the applications for these and other types of samplers as well as data comparability to traditional methods is needed before these approaches can be used on a more widespread basis. 4.3 Data Assessment and Management Common themes for this session included the management and evaluation of data from the perspective of LTM. In addition, procedures to manage and assess data for optimizing the monitoring process and producing higher quality data were also covered. Several examples and methods were presented. Maureen Ridley (40) discussed web-based rapid data access and interpretation tools that facilitate the processing of the large quantities of data typically associated with environmental investigations. These tools have been used at LLNL for site characterization and monitoring as part of a ground water restoration project. These tools are available as part of the enABL Data Management System (EDMS) and allow multiple organizations to share information electronically. The free software can be downloaded at http://\\rww.arseiiaultlegg.com/_dowiiload/dl_list.idc . Joe Evans (44) presented methods to help ensure data comparability between sampling and analytical events that are separated by long time periods. Changes in laboratory instrumentation, standards, and personnel can be sources of variability. The presentation included a discussion of: (1) shortcomings in traditional analysis, sampling, and interpretation methods used for interpreting bias trends, (2) methods to 19 ------- help ensure data comparability, and (3) examples of how these methods were applied and suggestions to ensure data comparability. John Hansen (42) presented an example of processes used to save $400,000 per year in LTM costs at the Badger Army Ammunition Plant in Wisconsin. The presentation emphasized: (1) an example of a data management system used for groundwater monitoring optimization, (2) benefits of data visualization during data collection, and (3) common sense procedures for data analysis. An example was presented which resulted in a reduction of the number of wells from 220 to 170, reduced the number of analytes, eliminated sampling points, and reduced frequency of sampling. The meetings with regulators to modify the sampling plan were also discussed. Anita Singh (43) presented statistical methods that are useful in the data review process. The emphasis of the presentation was the application of statistics to reaching the correct conclusion. The presentation focused on the computation of the 95% UCL to establish background level and verify cleanup. The burden of proof is on the alternate hypothesis, "to show that the site is clean." The lognormal distribution commonly used to represent environmental data may be overused since it "makes it easier not to reject the null hypothesis" and therefore may not be appropriate. The presentation also noted examples of conclusions that changed in response to use of the lognormal or normal distribution. The importance of data visualization techniques was also stressed. A.K. Singh (44) presented several beneficial alternative statistical procedures for environmental data analysis. Key elements of the presentation included: (1) the "normal" way of comparing data, (2) avoiding the application of log transformations, (3) using "Jackknife" and "Bootstrap" methods (incrementally take out one datum at a time) to interpret environmental data, and (4) approximating distributions for real data. A workshop was conducted by Robin Love 11 and Sharon Shaw (45) on the use of Environmental Resources Program Information Management Systems (ERPIMS) data and the Geographic Information System (GIS). The electronic database system allows for the integration of site information to include site location, hydrogeology, analytical, and remediation status. The system allows the data to be accessed through the Internet. The software can be obtained from the following website: http://www.afccc.brooks.af.mil/ms/msc irp.htm 20 ------- Future conferences could address, in greater detail, long-term data comparability issues as well as methods and procedures to detect and rectify potential problems. Specifically, case studies and examples of QA processes in-place that ensure data comparability would be a useful element in a conference. Conferences and workshops addressing statistical procedures for determining sample size prior to plan development, as well as methods to perform post-sampling data reduction and interpretation would be useful. The conferences could provide hands-on instruction and provide tools to assist in data reduction and interpretation. New statistical procedures and approaches can also be presented at these conferences. 4.4 Analytical Issues Common themes from this session focused on analytical methods that are appropriate for the evaluation of samples from sites that require LTM. Analytical issues relevant to LTM include data quality, representativeness, and alternative methods available for analysis. Analytical methodologies are more commonly focusing on the DQO process. In addition, flexible and project-specific analytical schemes may be required for implementation of LTM programs. Analyses could be geared towards the specific project. Standard methods may not be appropriate for all situations requiring the development and application of new methodologies. Barry Lesnik (46) presented information on Performance Based Measurement Systems (PBMS). This topic is controversial, with many investigators feeling that it will allow laboratories to develop their own analytical methods. In fact, the goal of PBMS is to choose appropriate methods in order to meet data quality needs. With PBMS, laboratories or principal investigators can choose methods appropriate for the matrix, the analyte, and the ultimate need of the data. EPA does not expect implementation of PBMS to spawn a large number of new methods. It is not the intention of EPA to replace previous methods currently being used. It is now more important that project managers and laboratories choose appropriate specifications and data quality requirements rather than choosing a specific method for analysis. A certain measure of data quality is required for regulatory decision-making, and PBMS focuses upon data quality. Alan Hewitt's (47) presentation focused on the collection and analysis of soil VOCs. Data and information were presented on a new method that seals soil samples in the field and then ships these samples "frozen" to the laboratory. Specific points of the presentation included: (1) the method is field friendly, and (2) it replaces old methods where volatiles are often lost. Most of the comparison was between this method and field methanol extraction. Field freezing is used when field methanol extraction is not feasible and therefore is not intended to replace the field methanol extraction procedure. The loss of 21 ------- volatiles from soil samples occurs so rapidly (within 30 minutes) that old methods for volatile collection are almost useless. Lisa Shriver (48) introduced the use of a continuous flow immunosensor - a field instrument for immunoassay. The presentation addressed: (1) data from assays for explosives and comparability with conventional analyses, and (2) other assays under development for polychlorinated biphenyls (PCBs) and trichloroethylene (TCE). The method is in an early stage of development for many compounds. Additional information is needed to compare the immunoassay procedure to conventional laboratory analyses for the compounds of interest. Presentations for future conferences could discuss successes, failures, and concerns with the PBMS approach. Examples of case studies and actual methods could be presented so that the audience understands the rationale behind the PBMS approach. Additional methods that have been modified or developed under PBMS or other requirements could be discussed. Alternative sampling and analytical approaches (e.g., volatiles and immunoassays) could be discussed in a format that could address comparability to accepted methodologies. 4.5 Data Presentation/Visualization This session emphasized the role of data presentation as the key to understanding site data. Real-time data presentation from several sites can be very useful in presenting findings prior to development of reports. One drawback to this approach is that instant availability of data may circumvent necessary QA/QC procedures. Earl Edris (49) discussed the Groundwater Monitoring System (GMS), version 2.1. He discussed: (1) changes in this system compared with previous models, (2) the version is a living model and provides characterization assessment, cleanup alternatives, and cleanup optimization, and (3) the system allows visualization of contamination on site. The system has broad applicability for many different organizations. Francis Slavich (50) presented the GIS; a way to visualize real-time data on the Web. The presentation stressed that visualization is the key to data presentation. The GIS was demonstrated "on-line" during the presentation. During the presentation, real-time data was acquired through a web connection and presented to the audience. An interface is available for use and no real training is required. The system 22 ------- has a simple, user-friendly menu interface. An demonstration of the system can be accessed at http: //www. amc-cc.org . Ideas for future conferences could investigate additional data presentation and visualization systems similar to GIS or GMS. Topics to be addressed include the advantages and disadvantages of data presentation and visualization systems from a data user perspective and the benefits with respect to LTM. 23 ------- Section 5 In situ I Remote Sensors and Networks and Emerging Monitoring Techniques The use of sensors, networks, and advanced sampling technologies is particularly beneficial to LTM and optimization of remedial systems. The cost of acquiring and analyzing soil and groundwater matrices over an extended time period can be a major factor in the economics of a project. In some cases, sampling and analysis may exceed system operation and maintenance costs. In-place sensors and innovative sampling methods can be cost-effective alternatives to "hard" sampling and analysis. Furthermore, integration of sensor networks with telecommunication technologies can provide continuous and real-time information that can be used for remote system control and optimization. The conference addressed advances and future development needs in the application of sensor technology for monitoring contaminants and subsurface conditions. A wide range of applications and technologies were discussed, including chemical sensors, data integration, remote sensing, and remote monitoring/control. The material was presented in a conference session as well as a workshop. 5.1 In situ/Remote Sensors and Networks Radislav Potyrailo (51) presented results of field test demonstrations on an in situ instrument for monitoring volatile organics in groundwater at part-per-billion (ppb) levels. The device is an acoustic wave thickness-shear mode sensor based on a quartz resonator coated with a nonpolar polymer film. Detection limits of 8 and 12 ppb were demonstrated for trichloroethylene and toluene, respectively. Advantages of the novel sensor include reduction of surface contamination, resistance to corrosion, and non-interference due to variations in pH, ionic strength, viscosity, and water density. Preliminary field evaluations demonstrated a good correlation with conventional laboratory purge-and-trap/GC analysis. Suzette Burckhard and Vernon Schaefer (52) discussed the use of remote sensing techniques to monitor the progress of phytoremediation applications. Long-term monitoring of large field sites usually requires numerous measurements to assess vegetative cover and condition. Remote sensing can be used to reduce the cost of sending personnel to the field to perform measurements. Aircraft equipped with systems such as the Airborne Visible and Infrared Spectrometer (AVIRIS) can be used to assess plant coverage. This remote sensing platform can provide up to 224 spectral bands of imagery from the visible through the 24 ------- mid-infrared and detects unique absorption features in the spectral region which correlate with the vegetation's vigor and/or stress. David Daniel (53) discussed monitoring techniques for vertical barriers designed to restrict the movement of contaminants or gases out of contaminated subsurfaces while restricting the inward migration of groundwater. Innovative techniques for physical integrity verification include geophysical (electrical resistance tomography) and fiberoptic sensors. Stephen Dwyer (54) discussed monitoring processes associated with the Alternative Landfill Cover Demonstration (ALCD), a project investigating the use of alternative caps for landfills in arid and semi- arid environments. Automated monitoring collects data on infiltration (lysimeters), soil moisture content (time domain reflectometry and neutron probes), runoff, and weather. The system is centralized by a series of data loggers, cable testers, computers, and power systems for continual data collection. Lome Everett (55) presented information on a new tool for monitoring performance of vadose zone remediation technologies. High energy laser technologies provide a characterization tool which can "zip- up" its own 2-4 inch diameter hole for decommissioning. The subsurface materials can also be analyzed with laser spectroscopy. Boris Faybishenko (56) discussed issues relating to characterizing preferential flow in the vadose zone. Preferential flow is a subsurface mechanism which allows for accelerated migration of contaminants through fissures, fractures, burrows, and other geological heterogeneous material. The presentation identified gaps in current characterization and monitoring technologies and recommended actions for the development of advanced vadose zone characterization and monitoring methods using a combination of hydrologic, geochemical, and geophysical techniques. Steve Leffler (57) presented information on the use of the Environmental Systems Management, Analysis and Reporting network (E-SMART), a system that uses sensors that detect and measure contaminants in groundwater and soil gas, as well as physical parameters such as barometric pressure, pH, and temperature. Collected data are immediately available via the Internet for analysis and visualization. Smart functions of the system include self calibration, standardized communication protocols with other sensors on the network, onboard data logging and data storage, and plug-and-play installation. 25 ------- Bruce Nielsen (58) discussed the use of direct push technology (DPT) platforms for making continuous measurements of contaminants and subsurface characteristics during drilling activities (soil sampling, well installation, and soil gas surveys). Sensor technologies associated with DPT include soil strength gages, resistivity, soil moisture, pore pressure, gas chromatography/mass spectroscopy, and laser-induced fluorescence spectroscopy (organics). Gary Roman (59} presented information on a nuclear radiation monitor with the ability to continuously monitor a large number of locations and automatically sound an alarm when measurements are outside established limits. The system (based on gamma detection) is capable of monitoring to depths of 50 meters below ground without the need to drill wells. A radiation monitoring probe is installed into the subsurface using cone penetrometer technology. The data are transferable to a host PC which can be remotely located and can monitor hundreds to thousands of locations. 5.2 Emerging Monitoring Techniques and Future Research Needs The workshop was designed to present new technologies and technology needs as well as to stimulate a facilitated discussion among the audience panelists. Therefore, the first half of the workshop was dedicated to presentation while the second half focused on audience discussions with the panelists. The audience consisted of a broad cross-section of interest groups such as regulators, technology consumers, and technology developers. Marty Faile, Air Force Center for Environmental Excellence (AFCEE), discussed the Air Force's current monitoring initiatives and needs. The Air Force has recognized that remote operation of remedial systems is not simple and will require further work. Needs exist for chlorinated solvent sensors as well as low-cost sampling devices such as diffusion samplers. George Shaw, W.L. Gore & Associates, discussed the Gore- Sorber screening module, a passive collection and analysis system for soil vapor testing. The system can save up to 75% of the costs of conventional sampling/laboratory analysis. Dan Powell, EPA Technology Innovation Office, discussed some monitoring needs and programs from EPA's perspective. Sensor needs include ground water, DNAPLs, and MTBE especially for USTs , and fence line monitoring. Don Vroblesky, US Geological Survey, presented work on use of tree sampling to detect organic contaminants in groundwater. Core sampling of cypress was able to determine the presence of TCE in ground water. 26 ------- The second half of the workshop involved a discussion of a number of issues and included: • Processes for approval of new technologies by regulators and barriers to implementation; • Challenges for technology developers in bringing new technologies to market given uncertainties in funding, market size, regulatory acceptance, etc.; • Need for technologies to integrate and disseminate information obtained with monitoring technologies for project decision makers; and • Identification of future monitoring needs. The use of novel monitoring technologies may enable capture of high quality environmental data at a significantly reduced cost. Future conferences and workshops could continue to provide updates on new technologies and applications. In addition, specialty conferences and workshops could focus on specific contaminants, matrixes, or cleanup scenarios. Specific areas include sensors for monitoring organics such as TCE, novel groundwater monitoring techniques, and technologies applicable to monitoring natural attenuation. An issue that arises with the discovery of new technologies and approaches is regulatory acceptance. A possible conference and/or workshop could focus on the path to regulatory acceptance for specific new monitoring technologies. Regulators could be informed about these new advances while technology developers could gain insight into the approval pathway. 27 ------- ABSTRACTS 28 ------- (1) Better Remediation Results: The Path Forward By Walter W. Kovalick, Jr., Ph.D., Director, US EPA, Technology Innovation Office ABSTRACT NOT A VAILABLE (2) Getting to Site Closeout By Karla L. Perri, Assistant Deputy Under Secretary of Defense, Department of Defense, Office of the Deputy Under Secretary of Defense (Environmental Cleanup) ABSTRACT NOT A VAILABLE (3) Remedial Systems Optimization Within the Site Closeout Process By Mario E. lerardi, US Air Force ABSTRACT NOT A VAILABLE (4) RAO/LTM Optimization At The Naval Industrial Reserve Ordnance Plant, Fridley, Minnesota - A Case History By Richard C. Cronce, Ph.D., Scott A. Glass, P.E., and Nicholas P. Trentacoste, Ph.D. Historical chemical releases resulted in elevated concentrations of chlorinated solvents, primarily trichloroethene (TCE), in soils and groundwater at the Naval Industrial Reserve Ordnance Plant (NIROP) in Fridley Minnesota. Impacted groundwater flows off-site, discharging to the Mississippi River. The United States Navy (Navy) instituted a groundwater remediation program in accordance with a 1990 Record of Decision. The remedial action involves hydraulic containment and recovery of contaminated groundwater to prevent future off-site migration. The 750 gpm system consists of six extraction wells, treatment by aeration with four tray aerators, and discharge to surface water. System polymer injection and acid treatment components address pronounced system inorganic and organic fouling. Intensive treatment system and environmental monitoring and reporting to local, state, and federal agencies are required. Yearly remedial action operations and LTM 29 ------- (RAO/LTM) costs were approximately $750,000. Navy retained Science Applications International Corporation (SAIC) to identify potential opportunities for further optimization of the RAO/LTM program with respect to performance and costs by conducting a "Needs Assessment," which is the first phase of SAIC's SmartSite™ system. Optimization included: review of background information; field evaluation of program elements; identification of program inefficiencies and potential cost saving alternatives; financial analyses and ranking of alternatives; and development of a plan for implementing the SmartSite™ technologies. The remediation technology and strategy, groundwater remediation system design, and system operations and maintenance (O&M) program were evaluated with respect to performance and cost. Modifications of the remediation program and upgrades of system hardware and software components were recommended where alternative approaches or technologies were shown to offer financial or non-monetary benefits. The optimization assessment indicated no deficiencies in treatment technology or design. Optimization of well field management would likely reduce system fouling and costs. Replacing treatment system motor starters with variable speed drives, and reconfiguring system piping for gravity versus forced discharge will reduce power costs. Improved predictive and preventative maintenance will likely reduce maintenance costs. Reducing the number and frequency of samples collected, and changing analytical methods will reduce environmental monitoring costs. Labor costs from manual collection of wellhead data can be reduced by automation of wellhead controls. Upgrade of the existing system control and data acquisition (SCAD A) system, and development of an automated data collection, storage, and reporting program will provide a positive return on investment. Providing for future expansion of the SCAD A will likely provide additional future cost savings. Cost for implementing the identified cost saving opportunities is estimated at $375,000. Implementation of these cost saving opportunities is predicted to reduce yearly O&M costs by approximately 20%, or $160,000 per year. This equates to potential savings of up to $1,500,000 over the life of the program, and yields a return on investment of around 2.5 years. A plan for implementing these SmartSite™ technologies will involve preparation of engineering designs followed by final financial analyses, negotiations with regulatory agencies, and implementation of alternatives. Richard C. Cronce, Ph.D., SmartSite Program Manager, Science Applications International Corporation, 3240 Schoolhouse Rd, Middletown, PA., 17057-3595, 717-944-5501 (4044 fax), Scott A. Glass, P.E., Base Realignment and Closure Environmental Coordinator, Southern Division, Naval Facilities Engineering Command, P.O. Box 190010, North Charleston, SC 29419-9010, 843-820- 5587 (5563 fax), glasssa@cfdsoutli.navfac.navy.mil. Nicholas P. Trentacoste, Ph.D., Science Applications International Corporation, 11251 Roger Bacon Drive, Reston, VA 20190, nicholag^trentacoste@cgmx.saic.cotTi (5) US Army Corps of Engineers Remediation System Evaluation Case Studies By Dave Becker, US Army Corps of Engineers The US Army Corps of Engineers (USAGE) Hazardous, Toxic, and Radioactive Waste Center of Expertise (HTRW CX) has developed the Remediation System Evaluation (RSE) process. RSEs are 30 ------- intended to assure the protectiveness of the remedy, recommend cost saving changes in the system operation or technologies applied at the site, verify a reasonable closure strategy, and assess the maintenance of Government-owned equipment. In order to assist USAGE personnel and their contractors in performing these RSEs, a suite of checklists has been developed. These checklists address the overall system goals, subsurface performance, and above-ground treatment effectiveness, and offer possible cost saving alternatives. The checklists are meant to be mental "prompts" for experienced scientists and engineers to conduct RSEs at a wide variety of long-term remedies. The RSE process has been applied at three different sites by the HTRW CX staff with assistance from USAGE district staff and other agency personnel. The three sites include a Superfund site in New Jersey; an Army installation in Utah, and an Army installation in Washington State. The Superfund site is a capped landfill with a gas collection and treatment system, a slurry wall, and a contaminated leachate collection and treatment system. The RSE, performed as part of a National Contingency Plan (NCP) five-year review, offered suggestions for changes in operations potentially resulting in savings that exceed $300,000 annually. These changes include increasing the degree of automation and elimination of some of the multiple treatment processes. The operators working under contract to the PRPs had already identified many of the changes. The Army project in Utah is a large 7500-gpm pump and treat system involving groundwater extraction and injection wells, air stripping for TCE removal, and use of the anti-scale additive, sodium hexametaphosphate (SHMP). The RSE identified changes to the monitoring program, extraction system, and the replacement of the SHMP with CO2 addition. Total potential savings from these recommendations exceed $200,000 annually. An RSE was conducted at a 2500-gpm pump-and-treat system at an Army site in Washington. Recommended changes to the monitoring program for this TCE plume can potentially save over $80,000 per year. Suggestions were also made to improve the air stripping treatment of the TCE-contaminated groundwater. Possible issues regarding subsurface performance of several of the remedies were also identified and recommended for further study. The RSE checklists, a sample report, and an instruction guide will be made publicly available via a USAGE web site in the near future. Dave Becker, US Army Corps of Engineers, HTRW CX, 12565 W. Center Road, Omaha, NE 68144- 3869; (402) 697-2655; (402) 697-2673 FAX; davej.becker@usace.army.mil (6) US EPA, Technology Innovation Office Demonstration Project of Hydraulic Optimization Modeling at Existing Pump and Treat Systems By Rob Greenwald, HSI Geotrans HSI GeoTrans conducted a demonstration project for USEPA, to illustrate the application of hydraulic optimization techniques for improving the design of pump-and-treat systems. Hydraulic optimization combines groundwater flow simulation (e.g., MODFLOW) with linear and/or mixed-integer programming, to determine mathematically optimal solutions for well locations and well rates, subject to site-specific constraints. Three sites with existing pump-and-treat systems were selected for this demonstration project. Two sites are Department of Defense (DOD) facilities, and one is an industrial facility. The project consisted of two major components: (1) development of a screening methodology to determine if a site is likely to benefit from more detailed hydraulic optimization analysis; and (2) formulation and solution of hydraulic optimization problems with the MODMAN code. 31 ------- The spreadsheet-based screening method can be used to quickly determine if significant cost savings may be achieved by altering key aspects of an existing or planned pump-and-treat system (e.g., by reducing the total pumping rate). For each alternative considered, the method accounts for up-front costs, annual costs, estimated time horizon, and discount rate (to evaluate total costs in present-day dollars). Site-specific values input into the spreadsheet can be based on very detailed engineering calculations and modeling results, or may be based on "ballpark estimates." For alternatives that offer the potential for significant cost reduction, more detailed design effort (e.g., hydraulic optimization) is a high priority. For each of the three sites, many hydraulic optimization problems were formulated and solved. Hydraulic optimization techniques that were demonstrated include: (1) representing plume containment with flow- based constraints; (2) evaluating sensitivity of minimum pumping rate to number of wells allowed; (3) comparing solutions for "containment only" with solutions incorporating more aggressive core-zone pumping; (4) incorporating reinjection of treated water; (5) representing multi-aquifer wells; and (6) comparing solutions for different target containment zones. The mathematical optimal solutions determined for each alternative were evaluated, to suggest a preferred management strategy for each site. For two of the three sites, solutions were suggested that could yield savings of millions of dollars (net present value) over 20 years, relative to the existing systems. Rob Greenwald, HIS Geotrans, Two Paragon Way, Freehold, NJ 07728; (732) 409-0344; (732) 409-3020 FAX; rgrccnwald@hsigcotrans.com (7) Long-Term Monitoring and Optimization of an Air Force Pump and Treat Facility By Philip Hunter, P.O. Air Force Center for Environmental Excellence (AFCEE) Wurtsmith Air Force Base is located near the town of Oscoda along the west shore of Lake Huron, losco County, Michigan. The installation is underlain by homogenous glacial outwash sands and gravels and is bounded by Lake Van Etten and the Au Sable River. TCE contamination was first detected in water supply wells at the base in 1977, which prompted the design and construction of one of the first air- stripping pump and treat (P&T) facilities in the Air Force. The installation currently operates three groundwater P&T facilities including the Mission Drive system. The principal contaminants of concern at the Mission facility include TCE and trans-l,2-DCE. The original design of the Mission Drive system consisted of four extraction wells pumping at a total rate of 250 gallons per minute (gpm). This original design has remained in place for over 10 years.. A basewide groundwater flow model was developed using input from various modeling investigations spanning 13 years. Parsons Engineering Science and Utah State University were contracted by AFCEE to use simulation/optimization (S/O) modeling to evaluate pumping strategies for capture and cleanup of contamination. Areas contaminated by concentrations greater than 94 and 230 ug/L for TCE and trans- 1,2-DCE, respectively, were the targeted zones of interest for cleanup. These remediation goals were agreed upon with the regulatory community and were specific constraints that guided the investigation. Several optimization scenarios were considered that evaluated the number of extraction wells, well locations, extraction rates, and screened intervals. Preliminary optimizations considered over 200 possible extraction well locations simultaneously, including the existing wells. The optimization investigation 32 ------- concluded that the existing extraction well locations were not located nor screened appropriately to achieve hydraulic containment or to effectively remediate groundwater contamination. The existing extraction wells are scheduled to be abandoned and new extraction wells that are optimized in design will be installed in their place. AFCEE investigated sampling frequencies for influent, effluent, and intermediate treatment flows for long-term plant operations at all three of the P&T facilities. Time series analysis and a sampling- frequency reduction algorithm developed by Lawrence Livermore National Laboratory (Johnson, et. al., 1995), indicated that there was technical justification to reduce sampling while achieving sufficient information to support the remedial decision process. Analytical and sampling fees were reduced approximately 75% based on this analysis. Other opportunities exist to optimize and, in most cases reduce, groundwater-sampling networks (i.e. number and location of sampling points). These networks are separate from the treatment-plant operation but are designed to monitor levels and trends in contamination over the plume areas to verify remedial effectiveness. Geostatistical techniques exist to cost-efficiently reduce and minimize spatial redundancy in these monitoring networks so that only sufficient data is captured. Finally, investigations from other Air Force installations support the use of cheaper analytical methods [i.e. portable gas chromatographic (GC) techniques in lieu of more expensive fixed-lab analytical procedures] to obtain data of acceptable quality to monitor and interpret contaminant levels in plant operations Although it is difficult to give specific "cook book" guidance for each hazardous waste site across a variety of hydrogeologic settings, the Air Force has developed general optimization guidance (AFCEE's Long-term Monitoring Optimization Guidance, 1998) to assist these important and necessary undertakings. AFCEE is also currently developing algorithms and decision support software that will be helpful to the design and analysis of these LTM efforts. Philip Hunter, P.G., Air Force Center for Environmnetal Excellence (AFCEE), Consultant Operations Division, Brooks AFB, TX; (210) 536-5281; (210) 536-6003 FAX; philip.hunter@HQAFCEE.brooks.af.mil (8) The Benefits of Remediation System Audits: Several Case Studies for the Private Sector By Rusty Norris ENSR Corporation Industry is wasting billion of dollars on outdated remediation systems. Many of these systems were installed under more conservative regulatory requirements than exist today and have not been adequately re-evaluated since they were originally installed. Trends in recent years include wide acceptance by regulators of: risk-based cleanup goals, natural attenuation as a legitimate remediation technology, and technical impracticability of remediation as a valid argument in certain circumstances. In addition, the arsenal of proven technologies available for site remediation has expanded. ENSR's remediation system audit (RSA) protocol is a flexible approach that identifies ways to reduce remediation cost and duration. The RSA is a holistic evaluation that considers client objectives; current system performance; trends in groundwater concentrations; cleanup goals; capital and operations and maintenance costs; engineering and consultant costs; monitoring programs; alternative remediation technologies and engineering controls; 33 ------- and potential remediation technology enhancements. Every RSA that ENSR has conducted for the private sector, without exception, has paid for itself many times over in cost savings to our clients. This presentation outlines the RSA protocol and describes the results of several recent RSAs. In a number of cases, it has been possible to turn off pump and treat systems and rely on natural attenuation to remediate plumes. Other cost-saving measures have involved developing risk-based cleanup levels, reducing monitoring frequency and parameters, shutting down selected recovery wells, and converting pilot treatment systems to formal corrective action systems. RSAs are inexpensive and quick, generally costing less than $10K and taking several weeks. Savings are typically in the $100Ks. Rusty Norris; ENSR Corporation, 35 Nagog Park, Acton, MA 01720 (978) 635-9500 fax (978) 635-9180) (9) Long-Term Monitoring Optimization at Naval Air Station, Brunswick, ME By Emil Klawitter, Michael Barry Naval Air Station, Brunswick, ME, Restoration Team Naval Air Station (NAS) Brunswick, Maine was placed on the NPL in 1987. Groundwater contamination has been attributed to past solvent disposal activities from an acid/caustic pit, a former fire training area, and a Defense Reutilization and Marketing Office storage area. This contamination is referred to as the "Eastern Plume." The geology of the eastern portion of NAS Brunswick is comprised of overburden sand, silt (transition) and clay units overlaying a moderately sloping bedrock surface. While the shallow overburden was the unit directly contaminated by solvents, the most impact is found in a lower sand area of the overburden. In 1992 an interim Record of Decision was signed for the installation of the groundwater extraction and treament system and associated monitoring. The treatment system utilized UV/Oxidation as the primary treatment with subsequent discharge to the sanitary sewer system. Five extraction wells were installed along the eastern perimeter of the Eastern Plume site to provide hydraulic control of the VOC plume and remove dissolved-phase VOC from groundwater. The Department of the Navy, Environmental Protection Agency (EPA) Region I, Maine Department of Environmental Protection (MEDEP) along with restoration Advisory Board (RAB) members are integrally involved in optimizing remediation of the eastern plume. This was broken down into 4 portions • Long-term Monitoring Plan (LTMP) • Treatment Plant • Effluent Discharge Options • Extraction System. The LTMP is almost complete and the team is currently midway through elevating the Treatment Plant & Effluent Discharge Options. Review of LTM Annual Reports on the Eastern Plume showed redundant data. The Navy performed a geostatistical analysis of the monitoring program for the Eastern Plume. It identified a number of data surplus areas and some data gaps. The Navy, EPA, and MEDEP met for three days and reviewed each sampling location. Trends of each well were analyzed and discussed using the Data Quality Objective (DQO) process. Questions regarding the necessity and purpose of the data, as well as what decisions the data would support, were asked for each well. If no reasonable answers could be given for a well, it was eliminated from the LTM program. The same process was applied to additional wells proposed. If a new 34 ------- well was deemed appropriate, using this process, it would be installed and added to the program. Whether meeting attendees agreed with the formal DQO process, or considered it to be simply common sense, the result was an improved LTM program at a reduced cost. EA Engineering was tasked with review of the existing treatment plant. Conceptual plant changes in conjunction with different effluent discharge options have identified cost savings that may be realized in less than one year after implementation. A more detailed plan is currently being developed for implementation. To date the success of the optimization has been attributable to a team approach with a common goal. EPA and MEDEP have been willing to discuss ideas in an open forum and provide creative suggestions prior to the Navy formalizing changes. Emil E. Klawitter, Naval Facilities Engineering Command, Northern Division, 10 Industrial Hwy, MS 82 Lester, PA 19113; Michael S. Barry, USEPA - Region I, 1 Congress Street Suite 1100 (HBT), Boston, MA 02114-2023; Claudia B. Sait, Maine Department of Environmental Protection, 17 State House Station, Augusta, ME 04333-0017; Anthony Williams, Naval Air Station Public Works Environmental, 400 Orion Street North Brunswick, ME 04011-5000 (10) Development of USEPA's SVE Optimization Guide By Ralph S. Baker, Ph.D. and Daniel M. Groher, P.E., ENSR Corporation, Acton, Massachusetts James Cummings, USEPA-Techno logy Innovation Office, Washington, D.C. Soil vapor extraction (SVE) is among the most commonly utilized in situ technologies for remediation of volatile organic compounds (VOCs) in soils. USEPA-TIO has recognized that there are operating SVE systems that are not performing up to initial expectations and/or have reached an asymptotic level of contaminant removal which does not meet cleanup objectives. These systems have underperformed for a variety of reasons. SVE may have been applied to marginally-suited settings, such as soil having insufficient permeabilities, soil in which air flow is dominated by preferential flow, and soil that is too moist and/or at which upwelling and excess moisture are not being properly managed. SVE systems have been implemented without considering all appropriate design factors. For example, SVE designs are often based on a radius of pressure influence rather than maintenance of adequate specific discharge (i.e., air velocity) throughout the treatment zone. Thus vent well spacings may be wider than needed to accomplish efficient mass transfer. Too often understanding of a given SVE system's performance - and therefore ability to optimize - is limited by current approaches to monitoring of system parameters, particularly those relating to subsurface conditions. Guidance is needed regarding best practices now available. Adherence to such practices will increase regulator willingness to approve requests for SVE system closure. The SVE Optimization Guide will include core "precepts," a succinct presentation of salient principles regarding siting, design, operation, and optimization of SVE systems. A supporting technical document provides background information explaining the precepts. Illustrative case studies are provided, and reference is made to other useful guidance where appropriate. A panel of SVE experts from government and industry, some involved in parallel USAF and USAGE initiatives to improve SVE system 35 ------- performance, is providing valuable input. The SVE Optimization Guide is scheduled for release in Fall, 1999. Ralph S. Baker, ENSR Corp., 35 Nagog Park, Acton, MA 01720. (978) 635-9500, (978) 635-9180 FAX, rbakerj^ensr Daniel M. Groher, ENSR Corp., 35 Nagog Park, Acton, MA 01720. (978) 635-9500, (978) 635-9180 FAX, dgrohcr^cnsLCQin James Cummings, USEPA-TIO, 1235 Jefferson Davis Hwy., 13th floor, Arlington, VA 22302. (703) 603- 7197, (703) 603-9135 FAX, Cummings.James@epa.gov (11) Assessment of Soil Venting Performance and Closure By Dominic C. DiGiulio, US EPA Varadhan Ravi, Dynamac Corporation Despite the common use of venting, there is little consistency in approach to assessment of performance and closure. Assessment strategies vary widely among and within U.S. EPA's ten regional offices and their corresponding states. Assessment of the technology's performance and eventual decisions on closure are typically based on site-specific discussions and negotiations between responsible parties and regulators which is contingent upon the education and experience of those involved in the process and the availability of information necessary for informed data analysis and decision making. Problems associated with education and experience are evidenced by the widespread use, reliance, and emphasis on empirical methods, while data collection is often a contentious process due to costs associated with site characterization and system monitoring. These problems reflect a recent Inspector General's (IG) audit of EPA submitted to Congress that points out that the agency does not consistently use scientifically-based, systematic planning processes to evaluate actions at Superfund hazardous waste sites and that it completes Superfund actions using data of unknown quality data for decision making and without sufficiently documenting important decision criteria. It is clear that an environmentally protective, flexible, technically achievable, consistently applied approach to the assessment of venting performance and closure is needed. We believe that any approach to this problem should link ground-water remediation to soils remediation since the two are interrelated and encourage good site characterization, design, and monitoring practices since mass removal can be limited by poor execution of any of these components. In response to this need, we have developed a systematic process for assessment of venting performance and closure based on regulatory (U.S. EPA and States) evaluation and approval of five components we believe to be essential: (1) site characterization, (2) design, (3) performance monitoring, (4) rate-limited vapor transport, and (5) mass flux to and from groundwater. These five components form converging lines of evidence. Evaluation is on a pass/fail basis since this type of critique provides the greatest flexibility in decision making. Failure in evaluation of one or more component(s) results in overall failure. Such a "weight of evidence" approach greatly increases the likelihood of correctly assessing performance and closure. Since each component is interrelated and requires continuous evaluation during the life of the project, approval of individual components occurs concurrently and not until the perceived end of the project. Premature approval of a component limits later corrective action. Notice of deficiencies in evaluation of individual components however could occur at any time since in many cases it will be apparent early on that deficiencies exist. Thus, this approach encourages and in some cases forces good site characterization, design, and monitoring practices. If evaluation of all five factors individually supports closure, then it is likely that closure is indeed appropriate. If one or more of the five components does not support closure, then it is likely that either closure is inappropriate or that a conscious decision 36 ------- must be made to accept a limiting condition. Regardless of the situation, assessment of these five components enables flexible, organized, and informed decision making. We recently published recommendations for mass flux assessment and are presently attempting to develop guidelines for the other components of our strategy. Dominic DiGiulio, Ph.D., US EPA, National Risk Management Research Laboratory (NRMRL), Subsurface Protection and Remediation Division (SPRD), Ada, OK 74821-1198; (580) 436-8605; digiulio.dominic@epa.gov (12) In Situ Air Sparging Design Paradigm: Development and Evaluation By Richard Johnson, Ph.D., Oregon Graduate Institute An in situ air sparging (IAS) design paradigm is being developed jointly by ESTCP, ARL and NFESC and is being tested at a number of DoD sites. There are three basic stages in development of the IAS system design for a site: risk assessment, conceptual model development and IAS system design. During the risk assessment stage the potential pathways to receptors are identified and will be used to identify which portions of the site will be targeted for remediation. During conceptual model development the areal and vertical extent of the treatment zone will be determined. This may include a groundwater plume and/or a source zone. Design of an IAS system initially involves conducting an infeasibility test for the site. This is accomplished by installing a single IAS well and several groundwater and vadose zone monitoring points and by conducting a short-duration air injection test. During that a number of parameters are monitored, and if there are no "red flags," then a standard IAS system can be installed. The standard system includes sparge wells placed on ~15 foot centers throughout the treatment zone and injection of ~20 scfm per well. Under most circumstances sparging will be pulsed in order to minimize blower requirements, maximize air/water contact, and to promote good water movement in the treatment zone. For large treatment zones (>5000 ft2) or deep water tables (>40 feet) it may be necessary to modify the design to make it practical to implement. To accomplish this additional site-specific tests can be undertaken to determine if more widely-spaced wells are justified. These tests involve measurement of the air distribution using tracers and potentially geophysics. Currently field activities are being used to refine the paradigm and to develop a robust protocol for implementation of IAS at a wide variety of sites. Richard Johnson, Ph.D., Oregon Graduate Institute, Dept. of Environmental Science & Engineering, P.O. Box 9100, Portland, OR 97291; (503) 690-1193; riohnsoni@ese.ogi.edu 37 ------- (13) SVE Optimization Using Pneulog™ at Air Force Installations By Lloyd D. Stewart, Ph.D. PE, PRAXIS Environmental Technologies A new well logging technique known as PneuLog™ has been developed, field-tested, and demonstrated to evaluate and optimize SVE systems. During PneuLog™, an airflow indicator is moved along the length of a well screen during vacuum extraction and gas concentrations are monitored continuously by sampling through a tube terminated just behind the flow indicator. The flow measurement yields a vertical air permeability profile and identifies preferential flow paths. The concentration measurement identifies strata from which contaminated air is extracted and yields a profile of the soil contamination. The combination of air flow and contaminant concentration versus depth effectively characterizes each distinct soil interval. The standard approach to SVE operation is to design and build a system based on site characterization data and then run the system at full capacity until clean-up criteria have been met. This approach is often inefficient because the SVE design relies on data collected under static conditions with little or no data representative of SVE conditions. Also, data collected during traditional site investigations are insufficient to estimate the optimal long-term operating parameters of an SVE system. Mass transfer constraints (i.e., diffusion from low permeability soil to advective zones, evaporation of a nonaqueous phase liquid, adsorption to solids, dissolution in pore water, etc.) are known to limit the effectiveness of SVE and are not known a priori. As a result, systems are usually overbuilt, inefficient, and expensive to operate. Current SVE optimization procedures are typically empirical because field mass transfer constraints are not quantified. This presentation describes a process for assessing the field-level mass transfer constraints through a combination of PneuLog™ and rebound testing. Quantified mass transfer limitations are then used to develop an optimized extraction schedule. The new procedure to optimize SVE systems uses PneuLog™ data from a representative number of wells coupled with the extracted concentration and flow histories to produce a scientific basis for SVE optimization. System improvements can include increasing or decreasing flow from wells, shutdown of some extraction wells, and/or installation of new extraction wells. The primary goals for optimization are reduced long-term operational costs and accelerated cleanup. A secondary benefit is a more accurate forecast of operation time for the system to reach clean-up criteria. Lloyd D. Stewart, PhD, PE, PRAXIS Environmental Technologies, Inc., 1440 Rollins Road, Burlingame, California 94010; (650) 548-9288, (650) 548-9287 FAX; Bo@Praxis-Enyiro.com (14) Modeling Performance of Soil Vapor Extraction Systems By Zhenhua Jiang, Ph.D., Argonne National Laboratory ABSTRACT NOT A VAILABLE 38 ------- (15) Long-Term Performance Monitoring of a Permeable Reactive Barrier to Remediate Contaminated Groundwater By Robert W. Puls, Ph.D., National Risk Management Research Laboratory, USEPA Permeable reactive barriers (PRB's) are an emerging, alternative in-situ approach for remediating contaminated groundwater that combine subsurface fluid flow management with a passive chemical treatment zone. In the last few years, there has been extensive research conducted to improve our understanding of the mechanisms and kinetics of the transformation reactions responsible for the removal of contaminants from the aqueous phase in such in situ treatment systems. The few pilot and commercial installations which have been implemented have proven that passive permeable reactive barriers can be a cost-effective and efficient approach to remediate a variety of different compounds. However, in all of the pilot and commercial installations to date there has been very little data collected or research focused on the long-term performance of these in-situ systems, particularly with respect to the build-up of surface precipitates or bio-fouling. A detailed analysis of the rate of surface precipitate buildup in these types of passive in-situ systems is critical to understanding how long these systems will remain effective. Different types of minerals and surface coatings have been observed to form under different geochemical conditions which are dictated by the composition of the permeable reaction zone and aquifer chemistry. Microbiological activity impacts are also important to understand and better predict how long these systems will remain effective in the subsurface. The presence of a large reservoir of iron and favorable pH and substrate availability conditions may favor the activity of iron and sulfate reducing bacteria and methanogens. This enhanced activity may favorably influence zero-valent iron reductive dehalogenation reactions through favorable impacts to the iron surface or through direct microbial transformations of the target compounds. However, this enhancement may come at the expense of faster corrosion leading to faster precipitate buildup and potential biofouling of the permeable treatment zone. Using advanced surface analytical techniques together with collection of detailed water sampling data from a site near Elizabeth City, North Carolina the specific objectives of this research project are: 1) Characterize the type and nature of surface precipitates forming over time in the at the upgradient aquifer/iron interface, within the iron zone and downgradient. 2) Identify type and extent of microbiological activity upgradient, within and downgradient in at least one of the chosen sites to evaluate microbiological response or effects from emplaced iron into an aquifer system. 3) Develop a'priori testing requirements that predict the longevity of a PRB. 4) Develop practical and cost-effective LTM protocols that minimize O&M costs. Work to date has focused on analysis of trends in groundwater quality parameters , differences in observed contaminant concentrations using different levels of sampling intensity, surface analysis of recovered iron from the wall over a period of two years, and assessment of microbiological activity within the wall and at the upgradient and downgradient interfaces. The latter utilizes the phospholipid fatty acid (PLFA) methods to determine cell counts and fatty acid profiles. Surface analyses include total organic carbon, total inorganic carbon, scanning electron microscopy with energy dispersive x-ray analysis, and x-ray photoelectron analysis. Preliminary data will be shared on these subjects in the presentation. 39 ------- This is an abstract of a proposed presentation and does not necessarily reflect EPA policy. Robert W. Puls, Ph.D., National Risk Management Research Laboratory, USEPA, P.O. Box 1198, Ada, Oklahoma, 74820, (580) 436-8543, FAX (580)436-8703, puls.robertrgepa.gov (16) In Situ Remediation of VOC-Contaminated Groundwater Using Zero-Valent Iron: Long- Term Performance By R.W. Gillham, Department of Earth Sciences, University of Waterloo Since introduced in 1992, the use of granular iron for in situ cleanup of groundwater containing chlorinated organic chemicals has gained substantial recognition as a cost-effective remediation alternative. Implementation generally involves the emplacement of a permeable granular iron wall across the path of a contaminant plume. Because capital costs can be high, the advantage relies on the expectation of low operating and maintenance costs over long periods of time. It can readily be shown that consumption of the iron by chemical processes is not a significant factor. However, long-term performance may be influenced by precipitates that form as a consequence of the dechlorination reactions or as a consequence of pH changes in the reactive material. This paper reviews the processes that have been identified as potential sources of precipitates and the evidence concerning the effect that these may have on long-term performance. R.W. Gillham, Department of Earth Sciences, University of Waterloo, Waterloo, Ontario, Canada N2L 3G1; rwgi 1 llia@sciborg.uwatcrloo.ca (17) DOE/DOD/EPA Collaborative Research on the Long-Term Performance of Permeable Reactive Barriers (PRBs) By Nic Korte, Oak Ridge National Laboratory Liyuan Liang, Oak Ridge National Laboratory Through collaboration among the Department of Energy (DOE), Department of Defense (DOD), the Environmental Protection Agency (EPA), academia, and industry, this project will evaluate and maximize the effectiveness of PRBs. At present, there is not a consensus regarding the appropriate amount and type of monitoring. Consequently, data collected at one site are not readily comparable to those from another. One example is with respect to hydraulic measurements. A heat-pulse flow meter was evaluated at the Navy=s Moffet Field site, but the measured velocities were lower than expected. In contrast, the DOE- developed, colloidal-borescope tends to yield high results, although the instrument provides both relative groundwater flow velocities and direct observations of colloid behavior. This project, therefore, will provide selection criteria and operational procedures for downhole flow meters. Relevant data have already been obtained at DOE=s Y-12 and Kansas City Plants. ORNL is also investigating whether single- 40 ------- well tests are applicable at locations as permeable as PRBs. While hydraulic measurements are relevant to all locations, the various agencies also have different interests. For example, at Moffett Field it was suggested that colloid loss from the barrier was beneficial because retention in the barrier could lead to clogging. From a DOE perspective, however, colloid release would be a concern because of the potential facilitated transport of uranium, technetium, or other radionuclides. Thus, colloid behavior in and around barriers must be assessed both to predict longevity and to determine applicability of PRBs to problems specific to DOE. The expected final results of this collaborative effort will be (1) development of a'priori testing requirements that predict the longevity of a PRB, (2) development of monitoring methods that will provide an early warning of incipient barrier failure, and (3) development of LTM protocols that minimize operation and maintenance costs. Nic Korte, Group Leader, Oak Ridge National Laboratory, 2597 B : Road, Grand Junction, CO 81503; (970)248-6210, FAX: (970)248-6147; kortene@ornl.gov (18) Performance Monitoring of the Permeable Reactive Barrier at the Somersworth, NH, Landfill Superfund Site By Timothy Sivavec, Ph.D., General Electric Performance monitoring of a pilot-scale permeable reactive barrier (PRB) at the Somersworth, NH site was conducted over an 18-month period to provide data that could improve full-scale PRB design at the landfill site. Assessment of PRB performance was based on the results of (1) pumping tests to provide information on the distribution of hydraulic conductivity near the iron zone, (2) VOC monitoring upgradient, within and downgradient of the PRB, (3) monitoring of groundwater parameters (pH, DO, ORP, SC) and inorganic parameters (including metals, major ions and nutrients), (4) microbial characterization of soil and iron, and (5) surface characterization of cored iron material. A funnel-and-gate barrier system was installed at this site in December 1996 using an eight-foot diameter caisson to form a conical-shaped reactive iron zone. Pea gravel was added upgradient and downgradient of the iron, providing an iron flow-through thickness of 4 feet. Low permeability funnels were installed on each side of the gate using a four-foot diameter auger and a soil-bentonite mixture. The groundwater contaminants at this site include trichloroethene (TCE), tetrachloroethene (PCE), dichloroethene (DCE) isomers, and vinyl chloride (VC). Over the 18-month study, VOC reductions of 50% were observed between the upgradient aquifer and entrance to the PRB. A number of factors point to extensive biodegradation that is occurring in the upgradient aquifer. VOCs were reduced to non-detect levels at the first monitoring point in the PRB, approximately 14 inches downgradient from the entrance to the barrier. Bicarbonate, calcium, magnesium, iron, manganese, and sulfate reductions were also observed within the iron zone. As groundwater exited the iron zone, the concentrations of these constituents increased to near their natural levels due to mixing with unaffected groundwater and the natural buffering effects of the aquifer materials. One concern for long-term effectiveness is loss of porosity from mineral precipitation within the iron zone, which can be quantified from groundwater inorganic profiles. Bulk digestion analysis and X-ray photoelectron spectroscopy (XPS) of granular iron from cores were used to quantify calcium carbonate 41 ------- and ferrous carbonate precipitates. As observed in other 100% iron PRBs, mineral precipitation was highest at the upgradient iron/pea gravel interface and decreased to background levels within the first 6 inches of iron. A porosity loss of approximately 3% was measured at the interface, which is less than is predicted by laboratory and pilot-scale column studies. Another concern for long-term performance is biofouling. During the pilot study at this site, microbial growth within the barrier was no greater than that observed in the surrounding aquifer. Sulfate-reducing bacteria were also identified within the iron zone. Future efforts at this site will focus on further developing cost-effective, long-term monitoring techniques and protocols to minimize operation and maintenance costs and to provide early warning of barrier failure. In particular, our proposed approach uses multi-parameter groundwater monitoring probes, dedicated to individual wells, to correlate certain groundwater parameters such as dissolved oxygen, pH, groundwater elevation and conductivity to changes in barrier performance. Timothy Sivavec, Ph.D., Research Chemist, GE Corporate Research & Development, Building Kl, Rm. 5A45, One Research Circle, Niskayuna, NY 12309; (518) 387-7677, FAX (518) 387-5592 Sivavccj7icrd._gc._g()ni (19) Performance Evaluations at the Moffett Field and Department of Defense Permeable Barrier Sites By Charles Reeter, Naval Facilities Engineering Service Center Arun Gavaskar, Neeraj Gupta, Bruce Sass, Battelle Memorial Institute A pilot scale permeable reactive barrier (PRB) or treatment wall demonstration project was initiated by the US Navy EFA West at the former Naval Air Station Moffett Field site in Mountain View, California about 3 years ago. Performance evaluations and cost-benefit analyses were performed by the US Naval Facilities Engineering Service Center (NFESC) and were sponsored by the Department of Defense (DoD) Environmental Security Technology Certification Program (ESTCP). The Moffett Field PRB uses a funnel-and-gate design, where the funnel is made of interlocking steel sheet piles and the gate consists of a reactive cell filled with zero-valent granular iron. Since its construction in April 1996, groundwater monitoring was conducted on a quarterly basis to demonstrate the effectiveness of the barrier technology in capturing and remediating groundwater that contained dissolved chlorinated hydrocarbon compounds. The primary contaminants of concern at Moffett Field in the vicinity of the PRB are trichloroethene (TCE), cis-1,2 dichloroethene (cDCE), and perchloroethene (PCE) at upgradient concentrations of about 2900 micrograms per liter (ug/L), 280 ug/L, and 26 ug/L, respectively. Quarterly monitoring events included water level measurements, field parameter testing, and groundwater sampling at about 75 monitoring points. Two tracer tests using bromide solutions and flow meter testing were also completed in April and August 1997 at the site. Iron cell coring samples were collected and analyzed in December 1997 for use as indicators of reactivity and longevity. Data from the quarterly monitoring, tracer testing, and iron cell coring have been used to determine the overall barrier performance. Since the first sampling event in June 1996, concentrations of all chlorinated compounds were either reduced to non-detect (ND) or to below the drinking water maximum contaminant levels (MCLs) within the first 2-3 feet of the permeable iron cell (gate). The iron cell coring analyses and geochemical modeling from Moffett Field indicated that changes in the inorganic chemistry were caused by precipitation of calcite, carbonates, iron-sulfide, and hydroxide compounds. Chemical precipitates are a concern because of the potential loss of reactivity and 42 ------- permeability in the iron cell. In general, long-term performance and life expectancies at PRB sites are unknown. The DoD ESTCP, Environmental Protection Agency, and Department of Energy are sponsoring additional performance and longevity evaluations at multiple PRB sites across the country. This is being accomplished in partnership with the RTDF PRB Action Team in an effort to gain widespread regulatory acceptance and remedial project manager confidence in using the reactive barrier technology. Charles Reeter, Naval Facilities Engineering Service Center, 1100 23rd Avenue, Code 411 Port Hueneme, California 93043; Tel: (805) 982-4991, reetercv@nfesc.naw.mil (20) Air Force LTM Methodology and Approach for Monitored Natural Attenuation By Patrick Haas, US Air Force Center for Environmental Excellence ABSTRACT NOT A VAILABLE (21) Evaluating the Natural Attenuation of Transient-Source Compounds In Groundwater At The Kl Avenue Landfill Site By Varadhan Ravi, Ph.D., Dynamac Corporation Jin-Song Chen, Ph.D., Dynamac Corporation John T. Wilson, U.S. Environmental Protection Agency Natural attenuation, when adequately monitored and properly evaluated, can be both environmentally and fiscally responsible. It has, in recent years, gained acceptance as a viable remedy for groundwater that has been contaminated with fuel hydrocarbons and industrial solvents. While these are the most commonly encountered pollutants in groundwater, leachate pollution plumes from old landfills or landfills with inadequate leachate collection systems are also prevalent. However, significant attention has yet to be given to studying the potential for natural attenuation as an effective remedy for landfill leachate plumes. Concentrations of waste mixtures emanating from landfills (municipal, commercial, and mixed industrial waste) are generally more variable in space and time than concentrations of contaminants from sources such as fuel tanks or industrial disposal facilities. Consequently, evaluating the natural attenuation of contaminants from landfill leachates is more difficult. Evaluations of natural attenuation are typically based on a deterministic framework. They involve the use of numerical or analytical models of flow and transport to estimate the rate of various attenuation processes based on calibration to observed field data. These modeling efforts almost always neglect the spatial and temporal variability of contaminant concentrations caused by factors other than the processes explicitly considered by the model used. Such variability can be captured naturally in a stochastic framework. More importantly, the stochastic approach yields a quantitative measure of the strength of evidence provided by the observed data to different measures of the effectiveness of natural attenuation. 43 ------- A hybrid framework, combining deterministic fate and transport modeling with stochastic methods, will be presented to examine the impact of temporal variability of contaminant source on the rate of natural attenuation. This approach will be illustrated using the data from the KL Avenue Landfill site, Kalamazoo, Michigan. A one-dimensional analytical solute transport model based on the superposition principle was used to describe the migration of contaminants downgradient of a time-varying source. Stochastic component of the contaminant source was determined after removing systematic trends from the historic time series data. Statistical tests were employed to verify the assumptions made with regard to the nature of the trend and stochastic components. Monte Carlo techniques were used to generate synthetic time-varying sources based on the properties of the stochastic component. Basic transport properties such as groundwater velocity and hydrodynamic dispersion coefficient were estimated by calibrating the transport model to the measured chloride data. The results of simulations indicated that the measured decrease of contaminant concentrations at downgradient wells cannot be explained by physical processes alone. First-order attenuation rate constants were estimated by nonlinear regression. Numerical experiments indicated that the rate constant estimate is dependent on the nature of source, location of observation wells, and the time of observations. It was also observed that the impact of stochastic variability of the source diminishes with travel distance from the source. DISCLAIMER This is an abstract of a proposed presentation and does not necessarily reflect EPA policy. Varadhan Ravi, Ph.D., Dynamac Corporation, Ada, OK 74820; (580) 436-6405; ravi@chickasaw.com (22) Monitored Natural Attenuation of Explosives at Louisiana Army Ammunition Plant By Judith Pennington, Ph.D., US Army Engineer Waterways Experiment Station Waste disposal practices associated with manufacturing and loading, assembling, and packaging of explosives during and before World War II and the Korean Conflict have resulted in contamination of soil and groundwater with explosives, predominantly 2,4,6-trinitrotoluene (TNT) and l,3,5-trinitro-l,3,5- hexahydrotriazine (RDX). Several processes having the potential to attenuate TNT and RDX have been identified and include biotic and abiotic transformation, microbial degradation, and immobilization by chemical reactions between contaminants and organic matter or clays. The objective of this project was to demonstrate natural attenuation of explosives at an Army site. Groundwater monitoring procedures were optimized to generate reliable trends in explosives concentrations overtime. Batch and column partitioning studies were used to evaluate the significance of site capacity on ultimate fate and transport of the explosives. Both biomarker and stable isotope techniques were investigated for use as monitoring tools. The Department of Defense groundwater modeling system was used for contaminant plume definition and predictions of future contaminant extent. The field demonstration was conducted at the Louisiana Army Ammunition Plant. The demonstration included groundwater monitoring, modeling, and a cone penetrometry sampling event to characterize site lithology and to obtain sample material for other parts of the study. Results demonstrated declining concentrations of explosives in groundwater over the two year monitoring period. Contaminant mass declined and the groundwater model predicted a shrinking plume in a 20-year simulation. Biomarkers demonstrated the microbial degradation potential of RDX and TNT in aquifer soils and provided an estimate of degradation rates. Use of stable isotope ratios of nitrogen in TNT extracted from groundwater were a promising indicator of attenuation. Results 44 ------- demonstrated that natural attenuation is a viable option which should be among the options considered for remediation of explosives contaminated sites. Judith C. Pennington, PhD, U. S. Army Engineer Waterways Experiment Station, 3909 Halls Ferry Road, Vicksburg, MS 39180-6199, phone 601-634-2802, e-mail penninj@exl.wes.army.mil. (23) EPA Guidance for Long-term Monitoring of Natural Attenuation By Herb Levine, US EPA EPA is currently developing a guidance for designing an LTM program for natural attenuation remedies. The scope of the guidance is limited to chlorinated solvents and BTEX in groundwater. The fundamental objectives of performance monitoring for natural attenuation are to determine if natural attenuation is performing as expected, and if there is anything happening that would change the outlook for natural attenuation, or change the approach to monitoring natural attenuation. The document will provide guidance on developing an LTM plan, methods for evaluating effectiveness of natural attenuation processes, data interpretation and presentation. Herb Levine, US EPA, 75 Hawthorne Street (MC SFD-8B), San Francisco, CA 94105; (415) 774-2312; lcvinc.licrb@cpaiiiail.cpa.gov (24) Phytoremediation Performance Monitoring and Optimization: Hydrological and Geochemical Assessments By Scott W. Beckman, Ph.D, Science Applications International Corporation Steven Rock, US EPA NRMRL-Ci Sandra Eberts, US Geological Survey Greg Harvey, Air Force Phytoremediation is a low-cost, low-maintenance remedial technology applicable to the in situ treatment of organics and metals. Phytoremediation of organics in groundwater usually involves the use of phreatophytic tree species that can penetrate the saturated zone and extract groundwater and geochemically alter the sub-surface environment to enhance degradative processes. The establishment of a tree-based phytoremediation system may take several years before maximum performance is achieved. Therefore, it is important to collect site-specific information in order to optimize system design as well as track the progress of the evolving remediation system. A multi-agency, multidisciplinary phytoremediation study was performed at Carswell Air Force Base in Fort Worth, Texas to evaluate the performance of purposefully-planted poplars in reducing the mass of trichloroethylene (TCE) migrating through the site. Approximately 660 trees were planted in April 1996 and studied for three growing seasons to investigate processes that contribute to the reduction of contaminants. Two mechanisms were identified as contributing to the reduction of contaminants by the planted system: (1) hydraulic influence and its associated uptake and metabolism of contaminants, and (2) 45 ------- sub-surface modification of the geochemical environment to promote anaerobic conditions and reductive dechlorination. This presentation will discuss critical data collection and monitoring requirements that are essential for implementing and evaluating the long-term performance of the evolving remediation system. Optimization requirements include: (1) a thorough understanding of the aquifer characteristics (physical and chemical) to determine the number of trees required, their placement at the site, and the configuration of the plantation(s), (2) an understanding of the climatic conditions of the site (rainfall, growing season), and (3) knowledge of the soil characteristics to determine fertilization and irrigation requirements. Long- term monitoring requirements include: (1) evaluating hydrological changes overtime, (2) monitoring tree growth and estimating water consumption by the trees, and (3) monitoring sub-surface geochemical and microbiological changes to determine the onset and magnitude of reductive processes. Results from the Carswell study will be used to illustrate these requirements. Scott W. Beckman, Ph.D., Science Applications International Corporation, 411 Hackensack Ave., 3rd Floor, Hackensack, NJ, 07601; (201) 498-7340, FAX (201)-489-1592; Scott.W.Bcckman.@cpmx.saic.com (25) Optimization, Uncertainty, and Risk Analysis of Design and Operation of in situ Bioremediation By Christine A. Shoemaker, Cornell University Designing and operating a remediation system is made more difficult because data limitations result in uncertainty in the correct values of aquifer characteristics, initial concentrations of the contaminant, and other information that determines the values of model parameters. This paper will describe a computationally efficient method for sensitivity analysis that gives the effects on model predictions of effectiveness of bioremediation for either individual or combined errors in parameter values. This sensitivity analysis can be used to quantify the uncertainty associated with possible errors in parameters, thereby giving reasonable estimates on the range of time that a clean up may require. It will be shown that consideration of combinations of errors leads to considerably larger estimates of possible delays in remediation time and other model predicitions. The procedure proposed for improving computational time of combined sensitivity analysis is essential to be able to do a thorough combined sensitivity analysis in a feasible amount of computer time. These sensitivity results will be presented for a simple aerobic bioremediation system and for a more complex anaerobic system for dechlorination of PCE that includes competition for hydrogen among competing bacteria. Optimization algorithms can be coupled to groundwater fate and transport models to identify the most cost-effective design for remediation by selecting locations of wells, rates of pumping and (for in situ bioremediation) rates of nutrients to be injected. Sensitivity results for optimization of bioremdiation will also be presented. The use of optimization methods can reduce remediation costs a large amount over the costs that would be incurred using sensible designs based on good engineering judgment or on trial and error simulation , because there are literally millions of possible combinations of well locations and pumping rates possible. A systematic search by an optimization method can find significantly less expensive policies. The cost of remediation is so high that the cost for extra computing for optimization and sensitivity analysis insignificant given the potential cost savings in construction and operating costs. Christine A. Shoemaker, Ph.D., Professor of Civil and Environmental Engineering, Cornell University, Ithaca, NY 14850; (607) 255-9233; (607) 255-9004 FAX; casl2@comcll.cdii 46 ------- (26) Monitoring Performance of an Enhanced In Situ Bioremediation Field Evaluation By Kent S. Sorenson, Jr., Lockheed Martin Idaho Technologies Company, Idaho Falls, ID, USA Heidi Bullock, Parsons Infrastructure and Technology Group, Inc. Idaho Falls, ID, USA Jennifer P. Martin, Lockheed Martin Idaho Technologies Company, Idaho Falls, ID, USA Lance N. Peterson, Lockheed Martin Idaho Technologies Company, Idaho Falls, ID, USA The Test Area North (TAN) Facility of the Idaho National Engineering and Environmental Laboratory (INEEL) is the site of a nearly 3 km long trichloroethene (TCE) plume resulting from the injection of low level radioactive waste, industrial wastewater (including dissolved and possibly pure phase organic liquids), and sanitary sewage during the 1960s. The presence of TCE degradation products such as cis- dichloroethene, vinyl chloride, ethene, and ethane downgradient of the hotspot suggests that reductive dechlorination is occurring naturally within the plume. The primary goal of the present study is to determine whether this natural process can be enhanced by the addition of a carbon substrate, sodium lactate, to the hotspot area. Monitoring the performance of a large-scale in situ field test in a deep aquifer must meet a variety of technical objectives while considering certain constraints. The technical requirements for the enhanced bioremediation field evaluation at TAN include monitoring aquifer hydraulic properties, geochemical trends, contaminant and degradation product distributions, and electron donor distributions. Among the constraints are a limited sampling and analysis budget, limited ability to handle large volumes of hazardous purge water and other wastes, and a tight schedule driven by regulatory milestones. In order to satisfy the technical objectives subject to the constraints mentioned, an integrated monitoring strategy has been employed at TAN utilizing: 1) dedicated low-flow pumps, 2) flow-through cell measurements, 3) a combination of field and fixed laboratory analysis, 4) in situ monitoring equipment, 5) near real-time data analysis for optimization of sampling frequencies, and 6) waste minimization. The system design consists of a 500-foot long treatment cell created by an injection well (10 gpm), an extraction well (50 gpm), and 11 monitoring wells. The extracted water is treated by a portable air stripper and reinjected (50 gpm) downgradient of the treatment cell. Dedicated low-flow pumps are placed in each monitoring well minimizing the volume of purge water collected and the time required in sampling each well. All well purge water is collected and discharged to the portable air stripper for treatment and reinjection. Mobile sampling trailers, equipped with a generator, control box, sample manifold, purge tank, and cooler, are used to collect samples from each monitoring well. A sample manifold equipped with a flow-through cell, sample port, and discharge line allows for the collection of the following purge parameters: pH, dissolved oxygen, oxidation-reduction potential, conductivity and temperature. A laboratory trailer is located on site for near real-time data collection and analysis. The field evaluation is being conducted in two phases. Phase I consisted of a bromide tracer test during which background geochemical data were also collected. The tracer test was conducted to evaluate the pumping and reinjection scheme for the field evaluation. The Phase I sampling frequency was based on the expected breakthrough time for each well. In-situ bromide probes were used in two of the monitoring wells. The performance of the probes was compared with results obtained from groundwater samples analyzed by the ion specific electrode. Phase II consists of the weekly injection of sodium lactate and the biweekly monitoring of the treatment cell to determine whether bioremediation is enhanced. Chloroethenes, ethene, ethane, methane, lactate, acetate, propionate, and butyrate samples are collected and analyzed off-site. Alkalinity, carbon dioxide, chloride, ferrous iron, ammonia, nitrate, phosphate, and 47 ------- sulfate samples are measured in the laboratory trailer using titrimetric and colorimetric methods. Waste generated by the field test kits is segregated and disposed of based on Resource Conservation and Recovery Act requirements. Kent S. Sorenson, Jr., Geomicribiology Group, INEEL, Lockheed Martin Idaho Technologies Co., MS 3953, PO Box 1625, Idaho Falls, ID 83415-3953; (208) 526-9597; (208) 526-9473 FAX; sorenksiS-inel.gov (27) Process Optimization of Remedial Systems By Sarabjit Singh, P.E., URS Greiner Woodward Clyde Adam T. Harvey, P.E., URS Greiner Woodward Clyde OBJECTIVES The primary objective of remedial system optimization is to accomplish the following: • Improve system reliability and maximize equipment uptime • Optimize system and process parameters to maximize remedial effectiveness • Minimize generation of process generated waste, and • Optimize sampling type and frequency These objectives are intended to facilitate accomplishment of remedial goals in the shortest timeframe possible, with maximum equipment utilization, thus minimizing costs. These objectives are achieved within the framework of maintaining regulatory compliance and ensuring all health and safety requirements are met. The following optimization strategies have been successfully employed at McClellanAFB. System Reliability Improvement At McClellan AFB process parameter optimization and design modifications were implemented including process operating range monitoring, heat and material balance monitoring, control logic evaluation, chemical dosage evaluation, and periodic vibration analysis. Results were used to ensure safe operation and refine operating ranges, allowing increased process efficiency. Additionally, site-specific process log sheets, shut down logs, preventative maintenance logs, critical spare parts lists, and O&M manuals were prepared and implemented. By using experienced and well-trained operators and developing and implementing stringent preventative maintenance schedules, equipment uptime in excess of 90 percent has been maintained at all SVE sites resulting in timely cleanup and O&M cost savings. Optimization of process parameters During the shake-down phase (the approximately one-month period of initial operations following installation), URSGWC optimized system and process parameters to maximize extraction and treatment efficiency. Specific parameters included extraction rates, on-line extraction wells, process setpoints, and warning and shutdown setpoints. Minimization of Process Generated Waste Through engineering calculations, URSGWC estimated the de minimus chemical feed rates. These techniques reduced chemical use and provided better pH control of scrubber blowdown. Further, evaluation of optimal fluidization velocities within the catalytic oxidizers minimized catalyst/particulate 48 ------- carryover from the oxidizer, thereby, reducing risks to human health and the environment. Material selection can play a large role in minimizing generated waste and disposal costs. Optimization of Sampling Type and Frequency Analytical methods were continually refined and compared to reduce the need for analyses without loss of required data. An onsite laboratory was utilized for routine soil gas analyses to allow rapid analysis of key samples, minimizing hold times, re-sampling costs, and sample shipment costs. Efforts were made with the client to optimize the frequency of process, monitoring, and waste characterization sampling requirements Sarabjit Singh, P.E, URS Greiner Woodward Clyde Federal Services, 2520 Venture Oaks Way, Suite 250, Sacramento, California 95833; (916) 929-2346; FAX (916) 929-7263; (28) Optimization at the Milan Army Ammunition Plant Operable Unit One Treatment of Explosives-Contaminated Groundwater By Lindsey K. Lien P.E., US Army Corps of Engineers Chris J. Riley P.E., ICF Kaiser Engineers, The groundwater treatment system for Operable Unit One at Milan Army Ammunition Plant, Milan, Tennessee consists of extraction wells, electrochemical precipitation, ultraviolet-oxidation (UV- oxidation) treatment, carbon adsorption and reinjection wells. During the initial operating period from November 1995 to October 1997, performance of all units was assessed, and improvements were made as appropriate. Operation of the extraction system has been largely reliable. Experience with the electrochemical cells has shown that the unit requires continuing attention, and that effluent standards for metals are presently being attained without the need for the unit, which was eventually taken offline. The remainder of the metals removal unit has functioned reliably after initial problems with the solids separation system were solved. The UV-oxidation unit has functioned reliably, and attains design removal efficiency at lower than design ozone diffusion rate; this is because influent explosives levels are presently lower than the design basis concentrations. The carbon adsorption unit has consistently attained effluent standards for explosives, but has exhibited high bed pressure drops. This was due to reduction in carbon particle size caused by the frequent backwash cycles employed during initial startup, and solids carryover from the metals removal unit. The solids carryover also resulted in reduced capacity of the reinjection wells. The economics of the treatment train was evaluated, and the results indicated that an explosives influent concentration of > 8 mg/L, the operating cost to attain effluent standards is similar regardless of the operating combination of the two units. This conclusion is based on estimated carbon usage from isotherms, and the assumption that carbon cannot be reactivated and re-used. As the explosives influent concentration has decreased to less than 5 mg/L, further evaluation of the treatment train has revealed the GAC system operating alone results in the lowest cost treatment system. Lindsey Lien, P.E., US Army Corps of Engineers, 12565 W. Center Road, Omaha, NE 68144; (402) 697- 2580; lindsey.k.lien@usace.army.mil 49 ------- (29) Facility Performance Audits: Bang for the BuckS By Ted H. Streckfuss, P.E., U.S. Army Corps of Engineers Over the past decade, the Army Corps of Engineers has aggressively addressed environmental contamination problems for many customers across the country. Significant capital has been invested to construct ground-water and surface water treatment facilities to address the residual contamination at these sites. Many of these facilities require a long-term commitment in order to ensure successful site remediation. Oftentimes, a disconnect occurs within these projects during the transition from the construction/startup phase to the long- term operation, maintenance and facility monitoring phase. This disconnect can lead to inefficient operation of these facilities. Another crucial timeframe within a treatment facility life occurs at the "five-year review," typically required by EPA's CERCLA program. Maintaining efficient operation is a significant issue when recognizing that the system operating costs can exceed millions of dollars per year. It is imperative that the operation of these treatment facilities be closely monitored to ensure that the process is optimized to minimize the cost per thousand gallons of treated water. One way to accomplish this goal is through the periodic performance of a treatment facility performance audit. The Omaha District has extensive experience in providing support to enable existing groundwater treatment facilities to operate in their most efficient configuration. Many times these facilities experience changes in influent conditions which may warrant a change in system operation. This paper will deal with the process behind a performance audit, enhancement of system operation, optimization of equipment components and minimization of utility expense, as well as other pertinent issues and examples. Typical problems encountered during system startup and the measures taken to counter these problems will be discussed. Ted Streckfuss, P.E., US Army Corps of Engineers, Omaha District, MS CENWO-ED-DK, 215 N. 17th St., Omaha, NE 68102; (402) 221-3826; FAX (402) 221-3842; Ted.H.Streckfuss@nwo02.usace.amiy.mil (30) Flow and Transport Optimization for Pump and Treat Systems By David Ahlfeld, Ph.D., University of Massachusetts Richard Peralta, Ph.D., Utah State University This workshop will introduce the participants to the basic concepts of combining numerical simulation models and optimization methods for remedial systems design and performance analysis. Optimization methods can be used to identify least cost strategies for construction and operation of remedial systems. The optimization approach consists of placing the statement of design objectives and constraints into a formal optimization framework and solving the resulting problem. Constraints are imposed which describe the remedial design criteria in terms of the output of groundwater simulation models (heads, velocities and concentrations). Objectives are stated in terms of the cost of implementation of the remedial strategy. The resulting optimization problem is solved to yield the best combination of well locations, recharge locations and pump rates for achieving remedial criteria. Two basic approaches will be highlighted in this workshop; methods involving groundwater flow control and methods involving concentration control. Flow control optimization uses constraints to control the direction and rate of groundwater flow and control fluid fluxes in and out of the groundwater system. 50 ------- Concentration control optimization directly controls future concentrations within the groundwater system by optimal selection of pumping and recharge locations and rates. This workshop will be relevant to individuals involved in management, engineering, modeling or assessment of groundwater contamination sites. The workshop will consist of a brief introduction to basic concepts followed by several case studies of application of these methods to field sites. The workshop will conclude with an open-ended discussion of how these methods can be applied to other sites. Participants are encouraged to bring information on their own sites. David Ahifeld, Ph.D., University of Massachusetts, 139 Marston Hall, Amherst, MA 01003; (413) 545- 2681; (413) 545-2202 FAX: ahlfeld@ecs.umoss.edu Richard C. Peralta, Ph.D., Biological and Irrigation Engineering, Utah State University, Building EC-216, Logan, UT 84322-4105; (435) 797-2786; (435) 797-1248 FAX; peralta@cc.usu.edu (31) Data Quality Objectives: Implementing the Process By John Warren, US EPA EPA Order 5360.1 CHG 1 (July 1998) requires all EPA organizations to use a systematic planning process to develop acceptance or performance criteria for the collection, evaluation, or use of environmental data. The Data Quality Objectives (DQO) Process is the Agency's recommended planning process when data are being used to select between two opposing conditions, such as decision-making or determining compliance with a standard. The outputs of this planning process (the data quality objectives themselves) define the performance criteria. The DQO Process is a seven-step planning approach based on the scientific method that is used to prepare for data collection activities such as environmental monitoring efforts and research. It provides the criteria that a data collection design should satisfy, including when, where, and how to collect samples; tolerable decision error rates; and the number of samples to collect. Making the Process work is not easy and requires effort in translating the data quality objectives into precise instructions that result in data meeting the planned criteria. Central to the successful implementation of the Process are the minimization of Total Error Variability, definition of sample support for inference, and the estimation of appropriate Type I and Type II (False positive and false negative) decision error rates. The talk will concentrate on these areas of concern and discuss potential research areas in the Quality Requirements field. John Warren, Ph.D., US EPA, MC 8724R, 401 M Street, SW, Washington, DC 20460; (202) 564-6876; warrcn.iohn@cpa.gov 51 ------- (32) Remedial Systems Optimization and Long-Term Monitoring Guidance Documents By Daniel Welch, Maj., Defense Logistics Agency Many DoD Installation Restoration Program (IRP) projects have entered the remedial design / remedial action (RD/RA) phase. The DLA, Air Force Center for Environmental Excellence (AFCEE) and AF Base closure and realignment agency (AFBCA) have determined that the cost of RD/RA will be equal or greater than the investigation phase of the IRP. To reduce costs, RD/RA must be based on attainable cleanup goals, and follow appropriate data quality objectives (DQOs). The majority of RA projects require compliance [Remedial Action-Operation (RAO)] monitoring of their active remedial systems. "Post-Closure" sites where the remedial action is complete, and/or where ground-water contamination is still present require Long-Term-Monitoring (LTM). RAO/LTM requirements are dictated by the Resource Conservation and Recovery Act (RCRA); Comprehensive Environmental Response Compensation and Liability Act (CERCLA), and Underground Storage Tank (UST) programs. RAO/LTM will become a costly necessity at most military installations. Consequently, improving the efficiency of these RD/RA and associated monitoring programs through Remedial Process Optimization (RPO) has the potential for substantial cost savings. As DoD restoration projects transition from the remedial investigation/feasibility (RI/FS) phase to the remedial action operations and LTM phase environmental managers will be required to consider a different set of project goals and DQOs. Data collection should focus on data that is necessary to measure protection of human health and the environment, system performance and efficient operations. A significant cost avoidance can be derived by selecting appropriate analytical methods, reducing sampling locations, reducing sampling frequency, and analyzing only for known contaminants. The status of current and future guidance documents and tools will be discussed. Major daniel Welch, Environment & Safety, Defense Logistics Agency, 8725 John J. Kingman Road, Ste. 2533, Ft. Belvoir, VA 22060-6221; (703) 767-6255; (703) 767-6093 FAX1 welch@bq.dla.mil (33) Statistical Methods Useful in Assessment Monitoring and Corrective Action Programs By Robert D. Gibbons, Ph.D, University Of Illinois Statistical methods for detection monitoring have been well studied in recent years (see Gibbons, 1994a, 1996; Davis and McNichols, 1994; USEPA 1992; and ASTM Standard D6312-98, formerly PS64-96). Although equally important, statistical methods for assessment sampling, on-going monitoring and corrective action sampling and monitoring have received less attention. One may ask why statistical analysis is necessary in assessment and corrective action monitoring. Why not simply compare each measurement to the corresponding criterion? There are several reasons why statistical methods are essential in assessment and corrective action sampling programs. First, a single measurement indicates very little about the true concentration in the sampling location of interest, and with only one sample there is no way of knowing if the measured concentration is a typical or an extreme value. The objective is to compare the true concentration (or some interval that contains it) to the relevant criterion or standard. Second, in many cases the constituents of interest are naturally occurring (e.g. metals) and the naturally 52 ------- existing concentrations may exceed the relevant criteria. In this case, the relevant comparison is to background (e.g. off-site soil or upgradient groundwater) and not to a fixed criterion. As such, background data must be statistically characterized to obtain a statistical estimate of an upper bound for the naturally occurring concentrations so that it can be confidently determined if on-site concentrations are above background levels. Third, there is often a need to compare numerous potential constituents of concern to criteria or background, at numerous sampling locations. By chance alone there will be exceedances as the number of comparisons becomes large. The statistical approach to this problem can insure that false positive results are minimized. In this presentation, statistical aspects of assessment and corrective action monitoring will be presented and then illustrated using the new CARStat (Compliance Assessment Remediation Statistics) computer program, the first completely integrated statistical analysis system that can be used to evaluate data collected as a part of assessment monitoring and corrective action programs for environmental sampling of soil, groundwater, air, surface water, and waste streams. The CARStat program and its' statistical foundations grew out of collaborations with industry (General Motors and Waste Management), environmental consultants (Conestoga-Rovers & Associates) and US EPA. Robert D. Gibbons, Professor of Biostatistics, University of Illinois at Chicago, 912 S. Wood Street, Chicago, IL 60612; (312) 413-7755; (312) 996-2113 FAX (34) Using the Data Quality Objective Process to Revise a Groundwater Monitoring Program: the Experience at Pantex By N. Hassig, L. Vail, and D. Bates (PNNL) M. Brown and C. Moke (Pantex); D. Michael (Neptune and Company) Historically, DOE operated the Pantex Plant near Amarillo, Texas to assemble and disassemble nuclear weapons. The current Plant missions are fabrication of chemical explosives for nuclear weapons, assembly of nuclear weapons for the nation's stockpile, maintenance and evaluation of nuclear weapons in the stockpile, disassembly of nuclear weapons being retired from the stockpile, and interim storage of plutonium components from retired weapons. The Ogallala Aquifer, a major source of water for the region, lies beneath the Pantex Plant. Perched groundwater also occurs beneath parts of Pantex Plant, and overlies the Ogallala aquifer. The perched aquifers underlying the Pantex Plant are believed to be a result of operational and industrial discharges from the Plant. Chemicals discharged from Plant operations have been detected in parts of the perched aquifer, but not in the Ogallala. Pantex has an extensive groundwater monitoring program consisting of over 70 wells, which are monitored quarterly or semi-annually for an extensive list of contaminants. The Pacific Northwest National Laboratory (PNNL) statisticians have been working with the Environmental Protection Department (EPD) staff at Pantex to critically evaluate their groundwater monitoring program - asking whether all of the data that are being collected are required to monitor groundwater quality and meet regulatory requirements. The review team is using the Data Quality Objectives (DQO) Process to achieve consensus among stakeholders; lay out the drivers for the groundwater monitoring in a logical format; identify the data required to support decisions linked to the drivers; establish a sound scientific basis for the type, quality, and quantity of data collected; and compare the existing data collection program to a revised optimized program. 53 ------- PNNL developed an Excel workbook of existing well descriptions and locations, groundwater monitoring drivers, analyte lists, monitoring schedules, and cost sheets - all linked by the decision logic established in working through the seven steps of the DQO Process. Using this workbook, it becomes quite clear when data are collected at significant cost but never used in decision making. The workbook also allows the EPD staff to quickly change assumptions, requirements, and drivers to generate alternative monitoring plans. Total costs for alternative plans are generated and revised monitoring schedules and analyte lists are provided as new sheets in the workbook. The workbook is linked to a database of recent well history since the ability to monitor and respond to changes over time is one of the requirements for the program. Pantex projects savings of greater than $400K/yr, along with a quantified confidence of achieving the goals of the monitoring program, if their revised monitoring plan is implemented. Pantex has presented their revised program, with the supporting logic and workbook data, to internal Pantex staff, DOE, and Texas regulators. The Pantex staff and DOE have approved the monitoring plan, and acceptance of the revised plan by the Texas regulators is anticipated. Dr. Nancy L. Hassig, Staff Scientist, Battelle Pacific Northwest National Laboratory, Richland, WA. 99352; (650) 969-3969, Fax (650) 969-3978. nancv.liassig@pnl.gov (35) Optimization of LTM Networks: Statistical Approaches to Spatial and Temporal Redundancy By Kirk Cameron, Ph.D., MacStat Consulting, Ltd. Summary: Air Force Center for Environmental Excellence (AFCEE) requested a spatial and temporal optimization algorithm for use at LTM networks on Cape Cod. Goal: optimize an LTM (LTM) network so that resources for sampling, analysis, and/or well construction are not wasted. Ensure 1) not too many wells are being monitored, and 2) certain wells are not being sampled too frequently. Two stage algorithm: 1) Identify and eliminate spatially redundant wells. 2) Identify temporal redundancies in the wells that remain. 1) Spatial redundancy * Since estimates at unsampled locations involve an interpolation of known ground-water concentrations, a well is spatially redundant when it is assigned a small weight in the interpolation compared to other wells. * Kriging can be used for linear interpolation over a spatial area and can account for the statistical redundancy at nearby sample locations through its estimate of the spatial correlation function. * Two intermediate computations from the kriging exercise are useful: 1) the local kriging weights assigned to sampled locations near each block can be accumulated and averaged to generate a "global" interpolation weight for each; 2) at each block, the kriging estimation variance indicates the relative uncertainty of the local block estimate compared to other blocks. * The global interpolation weights offer a relative ranking of the well locations in terms of the amount of independent spatial information provided. Those wells that are spatially redundant will tend to have the lowest global weights. * Since the kriging variance depends on the spatial correlation model, the number and configuration of the sampled locations, and the position of the estimated block relative to nearby samples, the kriging variance also provides a measure of relative spatial redundancy. By averaging the kriging 54 ------- variance across blocks, the uncertainty using one configuration of well locations can be compared to the uncertainty derived from alternate configurations. * If all wells with a global interpolation weight smaller than a fixed threshold are eliminated from the mix, and the site is re-kriged on the same blocks, the new average kriging variance can be checked against the original measure to make sure that too much spatial information has not been lost. 2) Temporal redundancy * A monitoring well is sampled too frequently if either 1) the random component of variation in concentration levels over time is very low; and/or 2) there is little to no temporal auto correlation between closely spaced sampling events. * To account for random variation: 1) remove any simple structural component of variation; 2) compute standard deviation of the residual concentrations; 3) divide the standard deviation by the mean well concentration to form a modified coefficient of variation (c.v.). * A high modified c.v. indicates the need for more frequent sampling, since then the random component of variation is large, while a low modified c.v. indicates the need for less frequent sampling. * Temporal auto correlation can be used to estimate the minimum sampling interval so that two consecutive sampling events provide uncorrelated information. * Basic approach: 1) compute a one-dimensional empirical temporal variogram for each well and average across wells to build a composite temporal variogram; 2) locate the smallest time interval at which the approximate sill of the composite variogram is reached; 3) designate this time interval as the minimum sampling interval providing independent temporal data. Kirk Cameron, Ph.D., MacStat Consulting, Ltd., 4735 Holister Ct, Colorado Springs, CO 80919; (719) 532- 0453, FAX (719) 532-0453, kcmacgallgaoLcom (36) Optimizing A Ground-Water Monitoring Network for Assessing Air Sparging Effectiveness on BTEX and MTBE By Steven J. Naber, Battelle Columbus Operations Bruce E. Buxton, Battelle Columbus Operations Ann M. Herberholt, Battelle Columbus Operations The initial site characterization of a former military installation, based on data from nine wells, showed three areas of higher BTEX and MTBE concentrations within a plume. A risk-based approach was used in selecting a remediation program that includes air sparging at the three hotspots to reduce the amount of BTEX and MTBE in their vicinities followed by intrinsic remediation. A network of 45 new wells was installed at the site to perform the air sparging and monitor the BTEX and MTBE levels in throughout the contaminant plume. This network was designed primarily for air sparging, so the wells were generally concentrated near the three hotspots. The proximity of some wells to others resulted in the collection of redundant information about BTEX and MTBE levels in the plume. In order to reduce future sampling costs, an analysis of data from the 54 wells prior to sparging was undertaken to determine whether any wells were providing redundant information and could, as a result, be eliminated from the sampling. Results of the redundancy analysis showed that there was little difference in the precision in the estimates of BTEX and MTBE levels in the plume after eliminating five wells. Exclusion of these wells from future 55 ------- sampling for contaminant levels resulted in a 10% reduction in the costs of sampling and laboratory analysis of the samples. Steven J. Naber, Battelle Columbus Operations, 505 King Ave., Columbus, OH 43201; (614) 424-3536 FAX: (614) 424-4250, na (37) Decision Support Software for Designing Long-term Monitoring Plans (LTMPs) By Charles J. Newell, Ph.D., P.E., Stacey Lee Ita, Ph.D., Ric L. Bowers, P.E. Groundwater Services, Inc., Marty Faile, Air Force Center for Environmental Excellence Data from the Air Force site restoration program indicate that fewer sites with affected groundwater are undergoing active remediation, rather; more sites are being characterized prior to implementation of LTM plans. Although the annual cost of LTM at an individual site may appear relatively small, groundwater monitoring at a large number of sites for a long time creates the potential for a tremendous cost liability of billions of dollars (Air Force Modeling and Monitoring Workshop, August 1997). To provide a strategy for developing appropriate long-term groundwater monitoring programs that can be implemented at lower costs, the Air Force Center for Environmental Excellence (AFCEE) and Groundwater Services, Inc. (GSI) are now developing decision support software for site managers. Using three lines of evidence, the software will classify both groundwater source zones and groundwater plumes as either expanding, stable, shrinking, exhausted, or no trend. Information on both the direction of the trend and strength of the trend will be provided to the user. The software will be designed to address a variety of groundwater contaminant plumes (e.g., fuels, solvents, metals) and can serve as a database for all LTM plan data. The three lines of evidence used in the software will be: i) statistical tools based on time-series data regarding groundwater constituent concentrations (e.g., Mann-Kendall test, regressions); ii) simple fate and transport models that take site-specific data and predict the ultimate extent of contaminant migration (such as BIOSCREEN and BIOCHLOR); and iii) empirical rules developed on the basis of data from previous "plume-a-thon" studies such the Lawrence Livermore study, the Texas BEG studies, and the AFCEE chlorinated database. The three lines of evidence will be compiled, weighed, and the direction and strength of concentrations trends in each source zone and the downgradient groundwater plume (the "tail") will be determined With this information, the user has the option to request a generic LTM plan based on the i) the trend data, ii) hydrogeologic factors (e.g., seepage velocity), and iii) the type of contaminant, and iv) location of potential receptors (e.g., wells, discharge points, or property boundaries). For example, a generic plan for a shrinking petroleum hydrocarbon plume (BTEX) in a slow hydrogeologic environment (silt) with no nearby receptors would entail minimal, low frequency sampling of just a few indicators. On the other hand, the generic plan for a chlorinated solvent plume in a fast hydrogeologic environment that is showing expanding but very erratic concentrations over time would entail more extensive, higher frequency sampling. It is envisioned that 15-30 generic plans for different conditions (trends, hydrogeology, contaminant type, receptor location) will be developed for use in the software. 56 ------- The software is designed to be "evergreen" so that LTM plans can be modified as the site changes over time (e.g., reducing monitoring efforts when a plume changes from stable to shrinking). The software will be available in Fall 2000 and will be distributed free of charge over the Internet. Charles Newell, Groundwater Services, Inc., 713 522-6300, cjnewell@gsi-net.com (38) Simple, Inexpensive Diffusion Samplers for Investigating VOCs in Groundwater By Don A. Vroblesky, Ph.D., US Geological Survey A simple, inexpensive diffusion sampler can be used to sample a variety of environments for VOCs of environmental interest. The diffusion sampler consists of a vapor or water phase inside a polyethylene membrane. In its simplest form, the sampler can be a scalable sandwich bag available at local grocery stores. A box of these can be obtained for a few dollars. The principle of operation is based on the ability of polyethylene to readily allow diffusion of volatile organic compounds (VOCs), such as aromatic petroleum hydrocarbons and chlorinated solvents, while preventing the movement of water across the membrane. Thus, after sufficient equilibration time, the VOC concentrations of the air or water in the sampler achieve equilibrium with the VOC concentrations in the ambient water outside of the sampler. Recovery of the samplers and analysis of the contained vapor or water can be used to determine VOC concentrations in the ambient water. Analyses of the vapor-based samplers give relative concentrations of VOCs and can be done rapidly and inexpensively on field or laboratory gas chromatographs. Analyses of the water-based samplers have the advantage of providing dissolved concentrations of VOCs measurable by standard laboratory methods. Diffusion samplers, consisting of air-filled 40-milliliter glass vials enclosed in scalable polyethylene bags, were successfully used in streambed sediment to determine the locations of fractures that were discharging contaminated groundwater to surface water. Because the samplers are diffusion-based, VOC concentrations within the samplers change as ambient VOC concentrations change. Thus, the periodic recovery and analysis of samplers from a single location of discharging ground-water contamination in a stream was useful in observing the rapid increase in contaminant discharge resulting from air-rotary drilling of a nearby well in a fractured-rock aquifer. Water-filled diffusion samplers can be used to quantify the concentrations in discharging groundwater. A test of this approach in South Carolina showed that trichloroethene concentrations obtained from water-filled diffusion samplers beneath a stream (142- 148 (ig/L) were similar to those obtained from a small well point (131-147 (ig/L) screened adjacent to the diffusion samplers. Diffusion samplers installed in observation wells can be used to obtain representative water samples for chlorinated volatile organic compounds. The samplers consist of polyethylene bags containing deionized water placed adjacent to the water-bearing fracture or screened interval in the well. In saprolite and fractured-rock wells at a study area in South Carolina, the volatile organic compound concentrations in water samples obtained using the samplers without prior purging were similar to concentrations in water samples obtained from the respective wells using traditional purging and sampling approaches, such as a submersible electric pump, a bladder pump, and a bailer. The low cost associated with this approach makes it a viable option for monitoring large observation-well networks for volatile organic compounds. Don Vroblesky, Ph.D., US Geological Survey, WRD, 720 Gracern Road, Columbia, SC 29210-7651; (803) 750-6115; vrobleskrgusgs.gov 57 ------- (39) Optimization of Long-Term Monitoring Costs via Statistical and Geo-Statistical Thinking By Maureen Ridley, Lawrence Livermore National Laboratory Gary Tuckfield, Westinghouse Savannah River Co. How many wells are enough? • Purpose of a ground water monitoring network? • Typical network sampling plans? • Problem: How can we minimize cost without compromising our knowledge of the • extent and magnitude of the contamination? • Solution: The 4Ps (Reliability, Relevancy, Redundancy, and Regulatory Assessments). • Method: Geo-hydrological and Geo-statistical analysis of historical contamination data. • Example deployments of the technology at SPS and LLNL. How often should wells be sampled? • Purpose of frequent ground water well sampling? • Typical sampling plans. • Problem: How can we reduce the sampling frequency and the associated costs without loss of crucial information regarding a contaminant plume? • Solution: Cost-Effective Sampling (CES) program • Method: Historical trend analysis via linear statistical modeling. • Example deployments of this technology at LLNL, Paducah, etc. Abstracts 4R's Statisticians and hydro-geologists at SRS have developed a method for well redundancy assessment based on what is called a 4R's technology, viz. a review of monitoring well relevancy, reliability, redundancy, and regulatory mandates. Well relevancy is determined by a hydro-geological comparison of the screened zone to modeled subsurface geology and surficial proximity to the major axis of the contaminant plume. Well reliability is assessed by an analysis of water quality measurements such as pH, turbidity, and specific conductance. Well redundancy employs geostatistics to identify those wells that contribute little additional information regarding contaminant concentration due to their spatial correlation with other monitoring wells. Regulatory assessment is achieved by an historical review of groundwater constituents in each well. Considerable reductions in the number of constituents can be proposed when lab analyses consistently return concentration estimates at the level of detection without compromising the regulatory purpose of the well under RCRA or CERCLA requirements. Our technology has been deployed in several groundwater monitoring networks at SRS including, the A/M area and the Burial Ground Complex. The South Carolina state regulator has recently approved an approximately 25% reduction in the number of wells and constituents sampled per well in the A/M area network. The estimated cost savings are in excess of $200K per year. 58 ------- CES The Cost Effective Sampling (CES) technology was developed at the Lawrence Livermore National Laboratory (LLNL) where it has been demonstrated and deployed. It has also been demonstrated at the DOE Savannah River Site, and Riverbank Army Ammunitions Plant. This technology has a track record of recommended reductions in monitoring costs of approximately 25-40% (about $400K annually at LLNL). CES uses simple linear statistical models and concepts to select an appropriate sampling frequency for each well in the monitoring network. If contaminant concentration is increasing or decreasing rapidly over time, sampling should occur more frequently (e.g. quarterly) than wells which show very little change in concentration over time. In the latter instance, semi-annual, annual, or biennial sampling will be adequate. Also, contaminant concentrations which are more uncertain, that is, which vary substantially with time, should be sampled more frequently than wells with less statistical uncertainty. The intent is to provide all the data necessary for a project leader to make the appropriate decisions concerning a site's remediation strategies, but to minimize or eliminate data that does not add additional value. When monitor wells are sampled less frequently, this can result in savings from: (1) reducing the sampling effort, (2) from decreased costs associated with contaminated ground water disposal, (3) decreasing the analyses, both in number of analytes as well as number of samples, and (4) reducing time for QA/QC evaluation and data management. Maureen Ridley, Lawrence Livermore National Laboratory (LLNL), (L-528), P.O. Box 808, Livermore, CA 94551-0808, (925) 422-3593: (925) 422-2095FAX: ridlcvl@llnl.gov Gary Tuckfield, Westinghouse Savannah River Company (WSRC), Savannah Technology Center, Bldg. 773-42A, Aiken, SC 29808, (803) 725-8215; C (40) Rapid Data Access: Key to Integrated Use of Environmental Characterization and Monitoring Information By Maureen Ridley, Patricia Ottesen, Darrel Lager, Gary Laguna, Francesca Colombini, Lawrence Livermore National Laboratory Marilyn Arsenault and Michael Legg,, Arsenault Legg, Inc. Environmental investigations result in large quantities of data. The value of these data lies in their interpretation and use by project staff, management, and the regulatory community. Traditional modes of data access can be frustrating and time-consuming. Software tools that join networking technology of the World Wide Web (WWW) with database access have decreased labor-intensive overhead in site characterization and monitoring, thus increasing the efficiency of the groundwater restoration project at Lawrence Livermore National Laboratory (LLNL). DOE and project personnel have dynamic access to statistical processing, database retrieval, and cost estimating tools. By adding mouse-sensitive site maps and post-processing capabilities, we have extended the utility of standard web browsers, such as Netscape or Internet Explorer. Users can retrieve chemical compound concentrations, groundwater elevations, and descriptive information about monitoring locations. Data may be viewed as time-series graphs, contour maps, or simply displayed as text. Platform 59 ------- independence and easy retrieval make more comprehensive review of data possible. Cost-savings are realized; efficiency is increased; planning and decision-making are facilitated. Another goal of LLNL is to make the Rapid Data Access Tools freely available to as many users as possible. This has been achieved by adding many of the LLNL tools to the enABL Data Management System (EDMS). EDMS is a free environmental data management system with an already large user base. The EDMS is a data management system designed to allow multiple organizations to share information electronically. The EDMS captures information related to sites, locations, soil borings, lithology, well installation and monitoring, soil geotechnical laboratory data, field sampling and field tests, chain-of- custody data, and provides a link to the laboratory General Electronic Data Deliverable (GEDD) Format. Work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract W-7405-Eng-48. Maureen Ridley, Lawrence Livermore National Laboratory (1-528), P.O. Box 808, Livermore, CA 94551- 0808; (925) 422-3593; ridlcvl.@llnl.gov (41) Data requirements for Long-Term Monitoring and Data Comparability By Joseph D. Evans, Science Applications International Corporation Long-term field demonstrations can present "particular" quality concerns associated with data received from the laboratories performing the analyses. Over the period of the field demonstration, which can often last several years, laboratories can change ownership, be bought or sold by another laboratory, experience changes in key personnel, or even go out of business. Because the field demonstration requires data comparisons from beginning to end, additional or special QC measures should be planned prior to starting any long-term demonstration such that meaningful data comparisons can be performed from even when laboratory conditions change. As part of the EPA Superfund Innovative Technology Evaluation (SITE) Program, Science Applications (SAIC) has implemented QA/QC measures to ensure data comparability when performing long-term demonstrations. These include experimental design, field procedural requirements, and laboratory QC measures. This paper addresses the problems of comparability and ways to help insure comparable data. In addition, another compounded factor associated with comparability over several years is the need to address potentially small reductions which must be measured for the successful evaluation of biological remediation processes; such as phytoremediation or bioventing. Because these reductions are often only expected to be small percent changes for a given demonstration period, with expected remedial efforts to last several years beyond the demonstration, normal analytical and sampling variability may be as great as the expected reduction. This factor is compounded by possible inter-laboratory variability. Procedures needed to determine these small measurement differences and associated field and laboratory variability will be addressed. Joseph D Evans, SAIC, 545 Shoup Ave, Idaho Falls, ID 83406, (208) 528-2168, ioscph.d.cvans@cpmx.saic.com 60 ------- (42) How the Badger Army Ammunition Plant Saved $400,000 in Long-Term Monitoring Costs By John P. Hansen, P.E., Olin Corporation Badger Army Ammunition Plant (Badger) is a 7,350 acre propellant manufacturing facility located in south-central Wisconsin, approximately 35 miles north of the capital, Madison. Badger was constructed in 1942 and produced single and double-based propellants for the Army during WWII, Korea and Viet Nam. Since 1977, the plant has been idle, and is currently being dismantled. As with any such facility, there are environmental problems that ultimately require groundwater monitoring. Olin Corporation has been the operating contractor for Badger since the 1950's and has provided expertise not only for operations, but for environmental compliance and remediation as well. Environmental studies began in the late 70's and continue today. Many of the contaminated sites have been cleaned up, however many remain. When Badger started a quarterly groundwater monitoring program in 1987, there were only 66 wells on the facility. Today, there are over 270 wells that are monitored both on and off the plant. Since cleanup continues, the groundwater monitoring program is still classified as being in the "investigative" phase, which means that instead of targeting specific monitoring wells for sampling and compounds for analysis, all wells and suspected contaminants would normally be retained into the program. Beginning in 1994, however, Olin staff began assessing the program in order to streamline and cut costs, which had grown to over 1.4 million dollars on an annual basis. Although there are many programs available for rigorous statistical analysis, Olin staff decided to take a much simpler approach. The groundwater data was compiled into a single database that still allowed individual sites within the facility along with their respective monitoring wells to be assessed. The data was evaluated for detects versus non-detects, the Wisconsin groundwater standards were incorporated to facilitate evaluation against regulatory levels of concern, trends were analyzed using straightforward techniques, and water level data was utilized to develop water table maps to verify which wells were downgradient, sidegradient and upgradient. In August of 1996, Olin submitted a report to the state and federal regulators titled "Optimization Report for the Badger Army Ammunition Plant Groundwater Monitoring Program" which contained recommendations to eliminate those chemicals that were being analyzed for and were either not being detected, or were detected at levels well below regulatory concern. In addition, wells that were not assigned as upgradient and were not showing levels of contaminants of concern were recommended for discontinuation from the program. Both the state of Wisconsin and the EPA agreed that the data was presented well and clearly provided basis and rationale for the recommended program changes. In the end, the regulators agreed with over 95% of the recommendations, and the groundwater monitoring program was officially revised in late 1997, resulting in a savings of over $400,000 on an annual basis. The success of our efforts imply that a simple, straightforward approach to evaluating a monitoring program may be all that is necessary in lieu of a complex statistical analysis that is likely to require far greater resources. John P. Hansen, P.E., Chief Environmental Engineer, Olin Corporation - Winchester Division, Badger Army Ammunition Plant, 1 Badger Road, Baraboo, WI 53913-5000; (608)643-3361 x 275 FAX: (608)643-2674; j phanscn@j vlnct.com 61 ------- (43) Sample Size Determination and Computation of the 95% Upper Confidence Limit of the Mean in Environmental Applications By Anita Singh Ph.D., Lockheed Martin Often in Superfund applications of USEPA, cleanup decisions are made based upon the mean concentrations of the contaminants of potential concern (COPCs) at a polluted site. The objective may be 1) to compute the exposure point concentration (EPC) term used as one of several parameters to estimate the contaminant intake for an individual, or 2) to verify the attainment of cleanup goals (CUGs) as agreed upon by all concerned parties such as the USEPA, and the party responsible for introducing contamination at the site. Contaminant concentration data from Superfund sites quite often appear to follow a skewed probability distribution. The lognormal distribution is frequently used to model positively skewed contaminant concentration distributions. The population mean, (i is one of the commonly used measure of central tendency of a distribution, and is often used to verify the attainment of cleanup standards at a site. The unknown mean, (i, is typically estimated by the sample mean and some Upper Confidence Limit (UCL) of the mean, which in turn are computed using the sampled data. The question now arises: how many samples one should collect to be able to reliably draw inference about the population mean with prespecified power and confidence coefficient. The H-statistic based upper confidence limit (H-UCL) for the arithmetic mean of a lognormal population is widely used to make remediation decisions at Superfund sites. However, recent work in the literature of environmental statistics has cast some doubts on the performance of the H-UCL of the mean of a lognormal population. Even though for a lognormal distribution, the H-UCL is theoretically sound and possesses optimal properties, the practical merit of the use of H-UCL in environmental applications is questionable as it becomes too high when the sd of the log-transformed variable starts exceeding 1.0 The use of decision criteria based on the mean of a lognormal distribution can have undesirable consequences, especially for samples of small sizes. It is observed that in many cases, the H-UCL of the mean contaminant concentration differs unrealistically from the UCL values obtained using other procedures by orders of magnitude, and the H-UCL even exceeds the corresponding percentile of the contaminant data distribution under consideration. When the UCL of the mean is obtained using a lognormal distribution, one may end up spending more time than is necessary on a Superfund cleanup project in one case; and leaving the contamination behind in the other. The later situation can arise when reference or background data based UCLs are obtained using a lognormal distribution. The problems of taking an adequate number of samples for the verification of attainment of cleanup standards satisfying pre-specified performance parameters such as the Type I (a) and Type II (P) error rates and obtaining reliable 95% UCL of the mean for positively skewed datasets will be discussed. Real datasets from Superfund applications will be considered. Anita Singh, Ph.D., Lockheed Martin Environmental Systems & Technologies Company 980 Kelly Johnson Drive, Las Vegas NV 89119; (702) 897-3234; (702) 897-6640 FAX; asingh.'olimcpo. com 62 ------- (44) Some Alternative Statistical Procedures for Environmental Data Analysis By A.K. Singh, Ph.D., University of Las Vegas ABSTRACT NOT A VAILABLE (45) Environmental Resources Program Information Management Systems (ERPIMS) and GIS By Robin Lovell and Sharon Shaw US Air Force Center for Environmental Excellence The Air Force Center for Environmental Excellence (AFCEE) has developed a web site designed to disperse geographic information linked to attribute data over the Internet. AFCEE's GIS on the Web allows users to browse data and maps using a standard web browser. These maps are dynamic and can be manipulated by a user to look at areas of interest and to query information in the AFW - Environmental Resources Program Information Management System (EPPIMS) data base. The site contains technical information which may need some degree of expertise for interpretation. The purpose of this effort is to allow easy access to both the graphic and Environmental Resources Program Information Management System (ERPIMS) attribute data available for installation remediation efforts. Robin Lovell, US Air Force Center for Environmental Excellence, 3207 North Road, Brooks AFB, TX 78235- 5363; (210)536-5399; (210)536-5921 FAX; Robin.Lovell@hqafcee.brooks.af.mil Sharon Shaw, US Air Force Center for Environmental Excellence, 3207 North Road, Brooks, AFB, TX 78235-5363: (210)536-6502; (210)536-5921 FAX: Sharon.Shaw@hqafcee.brooks.af.mil (46) Sampling and Analysis Plan Under PBMS By Barry Lesnik, USEPA The purpose of this presentation is to attempt to correct several misconceptions that abound in both the regulatory and regulated communities about what a performance based measurement system (PBMS) is and how it should be applied to address regulatory requirements regarding the selection and use of appropriate methods for RCRA applications. Topics to be covered include: 1) what is PBMS?; 2) what are the responsibilities of the regulators and the regulated community under PBMS, 3) how will the analytical paradigm change under PBMS?, 4) driving reasons for performing RCRA analyses; 5) flexibility of RCRA methods; 6) when the use of SW-846 methods is mandatory and when it is not; 8) factors determining appropriate choice of analytical methods; and 9) what should be included in the analytical component of a RCRA Sampling and Analysis Plan. Barry Lesnik, US EPA/OSW, (MC 5307W), 401 M Street, SW, Washington, DC 20460; (703) 308-0476; (703) 308-0500 FAX; lesnik.baOT@epamail.epa.gov 63 ------- (47) Sample Collection and Handling Alternatives for VOC Soil Characterization: Method 5035 By Alan D. Hewitt, US Army Corps of Engineers Within the last couple of years the U.S. EPA and ASTM have published new guidance on how soil samples acquired for VOC characterization should be collected and handled in preparation for analysis. The features of this new guidance that will have the greatest impact on improving data quality are the use of less disruptive and fewer transfer steps, and the use of vessels with hermetically scalable closures for transportation and storage. The new measures for sample preservation will also help improve the data quality. To assist with the implementation of this new guidance, two very different protocols have been developed. In one case, all of the steps leading up to those associated with the analysis process are performed in the field, while the other more traditional approach has all of the steps associated with sample preparation and analysis occur in a laboratory. This presentation will cover methods of infield sample preparations and methods for secure transportation and storage so that sample preparation can occur in a laboratory. In particular, information will be given with regard to transportation and storage of samples in the En Core sampler or a VOA vial. For example, the performance of these two vessel during an initial twoday storage period at 4±2°C, which corresponds to the length of time currently recommended before samples need to be preserved, and while being preserved by freezer (12±3°C) storage, for up to 12 additional days, will be discussed. Freezing as an alternative method of sample preservation appears to be better suited for VOCs in soil matrices than acidification. For instance, acidification is incompatible with carbonates, causes the decomposition of styrene and perhaps other target analytes, and has the potential to cause the formation of acetone. Efforts are ongoing to include freezer storage as a method of sample preservation and the use of a VOA vial as a transportation and storage vessel in future revisions of these guidance documents. Alan D. Hewitt, U.S. Army Cold Regions Research and Engineering Laboratory, 72 Lyme Road, Hanover, N.H. 03755-1290, (603) 646-4388 / fax 4287, ahewittfgicrrel.usace.arnw.mil (48) Environmental Applications of NRL's Continuous Flow Immunosensor By Lisa C. Shriver-Lake, Paul T. Charles, Charles H. Patterson , David B. Holt and Anne W. Kusterbeck, Naval Research Laboratory Paul R. Gauger, GeoCenters The environmental community has an increasing need for rapid, quantitative detection of hazardous pollutants. Current methods require off-site laboratory analyses which increase costs and time. Immunoassay-based methods (i.e., test kits, biosensors) provide a sensitive, specific, rapid, and portable means to fulfill that need. Recent advances in instrumentation at NRL have led to a method for measuring small molecular weight pollutants such as TNT and RDX. The FAST 2000 is a continuous flow immunosensor based on a displacement immunoassay. The key components are antibodies specific for the analyte, fluorescent signal molecules similar to the analyte, and a fluorescent detector. The FAST 2000 quantitates water samples (150 (il) with minimal sample preparation and reagent addition. Analysis is complete within five minutes, with the fluorescent signal being proportional to the analyte concentration 64 ------- in the sample. The biosensor is portable, and easily set-up within 30 minutes on a small table. Extensive field trials, designed to demonstrate the performance of this method for on-site analysis, were conducted at several geologically diverse sites during the last several years. Results to be discussed include detection limits (5-10 ppb), matrix effects, cross-reactivity, false positive/negative rates, and cost. In addition to the validation studies, other applications of these technologies to drug interdiction, explosives detection for non-environmental applications and monitoring of other environmental pollutants will be presented. Lisa C. Shriver-Lake, Center for Bio/Molecular Science and Engineering, US Naval Research Laboratory 4555 Overtook Ave., Washington, DC 20375; (202) 404-6045; lshriver-lake@gromit.nrl.navy.mil (49) Groundwater Modeling System, Version 2.1 By Earl Eldris, US Army Engineer Waterways Experiment Station The Department of Defense Groundwater Modeling System (GMS) is a comprehensive graphical user environment for performing groundwater simulations, site characterization, model conceptualization, mesh and grid generation, geostatistical interpretation and post-processing. GMS is developed through the collaborative efforts of 15 different government research labs and offices within the DoD, DoE, and EPA as well as participation from 20 universities and private industry. GMS integrates and simplifies the process of groundwater flow and transport modeling by bringing together all of the tools needed to complete a successful study. What's more, all this is available for both PC and UNIX based operating systems. Several types of models are supported by GMS. The current version of GMS provides a complete interface for the codes FEMWATER/LEWASTE, MODFLOW, MODPATH, MT3D, RT3D, and SEEP2D. Many other models will be supported in the near future, such as UTCHEM, NUFT3D, ParFlow, and ADH. System Features: Pre- and Post-Processing Support for: MODFLOW, MODPATH, MT3D, RT3D, FEMWATER Alpha and beta Support for: PARFLOW, NUFT3D, SEAM3D, ADH, UTCHEM Remedial alternative modeling and evaluation Site Characterization Tools GIS/CADD Links SCAPS & CPT Data Import Finite Difference/Finite Element Grid Generation Automated Calibration Tools 2D & 3D Data Interpolation/Visualization Geostatistical Library Including: Kriging (Ordinary, Universal) Inverse Distance Weighting Natural Neighbor Clough-Tocher AVI Video File Animation 65 ------- Conceptual Modeling Approach US Army Groundwater Modeling Technical Support Program: Distribution, Technical Support & Training for DoD Groundwater Modeling System with Up To One Week On-Site Technical Modeling Assistance For Army Users Currently Over 750 GMS Users In DoD, DoE, & US EPA, with over 1300 commercial users worldwide. The presentation will overview the GMS's technical capabilities, and will present the GMS's development direction over the next two to three years. Reference: http: //chl .we s .army .mil/software/gms/ Earl V. Eldris, US Army Engineer Waterways Experiment Station, CEWES-CV-H, 3909 Halls Ferry Road, Vicksburg, MS 39180; (601) 634-3378; FAX (601) 634-3453; cdrisc.@mail.wcs.amiv.mil (50) Practical Internet-Based Applications of Geographic Information Systems (GIS) in Support of Long-term Monitoring and Remedial Program Optimization By Francis E. Slavich, PE, Radian International Ken Hill, Radian International The data produced from long-term and remedial system performance monitoring programs are the backbone and foundation for demonstrating adequate progress toward contaminated site cleanup and eventual close-out. An Installation Restoration Program (IRP) stakeholder's ability to quickly and efficiently evaluate this data and gain an understanding of past, current, and future trends is greatly enhanced through the spatial analysis and display afforded by geographic information systems (GIS). To date, many GIS systems (both client-server and desktop varieties) have been implemented at various DoD facilities; however, widespread and routine use by IRP stakeholders has generally not occurred due to the high level of technical expertise and training required to maintain and operate these systems at the installation level. The purpose of this presentation is to reemphasize the value and importance of spatial data evaluation and display in support of long-term and system performance monitoring programs; and to illustrate several practical applications addressing common problems facing IRP stakeholders today. Examples and case studies to be presented include items such as: a Plume contours and definition a Zone of capture analysis a Time series concentration plots a Monitoring well elimination analysis a System cost effectiveness and performance graphs Moreover, we will also illustrate the exciting, new opportunities for making GIS available to all IRP stakeholders through a common shared data set, in a web-enabled, Internet environment, with the use of today's browser technology. Toward this end, we will emphasize the approach of assigning primary decision-making value to the products resulting from a GIS system (e.g., contaminated plume contours, 66 ------- time series plots for individual monitoring wells, topographic maps, facility infrastructure layers, system cost and performance curves), not the system hardware and software itself. In other words, a central server, web-based, GIS application provides IRP stakeholders with easy access to the products and outputs of GIS through a simple browser interface, without any formal training in GIS hardware and software packages. This process facilitates streamlined and effective remedial decision-making at greatly reduced cost. Francis E. Slavich, PE, Program Manager, Radian International,?.O. Box 13000, Research Triangle Park, NC 27709, (919) 461-1443, (919) 462-1415 FAX; francis_slavich@radian.com (51) Advanced Chemical Sensors for Monitoring of Organic Solvents in Groundwater By Radislav A. Potyrailo and Timothy M. Sivavec General Electric Corporate R&D Acoustic wave devices are increasingly being studied as detectors for chemical species. These instruments have the high potential to achieve detection limits of real-time measurements at parts-per-billion levels at a low cost and in a portable configuration. With the goal for applying these sensors for environmental monitoring needs, we have developed a prototype instrument for the real- time detection of low concentrations of organic solvents in groundwater. Radislav A. Potyrailo, Ph.D., Characterization and Environmental Technology Lboratory, General Electric Corporate R&D, Building K-l, Room 3B34, P.O. Box 8, Schenectady NY 12301; (518) 387- 7370; (518) 387-5604 Fax; potyrailofficrd.ge.com (52) Remote Sensing Assessment Usage in Long-Term Monitoring of Phytoremediation Field Sites By Suzette R. Burckhard and Vernon R. Schaefer Civil and Environmental Engineering, South Dakota State University The application of vegetation-based strategies for the cleanup and/or stabilization of contaminated soils has grown in recent years. As with any contaminated site, an LTM plan aimed at assessing the fate and transport of the contaminants is necessary. As part of this plan, the vegetation on a field site needs to be monitored for signs of stress, uptake of contaminant, and overall coverage on the site. The LTM of large field sites may require numerous measurements to fully assess the vegetation's condition and extent. One method to reduce the cost involved in sending individuals to the field for measurements is to use remote sensing. This presentation will give a brief overview of remote sensing, its uses and applications to LTM. In particular, the various types of remotely sensed data, including that sensed by satellite, high altitude aerial, low altitude aerial, and hand held units will be covered with emphasis on the resolution of the data set produced and the spectral ranges possible with each technique. Several examples of remotely sensed 67 ------- data sets, including a series of AVIRIS (Airborne Visible and InfraRed Imaging Spectroscopy), TM (Thematic Mapper), and SPOT images, and their applications will be presented. Suzette R. Burckhard and Vernon R. Schaefer, Civil and Environmental Engineering, South Dakota State University, Brookings, SD 57007, (605)688-5316, FAX (605)688-5878, (53) Long-Term Monitoring of Subsurface Barrier Integrity ~ Current Technology Capabilities and Limitations By David E. Daniel, Ph.D. University of Illinois Department of Civil and Environmental Engineering This paper addresses issues related to LTM of the integrity of subsurface barriers, with emphasis on vertical barriers, which are frequently used to contain contaminated groundwater and vapors around old landfills and contaminated sites. Barriers are intended to impede the movement of fluids, and, therefore, must be and remain relatively impermeable. Monitoring techniques are of three general types: (1) monitoring designed to verify the physical integrity of the barrier; (2) monitoring designed to verify the low hydraulic conductivity or gas permeability; or (3) monitoring designed to verify that contaminants or tracers are not being transported across the barrier at rates that exceed expectations. The first type of monitoring is designed to verify the physical integrity of the barrier. At the time of construction, quality assurance techniques (which are often not given adequate emphasis) should be employed to verify the physical integrity of the barrier. Quality assurance includes verification of adequate tie-in or key-in to a low-permeability stratum, confirmation that the barrier has the desired dimensions, and verification that components (e.g., sheet-pile panels) are properly joined and sealed. Innovative techniques, such as seismic methods, have been used and are undergoing further development. The second type of monitoring involves determining the hydraulic properties of the barrier. Three methods are available for evaluating the hydraulic conductivity of vertical barriers such as soil-bentonite or deep-soil-mixed walls: (1) laboratory tests on reconstituted samples; (2) laboratory tests on "undisturbed" samples; and (3) in situ tests. Of the three methods, only in situ tests provide an opportunity for LTM, and yet several key problems must be overcome before the long-term hydraulic conductivity can be monitored accurately. There is far greater opportunity to monitor changes in hydraulic conductivity (rather than absolute hydraulic conductivity) over the long-term, which can be very important in situations where the barrier may degrade over time. The third type of monitoring involves observing the transport of tracers or chemical contaminants across the barrier. In its simplest form, water or gas can be used as a tracer, e.g., by pumping on one side of a barrier and observing fluid pressures on the other side to determine if there is a response (for a highly effective barrier, there should be little or no pressure response on the other side of the barrier). Tracer liquids or gases can be injected on one side of the barrier, and samples collected on the other side can be analyzed for the presence of the tracers. Finally, gas or groundwater monitoring wells have historically provided the primary LTM tool for verifying the integrity of a barrier. In summary, a system of monitoring techniques, rather than reliance on a single method, is recommended for monitoring that is intended to provide the highest level of confidence available today. 68 ------- David E. Daniel, Ph.D., University of Illinois, Department of Civil and Environmental Engineering 205 North Mathews, MC-250 Urbana, IL 61801; (217) 333-1497, Fax (217) 265-0318; dcdanicl(rt}uiuc.cdu (54) Water Balance Monitoring of the Alternative Landfill Cover Demonstration (ALCD) By Stephen F. Dwyer Sandia National laboratories The ALCD is testing innovative landfill covers using currently accepted EPA cover designs as baselines. These covers are installed and instrumented in a side-by-side demonstration. Each test plot is 300 feet long; peaked in the middle with 150 feet sloping at 5% toward the west and the other 150-foot half sloping at 5% towards the east. The eastern half of each test plot will be evaluated under ambient conditions and the western side evaluated under "stressed" conditions controlled by a rain simulation system. The covers are evaluated and compared based on construction, cost, and performance criteria. Some of the alternative designs emphasize such things as unsaturated hydraulic conductivity, increased water storage potential to allow for eventual evaporation, and increased transpiration through engineered vegetative covers. The alternative covers were designed to take advantage of local materials to allow for easier construction of the covers at substantial cost savings. The key to gaining general acceptance of any new environmental technology is obtaining regulatory acceptance. The ALCD is addressing this issue by involving the EPA and environmental divisions from the western states in the project. This is key in obtaining acceptance of the new technologies and is encouraging interstate cooperation. The Western Governors' Association and Committee to Develop On- Site Innovative Technologies (DOIT) have worked with Sandia to promote this interstate cooperation. An Environmental Protection Agency (EPA) study of 163 randomly selected landfills determined that current landfill technologies need improvement. Problems were discovered at 146 of these sites. Problems included elevated chemical concentrations in on-site groundwater to severe contamination of groundwater at water supply well fields, surface water contamination, ecological impacts to local flora and fauna, and forced changes in the water supply for impacted communities where federal/state drinking water contamination standards were exceeded. All areas of the country have experienced some form of water contamination due to leaking leachate from landfills. Current cover design criteria emphasizes barrier layers that block infiltration of water through the cover into the waste. Saturated hydraulic conductivity is the measurement device chosen by the EPA to define the effectiveness of the barrier layer (e.g., the lower the hydraulic conductivity, the better the layer is). This is not a practical solution in arid and semi-arid regions because saturation of cover soil layers is rarely, if ever, achieved. The ALCD is developing technology to improve upon current landfill cover systems. The project will provide alternatives to the EPA's landfill cover designs that will work more effectively and be easier and less expensive to install in arid and semi-arid climates. It is also working to improve regulatory acceptance of alternative landfill cover designs across the DOE complex. Stephen F. Dwyer, P.E., Environmental Restoration Technologies Dept, Sandia National Laboratories, MS 0719, P.O. Box 5800. Albuquerque, NM 87185-0719; (505) 844-0595; sfdwycr@saadia.gov 69 ------- (55) Long-term Monitoring of Remediation Approaches in the Vadose Zone By Lome Everett, Ph.D. ARCADIS Geraghty and Miller Remediation technologies, which are applicable in the vadose zone, vary substantially in their time frame requirements for monitoring. In the case of barrier applications which now are recognized by EPA as a remediation technology, the monitoring requirements may extend into geologic time. With respect to barrier applications at radioactive waste disposal sites, natural materials have been selected as barrier construction components to withstand the riggers of long-term remediation requirements. Monitoring of three phase flow in the vadose zone will be discussed. Indirect measurements of pore liquids related to vertical, horizontal and automated applications of neutron moderation technologies will be discussed. Applicability of time domain reflectometry and frequency domain capacitance techniques to measure soil moisture will be presented. Recent adaptations to soil and gas sampling techniques to allow long-term approaches will be introduced. Direct monitoring techniques related to pore liquid investigations will be evaluated for long-term applications. Long-term monitoring requirements at passive funnel and gate systems and slow decomposition contamination sites such as at radioactive waste sites may require complete decommissioning of the monitoring systems prior to the useful life of the remediation/barrier program. Issues related to decommissioning of the LTM devises may be of more relevance than the value of the some information gathered from the active life of the monitoring program. Lome Everett, Ph.D., ARCADIS Geraghty and Miller, 3700 State Street, Ste. 350, Santa Barbara, CA 93105; (805) 687-7559, ext. 236; lcvcrcttra.gmgw.com (56) Short-Term and Long-Term Vadose Zone Monitoring: Current Technologies, Development, and Applications By Boris Faybishenko, Ph.D. Lawrence Berkeley National Laboratory Development of cost-effective remediation plans and post-closure monitoring of contaminated sites requires an improved understanding of the inventory, distribution, and movement of contaminants in the vadose (unsaturated) and saturated zones. This information can be obtained using both short-term (expedited) and long-term characterization and monitoring of water flow and reactive chemical transport. One of the challenging problems of vadose zone characterization and monitoring is the determination of how episodic infiltration enhances preferential and fast water seepage and contaminant fate and transport through the vadose zone toward the underlying aquifer. These effects are enhanced by organic contaminant and nuclear waste leaks from tanks, cribs, and other surface sources at DOE sites in heterogeneous soils and sediments like those at Hanford, Savannah River, and Oak Ridge; and fractured rocks, like those at the INEEL. Field observations using current technologies have shown complex water seepage and mass transport behavior in a very heterogeneous, thick vadose zone on a variety of scales, leading to severe 70 ------- contamination of the vadose zone and groundwater. However, current methods are not able to monitor how episodic infiltration enhances rapid water seepage and contaminant transport along localized preferential pathways within the heterogeneous vadose zone. Localized and persistent preferential pathways may be associated with specific heterogeneous geologic features, such as clastic dikes or caliche layers, and may also be present around the infiltration shadows of nuclear storage tanks. Changes in the chemical composition of moving and indigenous solutes, particularly sodium concentration, redox conditions, biological transformation of organic materials, and high temperature, may significantly alter hysteretic properties of water retention and unsaturated hydraulic conductivity of unsaturated-saturated soils. These processes may, in turn, modify water, chemicals, and bio-transformation exchange between the zones of fast flow and the rest of the media. The development of improved characterization and monitoring methods for contaminated sites with heterogeneous soils and sediments should focus on the following problems: What are important measurable vadose zone parameters affecting the initiation of preferential flow? How can one determine the spatial extent of soils and sediments affected by preferential flow? How can one measure the water velocity and concentration flux within the zone of preferential flow? The paper will present several examples of applications of field monitoring methods and controlled field infiltration experiments with tracers used to characterize water seepage and contaminant transport in soils and fractured rocks. The examples suggest that field studies should be supplemented by laboratory investigations of single cores and mathematical modeling in order to provide an accurate representation of infiltration dynamics. Boris Faybishenko, Ph.D., Lawrence Berkeley National Laboratory, Berkeley, CA 94720; 510-486-4852, FAX 510-486-5686; bfavb@lbl.eov (57) The E-SMART Base-wide Demonstration at Tinker Air Force Base: A Networked Array of Environmental Sensors By Steve Leffler, Ph.D., General Atomics John Mills, Tinker Air Force base Currently, characterization and monitoring of subsurface contaminants at most environmental sites is performed using traditional field collection and laboratory analyses. In these processes, field personnel collect samples from monitoring wells, package these samples for shipment to analytical laboratories, and receive analytical results from the laboratories oftentimes several weeks later. It is expected that significant cost savings in labor and laboratory costs may be achieved if these data are collected using an in situ, real-time monitoring system such as the Environmental Systems Management, Analysis and Reporting neTwork (E-SMART™), which is based upon networks of smart sensors. In addition, the data collected will be immediately available via the Internet for subsequent analysis and visualization, and the timeliness of data analysis will allow for rapid response should problems occur at the site. A prototype E-SMART system initially containing 23 sensors was installed and tested at the Tinker AFB Southwest Tank Area in August-October 1996 under Tinker AFB and Defense Advanced Research Programs Agency (DARPA)/Industry-funded Technology Reinvestment Project (TRP) sponsorship. 71 ------- Progress made by General Atomics and its partners developing E-SMART technology has laid the groundwork for a larger-scale demonstration of the E-SMART system. In support of this goal, a large-scale demonstration is underway to evaluate the applicability of the E- SMART system for eventual widespread use at USAF Air Logistic Centers and other military bases by installing 150 E-SMART sensors at a variety of environmentally impacted sites at Tinker AFB. The proposed E-SMART installation includes the application of sensors that detect and measure contaminants in groundwater and soil gas as well as physical parameters such as barometric pressure and temperature. This demonstration will also include the field installation of sensors that detect and measure airborne contaminants from such possible locations as painting and plating facilities, engine test cells, and other industrial facilities. Remote sites will also be included in the demonstration where innovative remote sensing and data transmission using radio frequency communications will be tested. The eight test sites at the base were chosen to provide a range of applications for the E-SMART system. Steve Leffler, Ph.D. General Atomics, POB 85608, San Diego, CA 92186 619-455-2509, !cfflcr@gatcom John Mills, Tinker Air Force Base, OC-ALC/EMR, Oklahoma City, OK 73145 405-734-3058, john.mills@tinker.af.mil (58) DIRECT PUSH TECHNOLOGIES: Recent Demonstrations By Bruce J. Nielsen, US Air Force Research Laboratory Direct push technologies such as the Site Characterization and Analysis Penetrometer System (SCAPS) have proven to be efficient and effective site characterization tools. AFRL/MLQE recently completed four direct push technology demonstrations at customer sites. To answer the need for less expensive and more easily deployed site characterization tools a Geoprobe® has been enhanced. It has been integrated with a laser induced fluorescence spectrometer similar to the commercial Rapid Optical Screening Tool (ROST) and with other sensors as well. The system has been demonstrated at a number of fuel and solvent contaminated sites. To demonstrate deployability the system was air transported to Misawa Air Base in Northern Japan. This cooperative effort with the Air Force Center for Environmental Excellence (AFCEE) characterized fuel contamination at two fuel tank farms and an F-16 crash site. Cone Penetrometer Technology (CPT) meets refusal in many geologies before being advanced to the desired depth. Enhancements were needed. A Sonic CPT system is under development with current funding from DoE. This effort combines the speed and high penetration capabilities of sonic drilling with the economy, continuous data logging of CPT to provide superior site characterization technology. Kelly AFB has an off-base, chlorinated solvent groundwater plume requiring characterization and they needed a minimally intrusive technology that is not disruptive to the off-base, civilian housing area. The Kelly AFB demonstration, with AFCEE oversight, proved Sonic CPT's ability to penetrate very difficult geologies. This critical enhancement allows access through difficult strata to deeper DNAPL source zones. 72 ------- An important application of CPT was for installation of low cost monitoring points. Hanscom AFB is assessing CPT-installed monitoring points near existing drilled monitoring wells for monitoring groundwater. The assessment involved a rigorous sampling effort to establish a database of analytical results comparing samples from each well type. The analytical results from the conventional and CPT installed wells were compared and demonstrated good correlation. Goal is to initiate validation of CPT- installed monitoring points for users and regulators. Successful integration of real-time DNAPL sensing instrumentation with horizontal directional drilling technology will allow characterization of DNAPL-contaminated strata without introducing a vertical conduit to underlying formations. The technology could investigate sites where the source of contamination is located beneath a building, road, runway, pipeline, lagoon, or landfill, which may be inaccessible using vertical characterization methods. The approach utilizes a ROST-like system and geophysical measurements to identify the presence of DNAPLs in a horizontal borehole after a pilot hole has been completed. For the demonstration project at Kirtland AFB the subsurface beneath an active petroleum service station was characterized and then slotted pipe emplaced in contaminated zones for S VE/bioventing. Continued development of direct push technologies will provide enhancements for expedited site characterization, emplacement of monitoring wells, and direct implementation of remediation systems. Bruce J. Nielsen, Air Force Research Laboratory, Airbase and Environmental Technology Division (AFRL/MLQE) 139 Barnes Dr., Suite 2, Tyndall AFB FL 32403-5323 (850) 283-6227, DSN 523-6227 (59) Results From a One-Year Field Trial of an Automated, Down-Hole Radiation Monitoring System By Garry W. Roman McDermott Technology, Inc. Monitoring of radionuclides in the vadose zone at DOE waste sites is necessary to determine if there may be potential impact to human health or the environment based on the characteristics and movement of the radionuclides through the soil. Continuous monitoring is high desirable during active operations at the site and is likely to be necessary long after site cleanup has been completed. To assure that radionuclide movement through the soil is detected early, frequent monitoring of many locations will be required. Conventional manual monitoring and laboratory analysis systems are very labor intensive, can be very costly and time consuming, and have the potential for exposing workers to unnecessary risks. Under contract with the Department of Energy's Federal Energy Technology Center in Morgantown, West Virginia, McDermott Technology, Inc. has developed a Radiation Monitoring System that can operate unattended. This system provides continuous monitoring and automatic alarming of events outside normal parameters. The system is based on gamma detection and is capable of monitoring to depths of fifty meters below ground level without the need to drill wells. The radiation probe portion of the system consists of a sealed assembly containing a butt coupled scintillator/photomultiplier tube (PMT) and a multi-channel analyzer (MCA). The probe is lowered into PVC casings that have been pushed into the soil using cone penetrometer technology (CPT). At the 73 ------- surface, solar-powered remote stations at each measurement location incorporate the system power supply and a cell phone modem for communication to an off-site host computer, which could be located hundreds or thousands of miles away. A large number of remote stations can each operate independently and, without human intervention, send their daily or weekly results to the host computer for analysis, trending and alarming. If required, the in-ground probes are easily serviceable since they can be readily retrieved from the PVC casing for repair or replacement. The system has been configured using commercially available components assembled into a reliable, low-cost, multi-point system for real-time, long-term, unattended monitoring of active or closed waste sites. For the past year, a five-unit system has been undergoing a field trial at the Fernald, Ohio facility. The purpose of the field test was to subject the system to real site conditions over four seasons of operation. This presentation will discuss the system design, operating characteristics, and the results from the one- year field trial. Garry W. Roman, McDermott Technology, Inc., 1562 Beeson St. Alliance, Ohio 44601, (330)-829-7484 (330)-829-7832 (FAX); garry.wTornanffimcdcrrnoU.corn 74 ------- |