Instructions for the Preparation of Quality Assurance Project Plans For EPA Brownfields Projects in the Southeast KY ^ vSC > MS al ga FL November 2017 b* U.S. Environmental Protection Agency Region 4 Resource Conservation and Recovery Act Cleanup and Brownfields Branch Brownfields Section 61 Forsyth St. S.W. Atlanta, GA 30303 ------- FORWARD The Instructions were prepared by the Environmental Protection Agency (EPA) Region 4 Brown fields Program to provide Brown field Cooperative Agreement Recipients (CARs) and their consultants and EPA contractors the instructions needed to develop a Generic Quality Assurance Project Plan (QAPP) and Site-Specific QAPP Addendum for environmental projects involving the collection and use of environmental data. This version of the Instructions replaces any previous versions. When environmental data is collected using federal funds, the CAR shall comply with 2 Code of Federal Register (CFR) 1500.1 1 requirements to develop and implement quality assurance practices sufficient to produce data adequate to meet project objectives and to minimize data loss. State law may impose additional Quality Assurance requirements. Once the Generic QAPP is approved by EPA, then a Site- Specific QAPP Addendum can be prepared and approved by EPA Region 4 before sampling begins. This document should not be considered a substitute for EPA policy, regulations and guidance. ------- Table of Contents 1.0 Introduction 5 1.1 WhatisaQAPP? 5 1.2 What is the purpose of this document? 6 1.3 General Overview 6 1.4 QAPP Contents and Elements 7 2.0 QAPP - Section A. ProjectManagement 8 2.2 QAPP - Section C. Assessment and Oversight 17 2.3 QAPP - Section D. Data Evaluation 18 3.0 Review, Approval and Distribution 21 5.0 Generic QAPP Updates, Revisions and Resubmittals 22 6.0 References 23 7.0 EPA Region 4 State Web Links: 24 Figures Figure 1 - Generic QAPP and Site-Specific Addendum Process for Multiple Sites 7 Figure 2 - Typical Brownfields Project Organization Layout 10 Figure 3 - Hypothetical Corrective Action Flow Chart 17 Figure 4 - QAPP Review, Approval and Distributions 21 Tables Table 1 - Components and Subsections of a QAPP 8 Table 2 - Examples of Data Validation Activities 19 Table 3 - Examples of Consideration for Usability Assessment 20 Appendices Appendix A 25 USEPA REGION 4 BROWNFIELDS QAPP REVIEW CHECKLIST Appendix B 29 USEPA REGION 4 BROWNFIELDS GENERIC QAPP TEMPLATE Appendix C 39 USEPA REGION 4 BROWNFIELDS SITE-SPECIFIC QAPP ADDENDUM TEMPLATE ------- This Page Intentionally left Blank ------- 1.0 Introduction The Environmental Protection Agency (EPA) requires that a Quality Assurance Project Plan (QAPP) be prepared to support all federally funded environmental projects involving the collection of environmental data. For EPA's Brown fields Program, this requirement means that whenever environmental samples are being collected as part of a site assessment or cleanup project, a QAPP must be prepared and approved by EPA before samples are collected. The EPA Region 4 Brownfields Program is providing these Instructions to provide anyone that is planning to collect samples with EP A Brownfields funding in Region 4 an overview of the information that should be in a Generic QAPP and Site-Specific QAPP Addendum. These instructions briefly explain each of the elements that are required in the QAPP. The Checklist and QAPP Example Template files contained in Appendices A through C ensure the plans contain all the required information. This document is consistent with the following documents: "EPA Requirements for Quality Assurance Project Plans," (EPA QA/R-5), EPA/240/B-01/003 (March 2001) Reissue Notice May 2006; and "EPA Guidance for Quality Assurance Project Plans" (EPA QA/G-5), EPA/240/R-02/009, (December 2002). EPA has additional tools that can be found online, for example the Quality Assurance Project Plan Development Tool contains information designed to assist in developing a QAPP that meets EPA requirements for projects that involve surface or groundwater monitoring and/or the collection and analysis of water samples. The online structure of the tool is intended to guide one through the thought process of planning a project, as well as to provide a framework for documenting the plan. The tool is divided into modules and can be found at https://www.epa.gov/qualitv/qualitv-assurance- proiect-plan-development-tool. Additional reference documents are listed in Section 6, References. 1.1 What is a QAPP? A QAPP is a formal document describing, in detail, the necessary quality assurance (QA), quality control (QC), and other technical activities needed to ensure that the results of the work performed will satisfy the stated performance criteria. Furthermore, a QAPP outlines how and why a project involving the collection of analytical data will be conducted, and assures the quality of the data for making environmental decisions. When developing a QAPP, consideration is to be given as to what potential contamination will be encountered, and to what extent, and what site decisions need to be supported by the data. When planning environmental assessments, consideration is to be given as to potential cleanup alternatives that could be compatible with redevelopment of the site. The QAPP also describes the site or project background, the environmental problem (or the Recognized Environmental Conditions (RECs) as defined in ASTM El527-13), the objectives for the project, the tasks to be performed, and the sampling design. The use of a Generic QAPP and Site Specific QAPP Addendum will eliminate duplication of tasks that are consistently conducted for all site assessment projects during the grant project period. Once EPA has approved the Generic QAPP, then Site Specific QAPP Addendums can be prepared and approved for each site assessment. 5 ------- 1.2 What is the purpose of this document? The purpose of these Instructions is to reduce the time required for QAPP development, review, and approval. By providing a Checklist (See Appendix A), EPA anticipates the review and approval process will be streamlined. Two templates, one for a Generic QAPP and one for a Site-Specific QAPP, are included in Appendices B and C to aid in the development process. The November 2017 Instructions for the Preparation of Quality Assurance Project Plans for EPA Brownfields Projects in the Southeast replaces the previous 2006 version. The purpose of a QAPP is to communicate the specifications for implementation of the project design and to help ensure that the data quality objectives are achieved for the project. QA Planning and implementation increases efficiency and provides for early detection of problems, either in the field or in the laboratory. This can help ensure that federal funds are not wasted on sampling that is not necessary and that the data collected can be used to make meaningful site decisions. 1.3 General Overview EPA Region 4 requires that a Generic QAPP must be developed by all community-wide assessment CARs. The Generic QAPP contains the information and processes that will be the same for each site sampling event. These activities can include field sampling methods and field measurement procedures, laboratory analysis and methodology, and comparison levels that will be used for results, such as EPA screening levels or State criteria. If a CAR has only one site to assess, for example those with a Site- Specific Assessment Grant or Cleanup Grant, there need only be one QAPP prepared for the project. The Checklist contained in Appendix A should be completed and submitted with the Generic QAPP to the EPA Region 4 Project Officer. Provide the page number and paragraph where each of the elements can be found on the second column of the Checklist. This process helps the EPA Region 4 Project Officer and Designated Approving Official (DAO) conduct the QAPP review. It also helps the QAPP developer/writer ensure accuracy and the necessary content is in the Generic QAPP. The Generic QAPP should be organized similarly to the contents in these Instruction and the Checklist, (e.g. Al. Title and Approval Page, A2. Table of Contents, etc.). The EPA-approved Generic QAPP has a 5-year shelf life and it must be reviewed and updated annually. The Site-Specific QAPP Addendums can be developed after EPA approves the Generic QAPP, and after the Project Officer has signed a Site Eligibility Form indicating the property is indeed a brownfield property. The same Checklist in Appendix A can be used for the Site-Specific QAPP. The Site-Specific QAPP Addendum is the project sampling plan prepared for each site assessment. When a site is identified and approved by EPA for assessment, a Site-Specific QAPP is developed as an addendum to the approved Generic QAPP. When additional sampling is necessary after the first sampling event, another Site-Specific QAPP or update can be prepared for the additional sampling. The initial Site-Specific QAPP Addendum for a site will be l.A and the subsequent Site-Specific QAPP Addenda for the same site would continue the sequence with l.B, l.C, etc. As additional sites are proposed for sampling the second site would start with 2. A and so forth (See Figure 1). 6 ------- Generic QAPP Site 1 Site-Specific QAPP Addendum A Site 3 Site-Specific QAPP Addendum A Site 2 Site-Specific QAPP Addendum A Site 1 Site-Specific QAPP Addendum B V , / Site 1 Site-Specific QAPP ^ Addendum C Figure 1 - Generic OAPP and Site-Specific Addendum Process for Multiple Sites A Checklist should be completed and submitted with the Site-Specific QAPP to the EPA Region 4 Project Officer. Complete the Checklist by providing the page number and paragraph where each of the elements can be found in the Plan. This process helps the EPA Region 4 DAO review and approve the Site-Specific QAPP. It also helps the QAPP developer ensure the necessary content is in the QAPP. The Site-Specific QAPP should be organized similarly to the contents in the Instruction and the Checklist, (e.g. Al. Title and Approval Page, A2. Table of Contents, etc.). EPA has up to 30 calendar days to review a QAPP. In time sensitive situations, contact with the Project Officer/Designated Approving Official should be as soon as possible to inform him/her of the expedited need. 1.4 QAPP Contents and Elements The Generic QAPP and Site-Specific QAPP Addendum have four parts: A) Project Management - how you will organize and run the project; B) Measurement and Data Acquisition - how you will collect and report data; C) Assessment and Oversight - how you will check that all activities are completed correctly; D) Data Evaluation - how you will review and interpret the data. Table 1 shows the four components and breakdown of all the QAPP elements that need to be covered in a Generic QAPP and Site-Specific QAPP Addendum. Site 3 Site-Specific QAPP Addendum B 7 ------- Table 1- Components and Subsections of a QAPP PROJECT MANAGEMENT Al. 2, 3 Title and Approval Page / Table of Contents/ Distribution List In Generic and Site- Specific A4 Project / Task Organization In Generic A5 Problem Definition / Background In Site-Specific A6 Project/ Task Description In Site-Specific A7 Quality Objectives and Criteria for Measurement Data In Generic AX Special Training/ Certification 1 n Generic A9 Documentation and Records In Generic MEASUREMENT DATA ACQUISITION B1 Sampling Design and Site Figures In Site-Specific B2 Sampling and Analytical Procedures In Site-Specific B3 Sample Handling & Custody In Generic B4 Analytical Methods and Requirements In Generic B5 Field Quality Control Requirements In Generic B6 Laboratory Quality Control Requirements 1 n Generic B7 Field Equipment Calibration and Corrective Action 1 n Generic B8 Laboratory Equipment Calibration and Corrective Action 1 n Generic B9 Analytical Sensitivity and Project Criteria In Generic BIO Data Management and Documentation In Generic ASSESSMENT/OVERSIGHT CI Assessments and Corrective Actions In Generic C2 Project Reports 1 n Generic DATA EVALUATION D1 Field Data Evaluation In Generic D2 Laboratory Data Evaluation In Generic D3 Evaluating Data in Terms of User Needs In Generic 2.0 QAPP — Section A. ProjectManagement This section of the QAPP answers who will be involved with the project, what is the environmental problem, the background, and history of the problem, how will the data be used, and what decisions will be made with the data. The information that goes into the elements of this section includes: Al. Title and Approval Page: (In Generic and Site-Specific) 8 ------- A Title and Approval Page is required for every Generic QAPP and every Site-Specific QAPP Addendum. Essential items for the title page include the title of the project, the Brownfield Cooperative Agreement number, date prepared, the CAR's full name; the name of person or organization that prepared the QAPP, and the effective date of the QAPP and revision number. The approval page is required to include the names, titles, and dated signatures of all approving officials. These include your organization's project manager, your organization's QA/QC manager/officer, EPA Project Officer, EPA DAO, and the Brownfields state coordinators/managers, if applicable. Add additional approving officials as appropriate to your organization or circumstance. In many cases, the EPA Project Officer and DAO will be the same person. If the Project Officer and DAO are the same individual, one signature line is sufficient if both titles are specified for the EPA Project Officer/DAO. Your Project Officer will be able to let you know the situation for your project. EPA's definition of a QA/QC manager/officer is the individual designated as the principal manager within the organization having management oversight and responsibilities for planning, documenting, coordinating, and assessing the effectiveness of the quality system for the organization. A2. Table of Contents and Page Header: (In Generic and Site-Specific) The Table of Contents lists the sections, figures, photographs, maps, tables, and appendices contained in the QAPP with corresponding page numbers. A3. Distribution List: (In Generic and Site-Specific) The purpose of the distribution list is to ensure that all key project personnel and people mentioned in the document and/or have interest in the project get a copy of the approved QAPP including, subsequent QAPP revisions, addenda and amendments. If it is easier, create a bulleted line item for each or table that includes the name of the recipient, their title, organization, and contact information including telephone number, address, and email address. Include a sentence in this section that indicates all the personnel will receive and follow applicable sections of the QAPP and subsequent revisions. The key personnel may include: EPA project officer/DAO, contractors, laboratory director, field team leader, QA/QC officer, data reviewers, subcontractors, and other major individuals mentioned in the document including those that need to review and approve the Generic and Site-Specific Addendum. A4. Project/Task Organization: (In Generic) List key project personnel (managers, laboratory personnel, QA personnel) and describe their roles and responsibilities for the project, identify who is of the lead manager for the entire project, and identify who is in charge of each activity (sampling, laboratory analysis, data reporting, etc.) Describe critical instructions that will need to be communicated and the person responsible (e.g. who will initiate soil sampling?) Attach an organizational chart (see Figure 2) and indicate lines of communication and authority for all of the above. The organizational chart shows the group hierarchy and reporting structure within the 9 ------- organization. Everyone involved in this project needs to know who reports to whom. R4 Project Officer/ DAO State Environmental Contact QA/QC Officer Environmental Laboratory Drilling/ Other Subcontractor Brownfields CAR Environmental Consultant *QA/QC reports to the highest official in their office, not project manager Figure 2 -Typical Brownfields Project Organization Layout The chart should indicate a separation of responsibilities that could create real or perceived conflicts of interest. For example, the person who is responsible for checking the quality of the data will be separate from those who generate the data. Therefore, the QA/QC officer should report to the highest official in their office and not to the project manager. In other words, the QA/QC officer should be independent or must have an "arm's length" distance from other groups to ensure that they can make objective determinations on quality assurance issues. Use the following language when describing the EPA Project Officer: EPA Project Officer has the responsibility to oversee and monitor the grant. As part of that responsibility he/she must ensure the activities conducted by the CAR are consistent with those described in the work plan. When the Project Officer and the DAO are the same person, this person need only be listed once with dual titles. Use the following language when describing the EPA Brownfields Region 4 quality assurance manager's DAO: The Brownfields Region 4 Designated Approving Official (DAO) provides a technical assistance role to some Region 4 Project Officers. The DAOs role is to provide technical reviews of the QAPPs. The state environmental regulatory agency or its delegate has the lead regulatory role in any Brownfields site assessment or cleanup. The site assessment and cleanup of a Brownfields property must be in compliance with state procedures, rules and regulations. Be sure to have early involvement by the state. 10 ------- A5. Problem Definition/Background: (In Site-Specific) The purpose of this section is to describe the reason for the data collection activity. The environmental problem can be the RECs as determined from a current and ASTM E1527-13 compliant Phase I Environmental Site Assessment (ESA). Appropriate items to include are a description of the environmental problem to be studied, background information from a historic, scientific, and/or regulatory perspective, a summary the known information/data including environmental parameters of concern and magnitude of contamination, and a clearly stated definition of the objectives of the project. Describe how data will be used (exploratory, delineation of contamination, compliance with regulatory criteria, etc.) and who will use the data. Identify what decisions will be made with the data. Cite any regulatory standards or criteria that data will be compared against. Identify information/data that are needed, and questions that must be answered during the study. Include (or reference) historic maps, diagrams and summary data that comprise secondary data. List the types of secondary data (data generated for another purpose) that that will be used for your project, such as historical data or studies, compliance data, information from public databases, photographs, Sanborn Fire Insurance Maps, and literature files. Describe the intended use of the secondary data and discuss any limitations on the use of the data. Minimally, discuss whether there are any QC data associated with the secondary data to characterize its quality. Previously collected data should be used in planning subsequent sampling events. Sources of previously collected data that may be used should be identified and discussed, along with available sampling maps and results. A6. Project/Task Description: (In Site-Specific) This section summarizes the tasks that will be performed, the data that will be collected, and the reports that will be generated. Provide maps and/or tables to identify geographic locations of field tasks and provide a timeline, either in a graphical or tabular format, for scheduled activities and deliverables. A7. Quality Objectives and Criteria for Measurement Data (Site-Specific) This Section states the project objectives and limits, both qualitatively and quantitatively. It also states and characterizes measurement quality objectives as to applicable action levels or criteria. The data quality objectives (DQO) process will be used to establish performance or acceptance criteria, and helps set the minimum quality/quantity of data needed to address the previously stated problem in A6. The DQO process entails seven steps: Step 1. State the problem(s) Step 2. Identify the goals of the study Step 3. Identify information inputs Step 4. Define the boundaries of the study 11 ------- Step 5. Develop the analytic approach Step 6. Specify performance or acceptance criteria Step 7. Develop the plan for obtaining data For a detailed explanation of the DQO process, additional information can be found in the following document: "Guidance on Systematic Planning Using the Data Quality Objectives Process," (QA/G-4), February 2006. Website https://www.epa.gov/sites/production/files/2015-Q6/documents/g4-final.pdf A8. Special Training/Certification: (In Generic) List training of personnel including any special or non-routine training or certifications needed by personnel to conduct project activities. These include: the project activity, specialized training course title (or description), from whom the training was provided, the date of training, and expiration date of training credentials. Also, describe how and by whom this training will be provided and documented, and state where the records will be maintained. A9. Documents and Records: (In Generic) The purpose of this element is to describe the documents and records that will be generated for the project and how they will be retained. For some projects, where results and conclusions may be challenged, raw data records and documents should be retained for a specified length of time. List all the records and documents that will be created during the project and describe which ones will be kept, how long they will be kept, and where they will be kept. Itemize the contents of the laboratory report package. It should include: field sample results, QC samples results (blanks, duplicates, etc.), description of data qualifiers that laboratory applies to sample results, narrative describing issues and problem, resolution during analysis, chain-of-custody records raw data, copies of logbook information. Include additional information as appropriate. A table may be a used to list the documents and records process. Records maintenance requirements are described in the Cooperative Agreement Programmatic Terms and Conditions. 2.1 QAPP - Section B. Measurement/Data Acquisition The elements in this group answer the following questions: What are the sampling design and the rationale behind it? What sample collection methods will be used and what quality control will be used to assure that representative samples are collected? What measurement procedures, field techniques and laboratories will be used and what quality control will be used to assure accurate, precise and sensitive data are collected? How will sample data be managed? Field measurement, sampling procedures, quality system procedures, and guidance documents that can be used as a reference for this section can be found at the EPA Region 4, Science and Ecosystem Support Division's SESD's, "Field Branches Quality System and Technical Procedures". Website, http://www.epa.gov/region4/sesd/fbqstp/index.html. These are also called Standard Operating Procedures (SOPs) and are often referenced in Sections B2 through B9 of the QAPP. The EPA R4 12 ------- Brownfields Program requires the use of SOPs because they present procedures that have been reviewed and approved by EPA, the State environmental agency, or a qualified environmental professional and ensure the reproducibility or quality of data. Any SOPs that are referenced or used must comply with applicable state or federal requirements for the activity to be performed. Referencing SOPs, QC criteria, state standards/criteria or other documents in the Generic QAPP and/or Site-Specific QAPP Addendum is acceptable, but the portion of the document that is referenced must be included within the QAPP or as an appendix to the QAPP. SOP references must include the organization that wrote the SOP, the version number and date. When an SOP provides an option, the option must be specified in the site-specific QAPP. Bl. Sampling Design and Site Figures: (In Site-Specific) This element should describe how you plan to collect representative samples. For guidance on selecting the appropriate design refer to: "Guidance for Choosing a Sampling Design for Environmental Data Collection" (QA/G-5S), December 2002. Website https://www.epa.gov/sites/production/files/2015-06/documents/g5s-final.pdf Explain why the sampling locations, environmental parameters and matrices were chosen. Describe the sampling design for the project including: sampling locations and directions to them, frequency of sampling at each location, matrices to be sampled, environmental parameters of interest in each matrix, any design assumptions (e.g. storm event defined as "X" inches of rain after "Y" number of dry days). Include maps that detail sample locations. It is recommended to create a table that includes: -sample matrix -environmental parameters -sampling collection method -analytical method reference -number of field samples -type and number of field QC samples for each matrix and parameter Some things to consider in developing your sampling design: What are the standards or action levels against which the data will be compared (ex, do you need to determine whether water quality criteria are exceeded)? Do you care about average contamination levels, hot spots, or the proportion of site contaminated? Will composite samples or grab samples be collected? Are you looking for trends over time? Are you using a statistical or judgmental sampling design? Will there be a reference site? Are you collecting background samples? Judgmental sampling is a non-statistical approach for selecting sampling locations. Non-statistical approaches are useful in characterizing a relatively small population and in finding average values over time. Although there is a large degree of bias associated with judgmental sampling, judgmental samples can provide useful information when sample locations are chosen based on prior history, visual assessment, and/or technical judgment. Statistical sampling/grid sampling is a probability-based approach to selecting sample locations. It is 13 ------- useful in locating areas of contamination at a study site. The approach provides a uniform coverage of the area and is the best design for locating "hot spots" when limited data are available. Generally, a larger number of samples are collected and data are more representative of the sampling area than when a judgmental sampling approach is used. However, this may be more expensive than judgmental sampling. If needed, it may be more conducive to field screening approaches. B2. Sampling and Analytical Procedures: (In Site-Specific) Describe in detail each step of the sampling procedure, the equipment, materials, supplies, sample preservation techniques, sample holding times, decontamination procedures, disposal of decontamination by-products, sample containers and volumes, quality control acceptance limits and corrective actions that will be used. Describe how problems (lost samples, broken equipment, inaccessible sampling locations, etc.) will be resolved and documented. EPA SOPs are commonly used for this section. If multiple SOPs are referenced, include a table listing all field sampling SOPs that will be used. Provide another table that includes: parameters, sample container, sample volume, preservation method, holding times for each parameter and matrix. You must describe any modifications to the SOPs that are necessary for your project. Also, if there are any method or equipment options within the SOP, indicate which ones will be used. Indicate how these modifications and option choices will be relayed to the samplers. Describe how Investigation-derived Waste or IDW will be handled and managed. B3. Sample Handling and Custody: (In Generic) This element describes how you will maintain sample integrity (how samples will not get corrupted or mixed up). Indicate how samples will be handled, transported, and then held in the laboratory. Describe how samples will be handled in the field, during transport, and in the lab. Identify responsible persons. Specify chain-of-custody (COC) procedures that will be used to ensure samples do not get lost, mixed up, or compromised by tampering. Provide examples of a sample label, COC forms and other documentation. Describe sample numbering/identification system for field samples and laboratory. B4. Analytical Methods and Requirements: (In Generic) This element identifies the analytical methods that will be used in the field and in the laboratory. These methods need to be sensitive enough to characterize the environmental conditions. Identify the extraction, digestion, and analytical methodologies (provide the actual method numbers) to be followed. This information is often in the laboratory SOPs or Laboratory Quality Assurance Manual (QAM). Describe sample preparation and analytical procedures for field techniques and laboratory methods. Detail any project-specific modifications to analytical methods and SOPs. List laboratory quantitation limits (reporting limits) to ensure project sensitivity requirements will be met. Describe how problems (lost samples, quantitation limits, holding time exceedances, etc.) will be resolved and documented. 14 ------- B5. & B6. Field and Lab Quality Control Requirements: (In Generic) This element should list all the QC checks you are going to perform to characterize the quality of the data and as above, these procedures are often in SOPs. Quality Control-Field: This element should list all the QC checks you are going to perform to characterize the quality of the data. Provide a table listing the QC samples for each sampling matrix (e.g., water, soil) and environmental parameter (pH, phosphorus, etc.). The table should include: Type of QC sample (field duplicates, split samples, Performance Evaluation Samples, and trip, equipment and cooler temperature blanks), frequency, and acceptance criteria (controls and formulas for calculating QC data). Quality C ontrol - Laboratory: Provide a table listing the QC samples for each analytical method, for each matrix and for each measurement parameter. The table should include: - Types of QC sample (lab duplicates, matrix spikes, method blanks, etc.) - Frequency of QC samples - Acceptance criteria (control limits) - Corrective actions that will be done when acceptance criteria are exceeded. - Describe procedures and formulas for calculating QC data. Data should also be considered in terms of accuracy, precision, and completeness. Accuracy is assessed through quality control samples and is expressed as the percent recovery of a known concentration of an analyte. Precision is determined through the use of field duplicates, matrix spikes and duplicate quality control samples. The relative percent difference between the two results is an indication of the precision of the analysis performed. Completeness is a measure of the amount of valid data obtained compared to what was expected to be obtained under normal conditions. Completeness is expressed as a percentage of valid data obtained from the measurement system. For data to be valid it must meet acceptability criteria of accuracy, precision, and any other criteria required by the prescribed analytical method. When reviewing QC sample results consider the following: What will you do if contaminants are found in a "blank" sample? What will you do with sample results that do not compare with previously collected data and appear to be incorrect? What will you do with sample results if the instrument was not calibrated correctly? What will you do when duplicate sample results are not comparable? What will you consider the range of comparable results to be? What will you do with sample results when a spiked compound is not recovered? Or the results of a performance evaluation sample are inaccurate? The answers to these questions should be addressed in the discussion of how your QA/QC officer or lab will review and qualify your data. For example, if the calibration check was not acceptable, will you estimate the sample results associated with the calibration. Will you apply a flag to the data, such as a J flag? Remember, QC data are only worthwhile generating if they are used to review and evaluate your data. B7. & B8. Field Equipment Calibration and Corrective Action & Laboratory Equipment Calibration and Corrective Action: (In Generic) The information is this element should describe how you will keep instruments and equipment properly 15 ------- operational during the project. Identify equipment and instrumentation (both field and laboratory) requiring calibration and periodic maintenance, inspection and testing. Describe how often the instrument needs to be calibrated and maintained, who will perform and document tasks. Describe the calibration and testing procedures and acceptance criteria (control limits) for operation. List the spare parts needed to be kept on hand to keep the instrument operational. Describe how problems (e.g., instrument does not hold calibration) will be resolved and documented. B9. Analytical Sensitivity & Project Criteria: (In Generic and Site-Specific) Provide an analytical method sensitivity and project criteria table for the analytical methods that will be routinely performed in Brownfields projects. If data from multiple laboratories is presented, the site- specific QAPP Addendum needs to clearly identify the laboratory being used on the project. As new methods and/or new laboratories are added on, this table is to be updated accordingly. This is an important table for both planning the project and evaluating the resulting data. Initially, the table helps evaluate potential concerns with the sensitivity of an analytical method in relation to the project criteria, particularly for primary contaminants of concern. The table helps evaluate potential concerns with the sensitivity of an analytical method in relation to the project criteria. Additionally, the table is critical for understanding the usability of a data point when a sample result is near the project criteria, which is in turn near the quantitation limits and/or detection limits of the method (i.e. is the data point usable, or is more data needed to support a decision or trend in site contamination). The table should include: -Laboratory providing the data -Analytical methods reference (e.g. VOCs 8260B) -Matrix (soil, groundwater, air, etc.) -Analyte/compound list -Method detection limit (MDL) -Quantitation/reporting limit (QL/RL) -Relevant state/federal criteria or standard that is associated with each analyte/compound and each matrix. EPA expects the laboratory reporting limit to be based on the low calibration standard for the value to be less than the appropriate action limit. The QAPP should include how data will be handled if the routine reporting limit is greater than the associated action limit. If the analyte is a primary contaminant of concern, the environmental professional should request an alternate method with a lower limit of detection to verify the absence of the analyte. Ensure the appropriate units are specified and that the analytical method and project criteria are the same. BIO. Data Management and Documentation: (In Generic) The information contained in this element describes managing project data. It also describes procedures for maintaining data so that it will not be lost or corrupted. Data can be lost during data reduction, reporting, and entry into forms, reports, databases, and even in storage. Indicate how computerized 16 ------- information will be maintained and stored. Any forms or checklists to be used can be attached Describe how data (both hard copy and electronic) will be managed from the time they are generated in the field to final report and archival. Discuss methods and equipment for detecting/correcting errors, and preventing data loss. Describe how computer outputs will be checked. Identify who is responsible for these tasks. 2.2 QAPP - Section C. Assessment and Oversight The elements in this group answer the following questions: How will you check to make sure that the project is being conducted as described in the QAPP? For example: Are field personnel collecting samples at the correct locations? Is the laboratory generating accurate data? What interim and final reports will be generated? Assessment findings should be documented in a report with recommendations for corrective actions, if it applies. CI. .Assessments and Corrective .Actions: (In Generic) Discuss how you plan to ensure that the project will be conducted as described in the QAPP. Describe any oversight activities and/or assessments that will be performed, approximate timeframe, and person responsible. Identify who will receive a report of the findings and who will be responsible for corrective actions and follow up. Minimally, the project manager should schedule one review of field activities at the beginning of the project to ensure that all personnel are trained and that the appropriate equipment is in place. A Corrective Action Flow Chart is extremely useful for illustrating the decision-making process, particularly as state roles vary within Region 4 (see Figure 3). Yes Yes No No No Yes Stop work. Issue resolved? Resume work. Has a problem actually occurred? Field personnel suspects problem has occurred. Work issue up chain of command as necessary to resolve. Document and no further action required. Implement corrective action plan and document problem. Can the problem be resolved without altering workplan or impactingthe data's usability? Inform management, grantee, State and/or EPA as appropriate. Modify work plan as needed to correct issue. Figure 3 - Hypothetical Corrective Action Flow Chart 17 ------- For additional information refer to "Guidance on Technical Audits and Related Assessments for Environmental Data Operations" (QA/G-7), January 2000. Website: https://www.epa.gov/sites/production/files/2015-07/documents/g7-final.pdf C2. Project Reports: (In Generic) The QAPP should clearly state the type of information that will be included in the site assessment or sampling report. Describe the type and number of reports that will be generated for the project. Identify who is responsible for preparing the reports and the recipients for each report. This report should analyze and interpret data, present observations, compare results to the appropriate criteria or cleanup levels, draw conclusions, recommend next actions, identify data gaps, and describe any limitations in the way the data should be used. 2.3 QAPP - Section D. Data Evaluation Dl. Field Data Evaluation. D2. Laboratory Data Evaluation and D3. Evaluating Data in Terms of User Needs: (In Generic) This section describes how the data will be evaluated to determine if they meet the requirements necessary to address the environmental issue in question. This group of elements answers the following questions: How will you check that individual data collection tasks were completed correctly? How will you determine that individual sample results are acceptable or unacceptable based on QC data? How will you assess the entire set of project data to determine whether the data are "good" enough to use in making project decisions and conclusions? These elements are simplified into 3 steps for the purpose of this section. Step 1: Verification Describe how data will be checked to ensure that they are complete and were generated according to the methods and procedures specified in the QAPP. Describe steps taken by the laboratory to qualify sample results. Include qualifiers that the laboratory will apply when data do not meet laboratory QC acceptance limits. Discuss how issues will be resolved, documented and reported and the personnel responsible for these tasks. Attach any forms and checklists used to perform completeness checks. For example, is there a procedure to check that the samples were correctly preserved? Is there a procedure to check that laboratory data packages are complete (contain all required information)? Laboratories generally apply flags to reported sample results that do not meet Lab QC limits. For example, if a lab contaminant is present in a method blank, that contaminant would be flagged on the sample report form. Other items to check include: incubation temperatures, media preparation and sample holding times. Step 2: Validation Procedures Describe how sample results will be accepted, rejected or estimated based on quality control acceptance criteria. Define the data qualifiers that will be applied to the data (e.g., U=not 18 ------- detected, J=estimated, R=rejected). Attach or refer to written data validation procedures, if used. Identify the individuals who will review data, and resolve and document data quality problems. Validation is a review of sample results by an individual who is independent of the generation of the data. That means that the laboratory reporting the data would not perform data validation. Instead, the laboratory generally performs an internal verification (QC check) of the data that it generates. Data validators use mathematical and/or statistical procedures to review measurement data and generate data validation reports to describe the acceptability of the data. See Table 2 for examples of common data validation activities. Table 2 - Examples of Data Validation Activities Activity Data Deliverables and QAPP Ensure that all required information on sampling and analysis was provided (including planning documents). Analytes Ensure that required lists of analytes were reported as specified. Chain-of-Custody Examine the traceability of the data from time of sample collection until reporting of data. Examine chain-of-custody records against contract, method, or procedural requirements. Holding Times Identify holding time criteria, and either confirm that they were met or document any deviations. Ensure that samples were analyzed within holding times specified in method, procedure, or contract requirements. If holding times were not met, confirm that deviations were documented, that appropriate notifications were made (consistent with procedural requirements), and that approval to proceed was received prior to analysis. Sample Handling Ensure that required sample handling, receipt, and storage procedures were followed, and that any deviations were documented. Sampling Methods and Procedures Establish that required sampling methods were used and that any deviations were noted. Ensure that the sampling procedures and field measurements met performance criteria and that any deviations were documented. Analytical Methods and Procedures Establish that required analytical methods were used and that any deviations were noted. Ensure that the QC samples met performance criteria and that any deviations were documented. Data Qualifiers Determine that the laboratory data qualifiers were defined and applied as specified in methods, procedures, or contracts. Deviations Determine the impacts of any deviations from sampling or analytical methods and SOPs. Consider the effectiveness and appropriateness of any corrective action. Sampling Plan Determine whether the sampling plan was executed as specified (i.e., the number, location, and type of field samples were collected and analyzed as specified in the QAPP). Sampling Procedures Evaluate whether sampling procedures were followed with respect to equipment and proper sampling support (e.g., techniques, equipment, decontamination, volume, temperature, preservatives, etc.). Co-located Field Duplicates Compare results of collocated field duplicates with criteria established in the QAPP. Project Quantitation Limits Determine that quantitation limits were achieved, as outlined in the QAPP and that the laboratory successfully analyzed a standard at the QL. Confirmatory Analyses Evaluate agreement of laboratory results. Performance Criteria Evaluate QC data against project-specific performance criteria in the QAPP (i.e., evaluate quality parameters beyond those outlined in the methods). Data Qualifiers Determine that the data qualifiers applied were those specified in the QAPP and that any deviations from specifications were justified. Validation Report Summarize deviations from methods, procedures, or contracts. Include qualified data and explanation of all data qualifiers. 19 ------- Step 3: Evaluating Data in Terms of User Needs This step describes how you intend to objectively decide whether the data you collected are good enough for the data user to use. Describe how the results of the study will be analyzed and evaluated to determine whether the needs of your project were met and then reported. Include mathematical and statistical formulae that will be used to calculate precision, accuracy/bias, completeness, comparability of the project data. Describe what will happen if data are unusable. Remember, anything that compromises data representativeness ultimately impacts data quality. Table 3 details common items that can affect data usability and should be reviewed to ensure the data can be used for its intended purpose. Table 3 - Examples of Consideration for Usability Assessment Item Assessment Activity Data Deliverables and QAPP Ensure that all necessary information was provided, including but not limited to validation results. Deviations Determine the impact of deviations on the usability of data. Sampling Locations, Deviation Determine if alterations to sample locations continue to satisfy the project objectives. Chain-of-Custody, Deviation Establish that any problems with documentation or custody procedures do not prevent the data from being used for the intended purpose. Holding Times, Deviation Determine the acceptability of data where holding times were exceeded. Damaged Samples, Deviation Determine whether the data from damaged samples are usable. If the data cannot be used, determine whether resampling is necessary. PT Sample Results, Deviation Determine the implications of any unacceptable analytes (as identified by the PT sample results) on the usability of the analytical results. Describe any limitations on the data. SOPs and Methods, Deviation Evaluate the impact of deviations from SOPs and specified methods on data quality. QC Samples Evaluate the implications of unacceptable QC sample results on the data usability for the associated samples. For example, consider the effects of observed blank contamination. Matrix Evaluate matrix effects (interference or bias). Meteorological Data and Site Conditions Evaluate the possible effects of meteorological (e.g., wind, rain, temperature) and site conditions on sample results. Review field reports to identify whether any unusual conditions were present and how the sampling plan was executed. Comparability Ensure that results from different data collection activities achieve an acceptable level of agreement. Completeness Evaluate the impact of missing information. Ensure that enough information was obtained for the data to be usable (completeness as defined in PQOs documented in the QAPP). Background Determine if background levels have been adequately established (if appropriate). Critical Samples Establish that critical samples and critical target analytes/COCs, as defined in the QAPP, were collected and analyzed. Determine if the results meet criteria specified in the QAPP. Data Restrictions Describe the exact process for handling data that do not meet PQOs (i.e., when measurement performance criteria are not met). Depending on how those data will be used, specify the restrictions on use of those data for environmental decision-making. Usability Decision Determine if the data can be used to make a specific decision considering the implications of all deviations and corrective actions Usability Report Discuss and compare overall precision, accuracy/bias, representativeness, comparability, completeness, and sensitivity for each matrix, analytical group, and concentration level. Describe limitations on the use of project data if criteria for data quality indicators are not met. Note: For additional guidance in data usability assessment refer to these EPA Documents: "Data Quality Assessment: A Reviewer's Guide", QA/G-9R, and the companion document "Data 20 ------- Quality Assessment: Statistical Tools for Practitioners" (QA/G-9S). Website https://www.epa.gov/aualitv/guidance-data-gualitv-assessment 3.0 Review, Approval and Distribution Once you are ready to submit your QAPP to EPA for review, include the completed appropriate Checklist to your EPA R4 Brownfields Project Officer. The Project Officer will specify whether a hard or electronic copy is required. EPA review time is approximately 30 days. However, if a faster turnaround time is necessary, as in the case with some Site-Specific QAPP Addendum, contact the EPA Project Officer to discuss. The EPA Region 4 Brownfields Program will work with the CAR and its consultant to try and meet a faster turnaround when necessary. If EPA has comments, the plan must be revised and resubmitted. Make the revisions and if necessary any revisions to the Checklist and resubmit to EPA. Once the comments have been addressed successfully the EPA Project Officer/DAO will sign the appropriate place on the signature page. EPA's signature of the project specific QAPP provides approval to begin the sampling. Once all signatures have been obtained on the signature page, the completed final document should be provided to all those on the Distribution list. Figure 4 shows a flowchart diagram of the review, approval and distribution process. k V Final QAPP Addendum Environmental Consulting Firm CAR/ R4PO/ State Organization submits QAPP Submissions Comments Response to comments Final adjustments Approval signatures Figure 4 - OAPP Review, Approval and Distributions 4.0 QAPP Modifications, Revisions and Updates Modifications can be made to QAPPs if necessary. Discuss the modifications that are needed to either the Generic or Site-Specific QAPP Addendum with your EPA R4 Brownfields Project Officer. Then submit the revision for review and approval prior to implementing the modifications (e.g., change in scope of sampling design and analytical methods). Revisions for minor changes may not be required, but your Project Officer must be made aware of such changes. 21 ------- 5.0 GenericQAPP Updates, Revisions and Resubmittals Generic QAPPs are valid for five years or for as long as the cooperative agreement is active, whichever is shorter, starting from the signed approval date. The consultant that prepared the QAPP should review the plan annually for updates. The annual update can capture the accumulated changes over the past year. If a cooperative agreement project period extends beyond five years, another Generic QAPP should be resubmitted to the EPA Project Officer for review and approval. It is important to keep the Generic QAPP updated to account for any relevant information that became known after the start of the cooperative agreement. At a minimum, personnel and SESD SOP changes are likely to have occurred. 22 ------- 6.0 References US EPA Region 4, SESD, Field Branches Quality System and Technical Procedures, February 2008. These procedures supersede the US EPA Region 4 "Environmental Investigations Standard Operating Procedures and Quality Assurance Manual" (EISOPQAM) November 2001 and the "Ecological Assessment Standard Operating Procedures and Quality Assurance Manual" (EASOPQAM) January 2002. https://www.epa.gov/qualitv/qualitv-svstem-and-technical- procedures-sesd-field-branches U.S. Environmental Protection Agency, 2002. Guidance for Quality Assurance Project Plans (EPA QA/G-5). Final. December 2002. EPA/240/R-02/009, Office of Environmental Information, https://www.epa.gov/sites/production/files/2015-06/documents/g5-final.pdf U. S. Environmental Protection Agency, 2001. EPA Requirements for Quality Assurance Proj ect Plans (EPA QA/R-5). (March 2001), Reissue Notice May 2006. EPA/240/8-01/003, Office of Environmental Information, https://www.epa.gov/sites/production/files/2016-06/documents/r5- final O.pdf The updated EPA Region 9 Preliminary Remediation Goals (PRGS) are now called Regional Screening Level (RSL). Several states have adopted these values for use in screening analytical results. QAPP preparers should confirm that their state is using these values or find out what their state recommends, https://www.epa.gov/risk/regional-screening-levels-rsls-generic-tables-mav- 2016 U.S. Environmental Protection Agency, "Guidance for Choosing a Sampling Design for Environmental Data Collection" (QA/G-5S) http://www.water.ca.gov/environmentalservices/docs/qaqc/choosing a sampling design for e nvironmental data collection.pdf U.S. Environmental Protection Agency, "Guidance for Preparing Standard Operating Procedures (SOPs)", EPA QA-G/6 http: //www. epa. gov/qualitv/qa docs .html U.S. Environmental Protection Agency, "Guidance on Technical Audits and Related Assessments for Environmental Data Operations", G-7 https://www.epa.gov/sites/production/files/2015- 07/documents/g7-final.pdf U.S. Environmental Protection Agency, "Data Quality Assessment: A Reviewer's Guide, (QA/G- 9R)"and the companion document 'Data Quality Assessment: Statistical Tools for Practitioners, (QA/G-9S). https://www.epa.gov/qualitv/guidance-data-qualitv-assessment Quality Assurance Project Plan Development Tool: https://www.epa.gov/qualitv/qualitv-assurance-proiect-plan-development-tool EPA's Quality Assurance Website: https://www.epa.gov/qualitv 23 ------- 7.0 EPA Region 4 State Web Links: Alabama: http://www.adem.state.al.us/ Georgia: httD ://www. gaeod. org/ Florida: http://www.dep.state.fl.us/ Kentucky: httD ://deo. kv. gov/Pages/default, asox Mississippi: httD://www.dea. state.ms.us/ South Carolina: http://www.scdhec.net/ North Carolina: httDs://dea.nc.gov/ Tennessee: http://www.tennessee.gov/environment/ 24 ------- APPENDIX A USEPA REGION 4 BROWNFIELDS QAPP REVIEW CHECKLIST QAPP Title: Cooperative Agreement Recipient: Grant Number: QAPP Preparer: QAPP Date: Transmittal Date: DAO Reviewer: *This is not an exhaustive list of requirements and is not intended as guidance for developing a QAPP. Refer to the Preparation of Quality Assurance Project Plans for EPA Brownfields Projects in the Southeast for comprehensive requirements. **For DAOs, mark each element in the right-hand column with one of the following abbreviations: P = Present & Acceptable; NP = Not Present; I = Incomplete; NA = Not Applicable ELEMENT Page Number & Paragraph EPA Use Al. Title and Approval Sheet Title (Including CAR's name and revision #) Grant Number Name of organization that prepared the QAPP Dated signature of approving officials: printed names, titles, organizations, date, and signatures Other signatures, as needed A2. Table of Contents A3. Distribution List A4. Project/Task Organization Key individuals, technical disciplines, and responsibilities Organizational chart/table depicting lines of authority and reporting responsibilities A5. Problem Definition/Background Clearly state the problem or decision to be resolved Provide historical and background information A6. Project/Task Description List measurements to be made Cite applicable technical, regulatory, or program-specific quality standards, criteria, and/or objectives Note special personnel or equipment requirements Provide work schedule Note required project and QA records/reports A7. Quality Objectives and Criteria for Measurement Data State project objectives and limits, both qualitatively and quantitatively State and characterize measurement quality objectives to applicable action levels or criteria A8. Special Training /Certification State trainings, date of trainings, expirations, and where applicable records are maintained 25 ------- ELEMENT Page Number & Paragraph EPA Use A9. Documentation and Records List information and records to be included for this project State requested lab turnaround time Give retention time and location for records and reports Bl. Sampling Process Design and Site Figures Type and number of samples required Sampling design and rationale Sampling locations and frequency Sample matrices Classification of each measurement parameter as either critical or needed for information only Describe/list SOPs used to characterize and dispose of IDW B2. Sampling and Analytical Procedures Describe the sampling methods and procedures or cite the specific SOPs to be used to guide the sample collection Describe how problems (lost samples, broken equipment, etc.) will be resolved and documented If SOPs are referenced, include a table listing all field sampling SOPs that will be used. Include the title of SOP, date, revision number and organization that wrote the SOP. Describe any modifications to the SOPs that are necessary for your project. B3. Sample Handling and Custody Sample handling requirements Chain-of-custody procedures B4. Analytical Methods and Requirements Identify the extraction, digestion, analytical methodologies to be followed Specify the turnaround time for hardcopy/electronic laboratory data deliverables Provide the laboratory SOPs as appropriate Identify the individual(s) responsible for overseeing the analysis and implementing corrective actions B5. Field Quality Control Requirements Design the field QC program that will be routinely performed, and provide a corresponding field sampling QC table in the QAPP Include field duplicate samples for each matrix and parameter, trip blanks for VOC samples, temperature blanks, and QA/QC samples as necessary B6. Laboratory Quality Control Requirements Determine the laboratory QC data to be routinely included with the laboratory's data package, and provide a corresponding laboratory analytical QC table. B7. Field Equipment Calibration and Corrective Action If contained in SOPs, reference that appendix in this section of the QAPP. Otherwise, provide a field equipment calibration table for the types of field equipment routinely used Discuss the corrective actions taken in the field when the control limits are not met 26 ------- ELEMENT Page Number & Paragraph EPA Use B8. Laboratory Equipment Calibration and Corrective Action If contained in laboratory SOPs, reference that appendix in this section. Otherwise, provide a laboratory equipment calibration table for each analytical method Note responsible individuals B9. Analytical Sensitivity and Project Criteria Provide an analytical method sensitivity and project criteria table for the analytical methods that will be routinely performed If the laboratory provides only one analytical method limit, note in the table whether it is the MDL or the QL/RL that is being reported BIO. Data Management and Documentation Describe standard record-keeping, data storage, and retrieval requirements for digital and hard copies of field data, laboratory data, and manipulated data; Include any checklists used for data management Describe the control mechanism for detecting and correcting errors, and ensuring accuracy Include the name, title, and organization of the person(s) responsible for these activities CI. Assessments and Corrective Actions Assessments/oversight that will be performed and frequency The person(s) responsible for performing the assessments/oversight, and where the results will be documented Identify who will receive the assessment/oversight report; who will be responsible for dealing with corrective actions; and follow up on assessments/oversight C2. Project Reports Identify the types of reports that will be routinely generated Provide a detailed description of the contents of project final reports to establish expectations between report preparer and client Dl. Field Data Evaluation Describe the final data evaluation process that will be routinely performed on the field data Indicate how the results of the evaluation will be documented, and what will be presented the final report(s). Indicate the position(s) of the person(s) who will be performing the field data evaluation D2. Laboratory Data Evaluation Describe the final data evaluation process that will be routinely performed on the laboratory data Perform a completeness check of the laboratory data package to ensure it is compliant with the requirements in the QAPP Document the presence or absence of any problems with the data, and note any relevant sample data that may be impacted. Evaluate the field QC sample results including data qualifiers for sample results D3. Evaluating Data in Terms of User Needs Describe the overall project evaluation process that will be routinely performed to determine the usability of the data, update the conceptual site model, and determine if the objectives of the project have been met 27 ------- ELEMENT Page Number & Paragraph EPA Use Tabulate the field sample data together with the state/federal standards for presentation in the final report Using the summary tables and graphical presentations, evaluate the usability of the individual field sample results at the parameter level. Document any limitations Document observations, trends, anomalies, or data gaps that may exist. Evaluate how the results have impacted the conceptual site model, and if the objectives of the project have been met. Draw conclusions and recommendations from all the information Final QAPP disposition: Approved, no comments Approved with comments, resubmittal not required Conditionally approved, comments must be addressed, resubmittal required Not approved, comments must be addressed, resubmittal required References EPA Requirements for Quality Assurance Project Plans. EPA QA/R-5, March 2001, EPA/240/B-01/003, Guidance for Quality Assurance Project Plans. EPA QA/G-5, December 2002, EPA/240/R-02/009 (Available fromEPA's Website: http://www.epa.gov/quality) 28 ------- Appendix B USEPA REGION 4 BROWN FIELDS GENERIC QAPP TEMPLATE Brownfields Generic QAPP Elements and general information/template for writers Notes A1. Title and Approval Page Including the following items on the Title and Approval Page: Title (including brownfields cooperative agreement recipient name and revision #) Brownfields cooperative agreement grant number Date Organization's name: the name of the organization preparing the QAPP Dated signature of approving officials: printed names, titles, organizations, date, and signatures A2. Table of Contents The table of contents must include tables, figures and appendices. A3. Distribution List Name, title/position, organization, and contact information (telephone & email), of all entities requiring copies of the QAPP. Should include all individuals mentioned in the Q APP. If portions are unknown, indicate information will be in site- specific Q APP, such as the Field Team Leader. A4. Project/Task Organization Identify key project personnel, specify technical disciplines, and detail each individual"s roles/responsibilities. Include an organizational chart or table depicting lines of authority, and reporting responsibilities. Include all agencies, contractors and individuals responsible for performing QAPP preparation, sample collection, laboratory analysis, data verification, review and validation, data quality assessment, and project oversight responsibilities. A5. Problem Definition/Background Indicate in the Generic QAPP that a project's Problem Definition will be provided in a Site-Specific QAPP Addendum. (See Appendix B Brownfields Site-Specific QAPP Elements for information on concepts to be covered in this section). 29 ------- A6. Project/Task Description/Timeline Indicate in the Generic QAPP that a project's Project Description/Timeline will be provided in a Site-Specific QAPP Addendum. (See Appendix B Brown fields Site- Specific QAPP Elements for information on concepts to be covered in this section.) A7. Quality Objectives and Criteria for Measurement Data State the project objectives and limits, both qualitatively and quantitatively. Also, state and characterize measurement quality objectives to applicable action levels or criteria. A8. Special Training Requirements and Special Certifications List training of personnel, including any special or non-routine training or certifications needed by personnel to conduct project activities (e.g. asbestos certification). For each non-routine training or certification include the following items within its description: Succinctly state the project activity; Specialized training course title (or description); By whom the training was provided; The date of training and expiration date of training credentials; Describe how this training will be documented, and indicate where the records will be kept. Be sure to discuss the importance of QA training and discusses how this training is provided. A9. Documentation and Records Provides a comprehensive list of the documents and records required for this project (including raw data, field logs, audit reports, QA reports, quarterly reports, analytical data reports, data validation reports/data quality assessment reports, etc.) Specify the turnaround time for laboratory data deliverables (both hardcopy and electronic formats). Provide hardcopy data package content requirements and electronic data requirements. Indicate the retention time and repository of study records, reports and formal documents. B1. Sampling Process Design & Site Figures Indicate in the Generic QAPP that a project's sampling process design and site figures will be provided in a Site-Specific QAPP Addendum. Describe SOPs that be used to characterize and dispose of all IDW. See Appendix B Brown fields Site-Specific QAPP Elements for information on concepts to 30 ------- be covered in this section. B2. Sampling & Analytical Method Requirements In the Generic QAPP, please provide an example of the Sampling and Analytical Methods Requirements table that will be used in all site-specific QAPP Addenda. The table should be completed with all pre-established analytical information (i.e., matrix, parameter, container, preservation and holding time information). This table may be used in the future as a template that can be edited for the individual site-specific QAPP Addendum. Provide the required field sample collection procedures, protocols, and methods. Provide a list of sampling/collection equipment (including make and model of equipment). Identify on-site support facilities that are available to field staff. Identify key personnel in charge of or overseeing sampiing/col 1 ection activities. Describes equipment decontamination procedures and requirements. Discusses whether sampling equipment is dedicated or non-dedicated. B3. Sample Handling & Custody Requirements Provide a detailed description of the procedures for post sample handling (once the sample has been collected). Provide a detailed description of the chain-of-custody (COC) procedures that will follow in preparing the field samples for transport to the laboratory. If a SOP is available, simply reference and include the SOP in an appendix. Provide a copy of a COC form, sample label, and custody seal. 31 ------- B4. Analytical Methods & Requirements Clearly identify the extraction, digestion, analytical methodologies (provides the actual method numbers) to be followed (includes all relevant options or modifications required), and the required instrumentation. Specify the turnaround time for hardcopy and electronic laboratory data deliverables. Provide the laboratory SOPs as appropriate. Identify the individual(s) responsible for overseeing the analysis and for implementing corrective actions if deemed necessary. B5. Field Quality Control Requirements Design the field QC program that will be routinely performed on Brownfield projects, and provide a corresponding field sampling QC table in the QAPP. Break the QC program down by parameter and matrix to identify the appropriate criteria that will be used for the evaluation. The information presented in this table is what will be used in the data evaluation process. Include the following at a minimum: Each type of field QC sample included in the project; Frequency it will be included; Acceptance criteria (control limits) that the data will be compared against; The actions the data evaluator performs when control limits are exceeded. Typical Brownfield projects will include field duplicate samples for each matrix and parameter, trip blanks for VOC samples, and temperature blanks for the shipping coolers. Other types of field QC samples should be considered for inclusion in the project, if warranted. B6. Laboratory Quality Control Requirements Determine the laboratory QC data to be routinely included with the laboratory's data package, and provide a corresponding laboratory analytical QC table in the QAPP. Break down by parameter and matrix, as appropriate, based on the information provided by the laboratory. The information presented in this table is what will be used in the data evaluation process described in Section D2. Include the following at a minimum: Each type of laboratory QC sample and frequency; Laboratory acceptance criteria (control limits); The actions the data evaluator performs when control limits are exceeded. Typical Brown fields projects will include the following laboratory QC results: Organic Analyses: method blanks, surrogate data and lab control samples/lab 32 ------- control sample duplicates (LCS/LCSD). Inorganic analyses: method blanks, lab control samples (LCS). B7. Field Equipment & Corrective Action Below is the general field equipment calibration QA/QC information that needs to be provided in a QAPP. If this information is clearly contained in SOPs attached to the QAPP, simply reference that appendix in this section of the QAPP. Otherwise, provide a field equipment calibration table for the various types of field equipment routinely used on Brownfields projects (e.g., PID, individual low flow water quality parameters, etc.). Document the initial calibration (including standards and concentrations used); Any continuing calibration checks used throughout operation to check for drift (standards, blanks, etc.) and frequency such actions are performed; Indicate the acceptance criteria (control limits) that need to be met to proceed; and. Discuss the corrective actions taken in the field when the control limits are not met. B8. Lab Equipment & Corrective Action Below is an outline of the laboratory equipment calibration QA/QC information that needs to be provided in a QAPP. If this information is clearly contained in the laboratory SOPs attached to the QAPP, simply reference that appendix in this section of the QAPP. Otherwise, please provide a laboratory equipment calibration table for each analytical method routinely used on Brownfields projects. If this information is unknown, specify it will be in the Site-Specific Addendum. Initial calibration (include the number of initial calibration standards and calibration range); Independent calibration check standard (include relevant concentrations); and. Continuing calibration checks (calibration blanks and concentration of continuing calibration check standard). For each calibration step include: Frequency that each is performed; Acceptance criteria (control limits); and. Laboratory corrective actions to be taken when control limits are not met. 33 ------- B9. Analytical Sensitivity & Project Criteria Provide an analytical method sensitivity and project criteria table for the analytical methods that will be routinely performed on Brownfields projects. If data from multiple laboratories is presented, the site-specific QAPP will need to clarify which laboratory is being used on the project. As new methods and/or new laboratories are added on, this table is to be updated accordingly. If this information is unknown, specify it will be in the Site-Specific Addendum. The table helps evaluate potential concerns with the sensitivity of an analytical method in relation to the project criteria, particularly for primary contaminants of concern. Also, the table is critical in understanding the usability of a data point when a sample result is near the project criteria, which is in turn near the quantitation limits and/or detection limits of the method (i.e., is the data point usable, or is more data needed to support a decision or trend in site contamination). The information presented in this table can be used as a reference in the data evaluation process. The table is to include: Laboratory providing the data; Analytical Method reference (e.g., VOCs 8260B); Matrix (soil, groundwater, air, etc.); Analyte/compound list; Method Detection Limit (MDL); Quantitation/Reporting Limit (QL/RL); Relevant state/federal criteria or standard that is associated with each analyte/compound and each matrix. If the laboratory provides only one analytical method limit, note in the table whether it is the MDL or the QL/RL that is being reported. When project criteria are near the MDL, special care should be taken in reviewing this data, particularly if it is a primary contaminant of concern. Depending on the situation, the environmental professional may choose to seek an alternate method with a lower limit of detection. 34 ------- BIO. Data Management & Documents Describe the documentation that will be generated for the project, and the data management procedures that will be used in handling both hard copy and electronic media. The three basic areas to cover include the field data, laboratory data, and manipulated data. Clearly specify what documentation goes into the project file and what documentation will be provided in the final report. Include a description of the project data management process and reference the office's record keeping procedures, document control, data storage, retrieval, and security systems. Also include a description for control mechanism for detecting and correcting errors, and ensuring accuracy. Include the name, title, and organization of the person(s) responsible for these activities. Attach any forms or checklists to be used for data management purposes. CI. Assessments & Response Actions Develop and describe the assessment/oversight plan that will be followed with each project to ensure adherence to the generic QAPP and site-specific QAPP, including: Types of assessments and oversight that will be performed and frequency (when during the project); Identify the person responsible for performing the assessments/oversight (e.g., field leader, QA officer, etc.), and describe where the results will be documented; Identify who will receive the assessment/oversight report; Identify who will be responsible for dealing with corrective actions, and follow up on assessments/oversight. Since Brown fields projects are relatively short term projects, a typical assessment plan would include 1) oversight of the field team and field subcontractors (early on in the project) by an experienced field leader knowledgeable in the project objectives, and 2) peer review of the final report. Oversight, in this case, essentially means checking on if the project is going according to the plan and procedures in place, helping with problems and questions, and providing a set of eyes keeping the total project in perspective. Please indicate in this section of the QAPP, when additional assessment/oversight is planned for a project. The scope and purpose behind the assessment should be described in the site-specific work plan (and include the identified information listed above). 35 ------- C2. Project Reports Identify the types of reports that will be routinely generated during the Brown fields project (e.g.. Phase 1/11 ESA, final reports, etc.). Include: Type of report; Frequency of reporting; The position(s) of the person(s) who will be responsible for preparing the reports; The organizations who will be receiving the reports. For the final project report, a fairly detailed description of its contents should be provided to establish appropriate expectations between report preparer and client. Please describe primary components of the main body of the document, and specify any routine tables and graphics being provided. Also list the various appendices routinely included in the report. Identify reports and items that will be routinely provided in electronic format. D1. Field Data Evaluation Describe the final data evaluation process that will be routinely performed on the field data (field notes, boring logs, field screening results, and field analytical data, etc.). This evaluation is intended to gather and document important information from the field data that may impact the project, or assist in the interpretation of the laboratory data and the conceptual site model. It is important that any observations, trends, conclusions and limitations discovered in reviewing the field data be interpreted and documented in the final report. For each component of the field data evaluation, indicate how the results of the evaluation will be documented, and what will be presented the final report. Indicate the position(s) of the person(s) who will be performing the field data evaluation. 36 ------- D2. Laboratory Data Evaluation Describe the final data evaluation process that will be routinely performed on the laboratory data. Perform a completeness check of the laboratory data package to ensure it is compliant with the requirements in the QAPP. Missing information or questions concerning the data package are to be addressed with the laboratory and any pertinent information should be documented and/or provided in the final report. Review the chain-of-custody, sample preservation and holding time results. Document the presence or absence of any problems with the data, and note any relevant sample data that may be impacted. Evaluate the field QC sample results including data qualifiers for sample results. For the field duplicates sample results, tabulate the relative percent differences (include these results in the final report). If other field QC samples were submitted, such as performance evaluation samples or matrix spike samples, this data should also be tabulated with appropriate recoveries and reported accordingly. Document the presence or absence of any problems or issues and note any relevant sample data that may be impacted, as appropriate. Evaluate the laboratory QC results. Document the presence or absence of any problems or issues and note any relevant sample data that may be impacted. For each of the components of the laboratory data evaluation, indicate how the results of the evaluation will be documented, and what will be presented in the final report. Again, it is important that any observations, trends, and limitations discovered in the field and/or laboratory QC data be interpreted and documented in the final report. Indicate the position(s) of the person(s) who will be performing the laboratory data evaluation. 37 ------- D3. Data Usability & Project Evaluation Describe the overall project evaluation process that will be routinely performed to determine the nuances in the usability of the data, update the conceptual site model, and to determine if the objectives of the project have been met. Tabulate the field sample data together with the state/federal standards for presentation in the final report. Highlight any sample results exceeding criteria. Check the table for correctness and appropriate units. Prepare site figures/maps and other graphical representations, as appropriate, and check for correctness and accuracy. Using the summary tables and graphical presentations, evaluate the usability of the individual field sample results at the parameter level. Document any limitations on how the data should be used and/or interpreted. Draw on the sensitivity criteria, the results of the field data evaluation, and/or the results of the laboratory data evaluation. (As sample concentrations approach the reporting limit, and on down to the MDL, the precision and accuracy of the data can be expected to worsen, which can impact how you judge the usability of this data.) Based on the results of the data usability study, use the summary tables and site maps to perform the overall project evaluation. Document any observations, trends, anomalies, or data gaps that may exist. Evaluate how the sample results have impacted the conceptual site model for the property, and whether the objectives of the project have been met. Draw conclusions and recommendations from all the information obtained, and document appropriately in the final report. For each of the components of the data usability and project evaluation, indicate how the results of the evaluation will be documented, and what will be presented in the final report. Indicate the position(s) of the person(s) who will be performing the data usability and project evaluation. 38 ------- Appendix C USEPA REGION 4 BROWNFIELDS SITE-SPECIFIC QAPP ADDENDUM TEMPLATE Brownfields Site-Specific QAPP Elements and general information/template for writers Notes AI. Title and Approval Page Title (including brownfields cooperative agreement recipient name and revision #) Addendum Number-see guidance Brownfields cooperative agreement grant # Date Organization's name: the name of the organization preparing the Q APP Dated signature of approving officials: printed names, titles, organizations, and dates and signatures Indicate the Addendum is prepared in accordance with EPA's Region 4 Brownfields Program; Identify the Addendum's Association with the approved Generic QAPP (including a complete reference to the Generic QAPP; Provide a statement that the work described will be performed in accordance with the process described in the Generic QAPP A2. Table of Contents A table of contents must include tables, figures and appendices. A3. Distribution List Name, title/position, organization, and contact information (telephone and email ) of all entities requiring copies of the Site Specific QAPP. Should include all individuals mentioned in the document. A4. Project/ Task Organization: List key project personnel and describe their roles and responsibilities for the project. See Appendix A Brownfields Generic QAPP Elements for information on concepts to be covered in this section. 39 ------- A5. Problem Definition/Background Provide historic, scientific, and/or regulatory background of the site or the project. Identify the current property owner and the proposed future reuse/development plans for the property; describe pertinent historical and current uses of the property, as well as any uses of adjacent properties, that may be impacting the site; describe the Recognized Environmental Conditions from the Phase I Environmental Site Assessment. Discuss known or likely contaminants of concern and where it may be located. Describe the findings of previous investigations and how/whether data will be used in this assessment or cleanup (this is particularly relevant for a cleanup QAPP). Provide a topographic map of the surrounding site area and a site map showing significant structures, terrain, previous sampling locations and relevant summary data, as appropriate, to illustrate problem. Provide regulatory standards or criteria that data will be compared against. A6. Project/Task Description/Timeline Summarize the tasks that will be performed, the data that will be collected, the decisions to be made and the timeline for the data and reports. Identify the media that will be sampled. Provide the projected timeline for key tasks in the project, including QAPP review and approval, field activities and sampling, laboratory results turnaround, and reporting activities to be completed. Allow 30 days for EPA QAPP review. A7. Quality Objectives and Criteria for Measurement Data Identify the seven steps in the DQO process for the project. A8. Special Training Requirements and Special Certifications Indicate in the Site-Specific QAPP that the project's special training requirements and special certifications are in the Generic QAPP. See Appendix A Brown fields Generic QAPP Elements for information on concepts to be covered in this section. Indicate it is in the Generic QAPP. A9. Documentation and Records Indicate in the Site-Specific QAPP that the project's documentation and records requirements are in the Generic QAPP. See Appendix A Brown fields Generic QAPP Elements for information on concepts to be covered in this section. Indicate it is in the Generic QAPP 40 ------- Bl. Sampling Design and Site Figures Describe all the samples to be collected. Provide the logic and rationale of the sampling. Specify the locations, numbers of samples, and analytical parameters for all media. Provide the purpose behind a set or series of samples in a particular area or location, and 2) how the sampling design addresses the problem identified in Section A5; Discuss any unusual communication/instructions that needs to take place between the field contractor and the laboratory to address special methods, matrices, particular samples, etc.; When the sampling locations, sampling depths and/or choice of analytical parameters cannot be predetermined, document the decision logic or input that will be used in the field to make those determinations (i.e. dynamic sampling strategies) and explain how the process will be documented and reported. Provide maps showing sample locations and a table that includes: -sample matrix -environmental parameters -sampling collection method -analytical method reference -number of field samples -type and number of field QC samples for each matrix and parameter -samples used for background or control comparison Specify the site specific concerns about all IDW and the methodology for characterizing the material for disposal. B2. Sampling and Analytical Procedures Describe the sampling methods and procedures or cite the specific SOPs to be used to guide the sample collection (include SOPs as attachments to the QAPP), i.e preparation of sample containers, sample volumes, preservation and holding times; sample packaging, labeling and shipping; equipment preparation; decontamination and disposal of waste by-products; describe how problems (lost samples, broken equipment, inaccessible sampling locations, etc.) will be resolved and documented. If SOPs are referenced, include a table listing all field sampling SOPs that will be used. Include the title of SOP, date, revision number and organization that wrote the SOP. Describe any modifications to the SOPs that are necessary for your project. 41 ------- B3. Sample Handling and Custody Requirements Indicate it is in the Generic QAPP B4. Analytical Methods and Requirements Indicate it is in the Generic QAPP unless site-specific methods are unique to a project B5. Field Quality Control Requirements Indicate it is in the Generic QAPP B6. Laboratory Quality Control Requirements Indicate it is in the Generic QAPP B7. Field Equipment & Corrective Action Indicate it is in the Generic QAPP B8. Lab Equipment & Corrective Action Indicate it is in the Generic QAPP B9. Analytical Sensitivity & Project Criteria Indicate it is in the Generic QAPP unless site-specific methods are unique to a project B10. Data Management and Documents Indicate it is in the Generic QAPP CI. Assessments and Response Actions Indicate it is in the Generic QAPP C2. Project Reports Indicate it is in the Generic QAPP D1. Field Data Evaluation Indicate it is in the Generic QAPP D2. Laboratory Data Evaluation Indicate it is in the Generic QAPP D3. Data Usability and Project Evaluation Indicate it is in the Generic QAPP 42 ------- |