REGION  3:  THE MID-ATLANTIC  STATES
          SI.KYINC; TIIK DISTRICT <>i- COLUMBIA, DL;LA\VARK, MARYLAND, PKNNSYI.VANIA. \'IRC,INIA AND WHST VIRGINIA
         QUALITY  ASSURANCE
     PHILADELPHIA, PA
     FORT MEADE, MD
      ANNAPOLIS, MD
      WHEELING, WV
    Regional QA Manager
       Terry Simpson
    Phone: 410.305.2739
     Fax: 410.305.3095
 E-mail: simpson.terry@epa.gov
      Region 3 QA Staff
       Fort Meade, MD
Mike Mahoney

Ed Messer
i.2631
410.305.:

410.305.2744
Web Links:

EPA Quality System
http://www.epa.gov/quality/
Region 3 Quality System
http://intranet.epa.gov/r3intran/
eaid/quality/

Region 3 Quality Staff (OASQA)
http://www.epa.gov/region3/esc/
CWqateam.htm
       The primary goal of the Region's Quality Management System (a.k.a. quality
       assurance program) is to ensure that environmental data activities performed by or for
       the Region will result in the production of data that is of adequate quality to support
       the Agency's scientific decisions or actions. In order for this data to be used with a
       high degree of certainty, its quality must be known and documented. This goal will be
       achieved by ensuring that appropriate resources are made available and proper
       procedures followed throughout the planning, implementation and evaluation phases
       of each environmental project. This brochure highlights some of the important
       aspects of Region 3's Quality Management System.
         Region 3 Quality Policy

It is Region 3 policy that all environmental data
and information collected and/or used in the
process of decision-making are of known quality,
suitable for its intended use, with all aspects of
collection and analysis thoroughly documented;
such documentation being verifiable and
defensible. This policy applies to all data
collected for environmental operations, including
environmental technology activities, performed
directly by or for the Region. This includes all
Federal, State, Tribal and local partners under
interagency and financial assistance agreements;
contractors funded by EPA; regulated entities
and potentially responsible parties.

                                                                      "Hi, I'm QA Guy!"
                                               REGION 3 QA COORDINATORS
Division/Office
Air Protection Division
Chesapeake Bay Program Office
Environmental Assessment and
Innovation Division
Hazardous Site Cleanup Division
Land and Chemicals Division
Office of Enforcement, Compliance
and Environmental Justice
Office of Policy Management
Water Protection Division
Representative/
Alternate Rep
Chris Pilla
Rich Batiuk
Mary Ellen Ley
Charlie Jones
Jeff Turtle
David Friedman
Jose Jimenez
Norman Rodriguez
Larry Merrill
Phone
215.814.3438
410.267.5731
410.267.5750
215.814.2734
215.8143236
215.814.3395
215.814.2148
215.814.5274
215.814.5452
Email
pilla.chns@epa.qov
batiuk nchardOeca.aov
ley mary(5)epa qov
lones charlie(o)epa qov

turtle leffrevfiteca QOV

friedman,davidm(5>eDa.QOv

iimenez loseOeoa.aov

rodnquez norman(o>eDa aov
mernll larryiqtepajiov
                                                                                February 2009

-------
            REGION  3:  THE  MID-ATLANTIC STATES
         Sl.RVlNC. rill DlSTKK 1  01- Col.tMHIA, D 1.1 A\\ ARII, MARYLAND, PENNSYLVANIA, VIRGINIA AM) \Vi:ST N'IRCilNIA
  % PRO^*-
      QUALITY  ASSURANCE
    PHILADELPHIA, PA
    FORT MEADE, MD
     ANNAPOLIS, MD
     WHEELING, WV
   • project involves environmental
data, as defined here, you need QA.
Web Links:

EPA Quality System
http://www.epa.gov/quality/

Region 3 Quality System
http://intranet.epa.gov/r3intran/
eaid/quality/

Region 3 Quality Staff (OASQA)
http://www.epa.gov/region3/esc/
QA/qateam.htm

Planning — Development of sampling
network design, generation of appropriate
data quality indicators, selection of
measurement and analytical methodologies,
standard operating procedures

Implementation — Just do what you
planned!

Evaluation — Use of review, audit and
assessment tools to determine whether
environmental data operations comply with
EPA Quality Assurance Policy
                                                                 The Quality System PIE
                             Environmental information or data is defined as any measurements
                             or information that describe environmental processes, location, or
                             conditions; ecological or health effects and consequences; or the
                             performance of environmental technology. For EPA Region 3,
                             environmental information also includes data compiled from other
                             sources such as data bases or literature or produced from models.
                Quality Assurance Staff Expertise

   QA document preparation and review (e.g., Quality Management Plans,
   Quality Assurance Project Plans, Sampling Plans, etc.)
   Expert assistance on QA/QC policies, procedures, and requirements
   Review/advice on quality requirements for contracts, grants, and interagency
   agreements
   Assistance with environmental project planning
   Selection of laboratory and field analytical methods
   Data validation and usability
   Data interpretation and assessment
   Quality system and technical assessments of laboratory and field activities
   (e.g., audits)
   Training in all of the above
                                                                         February 2009

-------
          >.       UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
          \                         REGION III
          o        Office of Analytical Services and Quality Assurance

                             Environmental Sciences Center
                                   701 Mapes Road
                            Fort Meade, Maryland 20755-5350
Points of Contact
410-305-XXXX (see extension below)
www.epa.gov/region3/esc


Region III Quality Assurance Manager: Terry Simpson:  ext. 2739   simpson.terrv@epa.gov


Technical Services Branch Chief:  Fred Foreman: ext. 2629  Foreman.fred@epa.gov

Quality Assurance Project Plan contacts:  Group email: r3 esc-qa
Mike Mahoney: ext. 2631    mahonev.mike@epa. gov
Ed Messer:     ext. 2744    messer.ed@epa.gov

Analytical Request contacts: Group email: r3 clients
JohnKwedar:  ext. 3021     kwedar.iohn@epa.gov
Dan Slizys:    ext. 2734     slizvs.dan@epa.gov
Carroll Harris:  ext. 2625     harris.carroll@epa.gov
    Printed on 100% recycled/recyclable paper with 100% post-consumer fiber and process chlorine free.
                        Customer Service Hotline: 1-800-438-2474

-------
ALASKA STATE DEPARTMENT OF ENVIRONMENTAL CONSERVATION
Division Air & Water Quality                           Air & Water Data & Monitoring Program
Revision 1.0                                                           April 24,2003
              Elements of a Good Quality Assurance Project Plan (QAPP)

A.  Project Management Elements

1.   Title and Approval Sheet/s - Includes the title of the plan, the name of the organization/s
implementing the project, and the effective date of the plan. It should have signature and date
lines for the organization/s Project Manager/s, the organization/s QA Officer, the DEC Project
Officer and the DEC Air QA Officer.

2.  Table of Contents - Use the same numbering system as the  EPA Quality Assurance
Requirements document (EPA QA/R-5); i.e., Al, A2 etc. (See  end of this document for EPA
QA/R-5 website) Whenever a section is not relevant to a specific project QAPP, Not Applicable
orNA, can be typed in.  Each page following the Title and Approval pages should show the
name of the project, date and revision number at the top or bottom of the page and number of
pages. (See above right-hand corner for example).

3.  Distribution List - Includes a list of the names, title, organization, phone number and
addresses (email and mail) of all who receive the approved QAPP and any subsequent revisions.

4.  Project/Task Organization - This narrative description identifies the  individuals or
organizations participating in the project and discusses their specific roles and responsibilities.  It
should include the principal  data users, the decision  makers, the project QA officer and all  those
responsible for project implementation.  A concise organization chart will be included showing
the relationships and lines of communication among project participants. This org. chart should
include other data users outside of the organization generating data, but for whom the data is
intended. It should also identify any subcontractor relevant to environmental data operations,
including laboratories providing analytical services.

5. Problem Definition/Background and Project Objective/s - Here state  the specific problem to
be solved, decision to be made or outcome to be achieved.  There should be sufficient
background information to provide a historical, scientific and regulatory perspective. State the
reason (the project objective) for the work to be done.

6. Project/Task Description - This section provides a summary of all work to be performed,
products to be produced, and the schedule for implementation.  Maps showing the geographic
locations of field tasks should be included. This section should be short. Save the total picture
for 10. Sampling Process Design.

7.   Quality Objectives and Criteria for Measurement of Data -These are the Measurement
Quality Objectives (MQOs)  for each parameter to be measured. Measurement Quality
Objectives are designed to evaluate and control various phases (sampling, preparation, and
analysis) of the measurement process to ensure that total measurement uncertainty is within the
range prescribed by the data quality objectives. MQOs can be defined in terms of the following
data quality indicators:
QAPP-24 Elements.doc                                                     Page 1 of 6

-------
ALASICA STATE DEPARTMENT OF ENVIRONMENTAL CONSERVATION
Division Air & Water Quality                           Air & Water Data & Monitoring Program
Revision I 0                                                            April 24,2003
•   Precision
•   Accuracy
•   Representativeness
•   Detectability
•   Completeness
•   Comparability

MQOs for each of the ambient air criteria pollutants can be found in, EPA Quality Assurance
Handbook for Air Pollution Measurement Systems. Volume 11: Part I Ambient Air Quality
Monitoring Program Quality System Development. Appendix 3. EPA-454/R-98-004. August
1998.

For each parameter to be sampled, list the measurement analytical methods to be used and the
MQOs to meet the overall data quality objectives.

Please summarize this section as much as possible in table format. In addition a good narrative
is always necessary.

8.   Special Training/Certifications - This section describes any specialized training or
certifications needed by personnel in order to successfully complete the project or task.  It should
discuss how such training is to be provided and how the necessary skills are assured and
documented.  If the project is a research one, it is sufficient to include the resumes of
consultants/staff in an appendix.

9   Documents  and Records - This section itemizes all the documents and records that will be
produced, such  as interim progress reports, final reports, audits, and Quality Assurance Project
Plan revisions, etc. It also lists field logs, sample preparation and analysis logs, laboratory
analysis, instrument printouts, model inputs and outputs, data from other sources such as
databases or literature, the results of calibration and QC checks.  Copies of example data sheets
should be included in the appendix.

In addition to any written report, data collected for Prevention of Significant Deterioration (PSD)
or compliance related air monitoring projects will be provided electronically to ADEC via a 3.5"
diskette or CD-ROM.

Finally this section needs to specify or reference all applicable requirements for the final
disposition of records and documents, including location and length of retention period.

Please summarize this section as much as possible in table format.

B.   Measurement and Data Acquisition

10.  Sampling Process Design - This section  includes four major activities:
•   Developing and understanding the monitoring objective(s) and appropriate data quality
    objectives.
•   Identifying the spatial scale most appropriate for the site(s)'s monitoring objective.
QAPP-24 Elements.doc                                                       Page 2 of 6

-------
ALASICA STATE DEPARTMENT OF ENVIRONMENTAL CONSERVATION
Division Air & Water Quality                           Air & Water Data & Monitoring Program
Revision 1.0                                                           April 24,2003
 •  Identifying the general site location(s) where monitoring site(s) should be placed.
 •  Identifying specific monitoring sites.

This section needs to define the key parameters, the types and numbers of samples, the design
assumptions, the where, when and how samples are to be taken, and the rationale for the design.
Unlike section 6. Project/Task Description above, the level  of detail here should be sufficient that
a person knowledgeable in this area could understand why and how the samples are to be taken.

 11. Sampling Methods - This section should describe the procedures for collecting the samples
and identify the sampling  methods, equipment calibration and maintenance, and specific
performance requirements.  To establish the basic validity of such air monitoring data, it must be
shown that:
•  The proposed sampling method complies with the appropriate testing regulations.
•  The equipment is accurately sited.
•  The equipment was accurately calibrated using correct and established calibration methods.
•  The organization implementing the data collection operation is qualified and competent.

Please summarize this section as much as possible in table format.

Some of this information can be provided by specific reference to existing equipment, methods,
and laboratory Standard Operating Procedures (SOPs) and Quality Assurance/Quality Control
(QA/QC)  Manuals in the appendices.

 12. Sample Handling and Custody - This section describes the requirements for sample
handling and custody.
Sample handling - The various phases of sample handling are sample labeling, sample
collection and sample transportation.
Chain of Custody - If the results of a sampling program are to be used as evidence, a written
record must be available tracking location and possession of the sample/data at all times.

Sample in the field and laboratory, taking into account the  nature of the samples, holding times
before extraction and analysis and shipping options and schedules. Sample handling forms and
SOPs, sample custody forms and SOPs, etc. can be included in the appendices.

 13. Analytical Methods - This section identifies the analytical methods and equipment required
for the analysis of ambient air samples. Generally these methods are manual sample collection
methods for lead and particulates with subsequent laboratory analyses.

 14. Quality Control (QC) - QC is the overall system of technical activities that measures the
attributes and performance of a process, item, or service against defined standards to verify that
they meet the stated requirements defined by the customer.  This section describes the quality
control activities that will  be used to control the monitoring process to validate sample data.
This section must state the frequency, control limits, standards traceability and describe the
corrective action to be taken when control limits are exceeded. Data QC/QA requirements will
be summarized  in table format (Data Validation Table) for each parameter to be measured.
QAPP-24 Elements doc                                                      Page 3 of 6

-------
ALASKA STATE DEPARTMENT OF ENVIRONMENTAL CONSERVATION
Division Air & Water Quality                           Air & Water Data & Monitoring Program
Revision 1.0                                                             April 24, 2003
Example Data Validation Tables for some parameters are available upon request. These
validation tables define criteria for accepting/rejecting pollutant and meteorological data.

Please summarize this section as much as possible in table format.

15. Instrument/Equipment Testing. Inspection and Maintenance - This section discusses the
procedures used to verify that all instruments and equipment are maintained in sound operating
condition and are capable of operating at acceptable performance levels. Elements to include in
Instrument/Equipment Testing, Inspection and Maintenance documents should include:
•   Equipment lists - by monitoring group or station
•   Spare equipment/parts lists - by equipment, including suppliers
•   Inspection/maintenance frequency - by equipment
•   Equipment replacement schedules
•   Sources of repair - by equipment
•   Service agreements that are in place
•   Monthly check sheets and entry forms for documenting testing, inspection, maintenance
    performed.

Please summarize this section as much as possible in table format.

Testing, inspection and maintenance procedures should be available at each monitoring station.
Appending or  referencing Standard Operating Procedures is an acceptable way to discuss
equipment.

16. Instrument/Equipment Calibration and Frequency - This section identifies all the tools,
gauges, instruments, and other sampling, measuring and test equipment used  for data collection
activities affecting quality that must be controlled, and, at specified periods, calibrated to
maintain performance within specified limits. It identifies the certified equipment and/or
standards used for calibration. It identifies the standards (primary, secondary, etc.), their
traceability to known master standards, their certification and expiration dates. For standards
where certification extends over a measurement range (e.g., thermometers, flow meters, etc.) ,
this section also specifies the range these respective standards are traceable over. Please ensure
that standards used are appropriate for the measurement range the equipment will be calibrated to
and that the calibration range is representative of the ambient environment to be measured. This
section also specifies how records of calibration are to be maintained.  Documentation should be
readily available for review and  should include calibration data, calibration equations, analyzer
identification,  calibration dale, analyzer location, shelter temperature, calibration standards used
and their traceabilities and the person conducting the calibration.

Please summarize this section as much as possible in table format.

17.Inspection/Acceptance of Supplies and Consumables - This section describes how and  by
whom supplies and consumables (e.g. standard materials and solutions, filters, tubing, volumetric
glassware, gas manifolds, sample bottles, water purity, calibration gases, reagents, electronic
data storage media), etc. are inspected and accepted for use in the project. The acceptance
criteria should be stated.
QAPP-24 Elements doc                                                       Page 4 of 6

-------
ALASKA STATE DEPARTMENT OF ENVIRONMENTAL CONSERVATION
Division Air & Water Quality                           Air & Water Data & Monitoring Program
Revision 1.0                                                           April 24,2003

18. Non-direct Measurements - This section identifies the type of data needed for project
implementation or decision making that are obtained from non-measurement sources such as
maps, charts, GPS latitude/longitude measurements, computer data bases, programs, literature
files and historical data bases. It describes the acceptance criteria for the use of such data and
specifies any limitations to the use of the data.

19. Data Management - This section describes the project data management process, tracing the
path of the data from their generation to their final use or storage. It discusses the control
mechanism for detecting and correcting errors as well as performance audits of the data
management system.

Please summarize this section as much as possible in table format.

C. Assessments and Oversight

20. Assessments and Response Actions - This section describes the evaluation processes and
criteria used measure the performance or effectiveness of a quality system, the monitoring
network and sites and various measurement phases of the data operation. These assessments are:
•  Management Systems Reviews
•  Network Reviews
•  Technical Systems Audits
•  Performance Audits (Accuracy)

This section will specify the frequency, numbers, acceptance criteria and type of project
assessments. It describes how and to whom the results of the assessment are reported and it
discusses how response actions to assessment findings, including corrective actions for
deficiencies and non-conforming conditions, are to be addressed and by whom. It discusses the
process for revising an approved QAPP, if necessary.

Please summarize this section as much as possible in table format.

21. Reports to Management - This section describes the frequency, content and distribution of
    assessment reports to management and other recipients. Assessment Reports are:
 •   QA Annual Report
 •   Network Reviews
 •   Quarterly  Reports (includes precision and accuracy)
 •   Technical Systems Audit Reports
 •   Response/Corrective Action Reports

D. Data Validation and Usability

22. Data Review. Validation. & Verification Requirements - The purpose of this section is to
state the criteria used to review and validate—that is, accept, reject or qualify—data in an
objective and consistent manner. It is a way to decide the degree to which each data item has
met its quality specifications as described in B above.
QAPP-24 Elements.doc                                                      Page 5 of 6

-------
ALASKA STATE DEPARTMENT OF ENVIRONMENTAL CONSERVATION
Division Air & Water Quality                           Air & Water Data & Monitoring Program
Revision 1.0                                                            April 24,2003
Validating data means determining if data satisfy QAPP-defmed user requirements; that is, that
the data refer back to the overall objectives.
Verifying data means ensuring that the conclusions can be correctly drawn.
Ambient Air and Meteorological data will be validated via Data Validation Tables that
summarize all criteria to be considered in validating, qualifying or invalidating data. Some
example data validation  tables have been developed for specific criteria pollutants and
meteorological parameters.

These validation templates should be maintained on-site, with the respective laboratory
conducting air monitoring analysis, QA auditor as well as with project management. These
tables are intended as a quick reference guide for all involved in air monitoring system. Please
summarize data validation criteria tables

23. Validation and Verification Methods - This section describes the process for validating and
verifying data. It discusses how issues are resolved and identifies the authorities for resolving
such issues. It describes how the results are  to be conveyed to the data users. This is the section
in which to reference examples of QAPP forms and checklists (which could be provided in the
appendices). Any project-specific calculations are identified in this section.

24. Reconciliation with User Requirements - The purpose of this section is to outline and specify
the acceptable methods for evaluating the results obtained from the project. It includes scientific
and statistical evaluations to determine if the data are of the right type, quantity, and quality to
support the intended use.

For additional assistance in developing a QAPP,  refer to:
1)  EPA QA/R-5 at:  www eon gov/i IQearth/orfices/oea/epaciarS pelf;
2)  EPA QA/G-5 at:  www.epa.uov/rIOearth/offices/oea/epaqap.pdf: and
3)  EPA QA Quality Assurance Handbook for Air Pollution Measurement Systems Volume II:
    Part I. Ambient Air Quality Monitoring Program Quality  System Development EPA-454/R-
    98-004. August 1998.
QAPP-24 Elements doc                                                       Page 6 of 6

-------
Quality Assurance Planning Document for Oversight Sampling Events
Site Name:
Sampling Date(s).
Site Background and Project Objectives: Reference the PRP's approved QAPP and attach as an
appendix to this QA planning document.


Purpose: Samples will be taken at the direction of the EPA as a subset of the overall sampling
event.
Analyte Type: (ex: VOA, Metals) & Purpose: The PRP QAPP can be referenced for this information
Analytical Information: Should be as in the PRP QAPP. Complete the following table or attach a table
with the same information:
Analyte List
Ex: Benzene




CAS
Number
123-1234.




Analytical Method (include
preparation and extraction
methods as applicable)
SOMO1.1




Quantitation Limits
Needed for Project
lug/1




Matrix
water




Number of samples to be spilt with the PRP:
Analyte Type


Matrix


Number of Samples


Number of QC
Samples + Type


Assessment Criteria: Include a discussion of what is the acceptable deviation between the PRP's analytical
results and those generated by the EPA laboratory system.
Name of PRP's Laboratory:

Data Validation Levels in PRP QAPP:
Sample Handling and Shipping Procedures; and Health and Safety Plan:  Include reference to Contractor's
Generic QAPP.
QAPP_oversight2.rtf

-------
                               Planning Checklist
^Reminders: Use this process for each site, operable unit, CERCLA phase.

All Quality Assurance Project Plans must be approved before sampling and analysis begins.
Therefore, ensure that QAPPs are sent to OASQA QA at least 30 days before sampling begins.

n     Identify all stakeholders (i.e., scientific experts, risk assessors, OASQA QA,
      OASQA Analytical Team, contractors, State personnel, Quality Assurance Project
      Plan preparer, etc.)

D     Use a systematic planning process and include all stakeholders. Things to define
      during planning:

      7      Project goals,  objectives and issues to be addressed
      7      Project schedule, resources, milestones
      J      Regulatory and contractual requirements
      /     Type of data needed and the ways in which data will be used to support the
             project objectives and decisions
      /     How much data is needed
      /     Performance criteria for measuring the quality of the data (i.e.,
             evaluation/assessment criteria for splits, matrix  spikes and duplicates)
      /     How, when and where the data will be obtained (including existing data)
      /    . Constraints on data collection
      /     Conditions that would cause the decision maker to choose between
             alternative actions (i.e., if the mean level of arsenic is less than or equal to
             1.0 ppb, then the soil will be left in situ, otherwise the soil shall be removed
             to an approved site.)

D     Document the results of planning meeting in the Quality Assurance Project Plan
      (QAPP) and associated documents (i.e., Work Plan, Sampling and Analysis Plan).

a     Send Quality Assurance Project Plan (QAPP) and associated documents (i.e.,
      Work Plan, Sampling and Analysis  Plan) to OASQA QA at least 30 days BEFORE
      sampling begins.  In case of emergency, contact QA to negotiate an alternate
      review schedule.

n     Review OASQA QA comments and forward comments to QAPP preparer

o     If necessary, schedule conference calls or meetings to resolve outstanding issues.

n     Upon resolution of outstanding issues, approve project's QAPP.
                                                                          Page 1 of 2

-------
                                Planning References

HSCD Quality Management Plan: Contact: Jeff Tuttle, HSCD QA Coordinator

Region HI Quality Assurance Project Plan Checklist:
http://www.epa.gov/region3/esc/OA/docs qapp.htm

EPA Requirements for Quality Assurance Project Plans, EPA QA/R5, March 2001:
http://www.epa.gov/QUALITY/qa docs.html

Uniform Federal Policy for Quality Assurance Project Plans:
http://www.epa.gov/swerffrr/documents/qualitvassurance.htm

Guidance for the DQO process for Hazardous Waste Sites, EPA QA/G-4HW EPA/600/R-
00/007/January 2000: http://www.epa.gov/OUALITY/Qa docs.html

Decision Process Tool: Visual Sampling Plan, ERC VSP "Managing Uncertainty for
Environmental Decision Making":
http://www.hanford.gov/dQO/VSP Training/ERC VSP Course40.html

Contract Laboratory Program (CLP)  (Analytical methods):
http://www.epa.gov/superfund/programs/clp/index.htm

SW846 Methods: http://www.epa.gov/epawaste/hazarQVtestmethods/index.htm

Clean Water Act Methods (40CFR):
http://www.epa.gov/epawaste/hazard/testmethods/index.htm
                                                                            Page 2 of 2

-------
                                                                        Quality Assurance Staff (3EA22)
                  US EPA ReCNOn 3                             Environmental Science Center
                  _ .  _                _     .      _                 701 Mapes Road
                  QA  Document Review Request      Ft.ivieade, 1^1020755-5350
                                                                        410-305-2629
                                                                        R3 ESC-QA@epa.gov
Please complete this form and FEDEX, along with the QA document, or email the electronic files to the applicable address above.
 Requestor Name |	                           |    Date


 Organization            	
Mailcode                  Phone #
 Site Name / Document Title
 Superfund Account # (if applicable) I
 If Grant/Contract, Type & Number  I
Type of Document (Please check the appropriate box.)

    C] QAPP             Q SAP               Q] FSP              Q Work Plan        Q Data

    Q Other    (please specify)      |	


 Document Author (Organization only)  	
Document Information (Please check which applies & include requested information.)

n New
   Revised    Revision #\	   Name of Original Document |

   Response to comments   Name of Original QA Staff Reviewer
 Is the document associated with a QMP?                       r-i yes              n No


 Title & Date of QMP I
Is there an associated Master/Generic QAPP to this document?      rn yes              F] No


Title & Date of Master/Generic QAPP
 Date Comments Requested    	      Project/Event Start Date
Document Disposal (Please indicate what you want done with the document after review is complete.)

                                Q Return                    Q Dispose/Recycle
Notes/Comments
 ForQAStaffUseOnly   QA Tracking* [                    |   Consolidated Database # f
                                                                                                 rev. March 2009

-------
              U.S. EPA Region III Analytical Request Form
                                               Revision 10.06
ASQAB USE ONLY
RAS#
DAS#
NSF#



Analytical TAT

Date:
            Site Activity:
Site Name:
                                              Street Address:
City:
                     State:
                Latitude:
 Longitude:
Program:
                     Acct. #
                                               2.00*1
                                     CERCLIS #:
Site ID:
                     Spill ID:
                                     Operable Unit:
Site Specific QA Plan Submitted:   Q No  QYes    Title:
                                                                                  [   Date Approved:
EPA Project Leader:
                         Phone#:
                   Cell Phone #:
E-mail:
Request Preparer:
                         Phone#:
                   Cell Phone #:
E-mail:
Site Leader:
                         Phone#:
                   Cell Phone #:
E-mail:
Contractor:
                              EPACO/PO:
#SampIes
Matrix:
Parameter:
Method:
^Samples
Matrix:
Parameter:
Method:
^Samples
Matrix:
Parameter:
Method:
#Samples
Matrix:
Parameter:
Method:
^Samples
Matrix:
Parameter:
Method:
#Samples
Matrix:
Parameter:
Method:
^Samples
Matrix:
Parameter:
Method:
#Samples
Matrix:
Parameter:
Method:
#SampIes
Matrix:
Parameter:
Method:
Ship Date From:
              Ship Date To:
                Org. Validation Level
  Inorg. Validation Level
                                •                                •                                     •	
Unvalidated Data Requested: D No  D Yes   If Yes, TAT Needed: Di4days D 7days  D 72hrs  D 48hrs D 24hrs D Other  (Specify)
Validated Data Package Due:  D 42 days  Q 30 days   Q 21 days  D-days  D Other (Specify)
Electronic Data Deliverables Required:  Q No O Yes   (EDDs will be provided in Region 3 EDD Format)
Special Instructions:
FORM ARF-10/06
                                                                                               Revision 1.1

-------
                                       Region III Fact Sheet
                                   Quality Control Tools: Blanks
The primary purpose of blanks is to trace sources of artificially introduced contamination. The diagram below shows how
comparison of different blank sample results can be used to identify and isolate the source of contamination introduced in
the field or the laboratory. See page 2 for a definition of each blank, its purpose and collection frequency.
                  Equipment Blank
                     Equipment
                  Decontamination
                    Field Blank
                       Field
                     Sampling
                    Trip Blank
                      Sample
                     Transport
                   Method Blank
                      Sample
                       Prep
                      Instrument
                       Blank
                       Sample
                       Analysis
Equipment Blank results include total field and
laboratory sources of contamination.
Field Blank results include total ambient
conditions during sampling and laboratory
sources of contamination.
Trip Blank results include shipping and
laboratory sources of contamination.
Volatiles only.
Method Blank results show only laboratory
sources of contamination.
Instrument Blank results show only laboratory
sources of contamination.
                                                                         Revision 0
                                                                         Date: November 15,2001
                                                                         Page 1 of2

-------
                                       FIELD BLANKS
Rinsate/Equipment Blank - A sample of analyte-free water poured over or through decontaminated Held
sampling equipment prior to the collection of environmental samples.
       Purpose- Assess the adequacy of the decontamination process  Assesses contamination from the
       total sampling, sample preparation and measurement process, when decontaminated sampling
       equipment is used to  collect samples
       Frequency- 1 blank/day/matrix or 1 blank/20 samples/matrix, whichever is more frequent.

Field Blank - A sample of analyte free water poured into the container in the field, preserved and shipped
to the laboratory with field samples.
       Purpose. Assess contamination from field conditions during sampling.
       Frequency - 1 blank/day/matrix  or 1 blank/20 samples/matrix, whichever  is more frequent.

Trip Blank - A clean sample of a matrix that is taken from the laboratory to the sampling site and
transported back to the laboratory without having been exposed to sampling procedures Typically,
analyzed only for volatile compounds.
       Purpose: Assess contamination introduced during shipping and field handling procedures
       Frequency - 1 blank/cooler containing volatiles.

                                   LABORATORY BLANKS
Method Blank - A blank  prepared to represent the matrix as closely as possible. The method blank is
prepared/extracted/digested and analyzed exactly like the field samples.
       Purpose: Assess contamination introduced during sample preparation activities
       Frequency - 1 blank/batch (samples prepared at one time.)

Instrument Blank - A blank analyzed with field samples
       Purpose: Assess the presence or absence of instrument contamination.
       Frequency - defined by the analytical method or at the analyst's discretion (e g , after high
       concentration samples)

COMPARING BLANKS: The source of contamination introduced in the field or laboratory can be
deduced by comparing blank  results  An equipment blank could potentially be contaminated  in the field,
during transport to the lab or  in the lab  The method blank, on the other hand, could only be
contaminated in the lab Using all blanks (appropriate for the project) described in this fact sheet will
facilitate  the identification of contamination sources

Temperature Indicator (often called a Temperature Blank, but is not a blank) - A VOA vial or other
small sample bottle filled  with distilled water that is placed in each cooler   Upon arrival at the laboratory,
the temperature of this vial is measured.  The temperature indicator  or blank is not analyzed and does not
measure introduced contamination, therefore, is not a blank
       Purpose1 To evaluate if samples were adequately cooled during sample shipment
       Frequency. 1 blank/cooler

Contact Mary  Ellen Schultz (410) 305-2746
                                                                         Revision 0
                                                                         Date. November 15,2001
                                                                         Page 2 of 2

-------
                            Implementation Checklist


Ensure field procedures in the approved QAPP are followed.
   •   Ensure deviations from the QAPP are followed

   •   Review Field Audit Reports/Corrective Action Reports

   •   Ensure corrective actions are implemented, documented and assessed

   •   Ensure split & confirmation samples are evaluated
       /Assessment criteria matches criteria in the QAPP

Ensure analytical procedures in the approved QAPP are followed
   •   Review Analytical requests

   •   Request Proficiency Testing (PT) samples be sent to the lab. Send QAT :
       /Site name and SF account number
       /PRP/FEDFAC contact person authorizing  the  laboratory to analyze the PES
       /Analytes of interest and their concentration and matrix
       /Date needed
       /Laboratory name, address, contact person, phone and facsimile numbers and email address.

   •   Schedule laboratory audits and review audit  reports
                            Implementation Reference
Acquisition of Analytical Services, EPA Region IH User's Guide for Acquiring Analytical
Services and Analytical Request Forms:
http://www.epa.gov/region3/esc/index.htm

-------
                                      REGION III
                              DATA VALIDATION USES

Region III data validation procedures consist of five levels of data validation, three (3) organic
review procedures (M-l, M-2, M-3) and two (2) inorganic review procedures (1M-1, IM-2.) The
term "level" is perhaps misleading because it implies succession, which is not the case. The
'levels" are independent of each other.  M-3 and IM-2 review procedures represent full validation
as described in the National Functional Guidelines for Data Review.  Should the intended data
use dictate review by the protocols presented in the National Functional Guidelines for Data
Review (IM-2 and M-3) refer to the Region III Modifications to the National Functional
Guidelines for Data Review (Organic, 1994 and Inorganic, 1993).  The remaining data review
levels (IM-1, M-l and M-2) are described in the Innovative Approaches to Data Validation,
1995  All procedures require full CLP or CLP-equivalent data package deliverables.

In general, for organic Level M-l  emphasis is on reviewing positive (detected) data. The primary
question asked is whether or not a compound is present.  If it is, the next question addressed is
whether the compound is potentially from field or laboratory-induced contamination.  If the
answer is negative, then the presence of the compound is considered confirmed, and the reported
concentrations are considered usable for some predefined uses (Table 1). If information
regarding data quality and usability is required, then the emphasis is shifted accordingly to an
evaluation of data quality parameters, false negatives, and detection limits (Level M-2 or IM-1).
If legally defensible data are necessary, then a full, CLP-equivalent data validation is performed
(Level M-3 or IM-2). There is a definite focus at every step of the process. This allows for a
clear differentiation in the levels of data validation.

For guidance purposes, general data use categories and suggested levels of review are  provided in
Table 1. However, it is important to note that the selected level of review will be specific to the
intended data use and specific project objectives.

Since full CLP or CLP-equivalent data package deliverables are required, the use of Levels M-l,
M-2 and IM-1 does not preclude the data  from being validated by M3/IM2 at a later date, if the
intended data use changes or if data is questionable for its intended purpose.
                                                                             Page 1  of 2

-------
Table 1
DATA VALIDATION
DATA USES MATRIX
DATA USES

ORGANIC REVIEW
LEVELS
Ml
M2
M3
INORGANIC
REVIEW LEVELS
IM-I
IM-2
SUPERFUND SPECIFIC DATA USES
Oversight
Action Level comparison
Initial Investigation
Nature & Extent
Preliminary Risk Assessment
Risk Assessment with
Known High Level Toxins
Feasibility Study
Trcatability Study
Preliminary Design
Initial Cleanup Verification
Risk Assessment with
Marginal Risk
LowLcvcl Contamination, Nature &
Extent
Cleanup Near Detection or
Action Levels
Use in Courts
Controversial Site
Residential Wells
X
X
















X

X
X
X
X
X
X









X






X
X
X
X
X
X
X
X
X

X
X
X
X
X
X









X






X
X
X
X
X
X
RCRA SPECIFIC DATA USES
Verification Release
Corrective Measures
Permits/Regulatory Compliance
RCRA Closure




X
X


X*

X*
X*

X


X*

X*
X*
"Can be Compound Specific
                                                                              Page 2 of 2

-------
               GLOSSARY OF DATA QUALIFIER CODES

CODES RELATED TO IDENTIFICATION
(confidence concerning presence or absence of analytes):

U =  Not detected. The associated number indicates approximate sample concentration
      necessary to be detected.

(NO CODE) = Confirmed identification.

B =  Not detected substantially above the level reported in laboratory or field blanks.

R =  Unusable result. Analyte may or may not be present in the sample. Supporting
      data necessary to confirm result.

CODES RELATED TO OUANTITATION
(can be used for both positive results and sample quantitation limits):

J  =  Analyte Present. Reported value may not be accurate or precise.

 K =  Analyte present. Reported value may be biased high.  Actual value is expected to
      be lower.

L =  Analyte present. Reported value may be biased low.
      Actual value is expected to be higher.

[ ] = Analyte present. As values approach the IDL the quantitation may not be
      accurate. (For use with Inorganic Data Validations only )

UJ = Not detected, quantitation limit may be inaccurate or imprecise.

UL = Not detected, quantitation limit is probably higher.

OTHER CODES

Q =  No analytical result.

-------
                                 Evaluation Checklist

Data Validation
   •   Ensure analytical results are validated as documented in the approved QAPP

   •   Review Data Validation Report Narratives

Data Quality Assessment
   •   Review Project Reports using information in Data Quality Assessment:  A Reviewer's
       Guide (QA/G- 9K): http://www.epa.gov/quality/qs-docs/g9r-final.pdf

                                Assessment References

Region HI Data Validation Policy, 1995  http://www.epa.gov/region3/esc/OA/docs  dataval.htm

CLP (data validation guidelines) - http://www.epa.gov/superfund/proerams/clp/guidance.htm

Region ffl Modifications to the National Functional Guidelines for Inorganic Data Review, April
1993.  Can be requested from htto://www.epa.gov/region3/esc/OA/docs dataval.htm

Region ffl Modifications to the National Functional Guidelines for Organic Data Review,
November 1994. Can be requested from http://www.epa.gov/region3/esc/OA/docs dataval.htm

Region m Innovative Approaches to Data Validation, June 1995.  Can be requested from
http://www.epa.gov/region3/esc/OA/docs dataval.htm

Region m Dioxin/Furan Data Validation Guidance, March,  1999.  Can be downloaded from the
Internet at http://www.epa.gov/region3/esc/OA/docs  dataval.htm

EPA Region HI Interim Guidelines for the Validation of Data Generated Using Method 1668
PCB Congener Data, April 2004. http://www.epa.gov/region3/esc/OA/docs dataval.htm

-------