EPA-810/B-92-001
    x>EPA
C^^L^^^P*«*^^l^»«^fc^l ^^•tf^^K4^ft«*a»
cnvtroiMMfiDU protection
Agency
Office of Water (WH-550)
OfllLV of Pesticides Mid
Toxic Substances (H-7501C)
EPA 810-B4C-001
February 1902
                    QUALITY ASSURANCE PROJECT PLAN
                                   FOR THE
         NATIONAL PESTICIDE SURVEY OF DRINKING WATER WELLS
                                   Prepared for:
                          U.S. Environmental Protection Agency
                              Technical Support Division
                               Office of Drinking Water
                             26 W. Martin Luther King Drive
                               Cincinnati, Ohio 45268
                                              U.S. Environmental Protection Agency
                                              Region 5, Library (PL- 12J)
                                              77 West Jackson Boulevard, 12th Floor
                                              Chicago, IL  60604-3590

-------
                                                   Section No. 1
                                                   Date: February 1993
                                                   Page 2 of 2
               APPROVAL PAGE
Director, NFS                                           Date
QA Manager-OPP                                       Date
QA Manager-ODW                                      Date

-------
                                                                              Section No. 2
                                                                              Date:  February 1992
                                                                              Page 1 of 2
<*>                                  NATIONAL PESTICIDE SURVEY
os                               QUALITY ASSURANCE PROJECT PLAN
       2.   TABLE OF CONTENTS
o
         Section                                                       Pages            Date

            1.   TITLE AND APPROVAL PAGE                               2              2/92

            2.   TABLE OF CONTENTS                                     2              2/92

            3.   INTRODUCTION                                          1              2/92

            4.   PROJECT DESCRIPTION                                   9              2/92
                4.1   Introduction
                4.2  Purpose of Program Plan
                4.3  Data Quality Objectives (DQOs)
                4.4  Measurement Quality Objectives (MQOs)
                4.5  Key Features
                4.6  Summary

            5.   QUALITY ASSURANCE ORGANIZATION AND
                RESPONSIBILITIES                                        4              2/92
                5.1   QA Policy Statement
                5.2  QA Management Structure

            6.   QUALITY ASSURANCE PROJECT PLANS                      5              2/92
                6.1   Scope
                6.2  Philosophical Approach
                6.3  Minimum Criteria
                6.4  Requirements for Laboratory Plans
                6.5  Requirements for Other Plans
                6.6  Amendments to Plans

            7.   PROCESS CONTROL                                      5              2/92
                7.1   NPS Pilot
                7.2  Well Selection
                7.3  Sampling Controls
                7.4  Questionnaire Administration and Processing Controls
                7.5  Laboratory Controls
                7.6  Database Management Controls

            8.   AUDITS                                                  4              2/92
                8.1   Philosophical Approach
                8.2  Technical Systems Audits
                8.3  Audits of Data Quality
                8.4  Performance Evaluation Studies
                8.5  Corrective Action Verification

            9.   QA COMMUNICATION STRATEGIES                          3              2/92
                9.1   Status Report
                9.2  Operations Communications
                9.3  Final Report

-------
                                                                      Section No. 2
                                                                      Date:  February 1992
                                                                      Page 2 of 2
2.    TABLE OF CONTENTS (continued)

 Section

   10.   CLOSE-OUT ACTIVITIES
         10.1  Final Update to QA Project Plans
         10.2  Close-Out Activities

   11.   REFERENCES CITED

Appendices

    A.   INFORMATION PACKET FOR NPS LABORATORIES
         A.1   QAPjP Guidance: Section 5
         A.2   QAPjP Guidance: Section 6
         A.3   QAPJP Guidance: Section 7
         A.4   QAPjP Guidance: Section 10
         A.5   QAPjP Guidance: Section 11
         A.6   Guidance for Revisions to QAPjPs

    B.   LABORATORY AUDIT CHECKLIST

    C.   FIELD SAMPLING AUDIT CHECKLIST

    D.   GENERAL NPS AUDIT CHECKLIST
Pages

  2
 63
 12

 16

  5
Date

2/92



2/92



2/92
2/92

2/92

2/92

-------
                                                                             Section No. 3
                                                                             Date: February 1992
                                                                             Page 1 of 1
3.    INTRODUCTION
      This document was revised at the conclusion of the National Pesticide Survey (NFS) in order to
have an accurate record of all the quality assurance/quality control features that were a part of the
Survey quality effort.  Although a draft QA program plan was available at the beginning of the Survey,
a number of issues were raised during the course of the Survey that have been incorporated only into
this, the final version  of the program plan. The evolutionary nature of certain aspects of the NFS
quality program will be evident to the reader since the document relies very heavily on original
memorandum and unpublished narratives to document the requirements and execution of the QA
program. The use of these memos and narratives resulted in the plan being organized into two rather
distinct sections, the  narrative,  dealing with broad aspects  of the QA program, and the appendices,
which contain many of the more detailed QC requirements, particularly with respect to analytical
methods requirements. If the reader wishes even more detailed QA information about a specific
aspect of the Survey, they are referred to the individual project plans which will be available through
the National Technical Information Service (NTIS), Springfield, Virginia
      The reader is also advised that the Environmental Protection Agency reorganized  the Office of
Water during  the Spring of 1991.  As a result, the Office of Drinking Water (ODW), a major sponsor of
the NFS, no longer exists but has been incorporated into the Office of Ground Water and Drinking
Water (OGWDW).

-------
                                                                             Section No. 4
                                                                             Date:  February 1992
                                                                             Page 1 of 9
4.    PROJECT DESCRIPTION
      In 1981, the United States Environmental Protection Agency (USEPA) issued a policy requiring
all environmental measurement data be collected under the auspices of a centrally managed quality
assurance (QA) program. In response to this policy, EPA formed the Quality Assurance Management
Staff (QAMS) and tasked them with developing QA guidance for all Agency environmental data
collection efforts. Guidance developed by QAMS suggests that Quality Assurance Program Plans
(QAPPs) be written to describe "the overall policies, organization, objectives, and functional
responsibilities designed to achieve data quality goals...'.  The National Pesticide Survey (NPS), as a
collector of environmental data, has developed the following QAPP to address these and other quality
issues specific to the Survey.

4.1   Introduction
      The National Pesticide Survey (NPS) is a jointly sponsored effort of the USEPA Offices of
Drinking Water (ODW) and Pesticide Programs (OPP).  The Survey has two primary objectives:
      1.    To determine the frequency and concentration of pesticide contamination in the
           Nation's drinking water supplies obtained from groundwater sources.
      2.    To examine the relationships of pesticide contamination to patterns of pesticide use
           and groundwater vulnerability.
To meet these objectives, substantial resources have already been committed to planning the Survey.
In particular, a pilot study was conducted of sixteen wells during 1987 to field test essential
components of the Survey design, logistics, and QA/QC procedures.  Based on results of the pilot, a
number of modifications were recommended by Mason, et al, 1988.  Because the pilot was covered
by a separate QA project plan (Kulkarni, et al, 1987), only activities conducted in support of the full
Survey, from  1988 to 1991, will be covered under the QA program plan described in this document.

4.2   Purpose of Program Plan
      The NPS will be conducted with assistance from  several organizational groups within EPA and
over 10 contract organizations, all operating from locations dispersed across the country.  Given this
level of complexity, the purpose of the QA program plan (QAPP) will be to communicate minimum
standards for assuring quality to the primary organizations responsible for conducting the Survey.
Each primary organization, whether EPA or contract, wiB be required to write a quality assurance
project plan (QAPjP)  describing to NPS management the procedures by which their organization plans
to meet the requirements of the program plan. It will be left to the discretion of each organization,
based on their own internal operations, to describe the exact procedures which will be used to meet
Survey quality standards. Through the project plan review process, NPS management will examine
the plans for completeness and conformity to the program plan, and if necessary, provide assistance
in understanding the standards and implementing procedures to achieve them.

-------
                                                                              Section No. 4
                                                                              Date:  February 1992
                                                                              Page 2 of 9
4.3   Data Quality Objectives (DQOs)
      To ensure the usefulness of IMPS data, management led an intensive planning effort prior to
starting the full Survey, with the result that precision requirements for each of the Survey's domains of
interest have been clearly defined and will be used to design and implement the Survey.  To generate
the requirements, Survey management, in discussions with their technical staff, considered the
resources necessary for achieving different levels of confidence in the national estimates for pesticide
occurrence. Separate sets of objectives were developed for the rural domestic wells and the
community water systems and are presented in Exhibit 4-1.  Following the NFS planning effort, QAMS
institutionalized the planning process coincidently used by NFS as the Data Quality Objective (DQO)
process.  A report describing the Survey's planning process as it relates to the DQO process was
then written (Nees, 1988).
      Other measures of data quality which the Survey addressed were representitiveness and
completeness.  For representitiveness, the goal of the Survey will be to select and sample relatively
few wells, which can then be used to characterize the status of pesticide occurrence in drinking water
wells across the nation. This goal will be  addressed through the Survey's statistical  design, which will
promote  the selection of wells from all types of geohydrologic and pesticide use areas using
stratification.  For completeness, a minimum number of wells must be successfully sampled and
analyzed in order to meet the precision requirements as stated for each of the domains of interest.
This goal will be addressed in two ways; first,  additional wells will be selected for sampling to account
for anticipated losses of data in the range of 5-10% and second, data losses will be tracked as the
Survey progresses, in order to select additional wells if losses exceed the estimated  5-10%.

4.4   Measurement  Quality Objectives (MQOs)
      MQOs for the full Survey have been established for detection limits, accuracy, and precision.
Each laboratory will be required to demonstrate a limit of detection within a factor of two of that
achieved during methods development, as reported in the method description. Detection limits that
exceed this will be evaluated individually taking into consideration any known health  effects levels.
      Other MQOs established for the Survey include  accuracy and precision. Accuracy  will be
evaluated using quarterly performance evaluation samples while precision will be evaluated using
standard control chart plots of laboratory  control samples. In addition to these overall MQOs, each
analytical method has a number of quality control criteria that must be addressed in  individual QAPjPs.
All QC criteria are discussed in detail in Section 6.4 and Appendix A.

4.5   Key Features
      The NFS  QA program  can only be understood within the context of the Survey's design,
implementation, and  analysis plans, therefore the key features of the Survey and their relationship to
each other, as shown in Exhibit 4-2 will be described in this section.

-------
                                                                              Section No. 4
                                                                              Date:  February 1992
                                                                              Page 3 of 9
                                          Exhibit 4-1

                                 NFS Data Quality Objectives
                                    Rural, Domestic Wells
Domain Description
Wells nationally
Wells in counties with highest average pesticide use
Wells in counties with highest average ground-water vulnerability
Wells in cropped and culnerable parts of counties
Wells in counties with highest average pesticide use and ground-
water vulnerability
Domain
Size, %
1.0
0.14
0.25
0.25
0.3
Probability of
Detection, %
63
75
75
97
47
Community Water Systems
Domain Description
Systems nationally
Systems in counties having the highest average ground-water
Domain
Size, %
0.5
0.1
Probability of
Detection, %
90
60
vulnerbility

-------
                                                       Section No. 4
                                                       Date:  February 1992
                                                       Pag* 4 of 9
                   Exhibit 4-2
              Survey Key Features
Step 1
Step 2
Step 3
     Task A
     TaskB
Step 4
Steps
  Stratify
                       Usage
                       Vulnerability
Select Sites
                       Rural, Domestic Wells
                       Community Water Systems
  Sample
                       ICF
                       State Personnel
 Interview
                       Westat
                       State Personnel
  Analyze
                       Contract Labs (5)
                       Referee Labs (2)
  Report
                       EPA
                       Public

-------
                                                                              Section No. 4
                                                                              Date: February 1992
                                                                              Page 5 of 9
      Statistical Design - By 1987, (and therefore not subject to this QA program plan), Research
Triangle Institute (Research Triangle Park, North Carolina) had placed all counties in the US into one
of 12 strata based on pesticide usage data (obtained from 1982 Census of Agriculture and Doanes
Marketing Research, Inc.) and groundwater vulnerability (as defined by a modified DRASTIC method
(Alexander et al., 1985).  For the current effort, essentially two surveys will  be conducted: one for
community water systems (CWS) and one for rural domestic wells.
      For the CWS, a complete listing of all CWSs can be found in the Federal Reporting Data System
(FRDS).  Using this listing, CWSs will be randomly selected from within county level strata.  With the
assistance of the well system operator, all system wells will be listed at the time of sampling and a
specific well will be selected using a random selection  process.
      For rural domestic wells, the design calls for a second stage stratification, meaning counties that
are selected for sampling will be further stratified into two sub-county strata: cropped/vulnerable and
non-cropped/non-vulnerable. Cropping data will be obtained from field interviews of county agents
about local pesticide use and cropping  practices; vulnerability assessments will be based on DRASTIC
scores developed using local hydrogeologic data, whenever possible. Strata will then be identified
from maps on which information about cropping and vulnerability has been combined and specific
wells will be randomly selected from within strata using telephone survey techniques. ICF,
Incorporated, as the prime contractor for conducting all non-analytical phases of the Survey, and
Westat, one of ICF's principal subcontractors, will be responsible for implementing the design begun
by RTI.
      Training - Implementation of the Survey design will require a significant commitment to training
because field activities will be conducted by a large group of individuals from a number of diverse
organizations including the States, EPA regional offices, ICF, and Westat. Therefore, to achieve
consistency in field operations, ICF and Westat will share responsibility for  developing and conducting
a hands-on training course on NPS field protocols.
      Sampling - The CWS samples will be collected by State sampling crews who will also administer
questionnaires to the well operator and  a local expert on pesticide usage.  The rural, domestic well
samples will be collected by ICF staff and questionnaires will  be administered by  professional
interviewers employed by Westat. All sampling logistics including preparation of the sampling
schedule, sample bottles, sample kits, and other supplies will be the responsibility of ICF.
      Chemical Analyses - Samples will be analyzed using eight  different analytical methods, several
of which were developed specifically for NPS use.  Exhibit 4-3 is a listing of the analytes associated
with each of the methods. As discussed in Section 6, five contract laboratories and three EPA referee
laboratories will perform the analyses.
      Questionnaire Processing - All questionnaires completed at the time wells are sampled will be
returned to Westat, where they will be coded and entered into a database.

-------
                                                                           Section No. 4
                                                                           Dote:  February 1992
                                                                           Page 6 of 9
                                        Exhibit 4-3

                           Survey Analytes and Analytic Methods
NPS Method 1:  Gas Chromatography with a Nitrogen-Phosphorus Detector
(46 Analytes)
Alachlor
Ametryn
Atraton
Atrazine
Bromacil
Butachlor
Butylate
Carboxin
Chlorpropham
Cycloate
Diazinon*
Dichlorvos
Diphenamid
Disutfoton*
Difutfoton sulfone*
Disulfoton sulfoxide*
EPIC
Ethoprop
Fenamiphos
Fenarimol
Fluridone
Hexazinone
MGK264
Merphos*
Methyl paraoxon
Metolachlor
Metribuzin
Mevinphos
Molinate
Napropamide
Norflurazon
Pebulate
Prometon
Prometryn
Pronamide*
Propazine
Simazine
Simetryn
Stirofos
Tebuthiuron
Terbacil
Terbufos*
Terbutryn
Triademefon
Tricyclazole
Vernolate


NPS Method 2:  Gas Chromatography with an Electron Capture Detector
(29 Analytes)
4,4-DDD
4,4-DDE
4,4-DDT
Aldrin
Chlorobenzilate*
Chloroneb
Chlorothalonil
DCPA
Dieldrin
Endosulfan I
Endosulfan II
Endosulfan sulftate
Endrin
Endrin aldehyde
Etridiazole
Heptachlor
Heptachlor epoxide
Hexachlorobenzene
Methoxychlor
Propachlor
Trifluralin
a-HCH
b-HCH
d-HCH*
g-HCH
a-Chlordane
g-Chlordane
c-Permethrin
t-Permethrin
NPS Method 3:  Gas Chromatography with an Electron Capture Detector
(17 Analytes)
2,4-D
2,4-DB
2,4,5-TP
2,4,5-T
3,5-Dichlorobenzoic acid
4-Nitrophenol*
Acifluorfen*
Bentazon
Chloramben*
DCPA acid metabolites
Dalapon*
Dicamba
Dicamba, 5-hydroxy-
Dichlorprop
Dinoseb
PCP
Picloram
NPS Method 4:  High Performance Liquid Chromatography with an Ultraviolet Detector
(18 Analytes)
Atrazine,1
Barban
Carbofuran, phenol-3-keto
Carofuran, phenol
Cyanazine
Diuron
Fenamiphos sulfone
Fenamiphos sulfoxide
Fluometuron
LJnuron
Metribuzin, DA
Metribuzin, DADK*
Metribuzin, DK*
Neburon
Pronamide
metabolite
Propanil
Propham
Swep

1   deethylated

-------
                                                                           Sectiorv No. 4
                                                                           Dote: February 1992
                                                                           Page 7 of 9
                                   Exhibit 4-3 (continued)
NPS Method 5: Direct Aqueous Injection HPLC with Post-Column Derivatization
(10 Analytes)

Aldicarb                        Baygon                 Carbofuran, 3-        Oxamyl
Aldicarb sulfone                 Carbaryl                 hydroxy
Aldicarb sulfoxide               Carbofuran              Methiocarb
                                                       Methomyl

NPS Method 6: Gas Chromatograhy with a Nitrogen-Phosphorous Detector
(1 Analyte)

Ethylene thirourea (ETU)

NPS Method 7: Microextraction and Gas Chromatography

Ethylene dibromide  (EDB)        c-1,3-dichloropropene**
Dibromochloropropane (DBCP)   t-1,3-dichloropropene**
1,2-dichioropropane**

NPS Method 9: Automated Cadmium Reduction and Colorimetric Detector

Nitrate and nitrite measured as nitrogen (N)
*   Qualitative only.
**  Analytes previously detected by Method 8, which was dropped.

-------
                                                                              Section No. 4
                                                                              Date: February 1992
                                                                              Page 8 of 9
      Data Synthesis - Given the complexity of the NFS, data synthesis will occur in several steps.
Initially, all data will be entered into databases at the location of the individual responsible for technical
oversight of a specific area of the Survey, as follows:
      Analytical data               EPA-Cincinnati (Dave Munch, Chris Frebis)
      Questionnaire data           Westat (David Marker)
      Field measurement data      ICF (Cindy Jengleski)
      Drastic data                 ICF (Bruce Rappaport)
      The next step will be to forward the databases to ICF, who will be responsible, along with
Westat for developing strategies to impute values for any missing data points.  The last step will be to
present the data to the public both in interpretive reports prepared by ICF under the direction of the
NFS director and OFF and ODW management and in a public use database available on magnetic
media.

4.6 Summary
      Well water samples and related information will be collected from approximately 750 rural
domestic wells and 600 community wells.  These wells will be used to represent the approximately 10
million rural domestic wells and 96,000  community wells in the nation which depend on groundwater
for drinking water purposes. Analysis of the well water samples will be for 126 pesticides and
associated products.  The presence and concentrations of nitrites/nitrates will also be determined. An
alphabetical listing of all Survey analytes is provided in Exhibit 4-4. In addition to collecting well water
samples, the Survey will administer questionnaires about pesticide usage, spillage, cropping  patterns,
well construction information, etc. for use in interpreting the results of the chemical analyses.

-------
                                                                        Section No. 4
                                                                        Date: February 1992
                                                                        Page 9 of 9
                                     Exhibit 4-4

                                    NFS Analytes
Aciflurofen
Alachlor
Aldicarb
Aldicarb sulfone
Aldicarb sutfoxide
Aldrin
Ametryn
Atraton
Atrazine
Atrazine, deethylated
Barban
Baygon
Bentazon
Bromacil
Butachlor
Butylate
Carbaryl
Carbofuran
Carbofuran-3-hydroxy
Carbofuran phenol
Cabofuran pheno-3-keto
Carboxin
Chloramben
a-Chlordane
g-Chlordane
Chloroneb
Chlorobenzilate
Chlorothalonil
Chlorpropham
Cyanazine
Cylcoate
Dalapon
DCPA
DCPA diacid*
Diazinon
Dibromochloropropane
Dicamba
Dichlorprop
Dichlorvos
Dieldrin
Dinoseb
Diphenamid
Disulfoton
Disutfoton sulfone
Disulfoton sulfoxide
Diuron
Endosulfan I
Endosulfna II
Endosulfan sulfate
Endrin
Endrin aldehyde
EPIC
Ethoprop
Ethylene dibromide
Ethylene thiourea
Etridiazole
Fenamiphos
Fenamiphos sulfone
Fenamiphos sulfoxide
Fenarimol
Fluometuron
Fluridone
a-HCH
b-HCH
d-HCH
g-HCH
Heptachlor
Heptachlor epoxide
Heptachlorobenzene
Hexazinon
Unuron
Merphos
Methiocarb
Methomyl
Methoxychlor
Methyl paraoxon
Metolachlor
Metribuzin
Metribuzin DA
Metribuzin DADK
Metribuzin DK
Mevinphos
MGK-264
Molinate
Napropamide
 Neburon
 Nitrate/nitrite
 Norflurazon
 Oxamyl
 PCP
 Pebulate
 c-Permethrin
 t-Permethrin
 Picloram
 Prometon
 Prometryn
 Pronamide
 Pronamide*
 Propachlor
 Propanil
 Propazine
 Propham
 Simazine
 Simetryn
 Stirofos
 Swep
Tebuthiuron
Terbacil
Terbufos
Terbutryn
Triademefon
Tricyclazole
Trifluralin
Vemolate
 1,2-DCP
0-1,3-DCP
M.3-DCP
2,4-DB
2,4-D
2,4,5-TP
2,4,5-T
3,5-Dichlorobenzoic
acid
4-Nttrophenol
4,4-DDD
4,4-DDE
4,4-DDT
5-Hydroxy Dicamba
    Metabolite

-------
                                                                            Section No. 5
                                                                            Date: February 1992
                                                                            Page 1 of 4
5.   QUALITY ASSURANCE ORGANIZATION AND RESPONSIBILITIES
     NFS management made a strong commitment to quality during the planning of the Survey, a
commitment which will continue throughout the data collection and interpretation phases of the
Survey.  This chapter describes the formal QA organization which will be created to support
management's commitment to quality and the responsibilities which will be delegated to individuals
within the QA structure.

5.1  QA Policy Statement
     The policy of the National Pesticide Survey will be to participate in the Agency-wide quality
assurance program,  with the goal of collecting data of known and documented quality. All applicable
guidance developed by the Quality Assurance Management staff will be followed and any additional
requirements of the Office of Drinking Water and Office of Pesticide Programs will be addressed.  The
QA program will be operated with the philosophy that responsibility for quality belongs to everyone.
To institutionalize this philosophy will require an active QA program within each participating
organization, with visible support provided by the Survey both through frequent informal
communication between the laboratory project managers and the technical monitors and  through
biannual on-site technical system audits.

5.2  QA Management Structure
     The NPS QA management structure will be a refinement of the QA organization that was used
during the pilot study and will be modified to account for those changes recommended by QAMS
during their management system review (MSR) of the pilot. Recommendations by QAMS  which were
incorporated into the QA program included the hiring of a full-time QA specialist, development of a QA
Management Plan, development of QA Project Plans for field and lab activities, development of
procedures for data  review and analysis, and development of clear, concise DQOs.
     Using  Exhibit 5-1 to illustrate the structure of the QA organization, five distinct levels of
responsibility encompassing 26 individuals are identifiable.  At Level  I, leadership for quality rests with
the Director  of the Survey, who has  assigned certain responsibilities  for quality to the NPS Quality
Assurance Manager (QA Manager),  at Level II.  The QA Managers for ODW and OPP, who are also
included in Level II, will guide and assist the NPS QA Manager.  At Level III are three coordinators for
the analytical and implementation activities, while level IV is composed of Technical Monitors who have
responsibility for oversight of day-to-day data gathering operations. The Quality Assurance
Coordinators, at Level V, are staff employed by each organization participating in the NPS. As
depicted, responsibility for QA policy and  leadership wilt be the greatest at Level I and responsibility
for QA of day-to-day operations will  be the greatest at Level V.

-------
                                                                Section No. 5
                                                                Date: February 1992
                                                                Page 2 of 4
                    Exhibit 5-1    NPS QA Organization
PROGRAM

RESPONSIBILITY
                            OPERATIONS
                            RESPONSIBILITY
 I  DIRECTOR

II  QAMs
III  COORDINATORS

IV TECHNICAL MONITORS
V  QACa

-------
                                                                       Section No. 5
                                                                       Date: February 1992
                                                                       Page 3 of 4
Roles and responsibility for QA have been assigned as described below.

NPS Director -The Director has overall responsibility for Survey quality including:

      Allocating resources for QA.

      Prioritizing tasks, including QA/QC activities.

      Requesting adjustments to work processes to improve quality.

      Serving as the liaison with all parties interested in the quality of the Survey, such as
      the States, public interest groups, and groups within EPA.

OPP/ODW QA Manager - The OPP and ODW program QA Manager will be responsible for:

      Directing the QA/QC program.

      Formulation of NPS QA policy.

      Providing guidance to the NPS Director, NPS QA Manager, and others about
      program office QA requirements.

      Oversight of the activities of the NPS QA Manager.

      Resolving QA/QC issues requiring programmatic input.

      Approving QA plans and  amendments.

      Auditing, as needed.

NPS QA Manager - The QA Manager will be responsible for:

      Advising the director on quality issues.

      Leading audits of the implementation contractor and analytical laboratories, (both
      contract and referee) to assure that facilities, equipment, personnel, methods,
      records and quality control are in compliance with the QA Project Plans.

      Reporting audit findings in written reports to the Survey Director.

      Developing QA guidance documents and protocols, as needed.

      Consulting and advising the QA Managers for ODW and OPP on QA issues.

      Facilitating resolution of problems identified by Technical Monitors or Analytical
      Coordinators.

      Building consensus on compliance and corrective action issues.

      Forwarding QA documentation to ICF for archival.

      Reviewing planning documents, draft reports and communications materials.

      Presenting information on the QA Program at professional meetings.

-------
                                                                              Section No. 5
                                                                              Date: February 1992
                                                                              Page 4 of 4
     Analytical/Implementation Coordinator - The coordinators will be responsible for:
           Elevating issues of concern to the NFS Director.
           Informing the Technical Monitors about the progress of the Survey and quality
           issues.
           Approving payment for work that meets NPS quality criteria.
           Coordinating the data review activities of the Technical Monitors.
     Technical Monitors - The Technical Monitors will be responsible for:
           Oversight of day-to-day analytical method performance at the laboratories.
           Informing the analytical/implementation coordinators about problems.
           Facilitating resolution of problems in the work process.
           Final review  of data for acceptability.
           Participation in on-site technical system and data audits.
           Resolving discrepancies identified in performance evaluation studies.
           Forwarding monthly progress reports to the NPS QA Manager.
     Quality Assurance Coordinator Role - The QACs will report independently of project
management and will be responsible for:
           Performing internal technical system and data audits.
           Overseeing corrective action.
     Although the QA organization in Exhibit 5-1 is presented as a hierarchal structure, no individual
in any one level is responsible administratively for an individual  in the next level.  Rather, because of
the importance and visibility of the Survey, management has consented  to allow individuals to
participate in the QA program according, to their expertise and on an as-needed basis.

-------
                                                                             Section No. 6
                                                                             Date: February 1992
                                                                             Page 1 of 5
6.    QUALITY ASSURANCE PROJECT PLANS

      Operational details of the NFS QA program will be described in individual quality assurance

project plans (QAPjPs).  Project plans will be written for each major element of the Survey, using

QAMS guidance 005/80 for the analytical elements of the Survey.  For non-analytical elements of the

Survey, each task associated with a given element will be viewed as part of a process. The overall

process will then be described in a logical order along with the QA/QC used to monitor and adjust it.
6.1 Scope

      The criteria for QAPjPs contained in this chapter will apply to each of the elements listed below

and will be written by the organization indicated.

      Contract Analyses:
           Methods 1 and 3
           Method 2
           Method 4
           Method 5
           Method 6
           Method 7

      Referee Analyses:

           Method 1, 3, and 6


           Method 2, 4, 5, 7, and 9
      Non-Analytical Task Areas:

           Sampling
           Second Stage Stratification
           Statistics
Montgomery Labs, Pasadena, California
Clean Harbors, Boston, Massachusetts
Radian Corporation, Austin, Texas
ES&E, Gainesville, Florida
Battelle, Columbus, Ohio
ES&E, Gainesville, Florida
OPP-Environmental Chemistry Section, Bay St. Louis,
Mississippi

ODW-Technical Support Division, Cincinnati, Ohio and
Office of Research and Development (ORD) Risk Reduction
Engineering Laboratory, Cincinnati, Ohio
ICF Incorporated, Fairfax, Virginia
ICF Incorporated, Fairfax, Virginia
Westat, Rockville, Maryland
6.2   Philosophical Approach

      The QAPjP will be viewed as each organization's agreement to implement specific procedures

for assuring the quality of their data As long as NPS issues are addressed, the manner in which they

are addressed will be left to the discretion of the individual organization. The project plan will serve as

the reference for discussions between the contractors and the analytical coordinators and technical

monitors.  During on-site technical system audits, the QAPjP will serve as the standard against which

the organization's procedures can be judged.  At this time the organizations implementation of the

project plan can be verified and possible areas for improvement can be identified.

-------
                                                                              Section No. 6
                                                                              Date: February 1992
                                                                              Page 2 of 5
6.3   Minimum Criteria

      Minimum criteria for approval of the project plans are:

      1.    The organization's commitment to QA must be expressed.

      2.    An individual, unattached to the project, must be identified who can serve in a QA
           oversight capacity with clearly identified responsibilities which will include the audit
           function.

      3.    A system for records management must be described.

      4.    A system for supervisory or peer review of data must be described.


6.4   Requirements for Laboratory Plans

      In addition to describing general aspects of the laboratories' operations, the more specific

information found in Appendix A 'Information Packet for NFS Laboratories' must be incorporated into

the laboratory  QA project plans, using the format recommended by QAMS (005-80), briefly described

below.

      Section 1:  Title and Approval Page

           Labs should list the Technical Monitor, Analytical Coordinator, and Project Officer as
           the EPA personnel responsible  for approving the plan.

           Also, a distribution list should be developed to include everyone working on the
           NFS.  The list should be initialed and dated to indicate that all personnel have
           received and read a copy of the plan. The list will be made available to the
           Technical Monitor.

      Section 2:  Table of Contents

           A table of contents will be included with the plan.

      Section 3:  Project Description

           A brief description of the Survey and the role of the lab in relation to it must be
           described.

      Section 4: Project Organization and Responsibilities

           All individuals and their  responsibilities with respect to NFS analyses must be
           identified. At a minimum, the program manager, quality assurance coordinator,
           sample receipt clerk, sample preparation personnel, analysts, and  data clerks must
           be identified.

      Section  5: QA Objectives for Measurement Data

           Include the  method for determining estimated detection limits (EDL's), and method
           reporting  limits (MRL's) based on the guidance in Appendix A.  Note that results
           must be forwarded to the Technical Monitor for approval.

-------
                                                                       Section No. 6
                                                                       Date: February 1992
                                                                       Page 3 of 5
      Describe the procedure that will be used for constructing control charts, including
      the use of Dixon's outlier test as described in Appendix A.

      Note that EPA will provide field samples for spiking to be used for a study of
      recoveries from different matrices and to study analyte stability using NPS
      preservation and analysis schemes. Spiking levels and holding times for these
      studies can be found in Appendix A.

Section 6: Sampling Procedure

      The implementation contractor, ICF, has provided information in Appendix A that the
      laboratories require on sampling procedures, such as sample containers and
      preservation schemes,  sample labels and IDs and sample tracking procedures.
      This information will be included by the lab in its' plan.

Section 7: Sample Custody

      ICF has provided information that needs to be included in this section, such as the
      communications system for notifying the labs about the sampling schedule and the
      labs notifying ICF about sample receipt (see Appendix A). ICF will also supply the
      labs with the procedure for returning sample kits, which should be included in this
      chapter.

      The laboratory will describe its' procedures for sample receipt, sample storage for
      NPS samples and extracts, holding times and the manner in which they are tracked,
      and how it monitors environmental conditions of sample and extract  storage areas.
      Policies for disposal of samples and extracts should also be provided. All
      procedures must meet the criteria presented in Appendix A.

Section 8: Calibration Procedures and Frequency

      This section should acknowledge that EPA will supply all calibration standards.
      Based on the requirements of the methods, the lab should then describe their
      calibration procedures, frequency, and QC checks.  Associated procedures should
      also be described, such as for standards preparation and retention of
      chromatograms.  A statement should be included that any deviations from the
      validated method must be discussed  with the Technical Monitor.

Section 9: Analytical Procedures

      A brief summary of the method should be described in this section, with the full
      method included in an  appendix. Major pieces of instrumentation  should be
      described. Batch sizes including all required QC samples and their run order
      should also be given.

Section 10: Data Reduction, Validation, Reporting

      The laboratories system for data reduction must  be described. If an  automated
      system is used, its' use and algorithms  must be described.  Peer review/supervisory
      review of the data must be described here. Data reporting must conform to the
      standard that has been provided by the NPS in Appendix A.  All batch data,
      including QC and confirmation data, must be reported within 60 days of sample
      collection.

-------
                                                                             Section No. 6
                                                                             Date:  February 1992
                                                                             Page 4 of 5
           Based on guidance in Appendix A, a fast track reporting system must be described
           in this section for confirmed positives with a known health effects level and for
           situations where results from confirmation columns do not agree with results from
           primary columns within 25%.

           A system for retaining records in a retrievable manner must also be described in
           this section.

     Section 11:  Internal Quality Control Checks

           A summary of all QC checks for analyses of NPS samples must be provided
           including the frequency of use, acceptable criteria, and corrective action.
           Confirmation procedures for positives identified on the primary column should be
           included. All these procedures should be based on guidance found in Appendix A.

     Section 12:  Performance and Systems Audits

           The laboratory will be expected to participate in NPS sponsored quarterly
           performance evaluation samples and biannual technical system audits. Participation
           in both should be acknowledged in this section.  Also, audits conducted by the
           laboratory of its own activities should be described, including the content,
           frequency, and reporting of the audits.

     Section 13:  Preventive Maintenance

           A schedule of routine maintenance and replacement of parts should be described.

     Section 14:  Specific Procedures for Assessing  Measurement System Data

           Calculations should be described for assessing the results of QC samples such as
           the instrument control standard, for which resolution, peak symmetry factor, and
           peak geometry factor must be calculated and such as the laboratory control
           standard, for which standard deviation and relative standard deviation must be
           calculated.

     Section 15:  Corrective Action

           Describe the organizations' procedures for taking corrective action, clearly
           identifying those individuals responsible for ensuring that problems have  been
           resolved.

     Section 16:  QA Reports to Management

           Describe both the internal reports generated for management and the monthly
           reports generated for the Technical Monitor. Reference the format as found in
           Exhibit 9.1.
6.5   Requirements for Other Plans

      QAMS guidance 005/80 was written with a focus on the laboratory analysis steps of

environmental data generating processes. Because the NPS wants to cover all steps in the data

generating process, QA plans will be written for nonlaboratory aspects of the Survey as well. To
implement this goal will require a broad and creative interpretation of the current guidance on QAPjPs,

-------
                                                                             Section No. 6
                                                                             Date:  February 1992
                                                                             Page 5 of 5
especially in regards to discipline areas such as statistics, hydrology, and mapping. For these plans,
the QAPjP format will be modified to account for the different orientation of the plans with the
requirement that the minimum criteria listed in Section 6.3 be addressed and that the first four
sections will be as described in 005/80 (i.e. Title - Approval Page, Table of Contents, Project
Description, and Project Organization  and Responsibilities.)  The format for the rest of the plan will be
flexible and allow for separate chapters on significant components of the work and the accompanying
QC. All standard operating procedures are to be included as appendices.  Approval  of the plans will
be the responsibility of the appropriate Technical Monitor and the QA Managers for OPP and ODW
and the NPS Director.

6.6   Amendments to Plans
      Changes to the plans are anticipated and NPS management has developed a procedure for
approving amendments to the QAPjPs, which can be found in Appendix A-6.

-------
                                                                              Section No. 7
                                                                              Date: February 1993
                                                                              Page 1 of 5
7.    PROCESS CONTROL
      During the Survey, to achieve the levels of precision expressed in the NFS DQOs, all aspects of
the data generation process must be operated in a state of control.  This will be accomplished
through monitoring of intermediate and final NPS data, using statistical process control whenever
possible.  Each unique work element in the Survey, and its associated QA/QC will be described in
individual QA project plans, as discussed in Section 6. The project plan will represent an agreement
for a given level  of quality  between the organization performing the work and Survey management.
This chapter will briefly summarize the controls that are explained in  more detail in the project plans.

7.1    NPS Pilot
      Prior to collecting data for inclusion in NPS final summary reports, a pilot Survey was conducted
to field test major elements of Survey design, logistics, and QA/QC.  The pilot proved to be an
extremely effective tool for improving Survey quality because it led to overall process evaluation and
improvement.  Following a review by the Federal Insecticide, Fungicide, and Rodenticide Act, Scientific
                            b
Advisory Panel Subpanel (1987), changes were made in several areas including the well selection
process, the questionnaires, sampling schedule, and analytical methods.

7.2   Well Selection
      The guiding philosophy behind all steps in the well selection process will be to meet
requirements necessary to allow for subsequent statistical analysis. Given this philosophy, the
capability will be present to associate error bounds with estimates derived during the data
interpretation phase of the Survey.  At the present stage of development of the NPS, no further
process control  can be applied to first stage stratification,  since these activities were completed prior
to the pilot. All remaining  activities performed in the sample selection process  will differ between the
rural domestic survey and the CWS survey.
      Controls for rural domestic well selection will begin with second stage stratification activities,
which first involve the collection of hydrogeologic and pesticide information, using interviews and other
means, and then involve mapping the collected information as DRASTIC vulnerability and cropping
categories.  Interviewers will be trained in proper interviewing techniques and questionnaires will be
processed using standard controls for data coding, data entry, and range and  logic checking.  To
provide consistency during mapping of DRASTIC scores, personnel will first be certified using
reference counties.
      During the Random Digit Dialing (RDD) effort, which will be used to locate qualified wells  and
well owners willing to participate in the Survey, several controls will be used.  To control the actual
telephone interviews,  interviewers will be trained and then silently monitored by supervisory personnel
for the accuracy and appropriateness of their conversations.  To track progress within each county,
control charts will  be used to monitor the number of calls required for meeting  the target number of

-------
                                                                              Section No. 7
                                                                              Date:  February 1992
                                                                              Page 2 of 5
wells in each county.  Out-of-control conditions will be investigated.  ROD constitutes the third and last
stage of the well selection process for rural domestic wells.
      Controls for CWS well selection will be focused in two areas. The first will be to verify the
information in the second stage sampling frame, i.e. the FROS list.  Attempts will be made to contact
CWS owners for a large number of the entries and check the accuracy of the information in the FRDS
database, particularly for the types of errors identified during the pilot, i.e. systems that are listed
individually plus being included with parent companies and for the correct number of wells in the
system. The second major control used in the CWS selection process will be to have a well defined
standard operating procedure for systems with more than  one well which can be used on-site by the
sampling team leader to randomly select an individual well for sampling.

7.3   Sampling Controls
      The following process control features will be instituted to both  correct and improve on systems
used during the pilot and will apply to samples from both rural domestic wells and CWSs. Two
distinct phases occur in the sampling process: those activities performed prior to sample collection
and those activities performed during or after sample collection.  Process controls prior to sampling
will include:
           A thorough hands-on training program conducted jointly by ICF and Westat
           covering the NPS protocol for collecting samples and conducting interviews.  All
           State and contract personnel will be required to attend a  training session prior to
           collecting samples for the Survey and will be issued training manuals for later
           reference.
           Centralized computerized management of all sampling activities at ICF, including
           final scheduling of sample collection dates. ICF can therefore control the rate of
           sampling to be both compatible with laboratory capacity and with the deadline for
           sampling completion.
           ICF will contract for sample containers constructed of appropriate materials and
           precleaned and preserved to NPS specifications so that breakage and losses due to
           contamination or lack of preservative will be minimized or eliminated.
           ICF will prepare and ship to the sample teams all sample kits (sample bottles and
           coolers) and supplies  (including questionnaires and field manuals) so that the
           correct bottles, necessary equipment, and calibrated probes will always be used.
           ICF and Westat will maintain a computerized inventory control system for both
           sampling supplies and questionnaires.
Process controls that will  be  used  during and after sampling include:
           Use of standardized and documented procedures by trained sampling teams.
           Purging of wells  prior to sampling.
           Checking for  any treatment upstream of sample collection.

-------
                                                                              Section No. 7
                                                                              Date: February 1992
                                                                              Page 3 of 5
           Checking well water for chlorine.

           Use of the NPS Hotline (1-800-451-7896) for assistance with problem sites.

           Use of a computerized sample tracking system, to assure that scheduled sampling
           events occurred as planned and that samples reached the laboratories in
           satisfactory condition.

ICF will also maintain a problem file to document any problems affecting quality that occurred during

the sampling process.  Information contained in the problem file will include the identity of the

individual who discovered the problem, the nature of the problem,  who was contacted to discuss the

problem, and how the problem was resolved.


7.4   Questionnaire Administration and Processing Controls

      A large part of the data collection effort for the NPS will be through the use of questionnaires.

Process control features associated with this effort include:

           Expert evaluation of questionnaire wording and construction.

           Use of trained, State personnel to administer questionnaires for the CWS survey
           and use of trained, professional,  Westat interviewers for the rural, domestic well
           survey.

           Processing questionnaires in a controlled environment where data retrieval is
           performed as necessary, where coding decisions are checked for consistency
           between coders, and where data entry is performed twice and checked for
           accuracy.

           Development of range and logic  checks to evaluate data for reasonableness.

           Develop imputation strategies after reviewing Survey results, so that the best
           possible estimates for missing data points can be made.

           Choose imputation classes carefully, so that donor records will  be  reasonable
           approximations of the missing values.

           Check the frequency of use for each donor record, so  that no single record
           contributes an inordinate number of values.

           Flag all imputed values in the database.


7.5   Laboratory Controls

      The analytical portion of the Survey will be tightly  controlled in order to guarantee both the

correct identity and the correct concentration for any analytes that  are reported.  Process control

features associated with this effort include:

            Use of standardized methods.

            Use of a centralized source of standards

-------
                                                                              Section No. 7
                                                                              Date: February 1992
                                                                              Page 4 of 5
           An initial demonstration of capabilities.

           Demonstrated control of the analytical measurement system through calibration
           requirements, method blanks, surrogate recoveries, internal standard responses,
           use of laboratory control standards with control charts, and instrument control
           standards.

           A study to examine potential interferences from sample matrices.

           A study to examine false negative and false positive rates through the use of a
           referee lab.

           A study to determine analyte stability for NFS sample preservation and analysis.

           Use of Method 7 for checking trihalomethane concentrations, which might indicate
           the sample was chlorinated and therefore  not a valid NFS sample.

           Participation  in quarterly performance evaluation studies.


7.6   Database Management Controls

      Data will initially be stored in three databases before being consolidated into the complete NFS

database, which will then be prepared for use by the public. Process control features associated with

this effort can be placed into three categories: analytical, questionnaire, and merged database, as

described below.

      Analytical

      All laboratory data will be subjected to an automated data audit before being entered into
      the database.  Any deviations to requirements in the QAPjPs will be reviewed by the
      appropriate Technical Monitor, who will have final authority for accepting or rejecting the
      data that will be included in the database.

      The problem file maintained by IGF concerning problems encountered in the field during
      sample analysis will be consulted at the end of the Survey.  Any samples with problems
      which would invalidate the analytical analyses will be removed from the database.  A
      typical problem of this type might be that the sample was not collected according to
      protocol, such as being collected following treatment by chlorination.

      Questionnaire

      Data retrieval will be performed for all 'key* data  items identified by EPA as critical and  for
      which additional resources should be committed in order to retrieve missing data points.

      The nonresponse rate for each data item will be  calculated in order to identify any
      problem questions  and to screen for questions which might not be amenable to
      imputation.

      The professional interviewers hired by Westat will participate in a debriefing session
      during which they will subjectively evaluate the effectiveness of the questionnaire
      instruments.

-------
                                                                        S«ction No. 7
                                                                        Date:  February 1992
                                                                        Page 5 of 5
Merged Database

The merged database will clearly identify any imputed values with flags.

Stringent security measures will be implemented so that the merged databases cannot be
altered after a final QC check has been performed.

-------
                                                                              Section No. 8
                                                                              Date.  February 1992
                                                                              Page 1 of 4
8.   AUDITS
     A schedule of frequent audits will be a major component of the QA program administered by the
NPS QA Manager.  The audits will encompass analytical activities, as well as 'implementation"
activities, such as sample selection, sampling, questionnaire administration, etc. Audits will be
conducted to independently verify that processes are in place that are capable of achieving agreed
upon levels of quality. The different types of audits which will be employed include technical  system
audits (TSAs), audits of data quality (ADQs), and performance evaluation studies (PEs). These are
described in detail  below.  A fourth type of audit, a close-out audit, is described in Section 9.

8.1  Philosophical Approach
     All audits will be conducted with the philosophy that the results will be used for continuous
improvement, rather than to assign blame for deficiencies. Audits will focus on the activities of each
individual, giving equal importance to work accomplished at every level of the organizational hierarchy.
The emphasis will therefore be to hold individuals accountable for the quality of their own work.

8.2  Technical  Systems Audits
     Technical system audits (TSAs) will be used to make an on-site assessment of both the
resources and processes used to accomplish NPS work. The QAPjP will serve as the standard
against which the organization will be evaluated. As a general rule, these audits will be conducted by
a team led by the NPS QA Manager with assistance from the Technical Monrtor(s) and, at times, the
QA Managers for ODW and OPP.  Depending on the organization being audited, one of three
checklists will be used, as follows:
           Laboratory Audit Checklist (Appendix B) will be used for organizations supplying
           analytical services.
           Field Sampling Audit Checklist (Appendix C) will be  used at both CWS and rural,
           domestic well sites during the collection of water samples.
           General NPS Audit Checklist (Appendix D) will be used for all office activities,  such
           as sample kit preparation, DRASTIC mapping, and questionnaire processing.
     Typically, the audit will be tentatively scheduled by the Technical Monitor at a date and time
mutually agreeable to both the organization being audited and the audit team.  The NPS QA Manager
will then provide formal notification of the audit in a letter to the project manager at the organization
being audited.
     The audit will commence with an introduction by the audit team leader describing the scope and
agenda for the audit, in addition to any relevant updates on the  progress of the Survey. At that time,
project management will be requested to provide an  update on personnel and their responsibilities.  A
tour of the facilities and/or observation of the work will occur next. The responsible party for each item
on the checklist will be interviewed, at their work station, if possible, to verify that operations conform

-------
                                                                               Section No. 8
                                                                               Date:  February
                                                                               Page 2 of 4
to the QA project plan. In conjunction with the interviews, all supporting documentation will be
reviewed.
      During audits of the analytical laboratories, a data audit will be conducted, during which a
member of the audit team will manually verify the analytical result and all accompanying QC. Any
questions arising during the data audit will be directed to the analyst who performed the work.
      During audits of field operations, the auditor(s) will not interrupt the field crew, especially in the
presence of the occupants of the home where sampling occurs.  Rather the auditor(s) will act as
impartial observers and discuss any problems with the field crew in private.
      At the conclusion of the audit, a meeting will be held with the project manager during which the
audit findings will be discussed and an agreement will be reached on corrective action measures.  A
report describing the audit will be written by the team leader and circulated for review to all
participants, including the project manager, before being forwarded to the Survey Director.
      TSAs will be held semiannually for laboratory and office activities.  The goal for auditing field
activities will be 1% of all wells sampled.

8.3   Audits of Data Quality
      Audits of data quality (ADQs) will be a  major part of the laboratory technical system audit (see
Section VIII of the laboratory audit checklist) and will be conducted in order to verify the accuracy of
analytical identifications and quantifications.  The procedure will be to track a sample from the time it
was taken, through analysis, and finally to the reporting of the analytical result to the Survey database
manager.  In addition to sample results, the preparation and analysis of all QC samples and standard
solutions will be  checked and calculations will be verified.

8.4   Performance Evaluation Studies
      Performance evaluation (PE)  studies will be used to challenge the capabilities of the analytical
laboratories and will be conducted quarterly. The contents of the standards will be determined each
quarter in discussions between the NFS QA Manager and the Technical Monitors. The Technical
Monitors will consider several factors when recommending analytes for inclusion in the PE samples
such as whether an analyte  has been detected in well water samples, if an analyte has been difficult
for the laboratories to analyze, and has the analyte been included in a previous PE sample.  The
Technical Monitors will be counseled to include the analytes at three to five times the limit of
quantification, unless conditions specifically warrant a different level.
      Bionetics,  a contractor with EMSL-Cincinnati, will be responsible for preparing standard solutions
and packaging them in sealed glass ampules.  The referee laboratories will be responsible for
verifying their contents.  The QA Manager will forward the PE samples to the labs with instructions for
preparation, as shown in Exhibit 8-1. The labs will be given approximately four weeks to analyze the

-------
                                                                       Section No. 6
                                                                       Date:  February 1993
                                                                       Page 3 of 4
                                      Exhibit 8-1

                        Example of Instructions for PE Samples
                  INSTRUCTIONS FOR NATIONAL PESTICIDE SURVEY
                        PERFORMANCE EVALUATION SAMPLES
                                   February 7,1990
                             Methods 1, 2,3,4,5, 6, and 7
As part of the Survey quality assurance program, performance evaluation samples will be analyzed
by every laboratory once each calendar quarter.  The samples will be prepared from concentrates
contained in sealed glass ampules.  Each lab will receive two identical vials per analytical method,
so that a back-up vial will be available. To prepare the samples, follow the directions given below.
The analyte concentrations will be within the normal working ranges 6f the method. Analyze once
and report  the value as you would  any other NFS samples, using the sample ID from the
concentrate vial.  In addition please forward your result(s) in memo form to your technical monitor
no later than March 16, 1990.

DIRECTIONS: Partially fill a 1000 ml Class A volumetric flask with laboratory pure water.  Add 1
ml of concentrate. Fill the volumetric to the mark with water and invert several times to mix.  Use
only that portion of the sample that is required by the analytical method.

Please  contact   your  technical  monitor   if  you  have   any  questions.
                            DUE DATE IS MARCH 16, 1990
cc:      Battelle
        Clean Harbors
        ECS
        Hunter/ESE
        Montgomery Labs
        Radian
        TSD

-------
                                                                              Section No. 8
                                                                              Date:  February 1992
                                                                              Page 4 of 4
samples and report back to their Technical Monitor. The results will also be reported via floppy disc
along with sample batch data
      Criteria for passing the PE samples will be developed on an analyte-by-analyte basis.  A 99%
confidence interval will be computed for the known true* value of each analyte in the PE sample
based on results of seven separate analyses by the referee lab and compared to a 99% confidence
interval for the reported result.  (The latter will be computed from precision estimates from past
laboratory  performance for the given analyte).  If the two confidence intervals overlap, the criteria will
have been met.  If the intervals do not overlap, the criteria will not have been met and corrective
action must be instigated in consultation with the Technical Monitor. Each laboratory and Technical
Monitor will receive the  results for their methods. A summary report of all analyses will be prepared
and forwarded to the Survey Director and the QA Managers for OPP and ODW.

8.5   Corrective Action Verification
      Any deficiencies identified during the course of an audit will require that the audited organization
develop a plan immediately for correcting the problem. The plan must be discussed with the
Technical Monitor, and, as appropriate, with the NPS QA Manager.  At the minimum, verification of
problem resolution will occur at the next on-site audit.

-------
                                                                             Section No. 9
                                                                             Date:  February 1992
                                                                             Page 1 of 3
9.    QA COMMUNICATION STRATEGIES
      Since its inception, NPS has committed significant resources towards developing effective
communication strategies which will greatly facilitate the information exchange processes necessary
for the QA program to operate successfully.

9.1    Status Reports
      NPS management has developed a routine schedule of teleconference calls which will include
QA as an agenda item. Weekly calls will be hosted by NPS staff for key EPA and contract personnel
to discuss progress, problems, and QA issues that have developed over the past week. The calls will
serve an important function in coordinating real-time activities that impact more than one organization,
such as the sampling schedule on the analytical labs.  The calls will be used to keep NPS
management informed of the audit schedule and to report informally on audit findings.  Monthly
teleconference calls will be hosted by NPS  management for EPA staff located in the ten regional
offices who have been designated as official points of contact on Survey issues.  The overall progress
of the Survey will be described during the call, as will any associated QA issues.
      Quality assurance issues will also be  communicated in several other formats.  Specific QA
accomplishments will be described in monthly and quarterly reports by the QA  Manager to the EPA
project officer for quality assurance support, who resides at the Environmental Monitoring Systems
Laboratory (EMSL) in Cincinnati.  An annual report on QA accomplishments and plans for the
following fiscal year will be prepared by the NPS QA Manager for the OPP QA Manager for inclusion in
the OPP annual quality assurance report and workplan to QAMS.  Lastly, as described in Section 8, all
audit  results will be documented in free-standing reports and forwarded to the Survey Director, the
QAMs for the Offices of Pesticides Programs and Drinking Water, and the NPS archives operated by
ICF.

9.2   Operations Communications
      Separate communications networks will be developed for tracking the progress of operations
level activities on a day-to-day basis.  ICF will develop a computerized tracking system accessible by
both field crews and  lab personnel that will contain information on the planned dates of sample
collection and actual progress in the field. A system of frequent phone contacts will also be
encouraged between the project managers at the laboratories and the Technical Monitors, especially
in cases where problems arise with an analytical method or the laboratory experiences any type of
difficulty which would preclude its ability to analyze samples. The laboratories will also provide written
monthly progress reports to the Technical Monitors according to the format presented in Exhibit 9-1.
This level of communication should benefit the quality of Survey data because managers will have
access to information that will allow them to make the necessary adjustments for meeting Survey
goals on completeness.

-------
                                                                            Section No. 9
                                                                            Date: February 1992
                                                                            Page 2 of 3
                                         Exhibit 9-1
                           Format for Monthly Progress - QA Report
                                  EPA Contract Laboratories
                                    Progress - QA Report
Method #
Report Period
Analyst 	
Date  	
1.     Progress:
      # of samples received  	
      # of samples analyzed  	
      # of samples invalidated  	
      Set ID numbers forwarded to data manager
2.     Bench Level Corrective Action (s):
      Date   	
      Problem	
      Action Taken
      Verification of Correction
      Sample set analyzesd prior to problem
                           (Use back of page if additional room is required.)
3.    Problems (Project Related):

4.    Information requested by Technical Monitor.
5.    Changes in Personnel:
6.    Comments:

-------
                                                                              Section No. 9
                                                                              Date:  February 1992
                                                                              Page 3 of 3
9.3   Final Report
      Information on the QA program and its impact on the Survey will be included in the Survey's
summary interpretive reports, following all data collection and processing activities. In addition to
providing specific information about the QA program in the final reports,  each report will be reviewed
by the NPS QA Manager from a QA perspective and comments will be forwarded to the Director.

-------
                                                                              Section No. 10
                                                                              Date: February 1992
                                                                              Page 1 of 2
10.   CLOSE-OUT ACTIVITIES
      As each organization completes its work for the Survey, a final update to each of the project
plans will be required, as will participation in a close-out audit, as described below.

10.1  Final Update to QA Project Plans
      All QA/QC conducted in support of the Survey will be described in one of the QA project plans
previously listed in Section 6.  The QA project plans are designed to be "living1 documents, therefore if
planned procedures are not effective or implementable, the organization performing them will be free
to develop alternative approaches, provided the revision procedure in Appendix A-6 is followed.
These types of changes are anticipated to occur with relative frequency and it will be the responsibility
of the organization performing the work to accurately record changes in the QA project plan.
      A final review of the project plans will  be performed by the Technical Monitors and the QA
Manager, who will request that missing updates be included before final payment is made  for the
project plans at the end of the Survey. An electronic version of the plan will also be requested after
final changes are made.  The final updates to the project plans will form an appendix in the final report
and will be available individually from the National Technical Information Service (NTIS), Springfield,
Virginia.

10.2 Close-Out Activities
      As requested in the Survey policy on  data archival, shown in Exhibit 10-1, every organization
participating in the Survey will be responsible for archiving  data they have generated in support of
Survey results. Basically the policy states that the data must be stored at a secure location in an
organized and retrievable fashion for a minimum of one year after release of Survey results. A close-
out audit will be conducted by the NFS QA  Manager and Technical Monitors to evaluate the archival
procedures adopted by each organization and a written report will be made to the Survey Director on
their suitability. During the audit, the files will be checked for completeness and the storage space will
be inspected. The close-out audit will be the last on-site review performed under the auspices of the
NFS QA program, therefore each organizations QAC will be responsible for providing written
verification that any necessary corrective action was implemented.

-------
                                                                         Section No. 10
                                                                         Date: February 1992
                                                                         Page 2 of 2
                                      Exhibit 10-1

                             Example of NFS Archival Policy
                                    ATTACHMENT 1

       NATIONAL PESTICIDE SURVEY (NPS) POLICY FOR ARCHIVAL OF RAW DATA
                                    JANUARY, 1990

POLICY STATEMENT

   It is the policy of the National Pesticide Survey that all raw data that have been collected in
support of Survey activities will be stored and managed in a systematic manner such that data may
be retrieved in a timely fashion for reference purposes.

BACKGROUND

   The NPS has conducted sampling, interviewing, and chemical analyses for approximately 1,350
drinking water well sites across the U.S. As a result, a considerable amount of 'raw* data has been
generated.  To assure the continued availability of these data for reference purposes,  each
organization contributing to the Survey is requested to abide by the following guidelines.

GUIDELINES

1) Scope:

   This policy applies to all data and other materials that have been generated either through the
   analytical  effort or the implementation/data synthesis effort; both hard copy and/or electronic
   media are covered.  Also  covered are any formal documents used for  the Survey,  such as
   training manuals.

2) Accessibility:

   Materials must be managed so that they can be accessed within 5-10 working days. Access to
   materials should be limited to internal staff and representatives of the EPA.

3) Length of Storage:

   At the minimum,  materials must be stored until October of 1992.   Even after this date, the
   Survey Director  (or  designee)   must  be  contacted  for permission to  purge  Survey
   data/documents.  Survey management also reserves the right to have all files surrendered upon
   request at any time.

4) Privacy Act (5 U.S. Code 552a):

   Organizations that have recorded information on specific individuals must manage their records
   so as to guarantee confidentiality in accordance with the requirements of the Privacy Act.

APPROVAL

   This policy has been reviewed and approved by  the Director, National Pesticide Survey, 401 M
   Street, S.W., WH550, Washington, D.C. 20460.

-------
                                                                             S«ction No. 11
                                                                             Dote: February 1992
                                                                             Page 1 of 1
11.   REFERENCES CITED

Alexander, W. J., J. H. Lehr, and L Moller, 1985. Training Manual for Using DRASTIC Hydroaeolooic
      Factors in Conducting a National Ground Water Vulnerability Assessment. Research Triangle
      Institute, unnumbered report, 168 pp.

Kulkarni, S.,  F. Smith, C. Salmons, and S. Coffey, 1987.  National Survey of Pesticides in Drinking
      Water Wells. Quality Assurance Project Plan for the Pilot Study. Research Triangle Institute,
      RTI/7801-08-01

Mason, R.E., L.L Piper, W.J. Alexander, R.W. Pratt, S.K. Liddle, J.T. Lessler, and M.C. Ganley, 1988.
      National Pesticide Survey Pilot Evaluation Technical Report. Research Triangle Institute,
      RTI/7801/06-02F

Nees, M. and C. Salmons, 1987.  National Survey of Pesticides in Drinking Water Wells. A Review of
      the Planning Process and the Data Quality Objectives. RTI/7801/08/01F.

United States Environmental Protection Agency, A Set of Scientific Issues Being Considered by the
      Agency in Connection with the National Pesticide Survey Pilot Study. October 9, 1987, Federal
      Insecticide, Fungicide, and Rodenticide Act Scientific Advisory Panel Subpanel

United States Environmental Protection Agency, Interim Guidelines and Specifications for Preparing
      Quality Assurance Project Plans. December 29, 1980, QAMS-500/80

-------
                                               Appendix A
                                               Date: February 1992
                                               Page 1 of 63
                APPENDIX A
INFORMATION PACKET FOR NPS LABORATORIES

-------
                                                        Appendix A
                                                        Date: February 1992
                                                        Page 2 of 63
                        Appendix A-l

        Guidance for Quality Assurance Project Flans:

                          Section 5


Initial Demonstration of  Capabilities:   Determining Reporting Limits

Initial Demonstration of  Capabilities:   Use of Control Charts

Additional Guidance on Establishing Control Charts for NFS Methods

Dixon's Test

-------
                                                                 Appendix A
                                                                 Data: February 1992
                                                                 Page 3 of 63
                     INITIAL DEMONSTRATION OF CAPABILITIES
                      DETERMINING REPORTING LIMITS 2/3/88


1.    Determine concentration of standard necessary to produce an instrument
     detector response with a 5/1 signal to noise ratio.

2.    Spike eight reagent water samples at the concentration determined above,
     and analyze in a single day.

3.    Compute Minimum Detectable Level (MDL) by multiplying the standard
     deviation by the student's t value, appropriate for a 99% confidence
     level,  and a standard deviation estimate with n-1 degrees of freedom.
     (Note that for n-8,  t-2.998 at the 992 confidence level and 7 degrees of
     freedom).

4.    The Estimated Detection Level (EDL) equals either the concentration of
     analyte yielding a detector response with a 5/1 signal to noise ratio, or
     the calculated MDL,  whichever is greater.

5.    Determined EDLs must be no greater than twice those determined during
     methods development, with the following exceptions:

     a.    Method 5  target values  will  be supplied  by  EPA,  since the  EDLs
          included  in the method  were  determined using a less  sensitive
          detector  than currently available.

     b.    Methods 7  and 9 will  be evaluated by the technical monitors,
          since  target EDLs  are not  included  in the methods.

6.    The acceptability of EDLs exceeding the above limits will be determined
     by  the technical monitor,  based on health effects values.

7.    Reanalyze the  standards using the confirmation  column,   the  EDLs
     determined on the confirmation column must equal those  determined on the
     primary column.  Again, EDLs exceeding this  requirement will be approved
     on  a case by case basis, by the technical monitors.

8.    The laboratories will be required to perform up  to  six  analyses per
     analyte mix by GC/MS, for the appropriate methods.    These analyses  will
     be  performed by Multiple Ion Detection (MID), using the three ions
     specified by EPA.  The purpose of these analyses are to determine the
     concentration at which a 5/1 signal to noise  ratio, for the  least intense
     of  the three ions, is obtained.

-------
                                                                 Appendix A
                                                                 Date: February 1992
                                                                 Page 4 of 63
9.
10.
The Minimum Reporting Levels  (MRLs)  are  defined as the following mutiple
of the EDL.
              Method

                 1
                 2
                 3
                 4
                 5
                 6
                 7
                 9
                                     Multiple
                                     4 x EDL
                                     5 x EDL
                                     5 x EDL
                                     5 x EDL
                                     3 x EDL
                                     3 x EDL
                                     3 x EDL
                                     3 x EDL
11.
Any chromatographic peak occurring  at  the  proper retention time of an NFS
analyte, at a concentration  level between  .5  x MRL and MRL will be
confirmed and reported as an occurrence  of that analyte.   Exact
quantification will not be required.   Any  frequent occurrence  of a peak
which is not an NFS analyte,  or  any occurrence of a non-NFS analyte at
what appears to be a high concentration, should be noted.

The lower concentration calibration standard  must be prepared  at a
concentration equal to the MRL.

-------
                                                                 Appendix A
                                                                 Date: February 1992
                                                                 Page 5 of 63
                     INITIAL DEMONSTRATION OF CAPABILITIES
                         USE OF CONTROL CHARTS  2/3/88


A.   Contractors for Methods 1-4 and 6 will be required to demonstrate control
     of the measurement system via use of control charts.  Control must be
     demonstrated for each analyte for which quantitation is required and for
     the surrogate at a concentration equal to that spiked into samples.

B.   To establish the control charts, following initial demonstration of
     capability, 5 reagent water samples will be spiked at 10 times the
     Minimum Reporting Limit (MRL) for the method and carried through
     extraction and analysis.  An additional 15 samples will be spiked and
     analyzed,  5 on each of 3 days.  The data from these 20 spiked samples
     will be used to construct control charts.

C.   Criteria for Accuracy and Precision

     1.   The RSDs  for  any  analyte must be  less  than or  equal  to  20%,
          except where  data,  generated by Battelle  at the  corresponding
          level,  indicated  poorer  precision.   The RSDs exceeding  20% will
          be evaluated  on a case-by-case basis by technical moniotrs for
          each method.

     2.   The mean  recovery (x) of each analyte  must lie between
          Battelles'  mean recovery for each analyte (at  the corresponding
          level)  +/-  3  times  the relative standard  deviation (RSD)  for
          that analyte  as determined  by Battelle during methods
          development,  but  no greater than  Battelle's mean recovery +/-
          30%.

          Example:

          For an analyte "A"

              i.   Battelle demonstrated recovery  (x) of 80% for  Analyte
                   "A"  with RSD of 5%. Acceptable  recoveries  will  be 80%
                   +/-  3(5%)  -  80% +/- 15%  - 65% -  95X.

              ii.  Or,  Battelle demonstrated recovery (x)  of 80%  with RSD
                   of 15%  for analyte "A".   The  acceptable recovery would
                   then be  limited to 80% +/-  30% - 50%  -  110%.

     3.   Surrogate

          In establishing the control chart for  the surrogate, criteria
          C(l) and  C(2)  above,  apply;  it follows that one  of the spike
          mixes  must  contain  the surrogate  at  the concentration as  spiked
          into actual samples.

          Surrogate recoveries  from samples  (Methods 1-4,  6-7) will be
          required  to be within +/- 30% of  the mean recovery determined
          for that  surrogate during the initial demonstration of
          capabilities.

-------
                                                                 Appendix A
                                                                 Date: February 1992
                                                                 Page 6 of 63
     4.   Warning  Limits/Control Charts

         The control  charts  will be drown up so as to depict both warning
         limits  (+/-  2  standard deviations (s))and control limits (+/-
         3(s).

D.   Outliers

     Dixon's test will be used to determine outliers.  There can be no more
     than 3 outliers per analyte from the 20 spiked controls.

E.   Out-of-Control Situations

     1.   In the following instances,  analytical work must be stopped
         until an "in-control"  situation is established.

         a.   More than 15%  of  the  analytes of  a particular method are
              outside +/- 3s.

         b.   The same analyte  is outside +/- 3s twice in a row, even
              though >85% of the  total analytes are in control.

     2.   An "alert"  situation  arises when one  of the  following occurs:

         a.   Three or more  consecutive points  for an analyte  are outside
              +/- 2s but  inside  the +/- 3s.

         b.   A run of 7  consecutive points above or below the mean.

         c.   A run of 7  points  for an analyte  in increasing or
              decreasing  order.

              The "alert" situation implies a trend toward an  "out-of-
              control" situation.   The contractor is required  to evaluate
              his analytical system before proceeding.  If "alert" or
              "out-of-control"  situations occur frequently, re-
              establishing control  charts may be required by the
              technical monitor before analytical work can proceed.

     3.   Other Factors

         a.   Method blank

              If  the  "method blank" exhibits a peak within the retention
              window of any  analyte and is greater than or equal to one-
              half the MRL for  that analyte, an "out-of-control"
              situation has  developed.

         b.   Performance Evaluation Samples

              If  the contractor fails  on one of these samples, an "out-
              of-control" situation is present.

-------
                                                             Appendix A
                                                             Date: February 1992
                                                             Page 7 of 63
4.   Up-Dating Control Charts

     Following establishment of the control chart, a spiked control
     is  part of each analytical or "sample set".  When 5 such
     controls have been run, the recoveries of these analytes will be
     incorporated into the control charts by adding these 5 most
     recent  recoveries to the 20 origianl points and then deleting
     the first 5 of the original points.  Accuracy and precision are
     recalculated and the chart is redrawn.  The newly drawn chart
     will then apply to all data in sample sets subsequent to the
     last one used to update the chart.

-------
      '»

       ^  UNITED STATES ENVIRONMENTAL PROTECTION AGENCY

      ,-*             CNVIMOMMCNTAI. CHCMtrmv L>«W»ATOI»Y NASA/*«STV
     '*                 •CNLAM6 MOB. N*TL «TATION.
                              April 25,  1988
MEMORANDUM

SUBJECT:  Additional Guidance on Establishing Cbntrol Charts for NFS Methods

FROM:     Boh Maxey, NFS Analytical Coordinator
          Environmental Chemistry Laboratory     j-\
          NSTL, MS                             &>

TO:       Addressees

     In reviewing the following guidance, please refer to the attached
instructions, "INITIAL DEMONSTRATION OF CAPABILITIES, CONTROL CHARTS", which
were disseminated at the NFS meeting in Cincinnati  in February.
          you or your your  laboratory have completed analyses of the 4
sets of 5 samples  (see Attachment, part B) and when these 20 values have
been tested for and found to contain 3 or fewer outliers, the sample mean
(x) of these 17-20 values is determined in pgfa units along with the
standard deviation also  in  ppb units.  ReTer to your formulas in Section
14 of your QAPP.

     Divide the standard deviation (in p£> units) by the sanple mean (x") in
pEb units and multiply the  result by 100 to obtain the Relative Standard
Deviation (RSD).  JRSD is the standard deviation expressed as a percentage of
the sample mean, (x") .

     The guidance  in the Attachment, Section C-2, calls for x as a mean
recovery, and I should have more clearly specified R as the mean recovery.
Divide the value x of an analyte (in ppb from the second paragraph above)
by the spiking level, in pfto, for that analyte and multiply by 100 to
obtain R, the mean recovery in percent of the 17-20 values.  Repeat for each
analyte in the Method.

-------
     R~ becomes the central  line on the  control chart about which upper
 (UWL, UCL) and lower  (LVt,  LCL) warning and control Ijjruts are depicted (see
Attachment, Section C-4).   The warning  limits become R + 2 (RSD), and the
control limits become R  + 3 (RSD).  There  is a separate but similarly con-
structed control chart for  each analyte, including the surrogate.

     I apologize to those of you confused  by the control chart instructions.
If you have additional questions, call  me  at FTS 494-1225 or ccnrercial (601)
688-1225.

Addressees:
Dave Munch
Dr. Robert Clark
Dr. A. Dupuy, Jr.
Dr. C. Byrne
W. Dreher
J. Vfetkins

cc:
Dr. A. Yonan

-------
                   INITIAL DEMONSTRATION OF CAPABILITIES
                              CONTROL CHARTS
A.   Contractors for Methods 1-4, and 6, will be required to deaonstrate
     control of the measurement system via use of control charts.  Control
     •ust be demonstrated for each analyte for which quantisation is
     required and for the surrogate at a concentration equal to that
     spiked into samples.

B.   To establish the control charts, following initial demonstration of
     capability, 5 reagent water samples will be spiked at 10 times the
     Minimal Reporting Level (MRL) for the method and carried through
     extraction and analysis.  An additional 15 samples will be spiked and
     analyzed, 5 on each of 3 days.  The data froa these 20 spiked samples
     will be used to construct control charts.

C.   Criteria for Accuracy and Precision

     1.   The RSDs for any analyte must be s20%, except where data,
          generated by Battelle  at the corresponding level, indicated
          poorer precision.  The RSDs exceeding 20% will be evaluated on  a
          case-by-case basis by  Technical Monitors for each method.

     2.   The mean recovery  (x)  of each analyte must lie between
          Battelles' mean recovery for each  analyte  (at the corresponding
          level) ± 3 times the RSD for  that  analyte as determined by
          Battelle during methods development, but no greater than
          Battelle's mean recovery ±  30%.

          Exaaple:

          For  an analyte  "A"

                o   Battelle demonstrated recovery  00 of 80% for  Analyte
                   "A" with RSD of 5%.   Acceptable  recoveries will be 80%
                   ±  3  (5%) «  80% 1  15%  «  65% -  95%;

                o   or, Battelle demonstrated  7 of  80% with  RSD of 15% for
                   analyte "A".   The  acceptable  recovery would be limited
                   to 80% t 30% « 50% -  110%.

      3.   Surrogate

           IB establishing the control chart  for the surrogate,  criteria in
           C(l) 2nd (2)  above,  apply; it follows that one of  the spike
           sixes must contain the surrogate at the concentration as  spiked
           into actual samples.

           Surrogate recoveries from samples (Methods 1-4,  6-7)  will be
           required to be within t 30% of the mean recovery determined  for
           that surrogate during the initial demonstration of capabilities.

-------
     4.    Warning Limits/Control Limits

          The control cbtrts  will be drawn up so as to depict both warning
          limits (i 20)  and control limits (± 30) about tbe >etn.
D.   Outliers
     DIXOD'S test will be used to determine outliers.  There can be no
     more than 3 outliers per analyte from tbe 20 spiked controls.

     Out-of-Control Situations

     1.   In "the following instances, analytical work oust be stopped
          until an "in-control" situation is established.

          a.  More than 15% of the analytes of a particular aethod are
              outside ± 30.

          b.  The saae analyte is outside ± 30 twice in a row, even
              though >85% of the total analytes are in control.

     2.   An "alert" situation arises when one of the following occurs:

          a.   Three or more consecutive points for an analyte are outside
               t 2o but inside tbe ± 30.

          b.   A run of 7 consecutive points above or below tbe mean.

          c.   A run of 1 points for an analyte in increasing or
               decreasing order.

               Tbe "alert" situation iaplies a trend toward an out-of-
               control" situation.  The contractor is required to  evaluate
               his analytical  system before proceeding.   If "alert" or
               "out-of-control" situations occur  frequently, re-establish-
               ing control charts «ay be required by the  Technical Monitor
               before  analytical work can proceed.

     3.   Other  Factors

          a.   Method  blank

                If  the  "aetbod  blank" exhibits  a peak within the  retention
               window  of  any analyte and  is greater  than or equal to
                one-half  the  XRL  for  that  analyte,  aa "out-of-control"
                situation  has developed.

          b.    Perforaance-Evaluation  Staples

                If  the  contractor fails on one of  these staples,  an
                "out-of-control"  situation is present.

-------
Up-dating Control Charts

Following establishment of the control chart, a spiked control(s) is
part of each analytical or "staple set".  Vnen 5 such controUhave
been run, the recoveries of these analytes will be incorporated into
the control charts by adding these 5 lost recent recoveries to the 20
original points and then deleting the first 5 of the original points
Accuracy and precision are recalculated and the chart re-drawn   The'
newly drawn chart will then apply to all data in sample sets subse-
quent to the last one used to update tae chart.

-------
                                                                 Appendix A
                                                                 Date: February 1992
                                                                 Page 13 of 63
DIXON'S TEST

Dixon's test is used to confirm  the  suspicion  of outliers of a set of data
(for example, control chart data points).   It  is based on ranking the data
points and testing the extreme values  for credibility.   Dixon's test is based
on the ratios of differences between observations and does not involve the
calculation of standard deviations.

The procedure for Dixon's test is as follows (from Taylor, 1987):

     1)   The data is ranked in order of increasing numerical value.  For
         example:

              Xi < X2 < X3 < ...  < Vl  < *n

     2)   Decide whether the smallest,  X^ or the largest,  Xu,  is
         suspected to be an outlier.

     3)   Select  the risk you are willing to take for false rejection.
         For use  in this QAPP we will  be using a 5% risk of false
         rejection.

     4)   Compute  one of the ratios  in  Table 1.   For use in this QAPP we
         will be using ratio r22, since we  will be using between  20 and
         17 points for the control  charts.

     5)   Compare  the ratio calculated  in Step  4 with the appropriate
         values  in Table 2.  If  the  calculated ratio is greater than the
         tabulated value, rejection  may be  made with the tbulated risk.
         Fort his QAPP we will be using the 5%  risk values (bolded).

Example (from Taylor)

     Given the  following set  of ranked data:

         10.45,  10.47, 10.47, 10.48, 10.49,  10.50,  10.50,  10.53,  10.58

     The value  10.58  is  suspected of being  an outlier.

     1)   Calculate ru

                       10.58 - 10.53        0.05
              rn  -    	    -   	    - 0.454
                       10.58 - 10.47        0.11

     2)   A  5% risk of false rejection  (Table 2),  ru - 0.477

     3)   Therefore there is no reason  to reject the value 10.58.

     4)   Note that at a 10X risk of  false rejection ru - 0.409, and the value
         10.58 would be rejected.

-------
                                                                      Appendix A
                                                                      Date:  February 1992
                                                                      Page 14 of 63
                                      TABLE  1

                              CALCULATION OF RATIOS
For use if
Ratio n is between
•^ . 7
if X,, is
suspect
«, - vo
if Xi is
suspect
/ V V '\
\-**-2 ~ "I/
         n
        *2i
        r22
                            8 - 10
                           11  -  13
                           14  -  25
                                                (Xn - Xx>
                                                    - X2)
                                                    - X2)
                                                    - X3)
                                                                      (X2
 (X3  - XJ


(Vi  -  X:)


 /Y    Y '\
 ^x3  - x:;

/ V      Y "S
^•xn-2  '  Al^
Note  that for  use in  this QAPjP ratio  r22 will be used.

-------
     •11
     -21
    •22
                                                                    Appendix A
                                                                    Date: February 1992
                                                                    Page 15 of 63
                                     TABLE 2

               VALUES  FOR USE WITH THE DIXON TEST FOR OUTLIERS



                                          Risk of  False Rejection

                   n           0.5%          1%            5%            10%
     •10
3
4
5
6
1
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
0.994
0.926
0.821
0.740
0.080
0.725
0.677
0.639
0.713
0.675
0.649
0.674
0.647
0.624
0.605
0.589
0.575
0.562





0.988
0.889
0.780
0.698
0.637
0.683
0.635
0.597
0.679
0.642
0.615
0.641
0.616
0.595
0.577
0.561
0.547
0.535
0.524
0.514
0.505
0.497
0.489
0.941
0.765
0.642
0.560
0.507
0.554
0.512
0.477
0.576
0.546
0.521
0.546
0.525
0.507
0.490
0.475
0.462
0.450
0.440
0.430
0.421
0.413
0.406
0.806
0.679
0.557
0.482
0.434
0.479
0.441
0.409
0.517
0.490
0.467
0.492
0.472
0.454
0.438
0.424
0.412
0.401
0.391
0.382
0.374
0.367
0.360
Note that  for this QAPjP the 5% risk  level will be used  for ratio r22.

-------
                                                                    Appendix A
                                                                    Date: February 1992
                                                                    Page 16 of 63
Reference:
     John K. Taylor,  Quality Assurance of Chemical Measurements. Lewis
     Publishers,  Chelsea, MI, 1987.

-------
                         Appendix A-2

       Guidance for Quality Assurance Project Flans:

                          Section  6


Sample Container  Types,  Sizes, and Preservatives

Sample Label and  ID Information

Sample Tracking Form
                                                         Appendix A
                                                         Date: February 1992
                                                         Page 17 of 63

-------
                                                                    Appendix A
                                                                    Date: February 1992
                                                                    Page 18 of 63
            Samplt Container Preservation Requirements
NPS METHOD                       BOTTLE. CAP. AND

NPS-1                        Clear (-liter borosfflcate glass bottles wtth teflon-fined caps
                             with 10frt of mercuric chloride.

NPS-2                        Clear MJier boroslcate glass bottles with teflon-fined caps
                             with 10 ml of mercuric chloride.

NPS-3                        Cleer Miter boroslcate glass botdes with teflorvlned caps
                             with 10 ml of mercuric chloride.

NPS-4                        Clear Miter borosfflcate glass bottles with tefion-fined caps
                             with 10 ml of mercuric chloride.

NPS-5                        250 ml amber screw-cap glass botdes with teflon
                             faced septa with 7.5 ml of pH 3 buffer.

NPS-6                        60 ml screw-cap glass bottles with teflon-faced septa with
                             0.6 ml of mercuric chloride.

NPS-7                        60 ml screw-cap glass botdes with teflon-faced septa with
                             0.6 ml of mercuric chloride.

NPS-9                        125 ml polyethylene botdes wth 025 ml of suluric acid.

Volume of preservative should be <. 1% of the final sample volume for al methods.
Concentration of the preservative should be 10 mg/L for el methods.

-------
                                                                        Appendix A
                                                                        Date: February 1992
                                                                        Page 19 of 63
               Description of Sample Cod* Numbers for Sampling
                      Scenario and Lab Spike Assignments
WaUTvoa
   10
Numbar
Lab Name
          Mfihod
          Number
Sample Tvoe
PC « Community Wei 0001
PO • DomMtfc Wei
PR « Reaamplad Wei   .
PB - PE Sample
                  1500
1 - JMM   1
2 • ATI    2
3«RAO   3
4 * ESE   4
S»BCL   5
6-BSL   6
7 « TSO   7
          9
                                   1)«AO
                     01 - FWd Sample
                     02 • SNpptog Blank
                     03 • Backup atvTVit
                     04 • Lib «pft« (mtt A,
                     06 -Ud«pM(mtKA%
                     06»La6«pk»ortlnM«oragfO«yO(m«(A,i«v«3)«A2
                     07 « Lab «pka (mot B, lavtf 1)«BO
                     06 « Lib *pfc» (mh B, (aval 2)»B1
                     08 • Lab apka or ttna atoraga - Day 0 (mte B, tavti 3)* 82
                     10 • Lab apka (mte C, (aval 1)«CO
                     11 - Lib apfca (mat C, (aval 2)»C1
                     12 -Lab apka or tJma atoraga- Day 0(mhC. )avti3)»C2
                     13 • TTma atoraga (Day 0 dupacata)
                     14 • Tlma atoraga - Day fourtaan
                     15 • TTma atoraga • Day fourtaan dupfcaia
Lab performing tha anaJyaaa for ma NFS:
1 • JMM (Montgomery Laboratoriaa)
2 - ATI  (AJBanca Tachnotoglaa incTOaan Harbora Analytical Sarvteaa)
3 m RAO (Raolan, Inc.)
4-ESE (ES&E)
5-BCL (Baflaaa, Cotumbua OMaton)
6-BSL (Bay St Lou* (EPAyEnvfrenmantal Chamlatfy Lab)}
7 -TSO (EPAyTachrteal Support DMatan Lab)
                                         FS«naUSampla
                                         LS»La&Spka
                                         T8 •TlmaStoraga

-------
                                      Appendix A
                                      Date: February 1992
                                      Page 20 of 63
NATIONAL  PESTICIDE  SURVgV
SAMPLE tt

      METHOD*
KIT:
PRESERVATIVE I
 DATE   !   TIME   !  SAMPLER
    Blank SampU BonU Label
NATIONAL PESTICIDE SURVEY

SAMPLE »» PD-9999-7-7-01

TSO  - METHOD*  7   KlTt 711
FIELD SAMPLE
PRESERVATIV€I  HgC12
        !  TIME  !  SAMPLER
    Exarflt Satnpl* Bool* Label

-------
                                                                      Appendix A
                                                                      Data:  February 1992
                                                                      Page 21 of 63
                            iiiiiiimiimmiiiittuMuuiiiu
                            i     MM mow rm     i
                            I  M WtM,HIT1CIB WKT  i
                            iimtiuiMiiiitiiiMimitiumin
«U II •.! 	

u*u caiinw mil  / /
                                                            Ih —
rtM 11 *. in «U «rit
                                                        in».i
                                        TO Ml
 tcr
 swu     ami
                 IUTUU

inoi
                                                            r «
                                                            r it
                                                            r «
                                                            > «
                                                            > «
                                                            ( K
 •WO*!
            TM
                                               MM        T«
                                               CBMTWin
 oca unit
              unm,
«nu mat, »nu mum.
                                      wiu, or MI
                                      am«u « 0

-------
                                                         Appendix A
                                                         Date:  February 1993
                                                         Page 22 of 63
                         Appendix A-3

        Guidance for Quality Assurance Project Plans:

                          Section 7


NFS Sample  Tracking Program

Minimum Requirements for Sample Management and Spiking

Spiking Update

-------
                       THE RPSIS SAKFUt 11CIIPT P10GBAM


     NPSIS Is designed co keep crack of che day to day operation* of che
National Pescicide Survey.  You play an important role in NTS and your timely
notification of receiving a kit of samples is essential co  che success of NFS.
tfe have designed che Sample Receipt Program with your busy  schedule in mind.
NPSIS will obtain che •inLoua amount of information necessary while still
maintaining a secure system.  You will be entering data into the NPSIS
personal computer via your own computer, modem, and Carbon  Copy software.
} \  Hardware and Software ReouireaenCs.


     The NPSIS Sample Receipt Program has a minimum hardware and software
requirement.  Here is a list of items you will need:
          Hardware:
                    One (1) IBM PC, XT, AT, or Personal System model with at
                    least 640K memory.

                    One (1) 2400 or 1200 baud Hayes or Hayes compatible modem
                    with cables. (See Carbon Copy guide for cabling requir-
                    ements and a description of usable modems)

                    On* (1) data transmission phone line.
          Software:
               •    NPSIS Sample Receipt Program access provided for you by
                    ICT.

               •    On* (1) copy Carbon Copy software which is provided co you
                    by ICF for the duration of NFS.
1.2  Initial Installation Seeps.

     Before you can access and use NPSIS, you must first  load the Carbon Copy
software onto your PC.  The directions arc provided  in  the Carbon Copy manual.
One item you will want to include is an entry Into the  "Call Table'.  This
entry will include a name, telephone number, and password for the NPSIS
computer.  To enter thai* items into the Call Table, press "2" from che Carbon
Copy Parameters' Screen.  The information you muse encer  consists of che
following:

-------
                    Have:  NFS

                    Telephone Nuaber:  703-$*t-0629

                    Password:  HPS
     Paraaeten for Con
     NFS IS will aaintain a sec configuration throughout operation.  Any
changes due to updates in equipment or the systea which will affect your
ability to coaaunicate through Carbon Copy will be forwarded to you.  The
paraaeters which will be aaintained at this tiae are:

               •    2400 baud aodea speed.

               •    Answer ring count equal to one .

               •    Re-boot on exit after S ainutes .  (If there is a power
                    failure or soae other type of interruption, you can log
                    back on to NFS IS and resuae your session.)

               •    Five ainute inactivity tiae constraint.

               •    Two password atteapts .


                              TO HPSIS.
") 1  Fst*hT l*hlr>» A CoaMunle*tlon> Link
     Once you have installed Carbon Copy and have all of the necessary
hardware, you are ready co "log on" to the NPSIS coaputer at ICF.  To do this:

          Type: C:> CC8KLP IPS    in your directory containing Carbon Copy.

This coamand will automatically dial the HPSIS coaputer, send your password
for verification, and establish a data link between the two coaputers.  You
will be able to discern what is taking place by aessages to your screen.

2.2  Entering A ffaajlt Receipt Into NPSIS.

     Once you have established a data link. ( e.g., are "logged on"), you will
see on the screen exactly what i* on the screen of the NPSIS coaputer.  This
screen you are viewing is the aain aenu for the Saaple Receipt Prograa.
Reaeaber that you are controlling the NPSIS coaputer via a 2400 baud phone
line and your typing will appear on the screen at a auch slower rate than you
are accustoaed to.  A few tips on how to use the systea are outlined in the
next section.

-------
       Useful Tips en How to Uee NPSIS.
     Before you start, * few things to remember are:

     •    Pressing the 'Esc' key will cancel mil changes for the screen you
          are currently in and return you to the previous screen.  Pressing
          *Esc* at the Searching Screen returns you to the main menu.

     •    Pressing "PgDn" or "PgUp" will save the items you have entered in
          the current screen and place you in the next or previous screen,
          respectively.  This feature is handy to use when you only have a few
          items to enter in a screen which prompts for several iteas.

     •    Pressing "Enter", "arrow up",  or "arrow down" will move the cursor
          fron field co field in each screen.  Reneaber that using the
          sideways arrows will not work.

     •    Pressing the "Alt" and "Right Shift" keys together will place the
          Carbon Copy Control Screen over the NPSIS Saaple Receipt Prograa.
          You can then use the communications features in Carbon Copy.
          Pressing "F10" again when you are through will replace the NPSIS
          Saaple Receipt Prograa screen you were currently in back on your
          screen,  and

     •    Because  you will are most likely to be entering information
          regarding a number of kits at one time, after you save or cancel
          your entries for one kit, you will be placed at the initial Saaple
          Searching Screen for a new kit.   If you are finished with your data
          entry, simply press "Esc" to exit the Sample Searching screen and be
          placed in the main menu.
2.3  A Basic Outline of the Staple R*e*ifft

     The NPSIS Sample Receipt Program has three basic features:

               •    Initial reporting of a NPS sample kit of sample bottles.

               e    Ability to edit or re -edit an existing report of a kit
                    receipt, and

               •    Access to ICFs computerized mail system which provides the
                    ability to send memoranda to ICF staff.

     The information obtained in an entry for a kit of bottles is:

               e    The kit identification number, the FedEx airbill number,
                    and the last name of the person making the entry.
                     «
               •    Any damage to the kit as a whole such as malted ice or any
                    breakage of the cooler.

-------
               •    Verification of which bottles belong In  • kit or cooler."
                    notification of any mining bottles or any  additional
                    bottles, and

               •    Any damage to each sample bottle which renders It unusable
                    for analysis and testing.
2.4  Hfsis Siiarlt Receipt PT?iria Sriitynii

     When you have completed the logon procedure, you will see the following
main menu on your computer screen:
                NATIONAL PESTICIDE SURVEY INFORMATION SYSTEM

                SELECTION MENU FOR REPORTING SAMPLE RECEIPTS    04/05/88
                         Report \ Edit a Sample Receipt
                         Send a Meao

                         Press  to Logoff
                        use f ^and  4* to select option.

     The screens provided in this memo will show all of the screens available
and thus represent the maximum number of screens you will encounter with
NPSIS.  It is most likely that you will not have the need to enter information
reporting damaged kits or samples.  Therefore, not all of the screens depicted
below will appear in your normal session.

     If you choose the first item on the menu. "Report \ Edit a Sample
Receipt", you will then be prompted for the kit identification number and the
FedEx airbill number Associated with the specified kit.  The screen will
appear like this:

-------
               N?S Staple Receipt Searching Screen
         ** Enter che following items to access kit information **

         To find the Kit information in NFSIS in the most complete
         and accurate fashion, please enter the Kit number and the
                         FedEx airbill number.

          Enter kit number:
         	> PD-0001-151

          Enter FedEx airbill *:
         	> 1111111111

          Enter your last name:
         	> CHIANG
                  * Press ESC to exit the searching *
     If the kit number you have entered is incorrect, or if the kit number and
FedEx airbill number combination is incorrect, NFSIS will prompt you to try co
enter these number again, as illustrated on the next page.  It is possible
that the FedEx airbill number on the kit is not the same as the FedEx airbill
number which was entered into the NPSIS system.  This could happen if the
field team loses or damages the airbill.

-------
       ERROR!!   The kit you entered cannot be found.  .  .

                 Kit nuaber:  PD-0001-151
                                    AMD
       FedEx airbill nuaber:  1111111111

            Please check these nuabers and try again!

       ********************************* ******** ************* mtttt**-**
        NFSIS is designed to track Kits and FedEx airbill nuabers.
        The Kit and FedEx airbill nuaber combination you have entered
        does not match what is currently in the systea.  Please enter
        the correct coabination.   If you still have probleas, try
        leaving the FedEx airbill * BLANK.  Only enter the Kit nuaber.
       ******* **************** A ******* ************************** *******
Press any key to continue...
     Then, you will encounter this screen insuring that you have entered the
FedEx airbill nuaber:
    Kit No.:   PD-0001-151
        Did you enter the correct Kit nuaber and FedEx airbill nuaber?
        NPSIS is designed to store and track all FedEx airbill nuabers.
        This Kit aay have a different FedEx airbill nuaber than the
        systea, please enter the new FedEx airbill nuaber:
     Hot*: if the correct airbill nuaber was entered before, hit ENTER.
            PgDn (Next page), PgUp (Previous page). Esc (Exit)

-------
     One* you h*v* correctly identified the staple kit, KPSIS will esk you if
there is eny deaege to the kit *e e whol*:
   Kit No.:    PD-0001-151


      U«s  there eny deaege to the seaple kit?  (Y/N)
               PgDn (Next pege),  PgUp (Previous pege), Esc (Exit)

-------
     If you press -Y",  HPSIS will then prompt you for the apparent  causa  of
daaagV
   Kit No.:    PD-0001-151


      WAS there any damage to the sample kit?   (Y/N)


      Please indicate the cause for daaage:

           Kit is broken  (Y/N)  Y

           Ice is melted  (Y/N)

           Other Reason   (Y/N)

       Please enter any comments about the sample kit.

     Comments:   Broken upon arrival.
     Comments:
     PgDn (Next page),  PgUp (Previous page), use f j or  •*-* to select field.  '

     There may already be comments regarding the kit in the comment field
shown in the above screen.  In this case, please enter your comments after any
which already appear.   This insures that no information is destroyed.

-------
     Next,  HPSIS will ask you to survey the content* of the kit and check that
which bottles ere contained within Che kit.  You should then look at the
bottle labels and determine if any are Biasing.  Don't forget to check end
determine if any bottles have been included in the kit which do not 
-------
    _I£ you have pressed 'N',  Indicating that you did not receive exactly what
KPSIS assumes you have received,  you will be prompted to enter the appropriate
information.  This information includes pressing a "Y" or a "N" beside each
bottle, and entering the bottle number found on the labels of any additional
bottles you have received:
   Kit No.:    PD-0001-151

         Please indicate which bottles you received:

            Bottle No:            Received (Y/N)
          PD-0001-1-1-01
          PD-0001-1-1-03
          PD-0001-1-3-01
          PD-0001-1-3-03
          PD-0001-1-9-01
          PD-0001-1-9-03
                              T
                              T
                              T
                              T
    1.
    3.
    5.
    7.
    Please indicate any additional bottles you received:
Bottle No.:   TO-0002-1-1-05   2. Bottle No. : PD-0002-2-2-01
Bottle No.:   PD-0004-4-4-01   4. Bottle No.:   -    - - -
Bottle No.:     •     ...      6. Bottle No.:   •    ...
Bottle No.:     -     ...      8. Bottle No.:   -    ...
     PgDn (Next page),  PgUp (Previous page), use f ^or *—'to select field.

     Notice that the user has indicated that ha did not receive the first tvo
bottles on the list.  Also note that the user has Indicated additional bottles
which have com* in the sample kit,  but which were not on the list.

-------
     N«xt,  NPSIS proapcs you to indicate If any of the Individual bottles have
been daaagtd and rtnd«r«d unusable for analysis:
   Kit No.:    PD-0001-151


      Was there  any daaag* to the saaplt Bottles?  (Y/N) T
               PgDn (Next page),  PgUp (Previous page). Esc (Exic)

-------
     fn order ce complete the appropriate information on daaagad saaples, you
must first press a "Y" or a "N" in the fi«ld labeled "Daaaged Y/N".  If you
have entered a *Y* in thia field,  you must then identify what the cause of the
daaage is. to the beat of your abilities.  As noted on the co*puter screen
below, the "Other" category should be used if the saaple is unusable but is
not broken.   Please try to comment whenever possible.
   Kit No.:    PD-0001-151

           Please indicate which bottles are damaged by entering Y or N,
           and for those which are daaaged, indicate the cause of daaage
                                   ---CAUSE •--
    Bottle  No:   Daaaged           Broken      Other         Comment
    	    (Y/N)              (Y/N)      (Y/N)

   PD-0001-1-3-01   H
   PD-0001-1-3-03   H
   PD.0001-1-9-01   •
   PD-0001-1-9-03   •
   PD-0002-1-1-05   •
   PD-0002-2-2-01   T                T
   PD-0004-4-4-01   •
   The 'Other'  cause category is for reporting contamination of a saaple,
   e.g.  contamination noted on the Saaple Tracking Fora, air bubbles,
   or other reasons a saaple is unusable.
     PgDn (Next page), PgUp (Previous page), use 4 t or 4-J to select field.

-------
     Now you have completed all of the necessary  information needed  to verify
 chat th« proper samples have reached their final  destination in usable
 condition.  You Bay save your kit entry by pressing   "Enter".   If you wish  to
 cancel your kit entry and try again, press "N" and "Enter".   If you  wish  to
 view or edit the current kit entry, press "R" and "Enter"  and NPSIS  will  place
 you back at the beginning of your entry.
     You have completed all of the data entry screens  for  this  Kit.

     You may save your entry by pressing 'Enter' .

     You «ay cancel your entry by pressing 'N' and 'Enter'.

     You aay verify or edit this entry by pressing 'R' and  'Enter'.


                    * * *  Accept entries?  * * *
                * Press    *-J     to Save           *
                * Press N and  -^— ' to Cancel         *
                * Press R and  ^-J Co Verify or Edit * T
     By pressing "Enter" ,  you have saved all of the information necessary for
a particular sample kit.  NPSIS assumes that you will enter «ore than one kit
entry per session.   Therefore, you will be placed at the initial "Searching
Screen".  If you arc finished, press "Esc" and you will be returned to the
main aenu.  You can then log off of NPSIS by pressing "Alt' and "Right shift"
at the saae time.  You may also send a memo through the ICF computerized mail
system.  To do this, cursor down to the second menu choice and press "Enter".

     The next two pages of this memo describe how to use the ICF electronic
mail system.  Rote  that the password for you is NFS.  The mail system software
program will prompt you for this password before It will allow access to the
system.  Also, when you are selecting the recipients of your memo, please
press the space bar beside the initials "NPS".  This will send your memo to
all ICF staff involved in the NPS project.  If you wish to send memos to a
particular ICF staff member, please call Beth Estrada for the Identification
number of the desired ICF employee.

-------
                                  ELECTRONIC MAIL
Function
Augment   office   communications
electronic transfer of notes and files.
with
Summary
Electronic Mail (E-Mail)  allows you  to send,
receive,  read,  and  subsequently  save  or
discard notes and attached files.

When you  power up  your workstation you
will  automatically enter  E-Mail if you have
received  any mail.  Enter your password to
check your  mail, or  press   twice  to
avoid E-Mail  and  continue  to  the Assist
main menu.
Instructions
Operation  of  E-Mail  is  similar  to  Lotus
1-2-3.  Press the  Fl key to receive help at
any  time during operation.   If any more
help is  needed contact workstation support
to receive a manual.

For  more  information on  tny feature  of
electronic mail,  use Network Courier's  on*
line help or refer to the User's Manual.
Passwords

Your  password will be 'password' until you
change  it yourself.   Once  you have  given
your password  tad entered  E-Mail, you can
change  your  password by selecting Options,
then Password.
Reading Mall

 1.    Select   "Read'  from  your   menu
      Highlight    read,then     press
      .
 2.    Select the note to read:
      a.    Highlight the note (using
           the arrow keys); and press
           .
      B.    To tave the note, select
           "Storage", then "Save*. Enter
           the name of the file to which
           the note  should be saved.
 3.    Press  to select another note.
        Wrltiig Mail

         1.    Select "Compose", then 'edit*.
         2.    Press  when the  highlight
              moves to "TO".
         3.    Select the recipients(s):
              a.   Move the highlight to the
                   first recipient's initials.
              b.   Press the space bar. A
                   small mark will appear.
              c.   Repeat steps a and b for all
                   recipients. Press the space bar
                   twice to "de-select" recipients.
                   The small mark will disappear.
              d~   Press  to cancel the
                   entire list.
         4.    Select the initials of those who will
              receive copies:
              a.   Press the down arrow to move
                   to 'CC-.
              b.   Select recipients as instructed
                   above (step 3, a-d)

-------
Writlaf Mail, c0«ti««d

  5.   Enter a subject and priority.
       (optional)
  6.   Select attachments (optional):
       a.   Press  and type the
            path for the document(s).
       b.   Press  and select the
            document(s) to be attached.
       c.   Repeat steps a and b for
            documents in another directory.
  7.   Enter the text of your message.
  8.   Press  when  finished.
  9.   Select Transmit" to post the note
       and attachments.

Quitting  the Mail Program

  1.   Press  from the menu.
  2.   Select 'YES'.

-------
                    SAMPLE MANAGEMENT  2/3/88
       (Sample preservation requirements  and holding  times)

1.    Samples must arrive at the laboratory with ice still remaining
     in  the shipping  box.    If  a  sample  box arrives  at  the
     laboratory without any  ice remaining,  the laboratory should
     contact the technical monitor immediately.

2.    The pH  of  all samples  collected  for analyses  using either
     Method 5 or 9 must be tested  prior to analyses.  If the pH is
     found  to  be  >4  for Method  5,  or >3  for  Method  9,  the
     laboratory  should contact the  technical  monitor prior  to
     analysis.

3.    Except for Method 9, strict  adherence  to sample and extract
     maximum holding times (14 days)  is required for both primary
     and  secondary column   analyses.    All  analyses  should  be
     completed  as  soon  as  possible,   but   under  extenuating
     circumstances,  the   maximum  extract holding   time  may  be
     extended to 28 days  for GC/MS analyses  only,  if approved by
     the technical  monitor.    Method 9 samples must be  analyzed
     within 28 days.

4.    Water samples are to be disposed  of after the 14  day sample
     holding time has been exceeded,  except for Method 9,  which is
     28 days. Sample extracts must be maintained until disposal is
     approved by the TSD or OPP Laboratory Coordinator.

5.    Additional  samples  will be  collected  at 10%  of the sample
     sites   for   spiking   at  the  laboratory  to   test   matrix
     interferences.   These  samples are  to  be  spiked at  analyte
     concentrations  equal to 2,   10,  or 20  times  the  minimum
     reportable  level for  each  analyte,  except  for  Method  9.
     Method 9 spiked samples  are to be  spiked  at 2 or 10 times the
     reporting limit.   Samples collected  for the analyte stability
     studies are to be spiked at  10  times the minimum reportable
     level for each analyte.   ICF  will  have the spiking level (and
     mix for those methods with more than one mix)  printed on the
     sample label.

-------
          UNITED  STATES ENVIRONMENTAL PROTECTION AGENCY
                              CINCINNATI  OHIO  45266"
DATE:    July 14, 19SS

SUBJECT: Changes in NFS Laboratory Procedures

FRO*:    David J. Munch, TSD  Project Manager
         National Pesticide Survey

TO:      NFS Technical Monitors  (See below)
    The following -inor changes ir. laboratory  operators  are  beir.c  -;:>..

    1.   Spiking Levels (Methods 1-7)

         Currently, selectee f.'PS sarples are being  spiked  at  either  Lev. 1
    1  (e.tizes XRLJ , Level 2  CO tires MRL), or Level  5  CC  tires XR1  .
    In rany cases, spikinc at Level 3 has created analyt-:  concent rations
    in sarples which exceed the linear range of the  instrurer.taticr..   Ar.y
    Level 3 spiked samples currently on hand should  be analyzed:  however,
    no further requests will be iad= to spike  sar.ples  at  Level  3.

         In order to laintair, three spiking levels,  a  Level  0  (2  tir.e;
    KRL)  is being added.  Laboratory Control Standards ar.d Tire Storart
    Sarples are to continue tc be spiked at Level 2  (10  tires X?', '

    2.   Spiking Levels (Method 9)

         Currently, sample spiking levels used for Method  9 are, Level 1
    (2 times MRL), Level 2 (10 times MRL) , and Level 3 (10,000  ug/L) .  Th=
    spiking levels are to remain the sa-e; however,  Level  0 will now be  2
    tices KP.L, Level 1 10 times HRL, and Level 3 10,000 ug/L.


    3.   Data Reporting Format

         In order for the data reporting format to match  the  requirements
    for reporting suspected NFS analytes observed on the  primary colucr. ,
    at a  concentration between 1/2 KRL and MRL (see  memorandum  entitled
    "Determining and Reporting the Presence of NFS Analytes below the
    Kiniauz Reporting Levels and Identifying Unknown Peaks,"  by Bob  Xaxey,
    6/1/88), further clarification is required.  In  those  cases where  the
    presence of an NFS analyte at a concentration between  1/2 MRL and  the
    MRL is successfully confirmed, the primary and conf ircational colur.n
    data  for that analyte should be reported as "-111".   In  those cases
    where confirmational analyses are either not required, or the
    conf iritational analyses did not confirm the presence  of  the analyte,
    the primary coluir.n data for that analyte should  be reported as  "-222".

-------
                                     -2-
    Please transmit this information  to  both  your  contract  and referee
laboratories, as soon as possible.  If you  have  any questions  concerning
these iteir.s, please let ie know.
Addressees
    A. Dupuy
    L. Kamphake  (TSD)
    C. Madding  (TSD)
    R. Maxey  (OP?)
    K. Sorrell  (TSD>
    R. Thomas  (TSD.1
cc :
    H. Brass  (TSD)
    C. Freebis (CSC1
    A. Kroner  (TSD)

-------
                                                          Appendix A
                                                          Date: February 1992
                                                          Page 41 of 63
                         Appendix A-4

        Guidance for Quality Assurance Project Flans:

                          Section 10


Data Reporting Format

Data Reporting Format Changes

Data Reporting Codes

Rapid Reporting System

-------
                                                                 Appendix A
                                                                 Date: February 1992
                                                                 Page 42 of 63
     There are two ways the data is received at the EPA in Cincinnati. The
first is on a floppy disk  and cover the majority  of  the data  received.  The
other way is on a hard copy. Data  that is received on  a hard  copy includes
method 9 referee data, methods  2,  4 and 5 referee data for  the  instrument
control standards, methods 2,4, 5 and 7 referee  data  for the confirmation
column, all GCMS data run  at the referee laboratories, and  miscellaneous
samples from all the laboratories.

     Due  to the different nature of the data transmissions,  there are
different methods of entering them into the data  files. For the  data received
on a floppy disk, the disk is copied to a master  diskette and a  backup
diskette for each laboratory/method combination as well as  by analysis  data
and instrument control standard data. Its unique  name  comes from the disk
number that is sequentially assigned as the data  arrives. These  diskettes are
checked to guarantee the copy was  complete. The data is then  uploaded from  the
PC to the NCC mainframe in North Carolina. It  is  stored in  a  partitioned data
file by the same scheme used for storing it on diskette (the  membername  being
the sequential disk number). These data files  are checked for completeness  and
accuracy of transfer. The  data  files are then  transferred to  the  mainframe  at
Cincinnati, since this is  where the processing of the  data will  be done.  They
are transferred to partitioned  data sets for processing (the  membername  being
the laboratory/method combination). For data received  on a hard  copy, the data
is exitered directly onto  the Cincinnati mainframe in  the processing data
sets. The hard copies are  then  stored in a file folder by laboratory-/method
combination.

     Now that the data is in a processing data set it is  edited to comply,
with the format set-up at  the beginning of the survey. This includes  deleting
extra blank lines, deleting method number in the  set number,  deleting lab pH
when not required, adding  dots  in  blank fields, moving headers and data  to  the
correct column, etc. This  also  includes changing  the spelling of nine analytes
(eight were misspelled in  the original format  given to the  laboratories  and
one which was misspelled by a laboratory in creating the format). Also a  line
of asterisks was added between  samples to help the technical  monitors in
perusing the data. Once the data was believed  to  be in an acceptable  format,  a
program was run to determine if the computer would read the right data for  the
right piece of information. If  the data reading check program proved the  data
to be in the acceptable format, the remaining  QA/QC check programs were  run on
the data.  If the data was  not in the acceptable format, the data was  edited to
comply with the correct format.

     A hard copy of the data is computer generated and this  copy is used to
highlight  problem data and give to the technical  monitor. The output from the
QA/QC check programs is used to detect data that  did not meet QA/QC  criteria
as well as check for positives, positives above the rapid reporting  limit,  and
proper mix, sample id, and reporting codes. The hard copy is  marked  to show
any, data  discrepancies or special notes. The  hard copy is dated and either
hand delivered or mailed  to the technical monitor.

     When the technical monitor finishes reviewing the data, the hard copy  is
returned with  their comments as to the disposition of  the data. The  data  is
edited to  meet with the critique of the technical monitor,  as well as to  add
any necessary  comments and to eliminate the line  of asterisks. The data  is

-------
                                                                 Appendix A
                                                                 Date: February 1992
                                                                 Page 43 of 63
then transferred to a finished data  set  on the  Cincinnati mainframe.  It
resides there until a "batch" of  data  is defined.

     A "batch" of data is an arbitrary group of sample sites broken down by
the sampling date. The cut-off date  for  the end of a "batch"  is chosen in such
a way as to minimize the amount of work  and to  report data in complete sets.
Once a "batch" is determined, data is  transferred  to its  final resting place.
When all of the data from all of  the labs resides  in this final data  set,
various check programs are run to ensure the data's quality.  These programs
include checks on the following:  date  sampled,  date shipped,  date received,
time sampled, type codes, site numbers,  positives,  reporting  codes, field *pH,
stabilizing temperature, and conductivity.  If there is any data of suspect
quality, it is investigated and appropriately edited.  A report is then
generated and sent to the respective analytical coordinators  for their review
and approval.

     Once the data is returned from the analytical coordinators,  it is edited
per their comments. It is then transferred to a data file which contains  only
finalized field sample data. (This copy  is  for  backup  purposes only,  in case
the IGF file gets lost.) This data file  is  then copied to the  IGF account  on
the Cincinnati mainframe for their use.

-------
                                                               Appendix A
                                                               Date: February 1992
                                                               Page 44 of 63
             FORMAT FOR NATIONAL PESTICIDE  SURVEY  (NFS)  DATA
LINE COLUMNS
1 1-6
9-14
17-24
27-34
37-44
47-54
57-64
[FOR
68-69
2 1-6
9-14
17-24
27-34
37-44
47-54
57-64
[FOR
67-70
DESCRIPTION
I_Temp
S_Temp
Date_Sam
Date_Shp
Date_Rec
Time Sam
Time_Ice
METHODS 5 AND 9 ONLY]
PH
enter INITIAL TEMPERATURE OF WATER
enter STABILIZED TEMPERATURE OF WATER
enter DATE SAMPLED
enter DATE SHIPPED
enter DATE RECEIVED
enter TIME SAMPLED
enter TIME ICED
METHODS 5 AND 9 ONLY]
enter pH
3

4

5

6

7
BLANK

 1-7

 1-80

BLANK

 1-6
16-18
21-25
28-35
38-45
48-55
58-63
Receipt Condition
enter CONDITION OF SAMPLE UPON  RECEIPT AT LABORATORY
Samp #
Lab
Set #
Date_Spk
Date_Ext
Date_Ana
Column
        1-13    enter SAMPLE IDENTIFICATION NUMBER
       16-18    enter LAB ABBREVIATION
       21-25    enter SET NUMBER
       28-35    enter DATE SPIKED
       38-45    enter DATE EXTRACTED
       48-55    enter DATE ANALYZED
       58-63    enter ANALYSIS COLUMN
       FORMAT FOR NATIONAL PESTICIDE SURVEY  (NPS) DATA  (continued)

-------
                                                                 Appendix A
                                                                 Date: February 1992
                                                                 Page 45 of 63
LINE
9
10




11







COLUMNS
BLANK
1-4
8-13
16-22
25-31
34-40
43-49
52-60
65-70
1-5
8-13
16-22
25-31
34-40
43-49
52-62
65-70
DESCRIPTION

Type
Spiker
Extract
Analyst
Sam Vol
Ext_Vol
Int. Std.
% Surr
enter SAMPLE TYPE
enter SPIKER' S INITIALS
enter EXTRACTOR'S INITIALS
enter ANALYST'S INITIALS
enter VOLUME OF SAMPLE
enter VOLUME OF EXTRACT
enter INTERNAL STANDARD
enter PERCENT RECOVERY OF
 12

 13

 14

 15

 16
BLANK

 1-8

 1-80

BLANK

 1-7
29-33
39-45
67-71
Comments
enter ANY PERTINENT COMMENTS ON SAMPLE AND ANALYSIS
Analyte
Cone.
Analyte
Cone.
17?      1-25    enter ANALYTE'-S NAME
        28-34    enter CONCENTRATION OR PERCENT RECOVERY
        39-63    enter ANALYTE'S NAME
        66-72    enter CONCENTRATION OR PERCENT RECOVERY

-------
                                                               Appendix A
                                                               Date: February 1992
                                                               Page 46 of 63
             FORMAT FOR NATIONAL PESTICIDE  SURVEY (NFS)  DATA
LINE
1




2






COLUMNS
1-6
9-14
17-24
27-34
37-44
47-54
57-64
68-69
1-6
9-14
17-24
27-34
37-44
47-54
57-64
68-69
DESCRIPTION
Fld_pH
S_Temp
Date Sam
Date_Shp
Date_Rec
Time_Sam
Time_Ice
pH ~ Note: Method 9
enter Field pH
enter STABILIZED TEMPERATURE
enter DATE SAMPLED
enter DATE SHIPPED
enter DATE RECEIVED
enter TIME SAMPLED
enter TIME ICED
enter pH Note: Method 9





only
OF WATER





only
3

4

5

6

7
BLANK

 1-17

 1-80

BLANK

 1-6
16-18
21-25
28-35
38-45
48-55
58-63
Receipt Condition
enter CONDITION OF  SAMPLE  UPON RECEIPT AT LABORATORY
Samp //
Lab
Set #
Date_Spk
Date_Ext
Date_Ana
Column
        1-13    enter SAMPLE IDENTIFICATION NUMBER
       16-18    enter LAB ABBREVIATION (JMM)
       21-25    enter SET NUMBER
       28-35    enter DATE SPIKED
       38-45    enter DATE EXTRACTED
       48-55    enter DATE ANALYZED
       58-63    enter ANALYSIS COLUMN
       BLANK

-------
   ERIE: September 9, 1988

SUBJECT: Data Reporting Codes

   FRCM: Christopher Frebis, CSC Statistician

     TO: Distrilxrtion


     The purpose of this memorandum is to discuss the reporting codes used in
the National Pesticide Survey.  There has been sane confusion over these ™vk>c
as to when and where to use then and their exact meaning.

     Table l identifies the unique sample types (SAMP - field sample, MBLK -
method blank, SBLK - shipping blank, LCS - lab control standard, and LSS, DIS,
HIE, and HTS - spiked field samples — these last three are each a type of time
storage sample).  Under each unique sanple type are the only possible ^-rtoc
that can appear for that sample type.  (Note: -555 has been arvw for the
situation where the contract lab sends the extract to the referee lab for O3E
analysis, and the code -222 has been deleted.)  There is aiao a type of
decision tree for field samples since they are a little more complicated with
three analyses for confirmation and qualitative only analytes.

     I hope this memorandum helps to put everyone on similar terms as well as
clearing the muddy water.  If there are any questions of different scenarios
you wish to discuss, please call me at (513) 569-7498.


Distribution:  Herb Brass, Technical Support Division
               Aubry Dupuy, Environmental Chemistry Laboratory
               Carol Madding, Technical Support Division
               Bob Maxey, Environmental Chemistry Laboratory
               Dave Munch, Technical Support Division
               Kent Sorrell, Technical Support Division
               Bob Thomas, Technical Support Division

-------
       \
I 53B ?    UNITED STATES ENVIRONMENTAL PROTECTION  AGENCY
\>    ,
-------
                                    -2-
    If you hart any questions concerning  these  procedures,  please  Itt  Bob
Ha*ey or me know.  Also, pleas* pass on this  information  to your contract
and referee laboratories.  They will need to  have  this  information in  hand
prior to their conducting the dry run.

Attacbaent

Addressees:

    A. Dupuy
    L. Kamphake
    C. Madding
    R. Maxey
    R. Sorrell
    t. Thomas
cc:
    J.  Iotas
    B.  Brass
    A.  Kroner
    J.  Orme

-------
                 METHOD 11
 AMALYTt                  tiPID KlfOKTIMG HYIL

ilachlor                          44 U0/L
Xaetryn                          300 ug/L
Atrazine                          35 ug/L
Broiacil                       2,500 ug/L
Butylate                         700 ug/L
Ctrboxia                       1,000 ug/L
Diphentmid                       300 ug/L
Fenaaiphos                       5.0 ug/L
Hextzinone                     1,050 ug/L
Hetolachlor                      300 ug/L
Metribuzin                       250 ug/L
Propazint                        500 ug/L
SiBtzint                          50 ug/L
Ttbuthiuron                      125 ug/L
Ttrbacil                         250 ug/L

-------
                 MZTBOD 12
   MtALYTI                    RAPID KgPORTHIC LTVIL

tlpht-Chlordtne                     0.5 ug/L
g»m«»-Chlord»ne                     0.5 ug/L
CblorotbaloDil                      150 ug/L
Dtcthtl (DCPA)                    5,000 ug/L
Dieldrin                            0.5 ug/L
Propachlor                          130 ug/L
Trifluralin                          25 ug/L

-------
                 METHOD 13
   JLMALYTt                R*PID  KEPOKTIMC LEVEL

Acifluorfco                       130  ug/L
Bentazon                         87.5  ug/L
2,4-D                             100  U0/L
Dalapoa                           800  uj/L
Dicaaba                            13  ug/L
Dinoseb                           3.5  ug/L
Pentaehloropbenol                 300  ug/L
Picloram                          700  ug/L
2.4,5-T                           105  ug/L
2,4,5-TP                           70  ug/L

-------
              KETBOD 14
  AKALYTE

Cy»n»zine
Diuron
Fluoaeturon
Prophaa
RAPID RZPORTIKC LIVZL

      13 U0/L
      70 ug/L
     438 U0/L
     595 ug/L

-------
              HZTHOD 15
 AMALYTE                RAPID MPORTI1IG LKVKL
Aldictrb                       10
B*ygon                         40
C»rb*ryl                    1,000 ug/L
Ctrbofuran                    50 u0/L
Kethoayl                      250 ug/L
                              175 ug/L

-------
                 METHOD |6
    ANALYTE                KAPID REPORTING LEVEL



ethylene thioure*               1.05 ug/L

-------
                           KtTHOD 17
	MfALYTI	          RAPID REPORTING LJVZL

dibroiochloroproptne                          2.5 ug/L
1,2-dicbloroproptn*                            56 ug/L
cis/trtns l,3-dichloroprop«n«                  11 ug/L
etbylene dibro«ide                           0.04 ug/L

-------
            KCTBOO 19
    JUfALYTE
Ifitrate/Nitritt

-------
                                                          Appendix A
                                                          Date: February 1992
                                                          Page 57 of 63
                         Appendix A-5

        Guidance for Quality Assurance Project Plans:

                          Section 11


Laboratory  QC Requirements  (including time-storage)

-------
         LABORATORY QC REQUIREMENTS FOR PRIMARY ANALYSES
                   GENERAL  INFORMATION  2/3/88

1.   Laboratory control standard mixes,  which together contain all
     method analytes, will  be analyzed  with each set of samples,
     (except Methods 5, 7, and 9).

2.   A set of  samples  is defined as  all  samples,  blanks,  spiked
     samples, etc., which are extracted at the same time,  or for
     methods 5, 7, and 9,  which are  analyzed  by the same person
     within a 12 hour period.

3.   The internal  standard area checks detailed in Methods 1-6 will
     be  used as  stated.    However,  the  control limits will  be
     reassessed following completion of the initial demonstration
     of capabilities.

4.   The measurement  system  is to be evaluated whenever any analyte
     is observed in a method blank, at a  concentration greater than
     or equal to  1/2 the minimum reportable level  (MRL).   Method
     blanks are to be analyzed with each set of samples.

5.   The criteria  for monitoring instrument control standards will
     be utilized as stated in the methods, for Methods 1-6.

6.   Surrogate recoveries (Methods 1-4, 6-7) will be required to be
     within  +/- 30% of the  mean recovery determined for  that
     surrogate during the initial demonstration of capabilities.

7.   Time storage  samples must  be extracted within  +/-  4  days of
     the proper date,  and analyzed within  4 days  of extraction.
     For example,   non-stored time storage samples must  be  spiked
     within  the 14  day holding  time for  samples,  and must  be
     analyzed within 4  days of extraction.  A stored time storage
     sample, must  be analyzed  within 4  days  of extraction.   A
     stored time storage sample must be extracted by no sooner than
     10 days and no later than 16 days after being spiked, and must
     be analyzed within 4 days of being extracted. Samples will be
     spiked at lOx MRL.

8.   The requirements for monitoring calibration standard responses
     will be followed as written in the methods.

9.   Samples failing any QC criteria must be  reanalyzed  at the
     contractors expense.

10.  Only qualitative  analyses  will be required for Chloramben,
     Diazinon,    Disulfoton,   Disulfoton   sulfone,    Disulfoton
     sulfoxide, Endosulfan  I,  Endosulfan  II,   Metribuzin  DADK,
     Metribuzin DK, Prometon, Pronamide, and Terbufos.  While these
     analytes  are  to  be  analyzed  in  at  least  one  of  the
     concentration levels of the calibration standards,  they are
     not subject to any of the QC requirements.

-------
                       GENERAL INFORMATION
                             (CON'T)

11.   If any analyte is observed at a concentration greater than the
     minimum reportable level, during sample analyses using Method
     7, the corresponding shipping blank must also be analyzed, and
     if necessary, confirmed.

12.   Each time that new calibration standard dilutions are prepared
     they must be compared to the existing calibration curve, and
     the observed concentration must agree within  +/-  20% of the
     expected concentration.

13.   Any   deviation   from  the   analytical   procedures   or  QC
     requirements, must be  approved by  the appropriate technical
     monitor, and documented in writing.

-------
                    LABORATORY  QC  REQUIREMENTS
                             (CON'T)

                    SECOND  COLUMN  CONFIRMATION
                           (Methods 1-7)

1.    Quantitate by comparison to a calibration standard, which is
     within +/- 20% of the concentration of the sample determined
     using the primary column.

2.    The concentrations determined  on  the  secondary column, must
     agree within +/- 25% of the result determined on the primary
     column.

3.    If the concentration determined on the secondary column does
     not agree within the limits stated above, the contractor must
     confer with the technical monitor concerning resolution  of the
     discrepancy.

                       GC/MS CONFIRMATION
                        (Methods 1-3,6-7)

1.    The sample  is  to be compared to  a  standard  prepared  at the
     concentration determined for the sample, on either the primary
     or secondary column, whichever concentration is the lower.

2.    If additional sample treatment is performed for GC/MS analysis
     (blowdown, etc.), the standard and  sample must both undergo
     the same treatment.

3.    Results  of  the  GC/MS analysis are simply reported as  the
     presence or absence of the analyte.

4.    The sample  extract  is to be  shipped  to one of  the referee
     laboratories  (Methods l,  3, or 6  to BSL,  Methods  2,  or 7 to
     TSD) for high resolution GC/MS analysis if  confirmation  of the
     analyte  is  not possible  using quadrupole GC/MS due to  the
     concentration of the analyte  or if the concentration is equal
     to or greater than 1/2 the lowest  adverse health effect level
     for that analyte, or if requested by the technical monitor.

-------
                                                           Appendix A
                                                           Date: February 1992
                                                           Page 61 of 63
                         Appendix A-6

Guidance for Revisions to Quality Assurance Project  Plans

-------
       \     UNITED STATES ENVIRONMENTAL PROTECTION AGENCY

                         WASHINGTON. D.C. 20460
                               B U   7  !OC"                   orriei OF
                              K»t/L.  I  l^__            PCSTICIOCS AMD TOXI
                                                    PCSTICIOCS AMD TOXIC fUMTANC
MEMORANDUM

SUBJECT:  Addenda Co Quality Assurance Plans
FROM:     James Boland, Acting
          National Pesticide Survey of^orinking water Wells

TO:       NFS Technical Monitors and Quality Assurance Coordinators


     This memorandum is in response to questions on the procedures
for revision of Quality Assurance Project Plans.  It is
anticipated that QA Project Plans will be approved by the QAOs
for the Office of Drinking Water and Office of Pesticide Programs
and myself during July.  The documents should not be revised,
instead revisions should be in the form of addenda.

     The addenda can be a memorandum from the analytical contractors,
coordinators or technical monitors.  The memorandum should be
specific with regard to Che page or section and sentences being
revised and the wording of any changes.  For instance,

       Change Section 12, page 3, second paragraph, third sentence:

              "The QAC will ..... ..... basis."

                           Co

              "The QAU will audic each analysis set."

or    Delete Section 12, page 3, second paragraph, third sentence:

              "The QAC will ............. basis."

-------
                               -2-


The memorandum should have an approval space at the end such as

              Approved	
                            QAO-ODW
              Approved
                            QAO-OPP
              Approved
                           NFS Director


Addenda will be approved by the Quality Assurance Officers and
myself and there should be a prior consensus from all involved
on the contents and wording of any addendum.  Copies of an addendum
once approved will be distributed to the analytical coordinators
who will distribute it to the technical monitors.  Technical
monitors will be responsible for distributing addenda to their
contractors.  Copies will be retained by the NFS Director and
QAOs.

     These addenda will be incorporated into the final edited
document for publication.  This final document incorporating the
addenda and ICF editoral and typographical comments will be
needed in September for publication by the National Technical
Information Service.  This procedure should reduce the number of
complete documents which need to be generated and the amount of
effort required to finalize the laboratory QAPjPs for publication.
If further addenda are needed after publication, the form and
procedure will be the same unless otherwise notified.

     Further revision after September will also take the form of
addenda as previously described.  Once the Survey QAO is retained,
he/she will coordinate the process for approving addenda.  If you
have any questions, contact one of the QAOs.


cc:  M. Gomez-Taylor
     E. Leovey

-------
                                            Appendix B
                                            Date:  February 1992
                                            Page 1 of 12
         APPENDIX B

LABORATORY AUDIT CHECKLIST

-------
                         NATIONAL PESTICIDE SURVEY
                         LABORATORY  AUDIT  CHECKLIST
    DATE  	   LABORATORY
    AUDITOR(S) 	   ANALYTICAL METHOD
                                        PERSONS CONTACTED/TITLE
    CONTRACT NO.

    REPORT NO. _
SECTION I:  QA MANAGEMENT SYSTEMS FOR NPS ANALYSES

	QUESTION	     Yea  1£2  N/A       Comments

1.  Is the latest copy of the QA Plan
    available?                             	  	  	

2.  Does the QAPjP contain all the
    applicable signatures?                 	  	  	

3.  Are personnel familiar with the
    QAPjP?                                 	  	  	

4.  Is the QA function being implemented
    as described in the QAPjP?             	  	  	

5.  Do internal organization charts show
    QA function which operates outside
    of the technical unit which generates
    the measurement data?                  	  	  	

6.  Does the QA function located externally
    to the project review data?            	  	  	

7.  Is a record maintained of internal
    laboratory audits?                     	  	  	

8.  Does the audit record show that the
    system and performance audits are
    conducted as described in the QAPjP?   	  	  	
                                                             Page 	 of __

-------
	QUESTION	     Yes  J£ffi  N/A       Comments

9.  Is a system in place for determining
    that method QC criteria have been met? 	  	  	

10. Are failures in method QC documented?  	  	  	

11. If failures have occurred, has
    corrective action been documented?     	  	  	

12. Have long-term problems been
    encountered?                           	  	  	

13. If yes, has the problem and
    subsequent corrective action
    been documented?                       	  	  	

14. Are control charts being prepared
    according to the QAPjP?                	  	  	

15. Are personnel at all levels aware
    of the recourse within their
    organization for correcting problems?  	  	  	

16. does management show visible support
    for quality assurance?                 	  	  	

17. Have questions been openly and
    honestly answered?                     	  	  	
                                                            Page 	 of

-------
SECTION II:  PROJECT MANAGEMENT SYSTEM

	QUESTION	     Yes  go.  N/A       Comments

1.  Are the individuals currently
    performing the work the same
    individuals who were originally
    assigned to perform the work as
    described in the QAPjP?                	  	  	

2.  If deviations have occurred, have
    they been documented?                  	  	  	

3.  Does the line manager allow for
    quick resolution of problems?          	  	  	

4.  Is the phone number and address of
    the project manager current?           	  	  	

5.  Have new analysts been adequately
    prepared for NPS work?                 	  	  	

6.  Does a supervisor review and initial
    daily logs for content and
    completeness?                          	  	  	

7.  Have monthly reports of laboratory
    activity been submitted to the
    technical monitor?                     	  	  	

8.  Are current staffing levels sufficient
    to meet the needs of the NPS in a
    timely and efficient manner?           	  	  	

9.  Do laboratory personnel have a copy
    of the analytical method at their
    workstation?                           	  	  	

10. Are SOPs listed in the QAPjP being
    followed?                              	  	  	

11. Has sufficient communication occurred
    between the lab and ICF so that the
    goals of the project can be met?       	  	  	
                                                            page 	 of

-------
SECTION III.  SAMPLE TRACKING SYSTEM

	QUESTION	     Yes  JJfi  N/A       Comments

1.  Have samples been assigned a unique
    control number?                        	  	  	

2.  Can samples be cross-referenced to
    NFS control numbers?                   	  	  	

3.  Are sample tracking forms properly
    completed?                             	  	  	

4.  Are sample tracking records filed?     	  	  	

5.  Is the storage facility for samples
    adequate?                              	  	  	

6.  Are refrigerator/freezer logs for
    samples, extracts, and standards
    available and current?                 	  	  	

7.  Are samples, extracts, and standards
    stored in a manner that prevents
    contamination?                         	  	  	

8.  Are procedures developed to alert
    analysts to the sample receipt
    schedule?                              	  	  	

9.  Is the movement of samples and
    extracts within the lab documented?    	  	  	

10. Are procedures developed for tracking
    holding times for samples and extracts?	  	  	

11. Are time storage samples being analyzed
    according to the QAPjP?                	  	  	

12. Are procedures available for shipping
    extracts to the referee labs?          	  	  	

13. Are procedures available for sample
    disposal?                              	  	  	

14. Have sample supplies  (i.e. cooler,
    shipping box, and bottles) been
    returned to ICF?                       	      	
                                                            Page 	 of

-------
SECTION IV.  SYSTEMS FOR MANAGING AND DOCUMENTING ANALYTICAL OPERATION

	QUESTION	     Yes  go.  N/A       Comments

1.  Is the following information
    documented for all reagents used?

    a.  Manufacturer                       	  	  	

    b.   Date of receipt                   	  	  	

    c.   Date opened                       	  	  	

    d.   Purity                            	  	  	

    e.   Lot number                        	  	  	

2.  Does documentation exist for standards
    preparation that uniquely identifies
    the reagents/solvents used and the
    method of preparation?                 	  	  	

3.  Does documentation exist for
    identification of standard preparer
    and date of standard preparation?      	  	  	

4.  Are new standards being prepared
    at the proper intervals?               	  	  	

5.  Are calibration standards validated
    prior to use?                          	  	  	

6.  Are calibration procedures being
    followed according to the QAPjP?       	  	  	

7.  Do balances have calibration stickers
    showing date of last certified
    calibration and date of next scheduled
    calibration?                           	  	  	

8.  Do balances have logs indicating
    calibration checks performed in-house? 	  	  	

9.  Are maintenance logs kept for lab
    equipment?                             	  	  	

10. Is the analytical method being
    performed as described in the QAPjP?   ___  	  	
 11. Are deviations to the method
    documented?
                                                            Page 	 of

-------
SECTION V:  DATA MANAGEMENT SYSTEMS

	QUESTION	     Yes  N_2  N/A

1.  Are entries to logbooks signed,
    dated, and legible:                    	  	  	

2.  Are changes to logs dated and
    initialed by the person who made
    them?                                  	  	  	

3.  Are the required calculations being
    performed as described in the QAPjP?   	  	  	

4.  If not, have the calculations being
    used been documented?                  	  	  	

5.  Are hard copies of sample preparation
    records and chromatograms stored in
    the project files?                     	  	  	

6.  Have lab data management systems
    been validated prior to use?           	  	  	

7.  Does the data validation staff
    periodically duplicate the
    calculations performed by the LIMS?    	  	  	

8.  Can the instrument on which the
    analysis was performed by identified
    from the project files?                	  	  	

9.  Is data stored in an accessible yet
    securable area?                        	  	  	

10. Do the procedures for reporting data
    follow NPSIS guidelines?               	  	  	

11. Are the project files checked for
    completeness?                          	  	  	

12. Does the lab have a document archival
    system in place?                       	  	  	
                                                            Page 	 of

-------
SECTION VI:  LABORATORY MANAGEMENT SYSTEMS

	QUESTION	     Yes  J£fi  N/A

1.  Is service on instruments readily
    available?                             	  	  	

2.  Are replacement parts for instruments
    available?                             	  	  	

3.  Has a contamination free area been
    provided for trace level work?         	  	  	

4.  Is the analytical balance located
    an area free from drafts and rapid
    temperature changes?                   	  	  	

5.  Are reagent grade or higher purity
    chemicals used to prepare standards?   	  	  	

6.  Is the manufacturer's maintenance
    manual available?                      	  	  	

7.  Has sufficient laboratory space been
    allocated to perform all phases of the
    analytical method?                     	  	  	

8.  Are glassware cleaning procedures
    adequate?                              	  	  	
                                                            Page 	 of _

-------
SECTION VII:  FOLLOW-UP ON PREVIOUSLY-IDENTIFIED PROBLEMS

	QUESTION	     Yes  N_2  N/A       Comments

1.  Has a person been designated to
    follow-up on previously-identified
    problems?                              	  	  	

2.  Has a timeframe been stipulated for
    resolving problems?                    	  	  	

3.  Does documentation of the resolution
    of problems exist?                     	  	  	
                                                            Page 	 of

-------
SECTION VIII:  DATA AUDIT
The following information should be confirmed using laboratory records if
more than one sample is tracked, make additional copies of Section VIII.
A.  SAMPLE RECEIPT INFORMATION
1.  NFS ID Number:	
2.  Laboratory ID Number:                 	
3.  Date Sampled:                         	
4.  Date Received:                        	
5.  Other Comments:                       	
B.  EXTRACTION
1.  Sample Set Number:                    	
2.  Analyst:                              	
3.  Date of Extraction:                   	
4.  Calculated days from date
     of sampling:
5.  Surrogate solution ID:
6.  Can surrogate solution
     preparation be validated?
7.  Method Blank ID:
C.  PRIMARY ANALYSIS
1.  Sample Set Number:
2.  Analyst:
3.  Date of Analysis:
4.  Calculated days from date
     of extraction:
5.  Internal Standard ID:
                                                            Page

-------
                                          Sample ID:
6.  Can internal standard
     preparation be validated?            	

7.  Instrument ID:                        	

8.  Do records show the instrument
     calibration was validated per
     QAPjP, Section 8?                    	

9.  Do records show all required QC
     checks for the sample's set were
     evaluated per QAPjP, Section 11?     	

10. Do records show the course of action
     taken if any QC checks did not meet
     criteria per QAPjP, Section 11?      	

11. Were data reduced as described in
     the QAPjP, Section 10 and 14?        	

12. If an analyte hit was observed,
     could the qualitative and
     quantitative results reported
     for the sample be reproduced using
     the laboratory data?                 	

D.  SECONDARY ANALYSIS

1.  Sample Set Number:                    	

2.  Analyst:                              	

3.  Date of Analysis:                     	

4.  calculated days from date
     of extraction:                       	

5.  Internal Standard ID:                 	

6.  Can internal standard
     preparation be validated?            	

7.  Instrument ID:                        	

8.  Do records show the instrument
     calibration was validated per
     QAPjP, Section 8?                    	

9.  Do records show all required QC
     checks for the sample's set were
     evaluated per QAPjP, Section 11?     	
                                                            Page 	 of

-------
                                          Sample ID:
10.  Do records show the course of action
     taken if any QC checks did not meet
     criteria per QAPjP, Section 11?

11.  Were data reduced as described in
     the QAPjP, Section 10 and 14?

12.  If an analyte hit was observed,
     could the qualitative and
     quantitative results reported
     for the sample be reproduced
     using the laboratory data?

E.  CONFIRMATION ANALYSIS BY GC/MS

1.  Sample Set Number:

2.  Analyst:

3.  Date of Analysis:

4.  Calculated days from date
     of extraction:

5.  Internal Standard ID:

6.  Can internal standard be
     validated?

7.  Instrument ID:

8.  Was the instrument tuned to
     manufacturers specifications?

9.  Are instrument operating
     conditions recorded?

10. If an analyte hit was observed,
     could the qualitative results
     reported  for the sample be
     reproduced using the laboratory
     data?
                                                            Page 	 of

-------
                                              Appendix C
                                              Date: February 1992
                                              Page 1 of 16
           APPENDIX C

FIELD SAMPLING AUDIT  CHECKLIST

-------
                         NATIONAL PESTICIDE SURVEY
                           FIELD AUDIT CHECKLIST
Auditor: 	 Sampling Organization:

Date:    	 Field Team Leader: 	

Site:    	 Sampler:  	
Well ID No.: 	 Interviewer:

Other Team Members: 	
                           PRE-FIELD PREPARATIONS
     Question                                        Yes    No    Comments

     Have the correct date and sampling locations
     been recorded for DWS sampling on the Well
     Sampling Information Sheet?                     	    	    	
2.   a.  Have all contacts with the head of the
         sample household been recorded in the Interview
         Record in the Team Leader Introduction and Well
         Observation Form?                           	

     b.  Have correct result codes been assigned to
         each contact with the head of the sample
         household?                                  	

3.   Was an inventory check conducted of the
     sample container kit numbers for each well
     using the Well Sampling Information Sheet?      	
ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page: 	  of

-------
                     PRE-FIELD PREPARATIONS (Continued)
     Question                                         Yes

4.   Was an equipment inventory conducted  against
     the Supply Kit Equipment Checklist?              	

5.   Was the local Federal Express Office  contacted
     to ensure that samples could be picked up by or
     delivered to the office after sampling was
     completed?                                       	

6.   a.  Were old Federal Express bar code labels
         and airbills removed from the sample container
         kit and supply coolers when first received.  	

     b.  Was the Federal Express three character
         code crossed out on the box?                 	

7.   Were all the necessary sampling and inter-
     viewing materials taken to the DWS site?         	
No
Comments
                               DWS SITE VISIT
I.  MORNING

     Question                                        Yes

1.   Was sufficient ice (at least one bag per kit)
     obtained?                                       	

2.   Did the sampling team plan for and arrive at the
     appointed time?                                 	

ADDITIONAL COMMENTS
No
Comments
                                                Report No.:
                                                Date: 	
                                                Page: 	
   of

-------
II.  ARRIVAL AT SITE

     Question                                        Yes    No    Comments

1.   If the head of the sample household was not at the
     site were attempts made to contact the person
     or to solicit an applicable substitute?        	    	    	
2.   Did the Field Team Leader consult the head of
     the sample household and introduce the
     sampling team?

3.   Did the selected sampling location represent
     freshly pumped water, and not from a holding
     tank?

4.   Did the Field Team Leader ask if there was a
     treatment system?

5.   After selecting the well, did the Team
     Leader complete the well and area sketches
     using the list on Page 7?

6.   Were all applicable portions of the Team
     Leader Introduction and Well Observation
     Record completely filled out?

ADDITIONAL COMMENTS
                                              Report No.:
                                                Date: 	
                                                Page: 	  of

-------
III.  DWS WELL SAMPLING PROCEDURES

     A.  WELL PURGING

     Question                                         Yes     No     Comments

1.   a.  Was a port or tap available for  sampling
         ahead of any pre-treatment and specific to
         the well selected?
     b.  If not, was the  ICF Hotline called before
         samples were collected?                     _

2.   a.  Was the air temperature measured and re-
         corded on the DWS Well Purging Parameters
         Record provided  in the field logbook?       _

     b.  Was a dry temperature probe used for the
         measurement?                                _

3.   a.  Were the measurements for pH, temperature,
         and conductivity of the TQ sample taken
         within 25 seconds after submersion into
         the sample?                      •          _

     b.  Was the water gently stirred, but readings
         taken with the water and meters at rest?    _

     c.  Were the measurement recorded on the Well
         Purging Parameters Record as T0?            _

     d.  Was the reading obtained on the conductivity
         meter multiplied by 10 and reported as ppm? _

4.   a.  Was the water allowed to run an additional
         five minutes before extracting another sample
         for the measurements of pH, temperature and
         conductivity?                               _
     b.  Were the results recorded on the DWS Well
         Purging Parameters Record as 75?

ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page: 	  of

-------
III.
1.

2.
 DWS WELL SAMPLING PROCEDURES (Continued)

A.  WELL PURGING

Question
                                                     Yes
         Was the water allowed to run an additional
         five minutes before extracting another sample
         for the measurements of pH, temperature and
         conductivity and the results recorded on the
         DWS Well Purging Parameters Record as T10?  _

         Were the readings checked for stability?    _
     c.  If readings were not stable, was the well
         checked at five minute intervals until
         two consecutive samples produced stable
         readings or 30 minutes had elapsed?

     d.  Were the criteria for stability properly
         applied?

     B.  SAMPLE COLLECTION

     Question
                                                Yes
No
Comments
No
Comments
Was only one sample kit worked with at a time?  	    	

Were the kits checked for broken bottles,
bottle labels, and were labels checked against
the sample numbers shown on the Sample Tracking
Form?                                           	    	

Were samples collected from plastic or rubber
hoses?                                          	    	
ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page: 	
                                                          of

-------
 III.  DWS WELL SAMPLING PROCEDURES  (Continued)

     B.  SAMPLE COLLECTION

     Question                                        Yes    No    Comments

4.   Applicable to kits designated  ESE.

     a.  Was the shipping blank bottle label
         initialed, dated, and the  time recorded
         with a waterproof pen?                      	    	    	
     b.  Was the shipping blank placed back
         into the kit without opening it?            	    	

     For collection of samples using the 1-liter
     bottles, were the following conditions
     observed:

     a.  Initialing, dating and recording of time
         on the bottle label?                        	    	

     b.  Keeping the bottle cap in hand to avoid
         contamination?                              	    	

     c.  Sampling close to the tap without touch-
         ing the spout?                              	    	

     d.  Preservative not lost because of rinsing,
         overfilling or spillage?                    	    	

     e.  Filling the bottle slightly above the
         1000 mL mark?                               	    	

     f.  Replacing of cap to seal the bottle?        	    	

     g.  Recording of the date and time of sampling,
         and the sampler's initials on the Sample
         Tracking Form?                              	    	
ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page: 	  of

-------
III.  DWS WELL SAMPLING PROCEDURES (Continued)

     8.  SAMPLE COLLECTION

     Question                                        Yes    No    Comments

5.    h.  Placing each filled bottle back into the
         sample container kit before taking the next
         bottle out?
     For collection of samples using the 250 mL
     bottles, were the following conditions observed:

     a.  Initialing, dating, and recording of time on
         the bottle label?

     b.  Keeping the bottle cap in hand to avoid
         contamination?

     c.  Adding water to the bottle until almost
         full?

     d.  Gentle tapping of bottles with fingers to
         bring air bubbles to the surface?-

     e.  Use of bottle cap to complete filling;
         creation of convex meniscus prior to
         sealing the bottle?

     f.  Sealing bottle so that the teflon side of
         the septum is in contact with the sample?

     g.  Inversion of the bottles and tapping firmly
         to check for air bubbles?

     h.  Refilling of bottle, if necessary, to
         eliminate air bubbles?
ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page: 	  of

-------
 III.  DWS WELL SAMPLING PROCEDURES  (Continued)

     B.  SAMPLE COLLECTION

     Question                                        Yes    No    Comments

6.   i.  Recording of the date and  time of sampling,
         and the sampler's initials on the Sample
         Tracking Form?                              	    	    	

     j.  Placing each filled bottle back into the
         sample container kit before taking out
         the next bottle?                            	    _    	

7.   a.  Initialing, dating and recording of time
         on the bottle label?                        	    _    	

     b.  Keeping the bottle cap in hand to avoid
         contamination?                              	    	    	

     c.  Filling the bottle to the start of the
         neck?                                       	    	    	

     d.  Replacing the cap to seal the bottle?       	    	    	

     e.  Recording of the date and time of sampling
         and the sampler's initials on the Sample
         Tracking Form?                              	    	    	

     f.  Placing each filled bottle back into the
         sample container kit before taking out
         the next bottle?                            	    	    	

8.   For collection of samples using the 60 mL
     bottles, were the following conditions
     observed:

     a.  Initialing, dating and recording of time
         on the bottle label?                        	    _    	

     b.  Keeping the bottle cap in hand to avoid
         contamination?                              	    	    	

ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page: 	  of

-------
 III.  DWS WELL SAMPLING PROCEDURES  (Continued)

     B.  SAMPLE COLLECTION

     Question                                        Yes    No    Comments

8.   c.  Adding water to the bottle until almost
         full?
     d.  Gentle tapping of bottles with fingers to
         bring air bubbles to the surface?           	    	

     e.  Use of bottle cap to complete filling;
         creation of convex meniscus prior to
         sealing of bottle?                          	    	

     f.  Sealing bottle so that the teflon side of
         the septum is in contact with the sample?   	    	

     g.  Inversion of the bottles and tapping firmly
         to check for air bubbles?                   	    	

     h.  Refilling of bottle, if necessary, to
         eliminate air bubbles?                      	    	

     i.  Recording of the date and time of sampling,
         and the sampler's initials on the Sample
         Tracking Form?                              	    	

     j.  Placing each filled bottle back into the
         sample container kit before taking out
         the next bottle?                            	    	

9.   Were any bottles overfilled during sampling?    	    	

10.  If any bottles were overfilled, were they
     noted on the Sample Tracking Form?              	    	

11.  Were any bottle caps or septa dropped during
     the course of sampling?                         	    	

ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page:  	  of

-------
 III.  DWS WELL SAMPLING  PROCEDURES  (Continued)

     B.  SAMPLE COLLECTION


     Question                                         Yes     No     Comments

 12.  If item 111 was checked yes, were  the  caps  and
     septa rinsed with well water prior to  placement
     on the sample bottle?                            	     	     	
 13.   If  item 111 was checked yes, was the event noted
      on  the Sample Tracking Form?                    	    _

 14.   After all bottles  in the  kit were  filled, did  the
      sample team members check the Sample Tracking
      Form to ensure all samples had been collected?  	    _

 15.   Did the team member who completed  the Sample
      Tracking Form sign their  name in the space
      provided on the form?                           	    _

 16.   Was the white copy of the Sample Tracking Form
      placed in its original ziplock bag, which is
      taped to the sample kit lid?                    	    _

 17.   Was the pink copy of the  Form placed in the
      pocket of the Field Logbook?                    	    _

 18.   Was ice placed around the sample bottles for
      each kit by first placing the small plastic
      bag enclosed in each sample container into the
      open spaces of the kit, then placing the ice
      into the bag to fill the  sample kit?            	    _

 19.   a.  Was the sample kit sealed at this time?     _    _

      b.  If yes, will the kit  be delivered to
         Federal Express the same day?               	    _

20.  Was a final pH, temperature and conductivity
      reading taken after all sample container
     kits had been completed?                        	    _

ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page: 	  of

-------
III.  DWS HELL SAMPLING PROCEDURES  (Continued)

     B.  SAMPLE COLLECTION

     Question                                        Yes    No

21.  Was the water run continuously during the time
     purging and sampling?                           	    	

22.  Was excess water collected and/or directed
     to a drain to prevent making a mess?            	    	

     C.  AFTER SAMPLE COLLECTION

     Question                                        Yes    No

1.   a.  Was additional ice added as needed to
         the kits after the sampling activities?     	    	
     b.  Was the neck of the inner bag twisted and
         taped so that leaks would not occur?

2.   Was the Federal Express airbill removed from
     inside the ziplock bag and kept separate to
     avoid confusion with other airbills?            	    	

3.   Did the sampler properly replace the styrofoam
     inserts and the sample container kit lid?       	    	

4.   Was there a plastic bag without holes or tears
     between the kit and the cardboard box?          	    	

5.   Were the ends of the plastic bag twisted
     together to form a neck which was taped
     at the base?                                    	    	

6.   Was the plastic bag end folded and taped again,
     or taped down onto the kit?                     	    	

ADDITIONAL COMMENTS
   Comments
   Comments
                                                Report No.:
                                                Date: 	
                                                Page: 	
of

-------
 III.  DWS HELL SAMPLING PROCEDURES  (Continued)

     C.  AFTER SAMPLE COLLECTION

     Question                                         Yes     No     Comments

 7.   Was the box taped shut,  being  careful  not  to
     cover the labels located on the two opposite
     sides of the box?                                	     	     	
 8.   Was the pre-addressed Federal  Express  airbill
     placed in the clear adhesive pouch on  the
     appropriate box?                                 	     	     	

 9.   Was Part A of the Final Community Water System
     Checklist completed?                             	     	     	

 10.  Was the Community Water System Well Observation
     Record completed and checked?                    	     	     	

 11.  Was the site policed for trash prior to
     leaving?                                         	     	     	

     D.  SHIPMENT OF SAMPLES TO LABORATORIES

     Question                                         Yes     No     Comments

 1.   a.  Did Federal Express pick up the samples?     	    	    	

     b.  Were the samples taken to a Federal
         Express office?                              	    	    	

2.   Were samples relinquished to a Federal
     Express agent?                                  	    	   	

3.   Was a copy of each airbill collected from the
     Federal Express agent and placed in the Field
     Logbook?                                        	    	    	

ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page: 	  of

-------
III.  DWS WELL SAMPLING PROCEDURES (Continued)

     D.  SHIPMENT OF SAMPLES TO LABORATORIES

     Question                                        Yes

4.   Was unused material shipped back to ICF by
     Federal Express "Standard Air" two-day
     shipment?                                       	    	

5.   Was the Final DWS Checklist, Part C, completed? 	    _

6.   Did the sampling team call the NPS Hotline to
     inform the Sample Tracking System that kits
     had been sent to the laboratory?                	    	

7.   Were logbooks/questionnaires either sent to
     Westat or plans made to send them after the
     Local Area Interview?                           	
                      COMPLETION OF DWS QUESTIONNAIRE
     Question
     a.  Was the DWS Questionnaire administered to
     the head of the sample household?               _

     b.  If the answer to item #1 is no, then list
         to whom it was administered.                _

     a.  Was the DWS Questionnaire administered during
         the time that samples were collected?       _

     b.  If no, then list when the interview
         occurred.                                   _

     c.  Did the interview take place in an
         appropriate setting?                        _
       No    Comments
Yes    No    Comments
ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page: 	
          of

-------
COMPLETION OF DWS QUESTIONNAIRE  (Continued)
     Question                                        Yes    No    Comments
3.   Are the name(s) of all respondents  supplied on
     the Interview Record of Contacts?               	    	    	
4.   Was the respondent's name correctly printed on
     Page 1 of the questionnaire?                    	    	    	
5.   Was the respondent's telephone number
     recorded in appropriate boxes?                  	    	    	
6.   Were all applicable questions answered in
     Section A of the questionnaire?                 	    	    	
7.   Were all applicable questions answered in
     Section 8 of the questionnaire?                 	    	    	
8.   Were all applicable questions answered in
     Section C of the questionnaire?                 	    	    	
9.   Were all applicable questions answered in Section
     D of the questionnaire?                         	    	    	
10.  Did the interviewer check the appropriate slot
     if drilling records or inspection documents
     were used to obtain information in Section D?   	    	    	
11.  Were all written comments and responses
     legible?                                        	    _    	
12.  a.  Were all responses correctly coded?         	    	    	
     b.  Were prompted responses marked with an (X)? 	    	    	
     c.  Were questions re-read or clarified so
         that bias was not introduced?               	    	    	
ADDITIONAL COMMENTS
                                                Report No.:
                                                Date:  	
                                                Page:  	  of

-------
COMPLETION OF OWS QUESTIONNAIRE (Continued)

     Question                                        Yes    No    Comments

12.  d.  Was the interviewer always in control of
         the situation?
     e.  Did the interviewer act in a professional
         but courteous manner?

     f.  Did the respondent become enraged, upset,
         or annoyed during the interview?

     g.  Did the interviewer use exact NPS
         definitions and not introduce their own
         definitions or opinions?

13.  Was evidence of erasures present in the
     completed questionnaire?

14.  Did the interviewer use the following criteria
     for making corrections to responses:

     a.  Draw a line in blue pencil through the
         original answer.

     b.  Write the corrected answer in blue pencil
         above or beside the original response.

15.  Did the interviewer examine the completed
     forms and questionnaires prior to leaving the
     site to ensure no questions were accidentally
     skipped?

16.  Did the Field Team Leader review all completed
     questionnaires and forms prior to leaving the
     site?
ADDITIONAL COMMENTS
                                                Report No.:
                                                Date: 	
                                                Page: 	  of

-------
                                            Appendix D
                                            Data: February 1992
                                            Page 1 of 5
          APPENDIX D

GENERAL NFS  AUDIT  CHECKLIST

-------
                          NATIONAL PESTICIDE SURVEY
                           GENERAL AUDIT CHECKLIST
DATE  	     QAPjP
AUDITOR(S)  	     ORGANIZATION
CONTRACT NO.

REPORT NO.
                                        PERSONS CONTACTED/TITLE
SECTION I:  QA MANAGEMENT SYSTEMS FOR NFS ACTIVITIES
                  Question	     Yea  No  N/A   	Comments
1.    la the latest copy of the QA Plan
     available?

2.    Does the QAPjP contain all the
     applicable signatures?

3.    Are personnel familiar with the
     QAPjP?

4.    Is the QA function being implemented
     as described in the QAPjF?

5.    Do internal organization charts show
     QA function which operates outside
     of the technical unit which generates
     the measurement data?

6.    Does the QA function located externally
     to the project review data?

7.    Is a record maintained of internal
     audits?

8.    Does the audit record show that the
     system and performance audits are
     conducted as described in the QAPjP?

-------
                             Question	     Yes  fto.  tL/A   	&
 9.    Is  a system in place for determining
      that protocol ahs been followed?

 10.   Are failures in protocol documented?

 11.   If  failures have occurred,  has
      corrective action been documented?

 12.   Have long-term problems been
      encountered?

 13.   If  yes,  has the problem and
      subsequent corrective action
      been documented?

 14.  Are  personnel at all levels  aware
     of the recourse within their
     organization for correcting  problems?

*15.   Does management show visible support
      for quality assurance?

V16.   Have questions been openly  and
      honestly answered?

-------
SECTION II:  PROJECT MANAGEMENT  SYSTEM

	Question	     Yes  No  N/A   	Common t.s

1.    Are the individuals  currently
     performing the work  the  same
     individuals who were originally
     assigned to perform  the  work as
     described in the QAPjP?                  	  	  	

2.    If deviations have occurred, have
     they been documented?                    	  	  	

3.    Does the line manager  allow for
     quick resolution of  problems?            	  	  	

4.    Is the phone number  and  address of
     the project manager  current?             	  	  	

5.    Have new employees been  adequately
     prepared for NFS work?                  	  	  	

6.    Does a supervisor review and initial
     daily logs for content and
     completeness?                            	  	  	

7.    Have monthly reports been submitted
     to the technical monitor?               	  	  	

8.    Are current staffing levels sufficient
     to meet the needs of the NPS in a
     timely and efficient manner?             	  	  	

9.    Do personnel have a  copy of the
     appropriate SOP at their
     workstation?                             	  	  	

10.  Are SOPs listed in the QAPJP being
     followed?                                	  	  	
     U.S. Environmental Protection Agency
     Region 5, Library (PL-12J)
                                                                              Pa

-------
 SECTION  III.   DATA MANAGEMENT  SYSTEMS

 	Question	Laa  tt&  tLiA   	comm^nr.*

 1.    Has  a  document  filing  system
      been established?                        	  	  	

 3.    Are  written  procedures available
      for  storage  and retrieval of files?     	  	  	

 4.    Are  files stored in  an accessable
      yet  securable area?                      	  	  	

 5.    Is  the storage  facility for  files
      adequate?                               	  	  	

 6.    Have standards  been  established  for
      contents  of  the file?                    	  	  	

 7.    Are  the files checked  for
      completeness against the  standard?       	  	  	

 8.    Have data management systems been
      validated prior to uae?                 	  	  	

^.    Are  computer system  checks performed
      and  documented?                         	  	  	

  0.   Are  software packages  documented?       	  	  	

 11.   Is  routine maintenance being
      performed on the computer systems?       	  	  	

 12.   Is  service readily available for
      computer hardware and  software?          	  	  	

-------