&EPA
         United States
         Environmental Protection
         Agency
           Office of Research and
           Development
           Washington DC 20460
EPA/600/R-96/027
November 1995
Guidance for the
Preparation of Standard
Operating Procedures
(SOPs) for Quality-Related
Documents
          EPA QA/G-6

-------

-------
                              TABLE OF CONTENTS
1.     Purpose of the Standard Operating Procedures (SOPs)




2;     Applicability




3.     SOP Logistics




4.     General Format




5.     Checklists




6.     Types of SOPs                           '




7.     Suggested Format for a Technical SOP




8.     Suggested Format for an Administrative SOP




9.     Suggested References
                                  Page




                                    1




                                    1




                                    2




                                    2




                                    3




                                    4




                                    4




                                    5




                                    6
EPA QA/G-6 SOP Guidance
11
November 1995

-------

-------
                       GUIDANCE FOR THE PREPARATION
                    OF STANDARD OPERATING PROCEDURES
                      FOR QUALITY-RELATED DOCUMENTS

1.0    Purpose of the Standard Operating Procedure

       A Standard Operating Procedure (SOP) documents routine or repetitive administrative
       and technical activities to facilitate consistency in the quality and integrity of the product.
       The development and use of SOPs for both technical (e.g., measurements) and
       administrative (e.g., document review/tracking) functions is an integral part of a successful
       quality system.  SOPs facilitate activities that would be managed under a work plan or a
       Quality Assurance Project Plan (EPA QA/R-5), or Chapter 5 of the EPA Quality Manual.

       The development and use of an SOP promotes quality through consistency within the
       organization, even if .there are personnel changes.  Therefore, SOPs could be used as a
       part of a personnel training program. When reviewing historical data, SOPs are valuable
       for reconstructing project activities when no references are available. Additional benefits
       of an SOP are reduced work'effort, along with improved data comparability, credibility,
       and defensibility.

       This guidance document is designed to assist in the preparation and review of an SOP.  To'
       clarify terms for the purpose of this guidance, "protocol" is used to describe the actions of
       a program or group of activities and should not be confused with an SOP. The terms
       "shall" and "must" are used when the element mentioned is required and deviation from,
       the specification will constitute nonconformance with the standard. The term "should"
       indicates that the element mentioned is recommended. The term "may" indicates when the
       element is optional or discretionary. The terms (shall, must, should, and may) are used as
       noted in ANSI/ASQC E4-1994.

2.0    Applicability                                             .

       An SOP is intended to be specific to the organization or facility whose activities are
       described. For example, if the SOP is written for a standard analytical method, the SOP
       specifies analytical procedures in greater detail than appear in the published method to
       ensure that the procedure is conducted in a standard, reliable, and reproducible fashion
       within the organization.  An SOP delineates the specific procedures used to carry out a
       method and how, if at all, the SOP dhTers from the standard method.

       As noted in ASTM D 5172-91, "A significant part of the variability of results generated by
       different laboratories analyzing the same samples and citing the same general reference is
       due to differences in the way the analytical test methods and procedures are actually
       performed in each laboratory:  These differences are often caused by the slight changes or
       adjustments allowed by the general reference, but that can affect the final results."
EPA QA/G-6 SOP Guidance
1
November 1995

-------
       Any activities that are classified as "Inherently Governmental Functions" and, as such,
       must be performed only by EPA employees (or other designated Federal employees),
       should be so indicated.

3.0    SOP Logistics

       The SOP needs to be written by individuals knowledgeable with the activity, the
       equipment, and the organization's internal structure. The SOP should be written with
       sufficient detail so that someone with limited experience with or knowledge of the
       procedure can successfully reproduce the activity or procedure.  The experience
       requirement can be noted in the section on personnel qualifications;.

       The SOP should be reviewed (that is, validated) by one or more individuals with
       appropriate training and experience with the process.  The final  SOP should be approved
       at least by the immediate supervisor (section/branch chief) and the quality assurance
       officer, or as described in the organization's Quality Management Plan (QMP).

       Whenever procedures are changed, the SOP should be modified and approved as stated
       above. The revision number and date should be indicated after each modification.

       The SOP should be reviewed periodically as defined by the QMP to ensure that the
       policies and procedures remain current and appropriate. Any SOP that is current does not
       need to be revised. However, the review date should be added to document that the SOP
       has been reviewed.
                                                             \
       Current copies of the SOPs should be readily available for reference in the work areas of
       those individuals actually performing the activity. Individuals following these SOPs
       should note the revision number being used. This is critical when the need for evidentiary
       records is involved and when the activity is being reviewed.

       Each organization should maintain in its files a master  list and file of all SOPs, including
       the date of the current version, in accordance with its approved  QMP.  Outdated versions
       need to be maintained in a manner to prevent their continued use and to be available for
       historical data review. The Quality Assurance Manager (or designee) is responsible for
       maintaining a file listing all current quality-related SOPs used within the organization.
       This list may be used when audits are being considered and questions are raised as to
       practices being followed within the organization.
EPA QA/G-6 SOP Guidance
November 1995

-------
4.0    General Format

       An SOP should be organized to ensure ease and efficiency in use. Development of short
       and simple SOPs and citation of other available SOPs or documents are highly
       recommended practices.                       ,

       4.1    Title Page
             The first page of each SOP should be a title page having the following information:
             a title that clearly identifies the activity or procedure, the name of the applicable
             agency/group, and the date and signatures of those individuals who prepared and
             approved the SOP.

       4.2    Table of Contents
             A Table of Contents is needed only if the SOP is longer than ten pages.

       4.3    Control Documentation
             Each page of the SOP should have control documentation, as illustrated below,
             generally in the upper right hand corner. A short title can identify the activity
             covered by the SOP and serve as a reference designation.  The revision number
             and date are useful in identifying the SOP in use when reviewing historical data.
             The user can also quickly check if the SOP is complete when the number of pages
             is indicated. Suggested control documentation format:      Short Title
                                                                  Rev: #: 0
                                                                  Date: July 1995
                                                                  Page 1 of6

       4.4    Text
             A well-written SOP has three sections: procedural, QA/QC, and reference.  The
             text of an SOP should be clearly worded so as to be readily understandable by a
             person knowledgeable with the general concept of the procedure.  The procedures >
             should be written in a step-by-step (cookbook) format that clearly describes the
             steps in chronological order. Use the active voice and present verb tense. The
             term "you" should not be used, but implied.

             An SOP can reference other SOPs. In such a case, cite the other SOP or attach a
             copy.

5.0    Checklists

       Many activities use checklists to ensure that steps are followed in order.  Checklists also
       document completed actions. Any checklists or forms that are included as part of an
       activity should be referenced at the points in the procedure where they are used; blank and
       completed copies of the checklists should be attached to the SOP.
EPA QA/G-6 SOP Guidance
November 1995

-------
       In some cases, detailed checklists are prepared specifically for a given activity, as for an
       inspection.  In those cases, the SOP should describe, at least generally, how the checklist
       is to be prepared, or on what it is to be based.  Copies of specific checklists are then
       maintained in the file with the activity results and/or with the SOP. Remember that the
       checklist is not an SOP, but a part of one.

6.0    Types of SOPs

       An SOP may be written for a repetitive administrative procedure as well as for a technical
       activity. Examples are:  QA procedures for conducting assessments; equipment use;
       maintenance and calibration; and, collection of samples.  General guidance for preparing
       both Technical and Administrative SOPs follow.

       The Agency has prescribed a format for documenting environmental monitoring methods,
       entitled EMMC Methods Format (attached). This methods format is sometimes confused
       with an SOP, perhaps because methods also include step-wise procedures that are to be
       followed by an analyst. However, monitoring methods contain information that is not
       essential to performing a repetitive technical activity, .e.g., sections on method sensitivity,
       method performance, validation data, and pollution prevention.

7.0    Suggested Format for a Technical SOP

       7.1   Procedural Section:
             After the title page, the following are topics that may be appropriate for inclusion
             in a technical SOP:
             a.     Scope & Applicability,
             b.     Summary of Method,
             c.     Definitions (acronyms, abbreviations and specialized forms used in the
                    SOP),
             d.     Health & Safety Warnings (indicating operations that could result in
                    personal injury or loss of life and explaining what will happen if the
                    procedure is not followed or is followed incorrectly; listed here and at the
                    critical steps in the procedure),
             e.     Cautions (indicating activities that could result in equipment damage,
                    degradation of sample or possible invalidation of results; listed here and at
                    the critical steps in the procedure),
             f.     Interferences,
             g.     Personnel Qualifications,
             h.     Apparatus & Materials (list or specify; note also designated locations
                    where found),
             i.      Instrument or Method Calibration,
             j.      Sample Collection,
             k.     Handling & Preservation,
EPA QA/G-6 SOP Guidance
November 1995

-------
             1.      Sample Preparation and Analysis,
             m.     Troubleshooting,
             n.     Data Acquisition, Calculations & Data Reduction,
             o.     Computer Hardware & Software (used to manipulate analytical results and
                    report data), and
             p.     Data Management & Records Management

      7.2    Quality Control and Quality Assurance Section
             QC activities are designed to allow self-verification of the .quality and consistency
             of the work. Describe the preparation of appropriate QC procedures (self-checks,
             such as calibration) and QC material (such as blanks - rinsate/trip/field/method,
             replicates, splits and spiked samples, and performance evaluation samples) that are
             required to successfully demonstrate performance of the method.  Specific criteria
             for each should be included. Describe the frequency of required calibration and
             QC checks and discuss the rationale for decisions. Describe the limits/criteria for
             QC data/results and actions required when QC data exceed QC limits or appear in
             the warning zone. Describe the procedures for reporting QC data/results.

             Specify and describe any QA procedures that are integral parts of the activity,
             including performance audits, outside audits or reviews, or other activities. These
             can be referenced from the organization's QMP. Specify who or what
             organization is responsible for each QA activity, where or how QA materials are to
             be procured and/or verified. Assign responsibility for taking corrective action,
             based on the results of the QA activities.

      7.3    Reference Section
             Documents or procedures that interface with the SOP should be fully referenced
             (including version), such as related SOPs and published literature or methods
             manuals. Citations cannot substitute for the description of the method being
             followed in the organization. Fully cite all references noted in the body of the SOP
             and attach any that are not readily available.

8.0   Suggested Format for an Administrative SOP

      When auditing, reviewing, and/or inspecting the work of others, the SOP needs to include
      a number of specific steps aimed at making initial contact with the subject of the activity,
      coordinating the activity, and reporting.  An SOP for a general activity (e.g., a
      management systems review or laboratory audit) should be covered by SOP guidance
      tailored to that activity. The SOP guidance should fit within the framework presented
      here, but can be modified, reduced, or expanded.
EPA QA/G-6 SOP Guidance
November 1995

-------
       8.1    Procedural Section
             The following are topics that may be appropriate for inclusion in an administrative
             SOP:
             a.     Title,
             b.     Purpose,
             c.     Applicability,
             d.     Summary of Procedure,
             e.     Definitions,
             f.     Personnel Qualifications, and
             g.     Procedure.

             Audits or assessments SOPs should specify the authority for the assessment, how
             auditees are to be selected, what will be done with the results, and who is
             responsible for corrective action.

       8.2    Quality Control and Quality Assurance Section
             Describe any control steps and provisions for review or oversight prior to
             acceptance of the product or deliverable.  This can include test plans such as
             verification and  validation plans for software.

       8.3    Reference Section
             Cite all references noted in the body of the SOP. A copy of any cited references
             not readily available should be attached to the SOP.

9.0    Suggested References

   (1)  ANSI/ASQC E4-1994, Specification and Guidelines for Quality Systems for
        Environmental Data Collection and Environmental Technology Programs, American
        National Standard, ASQC,  Milwaukee, WI (January 1995).

   (2)  EPA Manual for the Certification of Laboratories Analyzing Drinking Water.  Criteria
        and Procedures/Quality Assurance, U.S. EPA, Washington, DC (Draft 1995).

   (3)  Garner, Willa Y. and Maureen S. Barge, editors, "Good Laboratory Practices. An
        Agrochemical Perspective," ACS Symposium Series 369, American Chemical Society
        (1988).

   (4)  Herron, Nelson R., "Standard Operating Procedures: Developing Useful Procedures.
        Part 1," Environmental Testing and Analysis. (1994) p. 41-44.
                                                                       I
   (5)  Standard Guide for Documenting the Standard Operating Procedures Used for the
        Analysis of Water, ASTM D 5172-91, American Society for Testing and Materials,
        Philadelphia, PA (1991).
EPA QA/G-6 SOP Guidance
November 1995

-------
                                     APPENDIX

These SOPs are not purported to be perfect or complete in content, but are provided merely to
illustrate application of the SOP format for both technical and administrative subjects.

Like the G-6 Guidance, the SOPs both take advantage of the "Header" format in Word Perfect.
You must use the "VIEW" or "SCREEN" function to see the headers when using electronic
formats.

-------
                                      C-47 Color
                                      Rev. #: 2
                                      Date: July 1992
                                      Page 1 of5
 STANDARD OPERATING PROCEDURE (SOP)
   FOR THE DETERMINATION OF COLOR
DRAFT EXAMPLE - DO NOT QUOTE OR CITE
  Prepared by:_
Date:
                 Chemist
 Reviewed by:_
Date:
               Section Chief
 Approved by:_
Date:
               Branch Chief
U.S. ENVIRONMENTAL PROTECTION AGENCY
               REGIONXI

-------
                      •" j                                        .   C-47 Color
                            '                          "       .       Rev.#:2
                                                                    Date: July 1992
                                                                    Page 2 of5

Procedural Section

1.0    Scope & Application

1.1    The Platinum-Cobalt method is useful for measuring color of water derived from naturally
       occurring material, i.e., vegetable residues such as leaves, barks, roots, humus and peat
       materials. The method is not suitable for color measurement on waters containing highly
       colored industrial wastes.

1.2    Detection limit is 5 color units.

1.3    The range is from 5 to 70 units. Higher values may be measured by dilution of the
       samples.

1.4    Note: The spectrophotometric and Tristimulus methods are useful for detecting specific
       color problems. The userof these methods, however, is laborious and unless determination
       of the hue, purity, and luminance is desired, they are of limited value.

2.0    Summary of Method

2.1    Color is measured by visual comparison of the sample with platinum-cobalt standards.
       One unit of color is that produced by 1 mg/L platinum in the form of the chloroplatinate
       ion.

3.0    Health and Safety Warnings

3.1    Standard laboratory protective clothing and eye covering is required.

4.0    Cautions

4.1    Reagent standards must be prepared fresh on the day of analysis.

4.2    Determination must be made within 48 hours of collection and sample stored at 4°C.

5.0    Interferences

5.1    Since very slight amounts of turbidity interfere with the determination, samples showing
       visible turbidity should be clarified by centrifugation. Alternately, samples may be filtered.
       If turbidity is removed, the results are reported as "true color" otherwise the results are
       reported as "apparent color:"

-------
                                                                    C-47 Color
                                                                    Rev.#:2
                                                                    Date: July 1992
                                                                    Page 3 of5
                                         .'          '                     '
5.2    The color value of water may be extremely pH-dependent and may increase as the pH of
       the water is raised. When reporting a color value, specify the pH at which color is
       determined.

5.3    Absorption of ammonia by the standards will cause an increase in color.

6.0    Personnel Qualifications
                                              •N,                  ' '"       '
6.1    Technician should be trained at least one week in the method before initiating the
       procedure alone.

7.0    Apparatus & Materials

7.1    Nessler tubes: Matched, tall form, 50 ml capacity.

7.2    Racks for Nessler tubes.

7.3    Miscellaneous lab glassware.

8.0    Method Calibration

8.1    Chloroplatinate Stock Standard, 500 units: Add 100 ml concentrated HC1 to 500 ml
       reagent grade deionized water. Dissolve 1.246g Potassium Chloroplatinate and l.Og
       Cobaltous Chloride Monohydrate in this mixture and dilute to 1000 ml. This may be
       purchased from Fisher Scientific as Platinum Cobalt Standard and is equivalent to 500
       colorunits.

8.2    Prepare the following series of standards, fresh on the day of the analysis.
mis of standard solution diluted to 50 ml
with Reasent Grade Deionized Water









0.0
0.5
1.0
1.5
2.0
2.5
3.0
- 3.5 '
4.0
Color hi
Chloroolatinate Units
0
5
10
15
,20
25
30
35
40

-------
                                                                    C-47 Color
                                                                    Rev.#:2
                                                                    Date:  July 1992
                                                                    Page 4 of5
                           4.5.
                           5.0
                           6.0
                           7.0
9.0    Sample Collection. Preservation and Storage
                                         45
                                         50
                                         60
                                         70
9.1    Representative samples shall be taken in scrupulously clean containers. Both glass and
       plastic containers are acceptable.

10.0   Preservation

10.1   Since biological activity may change the sample color characteristics, the determination
       must be made within 48 hours of collection.  Samples should be stored at 4° C.

11.0   Sample Analysis Procedure

11.1   Apparent Color: Observe the color of the sample by filling a matched Nessler tube to the
       50 ml mark with the sample and compare with standards. This comparison is made by
       looking vertically downward through the tubes toward a white or specular surface placed
       at such an angle that light is reflected upward through the columns of liquid. If turbidity
       has not been removed by the procedure given in 7.2, report color as "apparent color."

11.2   True Color: Remove turbidity by centrifuging until supernatant is clear; up to one hour
       may be required.  Samples can also be filtered through a Whatman #541 filter paper.
       Results are reported as "true color" if steps are taken to remove turbidity.

11.3   Measure and record pH of each sample (see SOP C-24).

11.4   Dilute any sample with more than 70 units of color and reanalyze.

12.0   Data Analysis & Calculations

12.1   Calculate the color units by means of the following equation:
             Where:
Color units = A x 50
               V

A = estimated color of diluted sample.
V = ml sample taken for dilution.

-------
                                                                C-47 Color
                                                                Rev.#:2
                                                                Date:  July 1992
                                                                Page 5 of 5
12.2  Report the results in whole numbers as follows:

                   Color Units       Record to Nearest
                     1-50
                     51-100
                    101 - 250
                    251 - 500
 1
 5
10
20
13.0  Data Management and Records Management

      All laboratory records must be maintained in the bound record book designated for the
      method.

Quality Control and Qualify Assurance Section

1.0   There are no QC samples for color at this time.

2.0   Choose one sample per set of analyses and run in triplicate.  RSD % should not be greater
      than20%.

3.0   Spikes are not applicable to color determination.

References

1.    Standard Methods for the Examination of Water and Wastewater, 18th Edition,
      Supplement

2.    Methods for Chemical Analysis of Water and Wastewater Method #110.2

-------
                                           MSR Interviews
                                           Rev.#:0   ,
                                           Date: July 1995
                                           Page 1 of 10
      STANDARD OPERATING PROCEDURE (SOP)
FOR THE MANAGEMENT SYSTEMS REVIEW INTERVIEW
    DRAFT EXAMPLE - DO NOT QUOTE OR CITE
      Prepared by:_
Date:
               Environmental Scientist
      Reviewed by:_
Date:
      Approved by:_
Date:
                     Director

    U.S. ENVIRONMENTAL PROTECTION AGENCY
          QUALITY ASSURANCE DIVISION

-------
                                                                  MSR Interviews
                                                                  Rev.#:0
                                                                  Date: July 1995
                        • ••  •. 	:	;	;	Page2of 10

1.0    Purpose and Applicability of the Management Systems Review

       Beginning in 1992, EPA declared the documentation of its QA program to be a material
weakness under the reporting requirements of the Federal Managers' Financial Integrity Act
(Integrity Act). Furthermore, recent audits by the General Accounting Office (GAO) and the
Inspector General (IG) indicate that there is substantial uncertainty regarding the quality of
environmental data being used for key regulatory and programmatic decisions.

       EPA Order 5360.1, Policy and Program Requirements to Implement the Mandatory
Quality Assurance Program (April 1984), directs the Office of Research and Development (ORD)
to review and approve the implementation of QA programs across the Agency. Under the speci-
fications of EPA Order 5360.1, all EPA organizations conducting "enwonmentally-related
measurements" are required to develop and implement a QA program for their work. In response
to the Agency's Integrity Act weakness, all Agency organizations were again directed to complete
documentation of their QA programs by submitting Quality Management Plans (QMP) to ORD
for review and approval. The QMP describes the policies and procedures, roles and
responsibilities, and quality assurance/quality control (QA/QC) activities used to plan, implement,
and assess the effectiveness of the QA program applied to the collection and use of environmental
data for decision making. In addition, the Assistant Administrator for (the Office of) Research
and Development (AA/ORD) is delegated review and approval authority for QMPs as the Senior
Quality Management Official for the Agency.

       EPA Order 5360.1 also directs the AA/ORD to periodically assess the effectiveness of the
QA programs being implemented.  In FY 94, the AA/ORD directed the Quality Assurance
Division (QAD) to initiate a management assessment program in which all Agency organizations
collecting and using environmental data for decision making would be reviewed at least once
every three years.

       In response to this directive, QAD is employing the management systems review (MSR)
process to examine the effectiveness of quality systems applied to Agency environmental
programs. The focus of the MSR is on systems. The assessment seeks to determine that a quality
system is established and is operating within the organization in a manner such that potential
vulnerabilities in environmental data beyond that of inherent error may be detected, prevented,
and resolved. This is accomplished by examining the processes used to plan,  implement, and
assess the effectiveness of the QA/QC applied to environmental data collection activities.

       The MSR process may be applied to both organizations and to specific data collection
programs involving multiple organizations. The MSR process uses background documentation,
file reviews, case studies, and interviews of managers and staff involved in environmental data
operations to assess the effectiveness of the quality system relative to its stated objectives in the
QMP.  The MSR process is not an audit in the traditional sense in that it seeks to recognize

-------
                                                                  MSR Interviews
       -                                                  -         Rev.#:0
                       '..'-.                                   Date: July 1995
                                        „                          PageS of 10

noteworthy accomplishments and to identify needed improvements. Moreover, the MSR process
does not judge the quality of any data or the performance of any environmental data collection
activities. The results of the MSR are provided only to the principal client, in this case, the
AA/ORD, and to the program reviewed. QAD will not distribute copies of the final report or
discuss the findings with any other organization.

1.1    Summary of Procedure for MSR Interviews
        "N                                    ,                    '
       As noted above, the interview is one component of the MSR process. Within a one-hour
session, the interviewer introduces him/herself, checks the name and position of the interviewee,
and then asks a series of questions relating to the planning, implementing, and assessment of
environmental measurement activities in the organization. The interviewer allows the interviewee
to suggest any way that the quality program could be better served by QAD or the Agency, and
then thanks him/her for his/her time.  Related documentation is examined and relevant copies of
documents are collected or arranged to be collected.

1.2    Qualifications for MSR Interviewers                      '•
   1                           .          •'.'•".           "
       The (EPA employee) interviewer must have good communications skills and complete the
QAD two-day MSR course. Some familiarity with Agency QA regulations and current guidance
is necessary. In addition, there will be a briefing on the planned MSR by the team leader before
beginning a particular review. The organization's QMP and Q A Annual Report and Work Plan
should be studied and discussed.

1.3    MSR Interview Procedure

1.3.1   Deportment for Interviews

       Conduct the interview with a partner.  This enables verification of the content of the notes
by the report writer.  Conduct the interview in a congenial manner to put the interviewee at ease.
You may need to remind the interviewee that the information is for internal use, and that this is
not an audit. Remember that you are there to learn about the QA/QC practices in the
organization, not to judge their data or to criticize them. You may, however, impart helpful
information about current QA practices, guidance, requirements and training, if it does not
interfere with the purpose of the interview.

-------
                                                                 MSR Interviews
                                                                 Rev. #: 0
                                                                 Date: July 1995
                                                                 Page 4 of 10
13.2   Opening the MSR Interview
       Introduce yourself and your partner and give your organizations). Verify (write in your
notes) the name (check spelling) and organizational unit/position of the interviewee and the
date/time of the interview. Ask the interviewee if he/she understands the purpose of the
interview. If the response is negative, quickly summarize the material in 1.0.

1.3.3   MSR Interview Questions from QAD Director Memorandum

       The following are ten groups of questions referred to in the memorandum from the QAD
Director to the organization which are useful as a guide to the-mterview process:

1.     MANAGEMENT COMMITMENT AND ORGANIZATION

             Is the organizational structure of your quality system implemented as documented
             intheQMP?

             Are the duties and responsibilities of the QA Manager (and QA Coordinators, if
             present), as documented hi the QMP, being consistently performed?

             Are the duties and responsibilities of managers and staff members relative to
             QA/QC understood within the organization? How do managers assure that
             assigned QA responsibilities are performed?

             Are sufficient resources provided for effective QA/QC, including planning,
             implementation, and oversight (e.g., FTEs, intramural and extramural funding,
             travel)?

             Is oversight of extramural or delegated programs conducted relative to quality?
             Are those responsible for oversight of extramural and delegated programs
             performing as needed?

2.     QUAUTY SYSTEM DESCRIPTION

             Describe the preparation, review, and internal approval process for your Quality
             Management Plan (QMP).

             How are managers and staff informed of the requirements in the QMP? Do they
             understand their roles and responsibilities for QA/QC?

-------
                                                           MSR Interviews
                                              ,             Rev. #: 0
                                                           Date: July 1995
                      -                                     Page 5 of 10

       What QA/QC "tools" (e.g., the DQO process, QA Project Plans, audits) are used
       routinely to plan, implement, and assess environmental data collection activities?

       How are the QA'QC activities described in the QMP implemented? How is senior
       management assured that the QMP is being implemented as prescribed?

       Are requirements and guidance documents readily available, understood, andused
       by staff members?

       How have changes to or replacements for Agency-wide guidance been made in
       order to "tailor" QA/QC requirements to your mission?

       How do you know that the quality system is working? How do you measure the
       effectiveness of quality systems for external organizations (e.g., contractors,
       assistance agreement holders)?

       What is the role of non-EPA personnel (e.g., contractors, assistance agreement
       holders, other Federal Government employees) in implementing your quality
       system?             ,

PERSONAL QUALIFICATIONS AND TRAINING

       How do you know that your personnel are qualified to perform the environmental
       data collection activities needed?

       How does management assure that only qualified personnel perform work
       affecting the quality of the results?

       What is the process for determining QA-related training needs, providing the
       training, and measuring its effectiveness? Who is responsible for this?

       What QA-related training is provided to  managers and staff, and how often?

       What training do you currently need?

PROCUREMENTS AND ACQUISITIONS

       What is your process for specifying QA and QC requirements in procurements,
       acquisitions, assistance agreements, etc.? What is the role of the QA Manager in
       this process?

-------
7.
                                                          MSR Interviews
                                                          Rev.#:0
                                                          Date: July 1995
                                                          Page6oflO

      How do you incorporate QA/QC requirements into work assignments, technical
      directives, etc.?
                                                           •s     !
      How do you assure that environmental data operations performed by external
      groups (such as, contractors providing analytical services) satisfy all QA/QC speci-
      fications and requirements?
                               1      ,          '                 :          '
      How do you assure that items and services procured (e.g., consumables, reagent-
      grade chemicals, analytical services) conform to technical and quality specifica-
      tions?

DOCUMENTS AND RECORDS

      What is your process for identifying and keeping necessary documents and records
      to support your decisions based on environmental data collection activities?

      How are specific QA and QC records and documents identified?  What happens to
      these records and documents?

USE OF COMPUTER HARDWARE AND SOFTWARE

      •How do you assure that computer hardware and software configurations perform
      as required for environmental data operations?

      How do you assure that specialized computer software is developed in accordance
      with specifications and performs as required for environmental data operations?
                                           .-•.'    ,     '      i
      How is the quality of environmental data in computerized data bases and
      information systems identified and documented?

ENVIRONMENTAL DATA COLLECTION - PLANNING

      What is the process used for planning environmental data operations? How is
      technical expertise hi sampling, statistics, analytical services, and QA/QC
      provided?            ,

      Is the Data Quality Objectives process used in planning environmental data
      operations? What has been the effect of using the DQO process? What other
      systematic planning processes are used?

      How is the effectiveness of the planning process for QA/QC determined?

-------
                                         ,                       MSR Interviews
                                                                Rev.#:0
                                                                Date: My 1995 "
                                                  /              Page? of 10

      -      How are QA Project Plans prepared, reviewed, and approved for environmental
             data collection performed intramurally?

             flow are QA Project Plans prepared, reviewed, and approved for environmental
             data collection performed by contractors or assistance agreement holders?

8.     ENVIRONMENTAL DATA COLLECTION -IMPLEMENTATION

      -      What is the process used for implementing QAPPs or other planning
             documentation for environmental data operations as prescribed? flow do
             managers assure that such implementation is accomplished?
                                                      •i                     -. •
             flow are revisions to QAPPs (and other planning documents) made and
             maintained? flow does management assure that project personnel have access to
             current documentation?

      -      flow do you know that data compiled from computerized data bases and
             information systems is of adequate quality for use as intended? How do you
             develop the criteria for accepting these data?

9.     ENVIRONMENTAL DATA COLLECTION-ASSESSMENT

             What assessment methods (such as audits, peer reviews, surveillances, readiness
             reviews, performance evaluations, etc.) are used to examine the effectiveness of
             the technical and QA/QC activities in a project?

             What is the process for planning, conducting, and reporting the results of
             assessment activities?  Who is responsible for conducting assessments?

             Who assures that corrective actions are implemented in a timely manner? How is
             the effectiveness of corrective actions measured?
10.    QUALITY IMPROVEMENT

             What needs to be done to improve QA/QC in your environmental data collection
             activities?          ,

      -      What has or has not worked in the past to improve quality?

-------
                                                                  MSR Interviews
                                                                  Rev. #: 0
                                                                  Date: July 1995
                                                                  Page 8 of 10
1.3.4  QA Staff Interviews
       The above questions are especially appropriate for Q A staff, and they may prepare
answers ahead of your visit to facilitate the interview. In addition, listen for innovative strategies
employed. Ask about interaction with and support from management.  Is there a strategy for
ensuring that all data collection activities include QA? Is there consistent participation throughout
the organization?  Discuss the current status of the QMP. Discuss attendance at
national/prpgrammatic QA meetings and courses.  Are there resource limitations?

       Although not strictly an interview, during the MSR a time is scheduled to meet with the
QA staff to examine files.  Inspect guidance, training materials, files of QAPPs, logbooks,1
printouts of tracking programs, reports from audits, assessments, and MSRs.  Look for timeliness
of reviews, frequency of audits, response to comments, signature blocks. Scan guidance/course
material for conformance with QAD guidance (new or old), age, program-specificity.

1.3.5  Intramural Data Collector Interviews

       Many of the questions above will not apply to non-QA staffers and they sometimes
assume that you do not need to interview them. Furthermore, they may have limited familiarity
with "Qaspeak."  Repeat questions about QA practices in "plain English" as a check for a lack of
understanding of QA jargon. For example, ask the interviewee to describe how projects are
planned or begun, even if he/she does not use the DQO process. Ask about his/her awareness of
the existence of the QMP and interaction with QA staff.

       Planning:

       Ask to discuss a current or recently completed data collection activity.  Ask who the
       clients are for the data, and how they participate in planning the objectives. Is a formal
       (DQO) process used? What decision is being based upon the data?  Is it regulatory or
       enforcement-related? Are QA staffers involved hi planning?

       Implementation:

       Ask what steps are taken to ensure the  adequacy/quality of the work.  Be aware that many
       staffers will confuse QA and QC. Ask about their backgrounds relative to QA and their
       experience with defending their data against legal challenges. What (QA) training have
       they had?  What technical expertise is available to them hi QA, experimental design,

-------
                                                                  MSR Interviews
                                                                  Rev.#:0
                                              -                    Date: July 1995
                                                                  Page 9 of 10
                                                                  >          !
       statistics, and automated data processing?  If they are modelers, how do they ensure the
       models are of good quality? How do they ensure the quality of the data they are using?
       Are QA plans written, reviewed, followed, and revised? How often, and who signs ofi?
       Are SOPs or checklists used? Ask for QA plan documentation; check signatures;
       topics addressed. Ask for example SOP/checklist; check signoff, revision, dates.

       Assessment:                                                           •

       Ask what oversight is exercised?  What part do others play in review? What are then*
       qualifications? If there is peer review, how exactly does it work? Who'Selects the
       reviewers? What are their areas of expertise?  Who sees that comments are addressed?
       Are resources available to permit adequate training, attendance at professional meetings,
       oversight/audit of laboratory and field work? Is a final assessment made of whether set
       objectives were reached?  Is the QA plan used in that assessment? Are improvements
       made based on lessons learned? Ask to see audit/oversight guidance, checklists, and
       reports; and reviewed papers with comment/signoff memos.

1.3.6  Interviewing Project Officers/Work Assignment Managers

       The major difference from intramural data collection activities is that there are legal
requirements for contractors and assistance agreement holders with respect to QA (40 CER. Parts
30 and 31). Also, because the work is done by others, the project officer (PO) or work
assignment manager (WAM) is necessarily responsible for oversight. Try to determine if the PO
and/or work assignment manager WAM understands his/her role and responsibilities.  Use the
same questions as above, with additional resource questions.

       Planning:

       Ask the same questions eis in Section 1.3.5. However, be aware that some funding
       instruments may be for portions of data collection activities, e.g., contracts or grants for
       analyses, sampling, or audit gas cylinders only.  Therefore, ask about the role of their
       project in a larger context.

       Implementation:

       In addition to questions in Section 1.3.5, ask about the award of funds relative to
       complying with QA requirements. Who writes, reviews and approves the QA plan? How
       are review comments addressed?  When are funds awarded (before or after QA plan
       approval)? When are plans revised? Ask to see the QA plan; check signatures and
       dates; scan topics addressed for coverage of all data collection activities from
       planning to sampling, analysis, and reporting, etc., as in Section 1.3.5.

-------
                                                                    MSR Interviews
                                                                    Rev.#:0
                                                                    Date: July 1995
                                                                    Page 10 of 10
       Assessment:
       In addition to questions in Section 1.3.5, ask about management and resource support for
       oversight responsibilities.  Is training/guidance provided? Do technical experts perform
       auditing/data validation and review/interpretation? Are site visits adequate to assess
       implementation of sampling and analysis plans? Are data bases adequate? Ask to see
       audit reports and guidance, etc. as in Section 1.3.5.

1.3.7  Closing the MSR Interview

       Ask the interviewee about any particular topics he/she would like to mention relative to
QA. Does he/she have specific needs QAD could address? Close the interview by thanking the
interviewee for his/her time. If there are documents to be collected for review, arrange delivery.
Do not detain anyone beyond one hour. If there are topics that must be discussed further,  ask
permission to call the person at a later time. Feel free to release staffers who are inappropriately
scheduled for interviews, or to release them early if there is little to be discussed.

2.0    Quality Control and Quality Assurance of the MSR Interview

       The primary quality assurance measure for the interview is the pairing of interviewers.
The report writer uses the two sets of notes of the interview as assurance that the information is
accurate. The report writer retains a file of all interviewer notes. Questions about notes can be
asked of the interviewers as the draft Findings Report is written.  All review team members must
review the draft and sign it before it is sent to the organization for its accuracy review. The
specific project and/or staff-level organization discussed in the interview is usually mentioned in
the report. Because the MSR concerns the quality system rather than individuals, do not quote
staffers or mention their names in the text of the report. However, include the complete list of all
interviewed staffers with their organizations in the appendix.

       When the reviewed organization has returned the draft, the final duty of interviewers is to
read the comments and recommend for incorporation any comments that correct statements in the
report or reject comments (if they cannot be verified by notes) and discuss rejected comments the
writer/leader, who will work with the comment coordinator for the reviewed organization. Be
aware that clarifications submitted by the organization after the interview may make the report
more complete, but may obscure the lack of understanding of QA issues by the interviewees.

3.0    References

       Contact QAD hi headquarters (202/260-5763) for current lists of requirements,
guidance, and course offerings.

-------
                                                                  ATTACHMENT
                        Environmental Monitoring Management
                          Council (EMMC) Methods Format
1.0    Scope and Application

       Use a tabular format whenever possible for:
             Analyte list(s)
             Chemical Abstract Service (CAS) numbers
             Matrices
             Method Sensitivity (expressed as mass and as concentration with a specific
             sample size)
       Include a list of analytes (by common name) and their CAS registry numbers, the
matrices to which the method applies, a generic description of method sensitivity (expressed
both as the mass of analyte that can be quantified and as the concentration for a specific
sample volume or size), and the data quality objectives which the method is designed to meet.
Much of this material may be presented in a tabular format.

2.0    Summary of Method

       Sample volume requirements

       •     Extraction
       •     Digestion
             Concentration, and other preparation steps employed
       •     Analytical instrumentation and detector system(s), and             .
       •     Techniques used for quantitative determinations

       Summarize the method in a few paragraphs. The purpose of the summary is to provide
a succinct overview of the technique to aid the reviewer or data user in evaluating the method
and the data. List sample volume, extraction, digestion, concentration, other preparation steps
employed, the analytical instrumentation and detector system(s), and the techniques used for
quantitative determinations.

3.0    Definitions

       Include the definitions of all method-specific ternis here.  For extensive lists of
definitions, this section may simply refer to a glossary attached at the end of the method
document.

4.0    Interferences

       This section should discuss any known interferences, especially those that are specific
to the performance-based method. If known interferences in the reference method are not
interferences in the performance-based method,  this should be clearly  stated.

-------
5.0    Safety

              Above and beyond good laboratory practices
              Disclaimer statement (look at ASTM disclaimer)
       •      Special precautions
              Specific toxicity of target analytes or reagents
       •      Not appropriate for general safety statements

       This section should discuss only those safety issues specific to the method and beyond
the scope of routine laboratory practices. Target analytes or reagents that pose specific toxicity
or safety issues should be addressed in this section.

6.0    Equipment and Supplies

       Use generic language wherever possible. However, for specific equipment such as GC
(gas chromatograph) columns, do not assume equivalency of equipment that was not
specifically evaluated,  and clearly state what equipment and supplies were tested.

7.0    Reagents and Standards               '

       Provide sufficient details on the concentration and preparation of reagents and
standards to allow the work to be duplicated, but avoid lengthy discussions of common
procedures.                            '       .

8.0    Sample Collection, Preservation and Storage

              Provide information on sample collection, preservation, shipment, and srorage
              conditions.                                                •>•

              Holding times,  if evaluated

       If effects of holding time were specifically evaluated, pro^ ide reference to relevant
data, otherwise, do not establish specific holding times.

9.0   Quality Control

       Describe specific quality control steps, including such procedures as method blanks,
laboratory control samples, QC check'samples, instrument checks, etc., defining all terms in
Section 3.0. Include frequencies for each such QC operation.

10.0  Calibration and Standardization
                                                                              i1
       Discuss initial calibration procsdures here. Indicate frequency of such calibrations,
refer to performance specifications, and indicate corrective actions that must be taken when
performance specifications are not met. This section may also include procedures for
calibration verification or continuing calibration, or these steps may be included in Section
 11.0.

-------
11.0   Procedure

       Provide a general description of the sample processing and instrumental analysis steps.
Discuss those steps that are essential to the process, and avoid unnecessarily restrictive
instructions.

12.0   Data Analysis and Calculations

       Describe qualitative and quantitative aspects of the method.  List identification criteria
used. Provide equations used to derive final sample results from typical instrument data.
Provide discussion of estimating detection limits, if appropriate.

13.0   Method Performance

       A precision/bias statement should be incorporated in the section, including:

       •      detection limits
             source/limitations of data

       Provide detailed description of method performance, including data on precision, bias,
detection limits (including the method by which they were determined and matrices to which
they apply), statistical procedures used to develop performance specifications, etc. Where
performance is  tested relative to the reference method, provide a side-by-side comparison of
performance versus reference method specifications.

14.0   Pollution Prevention

       Describe aspects of this method that minimize or prevent pollution that may be
attributable to the reference method.                       .    -

15.0   Waste Management

       Cite how waste and samples are minimized and properly disposed.

16.0   References

       •      Source documents
             Publications         ..'•-.'.

17.0   Tables,  Diagrams, Flowcharts and Validation Data

       Additional information may be presented at the end of the method. Lengthy tables may
be included here and referred to elsewhere in the text by number. Diagrams should only
include new or  unusual equipment or aspects of the method.

-------

-------