You
and
Quality Assurance
In Region 10
EPA910/R-88-100
                   March 1988
             U.S. Environmental Protection Agency
           Region 10 Quality Assurance Management Office
                 1200 6th Ave. ES-095
                 Seattle, WA 98101

-------
                                      BACKGROUND INFORMATION
        The Environmental Protection Agency (EPA) was charged in 1970 to protect this country's land, air, and water
systems.  To do this, the agency initiates, implements, and oversees actions designed to balance man's activities with
nature's ability to maintain all forms of life. EPA and the state and local environmental protection agencies formulate
policies and then implement them through various forms or regulation, guidance, and assistance.

        All of these environmental protection activities must be based on an understanding of the state of the
environment and of the effects man is having. This understanding, in turn, is based upon environmental data collected for
this purpose. Presently EPA spends approximately $500 million/year on data collection activities.

        All of this expense would be wasted if EPA and all of its grantees and contractors did not have procedures in
place to ensure that the data are appropriate. In EPA Order 5360.1, EPA formally adopted a Quality Assurance
Program designed to do just that.  Region 10 implements these concepts with some additional protocols and
interpretation. The rest of this booklet presents the major ideas of quality assurance (QA) and describes the evolved QA
Program.
                                           ACKNOWLEDGMENT

        The Region 10 Quality Assurance Management Office wants to emphasize the appreciation warrented Margo
E. Hunt, Ph.D., Environmental Scientist, Air and Water Section, EPA Region II, Edison, New Jersey. Her excellent
booklet "QA and You" is the basic structure of this document. RQAMO 10 has changed it only where regional
emphasis or policy dictates.

-------
                                            QA/QC, WHAT IS IT?
        Quality control or QC is the system of checks used to generate excellence or quality in a program or object that
is being produced.  For manufacturing facilities and research laboratories QC has historically consisted of an aggregate
of system checks on the manufacturing process or the research experiment. It may mean running known negative
(blank) samples, positive (reference) samples, and duplicate samples, or the testing of 1-10% of all products that come
off the assembly line, or the system of annual or semi-annual inspections or equipment maintenance.

        Quality Control is not the same as Quality Assurance (QA). While QC may ask if we are doing things right,
QA asks if we are doing the right things.  QA is the larger system designed to see that QC is maintained.

        QA integrates Data Quality Objectives (DQOs), Standardized Operating Procedures (SOPs), and approved
methodology with written descriptions detailing these features and delineating responsibilities (the QA Plan). QA starts
at the conception of a project or process and extends through to the final product. In other words, QA is management's
effort to maximize the effectiveness of the dollars spent, so what we get counts and is accountable.

        The QA Plan may be only a single sheet filled in prior to collecting the data under an on-going "umbrella" plan,
as in NPDES or TSCA PCB Inspections, or it may be a master document of considerable size for a major CERCLA site.
Regardless of physical size or format, it is the game plan for all of the involved parties to a data generation effort.

-------
                                           EPA'S QA PROGRAM


        The goal of EPA's QA program is to ensure that all data generated and used to make decisions are scientifically
sound, of known quality, thoroughly documented and legally defensible. An additional purpose is to obtain acceptable
data at a reasonable cost. Those groups that receive or wish to receive EPA funding must adhere to quality assurance
standards.

         Within EPA there are three basic divisions, the Program Offices, the Offices of Research and Development,
and the Regions. QA is an ongoing commitment in each.  That commitment is embodied in the organization's Quality
Assurance Program Plan, QAPP (sounds like "qwap").  This stands as the written and signed commitment by
management to QA. The QAPP describes the organization's QA policies, structure, procedures, and work plan.

        The Program Offices implement legislation through the writing of regulations.  They set policy and give
guidance on implementing monitoring actions.

        The Offices of Research and Development (ORD) develop sampling and analytical methods for monitoring the
environment and fund environmental and health effects research and innovative technology.

        The Regions, of which there are ten, oversee the monitoring strategies of the states and territories and develop
their own monitoring strategies.

        In addition, each project started within each office and performed for or funded by EPA must have a written
and approved quality assurance project plan, QAPjP (sounds like "qwap gyp")- hi Region 10, this plan must be
reviewed by Regional Quality Assurance Management Office (RQAMO) staff and assisted by ESD experts, and
approved before any allocation of analytical resources can be made  in the regional laboratory or through the Contract
Laboratory Program's Sample Management Office.

        Although "quality" is built into the system, proper planning, audits or inspections, and final report reviews are
essential components of determining if the goals of EPA's QA program are being satisfied.

-------
                                                     QAPP


         Each Region's Quality Assurance Program Plan (QAPP) should detail the Region's overall policies, objectives,
organization, and responsibilities.  It is unfortunate that at the end of fiscal year 87 only five Regions, including Region
10, had QAPPs approved.

         As QAPPs serve as written contracts they must be acknowledged by the Program and Laboratory Directors.
Their signature indicates their knowledge and approval of the program. This can serve to unify all personnel and,
hopefully, stimulate effective action.  In Region 10, they are subject to critical review and approval by the RQAMO.

         Each QAPP starts with the management's statement of its' quality assurance goals. The region's environmental
data collection activities are identified and described along with the data quality objectives.

         QAPPs also should contain the policies and procedures assuring the development of SOP's (standard operating
procedures) and establish an audit program to assess QA implementation.

         Assignment of responsibilities outlined in the QAPP delineates the organization and delegation of authority
within the Region. QA training for all personnel involved in data collection and analysis is the responsibility of the
individual program offices, with the Regional Quality  Assurance Office providing guidance, technical assistance and
oversight.

         The annual review of the QAPP is entitled the QA Annual Report and Work Plan (QAARW). As an integral
part of the QAPP, the QAARW reports on the previous fiscal year's activities, presents the present status of the QA
program, and serves as the basis for the next fiscal year's QA activities through a detailed work plan. All EPA
employees should read their Region's QAPP.  This indicates the region's interests, goals, and attention to quality
assurance.

-------
                                       DATA QUALITY OBJECTIVES

        Data Quality Objectives (DQOs) are quantitative and qualitative statements describing the quality of data
needed to support a specific environmental decision or action. These descriptors must be considered by a hierarchy of
decision makers in order to determine whether data are appropriate for a particular application. Because higher level
decision makers may have many other controlling or modifying considerations to interrelate, they must be involved "up
front" in the development of DQOs, and be committed to a reiterative process with the decision makers and technical
staff below them in the hierarchy.  Some of the basic questions that need to be asked are: "Why do we want to do this
project? What information is needed? What is the use of the data? What resources are available?"

        Administratively, a person, or stated organizational policy controlling the data usage must define what quality
of data is needed for the  specific application intended.  These definitions may already exist in the requirements of an act,
regulation, permit or order and require only administrative interpretation for a clear definition. They are the portion of
the DQOs we are all familiar with but never formally called "DQOs".

         What we are familiar with as DQOs are the "PARCC" factors: Precision, Accuracy, Representativeness,
Comparability,  and Completeness, the standard operating checks and balances that any scientific discipline uses to
evaluate and validate data. These are usually  correctly identified as Quality Control.  Correctly documented, these are
the attributes of the data  we produce that make it suitable for reiterating into the decision making and planning process
and provide for defensibility if challenged either in litigation or as to professional credibility.

        Additionally, the decision makers (managers and project officers) need to specify the risks and ask: "What is
the cost of poor quality?" The  program and technical staff must ask: "What data are needed? What results are
anticipated? What is the  desired performance  level? What data do we need to complete this project so that enforcement
actions can be taken? Is there a real need for these new data?"  All questions, rightfully asked, must be reiterated until
they are answered properly. DQOs are a starting point for time and cost effective project design. Projects may be
accomplished without stated DQOs; however, the resultant data may exceed or fall short of the goals of the project.  This
results in the familiar syndrome of "doing it over"; or we may, from experience, accomplish the goals of the immediate
project.  It has been suggested that we developed this experience at the expense of "doing it over" or being prevented
from that uneconomical process by mentors that have done so.

        To accomplish  this communication is necessary.  The need and desire for information is not always coupled
with existing technology. For example, what happens if the anticipated toxic level of compound X is 0.002 ug but the
presently available equipment has a limit of detection of 0.2 ug? The decision might be made to postpone the project,
since the resulting data would  most likely not prove useful.  If sufficiently sensitive equipment is available, but at a high
price, the decision might be made to purchase it. In any case,  the technical staff must be aware of anticipated toxic
levels and the management staff must know the equipment potential. The lines of communication between both groups
must be open.

        Data Quality Objectives are analogous to the first steps in the "Scientific Method." The problem must be
identified and possible solutions suggested. Scientifically  valid experiments (non-biased, statistically analyzed, and
thorough) are then proposed. As the " Scientific Method" continues, these experiments are run, analyzed, and reported in
the literature for peer review.

-------
                                  QUALITY ASSURANCE PROJECT PLAN

         Within EPA, we have been charged with maintaining high level quality assurance on any data we produce or
cause to be produced at our expense or in response to our regulations.  We have evolved (and are continuing to evolve)
guidance and formats to help all parties respond to the Agency's requirements.  One of our strongest tools is the Quality
Assurance Project Plan (QAPjP), a formatted document and supporting guidance packages to help project officers,
technical field and laboratory personnel, and the regional legal counsel in planning and executing specific environmental
data generation projects. The completed format assures all participants in the project that they have responded to the
required elements, know what the schedule is, and have at completion a summary of how what was accomplished was
achieved.

        Lets take a closer look at the QAP Document and procedures:

        While each Region has only one Quality Assurance Program Plan, all EPA monitoring and measurement
projects are required to have written descriptions (QAPjPs1) of their QA procedures.

        A QAPjP is the written result of up-front reiterative planning by the individuals responsible for the project and
the project officer.  It must be approved by a Quality Assurance Officer (QAO) before any work is begun. A QAPjP is
valuable before the project begins (to allow all interested parties an opportunity to review the plans from a QA
viewpoint), during the project (as a guide for real time QA reviews and audits), and after the project (as a basis for
judging whether the project attained its' goals).

        It should be noted that certain operations, i.e., Emergency Response, Spill OSCs, and Criminal Investigation
Operations are action oriented and controlled. Such operations should have a generic QAPP, but may provide QAPjP
documentation after the fact of field operations.

        A well written QAPjP need not be long (a short, 4 page format is available), but should be precise and
complete. A QAPjP has sixteen sections, and, for convenience in writing a QAPjP for a small project, certain of the
sections can be combined, e.g. 6, 8 and 9.

         1.       Title Page

        2.       Table of Contents

        3.       Proj ect Description

        4.       Project Organization and Responsibility

        5.       QA Obj ectives for Measurement Data

        6.       Sampling Procedures

        7.       Sample Custody

        8.       Calibration Procedures and Frequency

        9.       Analytical Procedures

         10.      Data Reduction, Validation and Reporting

-------
11.      Internal Quality Control Checks

12.      Performance and System Audits

13.      Preventative Maintenance

14.      Routine Procedures Used to Assess Data

15.      Corrective Action

16.      QA Reports to Management

The Title Page cites the project title, and the names of the individual or individuals requesting the project and
including the project officer and the quality assurance officer. Their signatures and date of signing on the title
page denotes project approval.  For an extramural project, both the organization's and the funding
organization's project officers and quality assurance officers need to sign.

A Table of Contents is necessary for long QAPjPs but not for a "short form."

 The Project Description should provide a general description of the project: a map of the location, a list of the
administrative objectives, data usage, experimental design, monitoring parameters, frequency of collection,
sample preservation, holding times, and anticipated start and completion dates. All analytical methods should
be referenced at this point.  Project fiscal information may be included if desired.

Project Organization  lists the key personnel involved with the project and their corresponding responsibilities.
An organizational chart should be supplied with lines of authority duly noted.

In the section OA Objectives for Measurement Data, the historic "PARCC" factors, the data quality objectives
of Precision (P), Accuracy (A), Representativeness (R), Completeness (C), and Comparability © should be
addressed for each parameter.

Precision is the measure of agreement among repetitive measurements of the same sample.  Accuracy is the
degree of agreement of an experimental measurement with an accepted standard reference.  For example, in a
game of darts, accuracy would be hitting the bulls eye, while precision would be the degree of scatter of
separate darts from the bulls eye.

Representativeness is the degree to which data accurately represent a particular characteristic of a population
or environmental parameter, e.g., Lake sediment studies may not aid research on ocean sediment problems.

Completeness is the measure of how the amount of valid  data obtained from a measurement system compare to
the expected amount. At some point  a project may be canceled if not enough valid data is collected.

Comparability is the measure of the confidence in comparing daily results in one experiment and comparing
results in different experiments.  You cannot compare apples to oranges, or Mallard ducks to Canadian geese.

The Sampling Procedures section describes the details of sampling based on the predetermined understanding
of the site and the administrative requirements of the data sought.  Decontamination and preparation of
sampling containers and sampling equipment must be defined. The appropriate field Quality Control (QC)
samples must be identified, and type and frequency specified.

In the Sample Custody and Documentation section, the samples must be identified or tagged and, for legal
purposes, a written log of physical sample possession must be maintained, and sample transfer documentation

-------
        must be addressed.  A few of the items to be noted are: Identity of sampler, date, time, type of containers,
        methods of taking and preservation methods.

        Calibration Procedures and Frequency must be duly noted in the QAPjP. They may be included by citation of
        manufacturer's instruction and operation manuals or to established standard operating procedures (SOPs).
        Unless written down, these are easily dismissed as unnecessary, but, in truth, are critical. The frequency of
        cleaning and calibration should be dictated and followed rigorously.  A sampling device that opens prematurely
        must be repaired. Considering the sensitivity of modem analytical methods and the extremely low levels of
        detection sought for some contaminants, the analysis actually starts with the rigorous cleaning of sample
        containers  and sampling equipment.

        Analytical  Procedures section references the standard operating procedures (SOPs) or provides a written
        description. As mentioned previously, this section, as well as Calibration Procedures, can be combined with
        Sampling Procedures. Whether separate or combined, all of the sampling and analytical procedures must be
        documented in detail, (or cited in an "umbrella plan" filed by the program).

        The Data Reduction, Validation and Reduction section should detail for each measurement parameter the
        calculation methods used, the means used to validate the data during collection and reporting, and the treatment
        of data outliers when reporting the data.  Consistent data reporting is necessary. The criteria used to accept,
        qualify or reject data should be documented as well.

        Internal Quality Control Checks addresses that part of data validation that includes the internal system of
        checks and balances such as replicate samples, spiked samples, split samples, blanks and control samples in
        the analytical process (see Sampling Procedures, above).  These are the acceptance criteria for the "PARCC"
        factors.

        Performance and Systems Audits are inspections performed by the QAO to monitor the ongoing project. These
        are the real-time cheeks that the QAPjP is being followed.  An audit may be requested for reasonable cause by
        any involved party in the data generation by administrative, field, or laboratory staff.

        Preventative Maintenance should list the routine and documented care of equipment and materials used in the
        sampling effort or in the analyses of the samples.  Both may be covered by  SOPs, including manufacturer
        supplied manuals and procedures.

        Routine Procedures Used To Assess Data should address the specific procedures used to assess data precision
        and accuracy,  e.g., the t-test, Chi-square test.

        Corrective  Action identifies the procedure, methods used and/or actions taken to correct identified deviation
        from QA Objectives for management data, and designates  the individual to contact if something goes wrong
        (e.g., the acceptance limits are not met or it rains on sampling day) and the  documentation of procedures that
        should be followed.  It may include the planned measurements that will be implemented in the- event that
        equipment failure, weather conditions, or personnel illnesses should interfere with the project.

         Reports, the final section, represents that written commitment to inform all interested individuals of either the
        progress or the final summation job of the project. The last step in any scientific endeavor is publication.
        Since this is a  project plan emphasizing QA, the results of  audits along with any significant QA problems, and
        what was done to address these problems should be mentioned.

         The QAPjP acts  as an historical document denoting the details of the project that may not be noted in the final
report. However, they are  essential for defensibility of data in matters of litigation and provide the base tracking
document for the case file.

-------
                                                   AUDITS
         The greatest value of audits is that they allow knowledgeable "outsiders" to review actual performance of an
organization or performance of a project while it is taking place, throughout the life of a project and across all levels of
project management. Independent audits bring a fresh point of view to a program or project in an effort to ensure that
the documented QAPP, QAPjP, SOPs, etc. are being implemented appropriately. Audits should be an integral part of
the project from the concept of or regulatory requirement for data generation to the final report of All EPA performed,
funded or required projects..

         QA should ensure that the quality of the final product meets the standards set and that there is confidence in the
product and data to back it. A good QA program creates a "quality" image for the organization and enhances the
professional reputation of those who work for that company or agency.

         Finally, as we are a regulatory agency, the audit process  is vital to the concept of defensible evidence for
presentation in court.

         Audits are performed by either internal or external quality assurance officers to review the implementation of
either a QAPP or QAPjP. Four specific kinds of audits will be used at appropriate times by Region 10 to determine the
status of the measurement systems, the adequacy of the data collection systems, the completeness of documentation of
data collection activities and the abilities of the program management to meet the mandated data collection and data
quality objectives.  These four audit types are respectively, Performance Audits, Technical System Audits, Data Quality
Audits and Management System Audits.


                 Performance Audits (FA) are generally based on Performance Evaluation (PE) samples.  Samples
                 having known concentrations may be tested as  unknowns in the service laboratory or a sample may be
                 analyzed for the presence of certain compounds.  Performance audits are used to determine
                 objectively whether an analytical measurement system is operating within established control limits at
                 the time of the audit. The performance of personnel and instrumentation are tested by the degree of
                 accuracy obtained.

                 Technical System Audits (TSA) are qualitative onsite audits that evaluate the technical aspects of field
                 operations against the requirements of the approved QAPjP. TSA reports will note any problems,
                 allowing corrective action to be taken to protect the validity of future data.

                 Data Quality Audits (DOA) are evaluations of the documentation associated with data quality
                 indicators of measurement data to verify that the generated data are of known and documented quality.
                 This is an important part of the validation of data packages showing that the methods and SOPs
                 designated in the QAPjP were followed, and that the resulting data set is a functional part of satisfying
                 the established DQOs.  The results are vital to decisions regarding the legal defensibility of the data
                 should it be challenged in litigation.

                 A Management System Audit (MSA) is a formal review of an entire program, e.g., a review of a
                 state's QA program, or a State contracted Laboratory. In a MSA, key elements in the program, e.g.,
                 lab certification program, QC in field operations, and QC in the certified lab, are evaluated to  see if
                 QA is being implemented. If deficiencies are detected, corrective actions are suggested and
                 implementation monitored.  MSAs should be performed annually.
                                                      10

-------
                                    QA ROLES AND RESPONSIBILITIES

        For quality assurance to work, responsibility must be established for the continued successful maintenance of
the QA/QC program. The following is an abbreviated list of QA responsibilities for major EPA personnel.

Regional Administrator (RA): EPA order 5360.1 places overall regional QA responsibility in the hands of the RA. QA
programs and needs should be recognized and funds must be made available by the RA.

Division Directors (DP): Division Directors follow QA guidelines to enforce Data Quality criteria and QAPjP write ups
for all in-house and regional projects.

Regional OA Officer (OAO): The QAO manages the Regional Quality Assurance Management Office (RQAMO) and is
the QA contact person for the Quality Assurance Management System (QAMS), the Environmental Monitoring System
Laboratories (EMSLs), and the other Regions.  QA guidance and training is provided by the QAO.  This individual is
the one person responsible for maintaining the program.  In Region 10, the QAO has developed a strong professional
staff to implement an active advise and consent QA Program, including the Regional Sample Control Center to
coordinate activities with the CLP/SMO.  The QAO works closely with the BSD Training Program to promote
awareness of QA methods and requirements.

Branch Chiefs: Branch Chiefs must inform QAO of all monitoring projects in the region and arrange for employee
training in QA.  Branch Chiefs involved with data collection also should ensure that all data collection procedures are
documented, QC polices followed, SOPs written and that unacceptable data are rejected.

Project Managers (PM) or Project Officers (PO): PM/PO are to insist that QA requirements are incorporated into all
cooperative or inter-agency agreements and that QAPPs, QAPjPs, SOPs, QC results and DQ reports are prepared and
provided to the RQAMO at the appropriate time, i.e., either before, during, or after the project is completed.

Grants Officers: These individuals may not award grants without a preexisting approved QAPP or QAPjP.

Contracts Officers: Approval of contracts by these officers is dependent upon completion of the appropriate QA form
signed by the PO and QAO.

Quality Assurance Officers (OAO): A QAO has authority to sign a proposed QAPjP before the start of the project (or
later in certain specified operations involving emergency response or criminal investigation) after determining whether
the principles of QA have been met.  The QAO does not prepare a QAPjP. Audits are performed by the QAO.

        Administratively, QA/QC activities are the sole and direct responsibility of the Program generating and
collecting the data. Such projects can be more easily and efficiently accomplished with the comprehensive oversight and
technical assistance of the RQAMO.
                                                     11

-------
                                                IN RE VIEW

        QA is a management tool that brings together management, technical, and QA personnel to ensure that data
collected by the organization and/or project will be useful for the decision-making process.

        A good QA program ensures that all procedures, data, and decisions are well documented  and that the
documented procedures are followed.  A good program contains well defined organizational responsibilities and entrusts
those responsible with the appropriate authority. It contains provision for up-front planning (QAPjPs), in-process
reviews (audits), and final summaries (report reviews).  Finally, a good QA program encourages all involved parties,
from senior management throughout the staff to understand and leam from any mistakes to ensure that future projects
will produce better data more efficiently.

        The success of investigative and monitoring programs invariably stems from one of two sources - sheer luck; or
a careful integration of the actual data collection function with a complimentary quality assurance component.  When one
considers the resource and legal jeopardies inherent in the former approach, it becomes clearly evident that the latter
mode is the preferred one.
                                           Program
            Plan
                    Project
                                                              _
                                                                           RQAMO
                              Audit
  Authority:	
  Oversight:	
                                         Report
        In the final analyses, data generated without attention to QA and it's requirements is good for only one thing:
An excuse to go back and do the job right, if you still have the time and resources!
                                                     12

-------