United States
          Environmental Protection
          Agency
              Office of Air Quality
              Planning and Standards
              Research Triangle Park, NC 27711
EPA-454/R-98-005 I
April, 1998
          Air
r/EPA
Quality Assurance Guidance
Document
           Model Quality Assurance Project Plan
           for the PM2 5 Ambient Air
           Monitoring Program at State and
           Local Air Monitoring Stations
           (SLAMS)

-------
                                            Foreword
      EPA policy requires that all projects involving the generation, acquisition, and use of environmental data be
  planned and documented and have an Agency-approved quality assurance project plan or QAPP prior to the start of
  data collection. The primary purpose of the QAPP is to provide an overview of the project, describe the need for the
  measurements, and define QA/QC activities to be applied to the project, all within a single document. The QAPP
  should be detailed enough to provide a clear description of every aspect of the project and include information for
  every member of the project staff, including samplers, lab staff, and data reviewers. The QAPP facilitates
  communication among clients, data users, project staff, management, and external reviewers. Effective
  implementation of the QAPP assists project managers in keeping projects on schedule and within the resource budget.
  Agency QA policy is described in the Quality Manual and EPA QA/R-1, EPA Quality System Requirements for
  Environmental Programs.
 The following document represents a model QAPP for the environmental data operations involved in
 monitoring for PM2 5 under the Ambient Air Quality Monitoring Network. Due to the accelerated time frame
 for implementation of this program, OAQPs in cooperation with the EPA Regions and State and Local
 organizations developed this Model QAPP to serve as an example of the type of information and detail
 necessary for the QAPPs submitted by State and local organizations to EPA Regions.

 This model QAPP was generated using the EPA QA regulations and guidance as described in EPA QA/R-5,
 EPA Requirements for Quality Assurance Project Plans and the accompanying document  EPA QA/G-5,
 Guidance for Quality Assurance Project Plans. All pertinent elements of the QAPP regulations and
 guidance are addressed in this model. The model also contains background  information and a rationale
 for each element which are excerpts from EPA QA/G-5 and are included  in text brackets (as seen
 above), usually found at the beginning of a section or subsection.

 The Model QAPP must not and can not be referenced verbatim. Although PM2 5 regulations (40 CFR
 Parts 50, 53 and 58) and guidance (Network Guidance and Guidance Document 2.12) were used in the
 development of the document, many elements are unique to each organization and must be addressed at that
 level. Other elements may meet the organization's needs and can be used as such. Also, there are other ways
 for organizations to meet the data quality needs of the PM2 5 Monitoring Program. Therefore, State and local
 organizations have the flexibility to develop their own QAPPs that meet their needs and are considered
 acceptable by their Regional QA Manager.

 Due to the tight time frame required to generate this document, standard operating procedures (SOPs) for
 various data collection activities could not be developed. The QAPP elements or sections allude to these
 SOPs as if they would be included as part of the  QAPP,  or referenced to internal Department documents for
 which this Model QAPP was developed. Appendix E provides a listing of the  SOPs that would be included or
 available for the PM2 5 QAPP. SOPs would be developed based upon the document titled Guidance for the
 Preparation of Standard Operating Procedures for Quality Related Operations EPA QA/G-6. This
 document as well as the others EPA QA/G and QA/R documents are available of the  EPA QA Division
 Homepage  (http://es.epa.gov/ncerqa/qa).

This document has been reviewed by EPA Regional QA Managers and/or QA Officers and was found to
provide enough detail of the PM25 program to be considered acceptable (see following approval page).
Therefore, QAPPs of similar detail developed by State and Local organizations should meet approval.

The document, mentions trade names, and brand names that are both real and fictitious. Usually these names
will be associated with the "@" symbol. Mention of corporation names, trade names, or commercial products
does not constitute endorsement or recommendation for use.

-------
                                  Acknowledgments
This Model QAPP is the product of the combined efforts of the EPA Office of Air Quality Planning and
Standards, the EPA National Exposure Research Laboratory, the EPA Regional Offices, and the State and
local organizations. The development and review of the material found in this document was accomplished
through the activities of the PM2 5 QA Workgroup. The following individuals are acknowledged for their
contributions.

State and Local Organizations

George Apgar, State of Vermont, Waterbury, VT
Randy Dillard,  Jefferson County Department of Health, Birmingham AL
Dennis Mikel, Ventura County APCD, Ventura, CA
Gordon Pierce and Kevin Goohs,  Colorado Department of Public Health & Environment, Denver, CO
Alice Westerinen, Russell Grace and Tom Pomales, California Air Resources Board,  Sacramento, CA
Jeff Miller, Pennsylvania Department of Environmental Protection,  Harrisburg, PA
Richard Heffern, State of Alaska Department of Environmental Conservation, Juneau, AK
Dan Harman, North Dakota Department of Health, Bismarck, ND

EPA Regions

Region
    1   Don Porteous, Norman Beloin, Mary Jane Cuzzupe
    2   Clinton Cusick
    3   Victor Guide, Theodore Erdman
    4   Jerry Burger, Herb Barden
    5   Mary Ann Suero, Gordon Jones, Mike Rizzo, Basim Dihu,  Ruth Tatom
    6   Mary Kemp, Mark Sather, KuenjaChung, Timothy Dawson
    7   Leland Grooms,  Mike Davis, Shane Munsch
    8   Ron Heavner, Gordan MacRae, Joe Delwiche
    9   Manny Aquitania, Bob Pallarino
    10 Barrry Towns, Bill Puckett

National Exposure Research Laboratory

Frank McElroy, David Gemmill

Research Triangle Institute

Jim Flanagan, Cynthia Salmons, Gary Eaton, Bob Wright

Office of Air Quality Planning and Standards

Shelly Eberly, Tim Hanley, David Musick, Mark Shanis

-------
                             Acronyms and Abbreviations
AIRS          Aerometric Information Retrieval System
ANSI          American National Standards Institute
APTI          Air Pollution Training Institute
ASTM         American Society for Testing and Materials
AWMA        Air and Waste Management Association
CAA          Clean Air Act
CFR           Code of Federal Regulations
CMD          Contracts Management Division
CMZ          community monitoring zone
CO            Contracting Officer
COC          chain of custody
DAS           data acquisition system
DCO          Document Control Officer
DQA          data quality assessment
DQOs         data quality objectives
EDO          environmental data operation
EMAD         Emissions, Monitoring, and Analysis Division
EPA           Environmental Protection Agency
FAR           Federal Acquisition Regulations
FEM          Federal equivalent method
FIPS           Federal Information Processing Standards
FRM          Federal reference method
GIS            geographical information systems
GLP           good laboratory practice
LAN           local area network
MPA          monitoring planning area
MQOs         measurement quality objectives
MSA          metropolitan statistical area
MSR          management system review
NAAQS        National Ambient Air Quality Standards
NAMS         national air monitoring  station
NIST          National Institute of Standards and Technology
OAQPS        Office of Air Quality Planning and Standards
OARM         Office of Administration and Resources Management
ORD           Office of Research and Development
PC            personal computer
POC           pollutant occurrence code
PD            percent difference
PE            performance evaluation
PM2 s          particulate matter < 2.5 microns
PTFE          polytetrafiuoroethylene
Qa             sampler flow rate at ambient (actual) conditions of temperature and pressure.
QA/QC         quality assurance/quality control
QA            quality assurance
QAAR         quality assurance annual report
QAD          quality assurance division director
QAM          quality assurance manager
QAO          quality assurance officer

                                               iv

-------
QAPP          quality assurance project plan
QMP          quality management plan
SIPS           State Implementation Plans
SLAMS        state and local monitoring stations
SOP           standard operating procedure
SOW          statement or scope of work
SPMS          special purpose monitoring stations
SYSOP        system operator
Ta             temperature, ambient or actual
TSA           technical system audit
TSP           total suspended particulate
Va             air volume, at ambient or actual conditions
VOC           volatile organic compound
WAM          Work Assignment Manager

-------
                                                    Tables
 Number

 3-1         Distribution List	
 6-1         Design/Performance Specifications	
 6-2         Field Measurement Requirements	
 6-3         Additional Field Measurements	
 6-4         Laboratory Performance Specifications	
 6-5         Laboratory Measurements	
 6-6         Assessment Schedule	
 6-7         Schedule of Critical PM2.5 Activities	
 6-8         Critical Documents and Records	
 7-1         Summary of Case 1 and 2 Parameters	
 7-2         Measurement System Decision	
 7-3         Measurement System Decision	
 7-4         Measurement Quality Objectives - Parameter PKj 5	
 8-1         Department of health Training Requirements	
 8-2         Core Ambient Air Training Courses	
 9-1         Reporting Package Information	
 9-2         PM25 Summary Report Ranges	
 10-1         Schedule of PM2.5 Sampling Related Activities	
 10-2         Sample Set Up, Run and Recovery Dates	
 10-3         Identifying Information for Palookaville PM3 Sampler Locations	
 11-1         Sample Set-up, Run and Recovery Dates	
 11-2         Supplies at Storage Trailers	
 11-3         Site Dependent Equipment and Consumables	
 11-4         Field Corrective Action	
 11-5         Filter Temperature Requirements	
 11-6         Holding Times	
 12-1         Parameter list	
 12-2         Sample Custodians	
 12-3         Federal Express Locations	
 13-1         Potential Problems/Corrective Action	
 13-2         Filter Preparation and Analysis Checks	
 13-3         Temperature Requirements	
 14-1         Field QC Checks	
 14-2         Laboratory QC	
 14-3         Sample Batch	
 14-4         Batch Sample Distribution	
 14-5         Control Charts	
 15-1         Inspections in the Weigh Room Laboratory	
 15-2         Inspections of Field Items	
 15-3         Preventive Maintenance in Weigh Room Laboratories	
 15-4         Preventive Maintenance of Field Items	
 16-1         Standard Materials and/or Apparatus for PM2.5 Calibration	
 17-1         Critical Supplies and Consumables	
 17-2         Acceptance Criteria for Supplies and Consumables	
 19-1         List of the Palookaville Dept oh Health SOPs for PM2.5 Data Processing Operations..
 19-2         Validation Check Summaries	
 19-3         Raw Data Calculations	
 19-4         Data Transfer Operations	
 19-5         Data Reporting Schedule	
 19-6         Report Equations	
 19-7         Sample Batch Quality Control Flags	
 19-8         Data Archive Policies	.^'!' 7 ....."".
20-1         Assessment Summary	
21-1         Quarterly Reporting Schedule	
23-1         Single Flag Validation Criteria for Single Samples	
23-2         Single Sample Validation Template
23-3         Validation Template	
24-1         Summary of Violation of DQO Assumptions	
24-2         Weights for Estimating Three-Year Bias and Precision	
24-3         Summary of Bias and Precision	
Section

   3
   6
   6
   6
   6
   6
   6
   6
   6
   7
   7
   7
   7
  9
  9
  10
  10
  10
  11
  11
  11
  11
  11
  11
  12
  12
  12
  13
  13
  13
  14
  14
  14
  14
  14
  15
  15
  15
  15
  16
  17
  17
  19
  19
  19
  19
  19
  19
  19
  19
  20
  21
  23
  23
  23
  24
  24
  24
 Page

  1/1
 2/11
 3/11
 5/11
 6/11
 7/11
 9/11
 10/11
 11/11
 4/9
 4/9
 5/9
 8/9
 1/4
 3/4
 2/5
 3/5
 2/14
 12/14
 13/14
 3/10
 4/10
 5/10
 6/10
 9/10
 9/10
 3/11
 7/11
 10/11
 5/8
 5/8
 8/8
 3/15
 4/15
 13/15
 14/15
 14/15
 2/5
 3/5
 4/5
 5/5
 6/10
 1/5
 3/5
 4/14
 6/14
 7/14
 8/14
 8/14
11/14
12/14
14/14
11/11
 3/7
 3/4
 3/4
 4/4
 4/11
 5/11
 8/11
                                                      VI

-------
                                                 Figures
Number                                                                                           Section    Page

4.1          Organizational Structure of Palookaville Department of Health for PIy[5 Monitoring	           4        3/10
7.1          Annual arithmetic mean and 24-hour 98th percentile associated with selected data sets	          7        2/9
7.2          Comparison of normal and lognormal density functions at low measurement errors	         7        2/9
7.3          Comparison of normal and lognormal density functions at higher measurement errors	          7        2/9
10.1        Geography, population, and PM10 sampler locations for Palookaville	          10       6/14
10.2        PM2 5 sites for Palookaville	           10       8/14
12.1        Example filter chain of custody record  .    	           12       2/11
12.2        Filter archive form	            12       3/11
12.3        Chain of custody phases	           12       7/11
12.4        Filter ID	            12       8/11
14.1        Quality control and quality assessment activities	           14       1/15
14.2        PM2 5 quality control scheme	           14       5/15
19.1        PM2, data flow diagram	           19       3/14
20.1        Audit activities	            20       6/11
20 2        Audit finding form	           20       7/11
20.3        Audit response form	            20       9/11

-------
                                Region Approval
This Model QAPP has been reviewed by EPA Regional QA Managers and/or QA Officers and
was found to provide enough detail of a program specific QAPP for a PM2 5 monitoring program
to be considered acceptable.
 Region 1
 Region 4
                                             Date
 Regions
ReglonP   Vance s<
                                Regional QA Manager
                                             Date
                                             Date
                                       Vlll

-------
                                                                         Project: Model QAPP
                                                                              Element No: 1
                                                                              Revision No: 1
                                                                               Date:4/17/98
                                                                        	Page 1 of 1
              1.0 QA Project Plan Identification and Approval
       The purpose of the approval sheet is to enable officials to document their approval of the QAPP. The title
 page (along with the organization chart) also identifies the key project officials for the work. The title and approval
 sheet should also indicate the date of the revision and a document number, if appropriate.
Tide:  Palookaville Department of Health QA Project Plan for the PM2 5 Ambient Air
       Monitoring Program.
The attached QAPP for the PM2 5 Ambient Air Quality Monitoring Program is hereby
recommended for approval and commits the Department to follow the elements described within.
Palookaville Department of Health

1) Signature:	 Date:
Jeff Samuelson - Technical Manager - Monitoring Division
2) Signature:	 Date:
Linda Toughy-QA Manager- QA Branch
EPA Region Y

1) Signature:	 Date:
Bill Smiley-Technical Project Officer - Air Monitoring Branch
2) Signature:	 Date:
George Benson - QA Officer - QA Branch

-------
                                                                                     Project: Model QAPP
                                                                                            Element No: 2
                                                                                            Revision No:l
                                                                                             Date:4/18/98
                                                                                     	Page 1 of4
                                    2.0 Table of Contents
    The table of contents lists all the elements, references, and appendices contained in a QAPP, including a list of
tables and a list of figures that are used in the text.  The major headings for most QAPPs should closely follow the
list of required elements.  While the exact format of the QAPP does not have to follow the sequence given here, it is
generally more convenient to do so, and it provides a standard format to the QAPP reviewer. Moreover, consistency
in the format makes the document more familiar to users, who can expect to find a specific item in the same place in
every QAPP. The table of contents of the QAPP may include a document control component. This information
should appear in the upper right-hand corner of each page of the QAPP when document control format is desired.
 Section

 Foreword
 Acknowledgments
 Acronyms and Abbreviations
 Tables
 Figures
 Region Approval

 A. PROJECT MANAGEMENT

 1. Title and Approval Page

 2. Table of Contents

 3. Distribution List

 4.  Project/Task Organization
     4.1  Roles and Responsibilities

 5.  Problem Definition/Background
     5.1 Problem Statement and Background

 6.  Project/Task Description
     6.1 Description for Work to be Performed
     6.2 Field Activities
     6.3 Laboratory  Activities
     6.4 Project Assessment Techniques
     6.5 Schedule of Activities
     6.6 Project Records

 7.  Quality Objectives and Criteria for Measurement Data
     7.1 Data Quality Objectives
     7.2 Measurement Quality Objectives

 8.  Special Training  Requirements/Certification
     8.1 Training
     8.2 Certification
Page     Revision      Date
  n
  iii
  iv
  vi
  vii
 viii
 1/1

 1/4

 1/1


 1/10


 1/3
 1/11
 2/11
 6/11
 9/11
10/11
11/11
 1/7
 4/7
 1/4
 4/4
4/17/98
4/17/98
4/17/98
4/17/98
4/17/98
4/17/98
4/17/98

4/18/98

4/17/98

4/17/98


4/17/98


4/17/98
                        4/17/98
                        4/17/98

-------
                                                                                   Project: Model QAPP
                                                                                          Element No: 2
                                                                                          Revision No:l
                                                                                           Date:4/18/98
                                                                                   	Page 2 of4
Section

9.  Documentation and Records
    9.1 Information Included in the Reporting Package
    9.2 Data Reporting Package and Documentation Control
    9.3 Data Reporting Package Archiving and Retrieval

B MEASUREMENT/ DATA ACQUISITION

10.  Sampling Design
    10.1 Scheduled Project Activities, Including Measurement
        Activities
    10.2 Rationale for the Design
    10.3 Design Assumptions
    10.4 Procedure for Locating and Selecting
        Environmental Samples
    10.5 Classification of Measurements as Critical/Noncritical
    10.6 Validation of Any non-standard Measurements

11.  Sampling Methods Requirements
    11.1 Purpose/Background
    11.2 Sample Collection, Preparation, Decontamination
        Procedures
    11.3 Support Facilities for Sampling Methods
    11.4 Sampling/Measurement System Corrective Action
    11.5 Sampling Equipment, Preservation, and Holding Time
        Requirements

12.  Sample Custody
    12.1 Sample Custody Procedure

13.  Analytical Methods Requirements
    13.1 Purpose Background
    13.2 Preparation of Samples
    13.3 Analysis Methods
    13.4 Internal QC and Corrective Action for Measurement System
    13.5 Sampling Equipment, Preservation, and Holding Time
        Requirements

14.  Quality Control Requirements
    14.1 QA Procedures
    14.2 Sample Batching-QC Sample Distribution
    14.3 Control Charts

15.  Instrument/Equipment Testing, Inspection, and Maintenance
    Requirements
    15.1 Purpose/Background
    15.2 Testing
    15.3 Inspection
    15.4  Maintenance
Page
  1/5
 4/5
 5/5
 1/14

 3/14
 4/14

 5/14
 14/14
 14/14
 1/10
 2/10

 4/10
 5/10
 8/10
 6/11
 1/8
 2/8
 3/8
 4/8

 7/8
 2/15
 13/15
 14/15
  1/5
  1/5
  2/5
  3/5
Revision      Date

    1         4/17/98
                        4/18/98
                        4/17/98
                        4/17/98
                        4/18/98
                        4/17/98
                        4/17/98

-------
                                                                                    Project: Model QAPP
                                                                                           Element No: 2
                                                                                           Revision No: 1
                                                                                            Date:4/18/98
                                                                                   	Page 3 of4
Section

16. Instrument Calibration and Frequency
     16.1 Instrumentation Requiring Calibration
     16.2 Calibration Methods
     16.3 Calibration Standard Materials and Apparatus
     16.4 Calibration Standards
     16.5 Calibration Frequency

 17. Inspection/Acceptance for Supplies and Consumables
     17.1 Purpose
     17.2 Critical Supplies and Consumables
     17.3 Acceptance Criteria
     17.4 Tracking and Quality Verification of Supplies and
         Consumables

 18.  Data Acquisition Requirements (non-direct measurements)
     18.1 Acquisition of Non-Direct Measurement Data

 19. Data Management
     19.1 Background and Overview
     19.2 Data Recording
     19.3 Data Validation
     19.4 Data Transformation
     19.5 Data Transmittal
     19.6 Data Reduction
     19.7 Data Analysis
     19.8 Data Flagging-Sample Qualifiers
     19.9 Data Tracking
     19.10 Data Storage and Retrieval

ASSESSMENT/OVERSIGHT

 20. Assessments and Response Actions
    20.1 Assessment Activities and Project Planning
    20.2 Documentation of Assessment

21.  Reports to Management
    21.1 Frequency, Content, and Distribution of Reports
    21.2 Responsible Organizations

DATA VALIDATION AND USABILITY

22. Data Review, Validation and Verification Requirements
    22.1 Sampling Design
    22.2 Sample Collection Procedures
    22.3 Sample Handling
    22.4 Analytical Procedures
    22.5 Quality Control
    22.6 Calibration
    22.7 Data Reduction and Processing
Page
 1/10
 3/10
 6/10
 7/10
 9/10
 1/4
 1/4
 2/4
 4/4
 1/4
 1/14
 4/14
 4/14
 7/14
 7/14
 8/14
10/14
11/14
12/14
13/14
2/11
11/11
 1/6
 5/6
 2/8
 3/8
 4/8
 5/8
 5/8
 6/8
 7/8
Revision      Date

    1         4/17/98
                        4/17/98
                        4/17/98
                        4/17/98
              4/17/98
                        4/17/98
                        4/17/98

-------
                                                                                    Project: Model QAPP
                                                                                           Element No: 2
                                                                                            Revision No:
                                                                                                   Date:
                                                                                    	Page 4 of4
Section

 23. Validation and Verification Methods
    23.1 Process for Validating and Verifying Data
Page
  1/4
Revision     Date

    1         4/17/98
24. Reconciliation with Data Quality Objectives
    24.1 Reconciling Results with DQOs
 1/11
                        4/17/98
Appendices

A. Glossary
B. Training Certification Evaluation Forms
C. Analytical and Calibration Procedures (SOPs)
D. Data Qualifiers/Flags
E. Standard Operating Procedures
F. Reference Material and Guidance Documents
                        4/18/98
                        4/18/98
                        4/18/98
                        4/18/98
                        4/18/98

-------
                                                                                  Project: Model QAPP
                                                                                        Element No: 3
                                                                                        Revision No: 1
                                                                                         Date: 4/17/98
                                                                                 	Page 1 of 1
                                       3.0 Distribution
    All the persons and document files designated to receive copies of the QAPP, and any planned future revisions,
need to be listed in the QAPP. This list, together with the document control information, will help the project
manager ensure that all key personnel in the implementation of the QAPP have up-to-date copies of the plan. A
typical distribution list appears in Table 3-1
A hardcopy of this QAPP has been distributed to the individuals in Table 3-1. The document is
also available on the Department's local area network (LAN) for anyone interested.
Table 3-1 Distribution List
Name
Position
Division/Branch
Palookaville Department of Health
Linda Toughy
Philip Magart
John Dinsmore
Jeff Samuelson
Joe Manard
Bill Macky
Karin Porter
Beverly Deston
Angelista Medron
Delbert Boyle
Jason Chang
Sonny Marony
Mike Smather
Fred Nottingham
QA Division Director
Air QA Branch
QA Officer (auditing)
Technical Division Director
Ambient Air Monitoring Branch
Field Technician
Field Technician
Field Technician
Data Manager
Program Support Division
Shipping/Receiving Branch
Clerk
Laboratory Branch
Lab Technician
QA Division
QA/Air QA
QA/Air QA
Technical
Technical/ Ambient Air Monitoring
Technical/ Ambient Air Monitoring
Technical/ Ambient Air Monitoring
Technical/ Ambient Air Monitoring
Technical/ Ambient Air Monitoring
Program Support
Program Support/Shipping &Rec.
Program Support/Shipping &Rec.
Technical/Laboratory
Technical/Laboratory
EPA Region Y
Bill Smiley
George Benson
Technical Project Officer
QA Officer
Air/ Air Quality Monitoring
Air/ QA

-------
                                                                               Project: Model QAPP
                                                                                     Element No: 4
                                                                                     Revision No: 1
                                                                                      Date: 4/17/98
                                                                              	Pagel of 10
                           4.0 Project/Task Organization
     The purpose of the project organization is to provide EPA and other involved parties with a clear understanding
 of the role that each party plays in the investigation or study and to provide the lines of authority and reporting for the
 project.
4.1 Roles and Responsibilities
    The specific roles, activities, and responsibilities of participants, as well as the internal lines of authority and
 communication within and between organizations, should be detailed.  The position of the QA Manager or QA Officer
 should be described. Include the principal data users, the decision-maker, project manager, QA manager, and all
 persons responsible for implementation of the QAPP.  Also included should be the person responsible for maintaining
 the QAPP and any individual approving deliverables other than the project manager. A concise chart showing the
 project organization, the lines of responsibility, and the lines of communication should be presented.  For complex
 projects, it may be useful to include more than one chart—one for the overall project (with at least the primary contact)
 and others for each organization.
Federal, State, Tribal and local agencies all have important roles in developing and implementing
satisfactory air monitoring programs.  As part of the planning effort, EPA is responsible for developing
National Ambient Air Quality Standards (NAAQS), defining the quality of the data necessary to make
comparisons to the NAAQS, and identifying a minimum set of QC samples from which to judge data quality.
The State and local organizations  are responsible for taking this information and developing and
implementing a quality system that will meet the data quality requirements. Then, it is the responsibility of
both EPA and the  State and local organizations to assess the quality of the data and take corrective action
when appropriate.  The responsibilities of each organization follow.

4.1.1 Office of Air Quality Planning and Standards (OAQPS)

OAQPS is the organization charged under the authority of the Clean Air Act (CAA) to protect
and enhance the quality of the nation's air resources. OAQPS sets standards for  pollutants
considered harmful to public health or welfare and, in cooperation with EPA's Regional Offices
and the States, enforces compliance with the standards through state implementation plans (SIPs)
and regulations controlling emissions from stationary sources. The OAQPS evaluates the need to
regulate potential air pollutants and develops  national standards; works with  State and local
agencies to develop plans for meeting these standards; monitors national air quality trends and
maintains a database of information on air pollution and controls; provides technical guidance and
training on air pollution control strategies; and monitors compliance with air  pollution standards.

Within the OAQPS Emissions Monitoring and Analysis Division, the Monitoring and Quality
Assurance Group (MQAG) is responsible for the oversight of the Ambient Air Quality Monitoring
Network. MQAG has the following responsibilities:

-------
                                                                        Project: Model QAPP
                                                                              Element No: 4
                                                                              Revision No: 1
                                                                              Date: 4/17/98
	Page 2 of 10

     >   ensuring that the methods and procedures used in making air pollution measurements are
        adequate to meet the programs objectives and that the resulting data are of satisfactory
        quality
     >   operating the National Performance Audit Program (NPAP) and the FRM Performance
        Evaluation
     >   evaluating the performance, through technical systems audits and management systems
        reviews, of organizations making air pollution measurements of importance to the
        regulatory process
     *•   implementing satisfactory quality assurance programs over EPA's Ambient Air Quality
        Monitoring Network
     >   ensuring that national regional laboratories are available to support chemical speciation
        and QA programs
     >   ensuring that guidance pertaining to the quality assurance aspects of the Ambient Air
        Program are written and revised as necessary
     >   rendering technical assistance to the EPA Regional Offices and air pollution monitoring
        community

4.1.2 EPA Region Y Office

EPA Regional Offices have been developed to address environmental issues related to the  states
within their jurisdiction and to administer and oversee regulatory and congressionally mandated
programs. The major quality assurance responsibilities of EPA's RegionY Office, in regards to
the Ambient Air Quality Program, are the coordination of quality assurance matters at the
Regional levels with the State and local agencies. This is accomplished by the designation of EPA
Regional Project Officers who are responsible for the technical aspects of the program including:

     *   reviewing QAPPs by Regional QA Officers who are delegated the authority by the
        Regional Administrator to review and approve QAPPs for the Agency.
     >   supporting the FRM Performance Evaluation Program
     >   evaluating quality system performance, through technical systems audits and network
        reviews whose frequency is addressed in the Code of Federal Regulations and Section 20
     >   acting as a liaison by making available the technical and quality assurance information
        developed by  EPA Headquarters and the Region to the State and local agencies, and
        making EPA Headquarters aware of the unmet quality assurance needs of the State and
        local agencies

 Palookaville will direct all technical and QA questions to Region Y.

4.1.3 Polookaville Department of Health

40 CFR Part 58 defines a State Agency as "the air pollution control agency primarily responsible
for the development and implementation of a plan (SIP) under the Act (CAA)".  Section 302 of
the CAA provides a more detailed description of the air pollution control agency.

-------
                                                                                 Project: Model QAPP
                                                                                        Element No: 4
                                                                                        Revision No:l
                                                                                        Date: 4/17/98
                                                                                 	Page 3 of 10
40 CFR Part 58 defines the Local Agency as "any local government agency, other than the state
agency, which is charged with the responsibility for carrying out a portion of the plan (SIP)".

The major responsibility of State and local agencies is the implementation of a satisfactory
monitoring program, which would naturally include the implementation of an appropriate quality
assurance program.  It is the responsibility of State and local agencies to implement quality
assurance programs in all phases of the environmental data operation (EDO), including the field,
their own laboratories,  and in any consulting and contractor laboratories which they may use to
obtain data. An EDO is defined as work performed to obtain,  use, or report information
pertaining to environmental processes or conditions.

Figure 4.1 represents the organizational structure of the areas of the Department of Health that
are responsible for the activities of the PM2 5 Ambient Air Quality Monitoring Program.  The
following information lists the specific responsibilities of each individual and are grouped by
functions of the Directors Office, and the divisions  related to Quality Assurance, Technical
Support, and Program Support.
     EPA QA Officer
     George Benson
     872 669-2299
EPA Project Officer
   Bill Smiley
  872669-2378
                            Laboratory Branch
                              Mike Smather
                              515-331-9845
                           Laboratory Technician
                             Fred Nottingham
                              515-331-4278
                                                  Director
                                               James Calhoon
                                                515-331-2709
                                                                                Official
                                                                               Unofficial
                                                 Air Division
                                               Jeff Samuelson
                                                515-331-5454
                             Air Monitoring Branch
                                Joe Manard
                               515-331-6789
 Bill Macky - 6678
 Kann Porter-5514
Beverly Deston- 7616
                                       Information Manager
                                        Angelista Medron
                                         515-331-2279
                                                          Program Support
                                                           Detbert Boyle
                                                           551-331-5698
Shipping/Receiving
  Jason Chang
  551-331-7677
    Clerk
 Sonny Marony
 551-331-7834
Figure 4.1 Organizational Structure of Palookaville Department of Health for PNf 5 air monitoring.

-------
                                                                         Project: Model QAPP
                                                                               Element No: 4
                                                                               Revision No:l
                                                                                Date: 4/17/98
                                                                         	Page 4 of 10
4.1.3.1 Directors Office

Program Director - James Calhoon

The Director has overall responsibility for managing the Department of Health according to
Department policy.  The direct responsibility for assuring data quality rests with line management.
Ultimately, the Director is responsible for establishing QA policy and for resolving QA issues
identified through the QA program.  Major QA related responsibilities of the Director include:

   •   approving the budget and planning processes
   •   assuring that the Department develops and maintains a current and germane quality system
   •   assuring that the Department develops and maintains a current PM2 5 QAPP and ensures
       adherence to the document by staff, and where appropriate, other extramural cooperators
   •   establishing policies to ensure that QA requirements are incorporated in all environmental
       data operations
   •   maintaining an active line of communication with the QA and technical managers
   •   conducting management systems reviews

The Director delegates the responsibility of QA development and implementation in accordance
with Department policy to the Division Directors. Oversight of the Department's QA program is
delegated to the QA Division Director.

4.1.3.2 QA Division

QA Division Director (QAD) - Linda  Toughy

The QA Division Director is the delegated manager of the Department's QA Program. She has
direct access to the Director on all matters pertaining to quality assurance.  The main
responsibility of the QAD is QA oversight, and ensuring that all personnel understand the
Department's QA policy and all pertinent EPA QA policies and regulations specific to the
Ambient Air Quality Monitoring Program.  The QAD provides technical support and reviews and
approves QA products. Responsibilities include:

    •   developing and interpreting Department QA policy and revising it as necessary
    •   developing a QA Annual Report for the Director
    •   reviewing acquisition packages (contracts, grants,  cooperative agreements, inter-agency
        agreements) to determine the necessary QA requirements
    •   developing QA budgets
    •   assisting staff scientists and project managers in developing QA documentation and in
        providing answers to technical questions
    •   ensuring that all personnel involved in environmental data operations have access to any
        training or QA information needed to be knowledgeable in QA requirements, protocols,

-------
                                                                         Project: Model QAPP
                                                                               Element No: 4
                                                                               Revision No: 1
                                                                               Date: 4/17/98
	Page 5 of 10

        and technology of that activity
     •   reviewing and approving the QAPP for the PM2 5 Ambient Air Quality Monitoring
        Program
     •   ensuring that environmental data operations are covered by appropriate QA planning
        documentation (e.g., QA project plans and data quality objectives)
     •   ensuring that reviews, assessments and audits are scheduled and completed, and at times,
        conducting or participating in these QA activities
     •   tracking the QA/QC status of all programs
     •   recommending required management-level corrective actions
     •   serving as the program's QA liaison with EPA Regional QA Managers or QA Officers
        and the Regional Project Officer

The QAD has the authority to carry out these responsibilities and to bring to the attention of the
Director any issues associated with these responsibilities.  The QAD delegates the responsibility
of QA development and implementation in accordance with Department policy to the QA
Division Branch Managers. Oversight of the QA program as it relates to individual programs is
delegated to the QA Division Branch Managers.

Quality Assurance Division Branch Managers - Philip Magart

The QA Division Branch Manager is the main point of contact within each of the four QA
Branch's of the Department. The QA Branch Manager's responsibilities include:

    •  implementing and overseeing the Department's QA policy within the branch
    •  acting as a conduit for QA information to branch staff
    •  assisting the QAD in developing QA policies and procedures
    •  coordinating the Branch's input to the QA Annual Report (QAAR)
    •  assisting in solving QA-related problems at the lowest possible organizational level

This branch is responsible for overseeing the QA activities of the Ambient Air Quality Monitoring
Program and is therefore responsible for:

     •  ensuring that a QAPP is in place for all environmental data operations associated with the
       PM2 5 Ambient Air Quality Monitoring Program and that it is up-to-date
     •  ensuring that technical systems audits, audits of data quality, and data quality assessments
       occur within the appropriate schedules and conducting or participating in these audits
     •  tracking and ensuring the timely implementation of corrective actions
     •  ensuring that a management system review occurs every 3 years
     •  ensuring that technical personnel follow the QAPP

Each QA Branch Manager has the authority to carry out these responsibilities and to bring to the
attention of his or her respective Division Director any issues related to these responsibilities. The
QA Branch Manager delegates the responsibility of QA development and implementation in

-------
                                                                         Project: Model QAPP
                                                                               Element No: 4
                                                                               Revision No:l
                                                                                Date: 4/17/98
	   Page 6 of 10

accordance with Department policy to the QA Officers.

Quality Assurance Officers- John Dinsmore

The QA Officer is the official staff QA contacts appointed by the QA Branch Manager.  John
Dinsmore is the QA Officer responsible for the QA aspects of the PM2 5 Ambient Air Quality
Monitoring Program. Mr. Dinsmore's responsibilities include:

    •    remaining current on Department QA policy and general and specific EPA QA policies
         and regulations as it relates to the  PM2 5 Ambient Air Quality Monitoring Program
    •    reviewing and approving the QAPP for the PM2 5 Ambient Air Quality Monitoring
         Program
    •    reviewing and initializing pre and post sampling filter weighing activities
    •    scheduling and implementing technical systems audits
    •    performing data quality assessments
    •    reviewing precision and bias data
    •    providing QA training to Air and Program Support Division technical staff
    •    ensuring timely follow-up and corrective actions resulting from auditing and evaluation
         activities.
    •    facilitating management systems reviews implemented by the QA Division Director

4.1.3.3 Technical Division

The technical divisions are responsible for all routine environmental data operations (EDOs) for
the PM2 5 monitoring program.

Air Division Director - Jeff Samuelson

The Air Division Director is the delegated manager of the routine PM2 5 Monitoring Program
which includes the QA/QC activities that are implemented as part of normal data collection
activities.  Responsibilities of the Director include:

    •   communication with EPA Project Officers and EPA QA personnel on issues related to
        routine sampling and QA activities
    •   understanding EPA  monitoring and QA regulations and guidance, and ensuring
        subordinates understand and follow these regulations and guidance
    •   understanding Department QA policy and ensuring subordinates understand and follow
        the policy
    •   understanding and ensuring adherence to the PM2 5 QAPP
    •   reviewing acquisition packages (contracts,  grants, cooperative agreements, inter-agency
        agreements) to determine the necessary QA requirements.
     •   developing budgets and providing  program costs necessary for EPA allocation activities
     •   ensuring that all personnel involved in environmental data collection have access to any

-------
                                                                         Project: Model QAPP
                                                                               Element No: 4
                                                                               Revision No: 1
                                                                               Date: 4/17/98
	Page 7 of 10

        training or QA information needed to be knowledgeable in QA requirements, protocols,
        and technology
     •   recommending required management-level corrective actions

The Air Director delegates the responsibility for the development and implementation of
individual monitoring programs, in accordance with Department policy, to the Air Division
Branch Managers.

Air Monitoring Branch Manager - Joe Manard
Laboratory Branch Manager - Mike Smather

These two branchs are responsible for overseeing the routine field/lab monitoring and QA
activities of the Ambient Air Quality Monitoring Program. The Branch Manager's responsibilities
include:

     •   implementing and overseeing the Department's QA policy within the branch
     •   acting as a conduit for information to branch staff
     •   training staff in the requirements of the QA project plan and in the evaluation of QC
        measurements.
     •   assisting staff scientists and project managers in developing network designs, field/lab
        standard operating procedures and appropriate field/lab QA documentation
     •   coordinating the Branch's input to the QAAR
     •   ensuring that a QAPP is in place for all environmental data operations associated with the
        PM2 5 Ambient Air Quality Monitoring Program and that it is up-to-date
     •   ensuring that technical personnel follow the PM2 5 QAPP

Field Personnel - BillMacky, Karin Porter, Beverly Deston

The field personnel are responsible for carrying out a required task(s) and ensuring the data
quality results of the task(s) by adhering to guidance and protocol specified by the PM2 5 QAPP
and SOPs for the field activities. Responsibilities include:

     •  participating in the development and implementation of the PM2 5  QAPP.
     •  participating in training and certification activities
     •  participating in the development data quality requirements (overall and  field) with the
       appropriate QA staff
     •  writing and modifying standard operating procedures (SOPs)
     •  verifying that all required QA activities are performed and that measurement quality
       standards are met as required in the QAPP
     •  following all manufacturer's specifications
     •  performing and documenting preventative maintenance
     •  documenting deviations from established procedures and methods
     •  reporting all problems and corrective actions to the PO, and QA Officer

-------
                                                                         Project: Model QAPP
                                                                               Element No: 4
                                                                               Revision No:l
                                                                                Date: 4/17/98
	Page 8 of 10

    •  assessing and reporting data quality
    •  preparing and delivering reports to the Branch Manager
    •  flagging suspect data
    •  preparing and delivering data to the Information Manager.

Laboratory Personnel - Fred Nottingham

Laboratory personnel are responsible for carrying out a required task(s) and ensuring the data
quality results of the task(s) by adhering to guidance and protocol specified by the PMj 5 QAPP
and SOPs for the lab activities. Responsibilities include:

    •  participating in the development and implementation of the QAPP
    •  participating in training and certification activities
    •  participating in the development of data quality requirements (overall and laboratory) with
       the appropriate QA staff
    •  writing and modifying standard operating procedures (SOPs) and good laboratory
       practices (GLPs)
    •  verifying that all required QA activities were performed and that measurement quality
       standards were met as required in  the QAPP
    •  following all manufacturer's specifications
    •  performing and documenting preventative maintenance
    •  documenting deviations from established procedures and methods
    •  reporting all problems and corrective actions to the PO, PMs, and QA Officer
    •  assessing and reporting data quality
    •  preparing and delivering reports to the branch manager
    •  flagging suspect data
    •  preparing and delivering data to the information manager
Information Manager- Angelista Medron

The Information Manager is responsible for coordinating the information management activities of
the PM2 5 Ambient Air Monitoring Program. The main responsibilities of the Information
Manager include ensuring that data and information collected for the PM2 5 Monitoring Program
are properly captured, stored, and transmitted for use by program participants. Responsibilities
include:

     •   developing local data management standard operating procedures
     •   ensuring that information management activities are developed within reasonable time
        frames for review and approval
     •   following good automated data processes
     •   coordinating the development of the information management system with data users
     •   ensuring the development of data standards for data structure, entry, transfer, and archive
     •   ensuring the adherence to the QAPP where applicable

-------
                                                                          Project: Model QAPP
                                                                               Element No: 4
                                                                                Revision No:
                                                                                Date: 4/17/98
 	Page 9 of 10

     •  ensuring access to data for timely reporting and interpretation processes
     •  ensuring the development of data base guides (data base structures, user guidance
        documents)
     •  ensuring timely delivery of all required data to the AIRS system

 4.1.3.4 Program Support

 The Program Support Division include the areas of human resources, facilities maintenance, and
 shipping and receiving.

 Program Support Division Director - Delbert Boyle

 Responsibilities of the Director include:

     •  communication with QA and Air Monitoring Division on specific needs.
     •  understanding EPA  monitoring and QA regulations and guidance, and ensuring
        subordinates understand and follow these regulations and guidance
     •  understanding Department QA policy and ensuring subordinates understand and follow
        the policy
     •  understanding and ensuring adherence to the PM2 5 QAPP as it relates to program
        support activities
     •  ensuring that all support personnel have access to  any training or QA information needed
        to be knowledgeable in QA requirements, protocols, and technology

Shipping/Receiving Branch Manager - Jason Ching

This branch is responsible for shipping and receiving equipment, supplies and consumables for the
routine field/lab monitoring and QA activities of the Ambient Air Quality Monitoring Program.
The Branch Managers responsibilities include:

    •   implementing and overseeing the  Department's QA policy within the branch
    •   acting as a conduit for information to branch staff
    •   training staff in the requirements of the QA project plan as it relates to shipping/receiving
    •   assisting staff in developing standard operating procedures
    •   coordinating the Branch's input to the Quality Assurance Annual Report
    •   ensuring that technical personnel follow the QAPP
    •   reviewing and evaluating staff performance and conformance to the QAPP

-------
                                                                          Project: Model QAPP
                                                                               Element No: 4
                                                                                Revision No:
                                                                                Date: 4/17/98
                                                                         	Page 10 of 10
Clerk -Sonny Marony
Mr. Marony has been delegated to provide support for all shipping/receiving of all equipment and
consumable supplies for the PM2 5 Ambient Air Monitoring Program. Responsibilities include:

    •   assisting in the development of standard operating procedures for shipping/receiving
    •   following  SOPs for receiving, storage, chain-of-custody and transfer of filters
    •   informing appropriate field /lab staff of arrival of consumables, equipment, and samples
    •   documenting, tracking, and archiving shipping/receiving records

-------
                                                                               Project: Model QAPP
                                                                                     Element No: 5
                                                                                     Revision No: 1
                                                                                     Date: 4/17/98
                                                                              	Page 1 of3
                        5.0 Problem Definition/Background
    The background information provided in this element will place the problem in historical perspective, giving
 readers and users of the QAPP a sense of the project's purpose and position relative to other project and program
 phases and initiatives
5.1 Problem Statement and Background
    This discussion must include enough information about the problem, the past history, any previous work or data,
and any other regulatory or legal context to allow a technically trained reader to make sense of the project objectives
and activities. This discussion should include:

    •   a description of the problem as currently understood, indicating its importance and programmatic, regulatory,
        or research context;
    •   a summary of existing information on the problem, including any conflicts or uncertainties that are to be
        resolved by the project;
    •   a discussion of initial ideas or approaches for resolving the problem that were considered before selecting the
        approach described in element A6, "Project/Task Description"; and
    •   the identification of the principal data user or decision-maker (if known).

    Note that the problem statement is the first step of the DQO Process and the decision specification is the second
step of the DQO Process.
Between the years 1900 and 1970, the emission of six principal ambient air pollutants increased
significantly. The principal pollutants, also called criteria pollutants, are: particulate matter (PM10,
PM2 5), sulfur dioxide, carbon monoxide, nitrogen dioxide, ozone, and lead.  In 1970 the Clean
Air Act (CAA) was signed into law.  The CAA and its amendments provides the framework for
all pertinent organizations to protect air quality.  This framework provides for the monitoring of
these criteria pollutants by  State and local organizations through the Air Quality Monitoring
Program.

The criteria pollutant defined as particulate matter is a general  term used to describe a broad class
of substances that exist as liquid or solid particles over a wide range of sizes. As part of the
Ambient Air Quality Monitoring Program, EPA will measure two particle size fractions; those
less than or equal to  10 micrometers (PM10), and those less than or equal to 2.5 micrometers
(PM2 5). This QAPP focuses on the QA activities associated with PM2 5.

The background and rationale for the implementation of the PM2 5 ambient air monitoring network
can be found in the Federal Register.  In general, some of the findings are listed below.

     •  The characteristics, sources,  and potential health effects of larger or "coarse" particles

-------
                                                                          Project: Model QAPP
                                                                                Element No: 5
                                                                                Revision No:l
                                                                                Date: 4/17/98
	Page 2 of3

        (from 2.5 to 10 micrometers in diameter) and smaller or "fine" particles (smaller than 2.5
        micrometers in diameter) are very different.

    •   Coarse particles come from sources such as windblown dust from the desert or
        agricultural fields and dust kicked up on unpaved roads from vehicle traffic.

    •   Fine particles are generally  emitted from activities such as industrial and residential
        combustion and from vehicle exhaust. Fine particles are also formed in the atmosphere
        from gases such as sulfur dioxide, nitrogen oxides, and volatile organic compounds that
        are emitted from combustion activities and then become particles as a result of chemical
        transformations in the air.

    •   Coarse particles can deposit in the respiratory system and contribute to health effects
        such as aggravation of asthma. EPA's "staff paper" concludes that fine particles, which
        also  deposit deeply  in the lungs, are more likely than coarse particles to contribute to the
        health effects (e.g., premature mortality and hospital admissions) found in a number of
        recently published community epidemiological studies.

    •   These recent community studies find that adverse public health effects are associated with
        exposure to particles at levels well below the current PM standards for both short-term
        (e.g., less than 1 day to up to 5 days) and long-term (generally a year to several years)
        periods.

    •   These health effects include premature death and increased hospital admissions and
        emergency room visits (primarily among the elderly and individuals with cardiopulmonary
        disease); increased respiratory symptoms and disease (among children and individuals
        with cardiopulmonary disease such as asthma); decreased lung function (particularly in
        children and individuals with asthma); and alterations in lung tissue and structure and in
        respiratory tract defense mechanisms.

Air quality samples are  generally collected for one or more of the following purposes:

    1.  To judge compliance with and/or progress made towards meeting the National Ambient
        Air quality standards.
    2.  To develop, modify or activate control strategies that prevent or alleviate air pollution
        episodes.
    3.  To observe pollution trends throughout the region, including non-urban areas.
    4.  To provide a data base for research and evaluation of effects

With the end use of the  air quality samples as a prime consideration, various networks can
designed to meet one of six basic monitoring objectives listed below:

    •   Determine the highest concentrations to occur in the area covered by the network

-------
                                                                         Project: Model QAPP
                                                                               Element No: 5
                                                                               Revision No: 1
                                                                                Date: 4/17/98
 	     Page 3 of3

     •   Determine representative concentrations in areas of high population density
     •   Determine the impact on ambient pollution levels of significant source or source
         categories
     •   Determine general background concentration levels
     •   Determine the extent of Regional pollutant transport among populated areas, and in
         support of secondary standards
     •   Determine the welfare-related impacts in more rural and remote areas

 The monitoring network consists of four major categories of monitoring stations that measure the
 criteria pollutants. These stations are described below.

 The SLAMS consist of a network of- 3,500 monitoring stations whose size and distribution is
 largely determined by the needs of State and local air pollution control agencies to meet their
 respective State implementation plan (SIP) requirements.

 The NAMS (-1,080 stations) are a subset of the SLAMS network with emphasis being given to
 urban and multi-source areas. In effect, they are key sites under SLAMS, with emphasis on areas
 of maximum concentrations and high population density.

 The PAMS network is required to measure ozone precursors in each ozone non-attainment area
 that is designated serious, severe, or extreme.  The required networks will have from two to five
 sites, depending on the population of the area. There is a phase-in period of one site per year
 starting in 1994. The ultimate PAMS network could exceed 90 sites at the end of the 5 year
 phase-in period.

 Special Purpose Monitoring Stations provide for special studies needed by the State and local
 agencies to support their State implementation plans (SIP's) and other air program activities. The
 SPMS are not permanently  established and, thus,  can be adjusted  easily to accommodate changing
 needs and priorities. The SPMS are used to supplement the fixed monitoring network as
 circumstances require and resources permit. If the data from SPMS are used for SIP purposes,
 they must meet all QA and methodology requirements for SLAMS monitoring.

 This QAPP focuses only on the  QA activities of the SLAMS and NAMS network and the
 objectives of this network which include any sampler used for comparison to the NAAQS.

 Throughout this document,  the term decision maker will be used. This term represents
 individuals that are the ultimate users of ambient air data and therefore may be  responsible for
 activities such as setting and making comparisons to the NAAQS, and evaluating trends. Since
there is more than one objective for this data,  and more than one decision maker, the quality of
the data (see Section 7) will be based on the highest priority objective, which was identified as the
determination of violations of the NAAQS. This QAPP will describe the how the Palookaville
PM2 5 Ambient Air Quality Monitoring Program intends to control and evaluate data quality to
meet the NAAQS data quality objectives.

-------
                                                                                  Project: Model QAPP
                                                                                        Element No: 6
                                                                                        Revision No: 1
                                                                                        Date: 5/15/98
                                                                                         Page 1 of 11
                              6.0 Project/Task Description
     The purpose of the project/task description element is to provide the participants with a background
 understanding of the project and the types of activities to be conducted, including the measurements that will be taken
 and the associated QA/QC goals, procedures, and timetables for collecting the measurements.
 6.1 Description of Work to be Performed
 (1) Measurements that are expected during the course of the project.  Describe the characteristic or property to
     be studied and the measurement processes and techniques that will be used to collect data.

 (2) Applicable technical quality standards or criteria. Cite any relevant regulatory standards or criteria pertinent
     to the project.  For example, if environmental data are collected to test for compliance with a permit limit
     standard, the standard should be cited and the numerical limits should be given in the QAPP. The DQO Process
     refers to these limits as "action levels," because the type of action taken by the decision-maker will depend on
     whether the measured levels exceed the limit (Step 5 of the DQO Process).

 (3) Any special personnel and equipment requirements that may indicate the complexity of the project.
     Describe any special personnel or equipment required for the specific type of work being planned or
     measurements  being taken.

 (4) The assessment techniques needed for the project. The degree of quality assessment activity for a project will
     depend on the project's complexity, duration, and objectives. A discussion of the timing of each planned
     assessment and a brief outline of the roles  of the different parties to be involved should be included.

 (5) A schedule for the work performed.  The anticipated start and completion dates for the project should be given.
     In addition, this discussion should include an approximate schedule of important project milestones, such as the
     start of environmental measurement activities.

 (6) Project and quality records required, including the types of reports needed.  An indication of the most
     important records should be given.
In general, the measurement goal of the  PM2 5 Ambient Air Quality Monitoring Program is to
estimate the concentration, in units of micrograms per cubic meter Cug/m3), of particulates less
than or equal to 2.5 micrometers (//m) that have been collected on a 46.2mm
poletetrafluoroethylene (PTFE) filter. For the SLAMS/NAMS network,  which is what this
QAPP describes, the primary goal is to compare the PM2 5 concentrations to the  annual and 24-
hour National Ambient Air Quality Standard (NAAQS).  The national primary and secondary
ambient air quality standards for PM2 5 are 15.0 micrograms per cubic meter ( Mg/m3) annual
arithmetic mean concentration and 65 A^g/m3  24-hour average concentration measured in ambient
air. A description of the NAAQS  and its calculation can be found in the 1997 Federal Register1

-------
                                                                                    Project: Model QAPP
                                                                                           Element No: 6
                                                                                           Revision No:l
                                                                                           Date: 4/17/98
                                                                                   	Page 2 of 11
Notice.
In addition, Appendix L of part 50 also provides the following summary of the measurement
principle:

     " An electrically powered air sampler draws ambient air at a constant volumetric flow rate into a specially
     shaped inlet and through an inertial particle size separator (impactor) where the suspended particulate
     matter in the PM2 5 size range is separated for collection on a polytetrafluoroethylene (PTFE) filter over
     the specified sampling period. The air sampler and other aspects of this reference method are specified
     either explicitly in this appendix or generally with reference to other applicable regulations or quality
     assurance guidance.

     Each filter is weighed (after moisture and temperature equilibration) before and after sample collection to
     determine the net weight (mass) gain due to collected PM2 5. The total volume of air sampled is
     determined by the sampler from the measured flow rate at actual ambient temperature and pressure and the
     sampling time. The mass concentration of PM2 5 in the ambient air is computed as the total mass of
     collected particles in the PM2 5 size range divided by the actual volume of air sampled, and is expressed in
     micrograms per actual cubic meter of air (fJ-g/m  )."

The following sections will describe the measurements required for the routine field and
laboratory activities for the network.  In addition to these measurements, an initial set of
measurements will be required to fulfill the requirements of the AIRS  data base.


6.2  Field Activities
The performance requirements of the air sampler has been specified in Part 50, Appendix L of the
7718/97 Federal Register Notice1 . Table 6-1 summarizes some of the more critical performance
requirements.

 Table 6-1 Design/Performance Specifications
         Equipment
Frequency
   Acceptance Criteria
                                                                                Reference
    Filter Design Specs.
    Size
    Medium
    Support ring
    Pore size
    Filter thickness
    Max pressure drop
    Max. Moisture pickup
    Collection efficiency
    Filter weight stability
    Alkalinity
Vendor Cert.
      see reference
  46.2 mm dia+ 0.25mm
  Polytetrafluoroethylene
    Polymethylpentene
      0.38mm thick
46.2 mm + 0 25mm outer dia.
3.68 (ฑ0.00, -0.51mm) width
         2,um
        30-50 /^m
 30cmH2O@ 16.67L/min
  10 /^g increase in 24 hr.
         99.7%
               < 25.0 microequivalents/gram
40 CFR Pt 50, App.L Sec 6.0
        "Sec 6.1
        " Sec 6.2
        " Sec 6.3
                                                    "Sec 6 4
                                                    "Sec 6.5
                                                    "Sec 6.6
                                                    "Sec 6 7
                                                    "Sec 6.8
                                               "Sec 6.9.1 and 6.9.2
                                                   "Sec 6.10

-------
Project: Model QAPP
Element No: 6
Revision No: 1
Date: 4/1 7/98
Page 3 of 1 1

Equipment
Sampler Performance
Specs.
Sample Flow Rate
Flow Regulation
Flow Rate Precision
Flow Rate Accuracy
External Leakage
Internal Leakage
Ambient Temp Sensor

Filter Temp Sensor

Barometric Pressure

Clock/Timer


Frequency

All Instruments
"
"
"
"
"
"
"

"

"

"


Acceptance Criteria


l.OOOmVhr.
1.000ฑ5%m3/hr.
2% CV
+2%
Vendor specs
Vendor specs
-30ฐ - 45" C
1ฐ C res. +1.6ฐC accuracy
-30ฐ - 45ฐ C
O.TCres. ฑ1.0ฐC accuracy
600-800 mm Hg
5 mm res. +10mm accuracy
Date/time.
1 sec. res. + 1 min/month
accuracy
Reference


40 CFR Pt. 50, App.L Sec7.4
"

"
"
"
Vol-II-MS. 2.12
40 CFR Pt 50, App.L Sec7.4
"

"

((




















The air samplers will be purchased, distributed, and certified by the EPA as meeting the
requirements specified in the Federal Register.  Therefore, the Department assumes the sampling
instruments to be adequate for the sampling for PM2 5. Other than the required federal reference
or equivalent air sampler, there are no special personnel or equipment requirements. Section 15
lists all the equipment requirements for the Department's PM2 5 data collection operations.

6.2.1 Field Measurements

Table 6-2 represents the field measurements that must be collected. This table is presented in the
Federal Register1 as Table L-l of Appendix L.  These measurements are made by the air sampler
and are stored in the instrument for downloading by the field operator during routine visits.

Table 6-2 Field Measurement Requirements
Information to be provided
Flow rate, 30-second maximum interval
Flow rate, average for the sample period
Flow rate, CV, for the sample period
Flow rate, 5-min average out of spec
(FLAG/
Sample volume, total
Temperature, ambient, 30-second
interval
Temperature, ambient, min., max ,
average for the sample period
Appendix L
section
reference
7.4.5 1
7.45.2
7.4 5.2
7452
7452
7.48
7.4.8
Availability
Anytim
e"
•
*
*
•
*
•
*
End of
period11
—
•
•
•
•
—
•
Visual
display
•
*
*
•
•
•
•
Data
output
*
•
• •
• •
• •
—
• •
Format
Digital
reading
XXX
XXX
XXX
On/Off
XXX
XXX
XXX
Units
L/min
L/min
%

m'
ฐC
ฐC

-------
                                                                                               Project: Model QAPP
                                                                                                      Element No: 6
                                                                                                      Revision No: 1
                                                                                                       Date: 4/17/98
                                                                                              	Page 4  of 11
Information to be provided
Barometric pressure, ambient, 30-second
interval
Barometric pressure, ambient, mm ,
max , average for the sample period
Filter temperature, 30-second interval
Filter temperature, differential, 30-
minute interval, out of spec. (FLAC7)
Filter temperature, maximum
differential from ambient, date, time of
occurrence
Date and time
Sample start and stop time settings
Sample period start time
Elapsed sample time
Elapsed sample time out of spec
(FLAG/
Power interruptions >1 min, start time of
first 10
User-entered information, such as
sampler and site identification
Appendix L
section
reference
749
749
7.4 11
7.4.11
7411
7.4.12
74 12
7.4.12
7413
74 13
74 15 5
7.4.16
Availability
Anytim
e'
•
*
•
*
*
•
•
—
*
—
*
•
End of
periodb
—
•
—
•
*
—
•
•
•
•
•
•
Visual
display
•
•
•
•
*
•
•
•
•
•
*
•
Data
output"1
—
• •
—

-------
                                                                                     Project: Model QAPP
                                                                                            Element No: 6
                                                                                            Revision No:l
                                                                                             Date: 4/17/98
                                                                                    	Page 5 of 11
Table 6-3 Additional Field Measurements
Parameter
Monitor ID
Site Name
Sampler ID
QC Thermometer ID Initial
QC Temperature
Measurement Initial
QC Baromter ID Initial
QC Bar. Pressure Reading
Initial
QC Thermometer ID Final
QC Temperature
Measurement Final
QC Baromter ID Final
QC Bar. Pressure Reading
Final
Filter ID
Filter Integrity flag
Site Operator Initial
Site Operator Final
Free Form Notes
Parameter
Code
MONID
SITENAM
SAMPID
QCTIDI
QCTEMPI
QCBIDI
QCBI
QCTIDF
QCTEMPF
QCBIDF
QCBF
FID
FFIF
SOI
SOF
FFM
Frequency
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
Every sample event
As needed
Units
see AIRS
AAA...
AAXXX
AAAXXX
XXฐC
AAAXXX
XXX mm Hg
AAAXXX
XXฐC
AAAXXX
XXX mm Hg
AAYYXXXX
QFI/ VFI/GFI
AAA
AAA
AAA..
Comment
Unique AIRS Monitor ID that include tht
combination of STATE, COUNTY.
SITE, PARAMETER, and POC fields
Unique site name associated with the site
Sampler model number or unique bar
code number associated with the model
number
Unique ID number of QC thermometer
used for ambient air temp check at the
beginning of sampling
QC temp reading at the beginning of
sampling
Unique alpha-numeric ID of QC
barometric pressure device used for
barometric pressure reading check
QC temp reading at the beginning of
sampling
Unique ID number of QC thermometer
used for ambient air temp check at the
beginning of sampling
QC temp reading at the end of sampling
Unique alpha-numeric ID of QC
barometric pressure device used for
barometric pressure reading check
QC temp reading at the end of sampling
Unique filter ID of filter given by the
weighing laboratory.
QFI -Questionable filter integrity
VFI- Void Filter Integrity
GFI-Good Filter Integrity
Initials of the site operator setting up the
sampling run
Initials of the site operator completing the
sampling run
Free form notes about the sampling run

-------
                                                                          Project: Model QAPP
                                                                                Element No: 6
                                                                                Revision No: 1
                                                                                Date: 4/17/98
                                                                         	Page 6 of 11
6.3 Laboratory Activities
Laboratory activities for the PM2.5 program include preparing the filters for the routine field
operator, which includes three general phases:

Pre-Sampling weighing
     >•   Receiving filters from the EPA
     >   Checking filter integrity
     >   Conditioning filters
     *•   Weighing filters
     ป   Storing prior to field  use
     *•   Packaging filters for field use
     >   Associated QA/QC activities
     *   Maintaining microbalance at specified environmental conditions
     *•   Equipment maintenance and calibrations

Shipping/Receiving
     >   Receiving filters from the field and logging these in
     >   Storing filters
     >•   Associated QA/QC activities (see Section 12)

Post-Sampling Weighing
     *•   Checking filter integrity
     >   Stabilizing/weighing filters
     *•   Data downloads from field data loggers
     *•   Data entry/upload to  AIRS
     >   Storing filters/archiving
     >•   Associated QA/QC activities

The details for these activities are included in various sections of this document as well as
Guidance Document 2.122.  Table 6-4 provides the performance specifications of the laboratory
environment and equipment.

Table 6-4 Laboratory Performance Specifications
Equipment
Microbalance
Microbalance environment
Mass reference standards
Acceptance Criteria
Resolution of 1 fig, repeatability of 1 ug
Climate-controlled, draft-free room or chamber or equivalent. Mean relative humidity
between 30 and 40 percent, with a variability of not more than ฑ5 percent over 24 hours.
Mean temperature should be held between 20 and 23ฐC, with a variability of not more than
ฑ2 ฐC over 24 hours.
Standards bracket weight of filter, individual standard's tolerance less than 25 ug, handle with
smooth, nonmetallic forceps

-------
                                                                             Project: Model QAPP
                                                                                  Element No: 6
                                                                                  Revision No: 1
                                                                                   Date: 4/17/98
                                                                            	Page 7 of 11
6.3.1 Laboratory Measurements

With the exception of the shipping/receiving, which is discussed in
lists the parameters that will be required to be recorded for pre and
laboratory activities.

Table 6-5 Laboratory Measurements
detail in Section 12, Table 6-5
post-sampling weighing
Parameter
Filter Conditioning
Start Date
Start Time
Filter Number
Relative Humidity
Temperature
End Date
End Time
Pre-Sampling Filter
Weighing
Date
Filter Lot Number
Balance Number
Analyst
QA Officer
Relative Humidity
Temperature
Parameter
Code
CNSDATE
CNSHOUR
RFID
LBFID
FBID
CONRH
CONTEMP
CONDATE
CNEHOUR
PREDATE
FLN
BAUD
PREANL
PREQC
PRERH
PRETEMP
Frequency
every filter
every filter
every filter
I/run
I/run
every filter
every filter
1/run^
every filter
I/run
I/run
I/run
I/run
I/run
Units
YY/MM/DD
XX.XX
RFYYXXXX
LBYYXXXX
FBYYXXXX
xx%
xxฐc
YY/MM/DD
XX.XX
YY/MM/DD
AAAXXX
AAAXXX
AAA
AAA
XX%
XXฐC
Comments
Date of start of conditioning period
Start hour and minute of conditioning
Unique filter ID of routine filter (RF)
Lab Blanks (LB) Field Blanks (FB) .
Average % relative humidity value
for conditioning session based upon
readings every 10 mm.
Average temperature value for
conditioning session based upon
readings every 1 0 min.
Date of start of conditioning period
End hour and minute of conditioning
Date for pre-sampling run of filters
that can then be associated with each
filter
Lot number associated with filter
Unique balance ID for balance used
in pre-weighing
Initials of the technician preweighing
filters
Initials of the QA Officer overseeing
preweighing filters
Average % relative humidity value
for weighing session based upon
readings every 10 min.
Average temperature value for
weighing session based upon
readings every 10 min.

-------
 Project- Model QAPP
       Element No: 6
       Revision No:l
        Date: 4/17/98
	Page 8 of 11
Parameter
Filter Number
QC Sample Number
Pre-Sampling Mass
Transport container ID
Monitor ID
Free Form Notes
Post-Sampling Filter
Weighing
Date
Balance Number
Analyst
QA Officer
Relative Humidity
Temperature
Filter Number
QC Sample Number
Parameter
Code
RF1D
LBFID
FBID
FCID
DFID^
PREQC
PREMASS
CONTID
MONID
PREFFM
PSTDATE
BALID
PSTANL
PSTQC
PSTRH
PSTEMP
RFID
LBID
FBID
DFID^
PSTQC
Frequency
every filter
every QC
check
every filter
every filter
Every
sample
As needed
I/run
I/run
I/run
I/run
I/run
I/run
every filter
every QC
check
Units
RFYYXXXX
LBYYXXXX
FBYYXXXX
FCYYXXXX
DFYYXXXX
C1XXX
C2XXX
C3XXX
XXX.XXX
mg
AAAXXX
see AIRS

YY/MM/DD
AAAXXX
AAA
AAA
XX%
XXฐC
RFYYXXXX
LBYYXXXX
FBYYXXXX
DFYYXXXX
C1XXX
C2XXX
C3XXX
Comments
Unique filter ID of routine filter (RF)
Lab Blanks (LB) Field Blanks (FB)
Flow Check Filter (FC) and Duplicate
Filter.
Unique ID for calibration checks and
or other types of QC samples used.
Mass weight in mg of the filter
Identification of the filter transport
container
Unique AIRS Monitor ID that
include the combination of STATE,
COUNTY, SITE, PARAMETER,
and POC fields
Pre-weighing Free Form notes
Date for post-sampling run of filters
that can then be associated with each
filter
Unique balance ID for balance used
in post-weighing
Initials of the technician post-
weighing filters
Initials of the QA Officer overseering
preweighing filters
Average % relative humidity value
for weighing period based upon
readings every 10 min
Average temperature value for
weighing period based upon readings
every 10 min
Unique filter ID of routine filter (RF)
Lab Blanks (LB) Field Blanks (FB)
and Dulicate Sample.
Unique id for calibration checks and
or other types of QC samples used.

-------
                                                                            Project: Model QAPP
                                                                                  Element No: 6
                                                                                  Revision No:l
                                                                                   Date: 4/17/98
                                                                            	Page 9 of 11
Parameter
Post Sampling Mass
Net Mass
Weighing Flag
Free Form Notes
Parameter
Code
PSTMASS
NETMASS
PSTFLAG
PSTFFM
Frequency
every filter
every filter
as needed
as needed
Units
xxx.xxx
mg
XX.XXX mg
AAA
AAA...
Comments
Mass weight in mg of the filter
Net weight (PSTMASS-PREMASS)-
in mgof PM25 catch.
Flags associated with concentration
Past weighing free form notes
1- information is associated with a "session" and the values will be able to be associated with individual filters.
2- this identifies a second weighing of a routine filter and not a unique filter.

6.4 Project Assessment Techniques

An assessment is an evaluation process used to measure the performance or effectiveness of a
system and its elements. As used here, assessment is an all-inclusive term used to denote any of
the following: audit, performance evaluation (PE), management systems review (MSR), peer
review, inspection, or surveillance.  Definitions for each of these activities  can be found in the
glossary (Appendix A).  Section 20 will discuss the details of the Department's assessments.

Table 6-6 will provide information on the parties implementing the assessment and there
frequency.

Table 6-6 Assessment Schedule
Assessment Type
Technical Systems Audit
Network Review
FRM Performance Evaluation
Data Qulity Assessment
Assessment Agency
EPA Regional Office
Departmemt's QA Office
EPA Regional Office
Department's Air Division
EPA Regional Office
Department
Frequency
1 every 3 years
1 every 3 years
every year
AppD I/year
App E 1 every 3 years
25% of sites/year/4 times per year.
every year

-------
                                                                         Project: Model QAPP
                                                                               Element No: 6
                                                                               Revision No:l
                                                                               Date: 4/17/98
                                                                        	Page 10 of 11
6.5 Schedule of Activities

Table 6-7 contains a list of the critical activities required to plan, implement, and assess the PM2,
program.

Table 6-7 Schedule of Critical PH5 Activities
Activity
Network development
Sampler order
Laboratory design
Laboratory procurement
Personnel Requirements
QAPP development
Network design completion
Samplers arrive
Sampler siting/testing
Field/Laboratory Training
QAPP Submittal
QAPP Approval
Pilot testing
Installation of 1998 sites
Routine Sampling
Due Date
January 15, 1998
March 2, 1998
February 1, 1998
April 1, 1998
April 1, 1998
May-Sept, 1998
July 1, 1998
July 1, 1998
July -December, 1998
August, 1998
October 1, 1998
November 30, 1998
August-December 1998
December 31, 1998
January 1, 1999
Comments
Preliminary list of sites and samplers required
Samplers ordered from National contract
Listing of laboratory requirements
Ordering/purchase of all laboratory and miscellaneous
field equipment
Advertising for field and laboratory personnel (if
required)
Development of the QAPP
Final network design
Arrival of FRM samplers
Establishment of sites and preliminary testing of
samplers
Field and laboratory training activities and certification
QAPP submittal to EPA
Approval by EPA
Pilot activities to ensure efficiency of measurement
system
Sites must be established and ready to collect data
Routine activities must start
6.6 Project Records

The Department will establish and maintain procedures for the timely preparation, review,
approval, issuance, use, control, revision and maintenance of documents and records. Table 6-8
represents the categories and types of records and documents which are applicable to document
control for PM2 5 information.  Information on key documents in each category are explained in
more detail in Section 9.

-------
                                                                                      Project: Model QAPP
                                                                                            Element No: 6
                                                                                            Revision No: 1
                                                                                             Date: 4/17/98
                                                                                     	Page 11 of 11
Table 6-8 Critical Documents and Records
   Categories
Record/Document Types
   Management and
   Organization
State Implementation Plan
Reporting agency information
Organizational structure
Personnel qualifications and training
Training Certification
Quality management plan
Document control plan
EPA Directives
Grant allocations
Support Contract
   Site Information
Network description
Site characterization file
Site maps
Site Pictures
   Environmental Data
   Operations
QA Project Plans
Standard operating procedures (SOPs)
Field and laboratory notebooks
Sample handling/custody records
Inspection/maintenance records
   Raw Data
Any original data (routine and QC data)
including data entry forms
   Data Reporting
Air quality index report
Annual SLAMS air quality information
Data/summary reports
Journal articles/papers/presentations
   Data Management
Data algorithms
Data management plans/flowcharts
PM25Data
Data Management Systems
  Quality Assurance
Good Laboratory Practice
Network reviews
Control charts
Data quality assessments
QA reports
System audits
Response/Corrective action reports
Site Audits
References
1. U.S. EPA (1997a) National Ambient Air Quality Standards for Particulate Matter - Final Rule.
    40 CFR Part 50. Federal Register, 62(138):38651-38760. July 18,1997.

2. U.S. EPA Quality Assurance Guidance Document 2.12: Monitoring PM2 5 in Ambient Air
    Using Designated Reference or Class I Equivalent Methods.  March, 1998

-------
                                                                            Project: Model QAPP
                                                                                   Element No:7
                                                                                  Revision No: 1
                                                                                   Date: 4/17/98
                                                                           	Page 1 of7
       7.0 Quality Objectives and Criteria for Measurement Data
    The purpose of this element is to document the DQOs of the project and to establish performance criteria for the
 mandatory systematic planning process and measurement system that will be employed in generating the data.
 7.1 Data Quality Objectives (DQOs)
    This element of the QAPP should discuss the desired quality of the final results of the study to ensure that the data
user's needs are met. The Agency strongly recommends using the DQO Process, a systematic procedure for planning
data collection activities, to ensure that the right type, quality, and quantity of data are collected to satisfy the data
user's needs. DQOs are qualitative and quantitative statements that:

    •  clarify the intended use of the data,
    •  define the type of data needed to support the decision,
    •  identify the conditions under which the data should be collected, and
    •  specify tolerable limits on the probability of making a decision error due to uncertainty in the data.
DQOs are qualitative and quantitative statements derived from the DQO Process that clarify the
monitoring objectives, define the appropriate type of data, and specify the tolerable levels of
decision errors for the monitoring program1. By applying the DQO Process to the development
of a quality system for PM25, the EPA guards against committing resources to data collection
efforts that do not support a defensible decision. During the months from April to July of 1997
the DQO Process was implemented for the PM2 5.  The DQOs were based on the data
requirements of the decision maker(s). Regarding the quality  of the PM2 5 measurement system,
the objective is to control precision and bias in order to reduce the probability of decision errors.
Assumptions necessary for the development of the DQO included:

1. The DQO is based on the annual arithmetic mean NAAQS.

The PM25 standards are a 15 ug/m3 annual average and a 65 ug/m3 24-hour average.  The annual
standard is met when the 3-year average  of annual arithmetic means is less than or equal to 15
ug/m3.  Due to rounding, the 3-year average does  not meet the NAAQS if it equals or exceeds
15.05 prior to rounding.  The 24-hour average standard is met when the 3-year average 98th
percentile of daily PM25 concentrations is less than or equal to 65 ug/m3.

AIRS PM2 5 data were reviewed for two purposes: (a) to determine the relative "importance" of

-------
                                                                           Project: Model QAPP
                                                                                 Element No:7
                                                                                 Revision No:l
                                                                                 Date: 4/17/98
                                                                          	Page 2 of7
100
90
80
I n
^ 60
1 50
a. 40
S
o) 30
20
10
0
PM25



. *
1 ,'f
., I '

•

i
0 ' ซ
1 ฐ

„









3 5 10 15 20 25 30 35 40
Arithmetic Mean, pgftn3
                                                   the two standards; and (b) to suggest
                                                   "reasonable" hypothetical cases for which
                                                   decision makers would wish to declare
                                                   attainment and nonattainment with high
                                                   probability. Twenty-four locations were
                                                   found to have at least one year of PM2 5
                                                   data in AIRS. Figure 7.1 displays the
                                                   annual averages and 98th percentiles that
                                                   are associated with lognormal distributions
                                                   for the 47 data sets. Figure 7.1 does not
                                                   display estimates derived according to the
                                                   standard, as the data sets covered one
                                                   rather than three years, but it does indicate
                                                   the relative importance of the two
                                                   standards.  Points to the  right of the
                                                   vertical line may be viewed as exceeding
                                                   the annual average standard. Points above
the horizontal line may be viewed as exceeding the 24-hour average standard. All of those points
are also to the right of the vertical line, indicating that the annual standard is the "controlling"
standard for these locations. For this reason, the DQOs discussed in the remainder of this
document focus on attainment with the annual average standard.

2. Normal distribution for measurement error.

Error in environmental measurements is often assumed to be normal or lognormal. Figures 7.2
and 7.3 attempt to illustrate what happens to the normal and lognormal distribution functions for
the same median concentration at two values for measurement error (CV's of 10 and 50%). In
the case of PM2 5, the measurement error is expected to be in the range of 5 to  10 % of the mean,
as shown in Figure 7.2, where  normal or lognormal errors produce close to identical results.
Therefore, due  to these comparable results and its simplicity in modeling,  the normal distribution
of error was selected.
Figure 7.1 Annual arithmetic mean and 24-hour 98th
percentiles associated with selected data sets
                     Concentration
Figure 7.2 Comparison of normal and lognormal density
functions at low measurement error (10% CV)
                                                  001

                                                  0009 '

                                                  0006
                                                 ,
                                                  0007

                                                  0006

                                                  0005

                                                  0004

                                                  0003

                                                  0002

                                                  0001
                                                                         150      200
                                             Figure 7.3 Comparison of normal and lognormal density
                                             functions at higher measurement errors (50% CV)

-------
                                                                          Project: Model QAPP
                                                                                Element No:7
                                                                               Revision No: 1
                                                                                Date: 4/17/98
	Page 3 of7

3. Decision errors can occur when the estimated 3-year average differs from the actual, or true,
3-year average.

Errors in the estimate are due to population uncertainty (sampling less frequently than every day)
and measurement uncertainty (bias and imprecision). The false positive decision error occurs
whenever the estimated 3-year average exceeds the standard and the actual 3-year average is  less
than the standard.  The false negative decision error occurs whenever the estimated 3-year
average is less than the standard and the actual 3-year average is greater than the standard.

4. The limits on precision and bias are based on the smallest number of sample values in a 3-
year period.

Since the requirements allow 1  in 6 day sampling and a 75% data completeness requirement, the
minimum number of values in a 3-year period is 137. It can be demonstrated that obtaining more
data, either through more frequent sampling or the use of spatial averaging, will lower the risk of
attainment/non-attainment decision errors at the same  precision and bias acceptance levels.

5. The decision error limits were set at 5%.

For the two cases that follow, the decision maker will make the correct decision 95% of the time
if precision and bias are maintained at the acceptable levels. For cases that are less "challenging"
(i.e., annual average values that are farther from the standard), the decision maker will make the
correct decision more often. This limit was based on the minimum number of samples from
assumption 4 above (137) and the present uncertainty in the measurement technology.  However,
if precision and bias prove to be lower than the DQO, the decision maker can expect to make  the
correct decision more than 95% of the time.

6. Measurement imprecision was established at 10% coefficient of variation (CV).

By reviewing available AIRS data and other PM2 5 comparison studies, it was determined that it
was reasonable to allow measurement imprecision at 10% CV. While measurement imprecision
has relatively little  impact on the ability to avoid false positive and false negative decision errors, it
is an important factor in estimating bias.  CV's greater than 10% make it difficult to detect and
correct bias problems. Two sine finctions were developed (case 1 and 2) to represent distributions
where decision makers began to be concerned about decision errors. Table 7-1 summarizes the
case 1 and 2 distributions.

Table 7-1. Summary of Case 1 and 2 parameters

Case 1
Case 2
Model Equation
CD=12.75+8.90 sin(27O/365)+6D
CD= 18.4+ 12.85 sin(2i:D/365)+8D
Mean
12.75
18.4
Correct Decision
Attainment
Nonattainment
Incorrect Decision
F(+) = nonattainment
F(-) = attainment
Tolerable
Error Rate
5%
5%

-------
                                                                             Project: Model QAPP
                                                                                    Element No:7
                                                                                   Revision No:l
                                                                                    Date: 4/17/98
                                                                             	Page 4 of7
Table 7-2. Measurement System Decision
                                                Case 1: With this model (case 1), the 3-year
                                                average is 12.75 ug/m3.  The correct decision is
                                                "attainment."  A false positive error is made
                                                when the estimated average exceeds the
                                                standard. The probability of the false positive
                                                error for sampling every sixth day depends on
                                                the measurement system bias and precision, as
                                                shown in Table 7-2. As stated in assumption 6
                                                above, the data in Table7-2 show that precision
                                                alone has little impact on decision error, but is
an important factor for bias, which is an important factor in decision error.

Since the decision error probability limits were set at 5% (assumption 5),  acceptable precision
(CV) and bias are combinations yielding decision errors around 5%.
Precision
CV (%)
0
0
0
80
100
10
15

Bias (%)
+5
+10
+ 15
0
0
+10
+ 10
Decision Error Probability
False Positive (%)
0.18
4.4
26.8 (not acceptable)
1.3
3.0
4.7
5.1
Table 7-3. Measurement System Decision
Precision
CV(%1
0
0
0
80
100
10
15

Bias(%)
-5
-10
-15
0
0
-10
-10
Decision Error Probability
False Negative (%)
<0.1
1.6
18.9 (not acceptable)
1.2
2.8
1.8
2.1
                                                Case 2: With this model (case 2), the 3-year
                                                average is 18.4 ug/m3.  The correct decision is
                                                "nonattainment." A false negative error is made
                                                when the estimated average is less than the
                                                standard. The probability of the false negative
                                                error for sampling every sixth day depends on
                                                the measurement system bias and precision, as
                                                shown in the Table 7-3. Similar to case 1,
                                                combinations of precision and bias that yield
                                                decision error probabilities around 5% were
considered acceptable.
After reviewing cases 1 and 2, based upon the acceptable decision error of 5%, the DQO for
acceptable precision (10% CV) and bias (+ 10%) were identified. These precision and bias values
will be used as a goal from which to evaluate and control measurement uncertainty.

7.2 Measurement Quality Objectives (MQOs)
    While the quality objectives state what the data user's needs are, they do not provide sufficient information about
 how these needs can be satisfied.  The specialists who will participate in generating the data need to know the
 measurement performance criteria that must be satisfied to achieve the overall quality objectives. One of the most
 important features of the QAPP is that it links the data user's quality objectives to verifiable measurement
 performance criteria. Although the level of rigor with which this is done and documented will vary widely, this
 linkage represents an important advancement in the implementation of QA. Once the measurement performance
 criteria have been established, sampling and analytical methods criteria can be specified under the elements
 contained in Group B

-------
                                                                               Project: Model QAPP
                                                                                     Element No:7
                                                                                     Revision No:l
                                                                                     Date: 4/17/98
	Page 5 of7

Once a DQO is established, the quality of the data must be evaluated and controlled to ensure that
it is maintained within the established acceptance criteria. Measurement quality objectives are
designed to evaluate and control various phases (sampling, preparation, analysis) of the
measurement process  to ensure that total measurement uncertainty is within the range prescribed
by the DQOs. MQOs can be defined in terms of the following data quality indicators:

    Precision - a measure of mutual agreement among individual measurements of the same property usually under
    prescribed similar conditions. This is the random component of error. Precision is estimated by various statistical
    techniques using some derivation of the standard deviation.

    Bias - the systematic  or persistent distortion of a measurement process which causes error in one direction. Bias
    will be determined by  estimating the positive and negative deviation from the true value as a percentage of the true
    value.

    Representativeness - a measure of the degree which data accurately and precisely represent a characteristic of a
    population, parameter  variations at a sampling point, a process condition, or an environmental condition.

    Detectability- The determination of the low range critical value of a characteristic that a method specific procedure
    can reliably discern.

    Completeness - a measure of the amount of valid data obtained from a measurement system compared to the
    amount that was expected to be obtained under correct, normal conditions.  Data completeness requirements are
    included in the reference methods (40 CFR Pt. 50).

    Comparability - a measure of confidence with which one data set can be compared to another.

 Accuracy has been a  term frequently used  to represent closeness to "truth" and includes a
 combination of precision and bias error components.  This term  has been used throughout the
 CFR and in some of the sections of this document. If possible, the Department will attempt to
 distinguish  measurement uncertainties into precision and bias components.

 For each of these attributes, acceptance criteria can be developed for various phases of the
 EDO.  Various parts  of 40 CFR have identified  acceptance criteria for some of these attributes as
 well as Guidance Document 2.1?. In theory, if these MQOs are met, measurement uncertainty
 should be controlled to the levels required by the DQO.  Table 7-4 lists the MQOs for PM2 5
 program.  More detailed descriptions of these MQO's and how they  will be used to control and
 assess measurement uncertainty will be described in other elements,  as well as SOPs (Appendix
 E)  of this QAPP.

 References

 1. EPA Guidance for Quality Assurance Project Plans EPA QA/G-5, EPA/600/R-98/018,  February 1998

 2. U.S. EPA Quality Assurance Guidance Document 2.12: Monitoring PM25 in Ambient Air Using
    Designated Reference or Class I Equivalent Methods. April, 1998

-------
 Project: Model QAPP
        Element No:7
        Revision No:
        Date: 5/16/98
	Page 6 of7
Table 7-4 Measurement Oualitv Objectives - Parameter PM2.5
Requirement
Filter Holding Times
Pre-sampling
Post-sampling Weighing
Reporting Units
Detection Limit
Lower DL
Upper Cone. Limit
Data Completeness
Filter
Visual Defect Check
Filter Conditioning Environment
Equilibration
Temp. Range
Temp. Control
Humidity Range
Humidity Control
Lot Blanks
Lab QC Checks
Field Filter Blank
Lab Filter Blank
Balance Check
Duplicate Filter Weighing
Frequency
all filters
All data
All data
All data
quarterly
All Filters
All filters
3 filters per exposure lot
10% or 1 per weighing
session
10% or 1 per weighing
session
beginning, every 10th
sample, end
1 per weighing session
Acceptance Criteria
< 30 days before sampling
< 1 0 day sat 25ฐ C from
sample end date
< 30 days at 4ฐC from sample
end date
Aig/m3
2 Azg/m3
200 //g/m
75%
See reference
24 hours minimum
20-23" C
+ 2ฐ C over 24 hr
30%-40%RH
+ 5% RH over 24 hr.
less than 1 5 ,ug change
between weighings
+30 ^g change between
weighings
+ 1 5 ,ug change between
weighings
<3Mg
+1 5 ,ug change between
weighings
40CFR
Reference
Part 50, App.L Sec 8.3
Part 50.3
Part 50, App.L Sec 3.1
Part 50, App.L Sec 3. 2
Part 50, App. N, Sec. 2.1
Part 50, App.L Sec 6.0
Part 50, App.L Sec 8.2
((
Part 50, App.L Sec 8.2
QA Guidance
Document
2.12 Reference
Sec. 7.9
Sec. 7.11
Sec. 11.1


Sec 7.5
Sec. 7.6
it
11
Sec. 7.7
Sec. 7.7
Sec. 7.9
Sec 7. 11

-------
Project: Model QAPP
       Element No:7
       Revision No:
       Date: 5/16/98
  	Page 7 of7
Requirement
Calibration/Verification
Flow Rate (FR) Calibration
FR multi-point verification
One point FR verification
External Leak Check
Internal Leak Check
Temperature Calibration
Temp Multi-point Verification
One- point temp Verification
Pressure Calibration
Pressure Verification
Clock/timer Verification
Accuracy
FRM Performance Evaluation
Flow Rate Audit
External Leak Check
Internal Leak Check
Temperature Audit
Pressure Audit
Balance Audit
Precis/on
Collocated samples
Single analyzer
Single Analyzer
Reporting Org.
Calibration & Check Standards
Flow Rate Transfer Std.
Field Thermometer
Field Barometer
Working Mass Stds.
Primary Mass Stds.
Frequency
If multi-point failure
1/yr
1/4 weeks
every 5 sampling events
every 5 sampling events
If multi-point failure
on installation, then 1/yr
1/4 weeks
on installation, then 1/yr
1/4 weeks
1/4 weeks
25%ofsites4/yr
l/2wk (automated)
4/yr (manual)
4/yr
4/yr
4/yr
4/yr(?)
1/yr
every 6 days for 25% of sites
1/3 mo.
1/yr
I/ 3 mo.
1/yr
1/yr
1/yr
3-6 mo.
1/yr
Acceptance Criteria
+ 2% of transfer standard
+ 2% of transfer standard
+ 4% of transfer standard
80 mL/min
80 mL/min
+ 2% of standard
+ 2ฐCof standard
+ 4ฐCof standard
ฑ10 mm Hg
ilOmmHg
1 min/mo
+ 10%
+ 4% of audit standard
< 80 mL/min
< 80 mL/min
+ 2ฐC
ฑ10mmHg
Manufacturers specs
CV<10%
CV<10%
CV<10%
CV<10%
+2% of NIST-traceable Std.
+ O.TC resolution
+ 0. 5ฐ C accuracy
+ 1 mm Hg resolution
+ 5 mm Hg accuracy
0.025 mg
0.025 mg
40CFR
Reference
Part 50, App.L, Sec 9.2
Part 50, App.L, Sec 9.2.5
Part 50, App.L, Sec 7.4
ir
Part 50, App.L, Sec 9.3
Part 50, App.L, Sec 9.3
II
ri
Part 50, App.L, Sec 7.4
Part 58, AppA, Sec 3. 5
ii
not described
not described
not described
not described
not described
Part 58, App.A, Sec 3. 5
and 5.5
not described
not described
not described
Part 50, App.L Sec 9.1
and 9.2
not described
not described
not described
not described
not described
not described
QA Guidance
Document
2.12 Reference
Sec 6.3
Sec 6.3 & 8.4
Sec 8.4
Sec. 6.6 & 8.4
Sec. 6.6 & 8.4
Sec. 6.4
Sec. 6.4 and 8.4
Sec. 6.4 and 8.4
Sec. 6.5
Sec. 8.2
not described
Sec 10.2
Sec. 10.2
II
11
Sec. 10.2
not described
not described
not described
Sec. 6.3
Sec 4.2 & 6.4
Sec 4.3 and 7.3
il

-------
                                                                               Project: Model QAPP
                                                                                      Element No:8
                                                                                     Revision No:l
                                                                                      Date: 4/17/98
                                                                                       Page 1 of4
                8.0 Special Training Requirements/Certification

      The purpose of this element is to ensure that any specialized or unusual training requirements necessary to
  complete the projects are known and furnished and the procedures are described in sufficient detail to ensure that
  specific training skills can be verified, documented, and updated as necessary.
8.1 Training

     Requirements for specialized training for nonroutine field sampling techniques, field analyses, laboratory
 analyses, or data validation should be specified.  Depending on the nature of the environmental data operation, the
 QAPP may need to address compliance with specifically mandated training requirements.
Personnel assigned to the PM2 5 ambient air monitoring activities will meet the educational, work
experience, responsibility, personal attributes, and training requirements for their positions.
Records on personnel qualifications and training will be maintained in personnel files and will be
accessible for review during audit activities.

Adequate education and training are integral to any monitoring program that strives for reliable
and comparable data. Training is aimed at increasing the effectiveness of employees and the
Department. Table 8-1  represents the general training requirements for all employees, depending
upon there job classification.

Table 8-1 Department of Health Employee Training Requirements
Job Classification
Directors
Branch Chief and above
Project Officers and Above
All Employees
Training Title
Executive Development Program
Framework for Supervision
Keys to Managerial Excellence
EEO for Managers and Supervisors
Sexual Harassment
Contract Administration for Supervisors
40 hours of developmental activities
Contract Administration
Contract Administration Recertification
EEO for Managers and Supervisors
Grants Training
Project Officer Training (contract/grants)
Ethics in Procurement
Work statements for Negotiated Procurements
Ethics
Cultural Diversity
Time/Frequency
Requirement
As available
1 st 6 months
After comp. of above
As available
Prior to responsibility
Every three years
As available
Prior to responsibility
If filing SF450
As available

-------
                                                                        Project: Model QAPP
                                                                              Element No:8
                                                                              Revision No:l
                                                                              Date: 4/17/98
                                                                       	Page 2 of4
Job Classification
Support Staff
Field Personnel
Field Personnel
(Superfund sites)
Laboratory Personnel
Training Title
English grammar
Proofreading
Telephone Etiquette
Professionalism in the Office
Filing
Department Style of Correspondence
Travel Procedures
Procurement Request Procedures
Timekeeping
Introduction to WordPerfect
E-MAIL
24 Hour Field Safety
8 hour Field Safety Refresher
8 hour First Aid/CPR
Blood borne pathogens
40 Hour Field Safety
8 hour Field Safety Refresher
8 Hour First Aid/CPR
Blood borne pathogens
24 Hour Laboratory Safety
4 Hour Refresher
R/V Safety Video/Discussion
Chemical Spill Emergency Response
Blood borne pathogens
Time/Frequency
Requirement
As available
1 st time
Yearly
Yearly
1st time
1st time
Yearly
Yearly
1 st time
1st time
Yearly
Yearly
1st time
1st time
8.1.1 Ambient Air Monitoring Training

Appropriate training is be available to employees supporting the Ambient Air Quality Monitoring
Program, commensurate with their duties.  Such training may consist of classroom lectures,
workshops, teleconferences, and on-the-job training.

Over the years, a number of courses have been developed for personnel involved with ambient air
monitoring and quality assurance aspects.  Formal QA/QC training is offered through the
following organizations:

    >   Air Pollution Training Institute (APTI) http://www.epa.gov/oar/oaq.apti.html
    ป   Air & Waste Management Association (AWMA) http://awmci.org/epr.htm
    *•   American Society for Quality Control (ASQC) http://www.asqc.org/products/educat.html
    >•   EPA Institute
    >   EPA Quality Assurance Division (QAD) http://es.inel.gov/ncerqa/qa/
    >   EPA Regional Offices
Table 8-2 presents a sequence of core ambient air monitoring and QA courses for ambient air
monitoring staff, and QA managers (marked by asterisk). The suggested course sequences

-------
                                                                             Project: Model QAPP
                                                                                   Element No:8
                                                                                   Revision No:l
                                                                                   Date: 4/17/98
                                                                            	Page 3 of4
assume little or no experience in QA/QC or air monitoring. Persons having experience in the
subject matter described in the courses would select courses according to their appropriate
experience level.  Courses not included in the core sequence would be selected according to
individual responsibilities, preferences, and available resources.

 'able 8-2. Core Ambient Air Training Courses
Sequence
1*
2*
3*
4*
5*
6*
7*
8*
9
10
11
*
*
*
*
*
*

*
*
Course Title (SI = self instructional)
Air Pollution Control Orientation Course (Revised), SI:422
Principles and Practices of Air Pollution Control, 452
Orientation to Quality Assurance Management
Introduction to Ambient Air Monitoring (Under Revision), SI:434
General Quality Assurance Considerations for Ambient Air Monitoring
(Under Revision), SI:471
Quality Assurance for Air Pollution Measurement Systems (Under
Revision), 470
Data Quality Objectives Workshop
Quality Assurance Project Plan
Atmospheric Sampling (Under Revision), 435
Analytical Methods for Air Quality Standards, 464
Chain-of-Custody Procedures for Samples and Data, SI:443
Data Quality Assessment
Management Systems Review
Beginning Environmental Statistical Techniques (Revised), SI:473A
Introduction to Environmental Statistics, SL473B
Quality Audits for Improved Performance
Statistics for Effective Decision Making
AIRS Training
FRM Performance evaluation Training (field/lab)
PM2 5 Monitoring Implementation (Video)
Department
Number
422
452
QA1
434
471
470
QA2
QA3
435
464
443
QA4
QA5
473
473B
QA6
STAT1
AIRS1
QA7
PM1
Source
APTI
APTI
QAD
APTI
APTI
APTI
QAD
QAD
APTI
APTI
APTI
QAD
QAD
APTI
APTI
AWMA
ASQC
OAQPS
OAQPS
OAQPS
* Courses recommended for QA Managers

-------
                                                                             Project: Model QAPP
                                                                                   Element No:8
                                                                                   Revision No:l
                                                                                   Date: 4/17/98
                                                             	Page 4 of 4

Based upon the activities for the PM2 5, the following training will be required by personnel in the
following categories, prior to implementing environmental data operations.

Field Personnel-      422, 434, 435, 443, PM1

Laboratory-          422, 434, 435, 464, 443, PM1

Data Management -  434, AIRS 1

QA Personnel -       422, 434, 435, 443, QA1, QA3, QA4, QA6, QA7, PM1
During the month of August 1998, training will occur for all field, laboratory, sample custody and
data management personnel. Training will be based on the conforming to the SOPs listed in
Appendix E. The QA Division will coordinate training activities for the Department.

8.2  Certification
     Usually, the organizations participating in the project that are responsible for conducting training and health and
  safety programs are also responsible for ensuring certification.  Various commercial training courses are available that
  meet some government regulations. Training and certification should be planned well in advance for necessary
  personnel prior to the implementation of the project. All certificates or documentation representing completion of
  specialized training should be maintained in personnel files.
For the PM2 5 program, the QA Division will issue training certifications for the successful
completion of field, laboratory, sample custody and data management training. Certification will
be based upon the qualitative and quantitative assessment of individuals adherence to the SOPs.
Certification will require a qualitative acceptance rating of "adequate" and quantitative rating of
80%. Appendix B contains the QA Division Certification Evaluation Forms for field and
laboratory activities.  Forms for sample custody and data management will also be developed.

-------
                                                                            Project: Model QAPP
                                                                                  Element No: 9
                                                                                  Revision No: 1
                                                                                   Date: 4/17/98
                                                                                    Page 1 of5
                          9.0 Documentation and Records
     The purpose of this element is to define which records are critical to the project and what information needs to be
 included in reports, as well as the data reporting format and the document control procedures to be used. Specification
 of the proper reporting format, compatible with data validation, will facilitate clear, direct communication of the
 investigation and its conclusions and be a resource document for the design of future studies.
 For the Ambient Air Monitoring Program, there are number of documents and records that need
 to be retained. A document, from a records management perspective, is a volume that contains
 information which describes, defines, specifies, reports, certifies, or provides data or results
 pertaining to environmental programs. As defined in the Federal Records Act of 1950 and the
 Paperwork Reduction Act of 1995 (now 44 U.S.C. 3101-3107), records are: "...books, papers,
 maps, photographs, machine readable materials, or other documentary materials, regardless of
 physical form or characteristics, made or received by an agency of the United States Government
 under Federal Law or in connection with the transaction of public business and preserved or
 appropriate for preservation by that agency or its legitimate successor as evidence of the
 organization, functions, policies, decisions, procedures, operations, or other activities of the
 Government or because of the informational value of data in them..."

 The following information describes  the Department of Health's document and records
 procedures for PM2 5 Program. In EPA's QAPP regulation and guidance, EPA uses the term
 reporting package. Although this is not a term currently used by the Department, it will be
 defined as all the information required to support the concentration data reported to EPA, which
 includes all data required to be collected as well as data deemed important by the Department
 under its policies and records management procedures. Table 9-1 identifies these documents and
 records.

 9.1  Information Included in the Reporting Package


     The selection of which records to include in a data reporting package must be determined based on how the data
 will be used. Different "levels of effort" require different supporting QA/QC documentation. For example,
 organizations conducting basic research have different reporting requirements from organizations collecting data in
 support of litigation or in compliance with permits. When possible, field and laboratory records should be integrated
 to provide a continuous track of reporting.
9.1.1 Routine Data Activities

The Department of Health has a structured records management retrieval system that allows for
the efficient archive and retrieval of records.  The PM2 5 information will be included in this
system. It is organized in a similar manner to the EPA's records management system (EPA-220-

-------
                                                                           Project: Model QAPP
                                                                                 Element No: 9
                                                                                 Revision No:l
                                                                                 Date: 4/17/98
                                                                          	Page 2 of5
B-97-003) and follows the same coding scheme in order to facilitate easy retrieval of information
during EPA technical systems audits and network reviews. Table 9-1 includes the documents and
records that will be filed according to the statute of limitations discussed in Section 9.3.  In order
to archive the information as a cohesive unit, all the PM2 5 information will be filed under the
major code "PM25",  followed by the codes in Table 9-1

Table 9-1 PM> s Reporting Package Information
Categories
Management and
Organization


Site Information

Environmental Data
Operations
Raw Data
Data Reporting
Data Management

Quality Assurance



Record/Document Types
State Implementation Plan
Reporting agency information
Organizational structure
Personnel qualifications and training
Training Certification
Quality management plan
Document control plan
EPA Directives
Grant allocations
Support Contract
Network description
Site characterization file
Site maps
Site Pictures
QA Project Plans
Standard operating procedures (SOPs)
Field and laboratory notebooks
Sample handling/custody records
Inspection/Maintenance records
Any original data (routine and QC data)
including data entry forms
Air quality index report
Annual SLAMS air quality information
Data/summary reports
Journal articles/papers/presentations
Data algorithms
Data management plans/flowcharts
PM2.5 Data
Data Management Systems
Good Laboratory Practice
Network reviews
Control charts
Data quality assessments
QA reports
System audits
Response/Corrective action reports
Site Audits

File Codes
AIRP/217
AIRP/237
ADMI/106
PERS/123
AIRP/482
AIRP/216
ADMI/307
DIRE/007
BUDG/043
CONT/003
CONT/202
AIRP/237
AIRP/237
AIRP/237
AUDV/708
PROG/185
SAMP/223
SAMP/502
TRAN/643
AIRP/486
SAMP/223
AIRP/484
AIRP/484
AIRP/484
PUBL/250
INFO/304
INFO/304
INFO/1 60 -INFO/1 73
INFO/304 -INFO/1 70
COMP/322
OVER/255
SAMP/223
SAMP/223
OVER/203
OVER/255
PROG/082
OVER/658
OVER/203

-------
                                                                       Project: Model QAPP
                                                                             Element No: 9
                                                                             Revision No: 1
                                                                             Date: 4/17/98
                                                                       	Page 3 of5
 9.1.2 Annual Summary Reports Submitted to EPA
 As indicated in 40 CFR Part 58, the department shall submit to the EPA Administrator, through
 the Region Y Office, an annual summary report of all the ambient air quality monitoring data
 from all monitoring stations designated as SLAMS. The report will be submitted by July 1 of each
 year for the data collected from January 1 to December 31 of the previous year. The report will
 contain the following information:

 PM-fine (PM2 5)

 Site and Monitoring Information.
     *•   City name (when applicable),
     *•   county name and street address of site location.
     ป•   AIRS-AQSsite  code.
     *•   AIRS-AQS monitoring method code.

 Summary Data
    *•  Annual arithmetic mean (yUg/m3) as specified in 40 CFR part 50, Appendix N (Annual
       arithmetic mean NAAQS is 15,ag/m3)
    *•  All daily PM-fme values above the level of the 24-hour PM-fme NAAQS (65 /ug/m3) and
       the dates of occurrence.
    >  Sampling schedule used as once every 6 days, every  day, etc.
    >  Number of 24-hour average concentration in the ranges listed in Table 9-2:

                             Table 9-2 PH, Summary Report Ranges
Range
Oto ISCug/m3)
16 to 30
31 to 50
51 to 70
71 to 90
91 to 110
greater than 1 1 0
Number of Values







Dr. James Calhoon, as the senior air pollution control officer for the Department, will certify that
the annual summary is accurate to the best of his knowledge. This certification will be based on
the various assessments and reports performed by the organization, in particular, the Annual QA
Report discussed in Section 21 that documents the quality of the PM2 5 data and the effectiveness
of the quality system.

-------
                                                                             Project: Model QAPP
                                                                                  Element No: 9
                                                                                  Revision No: 1
                                                                                   Date: 4/17/98
	Page 4 of 5

9.2 Data Reporting Package Format and Documentation Control


     The format of data reporting packages, whether for field or lab data, must be consistent with the requirements
 and procedures used for data validation and data assessment. All individual records that represent actions taken to
 achieve the objective of the data operation and the performance of specific QA functions are potential components of
 the final data reporting package. This element of the QAPP should discuss how these various components will be
 assembled to represent a concise and accurate record of all activities impacting data quality. The discussion should
 detail the recording medium for the project, guidelines for hand-recorded data (e.g., using indelible ink), procedures
 for correcting data (e.g., single line drawn through errors and initialed by the responsible person), and documentation
 control. Procedures for making revisions to technical documents should be clearly specified and the lines of authority
 indicated.
Table 9-1 represents the documents and records, at a minimum, that must be filed into the
reporting package. The details of these various documents and records will be discussed in the
appropriate sections of this document.

All raw data required for the calculation of a PM2 5 concentration, the submission to the AIRS
database, and QA/QC data, are collected electronically or on data forms that are included in the
field and analytical methods sections. All hardcopy information will be filled out in indelible ink.
Corrections will be made by inserting one line through the incorrect entry, initialing this
correction, and placing the correct entry alongside the incorrect entry, if this can be accomplished
legibly, or by providing the information on a new line.

9.2.1 Notebooks

The Department will issue notebooks to each field and laboratory technician.  This  notebook will
be uniquely numbered and associated with the individual and the PM2 5 Program.  Although data
entry forms are associated with all routine environmental data operations, the  notebooks can be
used to record additional information about these operations.

Field notebooks - Notebooks will be issued  for each sampling site. These will be  3-ring binders
that will contain the appropriate data forms for routine operations as well as inspection and
maintenance forms and SOPs.

Lab Notebooks - Notebooks will also  be issued for the laboratory.  These notebooks will be
uniquely numbered and associated with the PM2 5 Program.  One notebook will be available for
general comments/notes; others will be associated with, the temperature and humidity recording
instruments, the refrigerator, calibration equipment/standards, and the analytical balances used for
this program.

Sample shipping/ receipt- One notebook will be issued to the shipping and receiving facility.
This notebook will be uniquely numbered and associated with the PM2 5 program. It will include
standard forms and areas for free  form notes.

-------
                                                                            Project: Model QAPP
                                                                                 Element No: 9
                                                                                 Revision No: 1
                                                                                  Date: 4/17/98
                                                                           	Page 5 of5
9.2.2 Electronic data collection
It is anticipated that certain instruments will provide an automated means for collecting
information that would otherwise be recorded on data entry forms. Information on these systems
are detailed in Sections 18 and 19.  In order to reduce the potential for data entry errors,
automated systems will be utilized where appropriate and will record the same information that is
found on data entry forms. In order to provide a back-up, a hardcopy of automated data
collection information will be stored for the appropriate time frame in project files.

9.3 Data Reporting Package Archiving and  Retrieval
     The length of storage for the data reporting package may be governed by regulatory requirements, organizational
  policy, or contractual project requirements. This element of the QAPP should note the governing authority for
  storage of, access to, and final disposal of all records
As stated in 40 CFR part 31.42,  in general, all the information listed in Table 9-1 will be retained
for 3 years from the date the grantee submits its final expenditure report unless otherwise noted in
the funding agreement.  However, if any litigation, claim, negotiation, audit or other action
involving the records has been started before the expiration of the 3-year period, the records will
be retained until completion of the action and resolution of all issues which arise from it,  or until
the end of the regular 3-year period, whichever is later.  The Department will extend this
regulation in order to store records for three full years past the year of collection. For example,
any data collected in calendar year 1999 (1/1/99 - 12/31/99) will be retained until, at a minimum,
January 1, 2003; unless the information is used for litigation purposes.

-------
                                                                              Project: Mode! QAPP
                                                                                   Element No: 10
                                                                                    Revision No: 1
                                                                                    Date: 4/18/98
                                                                             	Pagel of 14
                                 10.0 Sampling Design
      The purpose of this element is to describe all the relevant components of the experimental design; define the key
  parameters to be estimated; indicate the number and type of samples expected; and describe where, when, and how
  samples are to be taken. The level of detail should be sufficient that a person knowledgeable in this area could
  understand how and why the samples will be collected. This element provides the main opportunity for QAPP
  reviewers to ensure that the "right" samples will be taken. Strategies such as stratification, compositing, and
  clustering should be discussed, and diagrams or maps showing sampling points should be included. Most of this
  information should be available as outputs from the final steps of the planning (DQO) process.
The purpose of this Section is to describe all of the relevant components of the SLAMS
gravimetric mass PM2 5 monitoring network to be operated by Palookaville, including the network
design for evaluating the quality of the data. This entails describing the key parameters to be
estimated, the rationale for the locations of the PM25 monitors and the QA samplers, the frequency
of sampling at the primary and QA samplers, the types of samplers used at each site, and the
location and frequency of the FRM performance evaluations.  The network design components
comply with the regulations stipulated in 40 CFR Part 58 Section 58.13, Appendix A, and
Appendix D and further described in Guidance for Network Design and Optimum Site Exposure
forPM25andPMw.

10.1  Scheduled Project Activities, Including Management Activities

    This element should give anticipated start and completion dates for the project as well as anticipated dates of
major milestones, such as the following:

       schedule of sampling events;
       schedule for analytical services by offsite laboratories;
       schedule for phases of sequential sampling (or testing), if applicable;
       schedule of test or trial runs; and
       schedule for peer review activities.

    The use of bar charts showing time frames of various QAPP activities to identify both potential bottlenecks and
the need for concurrent activities is recommended.
As explained in Section 10.4, Palookaville will be monitoring  PM2 5 concentrations at five
locations using five primary samplers and two QA samplers. The order of installation of the
primary samplers has been determined based on anticipated  PM2 5 concentrations at each of the
locations. The sites with the highest anticipated  PM2 5 concentrations will be installed first, and
the QA samplers will be installed in compliance with the requirements of 40 CFR Part 58
Appendix A. Due to the common practice of burning wood during the winter months within
Palookaville's jurisdiction, it is important to have the samplers installed and operational as early in
the fall/winter as possible.  Table 10-1 represents the activities associated with the ordering and
deployment of the primary and QA PM25 samplers.

-------
                                                                                      Project: Model QAPP
                                                                                            Element No: 10
                                                                                            Revision No: 1
                                                                                             Date: 4/18/98
                                                                                     	Page 2 of 14
 Table 10-1. Schedule of PM2.5 Sampling-Related Activities
Activity
Order samplers: 5 sequentials (4
primary, 1 QA), 2 single-days (1
primary, 1 QA)
Receive samplers
Install sequential sampler at site A I1
Install collocated sampler at site A I1
Install sequential sampler at site A21
Install single-day sampler at site B I1
Install collocated sampler at site B I1
Install sequential sampler at site A31
Begin routine sampling at sites Al,
A2, A3, andBl1
Begin routine sampling at collocated
sites Al andBl1
Install sequential sampler at site A41
Begin routine sampling at site A4
Report routine data to AIRS-AQS
FRM Performance Evaluations
Report QA data to AIRS-AQS
Review QA reports generated by AIRS
Primary network review
Evaluate location of collocated
sequential sampler
Due Date
March 2, 1998
July 1, 1998
September 1998
September 1998
October 1998
October 1998
October 1998
November 1998
January 1, 1999
January 1, 1999
April 1999
January 1,2000
Ongoing - due within 90
days after end of quarterly
reporting period
Ongoing - according to
national audit time frame
Ongoing - due within 90
days after end of quarterly
reporting period
Ongoing
Annually
Annually
Comments
Ordered from National contract.

Contingent upon timely receipt of samplers under
National contract.
Contingent upon timely receipt of samplers under
National contract.
Contingent upon timely receipt of samplers under
National contract.
Contingent upon timely receipt of samplers under
National contract.
Contingent upon timely receipt of samplers under
National contract.
Contingent upon timely receipt of samplers under
National contract.


Deferred to 1 999 due to weather and minimal
population impact.

Required according to 40 CFR Part 58, Section 35(c).
FRM audits not the responsibility of Palookaville. Item
included in schedule since some coordination will be
required.
Required according to 40 CFR Part 58, Section 35(c).
Needed to determine which, if any, monitors fail bias
and/or precision limits.
Evaluate reasonableness of siting, CMZ definitions,
decommissioning of PM10 monitors, number of
samplers.
Need to collocate sequential sampler measuring
concentrations closest to PM2.5 NAAQS.
'Site names/number defined in section 10 4

-------
                                                                             Project: Model QAPP
                                                                                  Element No: 10
                                                                                  Revision No: 1
                                                                                   Date: 4/18/98
                                                                                    Page 3 of 14
 10.2   Rationale for the Design
     The objectives for an environmental study should be formulated in the planning stage of any investigation. The
 requirements and the rationale of the design for the collection of data are derived from the quantitative outputs of the
 DQO Process. The type of design used to collect data depends heavily on the key characteristic being investigated.
 For example, if the purpose of the study is to estimate overall average contamination at a site or location, the
 characteristic (or parameter) of interest would be the mean level of contamination.  This information is identified in
 Step 5 of the DQO Process. The relationship of this parameter to any decision that has to be made from the data
 collected is obtained from Steps 2 and 3 of the DQO Process.
 10.2.1 Primary Samplers

 The primary purpose of the gravimetric mass  PM2 5 ambient air monitoring program operated by
 Palookaville is to measure compliance with national standards for particulates less than or equal to
 2.5 micrometers. These standards are detailed in 40 CFR Part 50, are based on twenty-four hour
 average PM2 5 concentrations, and are summarized as:

    (1) The three-year average of the annual 98th percentiles of  PM25 concentrations at any
        population-oriented monitoring site is not to exceed 65 ug/m3.
    (2) The three-year average of the annual mean of PM25 concentrations is not to exceed 15
        ug/m3.  The average may be based on a single community-oriented monitoring site or may
        be based on the spatial average of community-oriented monitoring sites in a community
        monitoring zone (CMZ).

 Thus the key characteristics being measured are annual 98th percentiles and annual means of
 twenty-four hour average PM2 5 concentrations.

 To determine whether these characteristics are quantified with sufficient confidence, Palookaville
 must address sampler type, sampling frequency, and sampler siting. By employing FRM/FEM
 samplers, Palookaville is assured to be measuring the PM2 5 concentrations as well as possible
 with regards to evaluating compliance with the  PM2 5 NAAQS.  By complying with the sampling
 frequency requirements of 40 CFR Part 58 Section 58.13, Palookaville assumes that the sampling
frequency is sufficient to attain the desired confidence in the annual 98th percentile and annual
mean of PM2 5 concentrations in the vicinity of each monitor.  By selecting sampler locations
using the rules in 40 CFR Part 58 Appendix D, Palookaville can be confident that the  PM2 5
concentrations within its jurisdiction are adequately characterized. Sampler type, frequency, and
siting are further described in section  10.4.

-------
                                                                             Project: Model QAPP
                                                                                   Element No: 10
                                                                                   Revision No:l
                                                                                    Date: 4/18/98
                                                                             	Page 4 of 14
10.2.2 QA Samplers
The purpose of collocated samplers and the FRM performance evaluation is to estimate the
precision and bias of the various PM25 samplers. The DQOs developed in  Section 7.0 state that,
for a 3-year period, the concentrations measured by a sampler must be within ฑ10% of the true
concentration as measured by an FRM sampler and that the coefficient of variation of the relative
differences must be less than 10%. These levels of bias and precision need to be accomplished so
that decision makers can make decisions about attainment and/or non-attainment of the PM 2 5
NAAQS with sufficient confidence. To estimate the level of bias and precision being achieved in
the field, some of the sites will operate collocated samplers and some of the sites will be audited
using FRM samplers.  If a sampler is operating within the required bias and precision levels, then
the decision maker can proceed knowing that the decisions will be supported by  unambiguous
data.  If, however, a sampler exceeds either the bias limits or the precision limits or both, then the
decision maker cannot use the data to make decisions at the desired level of confidence and
corrective action must be implemented to ensure that future data collected by the sampler does
meet the bias and precision limits.  Thus the key characteristics being measured with the QA
samplers are bias and precision.

To determine whether these characteristics are measured with sufficient confidence, Palookaville
must address sampler  type, sampling frequency, and sampler siting for the QA network.  As with
the primary  PM2 5 network, by using FRM/FEM samplers, maintaining the  sampling frequency
specified in 40 CFR Part 58 Appendix A, and collocating the number of samplers as specified  in
40 CFR Part 58 Appendix A, Palookaville assumes its QA  network will measure bias and
precision with  sufficient confidence. These issues are described in more detail in section 10.4.

10.3   Design Assumptions
    The planning process usually recommends a specific data collection method (Step 7 of the DQO Process), but the
 effectiveness of this methodology rests firmly on assumptions made to establish the data collection design. Typical
 assumptions include the homogeneity of the medium to be sampled (for example, sludge, fine silt, or wastewater
 effluent), the independence in the collection of individual samples (for example, four separate samples rather than four
 aliquot derived from a single sample), and the stability of the conditions during sample collection (for example, the
 effects of a rainstorm during collection of wastewater from an industrial plant). The assumptions should have been
 considered during the DQO Process and should be summarized together with a contingency plan to account for
 exceptions to the proposed sampling plan. An important part of the contingency plan is documenting the procedures
 to be adopted in reporting deviations or anomalies observed after the data collection has been completed. Examples
 include an extreme lack of homogeneity within a physical sample or the presence of analytes that were not mentioned
 in the original sampling plan. Chapter 1 of EPA QA/G-9 provides an overview of sampling plans and the assumptions
 needed for their implementation, and EPA QA/G-5S provides more detailed guidance on the construction of sampling
 plans to meet the requirements generated by the DQO Process.
 The sampling design is based on the assumption that following the rules and guidance provided in

-------
                                                                               Project: Model QAPP
                                                                                     Element No: 10
                                                                                     Revision No:l
                                                                                      Date: 4/18/98
	Page 5 of 14

CFR and Guidance for Network Design and Optimum Site Exposure for PM2 5 and PM10 will
result in data that can be used to measure compliance with the national standards. The only issue
at Palookaville's discretion is the sampler siting,  and to a degree, sampling frequency.  The siting
assumes homogeneity of PM2 5 concentrations within CMZs and heterogeneity between CMZs.
MPA and CMZ boundaries will be regularly reviewed, as part of the network reviews (Section
20).  The basis for creating and revising the boundaries is described in the following section.


10.4   Procedure for Locating and Selecting Environmental Samples


    The most appropriate plan for a particular sampling application will depend on: the practicality and feasibility
(e.g., determining specific sampling locations) of the plan, the key characteristic (the parameter established in Step 5
of the DQO Process) to be estimated, and the implementation resource requirements (e.g., the costs of sample
collection, transportation, and analysis).

    This element of the QAPP should also describe the frequency of sampling and specific sample locations (e.g.,
sample port locations and traverses for emissions source testing, well installation designs for groundwater
investigations) and sampling materials. Sometimes decisions on the number and location of samples will be made in
the field; therefore, the QAPP should describe how these decisions will be driven whether by actual observations or by
field screening data. When locational data are to be collected, stored, and transmitted, the methodology used must be
specified and described (or referenced) and include the following:

    •  procedures for finding prescribed sample locations,
    •  contingencies for cases where prescribed locations are inaccessible,
    •  location bias and its assessment, and
    •  procedures for reporting deviations from the sampling plan.

    When appropriate, a map of the sample locations should be provided and locational map coordinates supplied.
EPA QA/G-5S provides nonmandatory guidance on the practicality of constructing sampling plans and references to
alternative sampling procedures.
10.4.1 Primary Samplers

The design of the SLAMS PM2 5 network must achieve one of six basic monitoring objectives, as
described in 40 CFR Part 58, Appendix D.  These are:

  (1) To determine the highest concentrations expected to occur in the area covered by the
      network.
  (2) To determine representative concentrations in areas of high population density.
  (3) To determine the impact on ambient pollution levels of significant sources or source
      categories.
  (4) To determine general background concentrations levels.
  (5) To determine the extent of Regional pollution transport among populated areas.
  (6) In support of secondary standards, to determine the welfare-related impacts in more rural
      and remote areas.

-------
                                                                        Project: Model QAPP
                                                                             Element No: 10
                                                                              Revision No:l
                                                                              Date: 4/18/98
                                                                        	Page 6 of 14
 The procedure for siting the PM2 5 samplers to achieve the six basic objectives is based on
 judgmental sampling, as is the case for most ambient air monitoring networks. Judgmental
 sampling uses data from existing monitoring networks, knowledge of source emissions and
 population distribution, and inference from analyses of meteorology to select optimal sampler
 locations.

 Palookaville is responsible for monitoring air quality for Parsley, Sage, Rosemary, and Thyme
 counties in California. The number of SLAMS sites where gravimetric mass PM2 5 monitoring
 will occur and their location was determined based upon the information provided in 40 CFR Part
 58 Appendix D and in Guidance for Network Design and Optimum Site Exposure for PM2 5 and
 PM1ft. Specifically, the following steps were used to define the Monitoring Planning Areas
    Jio-
 (MPAs), to define the community monitoring zones (CMZs), and to site the monitors.

 10.4.2 Primary Samplers - Defining MPAs

 Two MPAs were identified within Palookaville's four counties. The boundaries were determined
 based on (1) the 1990 census data by census tract, (2) the boundaries of the existing MS As, and
 (3) the surrounding geography. Figure 10-1 shows the geography, city centers, and current PM,0
 monitoring locations for the region for which Palookaville is responsible. One of the MPAs
 corresponds to the Scarborough Metropolitan Statistical Area (MSA).  According to the  1990
 census, the Scarborough MSA, which is located entirely in Parsley County, has a population of
                                                      Scarborough

                                                             P2
Figure 10-1. Geography, population centers, and PM10 sampler locations for Palookaville

-------
                                                                        Project: Model QAPP
                                                                             Element No: 10
                                                                              Revision No: 1
                                                                               Date: 4/18/98
	Page 7 of 14

357,420.  The population is evenly distributed through the MSA except in the downtown area.
As a result, the boundaries of the MPA are taken to be those of the MSA, that is, the county
boundary.

The second MPA is centered around Franklinton, a town located in Thyme County. Four census
tracts in Franklinton contain over 50,000 people. These four tracts were identified as a unique
MPA because: (1) of the density of the population, (2) Franklinton is the center of a pulp mill, and
(3) Franklinton is located in a depression, surrounded by mountains. The boundary of the
Franklinton MPA is the boundary of the four census tracts.

Since there are no other concentrated population centers in Palookaville's jurisdiction,  no other
MPAs were identified.

10.4.3 Primary Samplers - Defining CMZs

Specific CMZ definitions are needed only when spatial averaging is to be used, according to the
Guidance for Network Design and Optimum Site Exposure for PM2 5 and PM,0. Since spatial
averaging is to be used in Scarborough, the Scarborough MPA was divided into CMZs.

Within the Scarborough MPA, the major industries include transportation, commerce, and
tourism (skiing in the nearby mountains during the winter), and these industries are spread fairly
evenly through the MPA. A study of the point sources in the area confirms that the emission
sources are evenly spread, as is the transportation as indicated by traffic studies. However, a
study of the wind patterns using wind roses indicates that winds predominately blow from the
Northwest to the Southeast. This is of particular concern because of the potential transport of
particulate matter from the mill operations in Franklinton into the Northwestern portion of the
Scarborough  MPA. Review of the data from the current PM10 monitoring sites adds support to
this concern in that the daily PMIO concentrations at site PI are generally higher than those at
either P2 or P3. As a result, the Scarborough MPA was divided into two CMZs,  one area to the
Northwest of the downtown area that is likely to be influenced by transport from Franklinton, and
the other area being the remainder of the MPA. The names for the CMZs are NW Scarborough
and Scarborough. This division of the Scarborough MPA will need to be reviewed as PM25 data is
collected to determine if the two CMZs can be combined into one.

10.4.4 Primary Samplers - Siting Monitors

As mentioned previously, the procedure for siting the PM25 samplers is based on judgmental
sampling. Palookaville requires five PM25 sites to characterize adequately the aerosol in the four
counties for which it is responsible to monitor air quality. Three of the monitors will be located in
the Scarborough MPA, one in Franklinton, and one in Sage County. Figure 10-2 shows a map of
the locations of the SLAMS sites for PM25, where an "A" indicates a site using a sequential
sampler and "B" indicates a site using a single-day sampler.

-------
                                                                         Project: Model QAPP
                                                                              Element No: 10
                                                                               Revision No: 1
                                                                               Date: 4/18/98
                                                                        	Page 8 of 14
  Thyme
 Figure 10-2. PM2.5 sites for Palookaville

Sage County is comprised predominately of federal lands that are low in population. One PM25
site, A4, will be established in this county to monitor regional background PM2 5 concentration and
is therefore representative of a regional-scale.  This is a new monitoring site.

One PM2 5 site, A3, will be located in Franklinton to quantify the neighborhood-scale exposures in
the area. There currently is a PMi0 monitor in Franklinton and the PM2 5 monitor will be
collocated with the PMi0 monitor. The data from these two monitors will be reviewed after one
year of collection to see if the PM10 site can be de-commissioned. The decision on whether to de-
commission the PM10 site will based on discussions with Region Y
Three sites will be used in the Scarborough MPA.  Al, located in the northwestern part of the
MPA, will be sited to answer concerns about possible transport from Franklinton. The site
location will correspond to that for the PM10 sampler, PI. The core site, named A2 and located
just downwind of the downtown area, will be collocated on a platform that currently has both a CO
sampler and a PM10 sampler. The information gained by having the collocated samplers will be
invaluable.  The second site, named Bl and located further downwind of downtown, represents a
neighborhood scale and also will address possible questions about transport out of Scarborough.
Sites Al, A2, and B3 all represent neighborhood-scale sites.
The site in Sage county will be designated a NAMS site and the other four sites will be designated

-------
                                                                         Project: Model QAPP
                                                                              Element No: 10
                                                                              Revision No: 1
                                                                               Date: 4/18/98
	Page 9 of 14

SLAMS.

The network design as described meets all six of the basic monitoring objectives:

    (1) The highest concentrations are expected to occur at or near site Al.
    (2) The monitoring network has been designed to characterize the PM2 5 concentrations in the
       high population centers, those being Scarborough and Franklinton.
    (3) The significant sources of PM2 5 are anticipated to be the mill operations, wood burning,
       and automotive activities. Site A3 will determine the impact of the mill operations and
       sites Al, A2, A3, and Bl will determine both the impact of wood burning and automobile-
       related activities.
    (4) Site A4 is expected to experience background PM2 5 concentrations.
    (5) Site A3 will measure transport from Franklinton into Scarborough and B1 may provide
       some information about the transport out of Scarborough.
    (6) Site A3 is a rural site and A4 is a remote area.

10.4.5 Primary Samplers - Review of MPA and CMZ Definitions

The number of MPAs and the MPA boundaries will be regularly reviewed as part of the network
review (Section 20). The number and boundaries of MPAs will be reviewed and potentially
revised as new census data become available or in the event that MSA definitions change.

The CMZ definitions will also be reviewed as part of the network review (Section 20). In
particular, the division of the Scarborough MPA will be reviewed as PM2 5 data are collected to
determine if the two CMZs can be combined into one or if the Scarborough CMZ should be
further subdivided. The review will be based on actual data collected and a review of emission
sources in the area.  According to 40 CFR Part 58 Appendix D Section 2.8.1.6, the annual
average air quality is sufficiently homogenous, that is, monitors may be averaged for comparison
with the annual PM2 5 NAAQS, provided:

    (1) the average concentrations at individual sites do not exceed the spatial average by more
       than 20 percent,
    (2) the monitoring sites exhibit similar day  to day variability, and
    (3) all sites in the CMZ are affected by the same major emission sources of PM2 5.

To address these three issues, Palookaville will use the following five-step procedure, which is
based on the information in Guidance for Network Design and Optimum Site Exposure for PM2 5
and PMI0.

    (1) Determine if the average concentration at sites Al, A2, and Bl are within 20 percent of
       the spatial average.  The calculations for achieving this are provided in detail in 40  CFR
       Part 50 Appendix N. Decision: if Al differs from the spatial average by more than 20
       percent, then the NW Scarborough CMZ and the Scarborough CMZ should not be

-------
                                                                         Project: Model QAPP
                                                                              Element No: 10
                                                                              Revision No: 1
                                                                               Date: 4/1 8/98
                                                                    _ Page 10 of 14

      combined into one. If Al is similar to the spatial average, the CMZs possibly should be
      combined, based on the information from steps 3, 4, and 5.

   (2) Determine if the average concentration at sites A2 and Bl are within 20 percent of the
      spatial average.  The calculations for achieving this are provided in detail in 40 CFR Part
      50 Appendix N. Decision: if A2 and Bl differ by more than 20 percent, then the
      Scarborough CMZ should be split into two CMZs.  If they do not differ, the air in the
      Scarborough CMZ is possibly homogenous, but steps 3, 4, and 5 should be taken to verify
      further.

   (3) Determine if the monitoring sites exhibit similar day to day variability.  To accomplish this,
      calculate the correlation coefficient between the concentrations measured at Al and Bl,
      between Al and A2, and between Bl and A2. In general, the correlation coefficient
      between site X and site Y is calculated using all days for which concentrations exist for
      BOTH sites, using the following equation:
                                                    n
                                           V Y2H         V Y2
                                   ^Xf-^L\\EYf--    '
                                             n    \           n
       where all summations are for i=l, 2, ..., n and n is the number of days for which both site
       X and site Y have data. If the correlation coefficient is greater than 0.6, then we conclude
       that the sites exhibit similar day to day variability. Decision: if the correlation coefficient
       between Al and either A2 or Bl is less than 0.6, this indicates that the NW Scarborough
       and Scarborough CMZs should remain separate.  If the correlation coefficient between A2
       and Bl is less than 0.6, this indicates that the Scarborough CMZ should be split into two
       CMZs.   If the correlation coefficients are greater than 0.6, this adds support to the idea
       that the ambient air is homogenous, but steps 4 and 5 should be taken to verify  further.

   (4) Review the location of existing and new emission sources. Decision: if an emission source
       is located close to a monitoring site, then the site does not represent a neighborhood scale,
       hence is not eligible for spatial averaging.

   (5) Review any data from speciation monitors or air quality models. Decision: if the emission
       profiles look similar near each of the monitors, then we will conclude that the sites are
       impacted by the same major sources of emissions.

The information from these five steps will be used to determine how homogenous the air is and
what the appropriate CMZ boundaries are. Preliminary assessments will be made on an annual
basis, but three years of PM2 5 air quality data are required before a final evaluation can be made.

-------
                                                                         Project: Model QAPP
                                                                             Element No: 10
                                                                              Revision No: 1
                                                                               Date: 4/18/98
                                                                        	Page 11 of 14
 10.4.6 Primary Samplers - Sampling Frequency
According to 40 CFR Part 58 Section 58.13 and Appendix D, the required sampling frequency for
the samplers operated by Palookaville is once every three days.  Hence, all samplers will operate
on a 1 in 3 day sampling frequency, except for possibly during some critical times during the
wood-burning months in which case daily sampling may be conducted at the core site and at the
transport site between Franklinton and Scarborough.

10.4.7 Primary Samplers - Types of Samplers

Of the five PM25 samplers that Palookaville will be operating, four of them will be sequential
samplers and one will be a single-day sampler.  Palookaville would prefer to purchase all
sequential samplers to accommodate potential increases in sampling frequency; however, due to
budget constraints, only four of the five will be sequential. All samplers will be FRM or FEM.

Since the single-day sampler must be visited by field personnel at least every three days to collect
the used filter and load a new one, the single-day sampler will be placed in Scarborough, for ease
of access. In particular, the location most downwind from the downtown area of Scarborough,
site Bl, will have the single-day sampler. By operating sequential samplers at the regional
background site and at the Franklinton site,  field personnel will have to visit these sites only every
six days, as shown in Table  10-2. Such a schedule will minimize field costs because the operator
must visit the site only once every six days instead of after every sample, as would be the case
with a single-day sampler. In addition,  the sample recovery date is staggered in order to
eliminate weekend activities. This schedule  complies with 40 CFR Part 50 Appendix L, that
stipulates filters remain in the sampler for no more than 96 hours after sampling. The core site
located in downtown Scarborough, site A2, and the transport site located in the northwestern part
of Scarborough, site Al, will both have sequential samplers to expedite possible increases in the
sampling frequency.

10.4.8 Primary Sampling - Other PM2.5 Monitoring

This network design does not include any special purpose monitoring or monitoring that will
provide speciated data.  Special purpose and speciation monitoring are extremely important
components of the  PM2 5 monitoring network. However, this document will not address these
two components. They will be addressed in a separate document.

-------
                                                                         Project: Model QAPP
                                                                              Element No: 10
                                                                              Revision No:l
                                                                               Date: 4/18/98
                                                                        	Page 12 of 14
Table 10-2 Sample Set-u
Sample
Frequency
Iin3
Weekl
tin 3
Week 2
Iin3
WeekS
Iin3
Week 4
Iin3
WeekS
Iin3
Week 6
Iin3
Weekl
Iin3
Week 2
Iin3
WeekS
Sampler
Type
Multiple
Day
Multiple
Day
Multiple
Day
Multiple
Day
Multiple
Day
Multiple
Day
Single
Day
Single
Day
Single
Day
3, Run and Recovery dates
Sunday
Sample
Day 1


Sample
Day 8


Sample
Day 1


Monday


Sample
Day 6
Recovery &
Set-up

Sample
Day 13
Recovery &
Set-up
Recovery &
Set-up
Sample
Day 6
Tuesday

Sample
Day 4
Recovery &
Set-up

Sample
Day 11
Recovery &
Set-up

Sample
Day 4
Recovery &
Set-up
Wednesday
Sample
Day 2
Recovery &
Set-up

Sample
Day 9
Recovery &
Set-up

Sample
Day 2
Recovery &
Set-up

Thursday
Recovery &
Set-up

Sample
Day?
Recovery &
Set-up

Sample
Day 14
Recovery &
Set-up

Sample
Day 7
Friday

Sample
Day 5


Sample
Day 12


Sample
Day 5
Recovery
& Set-up
Saturday
Sample
Day 3


Sample
Day 10


Sample
Day 3
Recovery
& Set-up

10.4.9 QA Samplers

According to the primary PM2 5 network design, Palookaville will deploy and operate one site
using a single-day sampler and four sites using sequential samplers. According to 40 CFR Part
58, Appendix A, Section 3.5.2, for each method designation, at least 25% (minimum of one) of
the samplers must be collocated. As a result, Palookaville must collocate the single-day sampler
since a single-day sampler will have a different designation than the sequential samplers.  Also,
since there are four sequential samplers, all the same designation, Palookaville must operate one
site (which is 25% of 4) with a collocated sequential sampler, and the site will be the one most
likely to be in violation of the PM25 NAAQS. Based on the data collected by the PM10 network,
it is assumed that the site most likely to monitor concentrations at or above the PM2 5 NAAQS is
site Al in the northwestern part of Scarborough. However, as data from the PM2 5 network
becomes available, the data will be reviewed on an annual basis to determine if a different site
operating a sequential sampler is more  appropriate for collocation.  The two collocation samplers
will be operated on a six-day sampling  schedule, regardless of the sampling frequency of the
primary samplers and will coincide with the sampling run time of the primary sampler so that the
primary and collocated samplers are operating on the same days.  Section 14.1.3 discusses this
precision check in more detail.
A complementary method for estimating bias and precision is the FRM Performance Evaluation.
Even though Palookaville is not responsible for performing these evaluations, it is important for to

-------
                                                                          Project: Model QAPP
                                                                               Element No: 10
                                                                               Revision No: 1
                                                                                Date: 4/18/98
	Page 13 of 14

recognize that these evaluations will be performed.  First, Palookaville will need to coordinate
with the Regional QA Coordinator to provide access to the sites and offer other needed support.
Secondly, the performance evaluation data will be reviewed by Palookaville. According to 40
CFR Part 58, Appendix A, Section 3.5.3, for Palookaville, the number of sites to be evaluated
annually under is two and at a frequency of four times during the year.  The number of sites is two
for the same reason that the number of collocated sites is two, that is, each method designation
and at least 25% of each method designation within a reporting organization must be audited each
year.  Hence, the primary single-day sampler will be evaluated each year and one of the primary
sequential samplers will be evaluated each year.

Table 10-3 provides some basic information about each of the seven gravimetric mass PM2 5
samplers to be operated by Palookaville. Latitude and Longitudes will be recorded using a global
positioning instrument that meets the EPA locational data policy goals of 25 meters accuracy.
Table 10-3. Identifying Information for Palookaville PNf 5 Samplers
AIRS ID
Primary Samplers
Al
A2
A3
A4
Bl
060021125811041
060021245811041
060030125811041
060051625811041
060021126811041
QA Samplers
Al
Bl
060021125811049
060021126811049
Lat

38.9
39.1
38.5
39.5
38.9

38.9
389
Long

1205
120.0
1209
121.1
119.6

120.5
1196
Sampling
Frequency

1 in 3, daily
during PMj,
episodes
1 in 3, daily
during PM, ,
episodes
1 in 3
1 in 3
1 in 3

1 in 6
Iin6
Scale of
Representativeness

Neighborhood
Neighborhood
Neighborhood
Regional
Neighborhood



Type of Monitoi

Sequential
Sequential
Sequential
Sequential
Single Filter

Sequential
Single Filter
MPA

Scarborough
Scarborough
Franklmton
N/A
Scarborough



CMZ

NW
Scarborough
Scarborough
N/A
N/A
Scarborough



Standard

Annual and
24-hour
Anmual and
24-hour
Annual and
24-hour
Annual and
24-hour
Annual and
24-hour




-------
                                                                               Project: Model QAPP
                                                                                    Element No: 10
                                                                                     Revision No: 1
                                                                                     Date: 4/18/98
                                                                              	Page 14 of 14
10.5  Classification of Measurements as Critical/Noncritical
    All measurements should be classified as critical (i.e., required to achieve project objectives or limits on decision
errors, Step 6 of the DQO Process) or noncritical (for informational purposes only or needed to provide background
information). Critical measurements will undergo closer scrutiny during the data gathering and review processes and
will have first claim on limited budget resources. It is also possible to include the expected number of samples to be
tested by each procedure and the acceptance criteria for QC checks (as described in element B5, "Quality Control
Requirements").
10.5.1 Primary Samplers

The critical information collected at the primary samplers is that specified in Table 6-2 that will be
provided to AIRS. Also critical is the site information such as the MPA, CMZ (if applicable), and
the NAAQS to which the data will be compared. These data are critical because they are necessary
for determining compliance with the PM25 standards.

10.5.2 QA Samplers

The critical information collected at collocated samplers is the same as that presented in Table 6-2
of Section 6.2.1 for primary samplers. All of the measurements in Table 6-2 are considered critical
because they form the basis for estimating bias and precision which are critical for evaluating the
ability of the decision makers to make decisions at desired levels of confidence. The
measurements described in Table 6-3 will also be collected for the collocated samplers. With the
exception of the filter integrity flags, these measurements are considered to be noncritical.

10.6   Validation of Any Non-Standard  Measurements


    For nonstandard sampling methods, sample matrices, or other unusual situations, appropriate method validation
study information may be needed to confirm the performance of the method for the particular matrix. The purpose of
this validation information is to assess the potential impact on the representativeness of the data generated. For
example, if qualitative data are needed from a modified method, rigorous validation may not be necessary. Such
validation studies may include round-robin studies performed by EPA or by other organizations. If previous validation
studies are not available, some level of single-user validation study or ruggedness study should be performed during
the project and included as part of the project's final report. This element of the QAPP should clearly reference any
available validation study information.
Since Palookaville is deploying only FRMs/FEMs and will be operating them according to
Guidance Document 2.12, then there will not be any non-standard measurements from either the
primary or QA samplers.  Also, since Palookaville will be sending its filters to a certified
laboratory for weighing, there will not be any non-standard measurements from the analysis of the
filters.  Hence, all sampling and analysis measurements will be standard.

-------
                                                                                Project: Model QAPP
                                                                                     Element No: 11
                                                                                      Revision No: 1
                                                                                       Date: 4/17/98
                                                                                	Page 1 of 10
                       11.0 Sampling Methods Requirements
    Environmental samples should reflect the target population and parameters of interest. As with all other
considerations involving environmental measurements, sampling methods should be chosen with respect to the
intended application of the data. Just as methods of analysis vary in accordance with project needs, sampling
methods can also vary according to these requirements. Different sampling methods have different operational
characteristics, such as cost, difficulty, and necessary equipment. In addition, the sampling method can materially
affect the representativeness, comparability, bias, and precision of the final analytical result.

    In the area of environmental sampling, there exists a great variety.of sample types. It is beyond the scope of this
document to provide detailed advice for each sampling situation and sample type. Nevertheless, it is possible to
define certain common elements that are pertinent to many sampling situations with discrete samples (see EPA
QA/G-5S).

    If a separate sampling and analysis plan is required or created for the project, it should be included as an
appendix to the QAPP. The QAPP should simply refer to the appropriate portions of the sampling and analysis plan
for the pertinent  information and not reiterate information..
11.1   Purpose/Background

This method provides for measurement of the mass concentration of fine particulate matter having
an aerodynamic diameter less than or equal to a nominal 2.5 micrometers (PM2 5) in ambient air
over a 24-hour period for purposes of determining whether the primary and secondary national
ambient air quality standards for particulate matter specified in 40 CFR Part 50.6 are met.  The
measurement process is considered to be non-destructive, and the PM2 5 sample obtained can be
subjected to subsequent physical or chemical  analyses.

-------
                                                                                   Project- Model QAPP
                                                                                         Element No: 11
                                                                                          Revision No:l
                                                                                           Date: 4/17/98
                                                                                   	Page 2 of 10
11.2   Sample Collection and Preparation
 (1) Select and describe appropriate sampling methods from the appropriate compendia of methods. For each
     parameter within each sampling situation, identify appropriate sampling methods from applicable EPA
     regulations, compendia of methods, or other sources of methods that have been approved by EPA.  When EPA-
     sanctioned procedures are available, they will usually be selected. When EPA-sanctioned procedures are not
     available, standard procedures from other organizations and disciplines may be used. A complete description of
     non-EPA methods should be provided in (or attached to) the QAPP.  Procedures for sample homogenization of
     nonaqueous matrices may be described in part (2) as a technique for assuring sample representativeness. In
     addition, the QAPP should specify the type of sample to be collected (e.g., grab, composite, depth-integrated,
     flow- weighted) together with the method of sample preservation.

 (2) Discuss sampling methods' requirements. Each medium or contaminant matrix has its own characteristics that
     define the method performance and the type  of material to be sampled. Investigators should address the
     following:

     •     actual sampling locations,
     •     choice of sampling method/collection,
     •     delineation of a properly shaped sample,
     •     inclusion of all particles within the volume sampled, and
     •     correct subsampling to reduce the representative field sample into a representative laboratory aliquot.

     Having identified appropriate and applicable methods, it is necessary to include the requirements for each
     method in the QAPP. If there is more than one acceptable sampling method applicable to a particular situation,
     it may be necessary to choose one from among them.  DQOs should be considered in choosing these methods to
     ensure that: a) the sample accurately represents the portion of the environment to be characterized, b) the sample
     is of sufficient volume to support the planned chemical analysis, and c) the sample remains stable during
     shipping and handling.

 (3) Describe the decontamination procedures and materials. Decontamination is primarily applicable in situations
     of sample acquisition from solid, semi-solid, or liquid media, but it should be addressed, if applicable,  for
     continuous monitors as well. The investigator must consider the appropriateness of the decontamination
     procedures for the project at hand. For example, if contaminants are present in the environmental matrix at the
     1% level, it is probably unnecessary to clean sampling equipment to parts-per-billion (ppb) levels. Conversely,
     if ppb-level detection is required, rigorous decontamination or the use of disposable equipment is required.
     Decontamination by-products must be disposed of according to EPA policies and the applicable rules and
     regulations that would pertain to a particular situation, such as the regulations of OSHA, the Nuclear Regulatory
     Commission (NRC), and State and local governments.
FRM samplers will be used as the monitor for collection of PM2 5 samples for comparison to the
NAAQS.  In the Palookaville network there are two models of the XYZ sampler employed.  The
XYZ sampler model 1000 is a single day sampler that meets  FRM designation. The XYZ sampler
model 2000 is a multiple day sampler that meets FRM designation.  Each model  sampler shall be
installed with adherence to procedures, guidance,  and requirements detailed in 40 CFR Parts 501 ,
53 and 582; Section 2:12 of the QA Hand Book3; the sampler manufacturers operation manual4'5
Pallokavilles Field SOPs6 and this QAPP.

-------
                                                                        Project: Model QAPP
                                                                             Element No: 11
                                                                              Revision No: 1
                                                                               Date: 4/17/98
                                                                        	Page 3 of 10
11.2.1 Sample Set-up
Sample set-up of the FRM or equivalent sampler in the Palookaville network takes place any day
after the previous sample has been recovered. For multiple day samplers, two sample days may
be set-up when  1 in 3 day sampling is required.  It is important to recognize that the only holding
time that affects sample set-up is the 30 day window from the time a filter is pre-weighed to the
time it is installed in the monitor. At collocated, sites the second monitor will be set up to run at a
sample frequency of 1 in 6 days; however, sample set-up will take place on the same day as the
primary sampler. Detailed sample set-up procedures are available from the Palookaville PM2 5
sample methods standard operating procedure.

11.2.2 Sample Recovery

Sample recovery of any individual filter from the FRM or equivalent sampler in the Palookaville
network must occur within 96 hours of the end of the sample period for that filter.  For 1 in 3 day
sampling on single day samplers this will normally be the day after a sample is taken. The next
sample would also be set-up at this time.  For 1 in 3 day sampling on multiple day samplers,  this
will normally be on the day after the second sample is taken. The next sample set-up for two
samples would also take place on this day. At collocated sites  the sample from the second
monitor will be  recovered on the same day as the primary sampler. Sample recovery procedures
are detailed in the Palookaville PM2 5 sampling methods standard operating procedure.  Table 11-
1 illustrates sample set-up, sample run, and sample recovery dates based upon sample frequency
requirements of 1 in 3 day sampling.

Table 11-1 Sample Set-up, Run and Recovery Dates
Sample
Frequency
1 in 3
Week 1
1 in 3
Week 2
1 m3
Week 3
1 in 3
Week 4
1 m3
WeekS
1 in 3
Week 6
1 in 3
Week 1
1 in 3
Week 2
1 in 3
Week 3
Sampler
Type
Multiple
Day
Multiple
Day
Multiple
Day
Multiple
Day
Multiple
Day
Multiple
Day
Single
Day
Single
Day
Single
Day
Sunday
Sample
Day 1


Sample
Day 8


Sample
Day 1


Monday


Sample
Day 6
Recovery &
Set-up

Sample
Day 13
Recovery &
Set-up
Recovery &
Set-up
Sample
Day 6
Tuesday

Sample
Day 4
Recovery &
Set-up

Sample
Day 11
Recovery &
Set-up

Sample
Day_4
Recovery &
Set-up
Wednesday
Sample
Day 2
Recovery &
Set-up

Sample
Dav9
Recovery &
Set-up

Sample
Day 2
Recovery &
Set-up

Thursday
Recovery
& Set-up

Sample
Day?
Recovery
& Set-up

Sample
Day 14
Recover/
& Set-up

Sample
Day 7
Friday

Sample
Day 5


Sample
Day 12


Sample
Day 5
Recovery
& Set-up
Saturday
Sample
Day 3


Sample
Day 10


Sample
Day 3
Recovery
& Set-up


-------
                                                                              Project: Model QAPP
                                                                                  Element No: 11
                                                                                   Revision No:l
                                                                                    Date: 4/17/98
                                                                             	Page 4 of 10
Therefore, sites that utilize multiple day samplers with the 1 in 3 day sampling frequency will
require one site visit a week, except for one out of every 6 weeks; where two sites visits will be
required.  For sites that utilize single day  samplers with 1 in 3 day sampling frequency, a recovery
and set-up visit will be required for every sample taken.

11.3  Support Facilities for Sampling Methods
     Support facilities vary widely in their analysis capabilities, from percentage-level accuracy to ppb-level
 accuracy. The investigator must ascertain that the capabilities of the support facilities are commensurate with the
 requirements of the sampling plan established in Step 7 of the DQO Process.
The main support facility for sampling is the sample trailer.  At each sample location in the
Palookaville network there is a climate controlled sample trailer. The trailer has limited storage
space for items used in support of PM25 sampling.  Table 11-2 lists the supplies that are stored at
each sample location trailer

Table 11-2 Supplies at Storage Trailers
Item
Powder Free Gloves
Fuses
Temperature standard
Flow rate standard
Sampler Operations Manual
PM2 5 Sampling SOP
Flow rate verification filter
Non-Permeable Membrane
Filter Cassettes
Impactor Oil
Cleaning Wipes
Rain Collector
Data Download Cable
Minimum
Quantity
box
2
1
1
1 per model
1
2
2
2
1 Bottle
1 Box
1
1
Notes
Material must be inert and static resistant
Of the type specified in the sampler manual
In the range expected for this site and NIST traceable
Calibrated from at least 15.0 LPM to 18.4 LPM and NIST
Traceable



Contained in sampling cassette
For use with flow rate check filter or non-permeable
membrane

Dust resistant

For use with laptop computer

-------
                                                                              Project: Model QAPP
                                                                                   Element No: 11
                                                                                    Revision No: 1
                                                                                    Date: 4/17/98
                                                                             	Page 5 of 10
Since there are other items that the field operator may need during a site visit that are not
expected to be at each site, the operator is expected to bring these items with him/her. Table 11 -3
details those items each operator is expected to bring with them.
Table 11-3 Site Dependent Equipment and Consumables
Item
Tools
Digital Multi meter (DMM)
Lap Top Computer
Floppy Disks
WINS Impactor Assembly
FRM Filter Cassettes
Transport Container
Minimum
Quantity
1 box
1
1
1 box
1
1 for each sampler, plus
field blanks
2
Notes
screw drivers, fitted wrenches, etc...
For troubleshooting electrical components, if
trained to do so.
Set-up to receive data from monitor.
3.5", with labels
Without impactor oil
Loaded with pre-weighed filter
1 for pre-weighed, 1 for sampled filter.
11.4  Sampling/Measurement System Corrective Action
     This section should address issues of responsibility for the quality of the data, the methods for making changes
 and corrections, the criteria for deciding on a new sample location, and how these changes will be documented. This
 section should describe what will be done if there are serious flaws with the implementation of the sampling
 methodology and how these flaws will be corrected. For example, if part of the complete set of samples is found to be
 inadmissable, how replacement samples will be obtained and how these new samples will be integrated into the total
 set of data should be described.
Corrective action measures in the PM2 5 Air Quality Monitoring Network will be taken to ensure
the data quality objectives are attained.  There is the potential for many types of sampling and
measurement system corrective actions.  Table 11-4 is an attempt to detail the expected problems
and corrective actions needed for a well-run PM2 5 network.

-------
                                                                                               Project: Model QAPP
                                                                                                     Element No: 11
                                                                                                      Revision No:l
                                                                                                       Date: 4/17/98
                                                                                              	Page 6 of 10
Table 11-4 Field Corrective Action
         Item
        Problem
               Action
       Notification
  Filter Inspection
  (Pre-sample)
Pinhole(s) or torn
1.) If additional filters have been
brought, use one of them.  Void filter
with pinhole or tear.

2.) Use new field blank filter as sample
filter.

3.) Obtain a new filter from lab.
1.) Document on field data
sheet.
                                                                                        2.) Document on field data
                                                                                        sheet.

                                                                                        3.) Notify Field Manager
  Filter Inspection
  (Post-sample)
Torn or otherwise suspect
particulate by-passing 46.2
mm filter.
1.) Inspect area downstream of where
filter rests in sampler and determine if
particulate has been by-passing filter.

2.) Inspect in-line filter before sample
pump and determine if excessive loading
has occurred.  Replace as necessary.
1.) Document on field data
sheet.
                                                                                        2.) Document in log book.
  WINS Impactor
Heavily loaded with course
particulate.  Will be
obvious due to a "cone"
shape on the impactor well.
Clean downtube and WINS impactor
Load new impactor oil in WINS
impactor well
Document in log book
  Sample Flow Rate
  Verification
Out of Specification
(+ 4% of transfer standard)
1.) Completely remove flow rate device,
re-connect and re-perform flow rate
check.

2.) Perform leak test.

3.) Check flow rate at 3 points (15.0
LPM, 16.7 LPM, and 18.3 LPM) to
determine if flow rate problem is with
zero bias or slope.

4.) Re-calibrate flow rate
  ) Document on data sheet.
                                                                                        2.) Document on data sheet.
                                                                                        3.) Document on data sheet.
                                                                                        Notify Field Manager
                                                                                        4.) Document on data sheet.
                                                                                        Notify Field Manager.
  Leak Test
Leak outside acceptable
tolerance (80 mL/min)
1.) Completely remove flow rate device,
re-connect and re-perform leak test.

2.) Inspect all seals and O-rings, replace
as necessary and re-perform leak test.
                                                  3.) Check sampler with different leak
                                                  test device.
1.) Document in log book.
                                                                 2.) Document in log book,
                                                                 notify Field Manager, and
                                                                 flag data since last successful
                                                                 leak test.

                                                                 3.) Document in log book
                                                                 and notify Field Manager.
   Sample Flow Rate
Consistently low flows
documented during sample
1.) Check programming of sampler
flowrate

2.) Check flow with a flow rate
verification filter and determine if actual
flow is low.

3.) Inspect in-line filter downstream of
46.2 mm filter location, replace as
necessary.
1.) Document in log book.
                                                                                        2 ) Document in log book
                                                                                        3 ) Document in log book

-------
Project: Model QAPP
     Element No: 11
      Revision No:l
       Date: 4/17/98
        Page 7 of 10
Item
Ambient
Temperature
Verification, and
Filter Temperature
Verification.



Ambient Pressure
Verification


Elapsed Sample
Time
Elapsed Sample
Time

Power
Power
Data Downloading

Problem
Out of Specification
ฃ+4ฐCof standard)



Out of Specification
(ilOmmHg)


Out of Specification
( 1 min/mo)
Sample did not run

Power Interruptions
LCD panel on, but sample
not working.
Data will not transfer to
laptop computer

Action
1.) Make certain thermocouples are
immersed in same liquid at same point
without touching sides or bottom of
container.
2.) Use ice bath or warm water bath to
check a different temperature. If
acceptable, re-perform ambient
temperature verification
3.) Connect new thermocouple
4.) Check ambient temperature with
another NIST traceable thermometer.
1.) Make certain pressure sensors are
each exposed to the ambient air and are
not in direct sunlight.
2.) Call local Airport or other source of
ambient pressure data and compare that
pressure to pressure data from monitors
sensor. Pressure correction may be
required
3 ) Connect new pressure sensor
Check Programming, Verify Power
Outages
1 .) Check Programming
2.) Try programming sample run to start
while operator is at site Use a flow
verification filter.
Check Line Voltage
Check circuit breaker, some samplers
have battery back-up for data but will not
work without AC power.
Document key information on sample
data sheet Make certain problem is
resolved before data is written over in
sampler microprocessor
Notification
1 .) Document on data sheet.
2.) Document on data sheet.
3.) Document on data sheet.
Notify Field Manager.
4.) Document on data sheet.
Notify Field Manager.
1.) Document on data sheet.
2.) Document on data sheet
3.) Document on data sheet
Notify Field Manager
Notify Field Manager
1 .) Document on data sheet.
Notify Field Manager
2.) Document in log book.
Notify Field Manager.
Notify Field Manager
Document in log book
Notify Field Manager.


-------
                                                                          Project: Model QAPP
                                                                              Element No: 11
                                                                               Revision No:l
                                                                                Date. 4/17/98
                                                                         	Page 8 of 10
11.5   Sampling Equipment, Preservation, and Holding Time Requirements
    This section includes the requirements needed to prevent sample contamination (disposable samplers or samplers
capable of appropriate decontamination), the physical volume of the material to be collected (the size of composite
samples, core material, or the volume of water needed for analysis), the protection of physical specimens to prevent
contamination from outside sources, the temperature preservation requirements, and the permissible holding times to
ensure against degradation of sample integrity.
This sections details the requirements needed to prevent sample contamination, the volume of air
to be sampled, how to protect the sample from contamination, temperature preservation
requirements, and the permissible holding times to ensure against degradation of sample integrity.

11.5.1  Sample Contamination Prevention

The PM2 5 network has rigid requirements for preventing sample contamination. Powder free
gloves are worn while handling filter cassettes. Once the filter cassette is taken outside of the
weigh room it must never be opened as damage may result to the 46.2 mm Teflon filter. Filter
cassettes are to be stored in filter cassette storage containers as provided by the sampler
manufacturer during  transport to and from the laboratory. Once samples have been weighed,
they are to be stored with the paniculate side up and individually stored in static resistant zip lock
bags.

11.5.2  Sample Volume

The volume of air to  be sampled is specified in 40 CFR Part 50.  Sample flow rate of air is  16.67
L/min. The total sample of air collected will be 24 cubic meters based upon a 24 hour sample.
Samples  are expected to be 24 hours; however, in some cases a shorter sample period may be
necessary, not to be less than 23 hours.  Since capture of the fine particulate is predicated upon a
design flowrate of 16.67 L/min, deviations of greater than 10% from the design flowrate will
enable a  shut-off mechanism for the sampler.  If a sample period is less than 23 hours or greater
than 25 hours, the sample will be flagged and the QA Officer notified.

11.5.3  Temperature Preservation Requirements

The temperature requirements of the PM2 5 network are explicitly detailed in 40 CFR Part 50,
Appendix L'. During transport from the weigh room to the sample location there  are no specific
requirements for temperature control; however, the filters will be located in their protective
container and in the transport container. Excessive heat must be avoided (e.g.,. do not leave in
direct sunlight or a closed-up car during summer). The filter temperature requirements are
detailed in Table 11 -5

-------
                                                                         Project: Model QAPP
                                                                             Element No: 11
                                                                              Revision No:l
                                                                               Date: 4/17/98
                                                                        	Page 9 of 10
Table 11-5 Filter Temperature Requirements
Item
Filter temperature control during
sampling and until recovery.
Filter temperature control from time of
recovery to start of conditioning.
Post sample transport so that final
weight may be determined up to 30
days after end of sample period.
Temperature Requirement
No more than 5ฐ C above ambient
temperature.
Protected from exposure to
temperatures over 25ฐ C.
4ฐ C or less
Reference
40 CFR Part 50, Appendix L, Section
7.4.10
40 CFR Part 50, Appendix L, Section
10.13
40 CFR Part 50, Appendix L, Section
8.3.6
11.5.4 Permissible Holding Times

The permissible holding times for the PM2 5 sample are clearly detailed in both 40 CFR Part 50,
Appendix L, and Section 2.12 of the U.S. EPA QA Handbook. These holding times are provided
in Table 11-6.

Table 11-6 Holding Times
Item
Pre-weighed Filter
Recovery of Filter
Transport of Filter
Post Sample Filter stored
at <4ฐ C.
Post Sample Filter
continuously stored at
<25ฐ C.
Holding Time
<30 days
<96 hours
<24 Hours
(ideally)
<30 days
<10days
From:
DateofPre-
weigh
Completion of
sample period
Time of
recovery
Sample end
date/time
Sample end
date/time
To:
Date of Sample
Time of sample
recovery
Time placed in
conditioning
room
Date of Post
Weigh
Date of Post
Weigh
Reference
40 CFR Part 50, Appendix L,
Section 8.3.5
40 CFR Part 50, Appendix L,
Section 10.10
40 CFR Part 50, Appendix L,
Section 10.13
40 CFR Part 50, Appendix L,
Section 8.3.6
40 CFR Part 50, Appendix L,
Section 8.3 6
References

The following documents were utilized in the development of this section:

1. U.S. EPA (1997a) National Ambient Air Quality Standards for Particulate Matter - Final Rule.
   40 CFR Part 50. Federal Register, 62(138):38651-38760. July 18,1997.

2. U.S. EPA (1997b) Revised Requirements for Designation of Reference and Equivalent
   Methods for PM2.5 and Ambient Air Quality Surveillance for Particulate Matter-Final Rule.
   40 CFR Parts 53 and 58.  Federal Register, 62(138):38763-38854. July 18,1997.

-------
                                                                     Project: Model QAPP
                                                                          Element No: 11
                                                                           Revision No:l
                                                                           Date: 4/17/98
	Page 10 of 10

3. U.S. EPA Quality Assurance Guidance Document 2.12: Monitoring PM2 5 in Ambient Air
   Using Designated Reference or Class I Equivalent Methods. March, 1998

4. XYZ Company Incorporated. XYZ Sampler Model WOO Operating Manual. April 1998.

5. XYZ Company Incorporated. XYZ Sampler Model 2000 Operating Manual.

6. April 1998.Palookaville Standard Operating Procedures for PM25 Sampling Methods.  1998

-------
                                                                                    Project: Model QAPP
                                                                                         Element No: 12
                                                                                          Revision No: 1
                                                                                           Date: 4/17/98
                                                                                            Page 1 of 11
                                   12.0 Sampling Custody
     This element of the QAPP should clearly describe all procedures that are necessary for ensuring that:

         1.)  samples are collected, transferred, stored, and analyzed by authorized personnel;
         2.)  sample integrity is maintained during all phases of sample handling and analyses; and
         3.)  an accurate written record is maintained of sample handling and treatment from the time of its collection
             through laboratory procedures to disposal.

     Proper sample custody minimizes accidents by assigning responsibility for all stages of sample handling and
 ensures that problems will be detected and documented if they occur. A sample is in custody if it is in actual physical
 possession or it is in a secured area that is restricted to authorized personnel. The level of custody necessary is
 dependent upon the project's DQOs.  While enforcement actions necessitate stringent custody procedures, custody in
 other types of situations (i.e., academic research) may be primarily concerned only with the tracking of sample
 collection, handling, and analysis.

     Sample custody procedures are necessary to prove that the sample data correspond to the sample collected, if data
 are intended to be legally defensible in court as evidence.  In a number of situations, a complete, detailed, unbroken
 chain of custody will allow the documentation and data to substitute for the physical evidence of the samples (which
 are often hazardous waste) in a civil courtroom.

     An outline of the scope of sample custody-starting from the planning of sample collection, field sampling, sample
 analysis to sample disposal-should also be included.  This discussion should further stress the completion of sample
 custody procedures, which include the transfer of sample custody from field personnel to lab, sample custody within
 the analytical  lab during sample preparation and analysis, and data storage.
Due to the potential use of the PM25 data for comparison to the NAAQS and the requirement for
extreme care in handling the sample collection filters, sample custody procedures will be followed.
Figures 12.1 and  12.2 represent chain of custody forms that will be used to track the stages of
filter handling throughout the data collection operation. Definitions of each parameter on the
forms are explained in Table 12-1.  Although entries on this form will be made by hand, the
information will be entered into the a sampling tracking system, where an electronic record will be
kept (see Section  19). This section will address sample custody procedures at the following
stages:

     *•   Pre-sampling
     *•   Post-sampling
     *•   Filter receipt
     *•   Filter archive

-------
                                                                                        Project: Model QAPP
                                                                                              Element No: 12
                                                                                               Revision No:l
                                                                                                Date: 4/17/98
                                                                                       	Page 2 of 11
                                   Filter Chain of Custody Record
  Pre-Sampling Filter Selection
Site Operator
Initial
BLM
BLM


Filter ID
RF990001
FB990001


Cont.
ID
MC001
MC002


Receipt
Date
99/01/01
99/01/01


Monitor ID
060021125811041
060021125811041


Sampler ID
AD001
AD001


Installation
Date
99/01/01
99/01/01


Comments




   Post-Sampling Filter Recovery
Site
Operator
Final
BLM
BLM


Filter ID
RF990001
FB990001


Cont.
ID
MC001
MC002


Monitor ID
060021125811041
060021125811041


Sampler
ID
AD001
AD001


Removal
Date
99/01/03
99/01/03


Removal
Time
0900
0900


Ambient
Storage




ฐC.
X
X


Filter
Integrity
Flags
GFi
GFI


Field
Qualifiers




  Free Form Notes-_

  Shipping Info
Delivered by Operatorr
Delivered by 2nd Party X
                  Date Shipped 99/01/03    Shipping Vendor  FEDEX    ft Boxes	1_   # Filters 2

                  Airbill Number  4909283326
  Filter Receipt   Box 1 Max Temp
                        Min Temp
                Box 2 Max Temp
Mm Temp
Receiver
ID
SBM
SBM


Filter ID
RF990001
FB990001


Cont.
ID
MC001
MC002


Date
Received
99/01/04
99/01/04


Receipt
time
/030
/030


Shipping
Integrity
Flags
GSI
GSI


Archived




Sent to
Lab
X
X


  Free Form Notes -
  Filter Transfer

  Relinquished by   SBM
        Date/Time: 99/01/04 /1130
                                         Received by:   FIN   Date/Time: 99/01/04 /1130
Figure 12.1 Example filter chain of custody record

-------
                                                                                        Project: Model QAPP
                                                                                             Element No: 12
                                                                                               Revision No:l
                                                                                               Date: 4/17/98
                                                                                       	Page 3 of 11
Filter Archiving Tracking Form
Filter ID
RF990001
FB990001









Analysis
Date
99/01/05
99/01/05









Archive
Date
99/01/06
99/01/06









Box ID/Box #
06002112581104V1
060021125811041/1









Archived
By:
FIN
FIN









Comments












Figure 12.2 Filter archive form

 Table 12-1 Parameter List
Parameter
Parameter
Code
Frequency
Units
Comment
Pre-Sampling
Site Operator Initial
Filter ID
Container ID
Receipt Date
Monitor Id
SOI
FID
CONTID
SORDATE
MONID
Every sample
Every sample
Every Sample
Every sample
Every sample
AAA
AAYYXXXX
AAXXX
YY/MM/DD
see AIRS
Initials of the site operator setting up the
sampling run
Unique filter ID of filter given by the
weighing laboratory.
Unique ID for the protective containers
used to transport the filters. These are
reusable
Date filter taken by the site operator fror
storage to the field
Unique AIRS Monitor ID that include
the combination of STATE, COUNTY
SITE, PARAMETER, and POC fields

-------
 Project: Model QAPP
      Element No: 12
       Revision No:l
        Date: 4/17/98
	Page 4 of 11
Parameter
Sampler ID
Installation Date
Pre-Sampling Comments
Parameter
Code
SAMPID
SORDATE
PRESCOM
Frequency
Every sample
Every sample
When required
Units
AAXXX
YY/MM/DD
AAA....
Comment
Sampler model number or unique bar
code number associated with the model
number.
Date filter was placed into sampler by
the site operator.
Free form comments from site operator
during pre-sampling filter selection
Post-Sampling
Site Operator Final
Removal Date
Removal Time
Ambient Temp.
4ฐC
Filter Integrity flag
Field Qualifiers
Free Form Notes
SOF
REMDATE
REMTIME
AMSTOR
COSTOR
FFIF
FQUAL
PSTFFM
Every sample
Every sample
Every sample
See Comment
See Comment
Every sample
Every sample
As needed
AAA
YY/MM/DD
XXXX
Y/N
Y/N
QFI/ VFI/GFI
AAA
AAA.
Initials of the site operator completing
the sampling run
Date filter taken by the site operator fror
the monitor for transport from the field
Time in military units that filter was
removed from monitor for transport froi
the field
Field to determine whether the sample
was maintained at ambient temperature
from removal through transport. If this
field is not entered, the next (4ฐC) must
be
Field to determine whether the sample
was maintained at the 4ฐC temperature
from removal through transport If this
field is not entered, the previous
(Ambient Temp.) must be. Also if
shipped next day air this field must be
checked
QFI -Questionable filter integrity
VFI- Void Filter Integrity
GFI-Good Filter Integrity
Other field qualifier flags
Free form notes about sample recovery
activity
Shipping Information
Delivered by Operator'
DELOP
See Comment
Y/N
Field to determine whether the samples
on the C-O-C sheet was delivered to the
receiving facility by the site operator . If
this field is not entered, the following
field (2nd party) must be

-------
 Project: Model QAPP
      Element No: 12
       Revision No:l
        Date: 4/17/98
	Page 5 of 11
Parameter
Delivered by 2nd Party:
Date Shipped
Shipping Vendor
# Boxes
# Filters
Airbill number
Parameter
Code
SECDEL
DASHP
SHPVEN
NUMBOX
NUMFIL
AIRBIL
Frequency
See Comment
If shipped by Vendor
If shipped by Vendor
If shipped by Vendor
If shipped by Vendor
If shipped by Vendor
Units
Y/N
YY/MM/DD
AAAA 	
XX
XX
XXXX.. ..
Comment
Field to determine whether the samples
on the C-O-C sheet were delivered to th
receiving facility by a next day carrier. I
this field is not entered, the previous
(Delivered by Operator) must be.
Date filters shipped to the receiving
facility .
Vendor used for shipment
Total number of boxes sent under one
airbill number
Total number of filters in the
representative boxes sent under one
airbill number
Airbill number for shipment
Filter Receipt
Box 1 Min. Temp.
Box 1 Max. Temp.
Date received
Time
Container ID
Shipping Integrity flags
Archived
Sent to Lab
Free Form Notes
B1MIN
MIMAX
RECDATE
RECTIME
CONTID
RECFLAG
ARCH
SENLAB
RECFFM
Box 1
Boxl
Every sample
Every sample
every filter
as needed
See Comment
See Comment
As needed
XX
XX
YY/MM/DD
XXXX
AAAXXX
AAA
Y/N
Y/N
AAA. .
Temp, in Celsius of min. temperature
from max/mm thermometer
Temp in celsius of max. temperature
from max/min thermometer
Date filter received at the receiving
facility .
Time in military units that filter was
received at the receiving facility.
Identification of the filter transport
container .
Flags associated with the integrity of the
filter shipment upon receipt at the
receiving facility.
Field to determine whether the filters
were placed into cold storage at the
receiving facility prior to transport to
weighing lab (weekend delivery). If thi
field is not entered, the next (Sent to lab
must be
Field to determine whether the sample
was delivered to the weighing laboratory
the day it was received. If this field is
not entered, the previous (Archived)
must be.
Free form notes about sample receipt
activitv

-------
                                                                                       Project: Model QAPP
                                                                                             Element No: 12
                                                                                              Revision No:l
                                                                                               Date: 4/17/98
                                                                                       	Page 6 of 11
Parameter

Parameter
Code

Frequency

Units

Comment

Filter Transfer
Relinquished by
Date
Time
Received by:
Date
Time
RELBY
RELDATE
RELTIME
RECDBY
RECDATE
RECTIME
Every C-O-C Sheet
Every C-O-C Sheet
Every C-O-C Sheet
Every C-O-C Sheet
Every C-O-C Sheet
Every C-O-C Sheet
AAA
YY/MM/DD
XXXX
AAA
YY/MM/DD
XXXX
Initials of the receiving person
relinquishing filters.
Date filter was relinquished by the
receiving facility .
Time in military units that filter was
relinquished by the receiving facility.
Initials of the laboratory technician
receiving filters
Date filter received by the weighing
laboratory.
Time in military units that filter was
received by the weighing laboratory.
12.1 Sample Custody Procedure
    The QAPP should discuss the sample custody procedure at a level commensurate with the intended use of the
data. This discussion should include the following:

    1.  List the names and responsibilities of all sample custodians in the field and laboratories.
    2.  Give a description and example of the sample numbering system.
    3.  Define acceptable conditions and plans  for maintaining sample integrity in the field prior to and during
        shipment to the laboratory (e.g., proper temperature and preservatives).
    4.  Give examples of forms and labels used to maintain sample custody and document sample handling in the
        field and during shipping.
    5.  Describe the method of sealing shipping containers with chain-of-custody seals.
    6.  Describe procedures that will be used to maintain the chain of custody and document sample handling during
        transfer from the field to the laboratory, within the  laboratory, and among contractors.
    7.  Provide for the archiving of all shipping documents and associated paperwork.
    8.  Discuss procedures that will ensure sample security at all times.
    9.  Describe procedures for within-laboratory chain-of-custody together with verification of the printed name,
        signature, and initials of the personnel responsible  for custody of samples, extracts, or digests during analysis
        at the laboratory.  Finally, document disposal or consumption of samples should also be described.
    The discussion should be as specific as possible about the details of sample storage, transportation, and delivery
to the receiving analytical facility.

-------
                                                                                 Project: Model QAPP
                                                                                      Element No: 12
                                                                                       Revision No: 1
                                                                                        Date: 4/17/98
                                                                                        Page 7 of 11
                  1) Pre-sampling weighing - M. Smather/F. Nottingham
                                     Initial weighing/tracking
                  2) Pre-sampling - B Macky/K. Porter/ B. Deston
                               Selecting filters from lab, recording
                               Chain of Custody
                  3) Post-sampling - B Macky/K. Porter/ B. Deston
                                Collecting/Transporting filter, recording
                                Chain of Custody
                  4) Filter Receipt - J. Chang/S Marony
                             Receiving/checking/storing/sending filters
                             to lab, recording Chain of Custody
                 5) Post-weighing/Archive - M Smather/F. Nottingham
                                    Receiving filters from Receiving
                                    Facility, final weighing/archiving,
                                    and recording archive info
 Figure 12.3 Chain of custody phases
 Figure 12.3 represents the stages
 of sample custody for the PM2 5
 samples and Table 12-2 lists the
 personnel that will be responsible
 for sample custody at the various
 data operation stages.  Initials
 will be used on the chain of
 custody forms (Figs  12.1,  12.2)

 One of the most important values
 in the sample custody procedure
 is the unique filter ID number,
 illustrated in Figure 12.4 The
 filter ID is an alpha-numeric
 value. The initial two alpha
 values identify the type of filter
 as being either a routine filter
 (RF),  a field blank (FB),  a lab
 blank (LB) or a flow check filter
 (FC) used for the flow  rate check.
 The next two values (YY)
represent the last two digits of
the calendar year and the next 4
digits represent a unique number.
Each combination of filter type and year will start with the value 0001.  Therefore, for 1998 the
first routine filter will be numbered RF980001 and the field blank will be FB980001.   The filter
ID will be generated by the laboratory analyst at the time of preweighing.

               Table 12-2 Sample Custodians
Data Operation
Pre-Sampling
Post Sampling
Filter Receipt
Filter Archive
Sample Custodians
Bill Macky
Karin Porter
Beverly Deston
Bill Macky
Karin Porter
Beverly Deston
Jason Chang
Sonny Marony
Mike Smather
Fred Nottingham
Branch
Air Monitoring
Air Monitoring
Air Monitoring
Air Monitoring
Air Monitoring
Air Monitoring
Shipping/Receiving
Shipping/Receiving
Laboratory
Laboratory
Initials
BLM
KDP
BVD
BLM
KDP
BVD
JGC
SBM
MSS
FIN

-------
                                                                           Project: Model QAPP
                                                                               Element No: 12
                                                                                Revision No:l
                                                                                 Date: 4/17/98
                                                                          	Page 8 of 11
A
Filter
A
Type
Y
Filter
Y
— Year —
ID
X
X
— Unique
X
X
number 	
Figure 12.4 Filter ID


12.1.1 Pre-Sampling Custody

The Department's pre-sampling weighing SOPs define how the filters will be enumerated,
conditioned, weighed, placed into the protective shipping container, sealed with tape,  and stored
on the field filter shelf for selection by the site operators.  Filters must be used within 30 days of
pre-sampling weighing.  A Filter Inventory Sheet containing the Filter ID,  Filter Type, Container
ID, and the Pre-Sampling Weighing Date will be attached to the field filter shelf for use by the site
operator. Each sampling period,  the site operators will select filters that they will use for the field.
The number of filters selected will depend on the time in the field prior to returning to  the
laboratory and the number of samplers to be serviced. The  site operator will perform the
following Pre-sampling activities:

   1.  Contact M. Smather or F. Nottingham for access to  laboratory
   2.  Put on appropriate laboratory attire.
   3.  Enter the filter storage area.
   4.  Review the Filter Inventory Sheet and select the next set of filters on the sheet. Ensure
       the seals are intact.  Since the site operator can not check the Filter ID he/she will have to
       use the container ID value.
   5.  Take a Filter Chain of Custody Record for each site visited.  Fill out the first 4 columns of
       the "Pre-Sampling Filter Selection" portion of the Filter Chain of Custody Record (Fig
       12.1) for each filter.
   6.  Initial the column "Site Operator" on the Filter Inventory Sheet to signify selection of the
       filters.
   7.  Pack filters in sample coolers for travel to the field.

Upon arrival at a site:

     8. Select the appropriate filters for a sampler.
     9. Once the filters are installed at the site, complete the remainder of the columns (5-8)  of
       the "Pre-Sampling Filter Selection" portion of the Filter Chain of Custody Record (Fig
       12.1)

-------
                                                                           Project: Model QAPP
                                                                                Element No: 12
                                                                                 Revision No:l
                                                                                  Date: 4/17/98
                                                                                  Page 9 of 11
 12.1.2 Post Sampling Custody
 The field sampling SOPs specify the techniques for properly collecting and handling the sample
 filters.  Upon visiting the site:

     1.  Select the appropriate Filter Chain of Custody Record.  Ensure that the Site ID and the
        protective Container ID(s) are correct.
     2.  Remove filter cassette from the sampler. Briefly examine it to determine appropriate filter
        integrity flag and place it into the protective container per SOPs and seal with tape.
     3.  Place the protective container(s) into the shipping/transport container with the appropriate
        temperature control devices.
     4.  Record "Post Sampling Filter Recovery Information" on the  Filter Chain of Custody
        Record.

 Shipping Information --

 Depending on the number of sites to be serviced, the location of the sites, and the time period
 from the end of sample collection, the site operator will either deliver the sample to the laboratory
 or send it next day air to the laboratory. The first line of the "Shipping Info" area on the Filter
 Chain of Custody Record indicates the mode of transportation.  If the mode of transportation is
 next day air, record the appropriate information.  The Department has a contract with Federal
 Expressฎ. The Federal Express location for each site is listed in Table 12-3.  Pre- addressed
 mailing slips will be made available for site operators. Shipping requirements include:

     1. Bring the shipping/transport containers to the next day air vendor.
     2. Fill out the remainder of the pre-addressed airbills.
     3. Fill out the "Shipping Info" on the Filter Chain of Custody Record(s).
     4. Photocopy the Filter Chain of Custody Records that pertain to the shipment.
     5. Place the photocopied records in a plastic zip lock bag and include it in one of the
       shipping/transport containers.
     6. Seal all shipping/transport containers per SOPs.
     7. The site operator will take the original Filter Chain  of Custody Records(s) and attach the
       airbill to the records.
     8. The site operator will contact the receiving laboratory of a shipment the day of the
       shipment.

NOTE:  If a site operator needs to send or deliver a shipment on Saturday, the site operator must
provide  the shipping/receiving office 3-days notice in order ensure shipping/receiving personnel
will be available.

-------
                                                                          Project: Model QAPP
                                                                               Element No: 12
                                                                                Revision No:l
                                                                                 Date: 4/17/98
                                                                          	Page 10 of 11
Table 12-3 Federal Ex
Monitor ID
060021125811041
060021125811049
060021245811041
060030125811041
060051625811041
060041125811041
060041125811049
press Locations
Site Code
Al
Al
A2
A3
A4
Bl
Bl
Site Name
Darby Road
Jefferson School
Bay Bridge Street
Ridge Road
Donner Road
Federal Express Location
1615 Faming Blvd
Scarborough, CA 12679
161 5 Faming Blvd
Scarborough, CA 12679
1615 Faming Blvd
Scarborough, CA 12679
72 Ridge Road
Bakersville, CA 12677
44 Fossberg Road
Dunston, CA 121634
12.1.3 Filter Reciept

The samples, whether transported by the site operator or next day air, will be received by either
Jason Chang or Sonny Marony at the Shipping/Receiving Office. The Shipping/Receiving Office
will:
     1. Receive shipping/transport container(s)
     2. Upon receipt, open the container(s) to find  Filter Chain of Custody Record(s) or collect
       the originals from the site operator (if delivered by operator).
     3. Fill out the "Filter Receipt"  area of the Filter Chain of Custody Records(s). Check
       sample container seals.
     4. If the samples are delivered on a weekday, follow sequence 5; if the sample (s) are
       delivered on a weekend, follow sequence 6
     5. Check the "Sent to Laboratory" column of the Filter Chain of Custody Records(s) and
       transport the filters to the PM2 5 weighing laboratory. Upon delivery to the PM2 5 weighing
       laboratory, complete the "Filter Transfer" area of the Filter Chain of Custody Records(s)
     6. Store the samples in the refrigerator and check the "archived" column of the Filter Chain
       of Custody Records(s).  On the Monday of the following week,  deliver the archived filters
       to the  PM2 5 weighing laboratory and complete the "Filter Transfer" area of the  Filter
       Chain of Custody Records(s)

12.1 A Filter Archive

Once the PM2 5 weighing laboratory receives the  filter, they  will use their raw data entry sheets to
log the samples back in from receiving and prepare them for post-sampling weighing activities.
These activities are included in the analytical SOPs (Section 13).   The laboratory  technicians will
take the filters out of the protective containers and the cassettes and examine them for integrity,
which will be marked on the data entry sheets.  During all post-sampling activities, filter custody
will be the responsibility of Mike Smather and Fred Nottingham.  The samples will be stored

-------
                                                                           Project: Model QAPP
                                                                               Element No: 12
                                                                                Revision No:l
                                                                                 Date: 4/17/98
	Page 11  of 11

within the PM25 weighing laboratory.  This laboratory has restricted access to Mr. Smather and
Mr. Nottingham.

Upon completion of post-sampling weighing activities, the Filter Archiving Form (Figure 12.2)
will be used by the laboratory technicians to archive the filter. Each filter will be packaged
according to the SOPs and stored in a box uniquely identified by Site ID and box number.
Samples will be archived in the filter storage facility for one year past the date of collection. Prior
to disposal,  EPA Region Y will be notified of the Department's intent to dispose of the filters.

-------
                                                                                      Project: Model QAPP
                                                                                           Element No: 13
                                                                                            Revision No:l
                                                                                             Date: 4/17/98
                                                                                     	Page 1 of8
                       13.0 Analytical Methods Requirements
     The choice of analytical methods will be influenced by the performance criteria, Data Quality Objectives, and
 possible regulatory criteria. Qualification requirements may range from functional group contaminant identification
 only to complete individual contaminant specification. Quantification needs may range from order-of-magnitude
 quantities only to parts-per-trillion (ppt) concentrations.  If appropriate, a citation of analytical procedures may be
 sufficient if the analytical method is a complete SOP, such as one of the Contract Lab Program Statements of Work.
 For other methods, it may suffice to reference a procedure (i.e., from  Test Methods for Evaluating Solid Waste, SW-
 846) and further supplement it with the particular options/variations being used by the lab, the detection limits actually
 achieved, the calibration standards and concentrations used, and so on.  In other situations, complete step-wise
 analytical and/or sample preparation procedures will need to be attached to the QAPP if the procedure is unique or an
 adaption of a "standard" method.

     Specific monitoring methods and requirements to demonstrate compliance traditionally were specified in the
 applicable regulations and/or permits. However, this approach is being replaced by the Performance-Based
 Measurement System (PBMS).  PBMS is a process in which data quality needs, mandates, or limitations of a program
 or project are specified and serve as a criterion for selecting appropriate methods.  The regulated body selects the most
 cost-effective methods that meet the criteria specified in the PBMS. Under the PBMS framework, the performance of
 the method employed is emphasized rather than the specific technique or procedure used in the analysis. Equally
 stressed in this system is the requirement that the performance of the method be  documented and certified by the
 laboratory that appropriate QA/QC procedures have been conducted to verify the performance. PBMS applies to
 physical, chemical, and biological techniques of analysis performed in the field as  well  as in the laboratory. PBMS
 does not apply to the method-defined parameters.

     The QAPP should also address the issue of the quality of analytical data as indicated by the data's ability to meet
 the QC acceptance criteria. This section should describe what should be done if the calibration check samples exceed
 the control limits due to mechanical failure of the instrumentation, a drift in the calibration curve occurs, or if a reagent
 blank indicates contamination. This section should also indicate the authorities responsible for the quality of the data,
 the protocols for making changes and implementing corrective actions, and the methods for reporting the data and its
 limitations.

     Laboratory contamination from the processing of hazardous materials such  as toxic or radioactive samples for
 analysis and their ultimate disposal should be a considered during the planning stages for selection of analysis
 methods.  Safe handling requirements for project samples in the laboratory with appropriate decontamination and
 waste disposal procedures should also be described.
13.1   Purpose/Background

This method provides for gravimetric analyses of filters used in the Palookaville PM2 5 network.
The net weight gain of a sample calculated by subtracting the initial weight from the final weigh.
Once calculated,  the net weight gain can be used with the total flow passed through a filter to
calculate the concentration for comparison to the daily and annual NAAQS.  Since the method is
non-destructive,  and due to possible interest in sample composition, the filters will be archived
after final gravimetric analyses has occurred.

-------
                                                                            Project: Model QAPP
                                                                                 Element No: 13
                                                                                  Revision Ncr 1
                                                                                   Date: 4/17/98
                                                                           	Page 2 of 8
13.2  Preparation of Samples
    Preparation procedures should be described and standard methods cited and used where possible. Step-by-step
operating procedures for the preparation of the project samples should be listed in an appendix. The sampling
containers, methods of preservation, holding times, holding conditions, number and types of all QA/QC samples to be
collected, percent recovery, and names of the laboratories that will perform the analyses need to be specifically
referenced.
The Palookaville network will consist of 5 sites, 1 with a collocated sequential sampler and 1 with
a collocated single channel sampler. The 4 primary sequential and one single channel samplers are
on a 1 in 3 day schedule. The collocated sequential and single channel samplers are on a 1 in 6 day
schedule. Therefore,  the approximate number of routine filters that have to be prepared, used,
transported, conditioned and weighed is 12 per week.  In addition, field blanks, lab blanks, and
flow check filters must also be prepared. See Appendix C for activities associated with preparing
pre-sample batches.

Upon delivery of approved 46.2 mm Teflon filters for use in the Palookaville network, the receipt
is documented and the filters stored in the  conditioning/weighing room/laboratory. Storing filters
in the laboratory  makes it easier to maximize the amount of time available for conditioning. Upon
receipt, cases of filters will be labeled with the date of receipt, opened one at a time and used
completely before opening another case. All filters in a lot will be used before a case containing
another lot is opened.  When more than one case is available to open the "First In - First Out" rule
will apply. This means that the first case of filters received is the first case that will be used.

Filters will be taken out of the case when there is enough room for more samples in the pre-
sampling weighing section of the filter conditioning storage compartment. Filters will be visually
inspected according to the FRM criteria to determine compliance. See App.C, A-FIC for
inspection procedure for new shipments of filters.  Filters will then be stored in the filter
conditioning compartment in unmarked petri dishes. The minimum conditioning period is 24
hours.  Filters will not be left out for excessive periods of conditioning since some settling of dust
is possible on the filters' top sides.

-------
                                                                            Project: Model QAPP
                                                                                 Element No: 13
                                                                                  Revision No: 1
                                                                                   Date: 4/17/98
                                                                           	Page 3 of8
 13.3   Analysis Method
    The citation of an analytical method may not always be sufficient to fully characterize a method because the
 analysis of a sample may require deviation from a standard method and selection from the range of options in the
 method. The SOP for each analytical method should be cited or attached to the QAPP, and all deviations or
 alternative selections should be detailed in the QAPP.

    Often the selected analytical methods may be presented conveniently in one or several tables describing the
 matrix, the analytes to be measured, the analysis methods, the type, the precision/accuracy data, the performance
 acceptance criteria, the calibration criteria, and etc. Appendix C contains a checklist of many important components to
 consider when selecting analytical methods.
 13.3.1 Analytical Equipment and Method

 The analytical instrument used for gravimetric analysis in the FRM or equivalent PM2 5 sampler
 method (gravimetric analysis) is the microbalance.  The Palookaville microbalance is a Libra
 Model 101, which has a readability* of 1 //g and a repeatability* of l//g (* equipment
 performance terms used by balance vendors to characterize their equipment for purchase
 comparison purposes; see also Appendix C, A-MRS.)

 The Libra Model 101 microbalance was initially set-up and run by the Libra Company.  It is
 calibrated yearly by a Libra Balance Technician under the service agreement between the
 Palookaville Department of Health and the Libra Balance Company.

 The gravimetric analysis method (Appendix C)  consists of 3 main subparts (App.C, A-l, 2, and 3)
 following five preliminary sections. The information in the preliminary sections may be needed to
 establish and verify the continued acceptability of the set of primary and secondary mass reference
 standards (App.C, A-MRS) and a new lot of filters (App.C, A-FIC; App.C, A-FH) and to
 establish stable conditions in the weighing room (Filter Conditioning; Electrostatic Charge
 Neutralization). The three main subparts are entitled Pre-sampling Filter Weighing (Tare Weight);
 Postsampling Documentation and Inspection;  and Post-sampling Filter Weighing (Gross Weight).
 A detailed listing of the gravimetric analysis method can be found in the Palookaville
 microbalance standard operating procedure (App.C).

 13.3.2 Conditioning and Weighing Room

The primary support facility for the PM2 5 network is the filter conditioning and weighing
room/laboratory.  Additional facility space is dedicated for long  term archiving of the filter. This
weigh room laboratory is used to both pre-sampling weighing and post-sampling weighing of each
PM2 5 filter sample. Specific requirements for environmental control of the conditioning/weighing
room laboratory are detailed in 40 CFR Part 50  Appendix L1

-------
                                                                         Project: Model QAPP
                                                                             Element No: 13
                                                                              Revision No:l
                                                                               Date: 4/17/98
                                                                        	Page 4 of8
13.3.3 Environmental Control
The Palookaville weigh room facility is an environmentally controlled room with temperature and
humidity control.  Temperature is controlled at a minimum from 20 - 23ฐ C.  Humidity is
controlled from 30 - 40% relative humidity. Temperature and relative humidity are measured and
recorded continuously during equilibration. The balance is located on a vibration free table and is
protected from or located out of the path of any sources of drafts. Filters are conditioned before
both the pre- and postsampling weighings. Filters must be conditioned for at least 24 hours to
allow their weights to stabilize before being weighed.

13.4  Internal QC and Corrective Action for Measurement System

A QC notebook or database (with disk backups)will be mantained which will contain QC data,
including the microbalance calibration and maintenance information, routine internal QC checks of
mass reference standards and laboratory and field filter blanks, and external QA audits.  These
data will duplicate data recorded on laboratory data forms but will consolidate them so that long-
term trends can be identified. It is recommended that QC charts be maintained on each
microbalance and included in this notebook. These charts may allow the discovery  of excess drift
that could signal an instrument malfunction.

At the beginning of each weighing day, after the analyst has completed zeroing and calibrating the
microbalance and measuring the working standard, weigh the three laboratory filter blanks
established for the current filter lot and three field filter blanks from the most recently completed
field blank study.  After approximately every tenth filter weighing, the analyst will reweigh the one
working standard and rezero the microbalance. Record the zero, working standard, and blank
measurements in the laboratory data form and the laboratory QC notebook or database. If the
working standard measurements differ from the certified values or the presampling values by more
than 3 ug, repeat the working standard measurements. If the blank measurements differ from the
presampling values by more than 15 ug, repeat the blank measurements. If the two measurements
still disagree, contact the Laboratory Manager, who may direct the analyst to (1) reweigh some or
all of the previously weighed filters, (2) recertify the working standard against the laboratory
primary standard, (3)conduct minor, non-invasive diagnostic and troubleshooting, and/or (4)
arrange to have the original vendor or an independent, authorized service technician troubleshoot
or repair the microbalance.

Corrective action measures in the PM2 5 FRM system will be taken to ensure good quality data.
There is the potential for many types of sampling and measurement system corrective actions.
Tables 13-1 (organized by laboratory support equipment) and 13-2 ( organized by laboratory
support activity) list potential problems and corrective actions needed to support for a well run
PM2 5 network.  Filter weighing will be delayed until corrective actions are satisfactorily
implemented.

-------
                                                                              Project: Model QAPP
                                                                                   Element No: 13
                                                                                    Revision No: 1
                                                                                     Date: 4/17/98
                                                                             	Page 5 of8
Table 13-1 Potential Problems/Corrective Action for Laboratory Support Equipment
System
Weigh Room
Weigh Room
Balance
Balance
Balance
Balance
Item
Humidity
Temperature
Internal Calibration
zero
Working Standards
Filter Weighing
Problem
Out of Specification
Out of Specification
Unstable
Unstable
Out of Specification
Unstable
Action
Check HVAC system
Check HVAC system
Redue and check
working standards
Redue and check for
drafts, sealed draft
guard
Check balance with
Primary standards
Check Lab Blank
Filters
Notification
Lab Manager
Lab Manager
Lab Manager
Lab Manager
Lab Manager
Document in Log
Book
TABLE 13-2. FILTER PREPARATION AND ANALYSIS CHECKS
Activity
Microbalance
Use
Control of bal.
environment
Use of Mass
reference
standards
Filter handling
Filter integrity
check
Filter
identification
Method and frequency


Working standards checked every
3 to 6 months against laboratory
primary standards
Observe handling procedure
Visually inspect each filter
Write filter number on filter
handling container, sampler
number on protective container,
and both numbers on laboratory
data form in permanent ink
Requirements
Resolution of 1 ug,
repeatability of 1 ug
Climate-controlled, draft-free
room or chamber or equivalent
Standards bracket weight of
filter, individual standard's
tolerance less than 25 ug,
handle with smooth,
nonmetallic forceps
Use powder-free gloves and
smooth forceps. Replace2'"Po
antistatic strips every 6 months
No pinholes, separation, chaff,
loose material, discoloration, or
filter nonuniformiry
Make sure the numbers are
written legibly
Action if the requirements are
not met
Obtain proper microbalance
Modify the environment
Obtain proper standards or
forceps
Discard mishandled filter or old
antistatic strip
Discard defective filter
Replace label or correct form

-------
                                                                                               Project: Model QAPP
                                                                                                     Element No:  13
                                                                                                       Revision No:l
                                                                                                       Date: 4/17/98
                                                                                              	Page 6 of 8
   Activity
    Method and frequency
        Requirements
                                                                                      Action if the requirements are
                                                                                                 not met
Presampling
filter
equilibration
Determine the correct
equilibration conditions and
period (at least 24 hours) for each
new lot of filters. Observe and
record the equilibration chamber
relative humidity and
temperature; enter to lab data
form.
Check for stability of laboratory
blank filter weights. Weight
changes must be <15 ug before
and after equilibration. Mean
relative humidity between 30
and 40 percent, with a
variability of not more than ฑ5
percent over 24 hours. Mean
temperature will be held
between 20 and 23 ฐC, with a
variability of not more than
ฑ2 "Cover 24 hours.
Revise equilibration conditions
and period. Repeat equilibration
Initial filter
weighing
Observe all weighing procedures.
Perform all QC checks
Neutralize electrostatic charge
on filters. Wait long enough so
that the balance indicates a
stable reading (oscillates no
more thanฑ2, drifts no more
than 3/^g, in 5-10 sec).
Repeat weighing
Internal QC
After approximately every tenth
filter, rezero the microbalance and
reweigh the two working
standards. Weigh three laboratory
filter blanks. Reweigh one
duplicate filter with each sample
batch (duplicate weighing).
The working standard
measurements must agree to
within 3 ug of the certified
values.. The blank and
duplicate measurements must
agree to within 15 ug
Flag values for validation
activities.
Postsampling
inspection,
documentation,
and verification
Examine the filter and field data
sheet for correct and complete
entries. If sample was shipped in a
cooled container, verify that low
temperature was maintained.
No damage to filter. Field data
sheet complete. Sampler
worked OK.
 Notify Lab Manager. Discard
filter. Void sample.
Postsampling
filter
equilibration
Equilibrate filters for at least 24
hours. Observe and record the
equilibration chamber relative
humidity and temperature; enter
to lab data sheet. Must be within
+ 5% RH of pre-samplmg
weighing conditions.
Mean relative humidity
between 30 and 40 percent,
with a variability of not more
than ฑ5 percent over 24 hours.
Mean temperature will be held
between 20 and 23 ฐC, with a
variability of not more than
ฑ2 ฐC over 24 hours.
Repeat equilibration
Postsampling
filter weighing
Observe all weighing procedures.
Perform all QC checks.
Neutralize electrostatic charge
on filters. Wait 30 to 60
seconds after balance indicates
a stable reading before
recording data
Repeat weighing

-------
                                                                       Project: Model QAPP
                                                                            Element No: 13
                                                                             Revision No: 1
                                                                              Date: 4/17/98
	Page 7 of8

13.5  Filter Sample Contamination Prevention, Preservation, and Holding
Time Requirements

This section details the requirements needed to prevent and protect the filter sample from
contamination, the volume of air to be sampled, temperature preservation requirements, and the
permissible holding times to ensure against degradation of sample integrity.

13.5.1 Sample Contamination Prevention

The analytical support component of the PM2 5 network has rigid requirements for preventing
sample contamination. Filters are equilibrated/conditioned and stored in the same room where
they are weighed. Powder free gloves are worn while handling filters and filters are only
contacted with the use of smooth nonserrated forceps.  Upon determination of its pre-sampling
weight, the filter is placed in its cassette and then placed in a protective petri dish.  The petri dish
is labeled with a uniquely identifying number that is a sequential number  that includes all filters
originating from the Palookaville weigh room laboratory. Once the filter cassette is taken outside
of the weigh room it will never be opened as damage may result to the 46.2 mm teflon filter.

13.5.2 Sample Volume

The volume of air to be sampled is specified in 40 CFR Part 50. Sample flow rate of air is 16.67
L/min. Total sample of air collected will be 24 cubic meters based upon  a 24 hour sample.

13.5.3 Temperature Preservation Requirements

The temperature requirements of the PM2 5 network are explicitly detailed in 40 CFR Part 50. In
the  weigh room laboratory, the filters must be conditioned for a minimum of 24 hours prior to
pre-weighing; although, a longer period of conditioning may be required. The weigh room
laboratory temperature must be maintained between 20 and 23ฐ C, with no more than a +/- 2ฐ C
change over the 24 period prior to weighing the filters. During transport from the weigh room to
the  sample location,  there are no specific requirements for temperature control; however, the
filters will be located in their protective container and excessive heat avoided.  Temperature
requirements for the sampling and post sampling periods are detailed in 40 CFR Part 50,
Appendix L Section 7.4.10.  These requirements state that the temperature of the filter cassette
during sampler operation and in the period from the end of sampling to the time of sample
recovery  shall not exceed that of the ambient temperature by more than 5ฐ  C for more than 30
minutes.

The specifics of temperature preservation requirements are clearly detailed in 40 CFR Part 50,
Appendix L1.  These requirements pertain to both sample media before collection and both the
sample media and sample after a sample has been collected. Additionally, during the sample
collection there are requirements for temperature control.  The temperature requirements are

-------
                                                                         Project: Model QAPP
                                                                              Element No: 13
                                                                               Revision No:l
                                                                                Date. 4/17/98
                                                                         	Page 8 of8
 detailed in Table 13-3.
 Table 13-3 Temperature Requirements
Item
Weigh Room
Pre-weighed Filter
Filter Temperature Control during
sampling and until recovery
Post Sample Transport so that final
weight may be determined up to 30
days after end of sample period
Temperature Requirement
20 - 23ฐ C
+/- 2ฐ C for 24 hours prior to weighing
No more than 5ฐ C above ambient
temperature.
4ฐ C or less
Reference
40 CFR Part 50, Appendix L, Section
8.3.1
40 CFR Part 50, Appendix L, Section
83.2
40 CFR Part 50, Appendix L, Section
7.4.10
40 CFR Part 50, Appendix L, Section
8.3.6
 13.5.4  Permissible Holding Times

 The permissible holding times for the PM2 5 sample are clearly detailed in both 40 CFR Part 501
 and Section 2.12 of the U.S. EPA QA Handbook2. A summary of these holding times are
 provided in Table 11-6 in subsection 11.5.4.
 References

 The following documents were utilized in the development of this section:
1. U.S. EPA (1997a) National Ambient Air Quality Standards for Particulate Matter - Final Rule.
   40 CFR Part 50. Federal Register, 62(138):38651-38760. July 18,1997.
2. U.S. EPA Quality Assurance Guidance Document 2.12: Monitoring PM25 in Ambient Air Using
   Designated Reference or Class I Equivalent Methods. March, 1998

-------
                                                                              Project: Model QAPP
                                                                                   Element No: 14
                                                                                    Revision No: 1
                                                                                    Date: 4/17/98
                                                                             	Page 1 of 15
                        14.0 Quality Control Requirements
     QC is "the overall system of technical activities that measures the attributes and performance of a process, item,
 or service against defined standards to verify that they meet the stated requirements established by the customer." QC
 is both corrective and proactive in establishing techniques to prevent the generation of unacceptable data, and so the
 policy for corrective action should be outlined. This element will rely on information developed in section 7, "Quality
 Objectives and Criteria for Measurement Data," which establishes measurement performance criteria.
 To assure the quality of data from air monitoring measurements, two distinct and important
 interrelated functions must be performed. One function is the control of the measurement process
 through broad quality assurance activities, such as establishing policies and procedures,
 developing data quality objectives, assigning roles and responsibilities, conducting oversight and
 reviews, and implementing corrective actions. The other function is the control of the
                                                                        measurement process
                                                                        through the
                                                                        implementation of
                                                                        specific quality control
                                                                        procedures, such as
                                                                        audits, calibrations,
                                                                        checks, replicates,
                                                                        routine
                                                                        self-assessments, etc.
                                                                        In general, the greater
                                                                        the control of a given
                                                                        monitoring system, the
                                                                        better will be the
                                                                        resulting quality of the
                                                                        monitoring data.

                                                                        Quality Control (QC)
                                                                        is the overall system of
                                                                        technical activities that
                                                                        measures the attributes
                                                                        and performance of a
                                                                        process, item,  or
                                                                        service against defined
                                                                        standards to verify that
 Figure 14.1 Quality control and quality assessment activities                          they meet  the Stated
                                                                        requirements
established by the customer.  In the case of the Ambient Air Quality Monitoring Network, QC
activities are used to  ensure that measurement uncertainty, as discussed in Section 7, is maintained



^
^
^

^
^


/ Oiiality\
\. ControL^
Training M
Technical |
Competence of 1
Analysis j|
Good Laboratory I
Practices (GLP) 1
Good ||
Measurement 1
L Practices (GMPjl
Standarrd g
Operating jj
^Procedures (SOPs)J
Proper Facilities ||
and 1
.Instrumentation m
Environmen
Quality
Assurance


Internal Standard II
Reference Material B^
Replicate B^
Measurements ||
Internal On-going n
Inspections B^
Quality Control 8
Charts B^ 	
Interchange
Analysis
of r —
Interchange of B^
Instruments ||
Proper 1
Documentation j|

taii
1

/ Quality N.
>v Assessments^^
External |
External Standard •
Reference Materially 	
(NPAP) r
Technical Systems [I
Audit i^ 	
Interlab B^ 	
Comparisons |J
' \
DQO/MQO |^
Assessment 8^
Network |^ 	
Reviews ||



-------
                                                                                 Project: Model QAPP
                                                                                       Element No. 14
                                                                                       Revision No:l
                                                                                        Date: 4/17/98
	Page 2 of 15

within acceptance criteria for the attainment of the DQO.  Figure 14.1 represents a number of QC
activities that help to evaluate and control data quality for the PM2 5 program. Many of the
activities in this figure are implemented by the Department and are discussed in the appropriate
sections of this QAPP.


14.1 QC Procedures

    This element will need to furnish information on any QC checks not defined in other QAPP elements and should
reference other elements that contain this information where possible.

    Many of these QC checks result in measurement data that are used to compute statistical indicators of data quality.
For example, a series of dilute solutions may be measured repeatedly to produce an estimate of the instrument
detection limit. The formulas for calculating such Data Quality Indicators (DQIs) should be provided or referenced in
the text. This element should also prescribe any  limits that define acceptable data quality for these indicators (see also
Appendix D, "Data Quality Indicators"). A QC  checklist should be used to discuss the relation of QC to the overall
project objectives with respect to:

       the frequency of the check and the point in the measurement process in which the check sample is introduced,
       the traceability of the standards,
       the matrix of the check sample,
       the level or concentration of the analyte of interest,
       the actions to be taken in the event that  a QC check identifies a failed or changed measurement system,
       the formulas for estimating DQIs, and
       the procedures for documenting QC results, including control charts.

    Finally, this element should describe how the QC check data will be used to determine that measurement
performance is acceptable.  This step can be accomplished by establishing QC "warning" and "control" limits for the
statistical data generated by the QC checks (see standard QC textbooks or refer to EPA QA/G-5T for operational
details).
Day-to-day quality control is implemented through the use of various check samples or
instruments that are used for comparison. The measurement quality objectives table (Table 7-4)
in Section 7 contains a complete listing of these QC samples as well as other requirements for the
PM2 5 Program. The procedures for implementing the QC samples are included in the field and
analytical methods section (Sections 11 and 13 respectively). As Figure 14.2 illustrates, various
types of QC samples have been inserted at phases of the data operation to assess and control
measurement uncertainties. Tables 14-1 and 14-2  contains a summary of all the field and
laboratory QC samples. The following information provides some additional descriptions of these
QC activities,  how they will be used in the evaluation process, and what corrective actions will be
taken when they do not meet acceptance criteria.

-------
                                                                                                                          Project: Model QAPP
                                                                                                                               Element No: 14
                                                                                                                                 Revision No:
                                                                                                                                 Date: 4/17/98
                                                                                                                         	Page 3 of 15
Table 14-1 Field QC Checks
Requirement
Calibration Standards
Flow Rate Transfer Std
Field Thermometer
Field Barometer
Calibration/Verification
Flow Rate (FR) Calibration
FR multi-point verfifi cation
One point FR verification
External Leak Check
Internal Leak Check
Temperature Calibration
Temp multi-point verification
One- point temp Verification
Pressure Calibration
Pressure Verification
Clock/timer Verification
Blanks
Field Blanks
Precision Checks
Collocated samples
Accuracy
Flow rate audit
External Leak Check
Internal Leak Check
Temperature Check
Pressure Check
Audits (external assessment!)
FRM Performance evaluation
Flow rate audit
External Leak Check
Internal Leak Check
Temperature Audit
Pressure Audit
Frequency
1/yr
1/yr
1/yr
If multi-point failure
1/yr
1/4 weeks
every 5 sampling events
every 5 sampling events
If multi-point failure
on installation, then 1/yr
1/4 weeks
on installation, then 1/yr
1/4 weeks
1/4 weeks
See 2 12 reference
every 6 days
l/3mo (manual)
4/yr
4/yr
4/yr
4/yr (•>)
25% of sites 4/yr
1/yr
I/yr
1/yr
1/yr
1/yr
Acceptance Criteria
ฑ2% of NIST-traceable Std
+ 0 1ฐ C resolution
+ 0.5ฐC accuracy
+ 1 mm Hg resolution
+ 5 mm Hg accuracy
+ 2% of transfer standard
+ 2% of transfer standard
+ 4% of transfer standard
80 mL/mm
80 mL/mm
ฑ 2% of standard
+ 2ฐCof standard
+ 4ฐCof standard
ilOmmHg
ฑ10mmHg
1 min/mo
ฑ30/^g
CV< 10%
+ 4% of transfer standard
< 80 mL/min
< 80 mL/min
ฑ2ฐC
itOmmHg
+ 10%
+ 4% of audit standard
< 80 mL/mm
< 80 mL/mm
ฑ2ฐC
ilOmmHg
CFR Reference
Part 50, App L Sec 9 1, 92
not described
not described
not described
not described
Part 50, App.L, Sec 9 2
Part 50, App L, Sec 9.2 5
Part 50, App L, Sec 7 4
Part 50, App L, Sec 9 3
Part 50, App L, Sec 9 3
Part 50, App L, Sec 7 4
Part 50, App L Sec 8 2
Part 58, App A, Sec 3 5, 5 5
Part 58, App A, Sec 3 5 1
not described
not described
not described
Part 58, App A, Sec 3 5 3
not described
not described
not described
not described
not described
2.12 Reference
Sec 63
Sec 4.2 and 8 3
Sec 6.3 and 6.6
Sec 8.3
Sec 8 3
Sec 83
Sec. 8.3
Sec 6.4
Sec 6 4 and 8.2
Sec 6.4 and 8.2
Sec. 6 5
Sec. 8 2
not described
Sec. 7.10
Sec 10.3
Sec 81
Sec 103
Sec 102
Information Provided
Certification of Traceability
Certification of Traceability
Certification of Traceability
Calibration drift and memory effects
Calibration drift and memory effects
Calibration drift and memory effects
Sampler function
Sampler function
Calibration drift and memory effects
Calibration drift and memory effects
Calibration drift and memory effects
Calibration drift and memory effects
Calibration drift and memory effects
Verification of to assure proper function
Measurement system contamination
Measurement system precision
Instrument bias/accuracy
Sampler function
Sampler function
Calibration drift and memory effects
Calibration drift and memory effects
Measurement system bias
External verification bias/accuracy
Sampler function
Sampler function
Calibration drift and memory effects
Calibration drift and memory effects

-------
                                                                                                                           Project: Model QAPP
                                                                                                                                Element No: 14
                                                                                                                                  Revision No:
                                                                                                                                  Date: 4/17/98
                                                                                                                          	Page 4 of 15
Table 14-2 Laboratory QC
Requirement
Blanks
Lot Blanks
Lab Blanks
Calibra tion/Verifica tion
Balance Calibration
Lab Temp Calibration
Lab Humidity Calibration
Accuracy
Balance Audit
Balance Check
Calibration standards
Working Mass Stds
Primary Mass Stds
Precision
Duplicate filter weighings
Frequency
3-Iot
3 per batch
1/yr
3 mo
3 mo
I/year
beginning, every
10th samples, end
3-6 mo
1/yr
1 per weighing
session
Acceptance Criteria
+ 1 5 ,ug difference
+ 15/^g difference
Manufacturers spec
ฑ2ฐC
ฑ2%
+ 1 5 /j.g for unexposed
filters
<3,ug
25 Aig
25 A
-------
Project: Model QAPI
Element No: l^
Revision No
Date: 4/1 7/9!
Page 5 of If


PM2.5 Quality Control Sampling Scheme
Laboratory
Pre- Field Weighing
Field
Sampling
Laboratory
Post-Field Weighing

^ 	 ' ' 	 ^
fc.
Field | *-,..ป. h collocated | (7RM AudX ™ | Ba,,nce |
Blank 1 Sample ฃ Sample | XL ^f Blank I Checks |

\

r '


tt — ' 1


F ^^J
Field || Flow || Routine J| 	 ^ Collocated | (^Ku
Blank I Check | Sample ง Sample | X^

\

' IF 1


l^c~ \


r ^^.fc
!
r
Audit^
:
Field Ij F|OW I Routine fe 	 ^ Collocated || (^RM AudiT\ Lat> If Balance ||
Blank ง Check | Sample | Sample | X. ^Ji* Blank | Checks |
Meas System Instrument Meas System Meas System Lab Ba ance
Contain nation Accuracy Precis on Bias Contamination Precision/Bias
  Figure 14.2 PM2.5 Quality control scheme
14.1.1 Calibrations

Calibration is the comparison of a measurement standard or instrument with another standard or
instrument to report, or eliminate by adjustment, any variation (deviation) in the accuracy of the
item being compared1. The purpose of calibration is to minimize bias.

For PM2 5, calibration activities follow a two step process:

     1. Certifying the calibration standard and/or transfer standard against an authoritative
       standard, and
     2. Comparing the calibration standard and or transfer  standard against the routine
       sampling/analytical instruments.

Calibration requirements for the critical field and laboratory equipment are found in Tables 14-1
and 14-2 respectively; the details of the calibration methods are included in the calibration section
(Section 16)  and in the field  and laboratory methods sections (11 and 13  respectively)

14.1.2 Blanks

Blank samples are used to determine contamination arising from principally four sources: the
environment from which the sample was collected/analyzed, the reagents used in the analysis, the
apparatus used, and the operator/analyst performing the data operation. Three types of blanks

-------
                                                                         Project: Model QAPP
                                                                              Element No: 14
                                                                                Revision No:
                                                                                Date: 4/17/98
                                                                         	Page 6 of 15
will be implemented in the PM2 5 Program:
Lot blanks - a shipment of 46.2mm filters will be periodically sent from EPA to Palookaville.
Each shipment must be tested to determine the length of time it takes the filters to stabilize. Upon
arrival of each shipment, 3 lot blanks will be randomly selected for the shipment and be subjected
to the conditioning/pre-sampling weighing procedures.  The blanks will be measured every 24
hours for a minimum of one week to determine the length of time it take to maintain a stable
weight reading.

Field blanks - provides an estimate of total measurement system contamination. By comparing
information from laboratory blanks against the field blanks, one can assess contamination from
field activities. Details of the use of the field blanks can be found in field SOPs (Appendix E)

Lab blanks -provides an estimate of contamination occurring at the weighing facility. Details of
the use of the lab blanks can be found in can be found in field SOPs (Appendix E)

Blank Evaluation —

The Department will include 3 field and 3 lab blanks into each weighing session batch. A batch is
defined in section 14.2.  The following statistics will be generated for data evaluation purposes:

Difference for a single check (d) - The difference, d, for each check is calculated using Equation
1, where X represents the concentration produced from the original weight and Y represents  the
concentration reported for the duplicate weight

                       d  =   Y-X\           Equation 1
Percent Difference for a Single Check (eQ.  The percentage difference, d,, for each check is
calculated using Equation 2 where X, represents the original weight and Yt represents the
concentration reported for the duplicate weight.

                      Yl-Xi
                              X 10ฐ         Equation 2
Mean difference for batch (d)  - The mean difference dz for both field and lab blanks within a
weighing session batch, is calculated using equation 3 where d, through dn represent individual
differences (calculated from equation 1) and n represents the number of blanks in the batch.
                cf,  + d  + d,....d
          d  = -!	       Equation3
           z           n

-------
                                                                          Project: Model QAPP
                                                                               Element No: 14
                                                                                Revision No:
                                                                                Date: 4/17/98
	Page 7  of 15

Corrective action- The acceptance criteria for filed blanks is 30 //g difference, while lot and lab
blanks is 15 /ug difference and is determined by equation 1. However the mean difference based
upon the number of blanks in each batch will be used for comparison against the acceptance
criteria.  If the mean difference of either the field or laboratory blanks is greater than 15/ug, all
the samples in the weighing session will be re-weighed. Prior to re-weighing,  the laboratory
balance will be checked for proper operation.  If the blank means of either the field or lab blanks
are still out of the acceptance criteria, all samples within the weighing session will be flagged with
the appropriate flag (FFK or FLB), and efforts will be made to determine the source of
contamination.   In theory, field blanks should contain more contamination than laboratory blanks.
Therefore, if the field blanks are outside of the criteria while the lab blanks are acceptable,
weighing can continue on the next batch of samples while  field contamination sources are
investigated.  If the mean difference of the laboratory blanks is greater than 2Qwg and 2 or more
of the blanks were greater than  15/ug, the laboratory weighing will stop until the issue is
satisfactorily resolved.  The laboratory technician will alert the Laboratory Branch Manager and
QA Officer of the problem. The problem and solution will be reported and appropriately filed
under response and corrective action reports  (PROG/082 OVER/658, see Section 9)

Lab and field blanks will be control charted (see Section 14.3). The percent difference calculation
(equation 2) is used for control charting purposes and can  be used to determine equilibrium status.

14.1.3 Precision Checks

Precision is the measure of mutual agreement among individual measurements of the same
property, usually under prescribed similar conditions. In order to meet the data quality objectives
for precision, the Department must ensure the entire measurement process is within statistical
control. Two types of precision measurements will be made in the PM2 5 Program.

    >  Collocated monitoring
    *•  Filter duplicates

Collocated Monitoring - -

In order to evaluate total measurement precision, collocated monitoring will be implemented, as
referenced in CFR.  Therefore, every method designation will:

    a. have 25% of the monitors collocated (values of .5 and greater round up).
    b. have at least 1 collocated monitor (if total number less than 4). The first collocated
       monitor must be the FRM.
    c. have 50% of the collocated monitors be FRM monitors and 50% must be the same
       method designation. If there is an odd number of collocated monitors required, bias in
       favor of the FRM.

Palookaville will be implementing 5 FRM monitors, 4 being sequential types of the same method

-------
                                                                        Project: Model QAPP
                                                                             Element No: 14
                                                                               Revision No:
                                                                               Date: 4/17/98
                                                                         	Page 8 of 15

designation, and 1 being a single channel monitor. Therefore, the Department is required to have
2 collocated monitors. The location of these monitors is described in Section 10, but it is
anticipated that these sites will collect concentrations around the NAAQS, or will be sites where
higher concentrations are expected.

Evaluation of Collocated Data- Collocated measurement pairs are selected for use in the
precision calculations only when both measurements are above  6  ug/m3. However, all collocated
data will be reported to AIRS.

The following algorithms will be used to evaluate collocated data. These algorithms are included
in 40 CFR Part 58 Appendix A.  The equation numbers in 40 CFR will also be utilized in this
QAPP.

Percent Difference for a Single Check (d). The percentage difference, d,, for each check is
calculated by using Equation 19, where Xt represents the concentration produced from the
primary sampler and Y, represents the concentration reported for the duplicate sampler.

                          Y.-X,
                   d.  = 	'•	— * 100          Equation 19
                    '
Coefficient of Variation (CV) for a Single Check (CVj.  The coefficient of variation, CVP for
each check is calculated by dividing the absolute value of the percentage difference, d,, by the
square root of two as shown in Equation 20.


                                  j[              Equation 20
Precision of a Single Sampler - Quarterly Basis (CV,,).  For particulate sampler j, the
individual coefficients of variation (CVjq) during the quarter are pooled using Equation 21, where
HJ q is the number of pairs of measurements from collocated samplers during the quarter.
                          N
                             ECV,2
                                                  Equation 21
The 90 percent confidence limits for the single sampler's CV are calculated using Equations
22 and 23, where X2oosdfan<^ X2o95,dfare me 0.05 and 0.95 quantiles of the chi-square (%2)

-------
                                                                         Project: Model QAPP
                                                                              Element No: 14
                                                                                Revision No:
                                                                                Date: 4/17/98
                                                                         	Page 9 of 15
 distribution with n, degrees of freedom.
Lower Confidence Limit = CV,
                                       n:
                                  r-2
                                        ——        Equation 22
   Upper Confidence Limit =  CV,
                                   nj,q          Equation 23
Precision of a Single Sampler - Annual Basis. For particulate sampler j, the individual
coefficients of variation, CV,, produced during the calendar year are pooled using Equation 21,
where n^ is the number of checks made during the calendar year. The 90 percent confidence limits
for the single sampler's CV are calculated using Equations 22 and 23, where x20 05 dfan^ X2o 95 df
are the 0.05 and 0.95 quantiles of the chi-square (%2) distribution with n} degrees of freedom.

Corrective Action: Single Monitor - The precision data quality objective of 10% coefficient of
variation (CV) is based upon the evaluation of three years of collocated precision data. The goal
is to ensure that precision is maintained at this level. Therefore, precision estimates for a single
pair of collocated instruments,  or even for a quarter, may be greater than  10% while the three
year average is less than or equal to  10%. Therefore,  single collocated pairs with values >10%
will be flagged PCS and reweighed. If the value remains between 10-20% the field technician will
be alerted to the problem. If the CV is greater than 20%CV for both the initial and reweigh, all
the primary sampler data will be flagged  PCS from the last precision check and corrective action
will be initiated. Paired CVs and percent  differences will be control charted to determine trends
(section 14.2). The laboratory technician  will alert the  Laboratory  Branch Manager and QA
Officer of the problem. The problem and solution will be reported and appropriately filed under
response and corrective action  reports  (PROG/082 OVER/658, see Section 9).

Corrective Action: Quarter -  Usually, corrective action will be initiated and imprecision rectified
before a quarters worth of data fail to meet 10% CV.  However in the case were the quarters CV
is greater than 20% the routine data for that monitor for that quarter will be flagged (FPC).  The
QA Office, the Lab and the Air Monitoring Branch Managers will  work together to identify the
problem and a solution. The EPA Regional Office will be alerted of the issue and may be asked to
help find a common solution. The problem and solution will be reported and appropriately filed
under response and corrective action reports  (PROG/082 OVER/658, see Section 9)

Duplicate Laboratory Measurements —

During laboratory preweighing  and post weighing sessions, a routine filter from the sampling

-------
                                                                          Project: Model QAPP
                                                                               Element No: 14
                                                                                Revision No:
                                                                                Date: 4/17/98
	Page 10 of 15

batch will be selected for a second weighing. Equations 1 and 2  will be generated for this
information.  The difference among the weights of these two filters must be less than 15/^g. If
this criteria is not it met, the pair of values will be flagged FLD. Failure may be due to
transcription errors, microbalance malfunction, or that the routine samples have not reached
equilibrium.  Other QC checks (balance standards and lab blanks) will eliminate microbalance
malfunction. If the duplicate does not meet the criteria, a second routine sample will be selected
and  reweighed as a second duplicate check. If this second check fails the  acceptance criteria and
the possibility of balance malfunction and transcription errors have been eliminated,  all samples  in
the batch will be equilibrated for another 24 hours and reweighed. Corrective actions will
continue until duplicate weights for the batch meet acceptance criteria.

14.1.4 Accuracy or Bias Checks

Accuracy  is defined as the degree of agreement between an observed value and an accepted
reference value and includes a combination  of random error (precision) and systematic error
(bias). Two accuracy checks are implemented in the PM2 5 program:

     >  Collocated monitors
     *•  Flow rate audits
     *•  Balance checks
     *•  FRM performance evaluations

Collocated Monitors --

Although the collocated monitors are primarily used for evaluating and controlling precision, they
can be used to determine  accuracy or bias. By using equation 19 to determine percent difference,
one can track trends or bias between the two instruments without knowing which instrument is
producing the "true" value.  Use of the FRM performance evaluation information (discussed
below) in conjunction with collocation data  should help improve the quality of data.

Corrective Action - The percent difference of the paired values will be control charted to
determine trends. If it appears that there is a statistically significant bias (> 10% at the 90%
confidence level) between the pairs, corrective action will be initiated. The process will include
eliminating uncertainties  that may be occurring at filter handling, transport and laboratory stages,
in order to determine that the bias is truly at the instrument. Corrective actions at the instrument
will include multi-point temperature, pressure, and flow rate checks as well as complete
maintenance activities.  Additional corrective action could include a request for vendor servicing
or a request for Region Y to implement an FRM performance evaluation.

Flow Rate Audits-

Since the Department will be implementing  manual, in lieu of continuous sampling devices, we
will implement a flow rate audit every quarter.  Details of the implementation aspects of the audit

-------
                                                                         Project: Model QAPP
                                                                              Element No: 14
                                                                               Revision No:
                                                                               Date: 4/17/98
 	Page 11 of 15

 are included in Section 11.  The audit is made by measuring the analyzer's normal operating flow
 rate using a certified flow rate transfer standard. The flow rate standard used for auditing will not
 be the same flow rate standard used to calibrate the analyzer. However, both the calibration
 standard and the audit standard may be referenced to the same primary flow rate or volume
 standard. Report the audit (actual) flow rate and the corresponding flow rate indicated or
 assumed by the sampler. The procedures used to calculate measurement uncertainty are described
 below.

 Accuracy of a Single Sampler - Single Check (Quarterly) Basis (eQ.  The percentage
 difference (cQ for a single flow rate audit i is calculated using Equation 13, where Xt represents
 the audit standard flow rate (known) and Yt represents the indicated flow rate.
                                              Equation 13
Bias of a Single Sampler - Annual Basis (D).  For an individual particulate sampler j, the
average (DJ) of the individual percentage differences (dy) during the calendar year is calculated
using Equation 14, where rij is the number of individual percentage differences produced for
sampler j during the calendar year.
             D. = — x  Ed.          Equation 14
Bias for Each EPA Federal Reference and Equivalent Method Designation Employed by
the Department - Quarterly Basis (DkJ. For method designation k used by the reporting
organization, quarter g's single sampler percentage differences (dy) are averaged using Equation
16, where nkq is the number of individual percentage differences produced for method designation
k in quarter q.
                       1     'ฃ?
               uk,q   ~   x  ^ "j       Equation 16
Corrective Action - The single sampler accuracy requirement is + 4% .  If the audit violates the
acceptance criteria,  the sampling instrument will be checked for internal and external leaks,
ensure that temperature and pressure are within acceptable ranges, and the audit run a second
time.  If the audit is  still unacceptable, a multi-point calibration followed by a one-point

-------
                                                                          Project: Model QAPP
                                                                                Element No: 14
                                                                                 Revision No:
                                                                                 Date: 4/17/98
	Page 12 of 15

verification is required.  Routine data, back to an acceptable audit,  will be flagged and reviewed
to determine validity (see Section 23). In addition,  one would expect that the flow rate
calibration verification checks that will be implemented every 5 sampling events (see section 16)
would indicate a drift towards unacceptable accuracy.  If a review of the flow rate calibration
verification check data does not show a problem, there is a potential that one or both of the flow
rate standards need to be recertified.

Balance Checks -

Balance checks are frequent checks of the balance working standards (100 and 200 mg standards)
against the balance to ensure that the balance is within acceptance criteria throughout the pre- and
post-sampling weighing sessions. Palookaville will use ASTM class 1 weights for its primary and
secondary (working) standards. Both working standards will be used measured at the beginning
and end of the sample batch and 1 will be selected for a measure after every 10 filters. Balance
check samples will  be controlled charted  (see Table  14-5).

Balance Check Evaluation- The following algorithm will be used to evaluate the balance checks

Difference for a single check (dy) - The difference,  dr for each check is calculated using
Equation 3, where X represents the certified mass weight and Y represents the reported weight.

                        d   =  Y-X          Equation 3
Corrective Action - The difference among the reported weight and the certified weight must be <
3,ug.  Since this is the first check before any pre-or post-sampling weighings, if the acceptance
criteria is not met, corrective action will be initiated. Corrective action may be as simple as
allowing the balance to perform internal calibrations or to sufficiently warm-up, which may
require checking the balance weights a number of times.  If the acceptance criteria is still not met,
the laboratory technician will be required to verify the working standards to the primary
standards.  Finally, if it is established that the balance does not meet acceptance criteria for both
the working and primary standards, and other trouble shooting techniques fail, the Libra Balance
Company service technician (see Section  15) will be called to  perform corrective action.

If the balance check  fails acceptance criteria during a run, the  10 filters weighed prior to the
failure will be rerun. If the balance check continues to fail, trouble shooting, as discussed above,
will be initiated. The samples values of the 10 samples weighed prior to the failure will be
recorded and flagged, but will be remain with the unweighed samples in the batch to be reweighed
when the balance meets the acceptance criteria. The data acquisition system will flag any balance
check outside the acceptance criteria as FIS.

-------
                                                                        Project: Mode! QAPP
                                                                             Element No: 14
                                                                               Revision No:
                                                                              Date: 4/17/98
                                                                              Page 13 of 15
 FRM Performance Evaluation --
 The Federal Reference Method (FRM) Performance Evaluation is a quality assurance activity
 which will be used to evaluate measurement system bias of the PM2 5 monitoring network. The
 pertinent regulations for this performance evaluation are found in 40 CFR Part 58, Appendix A,
 section S.5.32. The strategy is to collocate a portable FRM PM2 5 air sampling instrument with an
 established routine air monitoring site, operate both monitors in exactly the same manner, and
 then compare the results of this instrument against the routine sampler at the site.  The EPA will
 be implementing this program and will inform the Department when an evaluation will be
 conducted. The evaluation will be conducted on a regularly scheduled sampling day and the
 filters from the evaluation instrument will be sent to a national laboratory in Region 10 for
 measurement. The comparison of data will be accomplished by EPA personnel using the
 Aerometric Information Retrieval System (AIRS) data base. It must be noted that the
 performance evaluation is a estimate of the uncertainty of the measurement system and not the
 instrument. Therefore, biases may  be attributed to sample handling, transportation  and
 laboratory  activities as well as to the instrument. The statistics used in the assessment are included
 in CFR part 5 82

 Corrective Action - The U.S. EPA will notify the Department of the evaluation results within 10
 days of sampling. The bias acceptance criteria for the data comparison is ฑ 10%.  If it appears
 that there is a bias, corrective action will be  initiated. The process will include an attempt to
 determine at what data collection phase(s) the majority of the measurement errors are occurring.
 This may require that Region Y conduct additional FRM performance  evaluations to trouble
 shoot the process.

 14.2 Sample Batching - QC Sample Distribution

 In order to  ensure that the Department can review all types of QC samples within a weighing
 session, the Department will use the concept of sample batches. A batch of sample will consist of
 all routine and QC samples collected in a two week sample period. And will consist of the
 following samples indicated in Table 14-3

Table 14-3 Sample Batch
Sample
5 sites 1/3 day sampling
Collocated monitors (2)
Duplicate filter weighings
lab blanks
field blanks
Balance checks
Total
Number
20
4
I
3
3
7
38

-------
                                                                          Project: Model QAPP
                                                                               Element No: 14
                                                                                 Revision No:
                                                                                Date: 4/17/98
                                                                         	Page 14 of 15
Sample Distribution —

QC samples need to be interspersed within the batch in order to provide data quality  information
throughout the batch weighing session. Table 14-4 represents the sample batch arrangement the
laboratory will use during post-sampling weighing activities.

Table 14-4 Batch Sample Distribution
1) Balance Check 1
2) Balance Check 2
3) Lab Blank
4) Field Blank
5) Site A 1 wk Isam. 1
6) Site A 1 wk 1 sam 2
7) SiteA2 wk 1 sam 1
8) Site A2 wk 1 sam 2
9) Site A3 wk 1 sam 1
10) Site A3 wk 1 sam 2
11) Site A4 wk 1 sam 1
12)SiteA4wk 1 sam 2
13) Balance Check
14) Site Bl wk 1 sam 1
15) Site Bl wk 1 sam 2
16)Coll Sam Site Al wk 1
17)CollSam SiteBl wk 1
18)SiteAl wk2sam 1
19)Site Al wk2sam 2
20) SiteA2 wk 2 sam 1
2I)SiteA2wk2sam2
22) Site A3 wk 2 sam 1
23) Site A3 wk 2 sam 2
24) Balance Check
25) Lab Blank
26) Field Blank
27) Site A4 wk 2 sam 1
28) Site A4 wk 2 sam 2
29) SiteBl wk 2 sam 1
30) SiteBl wk2sam2
31) Coll Sam Site Al wk2
32) Coll Sam SiteBl wk 2
33) Balance Check
34)LDupSiteAl wk Isl
3 5) Lab Blank
36) Field Blank
37) Balance check 1
38) Balance check 2


14.3 Control Charts

Control charts will be used extensively by the Department. They provide a graphical means of
determining whether various phases of the measurement process are in statistical control.  The
Department will utilize property charts which graph single measurements of a standard or a mean
of several measurements.  The department will also develop precision charts which utilize the
standard deviation of the measurement process. Table 14-5 indicates which QC samples will be
control charted. The control charts will be utilized as an "early warning system" to evaluate
trends in precision and bias.  They will be discussed in the QA Annual QA Report (Section 21).
They will be appropriately filed (SAMP/223) and archived.

Table 14-5 Control Charts
QC Check
Lot blanks
Flow rate calibration verification check
Lab/Field Blanks
Flow rate audit
Balance check
Collocated monitoring pairs
Duplicate filter weighings
Plotting technique
mean value of 3 blanks for each measurement
single values plotted
mean value of each batch
single values plotted
mean value of each batch
Percent difference each pair charted by site
coefficient of variation each pair
coefficient of variation of all sites per quarter
Percent difference each pair

-------
                                                                         Project: Model QAPP
                                                                              Element No: 14
                                                                                Revision No:
                                                                               Date: 4/17/98
                                                                               Page 15 of 15
 References
1. Taylor, J.K. 1987 Quality Assurance of Chemical Measurements.  Lewis Publishers, Chelsea,
   Michigan. 328pp.

2. U.S. EPA (1997b) Revised Requirements for Designation of Reference and Equivalent Methods
   for PM2.5 and Ambient Air Quality Surveillance for Particulate Matter-Final Rule. 40 CFR
   Parts 53 and 58. Federal Register, 62(138):38763-38854. July 18,1997.

-------
                                                                                Project: Model QAPP
                                                                                     Element No: 15
                                                                                      Revision No: 1
                                                                                       Date: 4/17/98
                                                                               	Page 1 of5
   15.0 Instrument/Equipment Testing, Inspection, and Maintenance
                                        Requirements
    The purpose of this element of the QAPP is to discuss the procedures used to verify that all instruments and
equipment are maintained in sound operating condition and are capable of operating at acceptable performance
levels. This section describes how inspections and acceptance testing of environmental sampling and measurement
systems and their components will be performed and documented.
 15.1   Purpose/Background

 The purpose of this element in the Palookaville QAPP is to discuss the procedures used to verify
 that all instruments and equipment are maintained in sound operating condition and are capable of
 operating at acceptable performance levels. All instrument inspection and maintenance activities
 are documented and filed under AIRP/486. See Section 9 for document and record details.

 15.2   Testing
     The procedures described should (1) reflect consideration of the possible effect of equipment failure on overall
 data quality, including timely delivery of project results; (2) address any relevant site-specific effects (e.g.,
 environmental conditions); and (3) include procedures for assessing the equipment status. This element should
 address the scheduling of routine calibration and maintenance activities, the steps that will be taken to minimize
 instrument downtime, and the prescribed corrective action procedures for addressing unacceptable inspection or
 assessment results. This element should also include periodic maintenance procedures and describe the availability of
 spare parts and how an inventory of these parts is monitored and maintained.  The reader should be supplied with
 sufficient information to review the adequacy of the instrument/equipment management program. Appending SOPs
 containing this information to the QAPP and  referencing the SOPs in the text are acceptable.

     Inspection and testing procedures may employ reference materials, such as the National Institute of Standards
 and Technology's (NIST's) Standard Reference Materials (SRMs), as  well as QC standards or an equipment
 certification program. The accuracy of calibration standards is important because all data will be measured in
 reference to the standard used. The types of standards or special programs should be noted in this element, including
 the inspection and acceptance testing criteria  for all components. The acceptance limits for verifying the accuracy of
 all working standards against primary grade standards should also be provided.
All PM2 5 samplers used in the Palookaville PM2 5 Ambient Air Quality Monitoring Network will
be designated federal reference methods (FPJVI) that have been certified as such by EPA.
Therefore, they are assumed to be of sufficient quality for the data collection operation. Testing
of such equipment is accomplished by EPA through the procedures described in 40 CFR Part 50'.
Prior to field installation, Palookaville will assemble and run the samplers at the laboratory. The
field operators will  perform external  and internal leak checks and temperature, pressure and flow
rate multi-point verification checks.  If any of these checks are out of specification (see Table 14-

-------
                                                                          Project: Model QAPP
                                                                              Element No: 15
                                                                               Revision No: 1
                                                                                Date: 4/17/98
                                                                         	Page 2 of 5
1),  the Department will contact the vendor for initial corrective action. Once installed at the site,
the field operators will run the tests mentioned above. If the sampling instrument meets the
acceptance criteria,  it will be assumed to be operating properly.  These tests will be properly
documented and filed (AIRP/486) as indicated in Section 9.

15.3  Inspection

Inspection of various equipment and components are provided here.  Inspections are subdivided
into two sections: one pertaining to weigh room laboratory issues and one associated with field
activities.

15.3.1 Inspection in Weigh Room Laboratory

There are  several items that need routine inspection in the weigh room laboratory.  Table 15-1
details the items to inspect and how to appropriately document the inspection.

Table 15-1 Inspections in the Weigh Room Laboratory
Item
Weigh room
Temperature
Weigh Room
Humidity
Dust in Weigh
Room
Inspection
Frequency
Daily
Daily
Monthly
Inspection
Parameter
20 - 23ฐ C
30 - 40ฐ RH
Use glove and
visually inspect
Action if Item Fails Inspection
1. ) Check H VAC System
2.) Call service provider that
holds maintenance agreement
1 ) Check HVAC System
2 ) Call service provider that
holds maintenance agreement
Clean Weigh Room
Documentation
Requirement
1.) Document in weigh room log boo
2 ) Notify Lab Manager
1.) Document in weigh room log boo
2 ) Notify Lab Manager
Document in Weigh Room Log Bool
15.3.2 Inspection of Field Items
There are several items to inspect in the field before and after a PM2 5 sample has been taken.
Table 15-2 details the inspections performed in the field before and after samples are taken.

-------
                                                                        Project: Model QAPP
                                                                             Element No: 15
                                                                              Revision No: 1
                                                                               Date: 4/17/98
                                                                        	Page 3 of5
Table 15-2 Inspection of Field Items
Item
Sample downtube
WINS Impactor well
Rain collector
O-rings
Filter Cassettes
Cassette Seals
In-line filter
Battery
Inspection
Frequency
Every site visit
Every site visit
Every site visit
Every site visit
After each sample run
Each sample
Every 6 months
Every 6 months
Inspection
Parameter
Visible particulate
"Cone" shape of
particulate on
impactor well
> 1/3 full
Any damage
Visible particulate
Clean and smooth
Loaded particulate
Decrease in voltage
Action if Item Fails
Inspection
Clean with a clean
dry cloth
Replace impactor
well (including new
impactor oil)
Empty
Replace
Check downtube and
WINS impactor
Clean with a clean
dry cloth, or replace
as needed
Replace
Replace
Documentation
Requirement
Document in log
book
Document in log
book
Document in log
book
Document in logbook
Document in log
book
Document when
replaced
Document in log
book
Document in log
book
15.4  Maintenance

There are many items that need maintenance attention in the PM2 5 network. This section
describes those items according to whether they are weigh room items or field items.

15.4.1 Weigh Room Maintenance Items

The successful execution of a preventive maintenance program for the weigh room laboratory will
go a long way towards the success of the entire PM2 5 program. In the Palookaville PM2 5
network, weigh room laboratory preventive maintenance is handled through the use of two
contractors. The Bert and Ernie HVAC Company has a contract to take care of all preventive
maintenance associated with the heating, ventilation, and air conditioning system (HVAC).
Additionally, the Bert and Ernie HVAC Company can be paged for all emergencies pertaining to
the weigh room laboratory HVAC system. Preventive maintenance for the micro-balance  is
performed by the Libra Balance Company service technician. Preventive maintenance for the
micro-balance is scheduled to occur at initial set-up and every 6-months thereafter.  In the event
that there is a problem with the micro-balance that cannot be resolved within the Palookaville
organization, the Libra Balance Company service technician can be paged.  The Department's
service agreement with Libra Balance Company calls for service within 24 hours. The service
technician will also have a working micro-balance in his/her possession that will be loaned to

-------
                                                                         Project: Model QAPP
                                                                              Element No: 15
                                                                               Revision No:l
                                                                                Date: 4/17/98
                                                                         	Page 4 of5
Palookaville in the case that the Department's micro-balance can not be repaired on-site.

Service agreements with both the Bert and Ernie HVAC Company and the Libra Balance
Company are expected to be renewed each year. In the event either companies service agreement
is not renewed, a new service provider will be selected and contract put in place.

The following table details the weigh room maintenance items, how frequently they will be
replaced, and who will be responsible for performing the maintenance.

Table 15-3  Preventive Maintenance in Weigh Room Laboratories
Item
Multi-point Micro-balance
maintenance
calibration
Polonium strip replacement
Comparison of NIST Standards to
laboratory working and primary
standards
Cleaning weigh room
HEPA filter replacement
Sticky floor mat
(just outside weigh room)
HVAC system preventive
maintenance
Computer Back-up
Computer Virus Check
Computer system preventive
maintenance (clean out old files,
compress hardrive, inspect)
Maintenance Frequency
6 Months
Yearly
6 Months
6 Months
Monthly
Monthly
6 Months
Yearly
Weekly
Weekly
Yearly
Responsible Party
Libra Balance Company
Libra Balance Company
Libra Balance Company
Balance Analyst
Balance Analyst
Balance Analyst
Bert and Ernie HVAC Company
Balance Analyst
Balance Analyst
PC support personnel
15.4.2 Field Maintenance Items

There are many items associated with appropriate preventive maintenance of a successful field
program. Table 15-4 details the appropriate maintenance checks of the PM2 5 samplers and their
frequency.

-------
                                                                           Project: Model QAPP
                                                                                Element No: 15
                                                                                 Revision No:l
                                                                                  Date: 4/17/98
                                                                           	Page 5 of5
 Table 15-4 Preventive Maintenance of Field Items
Item
Clean WINS PM,, Impactor
PM 10 Inlet
Filter Cassettes
In-line filter
Air Screens (under samplers rain
hood)
Clean filter holding area, internal and
external
Sample Pump Rebuild
Maintenance Frequency
Every 5 sample episodes
Monthly
Each run
6 Months
6 Months
Monthly
Every 10,000 hours of operation
Location Maintenance Performed
At Lab
At Site
At Lab
At Site
At Site
At Site
At Lab
 References

 The following documents were utilized in the development of this section:

1. U.S. EPA (1997a) National Ambient Air Quality Standards for Paniculate Matter - Final Rule.
   40 CFR Part 50. Federal Register, 62(138):38651-38760. July 18,1997.

-------
                                                                             Project: Model QAPP
                                                                                  Element No: 16
                                                                                  Revision No: 1
                                                                                   Date: 4/17/98
                                                                            	Page 1 of 10
                  16.0 Instrument Calibration and Frequency
     This element of the QAPP concerns the calibration procedures that will be used for instrumental analytical
 methods and other measurement methods that are used in environmental measurements. It is necessary to distinguish
 between defining calibration as the checking of physical measurements against accepted standards and as determining
 the relationship (function) of the response versus the concentration. The American Chemical Society (ACS) limits the
 definition of the term calibration to the checking of physical measurements against accepted standards, and uses the
 term standardization to describe the determination of the response function.
 16.1 Instrumentation Requiring Calibration
    The QAPP should identify any equipment or instrumentation that requires calibration to maintain acceptable
 performance. While the primary focus of this element is on instruments of the measurement system (sampling and
 measurement equipment), all methods require standardization to determine the relationship between response and
 concentration
 16.1.1  Mass Analysis by Gravimetry-Laboratory Microbalance

 The laboratory support for  Palookaville includes calibration of the Libra Model 101
 microbalance. As indicated in Section 13, the balance is calibrated (and mass standard check
 weights recertified) once a year under a service agreement. The service technician performs
 routine maintenance and makes any balance response adjustments that the calibration shows to be
 necessary. During the visit by the service technician, both the in-house primary and secondary
 (working) standards are checked against the service technicians standards to ensure acceptability.
 All of these actions are documented in the service technician's report, a copy of which is provided
 to the laboratory manager, which after review, is appropriately filed (see Section 9).

 16.1.2 Flow Rate -Laboratory

 Laboratory support performs the comparison of the flow rate transfer standard to a NIST-
 traceable primary flow rate standard and once every three years sends the primary standard to
NIST for recertification. The laboratory and field personnel chose an automatic dry-piston flow
meter for field calibrations and flow rate verifications of the flow rates of the network samplers.
This type  of device has the advantage of providing volumetric flow rate values directly,  without
requiring conversion from mass flow measurements, temperature, pressure, or water vapor
corrections. In addition, the manual bubble flowmeter will be used in the lab as a primary standard
and as a backup to the dry-piston flowmeter, where the absence of wind and relatively low
humidity will have less negative effect on flowmeter performance.

Upon initial receipt of any new, repaired, or replaced PM 2 5 sampler, lab support will perform a

-------
                                                                         Project: Model QAPP
                                                                              Element No: 16
                                                                               Revision No:l
                                                                               Date: 4/17/98
	Page 2 of 10

multipoint flow rate calibration verification on the sampler flow rate to determine if initial
performance is acceptable. Once sampler flow rates are accepted, the lab performs the calibration
and verifications at the frequency specified in Section 14, as well as directly performing or
arranging to have another party perform the tests needed to recertify the organizations standards.

16.1.3 Sampler Temperature, Pressure, Time Sensors- Laboratory

The lab arranges support for the field calibration of temperature and pressure sensors by
acquiring the necessary equipment and consumables, preparing and lab testing the temperature
comparison apparatus.

A stationary mercury manometer in the laboratory is used as a primary standard to calibrate the
two electronic aneroid barometers that go out in the field as transfer standards.

The lab has also arranged with the NISTฎ Time calibration service in Boulder, Colorado, to verify
the time on a central lab time device (a specified computer), to which other lab and field devices,
including the volumetric flow meter and FRM samplers, are compared.

16.1.4 Field

As indicated in  16.1.3, the following calibrations are performed in the field:

    *• calibration of volumetric flow rate meter in FRM samplers against the working standard
    >• calibration of sampler temperature and pressure sensors against the working temperature
       standard
    *• calibration of the 5 nonmercury min/max thermometers, normally located in the coolers in
       which filters are transported to and from the sampler in the field, against the laboratory-
       checked working standard thermometer

-------
                                                                                  Project: Model QAPP
                                                                                        Element No: 16
                                                                                         Revision No:l
                                                                                         Date: 4/17/98
                                                                                  	Page 3 of 10
 16.2 Calibration Method that Will Be Used for Each Instrument
     The QAPP must describe the calibration method for each instrument in enough detail for another researcher to
 duplicate the calibration method. It may reference external documents such as EPA-designated calibration procedures
 or SOPs providing that these documents can be easily obtained. Nonstandard calibration methods or modified
 standard calibration methods should be fully documented and justified.

     Some instrumentation may be calibrated against other instrumentation or apparatus (e.g., NIST thermometer),
 while other instruments are calibrated using standard materials traceable to national reference standards.

     Calibrations normally involve challenging the measurement system or a component of the measurement system at
 a number of different levels over its operating range. The calibration may cover a narrower range if accuracy in that
 range is critical, given  the end use of the data. Single-point calibrations are of limited use, and two-point calibrations
 do not provide information on nonlinearity. If single- or two-point calibrations are used for critical measurements, the
 potential shortcomings should be carefully considered and discussed in the QAPP. Most EPA-approved analytical
 methods require multipoint (three or more) calibrations that include zeros, or blanks, and higher levels so that
 unknowns fall within the calibration range and are bracketed by calibration points. The number of calibration points,
 the calibration range, and any replication (repeated measures at each level) should be given in the QAPP.

     The QAPP should describe how calibration data will be analyzed. The use of statistical QC techniques to process
 data across multiple calibrations to detect gradual degradations in the measurement system should be described. The
 QAPP should describe any corrective action that will be taken if calibration (or calibration check) data fail to meet the
 acceptance criteria, including recalibration. References to appended SOPs containing the calibration procedures are
 an acceptable alternative to describing the calibration procedures within the text of the QAPP.
16.2.1 Laboratory- Gravimetric (Mass) Calibration

The calibration and QC (verification) checks of the microbalance are addressed in Sections 16.1.1
and 13.3 and Appendix C of this QAPP.  For the following 3 reasons, the multipoint calibration
for this method will be zero, 100 and 200/^g: 1) the required sample collection filters weigh
between 100 and 200 mg; 2) the anticipated range of sample loadings for the 24 hour sample
period is rarely going to be more than a few 100 ^gs; and 3) the lowest, commercially available
check weights that are certified according to nationally accepted standards are only in the single
milligram range. Since the critical weight is not the absolute unloaded or loaded filter weight, but
the difference between the two, the lack of microgram standard check weights is not considered
cause for concern about data quality, as long as proper weighing procedure precautions are taken
for controlling contamination, or other sources of mass variation in the procedure (see  SOP in the
Appendix C).

-------
                                                                         Project: Model QAPP
                                                                              Element No: 16
                                                                              Revision No:l
                                                                               Date: 4/17/98
                                                                        	Page 4 of 10
16.2.2 Laboratory (and Field) -Flow Calibration.
The Air Monitoring and Laboratory Branch Managers conduct spot checks of lab and field
notebooks to ensure that the lab and field personnel are following the SOPs, including the QA/QC
checks, acceptance criteria and frequencies listed in Tables 6-4 and 7-4 in Sections 6 and 7.

Method Summary: After equilibrating the calibration device to the ambient conditions of the
sampler, install a filter cassette containing an unused 46.2 mm filter in the sampler. After
removing the inlet from the sampler, connect the flow calibration device on the sampler down
tube. If the sampler has not been calibrated before, or if the previous calibration was not
acceptable, perform a leak check according to the manufacturer's operational instruction manual,
which is incorporated into Palookaville SOPA-4 in Appendix C.

Otherwise, place the sampler in calibration mode ( SOPA-4 in Appendix C) and perform a three-
point calibration/verification or a one-point flow rate verification. The field staff will only perform
a leak check after calibration or verification of are outside of the acceptance criteria.

Following the calibration or verification, turn off the sampler pump, remove the filter cassette
from the filter cassette holder, remove the flow calibration device, (and flow adaptor device if
applicable), and replace the sampler inlet. If the flow rate is determined to be outside of the
required target flow rate, attempt to determine possible causes by minor diagnostic and trouble
shooting techniques (e.g., leak checks), including  those listed in the manufacturer's operating
instruction manual. Do not attempt field repairs or flow rate adjustments.

16.2.3  Sampler (and Laboratory-Weighing Room- Environmental Control)Temperature
Calibration Procedure.

Both the ambient air and filter temperature sensors will be calibrated once per year.

The ambient air sensor is located inside the shielded fixture on the outside of the PM2 5 sampler
and is easy to  unfasten and remove for comparison to a transfer standard for temperature. The
three-point verification/calibration will be conducted at the field site.

The filter temperature sensor is located in the (open)  space just below the filter cassette. It is
threaded through the walls of the filter cassette holding assembly section of the sampler and
removal of plastic or metal fittings is required to remove the sensor and its associated wiring. It
may be difficult to calibrate this sensor in the field. Be careful when removing the filter
temperature sensor- do not gall the fittings since this  could start an internal leak after the
installation. A sampler leak check must be performed after reinstallation of the filter temperature
sensor.

Several steps to follow in calibrating ambient air temperature are given in SOPA-5 in Appendix C
and in  the following summary.  Refer to the operator's  instruction manual for sampler-specific

-------
                                                                         Project: Model QAPP
                                                                              Element No: 16
                                                                              Revision No: 1
                                                                               Date: 4/17/98
                                                                        	Page 5 of 10
 procedures and instructions.
 Remove the ambient temperature sensor from the radiation shield.  Prepare a convenient container
 (an insulated vacuum/wide mouth thermos bottle) for the ambient temperature water bath and the
 ice slurry bath. Wrap the sensor(s) and a thermometer together with rubber band, ensure that all
 the probes are at the same level. Prepare the ambient or ice slurry solution according to the SOP
 A-3 in Appendix C. Immerse the sensor(s) and the attached thermometer in the ambient
 temperature bath. Wait at least 5 minutes for the ambient thermal mass and the
 sensor/thermometer to equilibrate. Wait at least 15 minutes for equilibration with the ice slurry
 before taking comparative readings.

 For each thermal mass, in the order:  Ambient, Cold, Ambient, Hot, Ambient,  make a series of 5
 measurements, taken about 1 minute apart.  If the measurements indicate equilibrium, average the
 5 readings and record the result as the sensor temperature relative to the thermometer.

 A similar process will be used to verify the calibration of continuously-reading temperature
 sensors used in the laboratory weighing room.

 16.2.4  Sampler Pressure Calibration Procedure. Summarized here and detailed version
 attached as SOP A-6 in Appendix C.

 General: According to ASTM Standard D 3631 (ASTM 1977), a barometer can be calibrated by
 comparing it with a secondary standard traceable to a NIST primary standard.

 Precautionary Note: Protect all barometers from violent mechanical shock and sudden changes
 in pressure.  A barometer subjected to either of these events must be recalibrated. Maintain the
 vertical and horizontal temperature gradients across the instruments at less than 0.1 ฐC/m.  Locate
 the instrument so as to avoid direct sunlight, drafts, and vibration.

 A Fortin  mercury type of barometer is used in the laboratory to calibrate and verify the
 aneroid barometer used in the field to verify the barometric sensors of PM2 5 samplers. Details are
 provided in 16.4.1,  below and SOP A-6.

 16.2.5 Sampler and Standard Volumetric Flow Rate Sensors with Built-in Clocks

Time can be verified over phone lines from NIST (in Boulder, Colorado, directly or through the
NIST calibration service in Gaithersberg, MD). See SOP A-7 in Appendix C for details (or in
NIST standardization handbooks and catalogues cited in A-7).

-------
                                                                              Project: Model QAPP
                                                                                    Element No: 16
                                                                                    Revision No:l
                                                                                     Date: 4/17/98
                                                                              	Page 6 of 10
Procedure for Verifying Relative Humidity Control/Monitoring data for the Filter
Conditioning / Weighing Room - Laboratory Only

A sling psychrometer is used by laboratory personnel to verify the humidity generated and
controlled by the environmental control system.  For details, see SOP A-8 in Appendix C.

16.3 Calibration Standard Materials and Apparatus

    Some instruments are calibrated using calibration apparatus rather than calibration standards.  For example, an
ozone generator is part of a system used to calibrate continuous ozone monitors. Commercially available calibration
apparatus should be listed together with the make (the manufacturer's name), the model number, and the specific
variable control settings that will be used during the calibrations.  A calibration apparatus that is not commercially
available should be described in enough detail for another researcher to duplicate the apparatus and follow the
calibration procedure.
Table 16-1 presents a summary of the specific standard materials and apparatus used in calibrating
measurement systems for parameters necessary to generate the PM25 data required in 40 CFR
parts 50, Appendix L, and part 58.

Table 16-1 Standard Materials and/or Apparatus for PM. 5 Calibration
Parameter
M-Material
A=Apparatus
Mass M

Temperature
M+A
M+A
M+A

Pressure
M+A
A
Flow Rate
A
A
A
Relative Humidity
A
Std. Material


Standard Check
weight

Hg
H20
NA


Hg
NA

NA



NA
Std. Apparatus


NA


Thermometer
Thermal mass (Thermos)
Thermistor


Fortin
Aneroid

Piston Meter
Bubble Meter
Adapter

Sling Psychrometer
Mfr. Name


Best Bet


Lotsa Choices
Cup a Joe
Best Bet


You Better ..
Aviators Choice

Jensen
Hasty
Sampler Mfr.

Whosits
Model #


111


5500
Big
Mouth
8910

22
7-11

F199
LG88
F100

99
Variable
Control
Settings
NA


*
NA
*


*
*

*
NA
NA


*- See manufacturer's operating manual an/or instruction sheet

-------
                                                                            Project: Model QAPP
                                                                                  Element No: 16
                                                                                  Revision No: 1
                                                                                   Date: 4/17/98
                                                                                    Page 7 of 10
 16.4 Calibration Standards
     Most measurement systems are calibrated by processing materials that are of known and stable composition.
 References describing these calibration standards should be included in the QAPP. Calibration standards are
 normally traceable to national reference standards, and the traceability protocol should be discussed. If the standards
 are not traceable, the QAPP must include a detailed description of how the standards will be prepared. Any method
 used to verify the certified value of the standard independently should be described.
 Flow Rate --

 The flow rate standard apparatus used for flow-rate calibration (field- NIST-traceable, piston-type
 volumetric flow rate meter; laboratory -NIST-traceable manual soap bubble flow meter and time
 monitor) has its own certification and is traceable to other standards for volume or flow rate
 which are themselves NIST-traceable. A calibration relationship for the flow-rate standard, such
 as an equation, curve, or family of curves, is established by the manufacturer (and verified if
 needed) that is accurate to within 2% over the expected range of ambient temperatures and
 pressures at which the flow-rate standard is used. The flow rate standard will be recalibrated and
 recertified at least annually.

 The actual frequency with which this recertification process must be completed depends on the
 type of flow rate standard- some are much more likely to be stable than others.  The Department
 will maintain a control  chart (a running plot of the difference or % difference between the flow-
 rate standard and the NIST-traceable primary flow-rate or volume standard) for all comparisons.
 In addition to providing excellent documentation of the certification of the standard, a control
 chart also gives a good indication of the stability of the standard. If the two standard-deviation
 control limits are close together, the chart  indicates that the standard is very stable and could be
 certified less frequently. The minimum recertification frequency  is 1 year. On the other hand, if
 the limits are wide, the  chart would indicate a less stable standard that will be recertified more
 often.

 Temperature -

 The operations manuals associated with the 2 single and 5 sequential Palookaville samplers
 identify types of temperature standards recommended for calibration and provide a detailed
calibration procedure for each type that is specifically  designed for the particular sampler.

The EPA Quality Assurance Handbook, Volume IV ( EPA 1995), Section 4.3.5.1, gives
information on calibration equipment and methods for assessing response characteristics of
temperature sensors.

-------
                                                                         Project: Model QAPP
                                                                              Element No: 16
                                                                               Revision No:l
                                                                               Date: 4/17/98
                                                                 	Page 8  of 10

The temperature standard used for temperature calibration will have its own certification and be
traceable to a NIST primary standard. A calibration relationship to the temperature standard (an
equation or a curve) will be established that is accurate to within 2% over the expected range of
ambient temperatures at which the temperature standard is to be used.  The temperature standard
must be reverified and recertified at least annually.  The actual frequency of recertification
depends on the type of temperature standard; some are much more stable than others. The best
way to determine recertification requirements is to keep a control chart. The Department will use
an ASTM- or NIST-traceable mercury in glass thermometer, for laboratory calibration.

Palookaville Standards

The temperature sensor standards chosen by  the lab and field staff and  managers are both based
on standard materials contained in standardized apparatus;  each has been standardized (compared
in a strictly controlled procedure) against temperature standards the manufacturers obtained from
NIST.

The Palookaville laboratory standards are 2 NIST-traceable glass mercury thermometers from the
Lotsa Choices Distributor Companyฎ, each with its own certificate summarizing the company's
NIST traceability protocol and documenting the technicians signature,  comparison date,
identification of the NIST standard used, and the mean and standard deviation of the comparison
results. There are 2 thermometers with overlapping ranges  that span the complete range of
typically measured summer to winter lab and field temperature values.

The Palookaville field temperature standards are two Best Bet Model 8910ฎ thermistor probes
and one digital readout module with RS232C jack and cable connector available for linkage to a
data logger or portable computer. The two probes have different optimum ranges, one including
the full range of temperatures ever recorded in the summer and the other including the full range
of temperatures ever recorded in the winter by the National Weather Service at the Palookaville
sites. Each probe came with a certificate of NIST-traceability with the same kind of information as
the thermometer certificates contained.

Pressure

The Fortin mercurial type of barometer works on fundamental principles of length and mass and is
therefore more accurate but more difficult to read and correct than other types. By comparison,
the precision aneroid barometer is an evacuated capsule with a flexible bellows coupled through
mechanical, electrical, or optical linkage to an indicator.  It  is potentially less accurate than the
Fortin type but can be transported with less risk to the reliability of its  measurements and presents
no damage from mercury spills.  The Fortin type of barometer is best employed as a higher quality
laboratory standard which is used to adjust and certify an aneroid barometer in the laboratory.

-------
                                                                            Project: Model QAPP
                                                                                 Element No: 16
                                                                                  Revision No: 1
                                                                                   Date: 4/17/98
                                                                                   Page 9 of 10
  16.4.1 Lab
  The Palookaville pressure standard is a You Better Believe 11ฎ Model 22 Fortin-type mercury
  barometer.

  16.4.2 Field

  The field working standard is an Aviator's Choiceฎ 7-11 aneroid barometer with digital readout.

  16.5 Document Calibration Frequency


    The QAPP must describe how often each measurement method will be calibrated.  It is desirable that the
 calibration frequency be related to any known temporal variability (i.e., drift) of the measurement system. The
 calibration procedure may involve less-frequent comprehensive calibrations and more-frequent simple drift checks.
 The location of the record of calibration frequency and maintenance should be referenced.
 See Table 14-1 for a summary of field QC checks that includes frequency and acceptance criteria
 and references for calibration and verification tests of single and sequential sampler flow rate,
 temperature, pressure, and time. See Table 14-2 for a similar summary of laboratory QC,
 including frequency of primary and working mass standards and conditioning/weighing room
 temperature and relative humidity.

 The field sampler flow rate, temperature and pressure sensor verification checks include 1-point
 checks at least monthly and multipoint checks (calibration without adjustment unless needed as
 determined independently and then performed by the vendor's authorized service representative)
 at least annually, as proven by tracking on control charts.

 All of these events, as well as sampler and calibration equipment maintenance will be documented
 in field data records and notebooks and annotated with the flags required in Appendix L of 40
 CFR Part 50, the manufacturer's operating instruction manual and any others indicated  in section
 22.7.2 of this document. Laboratory and field activities associated with equipment used by the
 respective technical staff will be kept in record notebooks as well. The records will normally be
 controlled by the Branch Managers, and located in the labs or field sites when in use or  at the
 manager's offices when being reviewed or used for data validation.

 References

ASTM. 1977. Standard test methods for measuring surface atmospheric pressure. American
    Society for Testing and Materials. Philadelphia, PA. Standard D 3631-84.

 ASTM.  1995. Standard test methods for measuring surface atmospheric pressure. American

-------
                                                                       Project: Model QAPP
                                                                            Element No-16
                                                                            Revision No:l
                                                                             Date: 4/17/98
                                                                      	Page 10 of 10
   Society for Testing and Materials. Publication number ASTM D3631-95.
EPA (1997a) National Ambient Air Quality Standards for Particulate Matter - Final Rule. 40 CFR
   Part 50. Federal Register, 62(138):38651-38760. July 18,1997.

EPA. 1997b. Ambient air monitoring reference and equivalent methods. U.S. Environmental
   Protection Agency. 40 CFR Part 53, as amended July 18, 1997.
        i
EPA. 1997. Reference method for the determination of fine particulate matter as PM2 5 in the
   atmosphere. U.S. Environmental Protection Agency. 40 CFR Part 58, Appendix L, as
   amended July 18, 1997.

EPA. 1995. Quality Assurance Handbook for Air Pollution Measurement Systems Volume IV:
   Meteorological Measurements. U.S. Environmental  Protection Agency. Document No.
   EPA/600/R-94/038d. Revised March.

NIST. 1976. Liquid-in-glass thermometry. National Institute of Standards and Technology. NBS
   Monograph 150. January.
NIST. 1986. Thermometer calibration: a model for state calibration laboratories. National
   Institute of Standards and Technology. NBS Monograph 174. January.

NIST. 1988. Liquid-in-glass thermometer calibration service. National Institute of Standards and
   Technology. Special publication 250-23. September.

NIST. 1989. The calibration of thermocouples and thermocouple materials. National Institute of
   Standards and Technology. Special publication 250-35. April.

-------
                                                                             Project: Model QAPP
                                                                                  Element No: 17
                                                                                   Revision No: 1
                                                                                    Date: 4/17/98
                                                                            	Page 1 of4
        17.0 Inspection/Acceptance for Supplies and Consumables
    Describe how and by whom supplies and consumables shall be inspected and accepted for use in the project.
State acceptance criteria for such supplies and consumables.
 17.1   Purpose

 The purpose of this element is to establish and document a system for inspecting and accepting all
 supplies and consumables that may directly or indirectly affect the quality of the PIV^ 5 Program.
 The Palookaville PM2 5 monitoring network relies on various supplies and consumables that are
 critical to its operation.  By having documented inspection and acceptance criteria, consistency of
 the supplies can be assured.  This section details the supplies/consumables, their acceptance
 criteria, and the required documentation for tracking this process.

 17.2   Critical Supplies and Consumables


     Clearly identify and document all supplies and consumables that may directly or indirectly affect the quality of
  the project or task. See Figures 10 and  11 for example documentation of inspection/acceptance testing requirements.
  Typical examples include sample bottles, calibration gases, reagents, hoses, materials for decontamination activities,
  deionized water, and potable water.

     For each item identified, document the inspection or acceptance testing requirements or specifications (e.g.,
  concentration, purity, cell viability, activity, or source of procurement) in addition to any requirements for certificates
  of purity or analysis.
There are many components to the PM2 5 monitoring network.  This section attempts to describe
the needed supplies for this PM2 5 monitoring network and includes items for the weigh room
laboratory and the field. Table 17-1 details the various components:

Table 17-1 Critical Supplies and Consumables
Area
Sampler
Sampler
Sampler
Sampler
Item
Impactor Oil
37 mm Glass Fiber Filter
Rain Collector
O-Rings
Description
Tetramethyltetraphenyl-
trisiloxane (30ml)
For use in impactor well
Glass
The O-rmgs that seal in
the filter cassette when it
is placed in the sampler.
Vendor
Dow Corningฎ
XYZ Company
XYZ Company
XYZ Company
Model Number
704 Oil
xxxx
xxxx
xxxx

-------
                                                                                Project: Model QAPP
                                                                                     Element No: 17
                                                                                      Revision No:l
                                                                                       Date: 4/17/98
                                                                               	Page 2 of4
Area
Sampler
Sampler
Sampler
Sampler
Filter
Filter
Filter
Filter
Filter
Filter
Weigh Room
Weigh Room
Weigh Room
All
All
Item
In-line Filter
Battery
Fuses
Floppy Disks
Filters
Petri-dish
Filter Cassettes (single)
Filter Cassette Holder,
Protective Containers
Sequential Sampler
Cassette Holder
Filter Handling
Containers
Staticide
Static Control Strips
Air Filters
Powder Free Antistatic
Gloves
Low-lint wipes
Description
Downstream of sample
collection and upstream
of sample pump.
Internal Sampler Battery
In sampler
3.5" Pre-formatted
46.2 mm teflon
47 mm with securing
ring.
As per CFR design
For securing cassette
For use with XYZ Model
2000
For transport to and from
the field
Anti-static solution
Polonium 50CMC,
High Efficiency
Vinyl, Class M4.5
4. 5" x 8.5"
Cleaning Wipes
Vendor
XYZ Company
XYZ Company
XYZ Company
Purchase local
Whatmanฎ
Gelmanฎ
XYZ Company
XYZ Company
XYZ Company
XYZ Company
Cole-Parmerฎ
Mettler-Toledoฎ
Purchase Local
Fisher Scientificฎ
Kimwipesฎ
Model Number
xxxx
xxxx
xxxx


7231
xxxx
xxxx
xxxx
xxxx
E-3 3672-00
110653

Small 11-393-85A
Medium 11-393-85A
Large 11 -393-85 A
X-Large 11 -393-85 A
34155
17.3  Acceptance Criteria
    Acceptance criteria must be consistent with overall project technical and quality criteria (e.g., concentration must
be within + 2.5%, cell viability must be >90%). If special requirements are needed for particular supplies or
consumables, a clear agreement should be established with the supplier, including the methods used for evaluation
and the provisions for settling disparities.
Acceptance criteria must be consistent with overall project technical and quality criteria.  Some of
the acceptance criteria is specifically detailed in 40 CFR Parts 50.  Other acceptance criteria such
as observation of damage due to shipping can only be performed once the equipment has arrived
on site.

-------
                                                                              Project: Model QAPP
                                                                                  Element No: 17
                                                                                   Revision No: 1
                                                                                    Date: 4/17/98
                                                                             	Page 3 of4
Table 17-2 details the acceptance test and limits for procurement of supplies and consumables to
be utilized in the PM2 5 Palookaville network:

Table 17-2 Acceptance Criteria for Supplies and Consumables
Equipment
Impactor Oil
37 mm Glass Fiber Filter
Rain Collector
O-Rings
In-line Filter
Battery
Fuses
Floppy Disks
Filters, 46.2 mm Teflon
Petri-dish
Filter Cassettes (single)
Filter Cassette Holder, Protective
Containers
Sequential Sampler Cassette Holder
Filter Handling Containers
Anti-Static Solution
Static Control Strips
Air Filters
Powder Free Antistatic Gloves
Cleaning Wipes
Acceptance Criteria
Is the oil identified as
Tetramethyltetraphenyl-trisiloxane
Filters of the correct size and quality
Not broken
Of the correct size
Of the correct size
Correct size and voltage
Correct size and specification
Undamaged and pre-formatted
Tested and Accepted by the U.S. EPA
with documentation of acceptance in
package. Should meet visual inspection
and pre-weight (1 10-160mg) criteria
Clean and appropriately sized for 46.2
mm filters
Of the correct type and make
Of the correct size so that filter
cassettes will not move around that
could potentially lead to dislodging
particulate
Of the correct type for use with the
sequential sampler model
Clean
Of the correct type
Manufactured within past 3 months
and between 400 and 500^C, of
Polonium
Of the size and quality specified
Of the size and quality specified
Of the quality specified
Action if Requirements not met
Return
Return
Call Vendor, will likely not return
Return
Return
Return
Return
Return
Call David Lutz. U.S. EPA
(919)541-5476
Return
Return
Return
Return
Clean
Return
Call vendor
Return
Return
Return

-------
                                                                          Project: Model QAPP
                                                                               Element No: 17
                                                                                Revision No:l
                                                                                 Date: 4/17/98
	Page 4 of4

17.4  Tracking and Quality Verification of Supplies and Consumables


     Procedures should be established to ensure that inspections or acceptance testing of supplies and consumables
 are adequately documented by permanent, dated, and signed records or logs that uniquely identify the critical supplies
 or consumables, the date received, the date tested, the date to be retested (if applicable), and the expiration date.
 These records should be kept by the responsible individual(s) (see Figure 13 for an example log)
Tracking and quality verification of supplies and consumables have two main components.  The
first is the need of the end user of the supply or consumable to have an item of the required
quality. The second need is for the purchasing department to accurately track goods received so
that payment or credit of invoices can be approved.  In order to address these two issues, the
following procedures outline the proper tracking and documentation procedures to follow:

    1.   Receiving personnel will perform a rudimentary inspection of the packages as they are
        received from the courier or shipping company. Note any obvious problems with a
        receiving shipment such as crushed box or wet cardboard.

    2.   The package will be opened, inspected and contents compared against the packing slip.

    3.   Supply/consumable will be compared to the acceptance criteria in Table 17-2.

    4.   If there is a problem with the equipment/supply, note it on the packing list, notify the
        supervisor of the receiving area and immediately call the vendor.

    5.   If the equipment/supplies appear to be complete and in good condition, sign and date the
        packing list and send to accounts payable so that payment can be made in a timely
        manner.

    6.   Notify  appropriate personnel that equipment/supplies are available.  For items such as the
        46.2 mm Teflon filters, it is critical to notify the laboratory manager of the weigh room so
        sufficient time for de-gassing of the filters can be allowed.

    7.   Stock equipment/supplies in appropriate pre-determined area.

    8.   For supplies, consumables, and equipment used throughout the PM2 5 program, document
        when these items  are changed out. If available, include all relevant information such as:
        model number, lot number, and serial number.

-------
                                                                                   Project: Model QAPP
                                                                                        Element No: 18
                                                                                         Revision No:l
                                                                                          Date: 4/17/98
                                                                                  	Page 1 of4
                         18.0 Data Acquisition Requirements
      This element of the QAPP should clearly identify the intended sources of previously collected data and other
  information that will be used in this project. Information that is non-representative and possibly biased and is used
  uncritically may lead to decision errors. The care and skepticism applied to the generation of new data are also
  appropriate to the use of previously compiled data (for example, data sources such as handbooks and computerized
  databases).


This section addresses data not obtained by direct measurement from the PM25 Ambient Air
Quality Monitoring Program.  This includes both outside data and historical monitoring data.
Non-monitoring data and historical monitoring data are used by the Program in a variety of ways.
Use of information that fails to meet the necessary Data Quality Objectives (DQOs) for the PM25
Ambient Air Quality Monitoring Program  can lead to erroneous trend reports and regulatory
decision errors. The policies and procedures described in this section apply both to data acquired
through the Palookaville Department of Health monitoring program and to information previously
acquired  and/or acquired from outside sources.


18.1 Acquisition of Non-Direct Measurement Data


    This element's criteria should be developed to support the objectives of element A7.  Acceptance criteria for each
collection of data being considered for use in this project should be explicitly stated, especially with respect to:

Representativeness.  Were the data collected from a population that is sufficiently similar to the population of
interest and the population boundaries? How will potentially confounding effects (for example, season, time of day,
and cell type) be addressed so that these effects do not unduly alter the summary information?

Bias. Are there characteristics of the data set that would shift the conclusions. For example, has bias in analysis
results been documented? Is there sufficient information to estimate and correct bias?

Precision.  How is the spread in the results estimated? Does the estimate of variability indicate that it is sufficiently
small to meet the objectives of this project as stated in element A7?  See also Appendix D.

Qualifiers. Are the data evaluated in a manner that permits logical decisions on whether or not the data are
applicable to the current project? Is the system of qualifying or flagging data adequately documented to allow the
combination of data sets?

Summarization. Is the data summarization process clear and sufficiently consistent with the goals of this project?
(See element D2 for further discussion.) Ideally, observations and transformation equations are available so that their
assumptions can be evaluated against the objectives of the current project.

This element should also include a discussion on limitations on the use of the data and the nature of the uncertainty of
the data.

-------
                                                                         Project: Model QAPP
                                                                              Element No: 18
                                                                               Revision No:l
                                                                               Date: 4/1 7/98
	Page 2 of4

The PM25 Ambient Air Quality Monitoring Program relies on data that are generated through field
and laboratory operations; however, other significant data are obtained from sources outside the
Palookaville Department of Health or from historical records. This section lists this data and
addresses quality issues related to the PM25 Ambient Air Quality Monitoring Program.

Chemical and Physical Properties Data

Physical and chemical properties data and conversion constants are often required in the
processing of raw data into reporting units. This type of information that has not already been
specified in the monitoring regulations will be obtained from nationally and internationally
recognized sources. Other data sources may be used with approval of the Air Division Q A
Officer. The following sources may be used in the PM25 Ambient Air Quality Monitoring
Program without prior approval:

    •   National Institute of Standards and Technology (NIST)
    •   ISO, IUPAC, ANSI, and other widely-recognized national and international standards
        organizations
    •   U.S. EPA
    •   The current edition of certain standard handbooks may be used without prior approval of
        the Palookaville Air Division QA Officer. Two that are relevant to the fine particulate
        monitoring program are CRC Press' Handbook of Chemistry and Physics, and Lange's
        Handbook.

Sampler Operation and Manufacturers' Literature

Another important source of information needed for sampler operation is manufacturers'
literature. Operations manuals and users' manuals frequently provide numerical information and
equations pertaining to  specific equipment. Palookaville Department of Health personnel are
cautioned that such information is sometimes in error, and appropriate cross-checks will be made
to verify the reasonableness of information contained in manuals. Whenever possible, the field
operators will compare physical and chemical constants in the operators manuals to those given in
the sources listed above.  If discrepancies are found, determine the correct value by contacting the
manufacturer. The field operators will correct all the operators manuals and ask the vendor to
issue an errata sheet discussing the changes. The Department will also contact the Region Y
Office to inform them of these errors  The following types of errors are commonly found in such
manuals:
    •   insufficient precision
    •   outdated values for physical constants
    •   typographical errors
    •   incorrectly specified units
    •   inconsistent values within a manual
    •   use of different reference conditions than those called for in EPA regulations

-------
                                                                          Project: Model QAPP
                                                                              Element No: 18
                                                                               Revision No:l
                                                                                Date: 4/17/98
                                                                         	Page 3 of4
 Geographic Location
 Another type of data that will commonly be used in conjunction with the PM25 Ambient Air
 Quality Monitoring Program is geographic information. For the current sites, the Department will
 locate these sites using global positioning systems (GPS) that meet EPA Locational Data Policy
 of 25 meters accuracy. USGS maps were used as the primary means for locating and siting
 stations in the existing network. Geographic locations of Palookaville monitoring sites that are no
 longer in operation will not be re-determined.

 Historical Monitoring Information of the Palookaville Department of Health

 The Palookaville Department of Health has operated a network of ambient air monitoring stations
 since the 1970's. Historical monitoring data and summary information derived from that data may
 be used in conjunction with current monitoring results to calculate and report trends in pollutant
 concentrations. In calculating historical trends, it is important to verify that historical data are
 fully comparable to current monitoring data.  If different methodologies were used to gather the
 historical data, the biases and other inaccuracies must be described in trends reports based on that
 data. Direct comparisons of PM2 5 with historical TSP or PM,0 data will not be reported or used
 to estimate trends.  Dichot sampler data (fine portion) may be used to establish trends in PM2 5
 concentration; however, evidence must be presented to demonstrate that results of the two
 methods are comparable. Trends reports comparing PM2 5 data with historical particulate data
 must be approved by the Air Division QA Officer prior to release.

 External Monitoring Data Bases

 It is the policy of the Palookaville Department of Health that no data obtained from the Internet,
 computer bulletin boards, or data bases from outside organizations shall be used in creating
 reportable data or published reports without approval of the Air Division Q A Officer.  This
 policy is intended to ensure the use of high quality data in Palookaville publications.

 Data from the EPA AIRS data base may be used in published reports with appropriate caution.
 Care must be taken in reviewing/using any data that contain flags or data qualifiers. If data is
 flagged, such data shall not be utilized unless it is clear that the data still meets critical QA/QC
 requirements. It is impossible to assure that a data base such as AIRS  is completely free from
 errors including outliers and biases, so caution and skepticism is called for in comparing
 Palookaville data from other reporting  agencies as reported in AIRS.  Users should review
 available QA/QC information to assure that the external  data are comparable with Palookaville
measurements and  that the original data generator had an acceptable QA program in place.

Lead and Speciated Particulate Data

The Palookaville Department of Health has been routinely monitoring airborne lead since the
 1980's.  Early data  is likely to be problematic because of different particle size cutpoints and

-------
                                                                          Project- Model QAPP
                                                                               Element No: 18
                                                                                Revision No:l
                                                                                 Date: 4/17/98
	Page 4 of4

because of significantly higher detection limits.  Lead data (PMIO) acquired since 1992, and
continuing in parallel with the current program, has improved analytical sensitivity due to a
change in the analytical method. However, caution is needed in directly comparing this data with
the PM2 5 data because of the difference in size fractions.

Existing chemical speciation data for elements other than lead are very limited.  Some speciation
data from dichot samples were obtained by the Palookaville Institute of Technology in
cooperation with the Department of Health during a 1986 research study sponsored by the
U.S.EPA. These results may be used to provide a historical baseline for the speciation results to
be obtained by the PM25 Ambient Air Quality Monitoring Program; however, it is unclear whether
the quality of these data is sufficient to allow direct comparison with new data.

U.S. Weather Service Data

Meteorological information is gathered from the U.S. Weather  Service station at the Palookaville
International  Airport.  Parameters include: temperature, relative humidity, barometric pressure,
rainfall, wind speed, wind direction, cloud type/layers, percentage cloud cover and visibility range.
Historically, these data have not been used to calculate pollutant concentration values for any of
the Palookaville monitoring sites, which each have the required meteorological  sensors.
However, NWS data are often included in summary reports. No changes to the way in which
these data are collected are anticipated due to the addition of the Fine Particulate data to the
Palookaville Department of Health ambient air monitoring program.

-------
                                                                            Project: Model QAPP
                                                                                 Element No: 19
                                                                                  Revision No: 1
                                                                                  Date:4/17/98
 	Pagel of 14

                               19.0 Data Management


 19.1   Background and Overview


     This element should present an overview of all mathematical operations and analyses performed on raw
 ("as-collected") data to change their form of expression, location, quantity, or dimensionality. These operations
 include data recording, validation, transformation, transmittal, reduction, analysis, management, storage, and retrieval.
 A diagram that illustrates the source(s) of the data, the processing steps, the intermediate and final data files, and the
 reports produced may be helpful, particularly when there are multiple data sources and data files.  When appropriate,
 the data values should be subjected to the same chain-of-custody requirements as outlined in element B3. Appendix G
 has further details.
 This section describes the data management operations pertaining to PM2 5 measurements for the
 SLAMS/NAMS stations operated by The Palookaville Department of Health.  This includes an
 overview of the mathematical operations and analyses performed on raw ("as-collected") PM2 5
 data. These operations include data recording, validation, transformation, transmittal, reduction,
 analysis, management, storage, and retrieval.

 Data processing for PM2 5 data are summarized in Figure 19-1.  Data processing steps are
 integrated, to the extent possible, into the existing data processing system used for The
 Palookaville Department of Health's SLAMS network. Originally, all data were entered manually
 and were processed using a set of programs written in Cobol and Fortran on the Department's
 central mainframe computer. In the mid-1980s, real-time data acquisition was added and the air
 pollution data base was moved to a VAX computer run by the Air Quality Division within The
 Palookaville Department of Health.  Data were collected via dedicated and dial-up phone lines.
 More recently, the system was transferred to a network of PC-compatible computers. The PM2 5
 data base resides on a machine running the Windows NT Server operating system, which is also
 the main file server for the Air Quality Division. This machine is shown in the upper left of Figure
 19-1.

 Each Ambient Air Monitoring Station operated by The Polokaville Department of Health has an
 Acme Mark IVฎ data logger. These data loggers provide data collection for continuous analyzers
 at each station.   There are currently no facilities to remotely acquire the PM2 5 sampler data.
 However, The  Polokaville Department of Health  is examining the possibility of upgrading these
 stations in the future so that sampler status, flow rate, temperatures, etc. can be monitored
 remotely.

 Filter tracking and chain of custody information are entered into the PM2 5 Data Acquisition
 System (DAS)  at four main stages as shown in Figure 19-1.  Managers are able to obtain reports
on status of samples, location of specific filters, etc. using the DAS. All users must be authorized
by the Manager, Air Quality Division, and receive a password necessary to log on to the DAS.
Different privileges are given each authorized user depending on that person's need. The

-------
                                                                           Project: Model QAPP
                                                                                Element No: 19
                                                                                 Revision No: 1
                                                                                 Date:4/17/98
	Page 2 of 14

following privilege levels are defined:

     ป•   Data Entry Privilege - The individual may see and modify only data within The PIV^ 5
        DAS that he or she has personally entered.  After a data set has been "committed" to the
        system by the data entry operator, all further changes will generate entries in the system
        audit trail.
     *   Reporting Privilege - This privilege permits generation of data summary reports available
        under The PM2 5 DAS. No data changes are allowed without additional privileges.
     *•   Data Administration Privilege - Data Administrators for The PM2 5 DAS are allowed to
        change data as a result of QA screening and related reasons. All operations resulting in
        changes to data values are logged to the audit trail. The Data Administrator  is
        responsible for performing the following tasks on a regular basis
        •   merging/correcting the duplicate data entry files
        •  running verification and validation routines and correcting data as necessary
        •  generating summary data reports for management
        •  uploading verified/validated data to EPA AIRS

-------
                                                                                              Project: Model QAPP
                                                                                                    Element No: 19
                                                                                                      Revision No:l
                                                                                                       Date:4/l 7/98
                                                                                              	Page 3  of 14
 Filter prepared unique ID
number recorded on filter
and in data system
                                     Sample Collection
                                      Data Download
Sample shipment including
Cham-of-custody form
                                                               Samples received with
                                                               cham-of-custody form
                        —~^^^~— Paper Flow

                                    Sample Flow

                        "~  ~  ~  Computer Flow
                                   Analysis benchsheets
                                        developed
Sample Analysis
Weight, Temp,
Humidity, Calibration
QA/QC samples
                                                                                                 Sample Storage
                                                                                                 (pre-analysis)
                                                                                                 Temp/humidity
                                                                                                 recording
                                                                  Data recorded on
                                                                    benchsheet
        Data Reports
         Reviews
        Acceptance
                                 Sample Storage
                                 (post-analysis)
                                                                                                  Sample disposal |
                                                                                                      1-years
Figure 19-1. PMj 5 data flow diagram

-------
                                                                                Project: Model QAPP
                                                                                     Element No: 19
                                                                                      Revision No:l
                                                                                       Date:4/17/98
                                                                               	Page 4 of 14
19.2   Data Recording
     Any internal checks (including verification and validation checks) that will be used to ensure data quality during
 data encoding in the data entry process should be identified together with the mechanism for detailing and correcting
 recording errors. Examples of data entry forms and checklists should be included.
Data entry, validation, and verification functions are all integrated in the PM2 5 DAS. Bench sheets
shown in Figure 19-1 are entered by laboratory personnel.  Procedures for filling out the
laboratory sheets and subsequent data entry are provided in SOPs listed in Table 19-1 and
included in Appendix E.

Table 19-1 List of The Polokaville Department of Health SOPS for P\J5 Data Processing Operations
SOP Number
AIR-IS-FP1
AIR-FLD- FP1
AIR-LAB-FP1
AIR-IS-FP2
AIR-IS-FP3
Title
Data acquisition procedures for the PA|5
monitoring program
Standard procedures for operation of field
monitoring sites for the PM2 5 monitoring
program
Standard operating procedures for
preparation, weighing, and data recording for
the PM2 5 monitoring program
Data processing procedures for the PAf5
monitoring program
AIRS data transmittal procedures system for
the PM2 5 monitoring program
Description
Describes the electronic data processing operations
applicable to PM, 5 data.
Describes all field operations to implement PIvJ 5
monitoring. Includes manual and electronic data
acquisition procedures.
Describes all laboratory operations for PlVfs filter
handling, weighing, and the associated data recording
Describes the procedures for data entry, processing,
merging, validation, reporting, and reduction.
Describes the procedures used to format and transmit
PM2 5 data to AIRS. (Will be used in conjunction with
SOP AIR-IS-SLAMS7, which describes transmittal of
other ambient monitoring data to AIRS.)
19.3   Data Validation
    The details of the process of data validation and pre-specified criteria should be documented in this element of the
QAPP.  This element should address how the method, instrument, or system performs the function it is intended to
consistently, reliably, and accurately in generating the data. Part D of this document addresses the overall project data
validation, which is performed after the project has been completed.
Data validation is a combination of checking that data processing operations have been carried
out correctly and of monitoring the quality of the field operations. Data validation can identify
problems in either of these areas.  Once problems are identified, the data can be corrected or
invalidated, and corrective actions can be taken for field or laboratory operations. Numerical data
stored in the PM2 5 DAS are never internally overwritten by condition flags. Flags denoting error

-------
                                                                         Project: Model QAPP
                                                                              Element No: 19
                                                                              Revision No: 1
                                                                               Date:4/17/98
	Page 5 of 14

conditions or QA status are saved as separate fields in the data base, so that it is possible to
recover the original data.

The following validation functions are incorporated into the PM2 5 DAS to ensure quality of data
entry and data processing operations:

*•   Duplicate Key Entry - the following data are subjected to duplicate entry by different
    operators: filter weight reports, field data sheets, chain of custody sheets. The results of
    duplicate key entry are compared and errors are corrected at biweekly intervals. The method
    for entering the data are given in  SOP AIR-LAB-FP1, Standard operating procedures for
    preparation, weighing, and data recording for the  PM2 5 monitoring program. Procedures
    for reconciling the duplicate entries are given in SOP AIR-IS-FP2, Data processing
    procedures for the  PM25 monitoring program.
*   Range Checks - almost all monitored parameters have simple range checks programmed in.
    For example, valid times must be between 00:00 and 23:59, summer temperatures must be
    between 10 and 50 degrees Celsius, etc.  The data entry operator is notified immediately when
    an entry is out of range.  The operator has the option of correcting the entry or overriding the
    range limit. The specific values used for range checks may vary depending on season and
    other factors.  The currently  used range values for data entry acceptance are provided in SOP
    AIR-IS-FP2.  Since these range limits for data input are not regulatory requirements, the Air
    Division QA Officer may adjust them from time to time to better meet quality goals.
*•   Completeness Checks - When the data are processed certain completeness criteria must be
    met.  For example, each filter must have a start time, an end time, an average flow rate, dates
    weighed, and operator and technician names. The data entry operator will be notified if an
    incomplete record has been entered before the record can be closed.
*•   Internal Consistency and Other Reasonableness  Checks - Several  other internal
    consistency checks are built into the PM2 5 DAS.  For example, the end time of a filter must be
    greater than the start time. Computed filter volume  (integrated flow) must be approximately
    equal to the exposure time multiplied by the nominal flow. Additional consistency and other
    checks will be implemented as the result of problems encountered during data screening.  See
    the most recent version of SOP AIR-IS-FP2 for the currently implemented consistency
    checks.
>•   Data Retention - Raw data sheets are retained on file in the Air  Quality Division  office for a
    minimum of five years, and are readily available for  audits and data verification activities.
    After five years, hardcopy records and computer backup media are cataloged and boxed for
    storage at the Palookaville Services Warehouse. Physical samples such as  filters shall be
    discarded with appropriate attention to proper disposal of potentially hazardous materials.
>    Statistical Data Checks - Errors found during statistical screening will be traced  back to
    original data entry files and to the raw data sheets, if necessary.  These checks shall be run on
    a monthly schedule and prior to any data submission to AIRS. Data validation is the process
    by which raw data are screened and assessed before  it can be included in the main data base
    (i.e., the PM2 5 DAS).
>    Sample Batch Data Validation- which is discussed in Section 23,  associates flags, that are

-------
                                                                               Project: Model QAPP
                                                                                    Element No: 19
                                                                                     Revision No:l
                                                                                      Date:4/17/98
                                                                              	Page 6 of 14
    generated by QC values outside of acceptance criteria, with a sample batch. Batches
    containing too many flags would be rerun and or invalidated.

Table 19-2 summarizes the validation checks applicable to the PM2 5 data.

Table 19-2 Validation Check Summaries
Type of Data Check
Data Parity and Transmission Protocol Checks
Duplicate Key Entry
Date and Time Consistency
Completeness of Required Fields
Range Checking
Statistical Outlier Checking
Manual Inspection of Charts and Reports
Sample Batch Data Validation
Electronic
Transmission and
Storage
•







Manual
Checks

•
•
•


•

Automated
Checks


•
•
•
•

•
Two key operational criteria for PM2 5 sampling are bias and precision. As defined in 40CFR Part
58, Appendix A, these are based on differences between collocated sampler results and FRM
performance evaluations.  The Palookaville Department of Health Air Quality Division will
inspect the results of collocated sampling during each batch validation activity.  This data will be
evaluated as early in the process as possible, so that potential operational problems can be
addressed.  The objective of the Palookaville Department of Health will be to optimize the
performance of its PM25 monitoring equipment. Initially, the results of collocated operations will
be control charted (see Section 14).  From these charts, control limits will be established to flag
potential problems. Multiple collocation results must be accumulated to assess data quality with
confidence.  However, even limited data can be used for system maintenance and corrective
action.

19.4   Data Transformation
    Data transformation is the conversion of individual data point values into related values or possibly symbols using
conversion formulas (e.g., units conversion or logarithmic conversion) or a system for replacement. The
transformations can be reversible (e.g., as in the conversion of data points using a formulas) or irreversible (e.g., when
a symbol replaces actual values and the value is lost). The procedures for all data transformations should be described
and recorded in this element. The procedure for converting calibration readings into an equation that will be applied
to measurement readings should be documented in the QAPP. Transformation and aberration of data for statistical
analysis should be outlined in element D3, "Reconciliation with Data Quality Objectives."

-------
                                                                                Project: Model QAPP
                                                                                    Element No: 19
                                                                                     Revision No: 1
                                                                                      Date:4/17/98
                                                                               	Page 7 of 14
 Calculations for transforming raw data from measured units to final concentrations are relatively
 straightforward, and many are carried out in the sampler data processing unit before being
 recorded.  The following relations in Table 19-3 pertain to PM2 5 monitoring:

  Table 19-3 Raw Data Calculations
Parameter
Filter Volume
(Va)*
Mass on Filter
(M25)
PM25
Concentration
\^PM2 5)
Units
m3
Mg
Mg/m3
Type of Conversion
Calculated from average Flow Rate (Qve)
in L/min, and total elapsed time(t) in min.
multiplied by the unit conversion (mVL)
Calculated from filter post-weight(Mf) in
mg and filter pre-weight(M,) in mg,
multiplied by the unit conversion (^g/mg)
Calculated from laboratory data and
sampler volume
Equation
Va = QaveXtXlฐ3
M2i = Mf - M, x 103
M,.
PM,, = — ii
25 va
* - most FRM instruments will provide this value from the data logger.
 19.5   Data Transmittal
    Data transmittal occurs when data are transferred from one person or location to another or when data are copied
from one form to another. Some examples of data transmittal are copying raw data from a notebook onto a data entry
form for keying into a computer file and electronic transfer of data over a telephone or computer network. The QAPP
should describe each data transfer step and the procedures that will be used to characterize data transmittal error rates
and to minimize information loss in the transmittal.
Data transmittal occurs when data are transferred from one person or location to another or when
data are copied from one form to another.  Some examples of data transmittal are copying raw
data from a notebook onto a data entry form for keying into a computer file and electronic
transfer of data over a telephone or computer network. Table 19-4 summarizes data transfer
operations.

Table 19-4 Data Transfer Operations
Description of Data
Transfer
Keying Weighing Data into
ThePM25DAS
Electronic data transfer
Originator
Laboratory Technician
(hand-written data form)
(between computers or over
network)
Recipient
Data Processing Personnel
-
QA Measures Applied
Double Key Entry
Parity Checking;
transmission protocols

-------
                                                                             Project: Model QAPP
                                                                                 Element No: 19
                                                                                  Revision No:l
                                                                                   Date:4/17/98
                                                                            	Page 8 of 14
Description of Data
Transfer
Filter Receiving and Chain-
of-Custody
Calibration, FRM/FEM,
and Audit Data
AIRS data summaries
Originator
Shipping and Receiving
Clerk
Auditor or field supervisor
Air Quality Supervisor
Recipient
The PM25 DAS Computer
(shipping clerk enters data at
a local terminal)
PM2 , Data base Computer
AIRS (U.S. EPA)
QA Measures Applied
Filter numbers are verified
automatically; reports
indicate missing filters
and/or incorrect data entries
Entries are checked by Air
Quality Supervisor and QA
Officer
Entries are checked by Air
Quality Supervisor and QA
Officer
The Palookaville Department of Health will report all PM2 5 ambient air quality data and
information specified by the AIRS Users Guide (Volume II, Air Quality Data Coding, and
Volume III, Air Quality Data Storage), coded in the AIRS-AQS format.  Such air quality data and
information will be fully screened and validated and will be submitted directly  to the AIRS-AQS
via electronic transmission, in the format of the AIRS-AQS, and in accordance with the quarterly
schedule. The specific quarterly reporting periods and due dates are shown in the Table 19-5.

                   Table 19-5 Data Reporting Schedule
Reporting Period
January 1 -March 3 1
April 1-June 30
July 1-September 30
October 1 -December 31
Due Date
June 30
September 30
December 3 1
March 3 1
19.6  Data Reduction
    Data reduction includes all processes that change the number of data items. This process is distinct from data
transformation in that it entails an irreversible reduction in the size of the data set and an associated loss of detail. For
manual calculations, the QAPP should include an example in which typical raw data are reduced. For automated data
processing, the QAPP should clearly indicate how the raw data are to be reduced with a well-defined audit trail, and
reference to the specific software documentation should be provided.
Data reduction processes involve aggregating and summarizing results so that they can be
understood and interpreted in different ways. The PM2 5 monitoring regulations require certain
summary data to be computed and reported regularly to U.S. EPA. Other data are reduced and
reported for other purposes such as station maintenance.  Examples of data summaries include:

-------
                                                                          Project: Model QAPP
                                                                               Element No: 19
                                                                                Revision No:l
                                                                                Date:4/17/98
_ Page 9 of 14

     *•  average PM2 5 concentration for a station or set of stations for a specific time period
     >  accuracy, bias, and precision statistics based on accumulated FRM/FEM data
     >  data completeness reports based on numbers of valid samples collected during a specified
        period

The Audit Trail is another important concept associated with data transformations and reductions.
An audit trail is a data structure that provides documentation for changes made to a data set
during processing.  Typical reasons for data changes that would be recorded include the
following:

     >•  corrections of data input due to human error
     *•  application of revised calibration factors
     *•  addition of new or supplementary data
     *•  flagging of data as invalid or suspect
     >  logging of the date and times when automated data validation programs are run

The PM2 5 DAS audit trail is implemented as a separate table in the Microsoft Accessฎ data base.
Audit trail records will include the following fields:
     *•  operator's identity (ID code)
     *•  date and time of the change
     *•   table and field names for the changed data item
     *   reason for the change
     >   full identifying information for the item changed (date, time, site location, parameter,
        etc.)
     >   value of the item before and after the change

When routine data screening programs are run, the following additional data are recorded in the
audit trail:

     >•   version number of the screening program
     *   values of screening limits (e.g., upper and lower acceptance limits for each parameter)
     ป•   numerical value of each data item flagged and the flag applied

The audit trail is produced automatically and can only document changes; there is no "undo"
capability for reversing changes after they have been made.  Available reports based on the audit
trail include:

     *   log of routine data validation, screening, and reporting program runs
     *•   report of data changes by station for a specified time period
     *   report of data changes for a specified purpose
     >   report of data changes made by a specified person

-------
                                                                              Project: Model QAPP
                                                                                   Element No: 19
                                                                                    Revision No:l
                                                                                     Date:4/17/98
                                                                     	Page 10 of 14

Because of storage requirements, the System Administrator must periodically move old audit trail
records to backup media.  Audit trail information will not be moved to backup media until after
the data are reported to AIRS.  All backups will be retained so that any audit trail information can
be retrieved for at least three years.


19.7   Data Analysis


    Data analysis sometimes involves comparing suitably reduced data with a conceptual model (e.g., a dispersion
model or an infectivity model). It frequently includes computation of summary statistics, standard errors, confidence
intervals, tests of hypotheses relative to model parameters, and goodness-of-fit tests. This element should briefly
outline the proposed methodology for data analysis and a more detailed discussion should be included in the final
report.
The Palookaville Department of Health is currently implementing the data summary and analysis
requirements contained in 40CFR Part 58, Appendix A.  It is anticipated that as the PM2 5
Monitoring Program develops, additional data analysis procedures will be developed.  The
following specific summary statistics will be tracked and reported for the PM2 5 network:

     •   Single sampler bias or accuracy (based on collocated FRM data, flow rate performance
         audits, and FRM performance evaluations)
     •   Single sampler precision (based on collocated data)
     •   Network-wide bias and precision (based on collocated FRM data, flow rate performance
         audits, and FRM performance evaluations)
     •   Data completeness

Equations used for these reports are given in the Table 19-6.

Table 19-6 Report Equations	
                Criterion
      Equation
                                                                                  Reference
  Accuracy of Single Sampler Flow - Single
  Check (d,) X, is reference flow; Y, is measured
  flow
      Y.  - X.
d. =	i  x  100
                              40 CFR 58 Appendix
                              A, Section 5.5.1.1
   Bias of a Single Sampler - Annual Basis (!})-
   average of individual percent differences
   between sampler and reference value; ij is the
   number of measurements over the period
                                                                             5 5.1 2
D  =     x
  i
  J     n

-------
                                                                              Project: Model QAPP
                                                                                   Element No: 19
                                                                                    Revision No:l
                                                                                     Date:4/l 7/98
                                                                             	Page 11 of 14
   Percent Difference for a Single Check (
-------
                                                                           Project: Model QAPP
                                                                                Element No: 19
                                                                                 Revision No:l
                                                                                  Date:4/17/98
	Page 12 of 14

Other data that is entered into the DAS will also generate the following flags:

     *   Filter holding time exceeded (HTE)
     *•   24 hour laboratory temperature criteria exceeded (FLT)
     >   24 hour relative humidity criteria exceeded (FLH)
     *•   Below quantitation limit (6 /ug/m3) for collocated pairs (BLQ)

During the sample validation process, the flags will be used to decide on validating or invalidating
individual samples or batches of data. Section 23 discusses this process.

19.9  Data Tracking


    Data management includes tracking the status of data as they are collected, transmitted, and processed. The
QAPP should describe the established procedures for tracking the flow of data through the data processing system.
The PM2 5 DAS contains the necessary input functions and reports necessary to track and account
for the whereabouts of filters and the status of data processing operations for specific data.
Information about filter location is updated at distributed data entry terminals at the points of
significant operations.  The following input locations are used to track filter location and status:

     *•   Laboratory
         •   Filter receipt (by lot)
         •   Filter pre-sampling weighing (individual filter number first enters the system)
         •   Filter packaged for the laboratory (filter numbers in each package are recorded)
     *•   Shipping (package numbers are entered for both sending and receiving)
     *•   Laboratory
         •   Package receipt (package is opened and filter numbers are logged  in)
         •   Filter post-sampling weighing
         •   Filter archival

In most cases the tracking  data base and the monitoring data base are updated  simultaneously.
For example, when the filter is pre-weighed,  the weight is entered into  the monitoring data base
and the filter number and status are entered into the tracking data base.  The Palookaville
Department of Health has requested permission from the Regional EPA to use this electronic
system in lieu of the paper forms previously used for chain-of-custody  tracking. Until this request
is approved, a parallel paper chain-of-custody system will remain in place.

-------
                                                                              Project: Model QAPP
                                                                                   Element No: 19
                                                                                    Revision No: 1
                                                                                     Date:4/17/98
 	Page 13 of 14

 Tracking reports may be generated by any personnel with report privileges on the DAS.  The
 following tracking reports are available:

     *•   Location of any filter (by filter number)
     *•   List of all filters sent to a specified site that have not been returned
     ป•   List of all filters that have not been returned and are more than 30 days past initial
         weighing date
     >   List of all filters in the filter archive
     >•   List of all filters that have been received but have not been post-weighed
     *•   Ad hoc reports can also be generated using Microsoft Accessฎ queries

 The Air Division QA Officer or designee is responsible for tracking filter status at least twice per
 week and following up on anomalies such as excessive holding time in the laboratory before
 re weighing.

 19.10 Data Storage and Retrieval
    The QAPP should discuss data storage and retrieval including security and time of retention, and it should
document the complete control system. The QAPP should also discuss the performance requirements of the data
processing system, including provisions for the batch processing schedule and the data storage facilities.
Data archival policies for the PM2 5 data are shown in  Table 19-8.

Table 19-8 Data Archive Policies
Data Type
Weighing records; chain of
custody forms
Laboratory Notebooks
Field Notebooks
PM2 5MP Data Base
(excluding Audit Trail
records)
PM25MP Audit Trail
records
Filters
Medium
Hardcopy
Hardcopy
Hardcopy
Electronic
(on-line)
Electronic
(backup
tapes)
Filters
Location
Laboratory
Laboratory
Air Quality
Division
Air Quality
Division
Air Quality
Division
Laboratory
Retention Time
3 years
3 years
3 years
indefinite (may be moved to
backup media after 5 years)
3 years
1 year
Final Disposition
Discarded
N/A
Discarded
Backup tapes retained
indefinitely
Discarded
Discarded

-------
                                                                        Project: Model QAPP
                                                                             Element No: 19
                                                                              Revision No:l
                                                                               Date:4/17/98
	Page 14 of 14

The PM2 5 data reside on an IBM-PC compatible computer in the Air Quality Division. This
computer has the following specifications:

    >   Processor: Dual Pentium-Pro 180 MHZ
    >   Operating System: Windows NT Server
    ป•   Memory: 128MB
    *   Storage:  18 GB (SCSI RAID 0 array)
    >   Backup:  DAT (3 GB per tape) - incremental backups daily; full backups biweekly
    >   Network: Windows NT, 100 Mbps Ethernet network (currently 23 Windows 95 and NT
        workstations on site; additional workstations via 28.8 kbps dial-in modem)
    *•   Data Base Software:  Microsoft Access, Visual Basic, Visual C++
    >   Security: Password protection on all workstations and dial-in lines; Additional password
        protection applied by application software

Security of data in the PM2 5 data base is ensured by the following controls:

    *•   Password protection on the data base that defines three levels of access to the data
    ป•   Regular password changes (quarterly for continuing personnel; passwords for personnel
        leaving the Air Division will be canceled immediately)
    >   Independent password protection on all dial-in lines
    >   Logging of all incoming communication sessions, including the originating telephone
        number, the user's ID, and connect times
    *•   Storage of media including backup tapes in locked, restricted access areas

-------
                                                                             Project: Model QAPP
                                                                                  Element No: 20
                                                                                  Revision No:  1
                                                                                    Date: 4/17/98
                                                                            	Page 1 of 11
                     20.0 Assessments and Response Actions
    During the planning process, many options for sampling design (ref. EPA QA/G-5S, Guidance on Sampling
 Design to Support QAPPs), sample handling, sample cleanup and analysis, and data reduction are evaluated and
 chosen for the project. In order to ensure that the data collection is conducted as planned, a process of evaluation of
 the collected data is necessary. This element of the QAPP describes the internal and external checks necessary to
 ensure that:

    •   all elements of the QAPP are correctly implemented as prescribed,
    •   the quality of the data generated by implementation of the QAPP is adequate, and
    •   corrective actions, when needed, are implemented in a timely manner and their effectiveness is confirmed.
    Although any external assessments that are planned should be described in the QAPP, the most important part of
 this element is documenting all planned internal assessments. Generally, internal assessments are initiated or
 performed by the internal QA Officer so the activities described in this element of the QAPP should be related to the
 responsibilities of the QA Officer as discussed in Section A4.
An assessment, for this QAPP, is defined as an evaluation process used to measure the
performance or effectiveness of the quality  system, the establishment of the monitoring network
and sites and various measurement phases of the data operation.

The results of quality assurance assessments indicate whether the control efforts are adequate or
need to be improved. Documentation of all  quality assurance and quality control efforts
implemented during the data collection, analysis, and reporting phases is important to data users,
who can then consider the impact of these control efforts on the data quality (see Section 21).
Both qualitative and quantitative assessments of the effectiveness of these control efforts will
identify  those areas most likely to impact the data quality and to what extent. Periodic
assessments of SLAMS data quality are required to be reported to EPA. On the other hand, the
selection and extent of the QA and QC activities used by a monitoring agency depend on a
number of local factors such as the field and laboratory conditions, the objectives for monitoring,
the level of the data quality needed, the expertise of assigned personnel, the cost of control
procedures, pollutant concentration levels, etc.

In order to ensure the adequate performance of the quality system, the Pollokaville Department
of Health will perform the following assessments:

    *•   Management Systems Reviews
    *   Network Reviews
    *•   Technical Systems Audits
    >•   Audits of Data Quality
    +   Data Quality Assessments

-------
                                                                                                Project: Model QAPP
                                                                                                      Element No: 20
                                                                                                      Revision No: 1
                                                                                                        Date: 4/17/98
                                                                                               	Page 2 of 11
  20.1     ASSESSMENT ACTIVITIES AND PROJECT PLANNING
    The following is a description of various types of assessment activities available to managers in evaluating the effectiveness of
environmental program implementation.

Assessment of the Subsidiary Organizations

A.  Management Systems Review (MSR).  A specific form of management assessment, this process is a qualitative assessment of
    a data collection operation or organization to establish whether the prevailing quality management structure, policies,
    practices, and procedures are adequate for ensuring that the type and quality of data needed are obtained. The MSR is used to
    ensure that sufficient management controls are in place and carried out by the organization to adequately plan, implement,
    and assess the results of the project. The MSR also checks the conformance of the organization's quality system with its
    approved QMP. See also Guidance for the Management Systems Review Process(EPA QA/G-3).

B.  Readiness reviews. A readiness review is a technical check to determine if all components of the project are in place so that
    work can commence on a specific phase of a project.

Assessment of Project Activities

 A  Surveillance. Surveillance is the continual or frequent monitoring of the status of a project and the analysis of records to
    ensure that specified requirements are  being fulfilled.

B.  Technical Systems Audit (TSA). A TSA is a thorough and systematic onsite qualitative audit, where facilities, equipment,
    personnel, training, procedures, and record keeping are examined for conformance to the QAPP.

C.  Performance Evaluation (PE} A  PE is a type of audit in which the quantitative data generated by the measurement system
    are obtained independently and compared with routinely obtained data to evaluate the proficiency of an analyst or laboratory.
    The QAPP should list the PEs that are planned, identifying:

         •   the constituents to be measured,
         •   the target concentration ranges,
         •   the timing/schedule for PE sample analysis, and
         •   the aspect of measurement quality to be assessed (e.g., bias, precision, and detection limit).

D.  Audit of Data Quality(ADQ) An ADQ reveals how the  data were handled, what judgments were made, and whether
    uncorrected mistakes were made. Performed prior to producing a project's final report, ADQs can often identify the means to
    correct systematic data reduction errors.

E.  Peer review.  Peer review is not a TSA, nor strictly an internal QA function, as it may encompass non-QA aspects of a project
    and is primarily designed for scientific review. Reviewers are chosen who have technical expertise comparable to the
    project's performers but who are independent of the project. ADQs and peer reviews ensure that the project activities:

             •     were technically adequate,
             •     were competently performed,
             •     were properly documented,
             •     satisfied established technical requirements, and
             •     satisfied established QA requirements.

    In addition, peer reviews assess the assumptions, calculations, extrapolations, alternative interpretationsgiethods, acceptance
    criteria, and conclusions documented  in the project's report..

F.  Data Quality Assessment (DQA). DQA involves the application of statistical tools to determine whether the data meet the
    assumptions that the DQOs and data collection design were developed under and whether the total error in the data is
    tolerable.

-------
                                                                         Project: Model QAPP
                                                                              Element No: 20
                                                                              Revision No: 1
                                                                                Date: 4/17/98
                                                                         	Page 3 of 11
 20.1.1 Management Systems Review
A management systems review (MSR) is a qualitative assessment of a data collection operation or
organization to establish whether the prevailing quality management structure, policies, practices,
and procedures are adequate for ensuring that the type and quality of data needed are obtained.
Management systems reviews of the Ambient Air Monitoring Program are conducted every three
years by the office of the Director. The MSR will use appropriate federal regulations, and the
QAPP to determine the adequate operation of the air program and its related quality system. The
quality assurance activities of all criteria pollutants including PM2 5 will be part of the MSR.
Divisions to be included in the MSR include the QA, Air,  and Program Support Divisions.  The
Office Director's staff will report its findings to the appropriate Divisions within 30 days of
completion of the MSR.  The report will be appropriately filed (Section 9).  Follow-up and
progress on corrective action(s) will be determined during regularly  scheduled division directors
meetings

20.1.2 Network Reviews

Conformance with network requirements of the Ambient Air Monitoring Network set forth in 40
CFR Part 58 Appendices D and E are determined through annual network reviews of the  ambient
air quality monitoring system.  The network review is used to  determine how well a particular air
monitoring network is achieving its required air monitoring objective, and how it should be
modified to continue to meet its objective.  A PM2 5 Network review will be accomplished every
year.   Since the EPA Regions are also required to perform these reviews, the Department will
coordinate  its activity with the Region in order to perform the activity at the same time (if
possible). The Air Monitoring Branch will be responsible for conducting the network review.

The following criteria will be considered during the review:

    >•   date of  last review
    ป•   areas where attainment/nonattainment redesignations are taking place or are likely to take
        place
    *   results of special studies, saturation sampling, point source oriented ambient monitoring,
        etc
    *•   proposed network modifications since the last network review

In addition, pollutant-specific priorities may be considered (e.g., newly designated nonattainment
areas, "problem areas", etc.).

Prior to the implementation of the network review, significant data and information pertaining to
the review will be compiled and evaluated. Such information might include the following:
       network files (including updated site information and site photographs)
       AIRS reports (AMP220, 225, 380, 390, 450)

-------
                                                                         Project: Model QAPP
                                                                             Element No: 20
                                                                              Revision No: 1
                                                                               Date: 4/17/98
	Page 4 of 11

     >  air quality summaries for the past five years for the monitors in the network
     >  emissions trends reports for major metropolitan area
     *  emission information, such as emission density maps for the region in which the monitor is
       located and emission maps showing the major sources of emissions
     >  National Weather Service summaries for monitoring network area

Upon receiving the information it will be checked to ensure it is the most current. Discrepancies
will be noted on the checklist and resolved during the review.  Files and/or photographs that need
to be updated will also be identified. The following categories will emphasized during network
reviews:

Number of Monitors-For SLAMS, the number of monitors required for PM25 depending upon
the measurement objectives is discussed in 40 CFR Part 58 with additional details in the Guidance
for Network Design and Optimum Exposure for PM2 5 and PMW.  Section 10 of this QAPP
discusses the PM2 5 Network. Adequacy of the network will be determined by using the following
information:

     >   maps of historical monitoring data
     *   maps of emission densities
     *•   dispersion modeling
     *•   special studies/saturation sampling
     *•   best professional judgement
     *   SIP requirements
     >   revised monitoring strategies (e.g., lead strategy, reengineering air monitoring network)

For NAMS, areas to be monitored must be selected based on urbanized population and pollutant
concentration levels.  To determine whether the number of NAMS are adequate, the number of
NAMS operating will be compared to the number of NAMS specified in 40 CFR 58 Appendix D.
The number of NAMS operating can be determined from the AMP220 report in AIRS.  The
number of monitors required, based on concentration levels and population, can be determined
from the AMP450 report and the latest census population data.

Location of Monitors- For SLAMS, the location of monitors  is not specified in the regulations,
but is determined by the Regional Office and State agencies on a case-by-case basis to meet the
monitoring objectives specified in 40 CFR Part 58 Appendix D. Adequacy of the location of
monitors can only be determined on the basis of stated objectives.  Maps, graphical overlays, and
GIS-based information will be helpful in visualizing or assessing the adequacy of monitor
locations.  Plots of potential emissions and/or historical monitoring data versus  monitor locations
will also be used.

During the  network review, the stated objective for each monitoring location or site (see section
10) will be "reconfirmed" and the spatial scale "reverified" and then compared to each location to
determine whether these objectives can still be attained at the present location.

-------
                                                                          Project: Model QAPP
                                                                              Element No: 20
                                                                               Revision No: 1
                                                                                Date: 4/17/98
 	Page 5 of 11

 Conformance to 40 CFR Part 58 Appendix E - Probe Siting Requirements- Applicable siting
 criteria for SLAMS, and NAMS are specified in 40 CFR 58, Appendix E. The on-site visit will
 consist of the physical measurements and observations to determine compliance with the
 Appendix E requirements, such as height above ground level, distance from trees, paved or
 vegetative ground cover, etc.  Since many of the Appendix E requirements will not change within
 one year, this check at each site will be performed every 3 years.

 Prior to the  site visit, the reviewer will obtain and review the following:

     *  most recent hard copy of site description (including any photographs)
     >  data on the seasons with the greatest potential for high concentrations for specified
        pollutants
     *•  predominant wind direction by season

 A checklist similar to the checklist used by the EPA Regional offices during their scheduled
 network reviews will be used.  This checklist can be found in the SLAMS/NAMS/PAMS Network
 Review Guidance which is intended to assist the reviewers in determining Conformance with
 Appendix E. In addition to the items on the checklist, the reviewer will also perform the
 following tasks:

     > ensure that the inlet is clean
     *• check equipment for missing parts, frayed cords, damage, etc
     *• record findings in field notebook and/or checklist
     ป• take photographs/videotape in the 8 directions
     * document site conditions, with additional photographs/videotape

 Other Discussion Topics- In addition to the items included in the checklists, other subjects for
 discussion as part of the network review and overall adequacy of the monitoring program will
 include:

     * installation of new monitors
     ป• relocation of existing monitors.
     ป• siting criteria problems and suggested solutions
     * problems with data submittals and data completeness
     *• maintenance and replacement of existing monitors and related equipment
     *• quality assurance problems
     *• air quality studies and special monitoring programs
     > other issues
        -proposed regulations
        -funding

A report of the network review will be written within two months of the review (Section 21) and
appropriately filed (Section 10).

-------
                                                                          Project: Model QAPP
                                                                               Element No: 20
                                                                               Revision No: 1
                                                                                Date: 4/17/98
                                                                         	Page 6 of 11
20.1.3 Technical Systems Audits

A TSA is a thorough and systematic onsite qualitative audit, where facilities, equipment,
personnel, training, procedures, and record keeping are examined for conformance to the QAPP.
TSAs of the PM2 5 network will be accomplished every three years and will stagger the required
TSA conducted by EPA Regional Office. The QA Office will implement the TSA either as a team
or as an individual auditor. The QA Office will perform three TSA activities that can be
accomplished separately or combined :
Audit Team Interview of Reporting Organization Director |
Audit Group 1
Interview Lahore



Interview with Key Personnel I

tory Manager 1
^— _
Audit Group2
	
Interview Field Operations Manager 1

Visit Laboratory, Witness Operations I


Review Sample Receiving and Custody I


Select Portion of Data, Inmate Audit Trail 1


Visit Sites I

Visit Audit and Calibration Facility |


Select Portion of Data, Initiate Audit Trai|

Establish Data Audi
Laboratory Operf
Management

t Trail Through
•unction
Meet to EstablL
Discuss Operati
Findings

Finalize Audit Trails and Complete Data Audit


Prepare Audit Result Summary of
(a) Overall operations (b) data audit findings
(c) laboratory operations (d) field operations


ih Trail Through Field |
ons to Data Management (
|


Complete audit finding fbims and debieifing report 1


Discuss Findings with Key Personnel


1
_J

On-Srte Audit Complete J



Figure 20.1 Audit Activities
>   Field - handling, sampling,
    shipping.
*•   Laboratory - Pre-sampling
    weighing, shipping.
    receiving, post-sampling
    weighing, archiving, and
    associated QA/QC.
ป•   Data management -
    Information collection,
    flagging, data editing,
    security, upload.

Key personnel to  be inter-
viewed during the audit are
those individuals with
responsibilities for: planning,
field operations, laboratory
operations, QA/QC, data
management, and reporting.
The audit activities are
illustrated in  Figure 20.1.

To increase uniformity of the
TSA, an audit checklist will
be developed and used. It will
review activities similar to the
training certification forms
found in Appendix B,
butcontain more detail.

The audit team will prepare a
brief written summary of find-
ings,  organized into the
following areas: planning, field

-------
                                                                           Project: Model QAPP
                                                                                Element No: 20
                                                                                Revision No: 1
                                                                                 Date: 4/17/98
                                                                                  Page 7 of 11
operations, laboratory operations, quality assurance/quality control, data management, and
reporting. Problems with specific areas will be discussed and an attempt made to rank them in
order of their potential impact on data quality. For the more serious of these problems, audit
findings will be drafted (Fig. 20.2).

The audit finding form has been designed such that one is filled out for each major deficiency that
requires formal corrective action.  The finding should include items like: pollutant(s) impacted,
estimated time period of deficiency, site(s) affected, and reason of action.  The finding form will
inform the Department about serious problems that may compromise the quality of the data and
therefore require  specific corrective actions. They are initiated by the Audit Team, and discussed
at the debriefing.  During the debriefing, if the audited group is in agreement with the finding, the
form is  signed by the groups branch manager or his designee during the exit interview. If a
disagreement occurs, the QA Team will record  the opinions of the  group audited and set a time
at some later date to address the finding at issue.
                                   Audit Finding
             Audit Title:
Audit #:
Finding #:_
             Finding:
              Discussion:
             QA Lead Signature:

             Audited Agencies
                  Signature:
              Date:
             Date:
           Figure 20.2. Audit Finding Form

-------
                                                                         Project: Model QAPP
                                                                             Element No: 20
                                                                              Revision No: 1
                                                                               Date: 4/17/98
	Page 8 of 11

Post-Audit Activities- The major post-audit activity is the preparation of the systems audit
report. The report will include:

     *•  audit title and number and any other identifying information
     >  audit team leaders, audit team participants and audited participants
     *•  background information about the project, purpose of the audit, dates of the audit,
       particular measurement phase or parameters that were audited, and a brief description of
       the audit process
     *•  summary and conclusions of the audit and corrective action requires
     *•  attachments or appendices that include all audit evaluations and audit finding forms

To prepare the report, the audit team will meet and compare observations with collected
documents and results of interviews and discussions with key personnel. Expected QA Project
Plan implementation is compared with observed accomplishments and deficiencies and the audit
findings are reviewed in detail. Within thirty (30) calendar days of the completion of the audit, the
audit report will be prepared and submitted. The systems audit report will be submitted to the
appropriate branch managers and appropriately filed (Section 10)

If the branch has written comments or questions concerning the audit report, the Audit Team will
review and incorporate them as appropriate, and subsequently prepare and resubmit a report in
final form within thirty (30) days of receipt of the written comments. The report will include an
agreed-upon schedule for corrective action implementation.

Follow-up and Corrective Action Requirements- The QA Office and the audited organization
may work together to  solve required corrective actions. As part of corrective action and follow-
up, an audit finding response form (Fig 20.3) will be generated by the audited organization for
each finding form submitted by the QA Team.  The audit finding response form is signed by the
audited organizations Director and sent to the QA Office who reviews and accepts the corrective
action. The audit response form will be completed by the audited organization within 30 days of
acceptance of the audit report.

20.1.4 Audit of Data Quality (ADQ)

An ADQ reveals how the data are handled, what judgments were made, and whether uncorrected
mistakes were made.  ADQs can often identify the means to correct systematic data reduction
errors. An ADQ will be performed every year and will also be part of the TSA (every 3 years).
Thus, sufficient time and effort will be devoted to this activity so that the auditor or team has a
clear understanding and complete documentation of data flow. Pertinent ADQ questions will
appear on the TSA check sheets to ensure that the data collected at each stage maintains its
integrity. The ADQ will serve as an effective framework for organizing the extensive amount of
information gathered during the audit of laboratory, field monitoring, and support functions within
the agency. The ADQ will have the same reporting/corrective action requirements as the TSA.

-------
                                                                             Project: Model QAPP
                                                                                  Element No: 20
                                                                                  Revision No: 1
                                                                                   Date: 4/17/98
                                                                            	Page 9 of 11
                           Audit Finding  Response Form
               Audited Division,

               Audit Title:	
  Audit*:	 Finding*:.
               Finding:
               Cause of the problem:
               Actions taken or planned for correction:
               Responsibilities and timetable for the above actions:
               Prepared by:.

               Signed by:
   Date:.

   Date:
               QA Division

               Reviewed by:.

               Remarks:
               Is this audit finding closed?_
   Date:
When?
               File with official audit records. Send copy to auditee
             Figure 20.3. Audit Response Form
20.1.5 Data Quality Assessments

A data quality assessment (DQA) is the statistical analysis of environmental data to determine
whether the quality of data is adequate to support the decision which are based on the DQOs.
Data are appropriate if the level of uncertainty in a decision based on the data is acceptable.  The
DQA process is described in detail in Guidance for the Data Quality Assessment Process, EPA
QA/G-9 and is summarized below.

    1.   Review the data quality objectives (DQOs) and sampling design of the program: review
        the DQO and develop one,  if it has not already been done. Define statistical hypothesis,
        tolerance limits, and/or confidence intervals

-------
                                                                          Project: Model QAPP
                                                                              Element No: 20
                                                                               Revision No: 1
                                                                                Date: 4/17/98
                                                                         	Page 10 of 11
    2.   Conduct preliminary data review. Review Precision &Accuracy (P&A) and other
        available QA reports, calculate summary statistics, plots and graphs.  Look for patterns,
        relationships, or anomalies

    3.   Select the statistical test: select the best test for analysis based on the preliminary review,
        and identify underlying assumptions about the data for that test

    4.   Verify test assumptions: decide whether the underlying assumptions made by the selected
        test hold true for the data and the consequences.

    5.   Perform the statistical test: perform test and document inferences.  Evaluate the
        performance for future use

Data quality assessment will be included in the QA Annual Report. Details of these reports are
discussed in Section 21.

Measurement uncertainty will be estimated for both automated and manual methods. Terminology
associated with measurement uncertainty are found within 40 CFR Part 58 Appendix A and
includes: (a) Precision - a measurement of mutual agreement among individual measurements of
the same property usually under prescribed similar conditions, expressed generally in terms of the
standard deviation; (b) Accuracy- the degree of agreement between an observed value and an
accepted reference value, accuracy includes a combination of random error (precision)  and
systematic error (bias) components which are due to sampling and analytical operations; (c) Bias-
the systematic or persistent distortion of a measurement process which causes errors in one
direction. The individual results of these tests for each method or analyzer shall be reported to
EPA.

Estimates of the data quality will be calculated on the basis  of single monitors and aggregated to
all monitors.

-------
                                                                                       Project: Model QAPP
                                                                                            Element No: 20
                                                                                             Revision No: 1
                                                                                              Date: 4/17/98
                                                                                      	Page 11  of 11
 20.2     Documentation of Assessments
    The following material describes what should be documented in a QAPP after consideration of the above issues
and types of assessments:

Number. Frequency, and Types of Assessments - Depending upon the nature of the project, there may be more than
one assessment. A schedule of the number, frequencies, and types of assessments required should be given.

Aassessment Personnel- The QAPP should specify the individuals, or at least the specific organizational units, who
will perform the assessments.  Internal audits are usually performed by personnel who work for the organization
performing the project work but who are organizationally independent of the management of the project. External
audits are performed by personnel of organizations not connected with the project but who are technically qualified
and who understand the QA requirements of the project.

Sschedule of Assessment Activities-A schedule of audit activities, together with relevant criteria for assessment,
should be given to the extent that it is known in advance of project activities.

Reporting and Resolution of Issues -Audits, peer reviews, and other assessments often reveal findings of practice or
procedure that do not conform to the written QAPP. Because these issues must be addressed in a timely manner, the
protocol for resolving them should be given here together with the proposed actions to ensure that the corrective
actions were performed effectively. The person to whom the concerns should be addressed, the decision-making
hierarchy, the schedule and format for oral and written reports, and the responsibility for corrective action should all
be discussed in this element. It also should explicitly define the unsatisfactory conditions upon which the assessors are
authorized to act and list the project personnel who should receive assessment reports.
Table 20-1 summarizes each of the assessments discussed above.
 Table 20-1 Assessment Summary
Assessment Activity
Management Systems
Reviews
Network Reviews
AppD
AppE
Technical Systems Audits
Audits of Data Quality
Data Quality Assessment
Frequency
1/3 years
I/ years
1/3 years
1/3 years
I/ year
I/year
Personnel
Responsible
Directors Office
Air Division
Air Division
QA Office
QA Office
QA/Air Monitoring
Divisions
Schedule
1/1/2000
1/1/2000
1/1/2000
5/1/99
5/1/99
1/1/2000
Report
Completion
30 days after
activity
30 days after
activity
30 days after
activity
30 days after
activity
120 days after
end of
calendar year
Reporting/Resolution
Directors Office to QA,
Air, Program Support
Divisions
Air Division to Air
Monitoring Branch
QA Division to Air
Monitoring Division
QA Division to Air
Monitoring Division
Air Monitoring Division to
Directors Office/ EPA
Region

-------
                                                                             Project: Model QAPP
                                                                                 Element No: 21
                                                                                  Revision No:l
                                                                                   Date:4/l 7/98
                                                                                    Page 1 of6
                            21.0 Reports to Management
    Effective communication between all personnel is an integral part of a quality system. Planned reports provide a
 structure for apprising management of the project schedule, the deviations from approved QA and test plans, the
 impact of these deviations on data quality, and the potential uncertainties in decisions based on the data. Verbal
 communication on deviations from QA plans should be noted in summary form in element Dl of the QAPP .
 This section describes the quality-related reports and communications to management necessary
 to support SLAMS/NAMS PM2 5 network operations and the associated data acquisition,
 validation, assessment, and reporting. Unless otherwise indicated, data pertaining to PM2 5 will be
 included in reports containing monitoring data for other pollutants.

 Important benefits of regular Q A reports to management include the opportunity to alert the
 management of data quality problems, to propose viable solutions to problems, and to procure
 necessary additional resources.  Quality assessment, including the evaluation of the technical
 systems, the measurement of performance, and the assessment of data, is conducted to help insure
 that measurement results meet program objectives and to insure that necessary corrective actions
 are taken early, when they will be most effective. This is particularly important in the new PM2 5
 network, as new equipment and procedures are being  implemented.

 Effective communication among all personnel is  an integral part of a quality system.  Regular,
 planned quality reporting provides a means for tracking the following:
    *   adherence to scheduled delivery of data and reports,
    ป•   documentation of deviations from approved Q A and test plans, and the impact of these
        deviations on data quality
    >   analysis of the potential uncertainties in decisions based on the data

 21.1 Frequency, Content,  and Distribution of Reports

     The QAPP should indicate the frequency, content, and distribution of the reports so that management may
  anticipate events and move to ameliorate potentially adverse results. An important benefit of the status reports is the
  opportunity to alert the management of data quality problems, propose viable solutions, and procure additional
  resources. If program assessment (including the evaluation of the technical systems, the measurement of
  performance, and the assessment of data) is not conducted on a continual basis, the integrity of the data generated in
  the program may not meet the quality requirements. These audit reports, submitted in a timely manner, will provide
  an opportunity to implement corrective actions when most appropriate
Required reports to management for PM2 5 monitoring and the SLAMS program in general are
discussed in various sections of 40 CFR Parts 50, 53, and 58. Guidance for management report
format and content are provided in guidance developed by EPA's Quality Assurance Division

-------
                                                                        Project: Model QAPP
                                                                             Element No: 21
                                                                              Revision No:l
                                                                               Date:4/17/98
                        	Page 2 of6

(QAD) and the Office of Air Quality Planning and Standards (OAQPS). These reports are
described in the following subsections.

21.1.1 Q A Annual Report

Periodic assessments of SLAMS data quality are required to be reported to EPA (40 CFR 58
Appendix A, Section 1.4, revised July 18, 1997). The Palookaville Health Department Air
Division's QA Annual Report is issued to meet this requirement.  This document describes the
quality objectives for measurement data and how those objectives have been met.

The QA Annual Report also provides for the review of the SLAMS air quality surveillance system
on an annual basis to determine if the system meets the monitoring objectives defined in 40 CFR
Part 58, Appendix D. Such review will identify needed modifications to the network such as
termination or relocation of unnecessary stations or establishment of new stations which are
necessary.

The QA Annual Report will include Quality information for each ambient air pollutant in the
Palookaville monitoring network.  These sections are organized by ambient air pollutant category
(e.g., gaseous criteria pollutants, PM2 5).  Each section includes the following topics:

    *•   program overview and update
    *•   quality objectives for measurement data
    ป•   data quality assessment

For reporting PM2 5 measurement uncertainties, the QA Annual Report contains the following
summary information required by 40 CFR 58  Appendix A (Section 3.5, revised July 18, 1997):

    ป•   Flow Rate Audits (Section 3.5.1)
    >   Collocated Federal Reference Method Samplers (Section 3.5.2)
    *•   Collocated Equivalent Samplers of same designation (Section 3.5.2)
    *•   Assessment of Bias Using the FRM Audit Procedure (Section 3.5.3)

21.1.2 Network Reviews

The EPA Regional office prepares annual network reviews  in accord with requirements in 40 CFR
Part 58.20(d).  The purpose of the annual network reviews  is to determine if the system meets the
monitoring objectives defined in 40 CFR Part  58 Appendix D.  The review identifies needed
modifications to the network including termination or relocation of unnecessary stations or
establishment of new stations which are necessary.  Information gathering for these reviews will
be coordinated through the Air Division Director.  Supervisors and other personnel will assist as
necessary to provide information and support.  The Director of the Palookaville Department of
Health is responsible for assuring that such changes are included in future planning. The Director
of the Air Division and the Air Branch QA Manager are jointly responsible for implementing other

-------
                                                                         Project: Model QAPP
                                                                              Element No: 21
                                                                               Revision No:l
                                                                                Date:4/17/98
                                                                         	Page 3 of6
 review findings impacting data quality.
 As required by 40 CFR Part 58 Appendix A, Section 4(a), revised July 18, 1997, the Palookaville
 Air Division Director has provided a list of all monitoring sites and their AIRS site identification
 codes and submits the list to the EPA Regional Office, with a copy to AIRS-AQS. The
 Aerometric Information Retrieval System (AIRS)-Air Quality Subsystem (AQS) is EPA's
 computerized system for storing and reporting of information relating to ambient air quality data.
 Whenever there is a change in this list of monitoring sites in a reporting organization, Palookaville
 Air Division Director will report this change to the EPA Regional Office and to AIRS-AQS.

 21.1.3 Quarterly Reports

 Each quarter, the Palookaville Department of Health, Air Division will report to AIRS-AQS the
 results of all  precision, bias and accuracy tests it has carried out during the quarter.  The quarterly
 reports will be submitted, consistent with the data reporting requirements specified for air quality
 data as set forth in 40 CFR Parts 58.26, 58.35 and 40 CFR Part 58 Appendix A, Section 4.

 The data reporting requirements of 40 CFR Part 58.35 apply to those stations designated SLAMS
 or NAMS. Required accuracy and precision data are to be reported on the same schedule as
 quarterly monitoring data submittals. The  required reporting periods and due dates are listed in
 Table 21-1.

 Table 21-1 Quarterly Reporting Schedule
Reporting Period
January 1 -March 31
April l-June30
July 1 -September 30
October 1 -December 3 1
Due on or Before
June 30
September 30
December 3 1
March 3 1 (following year)
In accord with the Federal Register Notice of July 18, 1997, all QA/QC data collected will be
reported and will be flagged appropriately. This data includes: "results from invalid tests, from
tests carried out during a time period for which ambient data immediately prior or subsequent to
the tests were invalidated for appropriate reasons, and from tests of methods or analyzers not
approved for use in SLAMS monitoring networks ..." (40 CFR Part 58 Appendix A, Section 4,
revised July 18, 1997).

Air quality data submitted for each reporting period will be edited, validated, and entered into the
AIRS-AQS using the procedures described in the AIRS Users Guide, Volume II, Air Quality Data
Coding.  The Palookaville Air Monitoring Branch Information Manager will be responsible for
preparing the data reports, which will be reviewed by the QAO and Air Branch Manager before
they  are transmitted to EPA.

-------
                                                                         Project: Model QAPP
                                                                             Element No: 21
                                                                              Revision No:l
                                                                               Date:4/17/98
                                                                        	Page 4 of6
21.1.4 Technical System Audit Reports

The Palookaville Department of Health performs Technical System Audits of the monitoring
system (section 20).  These reports are issued by the QA Division Director and are reviewed by
the Air Division Director and the Director of the Department of Health.  These reports will be
filed (see table 9-1) and made available to EPA personnel during their technical systems audits.

External systems audits are conducted at least every three years by the EPA Regional Office as
required by 40 CFR Part 58, Appendix A, Section 2.5. Further instructions are available from
either the EPA Regional QA Coordinator or the Systems Audit QA Coordinator, Office of Air
Quality Planning and Standards, Emissions Monitoring and Analysis  Division (MD-14), U.S.
Environmental Protection Agency, Research Triangle Park, NC 27711.

21.1.5 Response/Corrective Action Reports

The Response/Corrective Action Report procedure will be followed whenever a problem is found
such as a safety defect, an operational problem, or a failure to comply with procedures. A
separate form (see fig 20.2) will be used for each problem identified.  The Response/Corrective
Action Report is one of the most important ongoing reports to management because it documents
primary QA activities and provides valuable records of QA activities  that can be used in preparing
other summary reports.

The Response/Corrective Action Report procedure is designed as a closed-loop system. The
Response/Corrective Action Report form identifies the originator, who reported and identified the
problem, states the problem, and may suggest a solution. The form also indicates the name of the
persons or persons who is assigned to correct the problem. The assignment of personnel to
address the problem and the schedule for completion will be filled in  by the appropriate
supervisor. The Response/Corrective Action Report procedure closes the loop by requiring that
the recipient state on the form how the problem was resolved and the effectiveness of the
solution.  Copies of the Response/Corrective Action Report will be distributed twice: first when
the problem has been identified and the action has been scheduled; and second when the
correction has been completed. The originator, the field or laboratory branch manager, and the
QA Division Director will be included in both distributions.

21.1.6 Control Charts with Summary

Control charts for laboratory instruments  are updated after every new calibration or
standardization as defined in the relevant SOP. Analysts are responsible for reviewing each
control chart immediately after it is  updated and for taking corrective actions whenever an out-of-
control condition is observed.  Control charts are to be reviewed at least quarterly by the
laboratory supervisor.  The supervisors will provide summary information to the QA Division
Director for the Annual QA Report  to Management. Control charts are also subject to inspection

-------
                                                                            Project: Model QAPP
                                                                                Element No: 21
                                                                                 Revision No: 1
                                                                                  Date:4/17/98
	Page 5 of6

during audits, and laboratory personnel are responsible for maintaining a readily-accessible file of
control charts for each instrument.
 21.2  Responsible Organizations
     It is important that the QAPP identify the personnel responsible for preparing the reports, evaluating their
 impact, and implementing follow-up actions. It is necessary to understand how any changes made in one area or
 procedure may affect another part of the project. Furthermore, the documentation for all changes should be
 maintained and included in the reports to management. At the end of a project, a report documenting the Data
 Quality Assessment findings to management should be prepared.
This section outlines the responsibilities of individuals within the monitoring organization for
preparing quality reports, evaluating their impact, and implementing follow-up actions.  Changes
made in one area or procedure may affect another part of the project.  Only by defining clear-cut
lines of communication and responsibility can all the affected elements of the monitoring network
remain current with such changes. The documentation for all changes will be maintained and
included in the reports to management.  The following paragraphs describe key personnel
involved with QA reporting.

Director of the Palookaville Department of Health - The ultimate responsibility for the quality
of the data and the technical operation of the fine particle monitoring network rests with the
Director of the Palookaville Department of Health.  The Director's responsibilities with respect to
air quality reporting are delegated to the Director of the Air Division.  These responsibilities
include defining and implementing the document management and quality assurance systems for
the PM2 5  monitoring network.

Air Division Director - The Air Division Director is responsible for operation of the air quality
network.  The Air Division Director is specifically responsible for assuring the timely submittal of
quarterly and annual data summary reports.  The Air Director works closely with the Air Branch
QA Manager in implementation of QA procedures,  arranging for audits, and reporting QA data.

QA Division Director - The QA  Division Director  is responsible  for establishing QA policies and
systems employed by the Palookaville Health Department.  The QA Division Director reviews the
QA Annual Report and provides general supervision to the QA Directors of the various  Branches.

QA Manager and QA Officer -  The QA Manager  is responsible for management and
administrative aspects of the Air QA program including coordinating audits and preparing
required reports.  The QA Officer is appointed by the Air Branch QA Manager to be responsible
for day-to-day conduct of QA activities for the Ambient Air Monitoring Program.  The  QA
Officer's responsibilities for QA reports to management include the following:

-------
                                                                         Project: Model QAPP
                                                                              Element No: 21
                                                                               Revision No:l
                                                                               Date:4/17/98
	Page 6 of6

     *•    assisting the Air Branch Manager with data quality assessments and other internal audits
     *    calculating and/or reviewing precision and bias data generated by the collocated PM2 5
         monitors
     >    reviewing control charts and other laboratory QC materials
     >    monitoring Response/Corrective Action Reports

Information Manager - The Information Manager is responsible for coordinating the
information management activities for SLAMS/NAMS data. Specific responsibilities related to
management reports include:

     >   ensuring access to data for timely reporting and interpretation
     >   ensuring timely delivery of all required data to the AIRS system

Air Branch Manager - The Air Branch Manager is responsible for identifying problems and
issuing appropriate Response/Corrective Action Reports. He is also responsible for assigning
Response/Corrective Action Reports to specific personnel and assuring that the work is
completed and that the corrections are effective.  The Branch Manager is also responsible for
assuring that technicians and site operators under their supervision maintain their documentation
files as defined in the network design. Supervisors are responsible for disseminating information
appearing in audit reports and other quality-related documents to operations personnel.

Laboratory Branch Manager - The Laboratory Branch Manager is responsible for identifying
problems and issuing appropriate Response/Corrective Action Reports related to laboratory
activities. He is  also responsible for reviewing laboratory QC data such as control charts and for
assuring that repairs and preventive maintenance are completed and that the maintenance is
effective. The Branch Manager is also responsible for assuring that analysts under their
supervision maintain their documentation files as defined in the relevant SOPs. The Laboratory
Branch Manager will assist the QA Officer in preparing QA reports and summaries and is
responsible for disseminating information appearing in audit reports and other quality-related
documents to operations personnel.

Field and Laboratory Technicians - Individual technicians and analysts are not normally
responsible for authoring reports to management. However, they participate in the process by
generating control charts, identifying the need for new Response/Corrective Action Reports, and
maintaining other quality-related information used to prepare QA reports.

-------
                                                                                 Project: Model QAPP
                                                                                      Element No: 22
                                                                                       Revision No: 1
                                                                                        Date: 4/17/98
 	Page 1 of 8


       22.0 Data Review, Validation and Verification Requirements

     The purpose of this element is to state the criteria for deciding the degree to which each data item has met its
  quality specifications. Investigators should estimate the potential effect that each deviation from a QAPP may have
  on the usability of the associated data item, its contribution to the quality of the reduced and analyzed data, and its
  effect on the decision.

     The process of data verification requires confirmation by examination or provision of objective evidence that
  the requirements of these specified QC acceptance criteria are met. In design and development, verification
  concerns the process of examining the result of a given activity to determine conformance to the stated requirements
  for that activity. For example, have the data been collected according to a specified method and have the collected
  data been faithfully recorded and transmitted? Do the data fulfill specified data format and metadata requirements?
  The process of data verification effectively ensures the accuracy of data using validated methods and protocols and is
  often based on comparison with reference standards.

     The process of data validation requires confirmation by examination and provision of objective evidence that
  the particular requirements for a specific intended use have been fulfilled. In design and development, validation
  concerns the process of examining a product or result to determine conformance to user needs.  For example, have
  the data and assessment methodology passed a peer review to evaluate the adequacy of their accuracy and precision
  in assessing progress towards meeting the specific commitment articulated in the objective or subobjective. The
  method validation process effectively develops the QC acceptance criteria or specific performance criteria.

     Each of the following areas of discussion should be included in the QAPP elements.  The discussion applies to
  situations in which a sample is separated from its native environment and transported to a laboratory for analysis and
  data generation. However, these principles can be adapted to other situations (for example, in-situ analysis or
  laboratory research).
This section will describe how the Palookaville Department of Health will verify and validate the
data collection operations associated with the PM2 5 ambient air monitoring network. Verification
can be defined as confirmation by examination and provision of objective evidence that specified
requirements have been fulfilled. Validation can be defined as confirmation by examination and
provision of objective evidence that the particular requirements for a specific intended use are
fulfilled.  Although there are a number of objectives of ambient air data,  the major objective for
the Palookaville PM2 5 network is for comparison to the NAAQS standard and therefore, this will
be identified as the intended use.  This section will describe the verification and validation
activities that occur at a number of the important data collection phases.  Earlier elements of this
QAPP describe in detail how the activities in these data collection phases will  be implemented to
meet the  data quality objectives of the program. Review and approval of this QAPP by the
Department and EPA provide initial agreement that the processes described in the QAPP,  if
implemented, will provide data of adequate quality.  In order to verify and validate the phases of
the data collection operation, the Department will use various qualitative assessments (e.g.,
technical systems audits, network reviews) to verify that the QAPP is being  followed, and will rely
on the various quality control samples, inserted at various phases of the data collection operation,
to validate that the data will meet the DQOs described in Section 7.

-------
                                                                             Project: Mode! QAPP
                                                                                  Element No: 22
                                                                                   Revision No:l
                                                                                    Date: 4/17/98
                                                                            	Page 2 of 8
22.1 Sampling Design
     How closely a measurement represents the actual environment at a given time and location is a complex issue
 that is considered during development of element Bl. See Guidance on Sampling Designs to Support QAPPs ( EPA
 QA/G-5S).  Acceptable tolerances for each critical sample coordinate and the action to be taken if the tolerances are
 exceeded should be specified in element B1.

     Each sample should be checked for conformity to the specifications, including type and location (spatial and
 temporal). By noting the deviations in sufficient detail, subsequent data users will be able to determine the data's
 usability under scenarios different from those included in project planning. The strength of conclusions that can be
 drawn from data (see Guidance Document for Data Quality Assessment, EPA QA/G-9) has a direct connection to
 the sampling design and deviations from that design. Where auxiliary variables are included in the overall data
 collection effort (for example, microbiological nutrient characteristics or process conditions), they should be
 included in this evaluation.
Section 10 describes the sampling design for the network established by Palookaville. It covers
the number of sites required, their location, and the frequency of data collection. The objective of
the sampling design it to represent the population of interest at adequate levels of spatial and
temporal resolution Most of these requirements have been described in the Code of Federal
Regulations. However, it is the responsibility of Palookaville to  ensure that the intent of the
regulations are properly administered and carried out.

22.1.1 Sampling Design Verification

Verification of the sampling design  will occur through three processes:

Network Design Plan Confirmation - The Network Design Plan that discusses the initial
deployment of the network must be submitted, reviewed and approved by EPA prior to
implementation. This process verifies the initial sampling design.

Internal Network Reviews -Once a year, the Air Division will  perform a network review to
determine whether the network objectives, as described in the Network Design Plan, are still
being met, and that the sites are meeting the CFR siting criteria  (see Section 20).

External Network Reviews -  Every three year the EPA Regional Office will conduct a network
review to determine whether the network objectives, as described in the Network Design Plan, are
still being met, and that the sites are meeting the CFR siting criteria.

22.1.2 Sampling Design Validation

The ambient air data derived from the sites will be used to validate the sampling design. Through
the initial stages of implementation, the Department will use saturation monitors as well as special
purpose monitors to validate that the monitors are properly sited and that the sampling design will

-------
                                                                            Project: Model QAPP
                                                                                 Element No: 22
                                                                                  Revision No: 1
                                                                                  Date: 4/17/98
                                                                           	Page 3 of8
meet the objectives of the network.  This information will be included in network review
documentation and appropriately communicated the EPA Regional Office.  In addition,  the
processes described in Section 10 will be used to confirm the network design.

22.2 Sample Collection Procedures
    Details of how a sample is separated from its native time/space location are important for properly interpreting the
 measurement results. Element B2 provides these details, which include sampling and ancillary equipment and
 procedures (including equipment decontamination). Acceptable departures (for example, alternate equipment) from
 the QAPP, and the action to be taken if the requirements cannot be satisfied, should be specified for each critical
 aspect. Validation activities should note potentially unacceptable departures from the QAPP. Comments from field
 surveillance on deviations from written sampling plans also should be noted.


 22.2.1 Sample Collection Verification

 Sample collection procedures are described in detail in Section 11 and are developed to ensure
 proper sampling and to maintain sample integrity.  The following processes will be used to verify
 the sampling collection activities:

 Internal Technical Systems Audits - will be required ever three years as described in Section 20

 External Technical Systems Audits -  will be conducted by the EPA Regional Y Office every
 three years

 Both types of technical systems audits will be used to verify that the sample collection activity is
 being performed as described in this QAPP and the SOPs. Deviations from the sample collection
 activity will be noted in audit finding forms and corrected using the procedures described in
 Section 20.

 22.2.2 Sample Collection Validation

 The sample collection activity is just one phase of the measurement process.   The use of QC
 samples that have been placed throughout the measurement process can help validate the activities
 occurring at each phase. The review of QC data such as the collocated sampling data, field blanks,
the FRM performance evaluation, and the sampling equipment verification checks  that are
described in section 14 and 16 can be used to validate the data collection activities. Any data that
 indicates unacceptable levels of bias or precision or a tendency  (trend on a control chart) will be
flagged and investigated.  This  investigation could lead to a discovery of inappropriate sampling
activities.

-------
                                                                              Project: Model QAPP
                                                                                   Element No: 22
                                                                                    Revision No:l
                                                                                     Date: 4/17/98
                                                                             	Page 4 of 8
22.3 Sample Handling
    Details of how a sample is physically treated and handled during relocation from its original site to the actual
measurement site are extremely important.  Correct interpretation of the subsequent measurement results requires
that deviations from element B3 of the QAPP and the actions taken to minimize or control the changes, be detailed.
Data collection activities should indicate events that occur during sample handling that may affect the integrity of the
samples.

    At a minimum, investigators should evaluate the sample containers and the preservation methods used and
ensure that they are appropriate to the nature of the sample and the type of data generated from the sample. Checks
on the identity of the sample (e.g., proper labeling and chain-of-custody records) as well as proper physical/chemical
storage conditions (e.g., chain-of-custody and storage records) should be made to ensure that the sample continues to
be representative of its native environment as it moves through the analytical process..
Sections 11, 12, and 17 detail the requirements for sampling handling, including the types of
sample containers and the preservation methods used to ensure that they are appropriate to the
nature of the sample and the type of data generated from the sample.  Due to the size of the filters
and the nature of the collected particles,  sample handling is one of the phases where inappropriate
techniques can have a significant effect on sample integrity  and data quality

22.3.1 Verification of Sample Handling

As mentioned in the above section, both internal and external technical systems audits will be
performed to  ensure the specifications mentioned in the QAPP are being followed. The audits
would include checks on the identity of the sample (e.g., proper labeling and chain-of-custody
records), packaging in the field, and proper storage conditions (e.g., chain-of-custody and storage
records) to ensure that the sample continues to be representative of its native environment as it
moves through the data collection operation.

22.3.2 Validation of Sample Handling

Similar to the validation of sampling activities, the review of data from collocated sampling, field
blanks, and the FRM performance evaluations, that are described in section 14 and 16, can be
used to validate the sample handling activities. Acceptable  precision and bias in these samples
would lead one to believe that the sample handling activities are adequate. Any data that indicates
unacceptable  levels of bias or precision or a tendency (trend on a control chart) will be flagged
and investigated. This investigation could lead to a discovery of inappropriate sampling handling
activities that require corrective action.

-------
                                                                               Project: Model QAPP
                                                                                    Element No: 22
                                                                                     Revision No:l
                                                                                      Date: 4/17/98
                                                                              	Page 5 of8
 22.4 Analytical Procedures
      Each sample should be verified to ensure that the procedures used to generate the data (as identified in element
 B4 of the QAPP) were implemented as specified. Acceptance criteria should be developed for important components
 of the procedures, along with suitable codes for characterizing each sample's deviation from the procedure. Data
 validation activities should determine how seriously a sample deviated beyond the acceptable limit so that the
 potential effects of the deviation can be evaluated during DQA.
Sections 13 details the requirements for the analytical methods, which include the pre-sampling
weighing activities that give each sample a unique identification, an initial weight, and prepares
the sample for the field; and the post-sampling weighing activity, which provides the mass net
weight and the final concentration calculations.  The methods include acceptance criteria (section
13 and 14) for important components of the procedures, along with suitable codes for
characterizing each sample's deviation from the procedure

22.4.1 Verification of Analytical Procedures

As mentioned in the above sections, both internal and external technical systems audits will be
performed to ensure the analytical method specifications mentioned in the QAPP are being
followed.  The audits will include checks on the identity of the sample.  Deviations from the
analytical procedures will be noted in audit finding forms and corrected using the procedures
described in Section 20.

22.4.2 Validation of Analytical Procedures

Similar to the validation of sampling activities, the review of data from  lab blanks, calibration
checks, laboratory duplicates and other laboratory QC  that are described in sections 14 and 16
can be used to validate the analytical procedures.  Acceptable precision and bias in these samples
would lead one to believe that the analytical procedures are adequate. Any data that indicates
unacceptable levels of bias or precision or a tendency (trend on a control chart) will be flagged
and investigated as described in Section 14.  This investigation could lead to a discovery of
inappropriate analytical procedures, requiring corrective action.

22.5 Quality Control
    Element B5 of the QAPP specifies the QC checks that are to be performed during sample collection, handling,
and analysis. These include analyses of check standards, blanks, spikes, and replicates, which provide indications of
the quality of data being produced by specified components of the measurement process.  For each specified QC
check, the procedure, acceptance criteria, and corrective action (and changes) should be specified. Data validation
should document the corrective actions that were taken, which samples were affected, and the potential effect of the
actions on the validity  of the data.

-------
                                                                              Project: Model QAPP
                                                                                   Element No: 22
                                                                                    Revision No:l
                                                                                     Date: 4/17/98
	Page 6 of8

Sections 14 and 16 of this QAPP specify the QC checks that are to be performed during sample
collection, handling, and analysis. These include analyses of check standards, blanks, spikes, and
replicates, which provide indications of the quality of data being produced by specified
components of the measurement process. For each specified QC check, the procedure,
acceptance criteria, and corrective action are specified.

22.5.1 Verification of Quality Control Procedures

As mentioned in the above sections, both internal and external technical systems audits  will be
performed to ensure the quality control method specifications mentioned in the QAPP are being
followed.

22.5.2 Validation of Quality Control Procedures

Validation activities of many of the other data collection phases mentioned in this subsection use
the quality control data to validate the proper and adequate implementation of that phase.
Therefore, validation of QC procedures will require a review of the documentation of the
corrective actions that were taken when QC samples  failed to meet the acceptance criteria, and
the potential effect of the corrective actions on the validity of the routine data. Section  14
describes the techniques used to document QC review/corrective action activities

22.6 Calibration
      Element B7 addresses the calibration of instruments and equipment and the information that should be
  presented to ensure that the calibrations:

      •   were performed within an acceptable time prior to generation of measurement data;
      •   were performed in the proper sequence;
      •   included the proper number of calibration points;
      •   were performed using standards that "bracketed" the range of reported measurement results (otherwise,
         results falling outside the calibration range should be flagged as such); and
      •   had acceptable linearity checks and other checks to ensure that the measurement system was stable when
         the calibration was performed.

  When calibration problems are identified, any data produced between the suspect calibration event and any
  subsequent recalibration should be flagged to alert data users.
Section 16, as well as the field (Section 11) and the analytical sections (Section 13)  detail the
calibration activities and requirements for the critical pieces of equipment for the PM2 5 network.

-------
                                                                               Project: Model QAPP
                                                                                    Element No: 22
                                                                                     Revision No: 1
                                                                                      Date: 4/17/98
                                                                              	Page 7 of8
 22.6.1 Verification of Calibration Procedures
 As mentioned in the above sections, both internal and external technical systems audits will be
 performed to ensure the calibration specifications and corrective actions mentioned in the QAPP
 are being followed.  Deviations from the calibration procedures will be noted in audit finding
 forms and corrected using the procedures described in Section 20.

 22.6.2 Validation of Calibration Procedures

 Similar to the validation of sampling activities, the review of calibration data that are described in
 section 14 and 16, can be used to validate calibration  procedures. Calibration data within the
 acceptance requirements would lead one to believe that the sample collection measurement
 devices are operating properly. Any data that indicates unacceptable levels of bias or precision or
 a tendency (trend on a control chart)  will be flagged and investigated as described in Section 14 or
 16.  This investigation could lead to a discovery of inappropriate calibration procedures, or
 equipment problems requiring corrective action as detailed in the section.   Validation would
 include the review of the documentation to ensure corrective action was taken as prescribed in the
 QAPP.

 22.7 Data Reduction and  Processing


      Checks on data integrity evaluate the accuracy of "raw" data and include the comparison of important events and
  the duplicate rekeying of data to identify data entry errors.

      Data reduction is an irreversible process that involves a loss of detail in the data and may involve averaging
  across time (for example, hourly or daily averages) or space (for example, compositing results from samples thought
  to be physically equivalent). Since this summarizing process produces few values to represent a group of many data
  points, its validity should be well-documented in the QAPP. Potential data anomalies can be investigated by simple
  statistical analyses (see Guidance for Data Quality Assessment, EPA QA/G-9).

      The information generation step involves the synthesis of the results of previous operations and the construction
  of tables and charts suitable for use in reports. How information generation is checked, the requirements for the
  outcome, and how deviations from the requirements will be treated, should be addressed in this element.
22.7.1 Verification of Data Reduction and Processing Procedures

As mentioned in the above sections, both internal and external technical systems audits will be
performed to ensure the data reduction and processing activities mentioned in the QAPP are being
followed.

-------
                                                                         Project: Mode! QAPP
                                                                              Element No: 22
                                                                               Revision No:l
                                                                                Date: 4/17/98
                                                                         	Page 8 of 8
22.7.2 Validation of Data Reduction and Processing Procedures
As part of the audits of data quality, discussed in section 20, a number of sample IDs, chosen at
random will be identified. All raw data files, including the following will be selected:

    *•   Pre-sampling weighing activity
    >   Pre-sampling
    >   Sampling (sampler download information)
    >•   Calibration -the calibration information represented from that sampling period
    >   Sample handling/custody
    *•   Post-sampling weighing
    *•   Corrective action
    >   Data reduction

This raw data will be reviewed and final concentrations will be calculated by hand to determine if
the final vales submitted to AIRS compare to the hand calculations.  The data will also be
reviewed to ensure that associated flags or any  other data qualifiers have been appropriately
associated with the data and that appropriate corrective actions were taken.

-------
                                                                              Project: Model QAPP
                                                                                   Element No: 23
                                                                                    Revision No: 1
                                                                                     Date: 4/17/98
                                                                             	Page 1 of4
                   23.0 Validation and Verification Methods
     The purpose of this element is to describe, in detail, the process for validating (determining if data satisfy
  QAPP-defined user requirements) and verifying (ensuring that conclusions can be correctly drawn) project data.
  The amount of data validated is directly related to the DQOs developed for the project. The percentage validated for
  the specific project together with its rationale should be outlined or referenced. Diagrams should be developed
  showing the various roles and responsibilities with respect to the flow of data as the project progresses. The QAPP
  should have a clear definition of what is implied by "verification" and "validation."
                                      READERS NOTE

   The material in this section is an example. At the time of the development of this document a
   PM2 5 QA Workgroup was working on devising a consistent method for validating PM2 5 data
   based on the review of various QC information. The following material displays the concepts
   of the ongoing discussions but not the consensus validation criteria.
 Many of the processes for verifying and validating the measurement phases of the PM2 5 data
 collection operation have been discussed in Section 22.  If these processes, as written in the
 QAPP,  are followed, and the sites are representative of the boundary conditions for which they
 were selected, one would expect to achieve the PM2 5 DQOs.  However, exceptional field events
 may occur, and field and laboratory  activities may negatively effect the integrity of samples. In
 addition, it is expected that some of the QC checks will fail to meet the acceptance criteria.
 Information on problems that effect  the integrity  of data are identified in the form of flags
 (Appendix D).  It is important to determine how  these failures effect the routine data. The review
 of this routine data and their associated QC data will be verified and validated on a sample batch
 basis. Section 14.2 discusses the concept and use of sample batching.  The sample batch is the
 most efficient entity for verification/validation activities.  It is assumed that if measurement
 uncertainty can be controlled within acceptance criteria, at a batch level, then the overall
 measurement uncertainty will be maintained within the precision and bias DQOs.

 23.1 Describe the Process for Validating  and Verifying Data

    Each sample should be verified to ensure that the procedures used to generate the data (as identified in element
B4 of the QAPP) were implemented as specified. Acceptance criteria should be developed for important components
of the procedures, along with suitable codes for characterizing each sample's deviation from the procedure. Data
validation activities should determine how seriously a sample deviated beyond the acceptable limit so that the potential
effects of the deviation can be evaluated during DQA.

-------
                                                                         Project: Model QAPP
                                                                              Element No: 23
                                                                               Revision No:l
                                                                                Date: 4/17/98
                                                                         	Page 2 of4
 23.1.1 Verification of Sample Batches
After a sample batch is completed, a thorough review of the data will be conducted for
completeness and data entry accuracy. All raw data that is hand entered on data sheets will be
double keyed as discussed in Section 19, into the DAS.  The entries are compared to reduce the
possibility of entry and transcription errors. Once the data is entered into the DAS, the system
will review the data for routine data outliers and data outside of acceptance criteria. These data
will be flagged appropriately.  All flagged data will be "reverified"  that the values are entered
correctly.  Details of these activities are discussed in Section  19. The data qualifiers or flags can
be found in Appendix D.

23.1.2 Validation

Validation of measurement data will require two stages, one at the measurement value level, and
the second at the batch level. Records of all invalid samples will be filed. Information will include
a brief summary of why the sample was invalidated along with the associated flags. This record
will be available on the DAS since all filters that were pre-weighed will be recorded.  At least one
flag will be associated with an invalid sample, that being the "INV" flag signifying invalid, or the
"NAR" flag when no analysis result is reported.  Additional flags will usually be associated with
the NAR or INV flags that help describe the reason for these flags, as well  as free form notes
from the field operator or laboratory technician.

If the number of samples being invalidated or relatively small, the department will report them on
a monthly basis to Region Y.  If however, more than 5 values, in sequential order, from one site
appears to require invalidation, Region Y will be notified and the issue described.

Validation of Measurement Values —

Certain criteria based upon CFR and field operator and laboratory technician judgement have been
developed that will be used to invalidate a sample or measurement.  The flags listed in Appendix
D will be used to determine if individuals samples, or samples from a particular instrument will be
invalidated.  In all cases the sample will be returned to the laboratory for further examination.
When the laboratory  technician reviews the field sheet and chain-of -custody forms he/she will
look for flag values.  Filters that have flags related to obvious contamination (CON), filter damage
(DAM), field accidents (FAC) will be immediately examined. Upon concurrence of the laboratory
technician and laboratory branch manager, these samples will be invalidated. The flag "NAR" for
no analysis result will be placed in the flag area associated with this sample, along with the other
associated flags.

Other flags listed in Appendix D may be used alone or in combination to invalidate samples.
Since the possible flag combinations are overwhelming and can not be anticipated, the
Department will review these  flags and determine if single values or values from a site for a
particular time period will be invalidated.  The Department will keep a record of the combination

-------
                                                                            Project: Model QAPP
                                                                                Element No: 23
                                                                                 Revision No: 1
                                                                                  Date: 4/17/98
                                                                                    Page 3 of4
of flags that resulted in invalidating a sample or set of samples. These combinations will be
reported to Region Y and will be used to ensure that the Department evaluates and invalidates
data consistently from one batch to the next. It is anticipated that the these combinations can be
programmed into the DAS system in order to assist the laboratory in evaluating data. As
mentioned above,  all data invalidation will be documented. Table 23-1 and 23-2  contains criteria
that can be used to invalidate single samples based on single flags (Table 23-1) or a combination
of flags (Table 23-2)

Table 23-1 Single Flag Invalidation Criteria for Single Samples
Requirement
Contamination
Filter Damage
Event
Laboratory Accident
Field Accident
Flow Rate Cutoff
Flag
CON
DAM
EVT
LAC
FAC
FVL
Comment
Concurrence with lab technician and branch manager
Concurrence with lab technician and branch manager
Exceptional , known field event expected to have'effected sample .
Concurrence with lab technician and branch manager
Concurrence with lab technician and branch manager
Concurrence with lab technician and branch manager
Termination of sample collection due to flow rate > 10% design flo\
rate for 60 seconds.
Table 23-2 Single Sample Validation Template
Requirement
Flow Rate
Flow Rate Verification
Filter Temp
Elapsed Sample Time
Holding Times
Pre-sampling
Sample Recovery
Post-sampling
25ฐC
4ฐC

Acceptance criteria
< + 5% of 16.67 L/min. for < 5
min
< 4% of transfer standard
.>5ฐCfor<30min
> 1380* or < 1500 minutes
< 30 days
< 96 hours
< 10 days
< 30 days

Major1
>10%
>6%
>10ฐC
> 1530
>32 days
> 100 hours
>12days
>32 days

Minor2
>5%
>4%
>5ฐC
>1500
>30 days
>96 hours
>10days
>30 days

"- sample will still be used with sample period calculated with a time of 1440 minutes and flagged
-if 2 majors occur data invalidated
2-if 4 minors occur data invalidated. 2 minors equal 1 major
Flag
FLR
FLV
FLT
EST
HTE



-------
                                                                           Project: Model QAPP
                                                                               Element No: 23
                                                                                Revision No:l
                                                                                 Date: 4/17/98
                                                                          	Page 4 of4
Validation of Sample Batches --
Due to the nature and holding times of the routine samples, it is critical that the Department
minimize the amount of data that is invalidated. Therefore, the Department will validate data on
sample batches that are described in Section 14.2. Based on the types of QC samples that are
included in the batch and the field and laboratory conditions that are reported along with the batch
(field/lab flags), the Department has developed a validation template that will be used to
determine when routine data will be invalidated and when major corrective actions need to be
instituted. Table 23-3 represents the validation template.

Table 23-3 Validation Template
Requirement
Blanks
Field Blanks
Lab Blanks
Precision Checks
Collocated pairs
Duplicate weight
Accuracy
Balance Checks
Lab Conditions
Temperature
Humidity

#per
batch

3
3

2
1
7

1
1

Acceptance
Criteria

<ฑ30^g
<+15^g

PD<10%
<ฑ15Aig
<ฑ3^g

Mean 20-
23ฐC
< + 2ฐC
30-40%
< + 5%
Major1

both blanks >t 30 ,ug
both blanks >ฑ 15,ag

both samples > 15%
duplicate >f 20 /^g
4 checks > +3 #g

Mean>25ฐor<18ฐ
> + 4ฐ
Mean > 45% or < 20%
>+ 7%

Minor

one blank >+ 30 pg
one blank >+ 15/^g

one sample > 1 5%
duplicate >+ 1 5 jug
2 checks > +3 ,ug

Mean 23-25ฐ or 18-203
> + 2ฐ < + 4ฐ
Mean > 45% or < 20%
> + 5% < + 7%

Flag

FFB
FLB

PCS
FLD
FIS

ISP
ISP
ISP
ISP

1-if 2 majors occur data invalidated
2-if 4 minors occur data invalidated. 2 minors equal 1 major

Based upon the number of major and minor flags associated with the batch, the batch may be
invalidated. The DAS system will evaluate the batch and generate a report based upon the results
described in the validation template.  If the report describes invalidating the batch of data, the
batch will be reanalyzed. Prior to reanalysis, all efforts will be made to take corrective actions,
depending  on the type of QC checks that were outside of acceptance criteria, to correct the
problem. If the batch remains outside the criteria, the routine samples will be flagged invalid
(INV).  Each month a summary report of all data that was invalidated will be submitted to Region
Y along with explanations.

-------
                                                                           Project: Model QAPP
                                                                               Element No: 24
                                                                                Revision No: 1
                                                                                 Date: 4/17/98
 	Page 1 of 11

               24.0 Reconciliation with Data Quality Objectives

 24.1 Reconciling Results with DQOs

    The DQA process has been developed for cases where formal DQOs have been established.  Guidance for Data
 Quality Assessment (EPA QA/G-9) focuses on evaluating data for fitness in decision- making and also provides many
 graphical and statistical tools.

    DQA is a key part of the assessment phase of the data life cycle, as shown in Figure 9. As the part of the
 assessment phase that follows data validation and verification, DQA determines how well the validated data can
 support their intended use. If an approach other than DQA has been selected, an outline of the proposed activities
 should be included
 The DQOs for the PM25 ambient air monitoring network were developed in Section 7. The
 resulting DQOs are for precision, as measured by a coefficient of variation, to be less than 10% and
 for relative bias to be between -10% and +10%. This section of the QAPP will outline the
 procedures that Palookaville will follow to determine whether the monitors and laboratory
 analyses are producing data that comply with the DQOs and what action will be taken as a result of
 the assessment process. Such an assessment is termed a Data Quality Assessment (DQA) and is
 thoroughly described in EPA QA/G-9: Guidance for Data Quality Assessment2.  An assessment of
 the quality of the data will be made at the site level as well as at the Palookaville  level.

 24.1.1 Five Steps of DQA Process

 As described in EPA QA/G-Si1, the DQA process is comprised of five steps.  The steps are detailed
 below.

 Step 1. Review DQOs and Sampling Design.  Section 7 of this QAPP contains  the details for the
 development of the DQOs, including defining the primary objective of the  PM2 5 ambient air
 monitoring network (PM25 NAAQS comparison), translating the objective into a  statistical
 hypothesis (3-year average of annual mean PM25 concentrations less than or equal to 15  ug/m3and
 3-year average of annual 98th percentiles of the  PM25 concentrations less than or equal to 65
 ug/m3), and developing limits on the decision errors (incorrectly conclude area in non-attainment
 when it truly is in attainment no more than 5% of the time, and incorrectly conclude area in
 attainment when it truly is in non-attainment no  more than 5% of the time).

 Section 10 of this QAPP contains the details  for the sampling design, including the rationale for
the design, the design assumptions, and the sampling locations and frequency. If any deviations
from the sampling design have occurred, these will  be indicated and their potential effect carefully
considered throughout the entire DQA.

Step 2.  Conduct Preliminary Data Review. A preliminary data review will be performed to
uncover potential limitations to using the data, to reveal outliers, and generally to explore the basic

-------
                                                                           Project: Model QAPP
                                                                                Element No: 24
                                                                                Revision No:l
                                                                                 Date: 4/1 7/98
_ Page 2 of 1 1

structure of the data. The first step is to review the quality assurance reports. The second step is
to calculate basic summary statistics, generate graphical presentations of the data, and review these
summary statistics and graphs.

Review Quality Assurance Reports.  Palookaville will review all relevant quality assurance reports
that describe the data collection and reporting process. Particular attention will be directed to
looking for anomalies in recorded data, missing values, and any deviations from standard operating
procedures. This is a qualitative review. However, any concerns will be further investigated in the
next two steps.

Calculation of Summary Statistics and Generation of Graphical Presentations. Palookaville will
generate  some summary statistics for each of its primary and QA samplers. The summary statistics
will be calculated at the quarterly, annual, and three-year levels and will include only valid
samples.  The summary statistics are:

   Number of samples, mean concentration, median concentration, standard deviation, coefficient
   of variation, maximum concentration, minimum concentration, interquartile range, skewness
   and kurtosis.

These statistics will also be calculated for the percent differences at the collocated sites.  The
results will be summarized in a table.  Particular attention will be given to the impact on the
statistics caused by the observations noted in the quality assurance review. In fact, Palookaville
may evaluate the influence of a potential outlier by evaluating the change  in the summary statistics
resulting from exclusion of the outlier.

Palookaville will generate some graphics to present the results from the summary statistics and to
show the spatial continuity over Palookaville. Maps will be created for the annual and three-year
means, maxima, and interquartile ranges for a total of 6 maps. The maps will help uncover
potential outliers and will help in the network design review.  Additionally, basic histograms will
be generated for each of the primary and QA samplers and for the percent difference at the
collocated sites. The histograms will be useful in  identifying anomalies and evaluating the
normality assumption in the measurement errors.

Step 3. Select the Statistical Test.  The primary objective for the PM25 mass monitoring is
determining compliance with the  PM25 NAAQS.  As a result, the null and alternative hypotheses
are:
                            H0: X  15  ug/m3  and  Y 65
                             HA: X>\5  ug/m3  or Y>65 ug/m
where Xis the three-year average PM25 concentration and Y is the three-year average of the annual

-------
                                                                          Project: Model QAPP
                                                                               Element No: 24
                                                                                Revision No: 1
                                                                                Date: 4/17/98
 	Page 3 of 11

 98th percentiles of the PM25 concentrations recorded for an individual monitor. The exact
 calculations for X and Yare specified in 40 CFR Part 50 Appendix N4. The null hypothesis is
 rejected, that is, it is concluded that the area is not in compliance with the PM2 5 NAAQS when the
 observed three-year average of the annual arithmetic mean concentration exceeds 15.05 ug/m3 or
 when the observed three-year average of the annual 98th percentiles exceeds 65.5 ug/m3.  If the
 bias of the sampler is greater than -10% and less than +10% and the precision is within 10%, then
 the error rates (Type I and Type II) associated with this statistical test are less than or equal to 5%.
 The definitions of bias and precision will be outlined in the following step.

 Step 4. Verify Assumptions of Statistical Test. The assumptions behind the statistical test
 include those associated with the development of the DQOs in addition to the bias and precision
 assumptions. Their method of verification will be addressed in this step. Note that when less than
 three years of data are available, this verification will be based on as much data as are available.

 The DQO is based on the annual arithmetic mean NAAQS.  For each primary sampler,
 Palookaville will determine which, if either, of the PM25 NAAQS is violated. In the DQO
 development, it was assumed that the annual standard is more restrictive than the 24-hour one. If
 there are any samplers that violate ONLY the 24-hour NAAQS, then this assumption is not correct.
 The seriousness of violating this assumption is not clear.  Conceptually, the DQOs can be
 developed based on the 24-hour NAAQS and the more restrictive bias and precision limits
 selected. However, Palookaville will assume the annual standard is more restrictive,  until proven
 otherwise.

 Normal distribution for measurement error. Assuming that measurement errors are normally
 distributed is common in environmental monitoring. Palookaville has not investigated the
 sensitivity of the statistical test to violation of this assumption; although, small departures from
 normality generally do not create serious problems.  Palookaville will evaluate the reasonableness
 of the normality assumption by reviewing a normal probability plot, calculating the Shapiro-Wilk
 W test statistic (if sample size less than 50), and calculating the Kolmogorov-Smirnoff test statistic
 (if sampler size greater than 50).  All three techniques are provided by standard  statistical packages
 and by the statistical tools provided in EPA QA/G-9D: Data Quality Evaluation Statistical Tools'
 (DataQUEST). If the plot or statistics indicate possible violations of normality, Palookaville may
 need to determine the sensitivity of the DQOs to departures in normality.

 Decision error can occur when the estimated 3-year average differs from the actual,  or true, 3-
year average.  This is not really an assumption as much as a statement that the data collected by an
ambient air monitor is stochastic, meaning that there are errors in the measurement process, as
mentioned in the previous assumption.

 The limits on precision and bias are based on the smallest number of required sample values in a
3-year period. In the development of the DQOs, the smallest number of required samples was
used. The reason for this was to ensure that the confidence was sufficient in the minimal case; if
more samples are collected, then the confidence in the resulting decision will be even higher. For
each of the samplers, Palookaville will determine how many samples were collected in each

-------
                                                                         Project: Model QAPP
                                                                              Element No: 24
                                                                               Revision No:l
                                                                                Date: 4/17/98
	Page 4 of 11

quarter. If this number meets or exceeds 12, then the data completeness requirements for the DQO
are met.

The decision error limits were set at 5%. Again, this is more of a statement. If the other
assumptions are met, then the decision error limits are less than or equal to 5%.

Measurement imprecision was established at 10% coefficient of variation (CV).  For each
sampler, Palookaville will review the coefficient of variation calculated in Step 2.  If any exceed
10%, Palookaville may need to determine the sensitivity of the DQOs to larger levels of
measurement imprecision.

Table 24-1 will be completed during each DQA. The table summarizes which, if any, assumptions
have been violated.  A check will be placed in each of the row/column combinations that apply.
Ideally, there will be no checks. However, if there are checks in the table, the implication is that
the decision error rates are unknown even if the bias and precision limits are achieved. As
mentioned above, if any of the DQO assumptions are violated, then Palookaville will need to
reevaluate its DQOs.
  Table 24-1. Summary of Violations of DQO Assumptions
Site
Violate
24-Hour Standard
ONLY?
Measurement Errors
Non-Normal?
Data Complete?
( 12 samples per
quarter)
Measurement CV
> 10%?
Primary Samplers
Al
A2
A3
A4
Bl




















QA Samplers
Al
Bl








Achievement of bias and precision limits. Lastly, Palookaville will check the assumption that at
the three-year level of aggregation the sampler bias is in [-10%, 10%] and precision is less than
10%. The data from the collocated samplers will be used to estimate quarterly, annual, and three-
year bias and precision estimates even though it is only the three-year estimates that are critical for
the statistical test.

-------
                                                                           Project: Model QAPP
                                                                               Element No: 24
                                                                                Revision No: 1
                                                                                 Date: 4/17/98
	Page 5 of 11

Since all the initial samplers being deployed by Palookaville will be FRMs, the samplers at each of
the collocated sites will be identical method designations. As such it is difficult to determine
which of the collocated samplers is closer to the true PM2 5 concentration.  Palookaville will
calculate an estimate of precision. A bias measure will also be calculated but it can only describe
the relative difference of one sampler to the other, not definitively  indicate which sampler is more
"true." Algorithms for calculating precision and bias are described below. These are similar, but
differ slightly, from the equations in 40 CFR Part 58 Appendix A3. These have been developed
with assistance from OAQPS/EMAD.

Before describing the algorithm, first some ground work.  When less than three years of collocated
data are available, the three-year bias and precision estimates must be predicted.  Palookaville's
strategy for accomplishing this will be to use all available quarters  of data as the basis for
projecting where the bias and precision estimates will be at the end of the three-year monitoring
period. Three-year point estimates will be computed by weighting the quarterly components,
using the most applicable of the following assumptions:

    1.  Most recent quarters precision and bias are most representative of what the  future quarters
       will be.
    2.  All previous quarters precision and bias are equally representative of what the future
       quarters will be.
    3.  Something unusual happened in the  most recent quarter, so the most representative
       quarters are all the previous ones,  minus the most recent.

Each of these scenarios results in weights that will be used in the following algorithms. The
weights are shown in Table 24-2 where the variable Q represents the number of quarters for which
observed bias and precision estimates are available. Note that when Q=12, that is, when there are
bias and precision values for all of the quarters in the three-year period, then  all of the following
scenarios result in the same weighting scheme.
 Table 24-2.  Weights for Estimating Three-Year Bias and Precision
Scenario
1
2
3
Assumption
Latest quarter most representative
All quarters equally representative
Latest quarter unrepresentative
Weights
w, = \2-(Q-\ ) for latest quarter,
wq = 1 otherwise
wq = 1 2/Q for each quarter
wq = 1 for latest quarter,
w,= 11/(Q-1) otherwise
In addition to point estimates, Palookaville will develop confidence intervals for the bias and
precision estimates. This will be accomplished using a re-sampling technique. The protocol for
creating the confidence intervals are outlined in Box 24-1.

-------
                                                                         Project: Model QAPP
                                                                              Element No: 24
                                                                               Revision No:l
                                                                                Date: 4/17/98
                                                                                 Page 6 of 11
Box 24-1. Method for Estimating Confidence in Achieving Bias and Precision DQOs

Let Z be the statistic of interest (bias or precision). For a given weighting scenario, the re-
sampling will be implemented as follows:

1.     Determine M, the number of collocated pairs per quarter for the remaining 12-Q
       quarters (default is M=15 or can use M=average number observed for the previous Q
       quarters.
2.     Randomly select with replacement M collocated pairs per quarter for each of the future
       12-Q quarters in a manner consistent with the given weighting scenario.
               Scenario 1: Select pairs from latest quarter only.
               Scenario 2: Select pairs from any  quarter.
               Scenario 3: Select pairs from any  quarter except the latest one.
       Result from this step is "complete" collocated data for a three-year period, from which
       bias and precision estimates can be determined.
3.     Based on the "filled-out" three-year period from step 2, calculate three-year bias and
       precision estimate, using Equation 1 where wq = 1  for each quarter.
4.     Repeat  steps 2 and 3 numerous times, such as 1000 times.
5.     Determine P, the fraction of the 1000 simulations for which the three-year bias and
       precision criteria are met. P is interpreted as  the probability that the sampler is
       generating observations consistent with the three-year bias and precision DQOs.
The algorithms for determining whether the bias and precision DQOs have been achieved for each
sampler follow.

Bias Algorithm

1.  For each measurement pair, use Equation 19 from Section 14 to estimate the percent relative
   bias, d,. To reiterate, this equation is

                                          Y.-A
                                    d=
   where Xi represents the concentration recorded by the primary sampler, and Y, represents the
   concentration recorded by the collocated sampler.

-------
                                                                          Project: Model QAPP
                                                                               Element No: 24
                                                                                Revision No:l
                                                                                 Date: 4/17/98
                                                                          	Page 7 of 11
2.  Summarize the percent relative bias to the quarterly level, Djq, according to
    where njq is the number of collocated pairs in quarter q for site/

3.  Summarize the quarterly bias estimates to the three-year level using
                                         W,DJซ
                                 D, -	                           Equation 1
    where nq is the number of quarters with actual collocated data and wq is the weight for quarter q
    as specified by the scenario in Table 24-2.

4.  Examine Djq to determine whether one sampler is consistently measuring above or below the
    other. To formally test this, a non-parametric test will be used.  The test is called the
    Wilcoxon Signed Rank Test and is described in EPA QA/G-92.  If the null  hypothesis is
    rejected, then one of the samplers is consistently measuring above or below the other. This
    information may be helpful in directing the investigation into the cause of the bias.

Precision Algorithm

1 .  For each measurement pair, calculate the coefficient of variation according to Equation 20
    from Section 14 and repeated below:

                                        CV.  =
                                                V/2


2.   Summarize the coefficient of variation to the quarterly level, CVjq, according to
                                    CV,_ =

-------
                                                                          Project: Model QAPP
                                                                               Element No: 24
                                                                                Revision No:l
                                                                                Date: 4/17/98
                                                                         	Page 8 of 11
    where njq is the number of collocated pairs in quarter q for site j.

3.   Summarize the quarterly precision estimates to the three-year level using
                                9=1
                                                                     Equation 2
   where nq is the number of quarters with actual collocated data and wq is the weight for quarter q
   as specified by the scenario is Table 24-2.

4.  If the null hypothesis in the Wilcoxon signed rank test was not rejected, then the coefficient of
   variation can be interpreted as a measure of precision. If the null hypothesis in the Wilcoxon
   signed rank test was rejected, the coefficient of variation has both a component representing
   precision and a component representing the (squared) bias.

Confidence in Bias and Precision Estimates

1.  Follow the method described in Box 24-1 to estimate the probability that the sampler is
   generating observations consistent with the three-year bias and precision DQOs. The re-
   sampling must be done for each collocated site.

Summary of Bias and Precision Estimation

The results from the calculations and re-sampling will be summarized in Table 24-3. There will
be one line for each site operating a collocated sampler.
  Table 24-3. Summary of Bias and Precision
Collocated
Site
Al
Bl
Three-year Bias
Estimate
(Equation. 1)


Three-year Precision
Estimate
(Equation. 2)


Null Hypothesis of
Wilcoxon Test Rejected''


P
(Box 24-1)


Step 5.  Draw Conclusions from the Data.

Before determining whether the monitored data indicate compliance with the PM25 NAAQS,
Palookaville must first determine if any of the assumptions upon which the statistical test is based
are violated. This can be easily checked in Step 5 because of all the work done in Step 4.  In

-------
                                                                         Project: Model QAPP
                                                                              Element No: 24
                                                                               Revision No:l
                                                                                Date: 4/17/98
                                                                         	Page 9 of 11
particular, as long as
    >•  in Table 24-1, there are no checks, and
    >  in Table 24-3,
       •   the three year bias estimate is in the interval [-10%, 10%], and
       •   the three year precision estimate is less than or equal to 10%

then the assumptions underlying the test appear to be valid.  As a result, if the observed three-year
average PM25 concentration is less than 15  ug/m3 and the observed three-year average 98th
percentile is less than 65 ug/m3, the conclusion is that the area seems to be in compliance with the
PM2 5 NAAQS, with an error rate of 5%.

If any of the assumptions have been violated, then the level of confidence associated with the test
is suspect and will have to be further investigate.

24.1.3 Action Plan Based on Conclusions from DQA

A thorough DQA process will be completed during the summer of each year. Thorough means
that all five steps of the process  will be completed. Additionally, steps 2, Table 24-1, and Step 5
will be completed on a quarterly basis as a check to determine if something is changing with the
monitoring or laboratory work that needs addressing before the annual review.

For this section, Palookaville will assume that the assumptions used for developing the DQOs have
been met. If this is not the case, Palookaville must first revisit the impact of the violation on the
bias and precision limits determined by  the DQO process.

DQA indicates every monitor operated by Palookaville is collecting PM2 5 mass data that are
within the precision and bias goals determined by the PM2 5 DQOs.

If the conclusion from the DQA process is that each of the PM25 mass monitors are operating with
less than 10% bias and 10% precision, then Palookaville will pursue action to reduce the QA/QC
burden.  The basic idea is that once Palookaville has demonstrated that it can operate within the
precision and bias limits, it is reasonable to dedicate some of the PM25 QA/QC resources to other
duties/tasks, such as modifying  its QA monitoring or reducing some of its QC samplers or
monitoring frequecy.  Possible courses of action include the following.

•   Modifying the QA Monitoring Network. 40 CFR Part 58' requires that each QA monitor be
   the same designation as the primary  monitor, in the case that the primary monitor is an FRM.
   Since the initially deployed samplers will  all be FRMs, this means that the sites operating
   sequential samplers will have to collocate a sequential sampler.  In particular, the site
   northwest of Scarborough, Al, will have two PM2 5 sequential samplers, the primary one and
   the collocated one.  Once it is demonstrated that the data collected from the network are within
   tolerable levels of errors, Palookaville may request that it be allowed to collocate with a single-
   day sampler instead. This will allow Palookaville to establish a new site with the sequential

-------
                                                                          Project: Model QAPP
                                                                               Element No-24
                                                                                Revision No:l
                                                                                Date: 4/17/98
                                                                         	Page 10 of 11
   sampler that had been the collocated sampler.
•  Reducing QC Requirements. QC is integral to any ambient air monitoring network and is
   particularly important to new networks.  However, once it is demonstrated that the data
   collected from the network are within tolerable levels of errors, then Palookaville may request
   a reduction in the QC checks such as those specified in Table 23-1.  However, if, during any of
   the annual DQA processes, it is determined that the errors in the data are approaching or
   exceed either the bias limits or the precision limits, then Palookaville will return to the
   prescribed levels of QC checks as indicated in Table 23-1.

DQA indicates at least one monitor operated by Palookaville is collecting PM2 5 mass data that
are not within the precision and bias goals determined by the PM25 DQOs.

If and when the data from at least one of the collocated sites violates the DQO bias and/or
precision limits, then Palookaville will conduct an investigation to uncover the cause of the
violation.  If all of the collocated sites in Palookaville violate the DQOs (across monitor
designations), the cause may be at the Palookaville level (operator training) or higher (laboratory
QC, problems with method designation). If only one site violates the DQOs, the cause is more
likely specific to the site (particular operator, problem with site). The tools for getting to the root
of the problem include: data from the collocated network (Palookaville, nearby reporting
organizations, national), data from FRM performance evaluations (Palookaville, nearby reporting
organizations, national), QC trails. Some particular courses of action include the following.

•  Determine level of aggregation at which DQOs are violated.  The DQA process can identify
   which monitors are having problems since the DQOs were developed at a monitor level.  To
   determine the level at which corrective action is to be taken, it must be determined whether the
   violation of the DQOs  is due to problems unique to one or two sites, unique to Palookaville, or
   caused by a broader problem, like a particular sampler demonstrating poor QA on a national
   level.  Palookaville understands that AIRS will generate QA reports summarizing bias and
   precision statistics at the national and reporting organization levels, and by method
   designation.  These reports will assist Palookaville in determining the appropriate level at
   which the DQOs are being violated. The procedure for determining level  of violation is:

   *   Review national reports for the method designations for which Palookaville's DQA process
       indicated a violation. If large bias or imprecision is seen at the national level, Palookaville
       will request assistance from the Regional Office and OAQPS. If no problem seen at
       national level, Palookaville will proceed looking at the QA reports specific to its
       neighboring reporting organizations.
   *   Review neighboring reporting organizations' precision and bias reports for the method
       designations for which Palookaville's DQA process indicated a violation.   If large bias or
       imprecision is seen in the neighboring organizations, Palookaville will request assistance
       from the Regional Office. If no problem seen in the neighboring reporting organizations,
       Palookaville will proceed looking at the QA reports specific to Palookaville.
   *   Within Palookaville, if the violations occur across method designations, then laboratory

-------
                                                                         Project: Model QAPP
                                                                             Element No: 24
                                                                              Revision No: 1
                                                                               Date: 4/17/98
  	_^	Page 11 of 11

       QC and training will be reviewed.
   *   Within Palookaville, if the violations occur for only one method designation, the FRM
       performance evaluation data will be reviewed for confirmation with the collocated data.
       The FRM performance evaluation data may show that one of the  monitors has a problem
       and must be repaired or replaced.  Palookaville will also use the national  FRM
       performance evaluation summaries to see if Palookaville is unique or like the national
       network.  If Palookaville is similar to the national picture, then assistance will be requested
       from the Regional Office and OAQPS. The results from the neighboring reporting
       organizations will also be reviewed. If the violations seem unique to Palookaville,
       Palookaville will continue investigating all the pieces that  comprise the data.

•  Communication with Regional Office.  If a violation of the bias and precision DQOs is
   found, Palookaville will remain in close contact with the Regional Y  Office both for assistance
   and for communication.

•  Extensive Review of Quarterly Data until DQOs Achieved.  Palookaville will continue to
   review extensively the quarterly QA reports and the  QC summaries until the  bias and  precision
   limits are attained.

References

l.Data Quality Evalation Statistical Evaluation Toolbox  (DataQUEST) EPA QA/G-9D U.S.
   Environmental Protection Agency, QAD EPA/600/R-96/085, December 1997

2. Guidance for the Data Quality Assessment Process EPA QA/G-9 U.S.  Environmental Protection
   Agency, QAD EPA/600/R-96/084, July 1996.

3. U.S. EPA (1997b) Revised Requirements for Designation of Reference and Equivalent Methods
   for PM2.5 and Ambient Air Quality Surveillance for Particulate Matter-Final Rule. 40 CFR
   Parts 53 and 58.  Federal Register, 62(138):38763-38854. July  18,1997.

4.U.S. EPA (1997a) National Ambient Air Quality Standards for Particulate Matter - Final Rule.
   40 CFR Part 50. Federal Register, 62(138):38651-38760. July 18,1997.

-------
Appendices

-------
                                                                  Project: Model QAPP
                                                                         Appendix A
                                                                        Revision No:
                                                                        Date:4/18/98
                                                                         Pagel of 13
                                 Appendix A

                                   Glossary
The following glossary is taken from the document EPA Guidance For Quality Assurance
Project Plans EPA QA/G-5

-------
                                                                                Project: Model QAPP
                                                                                        Appendix A
                                                                                       Revision No:
                                                                                       Date:4/18/98
                                                                                        Page 2 of 13
GLOSSARY OF QUALITY ASSURANCE AND RELATED TERMS


Acceptance criteria — Specified limits placed on characteristics of an item, process, or service defined in
requirements documents.  (ASQC Definitions)

Accuracy — A measure of the closeness of an individual measurement or the average of a number of
measurements to the true value. Accuracy includes a combination of random error (precision) and systematic
error (bias) components that are due to sampling and analytical operations; the EPA recommends using the
terms "precision" and  "bias", rather than "accuracy," to convey the information usually associated with
accuracy.  Refer to Appendix D, Data Quality Indicators for a more detailed definition.

Activity — An all-inclusive term describing a specific set of operations of related tasks to be performed,
either serially or in parallel (e.g., research and development, field sampling, analytical operations, equipment
fabrication), that, in total, result in a product or service.

Assessment — The evaluation process used to measure the performance or effectiveness of a system and its
elements.  As used here, assessment is an all-inclusive term used to denote any of the following: audit,
performance evaluation (PE), management systems review (MSR), peer review, inspection, or surveillance.

Audit (quality)  — A systematic and independent examination to determine whether quality activities and
related results comply with planned arrangements and whether these arrangements are implemented
effectively and are suitable to achieve objectives.

Audit of Data Quality (ADQ) — A qualitative and quantitative evaluation of the documentation and
procedures associated with environmental measurements to verify that the resulting data are of acceptable
quality.

Authenticate — The act of establishing an item as genuine, valid, or authoritative.

Bias — The systematic or persistent distortion of a measurement process, which causes errors in one
direction (i.e., the expected sample measurement is different from the sample's true value). Refer to
Appendix D, Data Quality Indicators, for a more detailed definition.

Blank — A sample subjected to the usual analytical or measurement process to establish a zero baseline or
background value. Sometimes used to adjust or correct routine analytical results.  A sample that is intended
to contain none of the analytes of interest. A blank is used to detect contamination during sample handling
preparation and/or analysis.

Calibration — A comparison of a measurement standard, instrument, or item with a standard or instrument
of higher accuracy to detect and quantify inaccuracies and to report or eliminate those inaccuracies by
adjustments.

Calibration drift — The deviation in instrument response from a reference value over a period of time
before recalibration.

Certification — The process of testing and evaluation against specifications designed to document, verify,
and recognize the competence of a person, organization, or other entity to perform a function or service,
usually for a specified time.

-------
                                                                                 Project: Model QAPP
                                                                                         Appendix A
                                                                                        Revision No:
                                                                                        Date:4/18/98
                                                                                         Page 3 of 13


Chain of custody — An unbroken trail of accountability that ensures the physical security of samples, data,
and records.

Characteristic — Any property or attribute of a datum, item, process, or service that is distinct, describable,
and/or measurable.

Check standard — A standard prepared independently of the calibration standards and analyzed exactly like
the samples.  Check standard results are used to estimate analytical precision and to indicate the presence of
bias due to the calibration of the analytical system.

Collocated samples — Two or more portions collected at the same point in time and space so as to be
considered identical.  These samples are also known as field replicates and should be identified as such.

Comparability — A measure of the confidence with which one data set or method can be compared to
another.

Completeness — A measure of the amount of valid data obtained from a measurement system compared to
the amount that was expected to be obtained under correct, normal conditions.  Refer to Appendix D, Data
Quality Indicators, for a more detailed definition.

Computer program — A sequence of instructions suitable for processing by a computer. Processing  may
include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for
execution. A computer program may be stored on magnetic media and referred to as "software," or it  may be
stored permanently on computer chips, referred to as "firmware." Computer programs covered in a QAPP
are those used for design analysis, data acquisition, data reduction, data storage (databases), operation or
control, and database or document control registers when used as the controlled source of quality information.

Confidence Interval — The numerical interval constructed around a point estimate of a population
parameter, combined with a probability statement (the confidence coefficient) linking it to the population's
true parameter value. If the same confidence interval construction technique and assumptions are used to
calculate future intervals, they will include the unknown population parameter with the same specified
probability.

Confidentiality procedure — A procedure used to protect confidential business information (including
proprietary data and personnel records) from unauthorized access.

Configuration — The functional, physical, and procedural characteristics of an item, experiment, or
document.

Conformance — An affirmative indication or judgment that a product or service has met the requirements of
the relevant specification, contract, or regulation; also, the state of meeting the requirements.

Consensus standard — A standard established by a group representing a cross section of a particular
industry or trade, or a part thereof.

Contractor — Any organization or individual contracting to furnish services or items or to perform work.

Corrective action — Any measures taken to rectify conditions adverse to quality and, where possible, to
preclude their recurrence.

-------
                                                                                  Project: Model QAPP
                                                                                           Appendix A
                                                                                          Revision No:
                                                                                          Date:4/18/98
                                                                                           Page 4 of 13


Correlation coefficient — A number between -1 and 1 that indicates the degree of linearity between two
variables or sets of numbers. The closer to -1 or +1, the stronger the linear relationship between the two (i.e.,
the better the correlation). Values close to zero suggest no correlation between the two variables.  The most
common correlation coefficient is the product-moment, a measure of the degree of linear relationship between
two variables.

Data of known quality — Data that have the qualitative and quantitative components associated with their
derivation documented appropriately for their intended use, and when such documentation is verifiable and
defensible.

Data Quality Assessment (DQA) — The scientific and statistical evaluation of data to determine if data
obtained from environmental operations are of the right type, quality, and quantity to support their intended
use.  The five steps of the DQA Process include: 1) reviewing the DQOs and sampling design, 2) conducting
a preliminary data review, 3) selecting the statistical test, 4) verifying the assumptions of the statistical test,
and 5) drawing conclusions from the data.

Data Quality Indicators (DQIs) — The quantitative statistics and qualitative descriptors that are used to
interpret the degree of acceptability or utility of data to the user. The principal data quality indicators are
bias, precision,  accuracy (bias is preferred), comparability, completeness, representativeness.

Data Quality Objectives (DQOs) — The qualitative and quantitative statements derived from the DQO
Process that clarify study's technical and quality objectives, define the appropriate type of data, and specify
tolerable levels of potential decision errors that will be used as the basis for establishing the quality and
quantity of data needed to support decisions.

Data Quality Objectives (DQO) Process — A systematic strategic planning tool based on the scientific
method that identifies and defines the type, quality, and quantity of data needed to satisfy a specified use.
The key elements of the DQO process include:

        state the problem,
        identify the decision,
        identify the inputs to the decision,
        define the boundaries of the study,
        develop a decision rule,
        specify tolerable limits on decision errors, and
        optimize  the design for obtaining data.
DQOs are the qualitative and quantitative outputs from the DQO Process.

Data reduction — The process of transforming the number of data items by arithmetic or statistical
calculations, standard curves, and concentration factors, and collating them into a more useful form. Data
reduction is irreversible and generally results in a reduced data set and an associated loss of detail.

Data usability — The process of ensuring or determining whether the quality of the data produced meets the
intended use of the data.

Deficiency — An unauthorized deviation from acceptable procedures or practices, or a defect in an item.

Demonstrated capability — The capability to meet a procurement's technical  and quality specifications
through evidence  presented by the supplier to substantiate its claims and in a manner defined by the customer.

Design — The  specifications, drawings, design criteria, and performance requirements.  Also, the result of
deliberate planning, analysis, mathematical manipulations, and design processes.

-------
                                                                                 Project: Model QAPP
                                                                                         Appendix A
                                                                                        Revision No:
                                                                                        Date:4/18/98
                                                                                         Page 5 of 13


Design change — Any revision or alteration of the technical requirements defined by approved and issued
design output documents and approved and issued changes thereto.

Design review — A documented evaluation by a team, including personnel such as the responsible designers,
the client for whom the work or product is being designed, and a quality assurance (QA) representative but
excluding the original designers, to determine if a proposed design will meet the established design criteria
and perform as expected when implemented.

Detection Limit (DL) — A measure of the capability of an analytical method to distinguish samples that do
not contain a specific analyte from samples that contain low concentrations of the analyte; the lowest
concentration or amount of the target analyte that can be determined to be different from zero by a single
measurement at a stated level of probability. DLs are analyte- and matrix-specific and may be laboratory-
dependent.

Distribution — 1) The appointment of an environmental contaminant at a point over time, over an area, or
within a volume; 2) a probability function (density function, mass function,  or distribution function) used to
describe a set of observations (statistical sample) or a population from which the observations are generated.

Document — Any written or pictorial information describing, defining, specifying, reporting, or certifying
activities, requirements, procedures, or results.

Document control — The policies and procedures used by an organization  to ensure that its documents and
their revisions are proposed, reviewed, approved for release, inventoried,  distributed, archived, stored, and
retrieved in accordance with the organization's requirements.

Duplicate samples — Two samples taken from and representative of the same population and carried
through all steps of the sampling and analytical procedures in an identical manner. Duplicate samples are
used to assess variance of the total method, including sampling and analysis. See also collocated sample.

Environmental conditions — The description of a physical medium (e.g., air, water, soil, sediment) or a
biological system  expressed in terms of its physical, chemical, radiological, or biological characteristics.

Environmental data — Any parameters or pieces of information collected or produced from measurements,
analyses, or models of environmental processes, conditions, and effects  of pollutants on human health and the
ecology,  including results from laboratory analyses or from experimental  systems representing such processes
and conditions.

Environmental data operations — Any work performed to obtain, use,  or report information pertaining to
environmental processes and conditions.

Environmental monitoring — The process of measuring or collecting environmental data.

Environmental processes — Any manufactured or natural processes that produce discharges to, or that
impact, the ambient environment.

Environmental programs — An all-inclusive term pertaining to any work or activities involving the
environment, including but not limited to: characterization of environmental processes and conditions;
environmental monitoring; environmental research and development; the design, construction, and operation
of environmental technologies; and laboratory operations on environmental samples.

-------
                                                                                 Project: Model QAPP
                                                                                         Appendix A
                                                                                        Revision No:
                                                                                        Date:4/18/98
                                                                                         Page 6 of 13


Environmental technology — An all-inclusive term used to describe pollution control devices and systems,
waste treatment processes and storage facilities, and site remediation technologies and their components  that
may be utilized to remove pollutants or contaminants from, or to prevent them from entering, the
environment. Examples include wet scrubbers (air), soil washing (soil), granulated activated carbon unit
(water), and filtration (air, water).  Usually, this term applies to hardware-based systems; however, it can also
apply to methods or techniques used for pollution prevention, pollutant reduction, or containment of
contamination to prevent further movement of the contaminants, such as capping, solidification or
vitrification, and biological treatment.

Estimate — A  characteristic from the sample from which inferences on parameters can be made.

Evidentiary records — Any records identified as part of litigation and subject to restricted access, custody,
use, and disposal.

Expedited change — An abbreviated method of revising a document at the work location where the
document is used when the normal change process would cause unnecessary or intolerable delay in the work.

Field blank — A blank used to provide information about contaminants that may be introduced during
sample collection, storage, and transport. A clean sample, carried to the sampling site, exposed to sampling
conditions, returned to the laboratory, and treated as an environmental sample.

Field (matrix) spike — A sample prepared at the sampling  point (i.e., in the field) by adding a known mass
of the target analyte to a specified amount of the sample. Field matrix spikes are used, for example, to
determine the effect of the sample preservation, shipment, storage, and preparation on analyte recovery
efficiency (the analytical bias).

Field split samples — Two or more representative portions taken from the same sample and submitted for
analysis to different laboratories to estimate interlaboratory precision.

Financial assistance — The process by which funds are provided by one organization (usually
governmental) to another organization for the purpose of performing work or furnishing services or items.
Financial assistance mechanisms include grants, cooperative agreements, and governmental interagency
agreements.

Finding — An assessment conclusion that identifies a condition having a significant effect on  an item or
activity. An assessment finding may be positive or negative, and is normally accompanied by  specific
examples of the observed condition.

Goodness-of-fit test — The application of the chi square distribution in comparing the frequency
distribution of a statistic observed in a sample with the expected frequency distribution based on some
theoretical model.

Grade — The category or rank given to entities having the same functional use but different requirements for
quality.

Graded approach — The process of basing the level of application of managerial controls applied to an
item or work according to the intended use of the results and the degree of confidence needed in the quality of
the results. (See also Data Quality Objectives (DQO) Process.)

Guidance — A suggested practice that is not mandatory, intended as an aid or example in complying with a
standard or requirement.

-------
                                                                                  Project: Model QAPP
                                                                                          Appendix A
                                                                                         Revision No:
                                                                                         Date:4/18/98
                                                                                          Page 7 of 13


 Guideline — A suggested practice that is not mandatory in programs intended to comply with a standard.

 Hazardous waste — Any waste material that satisfies the definition of hazardous waste given in  40 CFR
 261, "Identification and Listing of Hazardous Waste."

 Holding time — The period of time a sample may be stored prior to its required analysis.  While exceeding
 the holding time does not necessarily negate the veracity of analytical results, it causes the qualifying or
 "flagging" of any data not meeting all of the specified acceptance criteria.

 Identification error — The misidentification of an analyte. In this error type, the contaminant of concern is
 unidentified and the measured concentration is incorrectly assigned to another contaminant.

 Independent assessment — An assessment performed by a qualified individual, group, or organization that
 is not a part of the organization directly performing and accountable  for the work being assessed.

 Inspection — The examination or measurement of an item or activity to verify  conformance to specific
 requirements.

 Internal standard — A standard  added to a test portion of a sample in a known amount and carried through
 the entire determination procedure as a reference for calibrating and controlling  the precision and bias of the
 applied analytical method.

 Item — An all-inclusive term used in place of the following: appurtenance, facility, sample, assembly,
 component,  equipment, material, module, part, product, structure, subassembly, subsystem, system, unit,
 documented concepts, or data.

 Laboratory split samples — Two or more representative portions taken from the same sample and
 analyzed by different laboratories  to estimate the interlaboratory precision or variability and the  data
 comparability.

 Limit of quantitation — The minimum concentration of an analyte or category of analytes in a  specific
 matrix that can be identified and quantified above the method detection limit and within specified limits of
 precision and bias during routine analytical operating conditions.

 Management — Those individuals directly responsible and accountable for planning, implementing, and
 assessing work.

 Management system — A structured, nontechnical system describing the policies, objectives, principles,
 organizational authority, responsibilities, accountability, and implementation plan of an organization for
 conducting work and producing items and services.

 Management Systems Review  (MSR)  — The qualitative assessment of a data collection operation and/or
organization(s) to establish whether the prevailing quality management structure, policies, practices, and
 procedures are adequate for ensuring that the type and quality of data needed are obtained.

Matrix spike — A sample prepared by adding a known mass of a target analyte to a specified amount of
matrix sample for which an independent estimate of the target analyte concentration is available. Spiked
samples are used, for example, to determine the effect of the matrix on a method's recovery efficiency.

May — When used in a sentence,  a term denoting permission but not a necessity.

-------
                                                                                Project: Model QAPP
                                                                                        Appendix A
                                                                                        Revision No:
                                                                                        Date:4/18/98
                                                                                        Page 8 of 13


Mean (arithmetic) — The sum of all the values of a set of measurements divided by the number of values in
the set; a measure of central tendency.

Mean squared error — A  statistical term for variance added to the square of the bias.

Measurement and Testing Equipment (M&TE) — Tools, gauges, instruments, sampling devices, or
systems used to calibrate, measure, test, or inspect in order to control or acquire data to verify conformance to
specified requirements.

Memory effects error — The effect that a relatively high concentration sample has on the measurement of a
lower concentration sample  of the same analyte when the higher concentration sample precedes the lower
concentration sample in the  same analytical instrument.

Method — A body  of procedures and techniques for performing an activity (e.g., sampling, chemical
analysis, quantification), systematically presented in the order in which they are to be executed.

Method blank — A blank prepared  to represent the sample matrix as closely as possible and analyzed
exactly like the calibration standards, samples, and quality control (QC) samples. Results of method blanks
provide an estimate  of the within-batch variability of the blank response and an indication of bias introduced
by the analytical procedure.

Mid-range check — A standard used to establish whether the middle of a measurement method's calibrated
range is still within specifications.

Mixed waste — A hazardous waste  material as defined by 40 CFR 261 Resource Conservation and
Recovery Act (RCRA) and mixed with radioactive waste subject to the requirements of the Atomic  Energy
Act.

Must — When used in a sentence, a  term denoting a requirement that has  to be met.

Nonconformance — A deficiency in a characteristic, documentation, or procedure that renders the quality of
an item or activity unacceptable or indeterminate; nonfulfillment of a specified requirement.

Objective evidence — Any documented statement of fact, other information, or record, either quantitative or
qualitative, pertaining to the quality of an item or activity, based on observations, measurements, or tests that
can be verified.

Observation — An assessment conclusion that identifies a condition (either positive or negative) that does
not represent a significant impact on an item or activity. An observation may identify a condition that has not
yet caused a degradation of quality.

Organization — A company, corporation, firm, enterprise, or institution, or part thereof, whether
incorporated or not, public or private, that has its own functions and administration.

Organization structure — The responsibilities, authorities, and relationships, arranged in a pattern, through
which an organization performs its functions.

Outlier — An extreme observation that is shown to have a low probability of belonging to a specified data
population.

Parameter — A quantity, usually unknown, such as a mean or a standard deviation characterizing a
population. Commonly misused for "variable," "characteristic," or "property."

-------
                                                                                  Project: Model QAPP
                                                                                          Appendix A
                                                                                         Revision No:
                                                                                         Date:4/l 8/98
                                                                                          Page 9 of 13


 Peer review — A documented critical review of work generally beyond the state of the art or characterized
 by the existence of potential uncertainty. Conducted by qualified individuals (or an organization) who are
 independent of those who performed the work but collectively equivalent in technical expertise (i.e., peers) to
 those who performed the original work.  Peer reviews are conducted to ensure that activities are technically
 adequate, competently performed, properly documented, and satisfy established technical and quality
 requirements.  An in-depth assessment of the assumptions, calculations, extrapolations, alternate
 interpretations, methodology, acceptance criteria, and conclusions pertaining to specific work and of the
 documentation that supports them. Peer reviews provide an evaluation of a subject where quantitative
 methods of analysis or measures of success are unavailable or undefined, such as in research and
 development.

 Performance Evaluation (PE) — A type of audit in which the quantitative data generated in a measurement
 system are obtained  independently and compared with routinely obtained data to evaluate the proficiency of
 an analyst or laboratory.

 Pollution prevention — An organized,  comprehensive effort to systematically reduce or eliminate pollutants
 or contaminants prior to their generation or their release or discharge into the environment.

 Population — The totality of items or units of material under consideration or study.

 Precision — A measure of mutual agreement among individual measurements of the same property, usually
 under prescribed similar conditions expressed generally in terms of the standard deviation. Refer to
 Appendix D, Data Quality Indicators, for a more detailed definition.

 Procedure — A specified way to perform an activity.

 Process — A set of interrelated resources and activities that transforms inputs into outputs. Examples of
 processes include analysis, design, data collection, operation, fabrication, and calculation.

 Project — An  organized set of activities within a program.

 Qualified data — Any data that have been modified or adjusted as part of statistical or mathematical
 evaluation, data validation, or data verification operations.

 Qualified services — An indication that suppliers providing services have been evaluated and determined to
 meet the technical and quality requirements of the client as provided by approved procurement documents
 and demonstrated by the supplier to the client's satisfaction.

 Quality — The totality of features and characteristics of a product or service that bears on its ability to meet
 the stated or implied  needs and expectations of the user.

 Quality Assurance (QA) — An integrated system of management activities involving planning,
 implementation, assessment, reporting, and quality improvement to ensure that a process, item, or service is
of the type and  quality needed and expected by the client.

Quality Assurance Program Description/Plan — See quality management plan.

Quality Assurance Project Plan (QAPP) — A formal document describing in comprehensive detail the
necessary quality assurance (QA), quality control (QC), and other technical activities that must be
implemented to ensure that the results of the work performed will satisfy the stated performance criteria.  The
QAPP components are divided into four classes:  1) Project Management, 2) Measurement/Data Acquisition,
3) Assessment/Oversight, and 4) Data Validation and Usability. Guidance and requirements  on preparation'

-------
                                                                                Project: Model QAPP
                                                                                        Appendix A
                                                                                       Revision No:
                                                                                       Date: 4/18/9 8
                                                                                       Page 10 of 13
of QAPPs can be found in EPA QA/R-5 and QA/G-5.

Quality Control (QC) —The overall system of technical activities that measures the attributes and
performance of a process, item, or service against defined standards to verify that they meet the stated
requirements established by the customer; operational techniques and activities that are used to fulfill
requirements for quality. The system of activities and checks used to ensure that measurement systems are
maintained within prescribed limits, providing protection against "out of control" conditions and ensuring the
results are of acceptable quality.

Quality control (QC) sample — An uncontaminated sample matrix  spiked with known amounts of analytes
from a source independent of the calibration standards. Generally used to establish intra- laboratory or
analyst-specific precision and bias or to assess the performance of all  or a portion of the measurement
system.

Quality improvement — A management program for improving the quality of operations. Such
management programs generally entail a formal mechanism for encouraging worker recommendations with
timely management evaluation and feedback or implementation.

Quality management — That aspect of the overall management system of the organization that determines
and implements the quality policy. Quality management includes strategic planning, allocation of resources,
and other systematic activities (e.g., planning, implementation, and assessment) pertaining to the quality
system.

Quality Management Plan (QMP) — A formal document that describes the quality system in terms of the
organization's structure, the functional responsibilities of management and staff, the lines of authority, and
the required interfaces for those planning, implementing, and assessing all activities conducted.

Quality system — A structured and documented management system describing the policies, objectives,
principles, organizational authority, responsibilities, accountability, and implementation plan of an
organization for ensuring quality in its work processes, products (items), and services. The quality system
provides the framework for planning, implementing, and assessing work performed by the organization and
for carrying out required quality assurance (QA) and quality control (QC).

Radioactive waste — Waste material containing, or contaminated by, radionuclides, subject to the
requirements of the Atomic Energy Act.

Readiness review — A systematic, documented review of the readiness for the start-up or continued use of a
facility, process, or activity. Readiness reviews are typically conducted  before proceeding  beyond project
milestones and prior to initiation of a major phase of work.

Record (quality) — A document that furnishes objective evidence of the quality of items or activities and
that has been verified and authenticated as technically complete and correct. Records may  include
photographs, drawings, magnetic tape, and other data recording media.

Recovery — The act of determining whether or not the methodology measures all of the analyte contained in
a sample.  Refer to Appendix D, Data Quality Indicators, for a more detailed definition.

Remediation — The process of reducing the concentration of a contaminant (or contaminants) in air, water,
or soil media to a level that poses an acceptable risk to human health.

Repeatability — The degree of agreement between independent test results produced by the same analyst,
using the same test method and equipment on random aliquots of the  same sample within a short time period.

-------
                                                                                  Project: Model QAPP
                                                                                           Appendix A
                                                                                          Revision No:
                                                                                          Date:4/18/98
 ___	Page 11 of 13


 Reporting limit — The lowest concentration or amount of the target analyte required to be reported from a
 data collection project.  Reporting limits are generally greater than detection limits and are usually not
 associated with a probability level.

 Representativeness — A measure of the degree to which data accurately and precisely represent a
 characteristic of a population, a parameter variation at a sampling point, a process condition, or an
 environmental condition. See also Appendix D, Data Quality Indicators.

 Reproducibility — The precision, usually expressed as variance, that measures the variability among the
 results of measurements of the same sample at different laboratories.

 Requirement — A formal statement of a need and the expected manner in which it is to be met.

 Research (applied) — A process, the objective of which is to gain the knowledge or understanding necessary
 for determining the means by which a recognized and specific need may be met.

 Research (basic) — A process, the objective of which is to gain fuller knowledge or understanding of the
 fundamental aspects of phenomena and of observable facts without specific applications toward processes or
 products in mind.

 Research development/demonstration — The systematic use of the knowledge and understanding gained
 from research and directed toward the production of useful materials, devices, systems, or methods, including
 prototypes and processes.

 Round-robin study — A method validation study involving a predetermined number of laboratories or
 analysts, all analyzing the same sample(s) by the same method. In a round-robin study, all results are
 compared and used to develop summary statistics such as interlaboratory precision and method bias or
 recovery efficiency.

 Ruggedness study — The carefully ordered testing of an analytical method while making slight variations in
 test conditions (as might be expected in routine use) to determine how  such variations affect test results. If a
 variation affects the results significantly, the method restrictions are tightened to minimize this variability.

 Scientific method — The principles and processes regarded as necessary for scientific investigation,
 including rules for concept or hypothesis formulation, conduct of experiments, and validation of hypotheses
 by analysis of observations.

 Self-assessment — The assessments of work conducted by individuals, groups, or organizations directly
 responsible for overseeing and/or performing the work.

 Sensitivity — the capability of a method or instrument to discriminate between measurement responses
 representing different levels of a variable of interest.  Refer to Appendix D, Data Quality Indicators, for a
 more detailed definition.

Service — The result generated by activities at the interface between the supplier and the customer, and the
supplier internal activities to meet customer needs. Such activities in environmental programs include design,
inspection, laboratory and/or field analysis, repair, and installation.

Shall — A term denoting a requirement that is mandatory whenever the criterion for conformance with the
specification  permits no deviation.  This term does not prohibit the use of alternative approaches or methods
for implementing the specification so long as the requirement is fulfilled.

-------
                                                                                 Project: Model QAPP
                                                                                         Appendix A
                                                                                        Revision No.
                                                                                        Date:4/18/98
                                                                                        Page 12 of 13


Should — A term denoting a guideline or recommendation whenever noncompliance with the specification is
permissible.

Significant condition — Any state, status, incident, or situation of an environmental process or condition, or
environmental technology in which the work being performed will be adversely affected sufficiently to require
corrective action to satisfy quality objectives or specifications and safety requirements.

Software life cycle — The period of time that starts when a software product is conceived and ends when the
software product is no longer available for routine use. The software life cycle typically includes a
requirement phase, a design phase, an implementation phase, a test phase, an installation and check-out
phase, an operation and maintenance phase, and sometimes a retirement phase.

Source reduction — Any practice that reduces the quantity of hazardous substances, contaminants, or
pollutants.

Span check — A standard used to establish that a measurement method is not deviating from its calibrated
range.

Specification — A document stating requirements and referring to or including drawings or other relevant
documents.  Specifications should indicate the means and criteria for determining conformance.

Spike — A substance that is added to an environmental sample to increase the concentration of target
analytes by known amounts; used to assess measurement accuracy (spike recovery).  Spike duplicates are
used to assess measurement precision.

Split samples — Two or more representative portions taken from one sample in the field or in the laboratory
and analyzed by different analysts or laboratories.  Split samples are quality control (QC) samples that are
used to assess analytical variability and comparability.

Standard deviation — A measure of the dispersion or  imprecision of a sample or population distribution
expressed as the positive square root of the variance and has the same unit of measurement as the mean.

Standard Operating Procedure (SOP)  — A written document that details the method for an operation,
analysis, or action with thoroughly prescribed techniques and steps and that is officially approved as the
method for performing  certain routine or  repetitive tasks.

Supplier — Any individual or organization furnishing items or services or performing work according to a
procurement document  or a financial assistance agreement. An all-inclusive term used in place of any of the
following: vendor, seller, contractor, subcontractor, fabricator, or consultant.

Surrogate spike or analyte — A pure substance with properties that mimic the analyte of interest. It is
unlikely to be found in  environmental samples and is added to them to establish that the analytical method
has been performed properly.

Surveillance (quality)  — Continual or frequent monitoring and verification of the status of an entity and the
analysis of records to ensure that specified requirements are being fulfilled.

-------
                                                                                  Project: Model QAPP
                                                                                          Appendix A
                                                                                         Revision No:
                                                                                         Date:4/18/98
                                                                                         Page 13 of 13


Technical review — A documented critical review of work that has been performed within the state of the
art. The review is accomplished by one or more qualified reviewers who are independent of those who
performed the work but are collectively equivalent in technical expertise to those who performed the original
work.  The review is an in-depth analysis and evaluation of documents, activities, material, data, or items that
require technical verification or validation for applicability, correctness, adequacy, completeness, and
assurance that established requirements have been satisfied.

Technical Systems Audit (TSA) — A thorough, systematic, on-site qualitative audit of facilities, equipment,
personnel, training, procedures, recordkeeping, data validation, data management, and reporting aspects of a
system.

Traceability — The ability to trace the history, application, or location of an entity by means of recorded
identifications. In a calibration sense, traceability relates measuring equipment to national or international
standards, primary standards, basic physical constants or properties, or reference materials. In a data
collection sense, it relates calculations and data generated throughout the project back to the requirements for
the quality of the project.

Trip blank — A clean sample of a matrix that is taken to the sampling site and transported to the laboratory
for analysis without having been exposed to sampling procedures.

Validation — Confirmation by examination and provision of objective evidence that the  particular
requirements for a specific intended use have been fulfilled. In design and development, validation concerns
the process of examining a product or result to determine conformance to user needs. See also Appendix G,
Data Management.

Variance (statistical) — A measure or dispersion of a sample or population distribution.  Population
variance is the sum of squares of deviation from the mean divided by the population size (number of
elements). Sample variance is the sum of squares of deviations from the mean divided by the degrees of
freedom (number of observations minus one).

Verification — Confirmation by examination and provision of objective evidence that specified requirements
have been fulfilled.  In design and development, verification concerns the process of examining a result of a
given activity to determine conformance to the stated requirements for that activity.

-------
                                                                    Project: Model QAPP
                                                                           Appenfix B

                                                                         Revision No: 1

                                                                         Date: 4/18/98
                                                                   	Page 1 of9
                                  Appendix B

                  Training Certification Evaluation Forms

The following forms will be used by the Department's QA Division to certify the PM2 5 field and
laboratory personnel have performed environmental data operations at a satisfactory level.
Quantitative scores of 80% are considered satisfactory.

-------
                                                                      Project: Model QAPP
                                                                             Appenfix B
                                                                           Revision No: 1
                                                                            Date: 4/1 8/98
                                                                     	Page 2 of9
Trainee:
Evaluator:
 Training Certification Evaluation Form
         Field Sampling Procedures
	                         Date
	                        Score:
Activity
Prepare for Site Visit on Scheduled Date/time
1) Preweighed sampling filter in cassette, packed in a labeled carrier. Also
take spares.
2) Three preweighed field blank filters in cassettes, packed in labeled
carriers, if a field blank study is scheduled
3) PM2 5 Sampler Run Data Sheet for each sampler, site notebook;
calculator
4) Transfer standard for ambient temperature measurements
5) Transfer standard for ambient atmospheric pressure measurements
6) Transfer standard for volumetric flow-rate measurements
7) Laptop computer and connecting cables to download sampler data
8) Spare parts and tools to include O-rings, silicone grease, lab wipes,
voltmeter, etc.
9) Operator's manual for sampler(s) to be serviced
SCORE
Fifth Day Maintenance Check
1) Clean impactor well assembly or filter/lab wipes/diffusion oil to clean
and service the one at the site
2) Sample inlet adapter and flow rate measurement transfer standard
3) Clean, unused flow check filter in its cassette
4) Sampler Flow Check Data Sheet
SCORE
Install Filter/Cassette and Begin Sampler Operations
1) Remove the new filter/cassette from its protective case and visually
inspect the filter/cassette for flaws. Verify that this is the correct filter for
this sampler, site, and run date
2) Be sure sampler is not operating.
3) Fill in initial information on PIH5 Run Data Sheet
Successful










/9





/4




Comment






















-------
Project: Model QAPP
         Appenfix B
      Revision No: 1
       Date: 4/18/98
         Page 3 of9
Activity
4) Remove the sampler's filter holder assembly (if required by the
manufacturer's instructions). Inspect the O-rings inside the filter holder.
5) Install the filter/cassette in the filter holder assembly, and then install the
loaded filter holder assembly in the sampler per the manufacturer's
instructions. If you touch or scratch the filter, void the filter and get another
one from the set of extra filters brought to the site.
6) Program the sampler to energize at the beginning of a sampling period
(consult the instruction manual).
7) Make independent measurements of ambient temperature (]) and
ambient pressure (PJ using transfer standards. Record these values and the
T. and Pa values indicated by the sampler on the data sheet
8) Ensure that the samplers) begins operation at the designated time.
Record the start time on the data sheet. 15 minutes after sampling begins,
record the sampler's display value for the indicated flow rate, Q, in L/min
on the data sheet.
SCORE
Remove Filter/Cassette; End Sampling Operations
1) Determine Pa and Ta using transfer standards. Enter on data sheet.
2) When sampling ends, record stop time, total elapsed time, final Q, Qg,
Qcv, total volume sampled, T^, P., etc, on data sheet
3) After each completed run, download data from the sampler data port to
a laptop or other computer storage disk.
4) Open the filter holder assembly (consult the instruction manual);
remove the used filter/cassette; visually inspect the filter for tears, oil,
insects, moisture, etc; and record observations on the data sheet.
5) Place the filter/cassette inside a properly labeled protective container.
Verify the container's label versus the site name, date, etc.
6) Place the container inside a cooled storage chest. Do not allow the metal
container to come into contact with ice or water. Sealed cooling blocks are
recommended. Protect the containers from condensed water.
7) Inspect the interior of the filter housing. Note any abnormalities.
8) Inspect the interior of the impactor housing and the exterior of the
impactor well. Remove any moisture or dust with a lint-free wipe and
make notes on the data sheet.
9) Without opening the impactor well, inspect the well's interior. Note any
abnormalities. Clean or replace the impactor well if necessary or if the
recommended 5-day servicing is due. Reinstall the impactor assembly. (If
another sampling run is to begin, insert a new filter/cassette in the filter
holder assembly and set up the sampler for the next run.)
Successful





/8










Comment

















-------
 Project: Model QAPP
          Appenfix B
       Revision No: 1
        Date: 4/18/98
	Page 4 of9
Activity
10) Review the recorded data for sample elapsed time, flow rate, filter
quality, and temperature to start the process of determining if the sample is
valid, questionable, or invalid. Scan through the sampling summary on the
sampler display and note flags. Record observations and reasoning for
questioning or invalidating a run on the data sheet.
1 1 ) Make a final check of the site, and observe and record the presence of
any activity that may have affected the particulate loading of the sample.
12) Keep the container holding the filter/cassette at a temperature of less
than 25 ฐC (preferably cooled to 4ฐC), and promptly deliver it and the
original of the data sheet to the sample custodian in receiving facility. Keep
a copy of the data sheet with the site records.
SCORE
FINAL SCORE
PERCENTAGE
Successful



/12
/33
%
Comment







-------
                                                                                              Project: Model QAPP
                                                                                                        Appenfix B
                                                                                                     Revision No: 1
                                                                                                      Date: 4/18/98
                                                                                             	Page 5 of9
                              Training Certification Evaluation Form
                                        Laboratory Procedures
Trainee:
                                                          Date
Evaluator:
                                                              Score:
 Activity
Sucessful
Comments
 Pre-sampling PROCEDURES
  1) Clean the microbalance's weighing chamber with a fine brush, if
  necessary.
 2) Zero (i.e., tare) and calibrate the microbalance according to the
 manufacturer's directions. Record the tare weight on the laboratory data
 form and in the laboratory notebook or database.
 3) Using smooth, nonserrated, nonmetallic forceps, weigh two working
 mass reference standards as a QC check. Wait until the microbalance's
 display has remained steady for 30 to 60 seconds or until the microbalance
 indicates that a stable reading has been obtained. Record the certified and
 measured values of these standards on the laboratory data form and in the
 laboratory notebook or database.
 4) Record the relative humidity and temperature of the conditioning
 chamber on the laboratory data form and in the laboratory QC notebook or
 database. Verify the filter has been conditioned for at least 24 hours.
 5) Laboratory blank filters and the current sampling interval's field blank
 filters will be weighed at least once in each weighing session. If many
 filters are weighed, you may want to weigh the set of laboratory blanks
 more than once. A new set of three laboratory blanks will be established
 for each distinct filter lot
 6) Weigh the filters. Operate the balance according to the balance
 manufacturer's directions. Take the filter from its filter-handling container
 (petri dish or equivalent) by gently slipping the filter-handling forceps
 under the outer polyolefin support ring. Hold the filter only by the ring.
 Place the filter, reinforcing ring side up, next to a2l"Po antistatic strip for
 30 to 60 seconds. The antistatic strip will be inside the weighing chamber
 or as close to the chamber door as is practical Immediately transfer the
 filter to the microbalance's pan and close the weighing chamber door
 After the microbalance's display has remained steady for at least 60
 seconds or until the microbalance indicates that a stable reading has been
 obtained, record the balance number, the sampler number the filter is
 intended to be used with, the filter number, the filter lot number, and the
 filter's tare weight (pre-sampling mass) on the laboratory data form.

-------
 Project: Model QAPP
          Appenfix B
       Revision No: 1
        Date: 4/18/98
	Page 6 of9
Activity
7) After every tenth filter weighing, the analyst will rezero the
microbalance and reweigh one working standard. Record the zero and
working standard measurements on the laboratory data form and the
laboratory QC notebook or database. If the zero and working standard
measurements disagree from the first measurements of the day by more
than 3 ug (i.e., three times the microbalance's reproducibility), repeat the
zeroing process and reweigh the working standards. If the two
measurements still disagree, contact the laboratory's QA Officer, who may
direct the analyst to (1) reweigh the previously weighed filters and/or (2)
troubleshoot or repair the microbalance, re-zero and reweigh the two
working standards and repeat the weighing session.
8) Any unused filter whose weight is outside the normal range (i.e., 1 10 to
160 mg) must be investigated. If there is a consistent negative replication
(>15 ug) for laboratory blank filters, it is usually a sign that the filters have
not equilibrated long enough. In this case, notify the QA Officer.
9) Return the filter to the filter-handling container, replace the lid, and
return it to storage.
10) Prior to filters beinge taken to the sites, install each filter in a filter
cassette, and put the filter/cassette assembly into a protective container for
transport to the sampler. Attach a label with the sampler number and the
unique filter number to the outside of the protective container. This label
will also be used to identify the upcoming sample run date. Record the
sampler number, sample date, and filter number on the PIy}5 Sampler Run
Data Sheet. Double-check the entries in the laboratory data form. Prepare
several extra filters in case a filter is invalidated during the installation
process.
SCORE
Post-sampling DOCUMENTATION/INSPECTION
PROCEDURES
1) Examine the field data sheet. Determine whether all data needed to
verify sample validity and to calculate mass concentration (e.g., average
flow rate, ambient temperature and barometric pressure, and elapsed time)
are provided. If data are missing or unobtainable from a field operator or if
a sampler malfunction is evident, flag the filter and record in the laboratory
data form that the sample has been flagged and the reason. Notify the QA
Officer
Sucessful




/10


Comments








-------
 Project: Model QAPP
          Appenfix B
       Revision No: 1
        Date: 4/18/98
	Page 7 of9
Activity
2) If the shipment was to be kept cold, verify that the temperature of the
cooler's interior was maintained at the desired point, usually less than
4 ฐC. If the protective container is cold, allow it to warm to the filter
conditioning environment's temperature before opening, to preclude water
condensation on a cold filter. Remove the filter from its protective
container and examine the container. If particulate matter or debris is found
in the protective container after the filter has been removed, flag the filter
and record notes on the laboratory data form that the sample has been
flagged and the reason. Save the filter for inspection by the QA Officer.
3) Match the sampler number with the correct laboratory data form on
which the original microbalance number, filter number, pre-sampling filter
weight, and other information were inscribed. Group filters according to
the microbalance used to determine their initial tare weights. Initial
separation of filters in this way will eliminate the risk of a measurement
error that could result from the use of different microbalances for pre- and
post-sampling weighings.
4) Remove the filter from both the protective container and the filter
cassette. Be careful not to touch or otherwise disturb the filter and its
contents. Transfer the filter to a filter-handling container labeled with the
corresponding filter number. Place the used filter in the container "dirty-
side" up. Keep the particles from contact with the walls of the container.
The filter must be handled with clean, smooth forceps and must not be
touched by hands. Inspect the filter for any damage that may have occurred
during sampling. If any damage is found, void the sample, and record on
the laboratory data form that the sample has been voided and why. Retain
the filter for inspection by the QA Officer.
5) Transfer the filter in its filter-handling container to the conditioning
chamber under the same conditions as pre-sampling ฃ 5% RH)
6) Allow the filter to condition for not less than 24 hours
SCORE
POST SAMPLING FILTER WEIGHING
1) Group filters according to the microbalance used for pre-weighing and
by their filter numbers. Reweigh each filter on the same microbalance on
which its pre-sampling weight was obtained.
2) Clean the microbalance's weighing chamber with a fine brush, if
necessary.
3) Zero (i.e., tare) and calibrate the microbalance according to the
manufacturer's directions. Record the tare weight on the laboratory data
form and in the laboratory notebook or database
Sucessful





16




Comments











-------
                                                                                                 Project: Model QAPP
                                                                                                           Appenfix B
                                                                                                        Revision No:  1
                                                                                                         Date: 4/18/98
                                                                                                	Page 8 of9
Activity
                                                                        Sucessfiil
Comments
4) Using smooth, nonserrated, nonmetallic forceps, weigh two working
mass reference standards as a QC check. Wait until the microbalance's
display has remained steady for 30 to 60 seconds or until the microbalance
indicates that a stable reading has been obtained. Record the certified and
measured values of these standards on  the laboratory data form and in the
laboratory notebook or database.
5) Record the relative humidity and temperature of the conditioning
chamber on the laboratory data form and in the laboratory QC notebook or
database.
6) Laboratory blank filters and the current sampling interval's field blank
filters will be weighed at least once in each weighing session. If many
filters are weighed, you may want to weigh the set of laboratory blanks
more than once. A new set of three laboratory blanks will be established
for each distinct filter lot
7) Weigh the filters. Operate the balance according to the balance
manufacturer's directions. Take the filter from its filter-handling container
(petri dish or equivalent) by gently slipping the filter-handling forceps
under the outer polyolefin support ring. Hold the filter only by the ring.
Place the filter, reinforcing ring side up, on a2'ฐPo antistatic strip for 30 to
60 seconds. The antistatic strip will be inside the weighing chamber or as
close to the chamber door as is practical. Immediately transfer the filter to
the microbalance's pan and close the weighing chamber door. After the
microbalance's display has remained steady for at least 60 seconds or until
the microbalance indicates that a stable reading has been obtained, record
the balance number, the sampler number the filter is intended to be used
with, the filter number, the filter lot number, and the filter's tare weight
(pre-sampling mass) on the laboratory data form.
8) After every tenth filter weighing, the analyst will rezero the
microbalance and reweigh the one working standard.  Record the zero and
working standard measurements on the laboratory data form and the
laboratory QC notebook or database. If the zero and working standard
measurements disagree from the first measurements of the day by more
than 3 ug, repeat the zeroing process and reweigh the  working standards. If
the two measurements still disagree, contact the laboratory's QA Officer,
who may direct the analyst to (1) reweigh the previously weighed filters
and/or (2) troubleshoot or repair the microbalance, re-zero and reweigh the
two working standards and repeat the weighing session.
 9) Any unused filter whose weight is outside the normal range (i.e., 110 to
 160 mg) must be investigated. If there is a consistent negative replication
 (>15 ug) for laboratory blank filters, it is usually a sign that the filters have
 not equilibrated long enough In this case, notify the QA Officer.
 10) Return the filter to the filter-handling container, replace the lid, and
 return it to storage.

-------
 Project: Model QAPP
          Appenfix B
       Revision No: 1
        Date: 4/18/98
	Page 9 of9
Activity
1 1) If the pre- and post-sampling weights for the laboratory and field filter
blanks disagree by more than 1 5 jog, repeat the measurements. If the two
measurements still disagree, contact the laboratory's QA Officer, who may
direct the analyst to (1) reweigh the previously weighed filters and/or (2)
troubleshoot or repair the microbalance, then reweigh.
12) If the filter will receive further analysis, return it to the filter-handling
container and note on the container and the laboratory data form that
additional analyses are required. Transfer the filter to the laboratory
responsible for performing the additional analyses.
134) A filter's post-sampling mass minus its pre-sampling mass is the net
mass loading for that filter. Record this value on the laboratory data form.
Refer to Section 11.0 of Guidance Document 2.12 for the calculations
required to compute and report ambient PN|5 concentrations in ug/m3.
SCORE
FINAL SCORE
PERCENTAGE
Sucessftil



/13
/29
%
Comments







-------
                                                                        Model QAPP
                                                                         Appendix C
                                                                        Revision No:
                                                                        Date: 4/18/98
                                                                         Pagel of 17
                                  Appendix C

               Analytical and Calibration Procedures (SOPs)

The following SOPs will be used to prepare for and weigh QC and sample filters before and after
they are used for sampling, and to calibrate the sampling instruments and measuring devices
required for PM2 5 sampling described in Appendix L of 40 CFR Part 50. Most of the information
in these procedures comes from the EPA QA Handbook, Section 2.12.6 (Calibration) and
2.12.7 (Filter Preparation and Weighing).

-------
                                                                                Model QAPP
                                                                                 Appendix C
                                                                                Revision No:
                                                                                Date: 4/18/98
                                                           	Page 2 of 17


A-MRS   Mass Reference Standards

Mass reference standards will be in the range of 0 to 200 mg, given that the mass range of typical
46.2-mm filter is from 110 to 160 mg. They must be certified as being traceable to NIST mass
standards (see ASTM 1993b; Harris 1993; Kupper 1990). Additionally, they must have an
individual tolerance of no more than 0.025 mg. Examples of mass reference standards that meet
these specifications are ANSI/ASTM Classes 1, 1.1, and 2. The Department will use Class 1
standards. The mass reference standards must be recalibrated on a regular basis (e.g., yearly) at a
NIST-accredited State weights and measures laboratory or at a calibration laboratory that is
accredited by the National Voluntary Laboratory Accreditation Program (NVLAP), which is
administered by NIST (Harris 1994; White 1997). The recalibration frequency will be determined
from records of previous recalibrations of these standards.

Note that the microbalance's resolution and repeatability are better than the tolerance of the most
accurate classes of mass reference standards. Accordingly, the  accuracy of the gravimetric
analysis is limited by the tolerance of the standards rather than  by the microbalance's
characteristics.

Two separate sets of mass reference standards are recommended; working and primary standards.
Working calibration standards will be used for routine filter weighing and will be kept next to the
microbalance in a protective container. Laboratory primary standards will be handled very
carefully and will be kept in a locked compartment. The working standards will be compared to
the laboratory primary  standards every 3 to  6 months to check for mass shifts associated with
handling or contamination. The current masses of the  working standards, as compared to the
laboratory primary standards, will be recorded in a laboratory  notebook and used to check the
calibration of the microbalance.

Always use smooth, nonmetallic forceps for handling  mass reference standards. The  standards are
handled only with these forceps, which are not used for any other purpose. Mark these forceps  to
distinguish them from the forceps used to handle filters. Handle the standards carefully to avoid
damage that may alter their masses.

A-FH     Filter Handling

Careful handling of the filter during sampling, equilibration, and weighing is necessary to avoid
measurement errors due to damaged filters or a gain or loss of collected particles on the filters.
Whenever filters are handled, the analyst must wear gloves that are powder-free and antistatic.
The filters must be handled carefully with smooth, nonserrated forceps that are used only for that
purpose.  Mark these forceps to distinguish them from the forceps used to handle mass reference
standards. These precautions reduce the potential effect from body moisture or oils contacting the
filters and subsequently affecting the measured weights.

In the laboratory, each  filter will be transferred from its sealed manufacturer's packaging to a
filter-handling container, such as a glass or plastic petri dish, to reduce the risk of contamination.
The filter will remain in this container, except for weighing, until it is loaded into a filter cassette
prior to sampling. Each filter must have a unique identification number. A label that lists the filter
number must be attached to the filter-handling container. It is recommended that each
microbalance be assigned a block of filter numbers to be processed and used sequentially. Assign
a filter identification number and take extreme care to avoid mistakenly assigning the same
number twice or omitting a number.

-------
                                                                                Model QAPP
                                                                                 Appendix C
                                                                                Revision No:
                                                                                Date: 4/18/98
                                                                                 Page 3 of 17
A-FIC   Filter Integrity Check
All filters must be visually inspected for defects before the initial weighing. A filter must be
discarded if any defects are found. Any lot of filters containing a high number of defects will be
returned to the supplier. Specific defects to look for are the following:

1.  Pinhole—A small hole appearing (a) as a distinct and obvious bright point of light when
    examined over a light table or screen or (b) as a dark spot when viewed over a dark surface.

2.  Separation of ring—Any separation or lack of seal between the filter and the filter border
    reinforcing the ring.

3.  Chaff or flashing—Any extra material on the reinforcing, polyolefin ring or on the heat seal
    area that would prevent an airtight seal during sampling.

4.  Loose material—Any extra loose material or dirt particles on the filter.

5.  Discoloration—Any obvious discoloration that might be evidence of contamination.

6.  Filter nommiformiry—Any obvious visible nonuniformity in the appearance of the filter
    when viewed over a light table or black surface that might indicate gradations in
    porosity or density across the face of the  filter.

7.  Other—A filter with any  imperfection not described above, such as irregular surfaces or other
    results of poor workmanship.

A-FC   Filter Conditioning

Filters will be  conditioned or equilibrated immediately before both the pre- and post-sampling
weighings. Filters must be conditioned for at least 24 hours to allow their weights to stabilize
before being weighed.

Researchers in the desert western and southeastern portions of the United States have found that
some Teflonฎ filters exhibit a  loss of weight for a period of time after they are removed from their
original shipping containers. The magnitude of weight loss varies from batch to batch and may be
due to loss of volatile components from the polyolefin support ring. In the desert West, weight
loss of up to 150 ug has been  observed (Desert Research Institute 1994).  Some filters require at
least 6 weeks to equilibrate.

In the Southeast, filter weight stability experiments were done as part of EPA's research to
develop the volatility test now included in 40 CFR Part 53.66 of the revised requirements for
designation of reference and equivalent methods for PM25 (Eisner 1997).  Small but still relatively
significant (i.e., from 0 to 45 ug) weight  losses were observed. These experiments showed that
the problem could be addressed by active conditioning (e.g., forced, HEPA-filtered air for 1-hour
duration) instead of passive conditioning. The active conditioning was conducted with each filter
sitting in the bottom of an  open petri dish. Consecutive 4-hour periods of active conditioning of
filters did not change the weight by more than ฑ5 ug.

Mean relative humidity will be held between  30 and 40 percent, with a variability of not more than
ฑ5 percent over 24 hours. Mean temperature  will be held between 20 and 23 ฐC, with a variability
of not more than ฑ2 ฐC over 24 hours. Relative humidity and temperature will be continuously
measured and recorded on a daily basis during equilibration. The Department's PM2 5 laboratory

-------
                                                                                 Model QAPP
                                                                                  Appendix C
                                                                                 Revision No:
                                                                                 Date: 4/18/98
           	Page 4 of 17

has demonstrated that it meets this criteria.  It should be noted that the relative humidity
condtions for post-sampling conditioning should be within + 5% of pre-sampling conditions.

Within the conditioning chamber, the filters will be placed on a covered rack that will allow air
circulation over the filters while reducing the chance that airborne material inside the chamber will
settle onto the filters.

Filters will be conditioned in their filter-handling containers. Label both the container's lid and
bottom half. The lid must be removed during conditioning. Place the lid beneath the bottom half
of the container to be certain that no filter mix-up occurs. To improve filter inventory control,
care will be taken to stack the filters in the chamber in numerical order so that the analyst can
more easily weigh the filters in numerical order.

Note: Typically, filters come packed together in large groups or in a container with separators.
This package is usually contained inside another clear, reclosable plastic package, which may, in
turn, be inside a box used in shipping. The more time that each filter is exposed to the
conditioning environment, the more likely that its weight will be stable by the end of a
conditioning period.

New filters will be removed individually from their sealed packages, placed in their own filter-
handling containers (uncovered petri dish) and conditioned for a sufficient time to allow their
weight to stabilize before use. The Department will condition -100 filters every  15 days. Each set
of 100 filters will be considered a "lot" and be conditioned as a lot.  100 filters would
support all sites and QC samples for 30 days of sampling. Analysts may need to determine the
conditions and time period needed to stabilize the weights for each new lot of filters. To
determine this,  randomly select three filters for "lot blanks" from each lot and expose each in a
separate container, similar to routine filters,  in the conditioning chamber. Weigh the filters prior
to conditioning, and every 24 hrs. If the weight change exceeds 15 ug, the conditioning period
will continue for that lot. Conduct additional tests with the lot blanks until the weight change of
each lot blank is less than 15 ug between weighings, signifying that the filters from that lot can be
processed for use in the field. Lot blank weighings are recorded in the Filter Conditioning
Notebook using the Filter Condition Form (Figure C.I)

A-ESC- Electrostatic Charge Neutralization

Electrostatic charge buildup will prevent a microbalance from operating properly. Static charge is
the accumulation of electrical charges on the surface of a nonconductive material. Common
symptoms of this problem include noisy readout, drift, and sudden readout shifts. Electrostatic
charge buildup  becomes greater as the air becomes drier. To reduce static charge within the
balance,  place a radioactive antistatic strip containing a very small amount (i.e., 500 picocuries)
of 210Po in the weighing chamber. 210Po antistatic strips are used to reduce electrostatic buildup in
the microbalance's weighing chamber and on individual filters by charge neutralization. They will
neutralize electrostatic charges on items brought within an inch of them. These antistatic strips are
safe, commonly available, and inexpensive.21 To has a half-life  of 138 days. Change the antistatic
strips every 6 months and dispose of the old strips according to the manufacturer's
recommendations. The technician will hold each filter about an inch from the antistatic strip for
30 seconds before it is weighed. See Engelbrecht et al. (1980), Hawley and Williams (1974), and
Weil (1991) for more information about electrostatic charge and how to minimize its effects.

-------
                                                                                Model QAPP
                                                                                 Appendix C
                                                                                Revision No:
                                                                                Date: 4/18/98
                                                                                 Page 5 of 17


Filter Conditioning Form
Filter Lot Number C2O1O2
Balance Number A446O3
Analysis Date- Time
6/28/98--10:54
6/29/98--11:OO
6/3O/98-11:OO




Analysis Date
6/28/98-10:54
6/29/98-11:00
6/29/98-11:00




Zero (Tare) Check
Weight (mg)
o.ooo
o.ooo
o.ooo




Lot Blank 1 Weight
(mg)
136.56O
136.54O
136.535




Analyst F. N@TTHFMG&-BAfti
QA Officer J. B>aPMSfVIฎKE
Working Standard 1
Weight (mg)
1OO.OO1
99.999
99.999




Lot Blank 2 Weight
(mg)
129.999
129.98O
129.978




Working Standard 2
Weight (mg)
199.999
2OO.OO1
2OO.OOO




Lot Blank 3 Weight
(mg)
13O.633
13O.622
13O.62O






Figure C.I. Example Filter Conditioning Form.


Do not assume that grounding eliminates all electrostatic buildup because the electrical ground
may not be perfect. Even though a filter weight might stabilize within 30 seconds and no weight
drift is observed during that period, the microbalance may still be influenced by some electrostatic
buildup.
Charge neutralization times may need to be longer than 60 seconds for sampling situations in
which (1) a high amount of charge has developed on collected particles due to their origin or (2)

-------
                                                                                Model QAPP
                                                                                 Appendix C
                                                                                Revision No:
                                                                                Date: 4/18/98
	Page 6 of 17

the particle loading on a filter is large. Examples of atmospheres that might be expected to contain
a higher quantity of charged particles include air containing particles generated by mechanical
means and air through which lightning has passed.

A-l    Pre-sampling Filter Weighing (Tare Weight)

The microbalance will be located in the same controlled environment in which the filters are
conditioned. The filters must be weighed immediately following the conditioning period. Prior to
actually weighing, select a sample batch of filters that can be weighed for the day. Since a sample
batch has been developed for post weighing (Section 14 of QAPP), a batch will be developed that
contain at a minimum 20 routine filters, 4 collocated filters, 3 field blanks and 3 laboratory blanks.
Including 4 spare filters, 34 filters will be included in a pre-sampling filter batch. Start a Filter Pre-
sampling Weighing Sheet to record this information (Figure C.2)

These steps will be followed during the pre-sampling filter weighing:

1.  Record the relative humidity and temperature of the conditioning chamber on the laboratory
    data form.  Ensure that the filters have been conditioned for at least 24 hours prior to
    weighing.

2.  Clean the microbalance's weighing chamber with a fine brush, if necessary. Avoid using
    pressurized gas, which may blow damaging debris and oils into the microbalance's
    mechanism. Clean the surfaces near the microbalance with antistatic solution- or methyl
    alcohol-moistened disposable laboratory wipes. Clean the standard forceps with a lint-free
    cloth and the filter forceps with the moistened wipes. Make sure the forceps are thoroughly
    dry before  use. Even a small amount of moisture can cause a significant measurement bias.

3.  To ensure maximum stability, the microbalance will be  turned on at all times. This procedure
    enables the microbalance to be operational at any time and eliminates the need for a warmup
    period before analyses are performed.

4.  Allow the microbalance to perform an internal calibration. When this is completed, zero (i.e.,
    tare) the instrument.

5.  Using smooth, nonserrated, nonmetallic forceps, weigh two working mass reference
    standards ( a 100 and 200 mg) as a QC check. Handle the working standards carefully to
    avoid damage that may alter their masses. Recheck the  standards annually or after any
    incident of rough handling  against the laboratory's primary standard weights. The 100 or 200
    mg standard approximates the mass of a blank or a loaded filter. Wait until the micro-
    balance's display indicates that a stable reading has been obtained. Record the certified and
    measured values  of these standards on the laboratory data form.

    If the certified and measured values of a working standard disagree by more that 3 ug,
    reweigh the working standards. If the two values still disagree, follow corrective action
    procedures (see QAPP section 14).

7.  Weigh the filters. Operate the balance according to the  balance manufacturer's directions.
    Take the filter from its filter-handling container (petri dish or equivalent) by gently slipping
    the filter-handling forceps under the outer polyolefin support ring. Hold the filter only by the
    ring. Place the filter, reinforcing ring side up, close to the 210Po antistatic strip for 30 seconds.
    The antistatic strip will be inside the weighing chamber or as close to the chamber door as is
    practical. Immediately transfer the filter to the microbalance's pan and close the weighing

-------
                                                                                Model QAPP
                                                                                 Appendix C
                                                                                Revision No:
                                                                                Date: 4/18/98
	Page 7 of 17

    chamber door. After the microbalance's display indicates that a stable reading has been
    obtained, record filter's tare weight (pre-sampling mass) on the laboratory data form.

8.  After approximately every tenth filter weighing, the analyst will rezero the microbalance and
    reweigh one working standard. Record the  working standard measurements on the
    laboratory data form.  If the zero and working standard measurements disagree from the first
    measurements of the day by more than 3 jog (i.e., three times the microbalance's
    reproducibility), repeat the zeroing process and reweigh the working standards.  If they are
    within acceptance reweigh the previous ten routine filters and continue. If the two
    measurements still disagree, follow corrective action procedures (see QAPP section 14).

9.  Any unused filter whose weight is outside the normal range (i.e., 110 to 160 mg) must be
    investigated. If there is a consistent negative replication (>15 ug) for laboratory blank filters,
    it is usually a sign that the filters have not equilibrated long enough. In this case, notify the
    QA Officer.

10. Return the filter to the filter-handling container, replace the lid, and return it to storage.

11. When the time comes for the filters to be taken to the sites  (must be within 30 days of the
    initial weighing), install each filter in a filter cassette, and put the filter/cassette assembly into
    a protective container for transport to the sampler. Attach a label with the sampler number
    and the unique filter number to the outside of the  protective container. This label will also be
    used to identify the upcoming sample run date. Record the  sampler number,  sample date, and
    filter number and protective container label on the PM2 5 Filter Inventory Sheet. Double-
    check the entries in the laboratory data form. The sample will have to be invalidated if it
    cannot be reconciled with the correct sampler and filter identification numbers. Prepare
    several extra filters in case a filter is invalidated during the installation process.

-------
                                                                                     Model QAPP
                                                                                      Appendix C
                                                                                     Revision No:
                                                                                     Date: 4/18/98
                                                                                      Page 8 of 17


Filter Lot Number 0001 Filter Pre-sampling Batch
Number 0001
Balance Number A44603
Pre-sampling Filter Weighing Date
Sampler
ID



AD001
AD001
AD006
AD002
AD002
AD003
AD004
AD005
AD007


AD001
AD002
AD003
AD003
AD004
AD005





Site ID



Al
Al
Al
A2
A2
A3
A4
Bl
Bl


Al
A2
A3
A3
A4
Bl





Filter Number3
lOOmg
200 mg
LB990001
FB990001
RF990001
RF990002
RF990003
FB990002
RF990004
RF990005
RF990006
RF990007
lOOmg
LB990002
RF990008
RF990009
RF990010
FB990003
RF990011
RF990012
LB990003
FB990004




Analyst F. Nottingham
QAO J. Dinsmore
MJO/9J5 Temp 21ฐC
Pre-sampling Mass
(mg)
99.999
199.999
130.633
130.633
139.293
136.020
135.818
130.633
131.456
137.508
136.546
129.999
99.999
130.896
130.633
139.293
136.020
135.818
131.456
137.508
136.546
129.999



Post-sampling
Mass (mg)

























Net Mass Filter
Loading (mg)

























'Indicate zero (tare) check or working standard check here.

Figure C.2. Example pre-sampling laboratory data form.

-------
                                                                                Model QAPP
                                                                                 Appendix C
                                                                                Revision No:
                                                                                Date: 4/18/98
	__^	Page 9 of 17

A-2 Post-sampling Documentation and Inspection

Upon receipt of the sample from the field, the Shipping/Receiving Office will:

1.   Receive shipping/transport container(s)
2.   Upon receipt, open the container(s) to find  Filter Chain of Custody Record(s) or collect the
    originals from the site operator (if delivered by operator).
3.   Fill out the "Filter Receipt" area of the Filter Chain of Custody Records(s). Check sample
    container seals.
4.   If the samples are delivered on a weekday, follow sequence 5; if the sample (s) are delivered
    on a weekend, follow sequence 6
5.   Check the  "Sent to Laboratory" column of the Filter Chain of Custody Records(s) and
    transport the filters to the PM2 5 weighing laboratory. Upon delivery to the PM2 .weighing
    laboratory, complete the "Filter Transfer" area of the Filter Chain of Custody Records(s)
6.   Store the samples in the refrigerator and check the "archived" column of the Filter Chain of
    Custody Records(s). On the Monday of the following week,  deliver the archived filters to the
    PM2 5 weighing laboratory and complete the "Filter Transfer" area of the Filter Chain of
    Custody Records(s)

Upon filter transfer, the laboratory personnel will

1   Examine field data sheets and custody sheets. Determine whether all data needed to verify
    sample validity and to calculate mass concentrations are provided. If data are missing or
    unobtainable from the field operator or if a sampler malfunction is evident, flag the filter
    appropriately but continue processing. Notify QAO.

2.   If the protective shipping container is cold,  allow it to warm to the filter conditioning
    environment's temperature before opening to preclude water condensation on a cold filter.
    Remove the filter from its protective container and examine the container. If particulate matter
    or debris is found in the protective container after the filter has been removed, record notes on
    the laboratory data form and flag appropriately. Consult the branch manager if it is felt that
    the sample should be invalidated.

3.   Match the  sampler number with the correct laboratory data form on which the  original
    microbalance number, filter number, pre-sampling filter weight, and other  information were
    inscribed. Group filters into sample batches (see QAPP Section 14) according to the
    microbalance used to determine their initial  tare weights. Initial separation of filters in this way
    will eliminate the risk of a measurement error that could result from the use of different
    microbalances for pre- and post-sampling weighings.

4.   Remove the filter from both the protective container and the filter cassette. Some cassettes
    may require special tools to disassemble them. Be very careful when removing the filter from
    the cassette. Be careful not to touch or otherwise disturb the filter and its contents. Transfer
    the filter to a filter-handling container labeled with the corresponding filter number. Place the
    used filter in the container "dirty-side" up. Keep the particles  from contact with the walls of
    the container. The filter must be handled with clean, smooth forceps and must not be touched
    by hands. Inspect the filter for any damage that may  have occurred during sampling. If any
    damage is  found, flag and record this on the laboratory data form. Retain the filter for
    inspection by the branch manager. Consult the branch manager if it is felt that the sample will
    be invalidated.

5.   Transfer the filter in its filter-handling container to the conditioning chamber.

-------
                                                                              Model QAPP
                                                                               Appendix C
                                                                              Revision No:
                                                                              Date: 4/18/98
	Page 10 of 17

6. Allow the filter to condition for not less than 24 hours. It should be noted that the relative
   humidity condtions for post-sampling conditioning should be within+5% of pre-sampling.

A-3 Post-sampling Filter Weighing (Gross Weight)

Both the pre- and post-sampling filter weighing must be carried out on the same analytical
balance, and preferably by the same analyst. Use an effective technique to neutralize static charges
on the filter. The post-sampling conditioning and weighing will be completed within 240 hours
(10 days)from the sampling end date, unless the filter is maintained at 4 ฐC or less during the
entire time between retrieval from the sampler and start of the conditioning, hi which case the
period shall not exceed 30 days.

The following steps will be followed during post-sampling filter weighing.

1. Group filters according to the micrpbalance used for pre-weighing and by their filter numbers.
   Reweigh each filter on the same microbalance on which its pre-sampling weight was obtained.

2. Repeat Steps 1 through 10 in Section A.I

3. If the filter will receive further analysis, return it to the filter-handling container and note on
   the container and the laboratory data form that additional analyses are required. Transfer the
   filter to the laboratory responsible for performing the additional analyses.

Calculation of Net Mass Filter Loading

A filter's post-sampling mass minus its pre-sampling mass is the net mass loading for that filter.
Record this value on the laboratory data form.

The mass of particulate matter collected on the filter during the sampling period is determined by
subtracting the initial (tare) mass of each filter from the final mass of the filter, as

                                  M25 = (Mf   M,)xio3                             (11-2)

where

       M2 5 = total mass of PM2 5 collected during the sampling period, ug
        Mf = final mass of the equilibrated filter after sample collection, mg
         Mj = initial (tare) mass of the equilibrated filter before sample collection, mg
        103 = units conversion (ug/mg).

For example, a filter that weighed 139.293 mg before sampling (M,)  and 139.727 mg after
sampling (Mf) would have a PM2 5 mass (M2 5) of 434 ug.

See remaining sections below for the calculations required to compute and report ambient PM2 5
concentrations in ug/m3.

Sample Volume Calculations

Both reference and equivalent method samplers are required to provide measurements of the total
volume of air sampled (VJ, in m3 at the actual ambient temperatures and pressures during
sampling (40 CFR Part 50, Appendix L, paragraph 7.4.5.2). If the sampler's flow measurement

-------
                                                                              Model QAPP
                                                                               Appendix C
                                                                              Revision No:
                                                                              Date: 4/18/98
                                                                              Page 11 of 17
system is properly calibrated, Va should be accurate, and no further sample volume calculations
are required.

Note that in the event the total sample volume measurement from the sampler is not available, the
total sample volume may be calculated by multiplying the average flow rate, in actual irrYmin, by
the elapsed sample collection time in minutes. Both of these measurements are required to be
provided by reference and equivalent method samplers. Use the following formula only if Va is not
available directly from the sampler:

                                    Va = Qavextxl03                              (11-1)

where

        Va = total sample volume, actual m3
        Qave = average sample flow rate over the sample collection period, L/min
          t = total elapsed sample collection time, min
        10 3 = units conversion (m3/L).

For example, a sampler with an average flow rate of 16.7 L/min (Qave) for a 1,410-min (23.5-
hour) sampling period (t) would have a total sample volume (VJ of 23.5 m3.

PM2 5 Concentration Calculation

Each PM2 5 mass concentration measurement is calculated by dividing the total mass of PM2 5
(Equation 11 -2) collected during the sampling period (M2 5) by the total volume of air sampled
(VJ (taken directly from the sampler readout display or calculated from Equation 11-1), as

                                    PM25 = M25/Va.                               (11-3)

For example, a sample with a mass (M2 5) of 434 ug collected from a total sample volume (VJ of
23.5 m3 calculates to be a PM25 concentration (PM25) of 18.5 ug/m3.

-------
                                                                              Model QAPP
                                                                               Appendix C
                                                                              Revision No:
                                                                              Date: 4/18/98
	Page 12 of 17

SOPS CALIBRATION PROCEDURES


A-4    FlowRate Calibration Verification Procedure

The sampler's flow rate measurement system must be verified/recalibrated after electromechanical
maintenance or transport of the sampler, and whenever there is any indication that the system is
inaccurate or operating abnormally. Be sure to check the temperature and pressure measurement
systems also.

For routinely operated samplers that are performing properly, the sampler's flow rate
measurement system will be verified/recalibrated at periodic intervals not to exceed 1 year. A
good way to determine an appropriate frequency for each sampler is to keep a control chart ( a
running plot of the difference ( or % difference) between the sampler's flow rate measurement
system and the flow rate measurement of the NIST-traceable flow rate standard) for all
calibrations, audits and flow checks. Such a chart alerts the operator should the performance of
the flow rate measurement system degrade to such an extent that repairs are required.

1.   Equilibrate the selected flow-rate calibration device to the ambient conditions of the air mass
    for which flow is to be measured. This equilibration can take up to an hour, depending on the
    difference from the conditions at which the instrument was stored prior to moving it to the
    point of use.  During this equilibration period, the standard must be exposed to the prevailing
    air conditions, but it must also be protected from precipitation, wind, dust, solar heating, and
    other hazards that could affect its accuracy.
2.   Install a flow check filter cassette in the sampler. This filter will meet all specifications for
    PM2 5 sampling, but it does not need to be preweighed or postweighed. Discard this filter once
    calibration is complete.
3.   Remove the inlet from the  sampler. Place the flow calibration device on the sampler down
    tube using a flow adaptor device if necessary. Ensure that any valves are open so that flow
    through the sampler is unrestricted.
4.   Place the sampler in calibration mode according to instructions in the manufacturer's
    operating manual. Calibration of the sampler's flow-rate measurement system must consist of
    at least three separate flow-rate measurements ( a multipoint calibration) approximately evenly
    spaced within the range of-10% to +10% of the sampler's operational flow rate ( 40 CFR
    Part 50, Appendix L, Sec. 9.2.4 ). The sampler is required to have the capability to adjust the
    flow rate over the -10% to +10% range (40 CFRPart 50, Appendix L, Sec.7.4.2). The
    sampler's instruction manual will provide additional guidance on this flow-rate adjustment.

Verification of the sampler's flow rate shall consist of one flow-rate measurement at the sampler's
operational flow rate (40 CFR Part 50, Appendix L, Sec. 9.2.4). This one-point verification of
the flow-rate measurement system may be substituted for a three-point calibration, provided that a
full three-point calibration is carried out upon initial installation of the sampler and at least once
per year and that the flow rate  measurement system has met the ฑ 2% accuracy requirement in the
previous three-Point calibration verification. A full three-point calibration verification must be
done whenever a one-point verification indicates that the sampler's flow-rate measurement system
differs by ฑ 4% or more from the flow rate measured by the flow rate standard, and the one-point
verification must be repeated after the three-point calibration( 40 CFR Part 50, Appendix L, Sec.
9.2.6).

5.  Follow the instructions in the manufacturer's manual for performing the flow calibration.

6.   Once calibration is completed successfully, turn off the sampler pump, remove the filter
   cassette from the filter cassette holder, remove the flow calibration device ( and flow adaptor

-------
                                                                               Model QAPP
                                                                                Appendix C
                                                                               Revision No:
                                                                               Date: 4/18/98
                                                                 	Page 13 of 17

    device, if applicable), and replace the sampler inlet.

7.  The sampler flow rate is now verified/calibrated.

A-5   Temperature Calibration

Calibration frequency for the temperature and pressure sensors will also be set based on such
control charts or equivalent operational experience.

The ambient air temperature sensor is located inside the shielded fixture on the outside of the
PM2 5 sampler and is easy to unfasten and remove for comparison to a transfer standard for
temperature. The three-point calibration can be conducted at the field site, although it may prove
easier to remove the sampler to the laboratory to avoid weather problems and for convenience in
preparing the temperature standards.

On the other hand, the filter temperature sensor of Reference or Class I Equivalent PM 2 5
samplers is located in the (open) space just below the filter cassette. It is threaded through the
walls of the filter cassette holding assembly section of the sampler and removal of plastic or metal
fittings is required to remove the sensor and its associated wiring. It is recommended that this
sensor be calibrated in the laboratory. The temperature sensor housing, the sampler inlet, and the
interior of the down tube can also be cleaned in the laboratory.  Be careful when removing the
filter temperature sensor- do not gall the fittings since this could start an internal leak after the
installation. It is suggested that a sampler leak check be performed after reinstallation of the filter
temperature sensor.

Several steps to follow in calibrating ambient air temperature are given below. Make frequent
reference to the operator's instruction manual for sampler-specific procedures and instructions.

1.  Remove the ambient temperature sensor from the radiation shield so that it can be placed in  a
    constant temperature  bath while  it (the sensor) is still connected to the sampler's signal
    conditioner.
2.  Prepare a convenient container ( such as an insulated vacuum bottle) for the ambient
    temperature water bath and the ice slurry bath.  See step 3 below. If complete immersion of the
    sensor is necessary, wrap  it in plastic film so that liquid cannot reach the point where the
    connecting wire(s) and the sensor interface. Use partial immersion when possible, thus
    keeping the interface  dry. To further insulate the vacuum bottle, it can be positioned inside a
    larger 2-gallon insulated container that has been modified to allow wires or cables to enter
    from the top. Refer to Figure 4.3.5.3 of Volume IV of the EPA QA handbook.

Keep the temperature changes relatively small and make comparative measurements in this order:
Ambient, Cold, Ambient, Hot, Ambient. The range of temperatures need be only as broad as
that expected to contain all the ambient temperatures that will be experienced during the
upcoming time period, generally a year.

3.  For the ambient bath, use an insulated bottle that  was filled with tap or deionized water
    several hours earlier and allowed to equilibrate to ambient temperature. For the ice slurry, the
    ice will be made with distilled water and then crushed into pea-sized pieces and mixed with
    distilled water until an easily penetrable slurry state is reached. As long as ice is present in the
    slurry and the open end of the bottle is guarded from ambient air temperature fluctuations, the
    ice slurry temperature will be 0.0ฑ0.1ฐC.
4.  Wrap the sensor(s) and a thermometer together so that the thermometer bulb and the
    temperature sensor active site will be close together. Immerse the sensor and the attached
    thermometer in the ambient temperature bath. Use a cork or some other device to cover the

-------
                                                                              Model QAPP
                                                                               Appendix C
                                                                              Revision No:
                                                                              Date: 4/18/98
	    Page 14 of 17

   open end of the insulated bottle and thus keep ambient air from circulating over the top
   surface of the water ( or ice slurry mass). Wait at least 5 minutes for the ambient thermal mass
   and the sensor/thermometer to equilibrate. Wait at least 15 minutes for equilibration with the
   ice slurry before taking comparative readings.

For each thermal mass, in the order indicated in Step 2 above, make a series of five
measurements, taken about a minute apart. Accurately read the meniscus of the thermometer. Use
a magnification if necessary to see the meniscus; avoid paralax errors. If the measurements made
support the assumption of equilibrium, then average the five readings and record the result as the
sensor temperature relative to the thermometer for ambient and for 0.0 ฐC relative to the ice
slurry.

A-6  Sampler  Pressure Calibration Procedure.

General: According to ASTM Standard D 3631( ASTM 1977), a barometer can be calibrated by
comparing it with a secondary standard traceable to a NIST primary  standard.

Precautionary Note: Protect all barometers from violent mechanical shock and explosively
sudden changes in pressure. A barometer subjected to either of these events must be recalibrated.
Maintain the vertical and horizontal temperature gradients across the instruments at less than
0.1 ฐC/m. Locate the instrument so as to avoid direct sunlight, drafts, and vibration.

Laboratory: The Fortin type of barometer is best employed as a higher quality laboratory
standard which is used to adjust and certify an aneroid type of barometer in the laboratory.

Fortin Type Barometer Readings

1. Read the temperature, to the nearest 0.1 ฐC, from the thermometer attached to the barrel of
   the barometer
2. Lower the mercury level in the cistern until it clears the index pointer. Raise the level slowly
   until a barely discernible dimple appears on the surface of the mercury.
3. Tap the barrel near the top of the mercury column.
4. Set the vernier so that the base just cuts off light at the highest point of the meniscus and
   carefully avoid parallax error.
5. Read the height of the mercury column from the barometer in the manner appropriate to the
   vernier scale used to the equivalent of the nearest 0.1 mm Hg. Apply appropriate
   corrections for temperature and gravity as described in the barometer instruction booklet.

Field: Aneroid Type Barometer

1. Always use and read an aneroid type barometer when it is in the same position (vertical or
   horizontal) as it was when calibrated.
2. Immediately before reading the scale of an aneroid barometer with mechanical linkages, tap its
   case lightly to overcome bearing drag.
3. Read the aneroid barometer to the nearest 1 mm Hg.

A-7    Sampler and Standard Volumetric Flow  Rate Sensors with Built-in
Clocks: Time-of-Day

Time, and in time-of-day, is not a standard even though it can be referenced to time-of -day
signals maintained at national labs. Any frequency source can be used to derive a time-of-day
signal with the same stability as the source, but the accuracy depends upon how well the source
can be synchronized to the external reference. Proper synchronization of the signal requires a

-------
                                                                              Model QAPP
                                                                               Appendix C
                                                                              Revision No:
                                                                              Date: 4/18/98
	Page 15 of 17

knowledge of equipment and signal propagation delays. The accuracy of any timekeeping system
depends upon how often the signal is synchronized to the reference, the stability of the oscillator,
and the distribution delays from the system to the user.

A number of satellite, radio, and telephone systems carry a digital time code. With the right
equipment, this code can be read and used to obtain time-of-day. In some cases (WWV and
WWVH radio and telephone services) the time is also sent by voice. A list of time-of-day signals
is given below. It is the responsibility of the lab acquiring such time signals to properly account
for all propagation and equipment delays. National laboratories, such as NIST, can only ensure
the time is accurate as it leaves the broadcast source.

A. Radio Sources of time-of-day signals.

   a.  WWV and WWVH (voice and digital code)
   b.  WWVB (digital code)
   c.  GOES Satellite (digital code)
   d.  Global Positioning Satellite System (digital code)

B. Telephone Sources of time-of-day signals

   a.  Automated Computer Time Service (digital code for computers by telephone at 303-
       494.4774)
   b.  NIST Telephone  Service ( WWV  audio at 303-499-7111)

       From section 2.3.0, "Technical Criteria for Calibration Laboratories Engaged in
       Time and Frequency Calibrations," starting on page 94 of the NIST Handbook 150-
       2(Draft; June 1996), "NVLAP Calibration Laboratories Technical Guide," C. Douglas
       Faison, Editor.

A-8    Relative Humidity Verification for Environmental
Conditioning/Weighing Room

The procedure for calibrating the thermometers in a psychrometer is essentially the same as any
thermometer calibration (See App. A-5).

Once a mercury or alcohol liquid-in-glass thermometer is calibrated, there is no need for
recalibration, unless it is to be used for reference or as a transfer standard. Errors in wet bulb
temperatures are  most frequently the result of an improperly installed or dirty muslin wick, the
repeated use of tap water instead of distilled water, or human error in reading. Wicking material
used on psychrometers must be washed to remove traces of sizing and finger-prints. Once
cleaned, the material is tied at the top of the thermometer bulb  and a loop of thread placed around
the bottom so the thermometer bulb is tightly covered. To prevent solid materials from collecting
on the cloth and preventing proper evaporation, the wick will be wet with distilled water. Of
course, slinging or motor aspiration will be done in the shade, away from reflected or scattered
radiation, at a ventilation rate of about 3 to 5m/s. Many technique-related errors are minimized by
using an Assman-type, motor-operated psychrometer, providing that the instrument  is allowed to
assume near ambient conditions prior to use. ( from subsections 4.5.2 and 4.5.3 of Vol. IV ,
Meteorological Equipment QA, EPA QA  Handbook)

For an additional discussion, see section 2.8.0, "Technical Criteria for calibration Laboratories
Engaged in Thermodynamic Calibrations," starting on page 255 of NIST Handbook 150-2 (
Draft; June 1996), entitled " NVLAP Calibration Laboratories Technnical Guide," edited by C.
Doug Faison.

-------
                                                                             Model QAPP
                                                                              Appendix C
                                                                             Revision No:
                                                                             Date: 4/18/98
                                                                             Page 16 of 17
Both the dew cell and the cooled-mirror hygrometer can be checked for approximate calibration
accuracy with a motor-operated psychrometer. Their performance will be verified under stable
conditions at night or under cloudy conditions during the day. Several readings taken at the intake
of the aspirator or shield will be taken. Bench calibrations of these more sophisticated units must
be made by the manufacturer.

References

ASTM. 1993a. Standard practice for maintaining constant relative humidity by means of aqueous
solutions. American Society for Testing and Materials. 1993 Annual Book of ASTM Standards,
Vol 1 1.03, Designation E 104-85 (reapproved 1991), pp. 570-572.

ASTM. 1993b. Standard specification for laboratory weights and precision mass standards.
American Society for Testing and Materials. 1993 Annual Book of ASTM Standards, Vol 14.02,
Designation E 617-91, pp. 280-295.

Desert Research Institute. 1994. DRI Standard Operating Procedure. Gravimetric Analysis, DRI
SOP No. 2-102.3, Reno, NV. 24 pp.

Eisner AD. 1997. Personal communication. ManTech Environmental Technology, Inc., Research
Triangle Park, NC.

Engelbrecht DR, Cahill TA, Feeney PJ.  1980. Electrostatic effects on gravimetric analysis of
membrane filters. Journal of the Air Pollution Control Association 30(4):319-392.

EPA. 1997. Reference  method for the determination of fine particulate matter as PM2 5 in the
atmosphere. U.S. Environmental Protection Agency. 40 CFR Part 50,  Appendix L.

EPA. 1995. Quality Assurance Handbook for Air Pollution Measurement Systems Volume IV:
Meteorological Measurements. U.S. Environmental Protection Agency. Document No. EPA/600/
R-94/038d. Revised March.

EPA. 1997. Reference  method for the determination of fine particulate matter as PM2 5 in the
atmosphere. U.S. Environmental Protection Agency. 40 CFR Part 58,  Appendix L, as amended
July 18, 1997.

Harris GL. 1994. State weights and measures laboratories:  State standards program description.
National Institute of Standards and Technology. Special publication 791. 130 pp.

Harris G. 1993.  Ensuring accuracy and traceability of weighing instruments. ASTM
Standardization News  2 1 (4):44-5 1 .

Hawley RE, Williams  CJ. 1974. Electrostatic effects in microbalances. I. General considerations
of the effectiveness of a radioactive ionizing source under ambient conditions. Journal of Vacuum
Science and Technology 1
Kupper WE. High accuracy mass measurements, from micrograms to tons. Instrument Society of
America Transactions 29(4): 11-20.

Nelson GO. 1971. Controlled Test Atmospheres: Principles and Techniques. Ann Arbor Science
Publishers, Ann Arbor, MI, p. 43.

NIST. 1989. The calibration of thermocouples and thermocouple materials. National Institute of

-------
                                                                              Model QAPP
                                                                               Appendix C
                                                                              Revision No:
                                                                              Date: 4/18/98
                                                                              Page 17 of 17
Standards and Technology. Special publication 250-35. April.
NIST. 1988. NIST measurement services: liquid-in-glass thermometer calibration service.
National Institute of Standards and Technology. Special publication 250-23. September.

NIST. 1976. Liquid-in-glass thermometry. National Institute of Standards and Technology. NBS
Monograph 150. January.

NIST. 1986. Thermometer calibration: a model for state calibration laboratories. National
Institute of Standards and Technology. NBS Monograph 174. January.

Weil J. 1991. Static control for balances. Cahn Technical Note. Published by Cahn Instruments,
Inc., Madison, WI.

White VR. 1997. National Voluntary Laboratory Accreditation Program: 1997 Directory.
National Institute of Standards and Technology. Special publication 810. 225 pp.
ASTM. 1977. Standard test methods for measuring surface atmospheric pressure. American
Society for Testing and Materials. Philadelphia, PA. Standard D 3631-84.

-------
                                                                        Project: Model QAPP
                                                                               Appendix D

                                                                              Revision No: 1
                                                                              Date: 4/18/98
                                                                       	Page 1 of3
                                     Appendix D

                              Data Qualifiers/Flags


A sample qualifier or a result qualifier  consists of 3 alphanumeric characters which act as an
indicator of the fact and the reason that the subject analysis (a) did not produce a numeric result,
(b) produced a numeric result but it is qualified in some respect relating to the type or validity of
the result or (c) produced a numeric result but for administrative reasons is not to be reported
outside the  laboratory.

-------
                                                                                         Project: Model QAPP
                                                                                                  Appendix D
                                                                                                Revision No: 1
                                                                                                 Date: 4/18/98
                                                                                        	Page 2 of3
Field Qualifiers
Code
CON
DAM
EST^
EVT
FAC
FAT
FIT
FLR^
FLT^
FMC
FPC
FSC
FVL
GFI
LEK
SDM
Definition
Contamination
Filter Damage
Elapsed Sample Time
Event
field accident
Failed Temperature
Check Ambient
Failed Temperature
Check Internal
Flow Rate
Filter Temperature
Failed Multi point
Calibration Verification
Failed Pressure Check
Failed Single Point
Calibration Verification
Flow volume
Good Filter Integrity
Leak suspected
Sampler Damaged
Description
Contamination including observations of insects or other debris
Filter appeared damaged
Elapsed sample time out of specification
exceptional event expected to have effected sample (dust, fire ,
spraying etc)
There was an accident in the field that either destroyed the sample or
rendered it not suitable for analysis.
Ambient temperature check out of specification
Internal temperature check out of specification
Flow rate 5 min avg out of specification
Filter temperature differential, 30 minute interval out of specification
Failed the initial Multi point calibration verification
Barometric pressure check out of specification
Failed the initial single point calibration verification
Flow volume suspect
Filter intgrity, upon post sampling field inspection looks good
internal/external leak suspected
Sampler appears to be damaged which may have effected filter
I/- Flag generated by sampling equipment
Laboratory Qualifiers
Code
ALT
AVG
BDL
BLQ
BLQ
CAN
CBC
Definition
alternate measurement
average value
below detectable limits
below limit of quantitation
below limit of quantitation
canceled
cannot be calculated
Description
The subject parameter was determined using an alternate measurement
method. Value is believed to be accurate but could be suspect.
Average value - used to report a range of values
There was not a sufficient concentration of the parameter in the sample to
exceed the lower detection limit in force at the time the analysis was
performed. Numeric results field, if present is at best, an approximate value.
The sample was considered above the detection limit but there was not a
sufficient concentration of the parameter in the sample to exceed the lower
quantitation limit in force at the time the analysis was performed
The sample was considered above the detection limit but there was not a
sufficient concentration of the parameter in the sample to exceed the lower
quantitation limit in force at the time the analysis was performed
The analysis of this parameter was canceled and not preformed.
The calculated analysis result cannot be calculated because an operand value is
qualified

-------
 Project: Model QAPP
         Appendix D
       Revision No: 1
        Date: 4/18/98
	Page 3 of3
EER
FBK
PCS
FFB
FIS
FLB
FLD
FLH
FLT
FQC
GSI
HTE
ISP
INV
LAC •
LLS
LTC
NAR
REJ
REQ
RET
RIN
STD
UNO
entry error
found in blank
failed collocated sample
failed field blank
failed internal standard
failed laboratory blank
failed laboratory duplicate
failed laboratory humidity
failed laboratory temperature
failed quality control
Good Shipping Integrity
holding time exceeded
improper sample preservation
invalid sample
laboratory accident
less than lower standard
less than criteria of detection
no analysis result
rejected
reque for re-analysis
return(ed) for re-analysis
re-analyzed
internal standard
analyzed but undetected
The recorded value is known to be incorrect but the correct value cannot be
determined to enter a correction.
The subject parameter had a measurable value above the established QC limit
when a blank was analyzed using the same equipment and analytical method.
Therefore, the reported value may be erroneous.
Collocated sample exceeded acceptance criteria limits
Field blank samples exceeded acceptance criteria limits.
Internal standards exceeded acceptance criteria limits.
Laboratory blank samples exceeded acceptance criteria limits.
Laboratory duplicate samples exceeded acceptance criteria limits.
Laboratory humidity exceeded acceptance criteria limits
Laboratory temperature exceeded acceptance criteria limits.
The analysis result is not reliable because quality control criteria were exceeded
when the analysis was conducted. Numeric field, if present, is estimated value.
Integrity of filter upon receipt by shipping/receiving looked good
Filter holding time exceeded acceptance criteria limits
Due to improper preservation of the sample, it was rendered not suitable for
analysis.
due to single or a number or flags or events, the sample was determined to be
invalid.
There was an accident in the laboratory that either destroyed the sample or
rendered it not suitable for analysis.
The analysis value is less than the lower quality control standard.
Value reported is less than the criteria of detection
There is no analysis result required for this subject parameter
The analysis results have been rejected for an unspecified reason by the
laboratory. For any results where a mean is being determined, this data was
not utilized in the calculation of the mean.
The analysis is not approved and must be re-analyzed using a different method.
The analysis result is not approved by laboratory management and reanalysis is
required by the bench analyst with no change in the method.
The indicated analysis results were generated from a re-analysis
The subject parameter is being utilized as an internal standard for other subject
parameters in the sample. There is no analysis result report, although the
theoretical and/or limit value(s) may be present
Indicates material was analyzed for but not detect

-------
                                                                    Project: Model QAPP
                                                                           Appendix E
                                                                         Revision No: 1

                                                                          Date: 4/18/98

                                                                   	Page 1 of3
                                   Appendix E


                       Standard Operating Procedures

The following list is meant to provide an example of the types of standard operating procedures
that would be available for the PM2 5 Program; either included in the QAPP or referenced. In
either case, they would need to be available for the EPA Regional personnel during the review of
the QAPP.

-------
 Project: Model QAPP
          Appendix E
       Revision No: 1
        Date: 4/18/98
	Page 2 of3
Document
Number
Title
Comments
Equipment/Consumables
AIR-EQ-IN1
AIR-EQ-CN1
Receipt, Inspection , Acceptance Procedures for
Capital Equipment
Section:
PM2 5 Equipment
Receipt, Inspection, Acceptance Procedures for
Consumable Supplies
Section 13 PM,5 Consumables:
Filter Handling
Filter Integrity Check
Sample Storage
Sample Chain-of- Custody

Receipt/Testing/Inspection procedures of
consumables (particularly filters)
Laboratory Activities
AIR-LAB-FP1
Standard Operating Procedures for Preparation,
Weighing, and Data Recording for the PAf 5
Monitoring program
Sections:
Mass Reference Standards
Filter Conditioning (pre and post sampling)
Electrostatic Charge Neutralization
Pre-sampling Filter Weighing
Sample Chain-of- Custody
Temperature Calibration/Verification
Relative Humidity Verification
Laboratory Maintenance
Sample Storage/Archive

Field Activities
AIR-FLD- FP1
Standard Procedures for Operation of Field
Monitoring Sites for the PM2 5 Monitoring Program
Sections:
Monitor set-up/Installation
Filter Selection from Laboratory
Filter Installation and Recovery
Filter Transportation, Packaging and Shipping
Sample Chain-of- Custody
Flow Rate Calibration/Verification
Temperature Calibration/Verification
Sampler Pressure Calibration
Internal/External Leak Checks
Field Maintenance
SOPS for field activities

-------
Project: Model QAPP
         Appendix E
      Revision No: 1
       Date: 4/18/98
         Page 3 of3
Document
Number
Title
Comments
Shipping/ Receiving

AIR-SHP-FP1

Standard Operating Procedures for Receiving PM2.5
Filters from the Field
Receiving and Inspection
Sample Chain-of- Custody
Sample Storage


Information Management
AIR-IS-FP1
AIR-IS-FP2
AIR.-IS-FP3
Data Acquisition Procedures for the PA4 5 Monitoring
Program
Sections:
Data Entry
Filter Conditioning
Filter Pre-weighing
Filter Post-weighing
Field Data Acquisition
Sample Chain-of- Custody
Data processing procedures for the PJVf s monitoring
program
Sections
Data Review
Data editing
Data Verification
Calculations, Algorithms, and Data Reduction
Back-up/security procedures
Data Validation
AIRS data transmittal procedures system for the PA| 5
monitoring program
Upload to AIRS
AIRS checks/edits
Security
Data entry SOPs for hardcopy
information (field, lab forms) and entry
into automated systems.



-------
                                                              Project: Model QAPP
                                                                     Appendix F
                                                                   Revision No: 1
                                                                    Date:4/18/98
                                                                     Page 1 of4
                               Appendix F
         PM25 Reference Material Guidance Documents

The following documents provide guidance on various aspects of the PM25 Ambient Air Quality
Monitoring Program. It is anticipated that many of these documents will be available on the
Internet and the AMTIC Bulletin Board.  Internet addresses are included in the status column.

-------
Project: Model QAPP
         Appendix F
       Revision No: 1
       Date:4/18/98
         Page 2 of4
DOCUMENT TITLE
STATUS
General
PM2. 5 Implementation Plan, March 1998
PM2.5 Quality Assurance Program Overview October,
1997
Quality Assurance Handbook for Air Pollution
Measurement Systems, Volume I: A Field Guide to
Environmental Quality Assurance, U.S. Environmental
Protection Agency, EPA-600/R-94-038a, April 1994.
Quality Assurance Handbook for Air Pollution
Measurement Systems, Volume II: Ambient Air Specific
Methods, EPA-600/R-94-038b, April 1994.
Quality Assurance Handbook for Air Pollution
Measurement Systems, Volume IV: Meteorological
Measurements, EPA-600/R-94/038d, Revised April
1994.
Quality Assurance Handbook for Air Pollution
Measurement Systems, Volume V: Precipitation
Measurement Systems (Interim Edition) , EPA-600/R-94-
03 8e, April 1994.
Model Quality Assurance Project Plan for the PM2 5
Ambient Air Monitoring Program, March 1998
Presently on AMTIC www.epa.gov/ttn/amtic
Presently on AMTIC www.epa.gov/ttn/amtic
Current
Interim edition [replaces EPA-600/4-77-027a
(revised 1990)]; final updated edition expected May
1998. With new EPA number "EPA-454/R-98-004"

Interim edition (replaces EPA-600/4-82-042a-b);
final updated edition expected early 1996.
Presently on AMTIC
www.epa.gov/ttn/amtic/pmqa.html
I Quality Management
EPA Quality Systems Requirements for Environmental
Programs, EPA QA/R-1
Guidance for Developing Quality Systems for
Environmental Data Operations EPA QA/G-1
EPA Requirements for Quality Management Plans, "
EPA QA/R-2 U.S. Environmental Protection Agency,
QAD, August 1994.
Guidance for the Management Systems Review Process
EPA QA/G-3: Draft January, 1994
EPA Requirements for Quality Assurance Project Plans,
QA/R-5, Current Version: Draft - November, 1997
"Guidance on Quality Assurance Project Plans " EPA/G-
5, EPA/600/R-98/018.
Policy and Program Requirements to Implement the
Mandatory Quality Assurance Program, Order 5360.1,
April 1984.
Available in Summer, 1998
Fall, 1998.
Draft available on Internet es.epa.gov/ncerqa/qa
Final Summer, 1998.
Available in Summer, 1998.
Draft available on Internet es.epa.gov/ncerqa/qa
Draft available on Internet es.epa.gov/ncerqa/qa Final
-February 1998
Current, basis for EPA QA program (updated in 1 995
draft Order)

-------
Project: Model QAPP
         Appendix F
      Revision No: 1
       Date:4/18/98
         Page 3 of4
DOCUMENT TITLE
STATUS
Data Quality Objectives
Guidance on Applying the Data Quality Objectives
Process for Ambient Air Monitoring Around Superfund
Sites (Stages I and II), EPA-450/4-89-015, August 1989.
Guidance on Applying the Data Quality Objectives
Process for Ambient Air Monitoring Around Superfund
Sites (Stage III), EPA-450/4-90-005, March 1990.
Decision Error Feasibility Trials (DEFT) Software for
the Data Quality Objectives Process, QA/G-4D:
EPA/600/R-96/056,
Guidance for the Data Quality Objectives Process, U.S.
QA/G-4, EPA/600/R-96/055,
Basically current guidance
Basically current guidance
Draft Available in Internet es.epa.gov/ncerqa/qa
Final: September, 1994
Draft Available in Internet es.epa.gov/ncerqa/qa
Final: September, 1994
P&A
Guideline on the Meaning and Use of Precision and
Accuracy Data Required by 40 CFR Part 58,
Appendices A and B, U.S. Environmental Protection
Agency, EPA-600/4-83-023, June 1983.
Guidance for the Data Quality Assessment: Practical
Methods for Data Analysis EPA QA/G-9
EPA/600/R-96/084,
Some items out of date (e.g., SAROAD versus AIRS,
no PM- 10, etc.)
Draft Available in Internet es.epa.gov/ncerqa/qa
Final: January, 1998
System Audits
National Air Audit System Guidance Manual for FY
1988-FY1989, U.S. Environmental Protection Agency,
EPA-450/2-88-002, February 1988.
National audit report discontinued in FY89
Network Design and Siting
Guidance for Network Design and Optimum Site
Exposure for PM2.5 and PM1 0, December, 1997
SLAMS/NAMS/PAMS Network Review Guidance, Draft
March 1998
Network Design and Optimum Site Exposure Criteria for
Particulate Matter, EPA-450/4-87-009, May 1987.
Network Design and Site Exposure Criteria for Selected
Noncriteria Air Pollutants, EPA-450/4-84-022,
September 1984.
Presently on AMTIC www.epa.gov/tfn/amWc
Draft published 12/15/97.
Presently on AMTIC www.epa.gov/ttn/amtic
Basically current; could be revised when new PM
standard is proposed
Partially out of date

-------
Project: Model QAPP
         Appendix F
      Revision No: 1
       Date:4/18/98
         Page 4 of4
DOCUMENT TITLE
Appendix E and F to Network Design and Site Exposure
Criteria for Selected Noncriteria Air Pollutants, EPA-
450/4-84-022a, October 1987.
STATUS
Partially out of date
Ambient Air Monitoring Methods
Filter Conditioning and Weighing Facilities and
Procedures for PM2. 5 Reference and Class I Equivalent
Methods, February 1998
Guidance Document 2.12 Monitoring PM2. 5 in Ambient
Air Using Designated Reference or Class I Equivalent
Methods
EPA QA/G-6: Guidance for the Preparation of Standard
Operating Procedures for Quality-Related Operations
Final - EPA/600/R-96/027, November, 1995
Static Control for Balances
Presently on AMTIC www.epa.gov/ttn/amfjc

Draft Available in Internet es.epa.gov/ncerqa/qa
Presently on AMTIC www.epa.gov/ttn/amtj'c
Ambient Air Monitoring Costs
Guidance for Estimating Ambient Air Monitoring Costs
for Criteria Pollutants and Selected Air Toxic Pollutants ,
EPA-454/R-93-042, October 1993.
Partially out of date; need longer amortization
schedule
Other
Guideline on the Identification and Use of Air Quality
Data Affected by Exceptional Events, EPA-450/4-86-
007, July 1986.
IntraAgency Task Force Report on Air Quality
Indicators, EPA-450/4-81-015, February 1981.
Screening Procedures for Ambient Air Quality Data ,
EPA-450/2-78-037, July 1978.
Validation of Air Monitoring Data, U.S. Environmental
Protection Agency, EPA-600/4-80-030, June 1980.
Currently being updated
Not a policy or guidance document; could be updated
to include more modern analysis and presentation
techniques
Could be updated to include more modern computer
programs and newer screening procedures
Partially out of date

-------