United States       Office of Air Quality        EPA
 Environmental Protection  Planning and Standards
 Agency	Research Triangle Park, NC 27711  March 2009
 Air
Quality Assurance Document
Quality Assurance Project Plan
for the Federal PM2.s Performance
Evaluation Program

-------
                                     Foreword

U.S. Environmental Protection Agency (EPA) policy per EPA Order 5360.1 A2 requires that all
projects involving the generation, acquisition, and use of environmental data be planned and
documented and have an Agency-approved Quality Assurance Project Plan (QAPP) before the
start of data collection. The primary purpose of the QAPP is to provide a project overview,
describe the need for the measurements, plan, and define quality assurance/quality control
(QA/QC) activities to be applied to the project, all within a single document.

The following document represents the QAPP for the environmental data operations involved in
EPA's PM2.5 Monitoring Network Performance Evaluation Program. This QAPP was generated
by using the following EPA monitoring and QA regulations and guidance:
    •   40 CFR Part 50, Appendix L
    •   40 CFR Part 58, Appendices A and C
    •  EPA QA/R-5, EPA Requirements for Quality Assurance Project Plans
    •  EPAQA/G-5, Guidance for Quality Assurance Project Plans
    •  EPA QA/G-9, QAOO update, Guidance for Data Quality Assessment: Practical Methods
      for Data Analysis.

All pertinent elements of the QAPP regulations and guidance are addressed in this QAPP.

This document and related PEP SOPs are accessible in PDF format on the Internet on the
Ambient Monitoring Technology Information Center's (AMTIC's) Bulletin Board (available at
http://www.epa.gov/ttn/amtic/amticpm.html) under the QA area of the PM2.5 Monitoring
Information. The document can be read and printed using Adobe Acrobat Reader software,
which is freeware that is available on many Internet sites, including the U.S. Environmental
Protection Agency's (EPA) Web site.  The Internet version is write-protected. Hardcopy versions
are available by writing or calling:
       Dennis Grumpier
       Office of Air Quality Planning and Standards
       MQAG (C304-06)
       Research Triangle Park, NC 27711
       Phone:(919)541-0871
       E-mail: crumpler.dennis@epa.gov
This is a living document, which means it may be revised as program objectives and
implementation procedures evolve. Comments about technical content and the presentation of
this document may be sent to Dennis Grumpier.

The document mentions trade names or brand names. Any mentions  of corporation  names,
trade names, or commercial products do not constitute endorsement or recommendation
for use.
                                          11

-------
Acknowledgments

This QAPP is the product of the combined efforts of EPA's Office of Air Quality Planning and
Standards (OAQPS); the Office of Radiation and Indoor Air (ORIA) support laboratories in Las
Vegas, NV; EPA's ORIA National Exposure Research Laboratory (NERL) in Montgomery, AL;
EPA Regional offices; and State, local, and Tribal (SLT) organizations. Dennis Grumpier of
OAQPS led and directed the 2007/2008 update, RTI International conducted the work under
EPA contracts 68-D-02-065 and EP-D-08-047, and the PM2.5 QA Workgroup reviewed the
material in this document. The following individuals are acknowledged for their contributions.

SLT Organizations
George Froehlich, New York Department of Environmental Conservation
Mark Potash, Connecticut Department of Environmental Protection

EPA Regions
Region:
  1     Mary Jane Cuzzupe
  2     Mark Winter
  3     Andrew Hass, Cathleen Kennedy, Colleen Walling
  4     Greg Noah
  5     Basim Dihu and Scott Hamilton
  6     John Lay
  7     Thien Bui and James Regehr
  8     Michael Copeland
  9     Mathew Plate
 10     Christopher Hall

RTI International (RTI)
Jennifer Lloyd, Ed Rickman, and Emaly Simone

Office of Air Quality Planning and Standards
Dennis Grumpier, Dennis Mikel, and Mark Shanis

Office of Radiation and Indoor Air
Jeff Lantz
                                        in

-------
Acknowledgments for the February 1999 Version
The following individuals are acknowledged for their contribution to the first edition of the PEP
QAPP (February 1999 version), which served as the basis for this update.

SLT Organizations
George Apgar, State of Vermont, Waterbury, VT
Randy Dillard, Jefferson County Department of Health, Birmingham, AL
Gordon Pierce and Kevin Goohs, Colorado Department of Public Health and Environment,
      Denver, CO
Russell Grace and Tom Pomales, California Air Resources Board, Sacramento, CA
Jeff Miller, Pennsylvania Department of Environmental Protection, Harrisburg, PA
Richard Heffern, State of Alaska Department of Environmental Conservation, Juneau, AK
Dan Harman, North Dakota Department of Health, Bismarck, ND

EPA Regions

Region:
  1     Don Porteous, Norman Beloin, and Mary Jane Cuzzupe
  2     Clinton Cusick
  3     Victor Guide and Theodore Erdman
  4     Jerry Burger and Herb Barden
  5     Mary Ann Suero, Gordon Jones, Mike Rizzo, and Basim Dihu
  6     Mary Kemp, Mark Sather, Kuenja Chung, Timothy Dawson, and Ruth Tatom
  7     Leland Grooms, Mike Davis,  and Shane Munsch
  8     Ron Heavner, Gordon MacRae, and Joe Delwiche
  9     Manny Aquitania and Bob Pallarino
 10     Barry Towns, Bill Puckett, and Karen Marasigan.

National Exposure Research Laboratory
Frank McElroy and David Gemmill

RTI
Jim Flanagan, Cynthia Salmons, Gary Eaton, and Bob Wright

Office of Air Quality Planning and Standards
Joe Elkins, Shelly Eberly, Tim Hanley, David Musick, and Mark Shanis
                                         IV

-------
                           Acronyms and Abbreviations

AIRS        Aerometric Information Retrieval System
ANSI        American National Standards Institute
APTI        Air Pollution Training Institute
AQS         Air Quality System
ASTM       American Society for Testing and Materials
AWMA      Air and Waste Management Association
BP          barometric pressure
CAA        Clean Air Act
CFR         Code of Federal Regulations
CMD        Contracts Management Division
CMZ        community monitoring zone
CO          Contracting Officer
COC         chain of custody
COR        Contracting Officer's Representative
DAS         data acquisition system
DCO        Document Control Officer
DOPO       Delivery Order Project Officer
DQA        data quality assessment
DQOs       data quality objectives
EDO         environmental data operation
EMAD       Emissions, Monitoring, and Analysis Division
ESAT       Environmental Services Assistance Team
EPA         Environmental Protection Agency
FAR         Federal Acquisition Regulations
FEM         Federal equivalent method
FIPS         Federal Information Processing Standards
FR          flow rate
FRM        Federal reference method
FS          Field Scientist
GIS          geographical information systems
GLP         good laboratory practice
LA          Laboratory Analyst
LAN        local area network
MPA        monitoring planning area
MQOs       measurement quality objectives
MSA        metropolitan statistical area
MSR        management system review
NAAQS      National Ambient Air Quality Standards
NAMS       national air monitoring station
NATTS      National Air Toxics Trend Stations

-------
NCore       National Core multi-pollutant monitoring stations
NIST        National Institute of Standards and Technology
OAQPS      Office of Air Quality Planning and Standards
OARM      Office of Administration and Resources Management
ORD        Office of Research and Development
ORIA        Office of Radiation and Indoor Air
PC          personal computer
POC         pollutant occurrence code
PD          percent difference
PE          performance evaluation
PEP         Performance Evaluation Program
PM2.5        particulate matter less than 2.5 microns
PTFE        polytetrafluoroethylene
Qa           sampler flow rate at ambient (actual) conditions of temperature and pressure.
QA          quality assurance
QAAR       Quality Assurance Annual Report
QAD        Quality Assurance Division Director
QAM        Quality Assurance Manager
QAO        Quality Assurance Officer
QAPP        Quality Assurance Proj ect Plan
QC          quality control
QMP        Quality Management Plan
RH          relative humidity
R&P         Rupprecht & Patashnick
SIPS         State Implementation Plans
SLAMS      State and Local Ambient Monitoring Stations
SOP         standard operating procedure
SOW        statement of work
SPMS        special purpose monitoring stations
SYSOP      system operator
Ta           temperature, ambient or actual
TOPO        Task Order Proj ect Officer
TSA         technical systems audit
TSP         total suspended particulate
Va           air volume, at ambient or actual conditions
VOC        volatile organic compound
WAM        Work Assignment Manager
                                          VI

-------
                                                                       Project: PEP QAPP
                                                                         Element No.: 1.0
                                                                         Revision No.: 1
                                                                          Date: 3/6/2009
                                                                      	Page 1 of 1
                        1.0  QA Project Plan Approval
Title: PM2.5 Performance Evaluation Program Quality Assurance Project Plan

The attached Quality Assurance Project Plan (QAPP) for the PM2 5 Performance Evaluation
Program (PEP) is hereby recommended for approval and commits the participants of the
program to follow the elements described within.
OAQPS     Signature:  	
            Name:       Joe Elkins, QA Manager
Region 1    Signature:
            Name:
Date:
Date:
Region 2    Signature:
            Name:
Date:
Region 3    Signature:
            Name:
Date:
Region 4    Signature:
            Name:
Date:
Region 5    Signature:
            Name:
Date:
Region 6    Signature:
            Name:
Date:
Region 7    Signature:
            Name:
Date:
Region 8    Signature:
            Name:
Date:
Region 9    Signature:
            Name:
Date:
Region 10   Signature:
            Name:
Date:

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 2.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
                                                                        	Page 1 of 6
                              2.0 Table of Contents
                                                                         Element No.-Page
Foreword.	ii
Acknowledgments	iii
Acronyms and Abbreviations	v
1.0    QA Project Plan Approval	1-1
2.0    Table of Contents	2-1
3.0    Distribution	3-1
4.0    Project/Task Organization	4-1
       4.1    The PEP Workgroup (Previously the PM2.5 QA Workgroup)	4-2
       4.2   EPA's Office of Air Quality Planning and Standards	4-2
       4.3    ESAT Organization	4-3
       4.4   EPA Regional Offices	4-6
       4.5    ESAT Contractors	4-7
       4.6   State, Local, and Tribal Agencies	4-7
       4.7   Other Participating Entities	4-10
5.0    Problem Definition/Background	5-1
       5.1    Problem Statement and Background	5-1
6.0    Project/Task Description	6-1
       6.1    Description of Work to be Performed	6-1
       6.2   Field Activities	6-2
       6.3    Laboratory Activities	6-6
       6.4   Schedule of Activities	6-8
       6.5    Project Assessment Techniques	6-14
       6.6   Project Records	6-14
7.0    Data Quality Objectives and Criteria for Measurement	7-1
       7.1    Data Quality Objectives	7-1
       7.2   Measurement Quality Objectives	7-2
8.0    Special Training Requirements/Certification	8-1
       8.1    OAQPS Training Facilities	8-1
       8.2   Training Program	8-2
       8.3    Field Training	8-2
       8.4   Laboratory Training	8-3

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 2.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
	Page 2 of 6

                                                                         Element No.-Page

       8.5    Certification	8-4
       8.6    Additional PEP Field and Laboratory Training	8-5
       8.7    Additional Ambient Air Monitoring Training	8-5
9.0    Documentation and Records	9-1
       9.1    Information Included in the Reporting Package	9-1
       9.2    Reports to Management	9-6
       9.3    Data Reporting Package Archiving and Retrieval	9-7
10.0   Sampling Design	10-1
       10.1   Scheduled Project Activities, Including Measurement Activities	10-1
       10.2   Rationale for the Design	10-1
       10.3   Design Assumptions	10-2
       10.4   Procedure for Locating and Selecting Environmental Samples	10-3
       10.5   Classification of Measurements as Critical/Noncritical	10-4
       10.6   Validation of Any Non-Standard Measurements	10-4
11.0   Sampling Methods Requirements	11-1
       11.1   Sample Collection and Preparation	11-1
       11.2   Support Facilities for Sampling Methods	11-3
       11.3   Sampling/Measurement System Corrective Action Process	11-3
       11.4   Sampling Equipment, Preservation, and Holding  Time Requirements	11-9
12.0   Sample Handling and Custody	12-1
13.0   Analytical Methods Requirements	13-1
       13.1   Preparation of Sample Filters	13-1
       13.2   Analysis Method	13-1
       13.3   Internal QC and Corrective Action for Measurement System	13-4
       13.4   Filter Sample Contamination Prevention, Preservation, and Holding Time
              Requirements	13-7
14.0   Quality Control Requirements	14-1
       14.1   QC Procedures	14-1
       14.2   Sample Batching—QC Sample Distribution	14-16
       14.3   Control Charts	14-18
15.0   Instrument/Equipment Testing, Inspection, and Maintenance Requirements	15-1
       15.1   Testing	15-1
       15.2   Inspection	15-1
       15.3   Maintenance	15-3

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 2.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
	Page 3 of 6

                                                                        Element No.-Page

 16.0   Instrument Calibration and Frequency	16-1
       16.1   Instrumentation Requiring Calibration	16-2
       16.2   Calibration Method That Will Be Used for Each Instrument	16-4
       16.3   Calibration Standard Materials and Apparatus	16-4
       16.4   Calibration Frequency	16-5
       16.5   Standards Recertifications	16-5
 17.0   Inspection/Acceptance for Supplies and Consumables	17-1
       17.1   Purpose	17-1
       17.2   Critical Supplies and Consumables	17-1
       17.3   Acceptance Criteria	17-7
       17.4   Tracking and Quality Verification of Supplies and Consumables	17-7
 18.0   Data Acquisition Requirements	18-1
       18.1   Acquisition of Non-Direct Measurement Data	18-1
 19.0   Data Management	19-1
       19.1   Background and Overview	19-1
       19.2   Data Recording	19-3
       19.3   Data Validation	19-3
       19.4   Data Transformation	19-5
       19.5   Data Transmittal	19-6
       19.6   Data Reduction and Data Integrity	19-7
       19.7   Data Analysis	19-8
       19.8   Data Flagging—Sample Qualifiers	19-9
       19.9   Data Tracking	19-10
       19.10  Data Storage and Retrieval	19-11
 20.0   Assessments and Response Actions	20-1
       20.1   Assessment Activities and Project Planning	20-1
       20.2   Documentation of Assessments	20-9
 21.0   Reports to Management	21-1
       21.1   Communication	21-1
       21.2   Reports	21-4
 22.0   Data Review, Validation, and Verification Requirements	22-1
       22.1   Sampling Design	22-1
       22.2   Sample Collection Procedures	22-2
       22.3   Sample Handling	22-2

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 2.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
	Page 4 of 6

                                                                         Element No.-Page

       22.4   Analytical Procedures	22-3
       22.5   Quality Control	22-4
       22.6   Calibration	22-4
       22.7   Data Reduction and Processing	22-5
23.0   Validation and Verification Methods	23-1
       23.1   Process for Validating and Verifying Data	23-1
24.0   Reconciliation with Data Quality Objectives	24-1
       24.1   Preliminary Review of Available Data	24-1
       24.2   Regional Level Evaluation of Data Collected While All PEP Samplers Are
              Collocated	24-1
       24.3   National Level Evaluation of Data Collected While All PEP Samplers Are
              Collocated	24-2
APPENDICES
A.     Glossary
B.     Documents to Support Data Quality Objectives
C.     Training Certification Evaluation Forms
D.     Data Qualifiers/Flags
E.     Technical Systems Audit Forms

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 2.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
                                                                       	Page 5 of 6
                                     Tables
                                                                         Element No.-Page
3-1.    Distribution List	3-1
4-1.    ESAT Oversight	4-4
6-1.    Design/Performance Specifications	6-4
6-2.    Field Measurement Requirements	6-5
6-3.    Laboratory Performance Specifications	6-8
6-4.    Implementation Summary	6-12
6-5.    Data Reporting Schedule for the AQS	6-13
6-6.    Assessment Schedule	6-14
6-7.    Critical Documents and Records	6-15
7-1.    Measurement Quality Objectives—Parameter PM25	7-4
8-1.    Core Ambient Air Training Courses	8-6
9-1.    PM2.5 Reporting Package Information	9-2
11-1.   Field Corrective Action	11-6
11-2.   Filter Temperature Requirements	11-10
11-3.   Holding Times	11-11
13-1.   Potential Problems/Corrective Action for Laboratory Support Equipment	13-5
13-2.   Filter Preparation and Analysis Checks	13-5
13-3.   Temperature Control Requirements	13-8
14-1.   Field QC Checks	14-3
14-2.   Laboratory QC	14-6
14-3.   Control Charts	14-18
15-1.   Inspections in the Weigh Room Laboratory	15-2
15-2.   Inspection of Field Items	15-2
15-3.   Preventive Maintenance in Weighing Laboratories	15-4
15-4.   Preventive Maintenance of Field Items	15-5
16-1.   Instrument Calibrations	16-1
16-2.   Calibration Standards and/or Apparatus for PM2.5 Calibration	16-4
17-1.   Weighing Laboratory Equipment	17-2
17-2.   Field Equipment and Supplies	17-3
19-1.   List of PEP Data Processing Operations for Critical Values	19-3
19-2.   Validation Check Summaries	19-5
19-3.   Raw Data Calculations	19-6

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 2.0
                                                                           Revision No.: 1
                                                                           Date: 3/6/2009
	Page 6 of 6

                                                                        Element No.-Page
 19-4.  Data Transfer Operations	19-6
 19-5.  Data Reporting Schedule	19-7
 19-6.  Data Assessment Equations	19-8
 19-7.  Data Archive Policies	19-11
 20-1.  Assessment Summary	20-9
 21-1.  Communications Summary	21-2
 21-2.  Quarterly SLAMS/NCore Reporting Schedule	21-7
 21-3.  Report Summary	21-8
 23-1.  Validation Template Where Failure of Any One Criteria Would Invalidate a
       Sample or a Group of Samples	23-4
 23-2.  Validation Template Where Certain Combinations of Failure May Be Used to
       Invalidate a Sample or Group of Samples	23-6
 23-3.  Sample Batch Validation Template	23-9

                                       Figures
                                                                        Element No.-Page
 4.1.    Organizational chart of the technical and contractual aspects of the PEP	4-1
 4-2.    Definition of an independent assessment	4-9
 6-1.    PEP overview	6-2
 6-2.    Critical filter-holding times	6-11
 11-1.  Quality Bulletin	11-5
 13-1.  Laboratory activities	13-2
 14-1.  PEP QC scheme	14-2
 14-2.  PEP Filter Weighing Data Entry Form	14-17
 17-1.  Field/Laboratory Inventory Form (INV-01)	17-8
 17-2.  Field/Laboratory Procurement Log Form (PRO-01)	17-8
 17-3.  Field/Laboratory Equipment/Consumable Receiving Report Form (REC-01)	17-9
 19-1.  PEP information management flow	19-2
 20-1.  Audit activities	20-3
 20-2.  Audit Finding Form	20-4
 20-3.  Audit Finding Response Form	20-6
 20-4.  Surveillance Report Form	20-7
 21-1.  Lines of communication	21-1
 23-1.  PEP validation matrix	23-2

-------
                                                                       Project: PEP QAPP
                                                                         Element No.: 3.0
                                                                          Revision No.: 1
                                                                          Date: 3/6/2009
                                                                      	Page 1 of 3
                                3.0  Distribution

A copy of this QAPP will be distributed to the individuals who are listed in Table 3-1. The
Regional Work Assignment Managers (WAMs), Task Order Project Officers (TOPOs), or
Delivery Order Project Managers (DOPOs) will be responsible for distributing the QAPP to each
Environmental Services Assistance Team (ESAT) contractor that participates in the
environmental data operations of the PEP. The Regional WAMs/TOPOs/DOPOs should also to
provide  a copy of this QAPP to their Regional Quality Assurance Managers (QAMs).

                              Table 3-1. Distribution List
Name
Address
Phone Number
E-mail
ESAT
Headquarters ESAT
Program Manager
Colleen Walling

Contracting Officers:
Charlie Hurt
Lynette Gallion
Deborah Hoover
U.S. Environmental Protection
Agency (EPA) Headquarters
Ariel Rios Building
1200 Pennsylvania Ave., NW
Mail Code: 5203P
Washington, DC 20460
** Same as above **
(Mail Code: 3805R)
(Mail Code: 3805R)
U.S. EPA-Region4
6 IForsyth Street, S.W.
Atlanta, GA 30303-8960
(703) 603-8814

(202) 564-6780
(202) 564-4463
(404) 562-8373
walling. colleen@epa. gov

hurt.charlie@epa.gov
gallion.lynette@epa.gov
hoover.deborah@epa.gov
OAQPS
WAM, National PEP
Project Leader
Dennis Grumpier
Michael Papp
Mark Shards
Field Instrument
Consultant
Jeff Lantz


U.S. EPA
Office of Air Quality Planning and
Standards
MQAG (C304-06)
Research Triangle Park, NC 2771 1
** Same as above **
U.S. EPA
Office of Radiation and Indoor Air,
Radiation & Indoor Environments
National Laboratory
P.O. Box 98517
Las Vegas, NV 89193-8517
(919) 541-0871
(919) 541-2408
(919)541-1323


(702) 784-8275


crumpler.dennis@epa.gov
papp.michael@epa.gov
shanis.mark@epa.gov


lantz.jeff@epa.gov



-------
 Project: PEP QAPP
   Element No.: 3.0
    Revision No.: 1
     Date: 3/6/2009
	Page 2 of 3
Name
Address
Phone Number
E-mail
REGIONS
Region 1
TOPO
Mary Jane Cuzzupe
Regional Project
Officer (RPO)
Pat Svetaka
Region 2
TOPO
Mark Winter
RPO
Yolanda Guess
Region 3
TOPO
Cathleen Kennedy
RPO
Khin-Cho Thaung
Region 4
TOPO
Greg Noah
RPO
Sandra Sims
Region 5
TOPO
Basim Dihu
RPO
Steven Peterson
Region 6
TOPO
John Lay
RPO
Marvelyn Humphrey
U.S. EPA-Region 1
New England Regional Laboratory
Office of Environmental
Measurement and Evaluation
1 1 Technology Dr. (ECA)
North Chelmsford, MA 01863
U.S EPA-Region 2
Raritan Depot (220MS220)
2890 Woodbridge Ave.
Edison, NJ 08837-3679
** Same as above **
(Mail Code: 215MS215)
U.S. EPA-Region 3
1650 Arch. St. (3AP22)
Philadelphia, PA 19103-2029
U.S. EPA-Region 3
Environmental Science Center
701MapesRd. (3ES20)
FortMeade, MD 20755-5350
U.S. EPA-Region 4
Science and Ecosystem Support
Division
980 College Station Rd.
Athens, GA 30605-2720
U.S. EPA-Region 4
Atlanta Federal Center
61ForsythSt, SW
Atlanta, GA 30303-8960
U.S. EPA-Region 5
77 West Jackson Blvd. (AT-18J)
Chicago, IL 60604-3507
** Same as above **
(Mail Code: SRT-4J)
U.S. EPA-Region 6 Laboratory
Houston Branch (6PDQ)
10625 Fallstone Rd.
Houston, TX 77099
** Same as above **
(Mail Code: 6MDHL)
(617) 918-8383
(617) 918-8396
(732) 321-4360
(732) 906-6875
(215) 814-2746
(410) 305-2743
(706) 355-8635
(706) 355-8772
(312)886-6242
(312)353-1422
(281) 983-2155
(281) 983-2140
cuzzupe.maryjane@epa.gov
svetaka.pat@epa.gov
winter.mark@epa.gov
guess.yolanda@epa.gov
kennedy .cathleen@epa. gov
thaung.khin-cho@epa.gov
noah.greg@epa.gov
sims. sandra@epa.gov
dihu.basim@epa. gov
peterson.steven@epa.gov
lay.john@epa.gov
humphrey.marvelyn@epa.gov

-------
                                                                           Project: PEP QAPP
                                                                             Element No.: 3.0
                                                                              Revision No.: 1
                                                                              Date: 3/6/2009
                                                                          	Page 3 of 3
REGIONS (continued)
Region 7
TOPO
Thien Bui
RPO
Barry Evans
Region 8
TOPO
Michael Copeland
RPO
Marty McComb
Region 9
TOPO
Mathew Plate
RPO
Rose Fong
Region 10
TOPO
Chris Hall
RPO
Christopher Pace
U.S. EPA-Region7
901 North Fifth St.
(ENSVEMWC)
Kansas City, KS 66 101
** Same as above **
(Mail Code: ENSVRLAB)
U.S. EPA-Region8
999 18th Street (8P-AR)
Suite 300
Denver, CO 80202-2466
** Same as above **
(Mail Code: 8EPR-PS)
U.S. EPA-Region9
75 Hawthorne St. (MTS-3)
San Francisco, CA 94105
** Same as above **
(Mail Code: MTS-3)
U.S. EPA-Region 10
1200 Sixth Ave.
Seattle, WA 98 101
U.S. EPA-Region 10
Manchester Laboratory
74 11 Beach Dr. East
Port Orchard, WA 98366
(913)551-7079
(913)551-5144
(303)312-6010
(303)312-6963
(415) 972-3799
(415) 972-3812
(206) 553-0521
(360) 871-8703
bui.thien@epa.gov
evans .barry @epa. gov
copeland.michael@epa.gov
mccomb . martin@epa. gov
plate.mathew@epa.gov
fong. rose@epa. gov
hall.christopher@epa.gov
pace .christopher@epa. gov
RTI
PEP Support
Work Assignment
Leader
Jennifer Lloyd
Project Manager
James Flanagan
RTI International
3040 Cornwallis Rd.
P.O. Box 12194
Research Triangle Park, NC 27709
(919) 541-5942
(919) 990-8649
jml@rti.org
jamesf@rti.org
It is likely the individuals who are listed in Table 3-1 will not be associated with the program
indefinitely; therefore, updates to the PEP contact list will be made available on the Internet
through the Ambient Monitoring Technology Information Center's (AMTIC's) Bulletin Board
under the quality assurance (QA) area of the PM2.5 Monitoring Information (available at
http ://www. epa.gov/ttn/amtic/pmpep. html).

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 4.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
                                                                              Page lof 11
                        4.0 Project/Task Organization

This element will provide the U.S. Environmental Protection Agency (EPA) and other involved
parties with a clear understanding of the role that each party plays in the PEP and will provide
the lines of authority and reporting for the project.

The degree of complexity and the number of agencies that are involved with the PEP requires
that the flow of information and associated communications be structured to optimize the
collective resources. The deployment and operation of this network is a shared responsibility
among all of the involved organizations. The purposes of the following role descriptions are to
facilitate communications and to outline basic responsibilities. Figure 4-1 provides a basic
diagram of the organization and the lines of communication. Table 3-1 in Element 3.0,
Distribution, lists the primary  personnel who are involved in the PEP.
QA
Workgn
OAQPS, C
Regions, Slate
Tribal
'
'
HID _ _
3RD
/Local/ ^
^
*
ORD
NERL

OAQPS
Michael Papp
Dennis Grumpier
Mark Shanis
i
1
\
\
C
ESAT
ii Program Manager
olleen Walling
A
i
i
4
EPA ESAT CO
Charlie Hurt - Regions 1. 2, 5, 6, 10
Lynnette Gall on - Regions 3, 7, 8, 9
Deborah Hoover - Region 4

A
i
I
*




Region 1
M.J. Cuzzupe-TOPO
Pal Svetaka-RPG
Region 2
Mark Winler-TOPO
Yolanda Guess-RPO
Region 3
Cathieen Kennedy-TOPO
Khin-Cho Thaung-RPO
Region 4
Greg Noah-TOPO
Sandra Sims-RPO
Region 5
Scotl Harnilton-TOPO
Steven Pete
-------
                                                                      Project: PEP QAPP
                                                                        Element No.: 4.0
                                                                         Revision No.: 1
                                                                          Date: 3/6/2009
	Page 2 of 11


4.1    The PEP Workgroup (Previously the PM2 5 QA Workgroup)

The PM2.s QA Workgroup was originally formed in 1988 to address the QA aspects of the PM2.s
Monitoring Program during the deployment of the PM2.5 ambient monitoring network and the
PEP. Members of this workgroup included personnel from EPA's Office of Air Quality Planning
and Standards (OAQPS), EPA Regions, EPA's Office of Research and Development (ORD), the
National Exposure Research Laboratory (NERL), and State, local, and Tribal (SLT) air
monitoring organizations. That workgroup has evolved into a more overarching "National
Ambient Monitoring QA Strategy Workgroup" for all ambient monitoring and meteorological
measurements.

The PEP has formed an ad hoc workgroup, the PEP Workgroup, which consists of the EPA
Regional WAMs, TOPOs, and DOPOs for the ESAT contract; and SLT agencies that have opted
to run at least the field operations component of the PEP in their jurisdictions. The PEP ESAT
field and laboratory personnel are invited to participate in the conference calls. The PEP
Workgroup, which is chaired by the OAQPS National PEP Project Leader, meets at least twice
per year and more often if needed. The PEP Workgroup serves in an advisory role and assists in
the review and revision of PEP guidance documents, such as the PEP field and laboratory
standard operating procedures (SOPs) and the PEP QAPP. Revisions to these documents, which
may have national implications or issues that are national in scope, are reviewed by the National
Ambient Monitoring QA Strategy Workgroup.

4.2    EPA's Office of Air Quality Planning and Standards

OAQPS, which has oversight for ensuring the quality of the nation's ambient air data, has
developed specific regulations for the development of a quality system as found in 40 Code of
Federal Regulations (CFR) Part 58, Appendix A. One specific element of this quality system is
the PEP. OAQPS has the following responsibilities to ensure the continued success of this
program:

   •   Coordinating and overseeing the PEP
   •   Providing a contractual vehicle for the acquisition and distribution of the Federal
       Reference Method (FRM) portable evaluation samplers
   •   Developing a memorandum of understanding with the ESAT office
   •   Working with the EPA Regions to determine which SLT organizations will use the
       federally implemented PEP
   •   Transferring the necessary funds through the EPA Regional offices to the EPA ESAT
       Contracts Management Division (CMD) to support the PEP and to the Region 4 office
       for laboratory equipment and consumables
   •   Procuring the majority of the field capital equipment and facilitating major repairs
   •   Distributing filters to the national  weighing laboratory

-------
                                                                          Project: PEP QAPP
                                                                            Element No.: 4.0
                                                                             Revision No.: 1
                                                                             Date: 3/6/2009
	Page 3 of 11

    •  Developing the PEP Implementation Plan, the statement of work for the PEP in the
       ESAT contract language, SOPs, and the PEP QAPP
    •  Developing the field and laboratory personnel requirements
    •  Developing the field and laboratory training activities, participating in training, and
       securing national experts to answer specific technical questions
    •  Developing and maintaining the Performance Evaluation Database (FED)
    •  Assessing the concentration information uploaded into EPA's Air Quality System (AQS)
       database and assisting in reconciling significant differences between site and audit data
    •  Initiating and instituting a communications network and serving as a liaison to groups
       that are working on the PEP
    •  Interacting with regional, SLT organization personnel about the setup, operation, and
       data results of the performance evaluations (PEs)
    •  Ensuring the program's success by performing various assessment activities, such as
       Regional management systems reviews (MSRs) and technical systems audits (TSAs).

OAQPS provides oversight for the program through the National PEP Project Leader. Most
budgetary and technical planning activities are coordinated through OAQPS. The Ambient Air
Monitoring Group (AAMG), within the Air Quality Assessment Division (AQAD), is ultimately
responsible for implementing the PEP and this QAPP, most technical components (with support
from ORD, Regional offices, and SLT organizations), and the resource estimates underlying
program implementation. Resource guidance necessary for the State and Tribal Assistance
Grants (STAG) distribution is coordinated through the Planning, Resources, and Regional
Management staff within OAQPS. In addition, the National Air Data Group, within the Outreach
and Information Division, is responsible for maintaining the AQS database.

4.3   ESAT Organization

Since the PEP's inception in 1999, the PEP field operators and laboratory technicians have been
secured through EPA ESAT's contractors1 In 2006-2007, the support was dispersed among 10
new contracts, one for each region. EPA's oversight of ESAT consists of Contracting Officers
(COs), Contracting Specialists (CSs), Project Officers (POs), and Regional Project Officers
(RPOs). Table 4-1 provides information  about the regions and important contacts within them.
Additional information about ESAT and these contacts is available at
http://www.epa.gov/superfund/policy/contracts/12esat.htm.
1 Currently, ESAT is providing all field operations for the federally implemented PEP. If for some reason an ESAT
contractor is unable to provide the capacity that is required for the PEP, EPA may issue contracts to other
organizations to fulfill these needs. An example might be in case of a national disaster. Such contractors would be
expected to have similar roles and responsibilities as described for the ESAT organization in this QAPP.

-------
                                                                       Project: PEP QAPP
                                                                        Element No.: 4.0
                                                                         Revision No.: 1
                                                                          Date: 3/6/2009
                                                                            Page 4 of 11
                              Table 4-1. ESAT Oversight
Colleen Walling — ESAT Program Manager
Region
1
2
3
4
5
6
7
8
9
10
Contracting Officer
Charlie Hurt
Charlie Hurt
Lynette Gallion
Deborah Hoover
Charlie Hurt
Charlie Hurt
Lynette Gallion
Lynette Gallion
Lynette Gallion
Charlie Hurt
Regional Project Officers
Pat Svetaka
Yolanda Guess
Khin-Cho Thaung
Sandra Sims
Steven Peterson
Marvelyn Humphrey and Melvin Ritter
Barry Evans
Marty McComb
Rose Fong
Christopher Pace
Some important aspects of the ESAT contract include the following:

   •   Only the WAM/TOPO/DOPO, the RPO/PO, and the CO/CS are authorized to give
       instructions or clarification (technical direction) to the ESAT contractor on the work to
       be performed. This technical direction is provided in writing.
   •   WAM/TOPO/DOPOs and PvPO/POs will prepare the work assignments/task
       orders/delivery orders and are effective only upon approval by the CO.

The EPA Contracts Manual describes the roles and responsibilities of COs, CSs and POs, which
do not need to be explained here. The important roles and responsibilities for the PEP are
described below.

Contracting Officers

   •   Work with OAQPS to secure, obligate, commit, and distribute funds for work performed
       under the ESAT contract (or other contract vehicle as appropriate)
   •   Ensure that contract activities fall within ESAT's scope of work
   •   Approve work assignments, task orders, and delivery orders.

Contracting Specialists

   •   Work with OAQPS or Regional ESAT WAM/TOPO/DOPOs to modify contracts or track
       the use of funds for work performed under the ESAT contract (or other contract vehicle
       as appropriate).

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 4.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
	Page 5 of 11

Headquarters Project Officers
    •   Serve as a Regional liaison between the RPO and the CO
    •   Provide contract-wide administration
    •   Develop a memorandum of understanding with OAQPS.
Regional Project Officers
    •   Provide overall management and oversees performance of respective Regional teams
    •   Review Region-specific invoices with input from WAMs, TOPOs, and DOPOs
    •   Prepare (with WAM/TOPO/DOPO) PEP work assignments, task orders, and delivery
       orders
    •   Assist in developing ESAT work assignments, task orders, and delivery orders
    •   Ensure that there are qualified contractual personnel available to implement the PEP
    •   Provide administrative and logistical support for the ESAT contract
    •   Oversee the performance of the required activities of the contractor
    •   Regularly  communicate with program participants (e.g., OAQPS, Region).
Work Assignment Managers, Task Order Project Officers, and Delivery Order Project
Officers
In most cases, the WAM/TOPO/DOPO will serve as a technical person from the regional air
program branch or division. He or she will be responsible for assisting in the technical
implementation of the program. Some of the WAM/TOPO/DOPO's activities may include the
activities listed in Section 4.4; however, the primary responsibilities related to the ESAT contract
are the following:
    •   Communicating with the National PEP Project Leader about the current status of funding
       for the federally implemented PEP
    •   Preparing (with RPO) PEP work assignments, task orders, and delivery orders
    •   Setting up a file system that contains all relevant documentation, including notes of
       conversations with the contractor and other items that will provide an audit trail of the
       contractor's actions under the contract, as well as all technical information related to the
       PEP
    •   Reviewing the contractor's work plan and preparing findings on proposed tasks, labor
       hours skill mix, and materials and quantities
    •   Monitoring contract and QAPP compliance
    •   Tracking the dollars and hours,  providing technical direction (in accordance with the
       terms of the contract), and reviewing monthly technical and financial reports

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 4.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
	Page 6 of 11

    •   Verifying contractor representations of deliverables received and accepted and/or
       progress
    •   Communicating contractor performance, budgetary, and administrative/logistical issues
       to the RPO and to the National PEP Project Leader
    •   Validating and accepting data.

4.4   EPA Regional Offices

The EPA Regional offices are the major communication link with SLT organizations in terms of
communicating the needs and concerns of states to EPA Headquarters offices and in
communicating to the SLT organizations the objectives and guidance that are often developed by
OAQPS. This role is vital for the development of effective policies and programs. For the PEP,
the Regional offices have the following specific responsibilities:

All Regions:

    •   Assist, through QA workgroup activities, in the development of all pertinent PEP
       guidance documents
    •   Review and approve the work plans submitted by the ESAT contractors
    •   Provide a WAM/TOPO/DOPO to oversee the technical aspects of field activities that are
       performed by the ESAT contractors
    •   Train and certify ESAT field personnel (if the Regional trainer is certified by EPA to do
       so)
    •   Provide technical oversight of the field activities by performing TSAs of the PEP field or
       support  laboratory operations
    •   Provide oversight of PEP activities for SLT organizations that have assumed the field
       and/or laboratory operations for PEP in their jurisdictions
    •   Work with  SLT organizations to develop a yearly schedule of site evaluations
    •   Provide a yearly schedule of site evaluations for the ESAT contractors
    •   Inform SLT organizations of an upcoming PE
    •   Evaluate the PE data, forward that data to the  SLT organizations, and inform them of
       significant differences between the PEP and their FRM/Federal Equivalent Method
       (FEM) monitors
    •   Participate in training and certification activities, including multi-state conferences, EPA
       satellite broadcasts, and other training vehicles
    •   Attend conference calls and meetings about PE activities.

-------
                                                                         Project: PEP QAPP
                                                                            Element No.: 4.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 7 of 11


Region 4 (including items previously listed):

    •   Provide a WAM/TOPO/DOPO to oversee the technical aspects of laboratory activities
       that are performed by the ESAT contractors
    •   Develops the primary laboratories for this program with respect to logistical, technical,
       and analytical support, including providing the necessary facilities to store, condition,
       weigh, distribute, and archive filters and the distribution of filters (including coolers, ice
       packs, and other supplies) to the Regions
    •   Trains and certifies ESAT laboratory personnel (if a Regional trainer is certified by EPA
       to do so)
    •   Provides technical oversight of the laboratory activities by performing TSAs of these
       activities
    •   Validates data before they are uploaded into the AQS.

4.5   ESAT Contractors

The ESAT contractors will  perform the specific tasks associated with the PEP. Their
responsibilities will include the following:

    •   Developing a work plan and cost estimates for each work assignment, task order, or
       delivery order
    •   Staffing appropriately to meet the contract requirements
    •   Successfully implementing the activities described in the work plan and work
       assignment/task order/delivery order
    •   Receiving training and certification(s) to perform field and laboratory PEP activities, as
       appropriate
    •   Understanding government  regulations as they relate to contracts and inherent
       government functions.

4.6   State, Local, and Tribal Agencies

EPA could not effectively plan and execute this program without SLT organization participation.
The SLT agencies bear the heaviest responsibility for developing and implementing the national
PM2.5 Monitoring Program; as well as for optimizing the data quality. Conversely, the PEP
provides an  invaluable QA/QC function on the  overall performance of the network and often
isolates unique sampler or site problems. It is imperative that SLT organizations work with the
EPA Regional offices to make every PEP audit event successful. They should identify problems
that will impede the mission of the PEP as early as possible and help find solutions. The SLT
organizations have the following specific responsibilities:

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 4.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
	Page 8 of 11


General monitoring site accommodations:

    •   Ensure that there is sufficient space for a collocated audit monitor within 1 to 4 meters of
       the primary monitor, while still meeting CFR siting requirements
    •   Ensure that the collocated audit monitor can be placed within 1 meter vertically of the
       primary monitor
    •   Ensure that each site is accessible for a PEP sampling (may be an issue for some
       continuous potential "FEM" sites.)
    •   Ensure that each site meets the applicable state or federal Occupational Safety and Health
       Administration safety requirements (includes providing secured ladders and appropriate
       safety rails and/or cages)
    •   Ensure that adequate power is available for the PEP samplers.

If not using the federal PEP:

    •   Implements a comparable or equivalent PEP  at the frequency prescribed by the federal
       regulations in 40 CFR Part 58, Appendix A
    •   Adheres to the definition of independent assessment (see Figure 4-2)
    •   Participates in similar training and certification activities
    •   Procures necessary equipment and consumables
    •   Develops the necessary SOPs and QA procedures into their respective QAPPs
    •   Participates in semi-annual collocation precision studies of the SLT and federally
       deployed PEP samplers
    •   Transmits data to the AQS according the schedule outlined in the monitoring QA
       regulations and procedures provided by EPA
    •   Selects the sites for evaluation
    •   If using a third-party laboratory, requires that laboratory to participate in an annual
       gravimetric round-robin PE administered by EPA's Office of Radiation and Indoor Air-
       National Air and Radiation Environmental Laboratory (ORIA-NAREL), in Montgomery,
       AL. This is not required if the SLT PEP uses EPA's PEP weighing laboratory for its filter
       weighing
    •   Prepares a weighing laboratory annual report in an EPA-specified format and submits it
       to EPA. This is not required if the SLT PEP uses EPA's PEP weighing laboratory for its
       filter weighing
    •   Submits to an annual TSA of their PEP activities by the EPA Regional PEP Leader or
       QA Manager.

-------
                                                                             Project: PEP QAPP
                                                                               Element No.: 4.0
                                                                                Revision No.: 1
                                                                                 Date: 3/6/2009
                                                                                   Page 9 of 11
Independent assessment—An assessment that is performed by a qualified individual, group,
or organization that is not part of the organization that is directly performing and accountable
for the work being assessed. This auditing organization must not be involved with generating
the routine ambient air monitoring data. An independent organization could be another unit of
the same agency, which is sufficiently separated in terms of organizational reporting and can
provide for independent filter weighing and PE auditing.

An organization can conduct the PEP if it can meet the above definition and has a
management structure that, at a minimum, will allow for the separation of its routine sampling
personnel from its auditing personnel by two levels of management. In addition, the pre- and
post-sample  weighing of audit filters must be performed by separate laboratory facility using
separate laboratory equipment. Field and laboratory personnel would be required to meet the
PEP field and laboratory training and certification requirements. The SLT organizations are
also  asked to consider participating in the centralized field and laboratory standards
certification  process.
                                          Organization
                                           3rd Level
                                          Supervision
                         Organization
                          2nd Level
                          Supervision
                            Organization
                            2nd Level
                            Supervision
                Organization
                 1st Level
                 Supervision
  Organization
   1st Level
  Supervision
  Organization
   1st Level
  Supervision
   Organization
    1st Level
   Supervision
                 Organization
                 Personnel
                QA Lab Analysis
  Organization
  Personnel
QA Field Sampling
  Organization
   Personnel
Routine Lab Analysis
   Organization
   Personnel
Routine Field Sampling
Organizations that are planning to implement the PEP must submit a plan that demonstrates
independence to the EPA Regional office that is responsible for overseeing QA-related
activities for the Ambient Air Quality Monitoring Network.
                   Figure 4-2. Definition of an independent assessment.

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 4.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
	Page 10 of 11

If using the federal PEP:

    •   Operates the routine PM2.5 FRM/FEM monitoring network according to the established
       regulations and guidelines, including proper siting, operations, and QA procedures
    •   Creates an accurate list of State and Local Ambient Monitoring Station (SLAMS) or
       Tribal sites with addresses, AQS IDs, makes and models of routine sampling equipment,
       and sampling schedules
    •   Assists, through PM2.s QA Workgroup activities, in the development of pertinent PEP
       guidance documents
    •   On a yearly basis, determines whether to continue using the federal implementation of
       the PEP
    •   Identifies the sites within the routine  PM2 5 FRM/FEM monitoring network for PEs and
       the associated sampling schedules
    •   Ensures that an Agency representative is onsite when the PEP Field Scientist (FS) arrives
       and performs the evaluation. This includes communicating with the operator, operating
       the routine  monitor in the normal operating mode (including posting site results to the
       AQS),  and  generally supporting the PEP
    •   Ensures the program's success by performing various internal oversight activities of the
       SLT monitoring networks, such as TSAs of field and laboratory activities
    •   Participates in training activities, including multi-state conferences, EPA satellite
       broadcasts, and other training vehicles
    •   Reviews  routine and PE data and works with the EPA Region on corrective actions.

4.7   Other Participating Entities

EPA Office of Research and Development

The ORD's primary role in the implementation of the PEP will be to serve as a technical
consultant, advisor, and arbiter of technical issues. This action will be primarily through the
NERL, which  provides many of the applied research elements for the program. ORD also has the
overall responsibility for designating all air monitors as FRM/FEM. The FRM/FEM portable
audit sampler must be designated by ORD through its Federal Reference and Equivalency
Program (40 CFR Part 53). This overall responsibility includes the following:

    •   Designating PM2 5 samplers  as FRM/FEM and providing technical support
    •   Providing technical support for the national monitor procurement contracts
    •   Arbitrating PEP technical issues
    •   Providing guidance for field and analytical activities  (QA Hand Book Guidance
       Document 2.12).

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 4.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 11 of 11

EPA Contracts Management Division Responsibilities

The CMD, within the Office of Acquisition Management (OAM), is responsible for issuing
contracts and various national procurements. These contracts are developed in concert with
OAQPS AQAD technical staff. The CMD is responsible for all communications with vendors
and extramural contract organizations. The CMD's responsibilities include the following:

    •  Developing national contracts for the sampler purchases and filter purchases and working
       with ORD and Office of Air and Radiation (OAR) contracts and technical staff to provide
       these products
    •  Providing COs and other contracting support for national procurements of contract
       support for federal implementation of the PEP, major equipment repairs, and equipment
       upgrades.

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 5.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
                                                                        	Page 1 of 5
                     5.0  Problem Definition/Background

The background information provided in this element will place the problem in historical
perspective, giving readers and users of the QAPP a sense of the project's purpose and position
relative to the Ambient Air Quality Monitoring Program.

5.1    Problem Statement and Background

In 1970, the Clean Air Act (CAA) was signed into law. Under the CAA, the ambient
concentrations of six criteria pollutants (particulate matter [PMio, PM^.s], sulfur dioxide [802],
carbon monoxide, nitrogen dioxide [NO2], ozone [Os], and lead) are regulated. The CAA
requires SLT organizations to monitor these criteria pollutants through the Ambient Air Quality
Surveillance Program as defined in 40 CFR Part 58.

The criteria pollutant defined as particulate matter (PM) is a general term used to describe a
broad class of substances that exist as liquid or solid particles over a wide range of sizes. As a
part of the Ambient Air Monitoring Program, two particle size fractions will be measured: those
less than or equal to 10 micrometers (PMio) and those less than or equal to 2.5 micrometers
(PM2.s). This QAPP focuses on one QA activity, the PEP, which is associated with PM2.5
monitoring.

The background and rationale for the implementation of the PM2.s FRM/FEM monitoring
network can be found in Air Quality Criteria for Particulate Matter., which is available at
http://cfpub2.epa.gov/ncea/cfm/recordisplay.cfm?deid=87903. In general, some of the findings
include the following:

   •   The characteristics, sources, and potential health effects of larger or "coarse" particles
       (from 2.5-10 micrometers in diameter) and smaller or "fine" particles (smaller than 2.5
       micrometers in diameter) are very different.
   •   Coarse particles come from sources such as windblown dust from the desert or
       agricultural fields and dust that is circulated on unpaved roads from vehicle traffic.
   •   Fine particles are generally emitted from activities such as industrial and residential
       combustion and from vehicle exhaust. Fine  particles are also formed in the atmosphere
       from gases, such as 862, nitrogen oxides, and volatile organic compounds, that are
       emitted from combustion activities, and then become particles as a result of chemical
       transformations in the air.
   •   Coarse particles can deposit in the respiratory system and contribute to adverse health
       effects, such as aggravating asthma. EPA's^4/> Quality Criteria for Particulate Matter
       concluded that fine particles, which also deposit deeply in the lungs, are more likely than
       coarse particles to contribute to the adverse health effects (e.g., premature mortality and
       hospital admissions) found in many published community epidemiological studies.

-------
                                                                          Project: PEP QAPP
                                                                            Element No.: 5.0
                                                                             Revision No.: 1
                                                                              Date: 3/6/2009
	Page 2 of 5

    •  Community studies found that adverse public health effects are associated with exposure
       to particles at levels well below the current PM standards for both short-term (e.g., less
       than 1 day to up to 5 days) and long-term (generally 1 year to several years) periods.
    •  These adverse health effects included premature death and increased hospital admissions
       and emergency room visits (primarily among the elderly and individuals with
       cardiopulmonary disease); increased respiratory symptoms and disease (among children
       and individuals with respiratory disease, such as asthma); decreased lung function
       (particularly in children and individuals with asthma); and alterations in lung tissue and
       structure and in respiratory tract defense mechanisms.

One goal of EPA's PM2.5 program was to establish a PM2.5 monitoring network by
December 31, 1999.

Air quality samples are generally collected for one or more of the following purposes:

    •  To judge compliance with and/or progress made towards meeting the National Ambient
       Air Quality Standards (NAAQS)
    •  To develop, modify, or activate control strategies that prevent or alleviate air pollution
       episodes
    •  To observe pollution trends throughout the Region, including non-urban areas
    •  To provide a database for research and evaluation of effects.

With the end use of the air quality samples as a prime consideration, various networks can be
designed to meet one of the following six basic monitoring objectives:

    •  Determine the highest concentrations to occur in the area covered by the network
    •  Determine representative concentrations in areas of high population density
    •  Determine the impact on ambient pollution levels of significant source or source
       categories
    •  Determine general background concentration levels
    •  Determine the extent of Regional pollutant transport among populated areas and in
       support of secondary standards
    •  Determine the welfare-related impacts in more rural and remote areas.

The Ambient Air Quality Monitoring Network consists of four major categories of monitoring
stations that measure the criteria pollutants. These stations are described below.

The SLAMS network consists of approximately 3,500 monitoring stations whose size and
distribution are largely determined by the needs of SLT air pollution control agencies to meet
their respective State Implementation Plan (SIP) requirements.

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 5.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 3 of 5

The National Core (NCore) network multipollutant monitoring stations are part of an overall
strategy to integrate multiple monitoring networks and measurements. These are a subset of
SLAMS. Monitors at NCore multipollutant sites will measure particles (PM2 5, speciated PM2 5,
PMio-2.5), Os, SC>2, carbon monoxide, nitrogen oxides (NO/NO2/NOy), and provide basic
meteorology. Monitors for all of the gases, except for 63, would be more sensitive than standard
FRM/FEM monitors, so they could accurately report concentrations that are well below the
respective NAAQS but that can be important in the formation of O3 and PM. EPA expects that
each state would have from one to three NCore  sites, and EPA will collaborate with states
individually and through multistate organizations on site selection. The objective is to locate
sites in broadly representative urban (approximately 55 sites) and rural (approximately 20 sites)
locations throughout the country to help characterize regional and urban patterns of air pollution.
In many cases, states will likely collocate these  new stations with the Photochemical Assessment
Monitoring Station (PAMS) sites that are already measuring 63 precursors and/or National Air
Toxics Trend Station (NATTS) sites that are measuring air toxics. By combining these
monitoring programs at a single location, EPA and its partners can maximize the multipollutant
information.

The PAMS network is required to measure Os precursors in each Os non-attainment area that is
designated as serious, severe, or extreme. The required networks have from two to five sites,
depending on the population of the area. The current PAMS network has  approximately 80 to 90
sites and is likely to change.

The Special Purpose Monitoring Stations (SPMS) network provides for special studies needed
by the state and local agencies to support their SIPs and other air program activities. The SPMS
are not permanently established; therefore, they can easily be adjusted to  accommodate the
changing needs and priorities. The SPMS are used to supplement the fixed monitoring network
as circumstances require and as resources permit. If the data from SPMS  are used for SIP
purposes, they must meet all QA and methodology requirements for SLAMS monitoring.

Note: This QAPP only focuses on the QA activities of the SLAMS and NCore networks and
the objectives of these networks, which include any PM2.s samplers used for comparison to the
NAAQS.

Throughout this document, the term "decision maker" will be used. This term represents the
individuals who are the ultimate users of ambient air data and therefore may be responsible for
activities such as setting and making comparisons to the NAAQS and evaluating trends. Because
there are more than one objective for this data and more than  one decision maker, the quality of
the data will be based on the highest priority objective, which was identified as determining the
attainment of the NAAQS.

Because the data for the FRM/FEM monitors in the  SLAMS and NCore networks are used for
NAAQS comparisons, the quality of these data is very important. A quality system has been
developed to control and evaluate the quality of data to make NAAQS determinations within an
acceptable level of confidence. During the development of the PM2.s NAAQS, EPA used the

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 5.0
                                                                           Revision No.: 1
                                                                           Date: 3/6/2009
	Page 4 of 5

Data Quality Objective (DQO) process to determine the allowable measurement system
imprecision and bias that would not significantly affect a decision maker's ability to compare
pollutant concentrations to the NAAQS. The precision requirement (10% coefficient of variation
[CV]) and bias requirement (±10%) are based on total measurement uncertainty, which
incorporates errors from  all phases (e.g., field sampling, handling, analysis) of the measurement
process. The collocated samples provide adequate estimates of precision. If properly
implemented, the FRM/FEM PE can provide an evaluation of bias.

The PEP is a QA activity that is used to independently evaluate the measurement system bias of
the PM2.5 FRM/FEM monitoring network, which includes measurement uncertainties from field
and laboratory activities. The pertinent regulations for this PE are outlined in 40 CFR Part 58,
Appendix A, Section 3.2.7. The strategy is for an independent PEP Auditor to collocate a
portable FRM/FEM PM2.5 air sampling instrument within 1-4 meters of a routine
SLAMS/NCore air monitoring instrument. Both monitors operate simultaneously as required in
the FRM/FEM and SOPs. The PEP filter is analyzed by an independent gravimetric laboratory.
The gravimetric results that are derived from the two samplers are compared.

Implementing the FRM/FEM PE is a SLT responsibility; however, due to many comments made
during the review period for the December 13, 1996 PM2.5 NAAQS Proposal, EPA made the
following revisions:

    •   Modified the system to include an independent FRM/FEM PE
    •   Reduced the burden of this program by changing the audit frequency from all sites to
       25%ofthePM2.5 sites
    •   Made allowances to shift the implementation burden from the SLT organizations to the
       federal government.

During August through October 1997, EPA discussed the possibility of federal implementation
with EPA Regions  and various SLT organizations (e.g., NESCAUM, MARAMA, WESTAR,
and individual organizations). The majority of the responses from these organizations favored
federal implementation of the PEP.

EPA evaluated potential  contracting mechanisms to assist in the implementation of this activity,
and it decided to use the ESAT contract, currently in place in each Region, to provide the
necessary field and laboratory activities. Each EPA Region is responsible for implementing the
field component  of the PEP. Regions 4 and 10 operated the laboratory component from the
beginning of the  program through 2006. Region 4 assumed all responsibility for laboratory
operations in 2006.

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 5.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
                                                                        	Page 1 of 5
In October 2006, 40 CFR Part 58, Appendix A, Section 3.2.7 was amended to require the
following:

    •   For primary quality assurance organizations (PQAOs) with less than or equal to five
       monitoring sites, five valid PE audits must be collected and reported each year.

    •   For PQAOs with greater than five monitoring sites, eight valid PE audits must be
       collected and reported each year.

    •   A valid PE audit means that both the primary monitor and PEP audit concentrations have
       not been invalidated and are greater than 3 micrograms per cubic meter (wg/m3).

Additionally, each year, every designated FRM or FEM within a PQAO must have

    •   Each method designation evaluated each year; and

    •   All FRM or FEM samplers subjected to a PEP audit at least once every 6 years; which
       equates to approximately 15%  of the monitoring sites audited each year.

Prior to 2007, only the State of Illinois chose to fully-implement its own PEP, which included
the  field and gravimetric laboratory support. In response to the 2006 regulatory revisions, a few
more states and some Tribal organizations opted to partially  self-implement the program in 2007.
These SLTs that have chosen to partially self-implement the PEP are essentially  providing the
same service that the ESAT contractors provide at the Regional level (i.e., they conduct and
perform all of the necessary field activities). All SLTs that have chosen to partially self-
implement the PEP have agreed that a central service laboratory is the better alternative to
individual SLTs running their own independent service laboratory or contracting with an
independent laboratory. An important  consideration is that the fully self-implementing
organization must ensure that its resulting PEP data are entered into the AQS as prescribed in 40
CFR Part 58.16, which  states, "The data and information reported for each reporting period must
contain all data and information gathered during the reporting period, and be received in the
AQS within 90 days after the end of the quarterly reporting period."

References

    1.  U.S. EPA (Environmental Protection Agency). 2004. Air Quality Criteria for Particulate
       Matter. U.S.  Environmental Protection Agency, Washington, DC, EPA 600/P-99/002aF-
       bF, October.

    2.  U.S. EPA (Environmental Protection Agency). 2006. Revisions to Ambient Air
       Monitoring Regulations. 40 CFR Parts 53 and 58. Federal Register 71(200):61235-
       61328. October  17.

    3.  U.S. EPA (Environmental Protection Agency).  2008. PEP Program Adequacy and
       Independence Criteria:  Monitoring Rule  Requirements and Implementing Instructions.
       Revised July 23, 2008.

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 6.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
                                                                              Page 1 of 15
                         6.0 Project/Task Description

The purpose of this element is to provide the participants with a background understanding of
the project and the types of activities to be conducted, including the measurements that will be
taken and the associated QA/quality control (QC) goals, procedures, and timetables for
collecting the measurements.

6.1    Description of Work to be Performed

In general, the measurement goal of the PM2.5 PEP is to estimate the bias of SLT routine PM2.5
FRM/FEM monitors as compared to PEP monitors, which represent the best measurement of
PM2 5 currently available. It is accomplished by measuring the concentration, in units of//g/m3,
of particulates less than or equal to 2.5 //m that have been collected on a 46.2-mm Teflon
(polytetrafluoroethylene [PTFE]) filter and comparing these values against the data from a SLT
routine PM2.5 FRM/FEM monitor with the collocation of the PEP monitor. The applicable
regulations for this activity can  be found in 40 CFR Part 58, Appendix A,  Section 3.5.3.

The following sections will describe the measurements required for the routine field and
laboratory activities for the PM2 5 PEP.

The PE can be segregated into field and laboratory components. The following information
briefly  describes these activities. Detailed SOPs have been developed for all field and laboratory
activities and have been distributed to all field and laboratory personnel and all personnel who
appear on the distribution list in Element 3.0, Distribution. Figure 6-1 provides a basic
description of the PEP in the following  steps:

   •   EPA will send filters to  the weighing laboratory, where they will be inventoried,
       inspected, equilibrated, weighed, and prepared for the field.
   •   The weighing laboratory will ship or deliver the filter cassettes and accompanying Chain-
       of-Custody (COC) forms to all Regions.
   •   The FSs will take the filter cassettes, Field Data Sheets (FDSs), and COC forms to the
       field  and operate the portable FRM monitor.
   •   The FS will send the filter cassettes,  data storage media, FDSs, and COC forms back to
       the weighing laboratory (as well as filing the data and records at the field office).
   •   The weighing laboratory will receive, equilibrate, inspect, and post-weigh the filters.
       Data will be validated and uploaded  into the AQS database.

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 6.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
                                                                              Page 2 of 15
                  Regions 1-10
                  Field Work
                                                                 Field
                       ("  ~) Unexposed
                       ^-^ Filter
                                                               OAQPS
                                                  New
                                                 Filters
               PEP Weighing Lab
                                             Validated Data
                                                                 AQS
                               Figure 6-1. PEP overview.
6.2    Field Activities

The portable audit samplers are used in a collocated manner to perform the PEs. These samplers
have been approved by EPA as FRM samplers and are designed to be durable, rugged, and
capable of frequent transport. These samplers are constructed in sections, with each section
weighing no more than 40 pounds and a total weight that does not exceed 120 pounds. To
optimize the consistency of PEP measurements nationwide, the BGIPQ200A portable sampler
will be used for PEP audits at all monitoring site locations with elevations under 7,000 feet. At
elevations higher than 7,000 feet, the Andersen RAAS 200 portable sampler or the Rupprecht &
Patashnick (R&P) 2000 portable sampler will be used. There are also some locations where
electromagnetic field interference can only be mitigated by the Andersen or R&P samplers.
Although these samplers have been specifically designed to perform these PEs, precautions must
still be taken to ensure data quality. Basic instructions are found in this PEP QAPP, and specific
instructions are detailed in the PEP Field SOPs (see http://www.epa.gov/ttn/amtic/pmpep.html).

The following steps must be observed to ensure data quality:

   •   The samplers must be operated in adherence to the vendor's instruction manual, which
       discusses the proper transport, assembly, calibration, operation, and maintenance.
   •   Samples must be taken in adherence with the guidance outlined in QA Guidance
       Document 2.12 Monitoring PM2.s in Ambient Air Using Designated Reference or Class I
       Equivalent Methods, except that shipping procedures will adhere to those specified in this

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 6.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 3 of 15

       QAPP and in the PEP Field SOPs, which are more rigorous than the current regulations
       specify.
    •   All activities must adhere to the PEP Field SOPs.
    •   In addition to adhering to the standards, principles, and practices outlined in the PEP
       QAPP, activities and procedures must adhere to specific site QAPPs for the identified
       sites. An example would be where a sampler is not properly sited, but the SLT
       organization has an approved waiver from the EPA Regional Ambient Air Monitoring
       Program.
    •   Personnel must complete EPA's federally implemented training and certification  program
       annually.

6.2.1   Field Activity Summary

The following activities are outlined in the PEP Field SOPs:

    •   The FS will transport a portable PM2.5 FRM PE sampling device to an established PM2.5
       site as agreed upon by the SLT organization and its respective EPA Region.
    •   The FS will assemble the instrument; collocate the sampler; perform time, barometric
       pressure, temperature, and flow verifications; install a filter cassette; and operate  the
       instrument from midnight to midnight on the same scheduled sampling day as the SLT's
       primary sampler.
    •   If scheduling permits, the operator will leave this location to set up additional PEP audits
       at other routine sampling locations. If the schedule does not allow for another setup, the
       operator will perform additional activities, such as scheduling subsequent audits,
       reviewing and verifying data from previous PEP audits, and completing associated
       paperwork.
    •   The FS will return to each site within a specified time following the 24-hour  sampling
       time, review the run data, download the stored electronic monitoring  data, remove and
       properly store the filter cassette for transport, and disassemble the instrument.
    •   The FS will properly package the filter cassette(s) for shipment to the weighing
       laboratory. Samples will be shipped in coolers with ice packs to maintain filter
       temperatures at 4°C.

The performance requirements of the PEP air sampler are specified in 40 CFR Part 50,
Appendix L. Required recovery times and shipping schedule  are discussed in Section 6.4.4.
Table 6-1 summarizes some of the more critical performance requirements.

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 6.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
                                                                              Page 4 of 15
                      Table 6-1. Design/Performance Specifications
Equipment
Acceptance Criteria
Reference
Filter Design Specifications (Certified by Vendor)
Size
Medium
Support ring
Pore size
Filter thickness
Maximum pressure drop
Maximum moisture
pickup
Collection efficiency
Filter weight stability
Alkalinity
46.2-mm diameter ± 0.25 mm
Polytetrafluoroethylene
Polymethylpentene
0.38-± 0.04 mm thick
46.2 ± 0.25 mm outer diameter
3.68 (± 0.00 mm, -0.51 mm) width
2 jum
30-50 ,wm
30 cm H2O at 16.67 Lpm
10-^ig increase in 24 hr
99.7%
<20,ug
<25.0 microequivalents/g
40 CFR Part 50, Appendix L, Section 6. 1
40 CFR Part 50, Appendix L, Section 6.2
40 CFR Part 50, Appendix L, Section 6.3
40 CFR Part 50, Appendix L, Section 6.4
40 CFR Part 50, Appendix L, Section 6.5
40 CFR Part 50, Appendix L, Section 6.6
40 CFR Part 50, Appendix L, Section 6.7
40 CFR Part 50, Appendix L, Section 6.8
40 CFR Part 50, Appendix L, Sections
6.9. land 6.9.2
40 CFR Part 50, Appendix L, Section
6.10
Sampler Performance Specifications
Sample flow rate
Flow regulation
Flow rate precision
Flow rate accuracy
External leakage
Internal leakage
Ambient temperature
sensor
Filter temperature sensor
Barometric pressure
Clock/timer
1.000 nrVhr
1.000±5%m3/hr
2% CV
±2%
<80 mL/min
<80 mL/min
-30°C to 45°C
0. 1°C resolution and ± 2°C accuracy
-30°C-45°C
0. 1°C resolution and ± 1.0°C accuracy
600 mm Hg to 800 mm Hg
5-mm resolution and ± 10-mm
accuracy
Date/time
1 min resolution and ± 1 min/mo
accuracy
40 CFR Part 50, Appendix L, Section 7.4
40 CFR Part 50, Appendix L, Section 7.4
40 CFR Part 50, Appendix L, Section 7.4
40 CFR Part 50, Appendix L, Section 7.4
40 CFR Part 50, Appendix L, Section 7.4
40 CFR Part 50, Appendix L, Section 7.4
Volume II-MS. 2.12
40 CFR Part 50, Appendix L, Section 7.4
Volume II-MS. 2.12
40 CFR Part 50, Appendix L, Section 7.4
Volume II-MS. 2.12
40 CFR Part 50, Appendix L, Section 7.4
Volume II-MS. 2.12
40 CFR Part 50, Appendix L, Section 7.4
The air samplers will be purchased, distributed, and certified by EPA as meeting the
requirements specified in the Federal Register, therefore, the PEP assumes that the instruments
are adequate for sampling PM2.s. However, the PEP is responsible for certifying the performance
parameters of the PM2.5 samplers after assuming custodianship of these samplers. Routine
verifications (every sampling event) and quarterly audits of sampler performance specifications
are conducted thereafter. Element 15.0, Instrument/Equipment Testing, Inspection, and

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 6.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
                                                                              Page 5 of 15
Maintenance Requirements, lists all the primary operational equipment requirements for the PEP
PM2.5 data collection operations. Additional support equipment are listed in the PEP Field SOPs.
6.2.2  Critical Field Measurements

Table 6-2, which is based on Table L-l of Appendix L in the Federal Register, represents the
field measurements that must be collected. These measurements are made by the air sampler and
are stored in the instrument for downloading by the FS during routine visits.

                      Table 6-2. Field Measurement Requirements
Information to be Provided
Flow rate, 30-second
maximum interval
Flow rate, average for the
sample period
Flow rate, coefficient of
variation for the sample
period
Flow rate, 5 -minute average
out of specificationf
Sample volume, total
Temperature, ambient,
30-second interval
Temperature, ambient,
minimum, maximum, average
for the sample period
Barometric pressure, ambient,
30-second interval
Barometric pressure, ambient,
minimum, maximum, average
for the sample period
Filter temperature, 30-second
interval
Filter temperature,
differential, 30-minute
interval, out of specificationf
Filter temperature, maximum
differential from ambient,
date, time of occurrence
Date and time
Sample start and stop time
settings
Appendix L
Section
Reference
7.4.5.1
7.4.5.2
7.4.5.2
7.4.5.2
7.4.5.2
7.4.8
7.4.8
7.4.9
7.4.9
7.4.11
7.4.11
7.4.11
7.4.12
7.4.12
Availability
Anytime"
•
*
*
•
*
•
*
•
*
•
*
*
•
•
End of
Periodb
—
•
•
•
•
—
•
—
•
—
•
*
—
•
Visual
Display0
•
*
*
•
•
•
•
•
•
•
•
*
•
•
Data
Outputd
*
•
• •
• •
• •
—
• •
—
• •
—
• •
*
—
•
Format
Digital
Reading"
xx.x
xx.x
xx.x
On/off
XX.X
XX.X
XX.X
XXX
XXX
xx.x
On/off
X.X,
YY/MM/DD
HH:mm
YY/MM/DD
HH:mm
YY/MM/DD
HH:mm
Units
L/min
L/min
%

m3
°C
°C
mmHg
mmHg
°C

°c,
Yr/mo/day
hrmin
Yr/mo/day
hrmin
Yr/mo/day
hrmin

-------
                                                                                 Project: PEP QAPP
                                                                                   Element No.: 6.0
                                                                                    Revision No.: 1
                                                                                     Date: 3/6/2009
                                                                                	Page 6 of 15
Information to be Provided
Sample period start time
Elapsed sample time
Elapsed sample time out of
specificationf
Power interruptions >1 min,
start time of first 10 power
interruptions
User-entered information,
such as sampler and site
identification
Appendix L
Section
Reference
7.4.12
7.4.13
7.4.13
7.4.15.5
7.4.16
Availability
Anytime"
—
*
—
*
•
End of
Periodb
•
•
•
•
•
Visual
Display0
•
•
•
*
•
Data
Outputd
• •
• •
• •
•
• •
Format
Digital
Reading"
YYYY/MM/
DD HH:mm
HH:mm
On/off
lHH:mm,
2HH:mm,
etc.
As entered
Units
Yr/mo/day
hrmin
Hrmin

Hrmin

•  Provision of this information is required.
—  Not applicable.
#  Provision of this information is optional. If information related to the entire sample period is optionally
    provided before the end of the sample period, then the value provided should be the value calculated for the
    portion of the sampler period completed up to the time the information is provided.
•  Indicates that this information is also required to be provided to the AQS database.
a   Information must be available  to the operator at any time the sampler is operating.
b   Information relates to the entire sampler period and must be provided following the end of the sample period
    until the operator manually resets the sampler or the sampler automatically resets itself upon the start of a new
    sample period.
0   Information shall be available  to the operator visually.
d   Information will be available as digital data at the sampler's data output port following the end of the sample
    period until the operator manually resets the sampler or the sampler automatically resets itself upon the start of
    a new sample period.
e   Digital readings, both visual and data output, shall have no less than the number of significant digits and
    resolution specified in this table.
f   Flag warnings may be displayed to the operator by a single-flag indicator or each flag may be displayed
    individually. Only a set (on) flag warning must be indicated; an unset (off) flag may be indicated by the
    absence of a flag warning. Sampler users should refer to Section 10.12 of Appendix L about the validity of
    samples for which the sampler provided an associated flag warning.


In addition to the measurements collected in Table 6-2, supporting field  data will also be
collected. These additional parameters are identified in the PEP Field SOPs and help to identify
the samples, ensure proper COC, holding times,  and data quality. The values are recorded on the
COC Form and the FDS.
6.3     Laboratory Activities

The PEP also requires extensive laboratory activities, including filter handling, inspection,
equilibration, weighing, data entry/management, and archiving. Region 4 is currently responsible
for laboratory activities for this program. Detailed Laboratory SOPs have been developed. In
addition, Good Laboratory Practices must be followed. The PEP laboratory must conform to the
following:

-------
                                                                          Project: PEP QAPP
                                                                            Element No.: 6.0
                                                                             Revision No.: 1
                                                                             Date: 3/6/2009
	Page 7 of 15

    •  Microbalance operation and calibration must be in accordance with the vendor's
       instructions manual with the requirements for gravimetric analyses provided in 40 CFR
       50, Appendix L, and with the QA Guidance Document 2.12 Monitoring PM2.s in Ambient
       Air Using Designated Reference or Class I Equivalent Methods.
    •  Activities must adhere to the PEP Laboratory SOPs.
    •  Activities must adhere to the standards, principles, and practices outlined in the PEP
       QAPP.
    •  Personnel must complete EPA's federally implemented training and certification program
       annually.
The following information summarizes the laboratory activities, in general chronological order,
that are detailed in the laboratory SOPs.

Pre-sampling weighing will include the following:
    •  Filters will be received from EPA and examined for integrity.
    •  Filters will be enumerated for data entry.
    •  Filters will be equilibrated and weighed.
    •  Filters will be prepared for field activities or stored.
    •  The laboratory will develop and maintain shipping/receiving requirements, which would
       apply to containers, cold packs, minimum/maximum thermometers, and COC
       requirements/documentation.

Post-sampling weighing will include the following:
    •  Filters will be received in the laboratory, checked for integrity (e.g., damage,
       temperature), and logged in.
    •  Filters will be archived (cold storage) until they are ready for weighing.
    •  Filters will be brought into the weighing  facility  and equilibrated for 24 hours.
    •  Filters will be weighed, and gravimetric data will be entered  in the database to calculate a
       concentration.
    •  Field data will be entered into the database.
    •  Filters will be archived in cold storage for the rest of the calendar year, for the next full
       calendar year, and will  remain at room temperature for 3 additional years. As an example,
       a filter sampled on March 1, 2007, would be kept in cold storage  until December 31,
       2008, and not disposed of until after December 31, 2011.
    •  Required data will be submitted to the AQS database.
The details for these activities are included in various  sections of this document, as well as in
laboratory SOPs. Table 6-3 provides the performance  specifications  of the laboratory
environment and equipment.

-------
                                                                       Project: PEP QAPP
                                                                         Element No.: 6.0
                                                                          Revision No.: 1
                                                                           Date: 3/6/2009
                                                                             Page 8 of 15
                   Table 6-3. Laboratory Performance Specifications
Equipment
Microbalance
Microbalance environment
Mass reference standards
Acceptance Criteria
Resolution of 1 /j,g, repeatability of 1 /j,g.
Climate-controlled draft-free room, chamber, or equivalent. Mean relative
humidity between 30% and 40%, with a target of 35% and variability of not more
than ± 5% over 24 hours; with minimums and maximums never to fall out of the
range of 25%-45%. Mean temperature should be held between 20°C and 23 °C,
with a variability of not more than ± 2°C over 24 hours, with minimums and
maximums never to fall out of the range of 18°C-25°C.
Standards will bracket the expected weight of filter, and the individual (Class 1)
standard's tolerance will be within ± 25 ,wg, annual certified mass.
6.3.1   Critical Laboratory Measurements

Filter pre-weights (unexposed) and post-weights (exposed) are the most critical measurements in
the laboratory. The difference between these two measurements provides the net weight of
particles in micrograms (//g) that when divided by the air volume in cubic meters (m3) pulled
through the filter, provides a final concentration (//g/m3). In addition to these critical
measurements, supporting laboratory data will also be collected to help identify the samples and
ensure proper COC, holding times, and data quality.  These additional parameters are described
in more detail in Element 13.0, Analytical Methods Requirements, and in the PEP Laboratory
SOPs.

6.4    Schedule of Activities

The PEP consists of laboratory and field activities, which must be coordinated and completed in
a timely, efficient manner for the program to be successful. This includes activities such as
acquiring equipment and supplies, developing sampling schedules,  shipping and receiving
prepared filter cassettes, conducting  site visits, weighing filters, and performing QC checks. The
sections below describe some of the  time-critical components of conducting PEP audits.
Additional detail  is provided in the PEP Field and Laboratory SOPs. The laboratory must also
ensure that its operating calibration standards and independent internal audit standards are
certified annually as National Institute of Standards and Technology (NIST) traceable.

6.4.1   PEP Audit Frequency

The sampling design has been codified in 40 CFR Part 58, Appendix A, Section 3.2.7, as
follows.

The PEP is an independent assessment used to estimate total measurement system bias. These
evaluations will be performed under  the PM PEP (40 CFR Part 58,  Appendix A, Section 2.4) or
a comparable program. PEs will be performed on the SLAMS monitors annually within each
PQAO. For PQAOs with more than five monitoring  sites, eight valid PE audits must be collected
and reported each year. A valid PE audit (for the purposes of calculating network bias and
precision as required by the regulations) means that both the primary monitor and PEP audit

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 6.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
	Page 9 of 15

concentrations have not been invalidated and are greater than 3 //g/m3. Additionally, each year,
every designated FRM or FEM sampler within a PQAO must have

    •   Each method designation evaluated each year
    •   All FRM or FEM samplers subjected to a PEP audit at least once every 6 years. This
       equates to approximately 15% of monitoring sites audited per year.
A limited number of "make-up" PEs are included in the annual budget by OAQPS. Scheduling is
the responsibility of the Regional WAM/TOPO/DOPO and the SLT.
NOTE: Sites that have seasonally low concentrations should be sampled during times when
concentrations are expected to be greater than 3 jug/m3. EPA recognizes that it may be difficult or
impossible to obtain valid audits at sites where the concentration rarely exceeds 3 jug/m3. EPA is
currently considering ways to evaluate such sites. Audits that are otherwise valid, but do not
meet the "greater than 3 //g/m3" criteria, are still useful to evaluate sampler operation, even if
such audit data may not be used in the calculations for sampler bias.

6.4.2   PEP Sampling Schedule

SLT organizations will work with EPA Regions to select and develop a list of sites for the
evaluations to be conducted in each calendar year on or before December 1 of the previous year.
The Regional WAM/TOPO/DOPOs will attempt to determine the most efficient site visit
schedule.  This schedule should be based upon the following:

    •   CFR requirements for audit frequency
    •   Meeting the same monitoring schedule as the routine sampler being evaluated
    •   Site proximity (the sites that are closest in proximity to each other can be visited within
       the same day or week).

PEs should be implemented on a normal sampling day so that they do not create additional work
for the SLT organizations. Thus, for sites that only sample 1 day in 3 or 1 day in 6, this schedule
must be taken into account when scheduling a PE site visit. However, if the SLT agency is
amenable to perform a PE on a day other than a routine sampling day and is willing to post the
result to AQS, then the visit can be scheduled. Accurate reporting of alternate sampling days is
critical.

6.4.3   General Time Line for PEP Activities

Below is a list of activities, in general chronological order, that are performed by PEP laboratory
and field personnel when conducting an FRM PE:
  1.   A  field equipment list is developed, and the equipment is acquired.
  2.   The EPA WAM/TOPO/DOPO and SLT organization determine the annual PEP sampling
       schedule.
  3.   The WAM/TOPO/DOPO, FS, and SLT organization schedule site visits.

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 6.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
	Page 10 of 15

  4.    The FS attempts to identify any issues with the site prior to audit; the
       WAM/TOPO/DOPO resolves issues with the SLT organization.
  5.    The FS and site operators confirm the scheduled PE.
  6.    The FS sends an order for filters to the weighing laboratory.
  7.    The PEP weighing laboratory activities commence.
       a.  The weighing laboratory receives filter shipments from EPA.
       b.  The weighing laboratory checks, equilibrates, and weighs filters.
       c.  The weighing laboratory loads the filters into cassettes and ships them with their
          accompanying COC forms to the EPA Regions/FS office.
  8.    The FS receives the filter cassettes, FDSs, and COC forms and completes as much of
       these sheets and forms as possible at the field office.
  9.    The FS transports the PEP audit sampler to the site and evaluates the site for set up.
  10.  The FS assembles the sampler, sets the date/time, and then performs leak checks and
       barometric pressure, temperature, and flow rate verifications.
  11.  The FS performs a field blank exercise if needed.
  12.  The FS installs the sampling filter cassette.
  13.  The FS sets the controller to run during a 24-hour sampling event (midnight to midnight).
  14.  The sampler collects PM on the filter during the scheduled event.
  15.  The FS recovers the filter cassette and downloads recorded sampling event parametric
       summary data.
  16.  The FS disassembles the sampler.
  17.  The FS packages recovered cassette(s) and ships them along with data (e.g., diskette or
       other portable media), FDSs,  and COC forms back to the weighing laboratory.
  18.  The PEP weighing laboratory will post-equilibrate and weigh filters.
  19.  The PEP weighing laboratory performs data validation activities, including FS review of
       transcribed field data.
  20.  The EPA WAM/TOPO/DOPO for the PEP weighing laboratory approves data that are to
       be loaded into the AQS.
  21.  EPA OAQPS (contractor) loads data into the AQS.

6.4.4  Implementation Time Lines
There are some other important dates that must be met during implementation activities. They
involve both laboratory and field activities.

One time-critical aspect of the implementation process is the filter holding time. As illustrated in
Figure 6-2 and as stipulated in the CFR, filters must be used within 30 days of pre-sampling
weighing, or they must be  reconditioned and pre-weighed again. Therefore, it is critical that the
weighing laboratory develop a schedule to provide the FSs with filters that will be used in the
appropriate time frame.

-------
                                                                           Project: PEP QAPP
                                                                             Element No.: 6.0
                                                                              Revision No.: 1
                                                                               Date: 3/6/2009
                                                                                Page 11 of 15
Figure 6-2 indicates that for best practice, the FS will collect the filters within 24 hours of the
end of the sample exposure period. Filters collected after 48 hours will be assigned a minor flag
by the weighing laboratory, which may contribute to an invalidation depending upon the result
of other QC checks. The critical recovery time, beyond which filters will be automatically
invalidated, is 96 hours.
                              Life Cycle of a PEP Filter
          Filter must be exposed within
         30 days after steady tare weight
     24-hr sampling period	——
     Routine filter collection within 24 hrs
     of exposure for "best practice"
     Filter collection due to week-ends and
     holidays within 48 hrs of exposure
     Emergency delayed collection
     within 96 hours of exposure
     Overnight shipping at< 4°C
     Routine post-sampling weighing
     within 10 days of exposure
     For emergency delayed collection,
     post-sampling weighing within 15 days
     of exposure
                                                 > 30 day filter expiration
                                                 flag
                                                 > 48-hr filter collection flag

                                                 > 96-hr filter collection flag
                                                 (invalidates sample) *
                                                 > 4°C maximum shipping
                                                 temperature flag
                                                 > 10 days weighing flag

                                                   	 > 15 days weighing
                                                       flag (invalidates
                                                       sample) *
  0
10     15
                         20     25    30     35    40    45    50    55     60
                         Days from Steady Tare Weight
* Invalidation may be overridden in special circumstances by the lab supervisor.
                          Figure 6-2. Critical filter-holding times.
Ideally, samples will be sent the day of removal to the appropriate laboratory via next-day
delivery. The FS should ship the exposed filters within 8 hours of recovery on Monday through
Thursday and as soon as possible if recovery occurs on a Friday. If an issue arises in which
shipment cannot occur within these guidelines, then the FS must store the filters at less than or
equal to 4°C until the next available shipping day. The weighing laboratory must be notified of
the delayed shipment date because the post-sample weighing must occur within 10 days of
exposure to avoid a data validation flag. Data will be immediately downloaded from the portable
sampler and stored on the computer's hard drive and two portable storage media (e.g., diskette,
CD, or USB drive). One copy of these data will be shipped with the sample. Data may also be
transmitted electronically (e.g., via e-mail) if necessary to the weighing laboratory. Table 6-4
provides a  summary of the key  activities previously discussed.

-------
                                                                                   Project: PEP QAPP
                                                                                     Element No.: 6.0
                                                                                      Revision No.: 1
                                                                                       Date: 3/6/2009
                                                                                        Page 12 of 15
                              Table 6-4. Implementation Summary
Activity
Laboratory tares the filters
Laboratory ships the filters
to the FS (best practice)3
FS loads the filter into the
sampler13
Filter exposure
Filter collection0
Shipped to laboratory
(best practice)"1
Laboratory equilibrates
and weighs the filter6
Holding Time
As needed
<7 days
<30 days from
pre-weigh
1 day
24 (48) (96) hrs
<8hrs
<10 (15) (30) days
From
Filter box
Stable tare weight
Received from the
laboratory
Mounting in sampler
End of sampling period
Recovery
End of sampling period
To
Stable tare weight
Shipment
Mounting in sampler
End of sampling period
Recovery
Shipment
Stable post-sampling
gravimetric mass
The maximum life for a PEP audit filter is 46 days.
a  The PEP QAPP states that the filter must be loaded into sampler or used as a blank within 30 days after the tare
   weight stabilizes. Best practice dictates that the laboratory ship tared filters as soon as possible, usually within 1
   week.
b  Refer to the "use by" date on the PEP COC Form.
0  PEP filters should be routinely recovered within 24 hours after conclusion of exposure. Note that 48-hour
   collection is permissible due to holidays and weekends if the site is inaccessible. These filters receive a 48-hour
   collection flag. Up to 96-hour collection is permissible, but only in the case of an emergency (e.g., sickness,
   accident). If the collection time is > 96 hours, the sample will receive an invalidation flag.
d  The FS will always transport exposed filters and blanks with chilled cold packs. The PEP requires 8-hour
   packaging and shipping after filter recovery. However, if the sample is recovered on a Friday, then it should be
   stored at a temperature <4°C until the next  available shipping day.  The laboratory must be notified of the delay
   because the sample must be weighed within 10 days after exposure to avoid a validation flag, which in
   conjunction with another flag may invalidate the sample.
e  Filters received from the field are to be equilibrated and post-weighed within 10 days after exposure.
   Exceptional events, such as Thursday sampling events followed by a Monday holiday or collection between 48
   and 96 hours (resulting from emergencies), will permit 15-day post-sampling weighing periods. NOTE:  Samples
   weighed after 15 days will be considered invalid unless additional QA evaluation is performed by the
   laboratory's QA Officer. Based on review  and acceptance of the sample's consistency with historical CV data
   (comparing differences between PEP and routine site sample data), the validation flag may be overridden by the
   QA Officer. However, any PEP sample that cannot be weighed within 30 days from exposure shall not be
   overridden and should therefore not be post-weighed.

6.4.5   Assessment Time Lines
6.4.5.1   Data Availability

The PEP weighing laboratory should complete data validation within 60 days of the sample end
date. The laboratory should submit its validated data to OAQPS (or authorized contractor)
monthly for data assessment purposes. Submitting routine sampler data as soon as possible is
encouraged to ensure that data assessment occurs in a timely manner.

PEP audit results are posted to the AQS as data pairs. A data pair consists of the PEP audit and
site's measured values. SLAMS sites are required to post their site data to the AQS within 90

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 6.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
                                                                             Page 13 of 15
days after the end of the quarter. Because posting PEP data requires first obtaining the site's
measured value from the AQS, PEP data cannot normally be posted until after the due dates
listed in Table 6-5. In cases where the site data have been uploaded into the AQS and validated
on or before the due date, the PEP audit data should be available through the AQS within 30
days after the due date (to allow enough time for processing and review). Data submitted after
the due date will be available within 30 days after the end of the next reporting period.

                     Table 6-5. Data Reporting Schedule for the AQS
Reporting Period
January 1-March 3 1
April 1-June 30
July 1-September 30
October 1-December 3 1
Due Date
June 30
September 30
December 3 1
March 31
6.4.5.2  Assessments of PEP Data

The Region 4 ESAT Contractor is tasked to provide Level 0 and Level 1 assessments of the PEP
data. Refer to Element 22.0 for a discussion of PEP data review, validation, and verification
requirements, which incorporate these levels of assessment. Following the SLT agencies'
submittals of quarterly PM2.5 FRM/FEM data, OAQPS (via the support contractor) will load the
PEP data into the AQS. The PEP Laboratory Manager, Regional (Laboratory)
WAM/TOPO/DOPO, and the OAQPS contractor(s) will review the PEP data. They will report to
the PEP Workgroup significant operations issues of the PEP that are reflected by the data. After
both routine data and PE data for a site are in the AQS database, OAQPS, EPA Regions, and
SLT organizations can use the AQS data evaluation programs, based on data quality assessment
techniques, to assess this information.

6.4.6  OAQPS Reporting Time Lines

6.4.6.1  QA Reports

As mentioned in Element 3.0, Distribution, OAQPS plans to develop and distribute Annual QA
Summary Reports and interpretive 3-year QA Reports per the distribution list in Element 3.0 and
to other interested parties, such as ESAT contractors. The Annual QA Summary Report will be
based on a calendar year, and it should be completed 6 months from the last valid entry of
routine data by the SLT organizations. This report will include basic statistics of the data,
including completeness; PEP results versus FRM/FEM results; results of collocation studies for
precision of PEP samplers, both aggregated and by the Region; QC charts for the weighing
laboratory; and PEP sampler performance versus acceptance criteria, PEP TSA findings, and a
summary of yearly standard certifications. The 3-year QA Report should be generated 9 months
after the last valid entry of routine data by the SLT organizations for the final year. This report is
a composite of the annual reports, but with a more narrative interpretation and evaluation of
longer term trends with respect to PEP sampler and operational performance.  In the year that a 3-
year QA Report is generated, the Annual QA Summary Report is not required.

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 6.0
                                                                           Revision No.: 1
                                                                           Date: 3/6/2009
                                                                            Page 14 of 15
6.4.6.2  Assessment Reports

Each EPA Region, ORIA and OAQPS will perform TSAs of the PEP ESAT contractors and PEP
activities as specified in Table 6-6 below. Initial assessment findings will be documented and
reported back to the audited organization within 15 working days after the assessments. Final
assessment reports, including responses to findings and follow-up activities, will be submitted to
the National PEP Project Leader at OAQPS by the end of the first quarter of the following year
to have the results summarized in the Annual QA Summary and 3-year QA Reports.

6.5    Project Assessment Techniques

An assessment is an evaluation process used to measure the performance or effectiveness of a
system and its elements. As used here, "assessment" is an all-inclusive term used to denote any
of the following: audit, PE, MSR, peer review, inspection, or surveillance. Definitions for each
of these activities can be found in the glossary (Appendix A). Element 20.0, Assessments and
Response Actions, discusses the details of the assessments. Table 6-6 provides information on
the organizations that implement the assessment and the frequency of these assessments.

                            Table 6-6. Assessment Schedule
Assessment Type
TSA of FS and field operations
Surveillance of FSs' operations
TSA of the gravimetric
laboratory and laboratory
operations
PE of weighing lab(s)
Management systems review
of Regional conduct of the PEP
Data Quality Assessment
Assessment Agency
EPA Regional office
OAQPS at annual recertification of FSs
or by the EPA Regional office as needed
OAQPS or the EPA Regional office
if the SLT organization runs its own
PEP laboratory
ORIA
OAQPS
OAQPS
Frequency
One per year
One per year unless there is a
need for additional Regional
surveillance
One per year
Two per year, approximately
every 6 months
Two Regions per year
Every year
6.6    Project Records

The field and laboratory programs will establish and maintain procedures for the timely
preparation, review, approval, issuance, use, control, revision, and maintenance of documents
and records. Table 6-7 represents the categories and types of records and documents that are
applicable to document control for PM2 5 information. Information about key documents in each
category is explained in more detail in Element 9.0, Documentation and Records.

-------
                                                                                 Project: PEP QAPP
                                                                                    Element No.: 6.0
                                                                                     Revision No.: 1
                                                                                      Date: 3/6/2009
                                                                                      Page 15 of 15
                          Table 6-7. Critical Documents and Records
                       Categories
                Management and
                organization
                Site information
                Environmental data
                operations
                Raw data
                Data reporting
                Data management
                QA
         Record/Document Types
State Implementation Plan
Reporting agency information
Organizational structure
Personnel qualifications and training
Training certification
Quality Management Plan
Document Control Plan
EPA directives
Grant allocations
Support contract
Network description
Site characterization file
Site maps
Site pictures
Quality Assurance Project Plans
Standard operating procedures
Field and laboratory notebooks
Sample handling/custody records
Inspection/maintenance records
Any original data (routine and QC data),
including data entry forms
Air Quality Index Report
Annual state and local monitoring stations' air
quality information
Data/summary reports
Journal articles/papers/presentations
Data algorithms
Data management plans/flowcharts
PM25 data
Data management systems
Good Laboratory Practices
Network reviews
Control charts
Data Quality Assessments
QA reports
System audits
Response/corrective action reports
Site audits
References

1.  U.S. EPA (Environmental Protection Agency). 2006. National Ambient Air Quality
    Standards for Particulate Matter—Final Rule.  40 CFR Part 50. Federal Register
    71(200):61144-61233. October 17.

-------
                                                                       Project: PEP QAPP
                                                                         Element No.: 7.0
                                                                          Revision No.: 1
                                                                           Date: 3/6/2009
                                                                       	Page 1 of 7
      7.0  Data Quality Objectives and Criteria for Measurement

The purpose of this element is to document the DQOs of the project and to establish performance
criteria for the environmental data operation (EDO) that will be used to generate the data.

7.1    Data Quality Objectives

DQOs are qualitative and quantitative statements derived from the DQO process that clarify the
monitoring objectives, define the appropriate type of data, and specify the tolerable levels of
decision errors for the monitoring program. By applying the DQO process to the development of
a quality system for PM2.5, EPA guards against committing resources to data collection efforts
that do not support a defensible decision. The DQO process was implemented for the PM2 5 PEP
in 1997. The DQOs were based on the ability of the decision maker(s) to make NAAQS
comparisons within an acceptable probability of decision errors. Based upon the acceptable
decision error of 5%, the DQO for acceptable precision (10% CV) and bias (± 10%) were
identified. These precision and bias values will be used as goals from which to evaluate and
control measurement uncertainty. The PEP provides the measurements upon which the bias
component of the DQO is evaluated and is, in essence, a network-scale QC check. In many
environmental measurements, bias can be measured and evaluated by simply introducing
standard reference material into a measurement phase and evaluating the results. Because there is
no accurate way of introducing a known concentration of particles into a PM2.5 FRM/FEM
sampler, the PEP was developed  to serve, as closely as possible, as a reference standard by
which a relative network bias can be determined (and in a gross sense, the relative accuracy of a
local monitor).

The data collected under the PEP are to be used to determine whether there is bias in the
measurement system used to measure PM2.5 for comparison to the PM2.5 NAAQS. It is important
to control the repeatability of the measurements from each PEP sampler. It is also important to
be sure there is a statistically  significant amount of data on which to make a decision about the
presence of bias. The more samples used in the analysis, the larger the confidence; however, it is
important not to waste resources  by collecting too many samples.

The minimum number of samples needed to detect a bias of ± 10% depends on the precision
(CV) of PM2.5 measurements and the actual bias, which were not well characterized at the
beginning of the PEP. Initially, based on a statistical review, the audit frequency was set at 25%
of the national PM2.5 FRM/FEM  network each year; each selected sampler was audited four
times during the specified year. This frequency was shown to be adequate to evaluate bias for a
typical reporting organization, assuming initial estimates of sampler CV of less than 10% and
allowing for a 10% decision error.

In 2005, the minimum sampling frequencies that were needed to detect a 10% bias over 3 years
were re-evaluated using actual network data to get a better estimate of CV and typical bias
levels. A paper that provides more details on this re-evaluation is provided as Appendix B,

-------
                                                                          Project: PEP QAPP
                                                                           Element No.: 7.0
                                                                             Revision No.: 1
                                                                             Date: 3/6/2009
	Page 2 of 7

Support Data Quality Objectives. Using the updated estimates, it was determined that
approximately 24 audits over a 3-year period (i.e., 8 per year) would be adequate to evaluate a
± 10% bias for a reporting organization. Recent changes to the regulations (40 CFR Part 58,
Appendix A, Section 3.2.7) now require all organizations with five or fewer sites to collect at
least five valid PE audits per year and organizations with more than five sites to collect at least
eight valid PE audits per year (see Section 6.4.1 for additional discussion on audit frequency).
These sampling frequencies are consistent with the frequencies described in Appendix B,
Documents to Support Data Quality Objectives, to meet the DQOs of the PEP for the national
PM2.5 FRM network. The data will be evaluated year by year and cumulatively every third year.

7.2    Measurement Quality Objectives

After a DQO is established, the quality of the data must be evaluated and controlled to ensure that
it is maintained within the established acceptance criteria. Measurement Quality Objectives
(MQOs) are designed to evaluate and control various phases  (e.g., sampling, preparation,
analysis) of the measurement process to ensure that total measurement uncertainty is within the
range recommended by the DQOs. The MQOs can be defined in terms  of the following data
quality indicators:
   Precision—A measure of mutual agreement among individual measurements of the same
   property, usually under prescribed similar conditions. This is the random component of error.
   Bias—The systematic  or persistent distortion of a measurement process, which causes error in
   one direction. Bias will be determined by estimating the positive and negative deviations from
   the true value as a percentage of the true value.
   Representativeness—A measure of the degree in which  data accurately and precisely
   represent a characteristic of a population, parameter variations at a  sampling point, a process
   condition, or an environmental condition.
   Detectability—The determination  of the low-range critical value of a characteristic that a
   method-specific procedure can reliably discern.
   Completeness—A measure of the amount of valid data obtained from a measurement system
   compared to the amount that was expected to be obtained under correct, normal conditions.
   Data completeness requirements are included in the reference methods (40 CFR Part  50).
   Comparability—A measure of confidence with which one dataset can be compared to
   another.

"Accuracy" is a term that is frequently  used to represent closeness to "truth" and includes a
combination of precision and bias error components. The term "accuracy" has been used
throughout the CFR and in some of the elements of this document. The PEP attempts to
apportion measurement uncertainties into precision and bias components.

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 7.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
	Page 3 of 7


For each of these attributes, acceptance criteria were developed for various phases of the EDO.
Various parts of 40 CFR have identified acceptance criteria for some of these attributes, as well
as Guidance Document 2.12.  In theory, if these MQOs are met, then measurement uncertainty
should be controlled to the levels required by the DQO. It should be noted that some MQOs for
PEP are more stringent than routine PM2.5 FRM MQOs. Table 7-1 lists the MQOs for the PEP.
More detailed descriptions of these MQOs and how they will be used to control and assess
measurement uncertainty will be described in other elements of this QAPP and in the Field and
Laboratory SOPs.

References

1. U.S. EPA (Environmental Protection Agency). 1998. EPA Guidance for Quality Assurance
   Project Plans. EPA QA/G-5, EPA/600/R-98/018. February.
2. U.S. EPA (Environmental Protection Agency). 1998. Quality Assurance Guidance Document
   2.12: Monitoring PM2.5 in Ambient Air Using Designated Reference or Class I Equivalent
   Methods. December.

-------
Table 7-1. Measurement Quality Objectives—Parameter PMi.s
Requirement
Frequency
Acceptance Criteria
40 CFR Reference
Lab/Field SOP
Reference
Filter Holding Times
Pre-sampling weighing
Post- sampling weighing
All filters
All filters
<30 days before sampling
<10 days stored at 4°C from sample end
date3
Part 50, Appendix L,
Section 8. 3
Part 50, Appendix L,
Section 8. 3
Lab SOP, Section 4
Lab SOP, Section 4
Reporting Units
Reporting units
All data
,«g/m3
Part 50.3

Detection Limit
Lower detection limit
Upper concentration limit
All data
All data
2 ^g/m3
200 /ng/m3
Part 50, Appendix L,
Section 3.1
Part 50, Appendix L,
Section 3. 2


Data Completeness
Data completeness
5 or 8 sites with 24-hr collocated filter
collection
100%
Part 58, Appendix A,
Section 3. 2. 7

Filter
Visual defect check
Exposure lot blanks
All filters
3 filters from each of the 3 boxes in lot
(9 filters total)
See reference
<1 5 jug change between weighings
Part 50, Appendix L,
Section 6.0
Not described
Lab SOP, Section 5
Lab SOP, Section 6
Filter Conditioning Environment
Pre-sample equilibration
Post- sample equilibration
Temperature range
Temperature control
All filters
All filters
All filters
All filters
24 hrs minimum in weighing room;
<5 /ig change between sequential
weighings of each filter
24 hrs minimum in weighing room;
<15 jug between sequential weighings for
2 of 3 filters in each filter batch
24-hr mean 20°C-23°C;
18°C minimum, 25°C maximum
± 2°C over 24 hr
Part 50, Appendix L,
Section 8.2
Part 50, Appendix L,
Section 8.2
Part 50, Appendix L,
Section 8.2.1
Part 50, Appendix L,
Section 8.2.2
Lab SOP, Section 6
Lab SOP, Section 6
Lab SOP, Section 6
Lab SOP, Section 6
                                                                                    8, 8 9

-------
Requirement
Relative humidity range
Relative humidity control
Frequency
All filters
All filters
Acceptance Criteria
24-hr mean 30%-40%;
25% minimum, 45% maximum
± 5% over 24 hr
40 CFR Reference
Part 50, Appendix L,
Section 8.2.3
Part 50, Appendix L,
Section 8.2.4
Lab/Field SOP
Reference
Lab SOP, Section 6
Lab SOP, Section 6
Laboratory Quality Control Check
Field filter blank*
Laboratory filter blank
Trip filter blankc
Balance check
Duplicate filter weighing
1 per audit (for programs <2 years old)
or
1 per FS per trip (for all other
programs)
10% or 1 per weighing session
10% of all filters
Beginning/end of weighing session and
one after approximately every 15
samples or fewer, per recommendations
of balance manufacturer
1 per weighing session; one carried
over to next session
± 30 /Lig change between weighings
± 1 5 jug change between weighings
± 30 jug change between weighings
<3 /Lig of working mass standard
± 1 5 jug change between weighings
Part 50, Appendix L,
Section 8. 3. 7
Part 50, Appendix L,
Section 8. 3. 7
Not described
Part 50, Appendix L,
Section 8. 3
Part 50, Appendix L,
Section 8. 3
Lab SOP, Section 8
and Field SOP,
Section 6
Lab SOP, Section 8
Lab SOP, Section 8
and Field SOP,
Section 6
Lab SOP, Section 8
Lab SOP, Section 8
Field Calibration/Verification
Clock/timer verification
External leak check
Internal leak check
One-point barometric pressure
verification
Barometric pressure
calibration1*
Single-point temperature
verification
Temperature calibration"'
Every sampling event
Every sampling event
Upon failure of external leak check
Every sampling event and following
every calibration
Upon failure of single-point verification
Every sampling event and following
every calibration
Upon failure of single-point verification
±1.0 min/mo
<80 mL/min
<80 mL/min
± 10 mmHg
± 10 mmHg
± 2°C of working standard
± 0.1 °C of calibration standard
Part 50, Appendix L,
Section 7.4. 12
Part 50, Appendix L,
Section 7.4. 6.1
Part 50, Appendix L,
Section 7.4.6.2
Part 50, Appendix L,
Sections 7.4.9 and 9.3
Part 50, Appendix L,
Sections 7.4.9 and 9.3
Part 50, Appendix L,
Sections 7.4.8 and 9.3
Part 50, Appendix L,
Sections 7.4.8 and 9.3
Field SOP, Section 5
Field SOP, Section 5
Field SOP, Section 5
Field SOP, Section 5
Field SOP, Section 10
Field SOP, Section 5
Field SOP, Section 10
8, 8

-------
Requirement
Single-point flow rate
verification
Flow rate calibration1*
Post-calibration single-point
flow rate verification
Frequency
Every sampling event
Upon failure of single-point verification
Following every calibration
Acceptance Criteria
± 4% of working standard or ± 4% of
design flow (16.67 Lpm)
± 2% of calibration standard at design
flow (16.67 Lpm)
±2% of design flow (16.67 Lpm)
40 CFR Reference
Part 50, Appendix L,
Section 9.2.5
Part 50, Appendix L,
Sections 7.4. land 9.2.6
Part 50, Appendix L,
Section 9.2.6
Lab/Field SOP
Reference
Field SOP, Section 5
Field SOP, Section 10
Field SOP, Section 10
Laboratory Calibration/Verification
Balance calibration
Laboratory temperature
verification
Laboratory humidity
verification
When routine QC checks indicate
calibration is needed and upon
approval
I/quarter
I/quarter
Manufacturer's specification
±2°C
±2 % relative humidity
Not described
Not described
Not described
Lab SOP, Section 7
Lab SOP, Section 7
Lab SOP, Section 7
Accuracy
Flow rate audit
External leak check
Internal leak check
Temperature audit
Barometric pressure audit
Balance audit (PE)
4/yr (manual)
4/yr
4/yr (if external leak check fails)
4/yr
4/yr
2/yr
± 4% of calibration standard at design
flow (16.67 Lpm)
<80 mL/min
<80 mL/min
± 2°C of calibration standard
±10 mmHg of calibration standard
± 20 jug of NISL-traceable standard,
± 15 /Lig for unexposed filters
Part 58, Appendix A,
Section 3. 5.1
Part 50, Appendix L,
Section 7.4.6
Part 50, Appendix L,
Section 7.4.6
Part 50, Appendix L,
Section 9. 3
Part 50, Appendix L,
Section 7.4
Not described
Field SOP, Section 8
Field SOP, Section 8
Field SOP, Section 8
Field SOP, Section 8
Field SOP, Section 8
Lab SOP, Section 1 1
Precision (Using Collocated Samplers)"
All samplers (mandatory)
2/year (semi-annual)
CV <10%
Part 50, Appendix L,
Section 5.0
Field SOP, Section 8
Calibration and Check Standards
Flow rate transfer standard
1/yr
± 2% of NISL-traceable standard
Part 50, Appendix L,
Sections 9.1 and 9.2
Field SOP, Section 8
8, 8

-------
Requirement
Field thermometer
Field barometer
Working mass standards
Primary mass standards
Frequency
1/yr
1/yr
3-6 mo
1/yr
Acceptance Criteria
±0.1°C resolution
±0.5°C accuracy
±1 mmHg resolution
±5 mmHg accuracy
0.025 mg
0.025 mg
40 CFR Reference
Not described
Not described
Not described
Not described
Not described
Not described
Lab/Field SOP
Reference
Field SOP, Section 8
Field SOP, Section 8
Lab SOP, Section 7
Lab SOP, Section 7
Representativeness
Method designation (sampler
type) in reporting organization
Samplers in reporting
organization
Each method designation audited
yearly
Each sampler audited at least once
every 6 years
Primary and PEP audit concentrations
are valid and >3.0 jug/m
Primary and PEP audit concentrations
are valid and >3.0 //g/m3
Part 58, Appendix A,
Section 3. 2. 7
Part 58, Appendix A,
Section 3. 2. 7


" The PEP requirement is more stringent than regulation (see Element 6.0, Project/Task Description, Table 6-4 for exceptions).
b For a new SLT program (i.e., less than 2-years old), the frequency for field blanks is one per FRM/FEM audit. For all others, one field blank should
 be performed per FS per trip. A trip may include audits for more than one FRM/FEM sampler. It is up to the FS to determine the site where the field
 blank audit will be performed, unless otherwise directed by his or her Regional WAM/TOPO/DOPO (such as when a problem is identified at a
 particular site).
cTrip blanks will be performed at a frequency of 10% of all filters, as determined by the weighing laboratory (i.e., 1 per every 10 filters shipped out,
 rounded up). So if the laboratory  sends out 1 to 10 filters, then 1 trip blank should be included in the shipment. If the laboratory ships out 11 to 20
 filters, then 2 trip blanks should be included. The FS will determine with which trip to use the trip blank filter(s), in a manner similar to the field
 blanks. However, if the FS receives more than one trip blank in a shipment, then he or she must make sure that only one trip blank is carried per trip.
d The BGIPQ200A is not capable  of performing multipoint verifications. If the BGIPQ200A fails a single-point verification, then a calibration
 should be performed next.
e Twice per year, all of the PEP samplers used by the Region (and any SLT organizations that are running their own PEP) must be collocated and run
 at the same location over the same time period. These are often referred to as "parking lot collocations." In 2007, this frequency was reduced from
 monthly and quarterly collocation scenarios because the historical performance shows that the precision does not seem to vary significantly. Semi-
 annual precision checks are justified.
                                                                                                                                               OQ  ^ C
                                                                                                                                               -J ir^ v o  <^
                                                                                                                                               a, I ° H ^
                                                                                                                                               ^i ^o i—- o hd

-------
                                                                        Project: PEP QAPP
                                                                          Element No.:8.0
                                                                           Revision No.: 1
                                                                           Date: 3/6/2009
                                                                       	Page 1 of 6
             8.0  Special Training Requirements/Certification

The purpose of this element is to ensure that any specialized or unusual training requirements to
conduct the PEP are implemented. Within this element, the procedures are described in sufficient
detail to ensure that specific training skills  can be verified, documented, and updated as
necessary.

OAQPS has developed a two-fold PEP training program. The first aspect of the training program
is to ensure all monitoring personnel have a baseline level of knowledge about the Ambient Air
Quality Monitoring Network  and the principles and operation of the PEP and the QA procedures.
This phase of training is ongoing and includes the following:

   •   National-level conferences and training workshops
   •   Regional training events
   •   An air training facility for hands-on experience
   •   National- and Regional-level conference calls
   •   Individual sessions upon request
   •   All documentation of SOPs and current materials used in PEP training are posted on
       AMTIC's Bulletin Board at http://www.epa.gov/ttn/amtic/pmpep.html.
   In the future, EPA will be developing and implementing the following:
   •   National broadcasts of the Web-based PEP training sessions with an interactive
       component
   •   Training videos for complete courses that consist of individual modules for each subject
       matter topic needed to attain full certification.

The second phase of training  specifically focuses on the PEP and includes the following:

   •   Specific, extensive hands-on field and laboratory training sessions, which are sponsored
       and developed by OAQPS, involve the ESAT contractors, Regional personnel, and SLT
       organization personnel
   •   A certification program to "certify" the ESAT field and laboratory personnel. This
       certification will involve a written test, as well as a performance test. Failure of either of
       these tests will result in retraining until the personnel achieve successful certification.

8.1    OAQPS Training Facilities

EPA, through its Regional laboratories, OAQPS, and ORIA (Las Vegas, NV), has multiple
training facilities, which provide the capacity to

   •   Develop internal expertise in fine PM monitoring and gravimetric analysis

-------
                                                                        Project: PEP QAPP
                                                                           Element No.:8.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
	Page 2 of 6


    •   Provide monitoring equipment that is readily accessible to EPA staff for investigating
       operational questions and concerns
    •   Perform field and laboratory training for personnel at EPA, Regional, SLT organizations,
       and ESAT
    •   Perform special studies (study monitor performance, evaluate measurement uncertainty)
    •   Perform research studies for future monitoring activities.

8.2   Training Program

The field and laboratory PEP training program will involve the following four phases:

    •   Classroom lecture. This will include an overall review of the PM2.5 program and the
       consequential importance of the PEP. Classroom lectures will also be implemented for
       each training module (as described below). Revisions to the training modules and SOPs
       are made based on suggestions from PEP auditors and a subsequent annual evaluation
       and consensus of the EPA PEP WAM/TOPO/DOPOs and the PEP Workgroup.
    •   Hands-on activities. After a classroom lecture, personnel will be taken to the training
       area where the field/laboratory activities will be demonstrated, and then the trainees will
       perform the same activity under instruction.
    •   Certification-written exam. A written test will be administered to trainees to cover the
       information and activities of importance in each of the training modules.
    •   Certification-performance exam. This is a review of the actual field implementation
       activities by the trainer/evaluator. Appendix C contains PE forms for this review.

Trainers will include OAQPS personnel  from the AAMG QA Team, as well as Regional PEP
QA staff and contractors, who are certified by OAQPS to conduct PEP field and laboratory
training.

8.3   Field Training

All personnel, which include EPA Regional WAM/TOPO/DOPOs and ESAT contractors, will
be trained before performing PEP field data collection activities. Representatives of SLT
organizations are welcome to attend this training to satisfy the training requirement for their
implementation of the PEP.

Annual field training/recertification will be conducted at a facility designated by OAQPS. One
full certification  course (if needed) and one recertification course will be conducted each year.
Additional training may be arranged at the discretion of OAQPS. This may include training
conducted by EPA-certified Regional WAM/TOPO/DOPOs within their respective Regions.
When this occurs, the WAM/TOPO/DOPO is responsible for submitting a record of training and
certification results to OAQPS.

-------
                                                                         Project: PEP QAPP
                                                                            Element No.:8.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 3 of 6

Field training for full certification is expected to last 3 full days. Trainers may be required to be
available a fourth day for any individual trainees requiring more instruction.
Field training will include the following topics:
    •  Introduction to the PEP
    •  Planning and preparation
    •  Cassette receipt, storage, and handling
    •  Sampler transport, placement, and assembly
    •  System checks
    •  Programming the run
    •  Filter exposure and concluding the sampling event
    •  Using the COC Form
    •  Using the FDS
    •  QA/QC and information retention
    •  Troubleshooting in the field: When to perform calibrations (not typically performed in
       the field).

8.4   Laboratory Training
Annual laboratory training/recertification for the routine PEP filter preparation and weighing
activities will be conducted at an EPA PEP weighing laboratory designated by OAQPS.
Additional training may be arranged at the discretion of OAQPS.
Laboratory personnel  will be trained on the following topics:
    •  General laboratory preparation           •  Equipment inventory and maintenance
    •  Communications                       •  Filter handling
    •  Filter conditioning                      •  Calibrations
    •  Filter weighing                         •  Filter shipping
    •  Using the COC Form                    •  Data entry and data transfer
    •  Using the FDS                         •  Storage and archiving
    -  QA/QC

-------
                                                                          Project: PEP QAPP
                                                                            Element No.:8.0
                                                                             Revision No.: 1
                                                                             Date: 3/6/2009
                                                                         	Page 4 of 6
8.5    Certification

EPA requires certification for FSs, LAs, and the Project Leaders (including
WAM/TOPO/DOPOs) to help ensure that personnel are sufficiently trained to perform the
necessary PEP activities at a level that does not compromise data quality and also inspires
confidence in the PEP by the SLT organizations.

8.5.1   Certification of Field Scientists and Laboratory Analysts

Both a written exam and a performance review are considered part of the certification
requirements for FSs and LAs. The written exam is gauged to review the more critical aspects of
the PEP and to identify where the individual requires additional training. The written test will be
generated by OAQPS. A score of 90% is required for passing the written exam. The PE is
focused on ensuring that the individual understands and follows the SOPs. The trainer(s) will
evaluate the trainees' implementation of the topics identified in the field and laboratory sections
above. Appendix C provides the qualitative check forms that will be used during the evaluation
of field and laboratory performance.

The intent of the certification activities is not to fail individuals, but to determine where
additional training is required to ensure that the PEP is implemented consistently across the
nation. By testing and evaluating each module, the trainer(s) will be able to identify the areas
where individuals will require additional training. If many individuals fail a particular
component, then this may indicate that the classroom or hands-on training is not adequate. In any
case, failure by individuals of parts  of either the written or hands-on PE will indicate that more
training is required. Trainees will be required to attend additional training on these components.
Trainers will be available for an additional day of field/laboratory training and will ensure that
personnel are certified by the end of the training session.

If the certification or recertification activities identify individuals who appear to be incapable of
properly performing the field/laboratory activities, then the ESAT WAM/TOPO/DOPOs and
RPOs will be notified to initiate remedial action.

8.5.2   Certification of Regional Project Leaders

Because Regional Project Leaders (including WAM/TOPO/DOPOs) conduct TSAs and may be
authorized to train staff on behalf of EPA, annual recertification is necessary to maintain their
knowledge of current issues and changes in equipment and procedures. They must meet the
following certification requirements:

   •   At a minimum, they must successfully complete the initial 3 %-day PEP training course
   •   At least every 2 years, they must participate in the "hands-on" PEP annual certification
       event conducted by OAQPS. In alternate years, they may fulfill their annual certification
       requirement by participating in OAQPS-led Web-based training events.

-------
                                                                        Project: PEP QAPP
                                                                          Element No.:8.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
	Page 5 of 6


8.6    Additional PEP Field and Laboratory Training

Annual certifications and recertifications will be arranged and conducted by OAQPS. Personnel
turnover is expected among PEP contractor and SLT organizations. Occasionally, the PEP
contracts will be awarded to new contractors. This situation will dictate that a second full
training course needs to be conducted in the same year. The WAM/TOPO/DOPOs will contact
OAQPS as soon as possible when training is required. The following two options are available
for training in these extraordinary circumstances:

    •   EPA-certified Regional WAM/TOPO/DOPOs or Project Leaders may train additional
       ESAT personnel with authorization by the National PEP Project Leader.
    •   Individual training may be arranged at the discretion of OAQPS at its air training facility
       in Research Triangle Park, NC.

OAQPS will work with the Regional PEP leaders and the WAM/TOPO/DOPOs to determine the
need for training and what method is logistically the most efficient for all involved.

8.7    Additional Ambient Air Monitoring Training

Appropriate training will be available to personnel who support the Ambient Air Quality
Monitoring Program, commensurate with their duties. Such training may consist  of classroom
lectures, workshops, teleconferences, and on-the-job training.

Over the years, many courses have been developed for personnel involved in ambient air
monitoring and QA aspects. Formal QA/QC training is offered through the following
organizations:

    •   OAQPS, AAMG
    •   Air & Waste Management Association (AWMA) (http://www.awma.org)
    •   EPA Air Pollution Training Institute (APTI) (http://www.epa.gov/apti)
    •   EPA Office of Environmental Information (OEI)
       (http://www.epa.gov/quality/trcourse.html)
    •   EPA AQAD (http://www.epa.gov/air/oaqps/organization/aqad/io.html)
    •   EPA Regional offices.

Table 8-1 presents a sequence of core ambient air monitoring and QA courses for ambient air
monitoring staff and QAMs (marked with an asterisk). The suggested course sequences assume
little or no experience in QA/QC or  air monitoring.

-------
                                                                               Project: PEP QAPP
                                                                                 Element No.:8.0
                                                                                  Revision No.: 1
                                                                                   Date: 3/6/2009
                                                                              	Page 6 of 6
                       Table 8-1. Core Ambient Air Training Courses
Sequence
1*
2*
3*
4*
5*
6*
7*
8*
9
10
11
*
*
*
*
*
*
*
*
*

*
*
Course Title (Self Instructional [SI])
Air Pollution Control Orientation Course, SI-422
Principles and Practices of Air Pollution Control, 452
Introduction to EPA Quality System Requirements
Introduction to Ambient Air Monitoring, SI-434
General Quality Assurance Considerations for Ambient Air
Monitoring (under revision), SI-47 1
Quality Assurance for Air Pollution Measurement Systems (under
revision), 470
Introduction to Data Quality Objectives
Introduction to Quality Assurance Project Plans
Atmospheric Sampling, 435
Analytical Methods for Air Quality Standards, 464
Chain-of-Custody Procedures for Samples and Data, SI-443
Introduction to Data Quality Assessment
Introduction to Data Quality Indicators
Assessing Quality Systems
Detecting Improper Laboratory Practices
Beginning Environmental Statistical Techniques, SI-473A
Introduction to Environmental Statistics, SI-473B
Interpreting Monitoring Data
Interpreting Multivariate Analysis
Quality Audits for Improved Performance
Air Quality System (AQS) Training
Federal Reference Method Performance Evaluation Program
Training (field/laboratory)
PM2 5 Monitoring Implementation (video)
Number
422
452
—
434
471
470
—
—
435
464
443
—
—
—
—
473
473B
—
—
QA6
	 **
QA7
PM1
Source
APTI
APTI
OEI
APTI
APTI
APTI
OEI
OEI
APTI
APTI
APTI
OEI
OEI
OEI
OEI
APTI
APTI
OEI
OEI
AWMA
OAQPS
OAQPS
OAQPS
*  Courses recommended for QAMs
** Information about AQS training is available on EPA's Technology Transfer Network Web site for the
   AQS. Materials used in past AQS training classes are also posted on the Web site at
   http://www.epa.gov/ttn/airs/airsaqs/training/training.htm

-------
                                                                       Project: PEP QAPP
                                                                         Element No.: 9.0
                                                                          Revision No.: 1
                                                                           Date: 3/6/2009
                                                                       	Page 1 of 7
                       9.0  Documentation and Records

The purpose of this element is to define the records that are critical to the project, the
information to be included in reports, the data reporting format, and the document control
procedures to be used.

For the Ambient Air Monitoring Program, there are many documents and records that need to be
retained. A document, from a records management perspective, is a volume that contains
information, which describes, defines, specifies, reports, certifies, or provides data or results
pertaining to environmental programs. As defined in the Federal Records Act of 1950 and the
Paperwork Reduction Act of 1995 (now 44 U.S.C. 3101-3107), records are: "...books, papers,
maps, photographs, machine readable materials, or other documentary materials, regardless of
physical form or characteristics, made or received by an agency of the U.S. Government under
Federal  Law or in connection with the transaction of public business and preserved or
appropriate for preservation by that agency or its legitimate successor as evidence of the
organization, functions, policies, decisions, procedures, operations, or other activities of the
Government or because of the informational value of data in them..."

The following information describes the document and records procedures  for the PEP. In EPA's
QAPP regulation and guidance, EPA uses the term "reporting package" which is defined as all
of the information required to support the concentration data reported to EPA. This information
includes all data required to be collected, as well as data deemed important by the PEP.

9.1   Information Included in the Reporting Package

9.1.1  Data Reporting Package Format and Document Control

The PEP has structured its records management system according to EPA's File Plan Guide (see
http://www.epa.gov/records/tools/toolkits/filecode). A file plan lists office records and describes
how they are organized and maintained. A good file plan is one of the  essential  components of a
recordkeeping system, and it is key to a successful records management program. It can help you
complete the following:

   •  Document your activities effectively
   •  Identify records consistently
   •  Retrieve records quickly
   •  Determine disposition of records no longer needed
   •  Meet statutory and regulatory requirements.

The PEP records management system uses the Agency File Codes (AFCs) to facilitate easy
retrieval of information during EPA TSAs and reviews. The PEP records management system
also follows EPA records schedules, which constitute EPA's official policy on how long to keep

-------
                                                                              Project: PEP QAPP
                                                                                Element No.: 9.0
                                                                                 Revision No.: 1
                                                                                  Date: 3/6/2009
                                                                             	Page 2 of 7
Agency records (retention) and what to do with them afterwards (disposition). For more
information on EPA records schedules, see http://www.epa.gov/records/policy/schedule (the
Web site is searchable by AFC function code and schedule number).

Table 9-1 includes the documents and records that will be filed according to the statute of
limitations discussed in Section 9.3. To archive the information as a cohesive unit, all of the PEP
PM2.5 information will be filed under the major code "PEP," followed by the AFC function code
and schedule numbers listed in Table 9-1. For example, PEP project plans would be filed under
the heading "PEP/301-093-006.1,"  and COC forms would be filed under "PEP/301-093-006.3."
Each Field and Laboratory SOP provides instruction on the proper filing of data collected during
the particular procedure.

                      Table 9-1. PM2.s Reporting Package Information
  Agency File Code
  Function
 No.
                             Category
                                            Record/Document Types
301-093
006
Program Management Files
              006.1
        Management
        and organization
              006.2
        Monitoring site
        information
              006.2
        Field operations and data
        acquisition (by EPA
        Regional staff or
        contractors on behalf of
        EPA)
              006.4
        Communications
        (contractor technical
        project activity)
                           Organizational structure for EPA and how the
                           Regions and ESAT contractors fit into running the
                           PEP
                           Organizational structure for the support contractors
                           PEP project plans and subsequent revisions
                           Quality Management Plan
                           Site characterization file (Site Data Sheets)
                           Site maps
                           Site pictures
                           SLT site contact information
                           QAPPs
                           SOPs
                           Field logbooks and communications
                           Sample handling/COC forms
                           Documentation of instrument inspection
                           and maintenance
                           Field testing of PEP equipment
                           Telephone record and e-mail between the ESAT
                           contractor and SLT organizations
                           Telephone record and e-mail between the ESAT
                           contractor and the Contract Officer's Representative
                           (COR)

-------
 Project: PEP QAPP
   Element No.: 9.0
    Revision No.: 1
     Date: 3/6/2009
	Page 3 of 7
Agency File Code
Function
301-093
405
404-142-01
404-142-01
No.
006.5
006.6
202
202.1
179
179.1
173
173.1
Category
Communications (EPA
project activity)
Equipment and
instruments used by
contractors in the PEP
(records about charged
time to the support of the
program would reference
AFC 405-202)
Record/Document Types
• Telephone record and e-mail between EPA Regional
or Headquarters staff and SLT organizations and
vice versa
• Telephone record and e-mail between EPA Regional
and other EPA personnel (Headquarters to Regions
and vice versa)
• Procurement logs
• Inventories of capital equipment, operating supplies,
and consumables
• Repair and maintenance (e.g., vendor service
records, calibration records)
• Retirement or scrapping
Contract Management Records
Contract administration
• Work assignments, task orders, delivery orders, and
work plans
• Contractor monthly reports
• Technical directives from the COR to the contractor
• Invoices for consumables
• Requisite qualifications of FSs and Laboratory
Analysts (LAs) for PEP -related, contractor-
implemented activities
• Training records and certificates of ES AT
contractors conducted and issued by the EPA
Regional ESAT COR
Special Purpose Programs
Data administration
and integration
• Data management plans/flowcharts
• Raw data: any original data (routine and QC data),
including data entry forms
• Data algorithms
• Documentation of PEP database (FED)
(national/Regional level)
• PM2 5 FED data
• FDSs and COC forms
Data Files Consisting of Summarized Information
Data summaries, special
reports, and progress
reports
• Data/summary/monthly field activity reports
• Journal articles/papers/presentations
• Data validation summaries

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 9.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
                                                                        	Page 4 of 7
Agency File Code
Function
108-025-01-
01
405
403-256
No.
237
237.1
036
036.1
122
122.1
Category
Record/Document Types
State and Local Agency Air Monitoring Files
QA/QC Reports
• 3 -year PEP QA reports
• PEP Data Quality Assessments
• QA reports
• Response/corrective action reports
• Site audits
Routine Procurement
Acquisition of capital
equipment and supplies by
EPA (either Headquarters
or Regional office)
• Needs assessments and reports
• Program copies of purchase requests
• Requests for bids or proposals
• Proposals, bids, or quotations
• Bills of lading
• Warranties and certificates of performance
• Evaluations of proposals, bids, quotations, or trial
installations
Supervisors' Personnel Files and Duplicate Official Personnel Folder
Documentation
Personnel qualifications,
training, and certifications
• WAM/TOPO/DOPO training certifications
• Certification as a PEP FS and/or LA
• Certification as a PEP FS trainer and/or LA trainer
9.1.2   Notebooks

The following types of notebooks will be issued to field and laboratory personnel:

Field/Laboratory Notebooks. The PEP will issue notebooks to each FS and Laboratory Analyst
(LA). Each notebook will be uniquely numbered and associated with the individual and the PEP.
Although data entry forms are associated with all routine environmental data operations, the
notebooks can be used to record additional information about these operations. In the laboratory,
notebooks will also be associated with the temperature and humidity recording instruments, the
refrigerator, calibration equipment/standards, and the analytical balances used for this program.

Field/Laboratory Binders. Three-ring binders, which will be issued to each FS and LA, will
contain the appropriate data forms for routine operations, as well as inspection and maintenance
forms and SOPs.

Sample Shipping/Receipt. One notebook, which will be issued to each field and laboratory
shipping and receiving facility, will be uniquely numbered and associated with the PM2.5 PEP.  It
will include standard forms and areas for free-form notes.

-------
                                                                         Project: PEP QAPP
                                                                           Element No.: 9.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 5 of 7

Field/Laboratory Communications Notebook. One communications notebook will be issued
to each FS and LA to record communications. Element 21.0, Reports to Management, provides
more information about this activity.

9.1.3  Electronic Data Collection

All raw data required for calculating PM2 5 concentrations, including QA/QC data, are collected
electronically or on the data forms that are included in the Field and Laboratory SOPs. Field
measurements listed in Table 6-2 (found in Element 6.0, Project/Task Description) will be
collected electronically, along with the laboratory pre- and post-sampling weights. Therefore,
both the primary field and laboratory data will be collected electronically, and primary data will
be used to electronically calculate a final concentration. More details about this process can be
found in Element 18.0, Data Acquisition Requirements, and Element 19.0, Data Management.

Various hard copies are created from electronic systems, such as FED reports and spreadsheets
used by the FS and others. Hard copies that are determined to be permanent record (e.g., data
that lead to significant findings or conclusions) should be filed as a data reporting package to
ensure that all  PEP data are properly archived.

It is anticipated that other instruments will provide an automated means for collecting the
information that would otherwise be recorded on data entry forms. Information on these systems
is detailed in Element 18.0, Data Acquisition Requirements, and Element 19.0, Data
Management. To reduce the potential for data entry errors, automated systems will be used
where appropriate and will record the same information that is found on data entry forms.  To
provide a  backup, a hard copy of automated  data collection information will be stored as
specified by EPA records schedules in project files.

9.1.4  Hand-Entered Data

There will be many data forms that will be entered by hand. These can be found at the end of
each Field and Laboratory SOP. All hard copy information will be completed in indelible ink.
Corrections will be made by inserting one line through the incorrect entry, initialing and dating
this correction, and placing the correct entry alongside the incorrect entry, if this can be
accomplished legibly, or by providing the information on a new line.

9.1.5  E-mail and Attachments

As of April 2007, the EPA implemented a new record-handling system for e-mail and associated
attachments. ESAT and other contractors who use EPA's in-house e-mail will be expected to use
the record-handling system as soon as guidelines for the PEP and user training are available.
Instructions on use for PEP e-mail and attachments are currently being developed and will be
issued as a quality directive to EPA and ESAT personnel.

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 9.0
                                                                           Revision No.: 1
                                                                           Date: 3/6/2009
	Page 6 of 7


9.2    Reports to Management

In addition to the reporting package, various reports will be required by the PEP.

9.2.1  Laboratory Weekly Report

The LA will provide the WAM/TOPO/DOPO with a written progress report every Friday or the
last day of the scheduled work week. The LA will maintain a complete record of the laboratory
weekly progress reports (PEP Laboratory SOP, Form COM-2) in a three-ring binder and will
include an updated Filter Inventory and Tracking Form (PEP Laboratory SOP, Form COC-1).
The PEP Laboratory SOP, Section 4 contains the details of this report, which will be filed
according to the records schedule outlined in Table 9-1. The WAM/TOPO/DOPO may request
more information to be included in the weekly reports if he or she deems that it is necessary.

9.2.2  Field Monthly Report

The FS will provide a written progress report to the WAM/TOPO/DOPO each month (the
deadline is the 15th calendar day of the following month unless otherwise specified by the
WAM/TOPO/DOPO). See the PEP Field SOP, Section 2 for more details about this report. This
monthly  report will be filed according to the schedule outlined in Table 9-1.
The monthly progress report (PEP Field SOP, Form COM-2) will convey the following
information:
    •  Reporting date—The beginning and end date that the report covers
    •  Reporter—The person who is writing the reports
    •  Progress—Progress on field activities
       -   Evaluations scheduled within the reporting date

       -   Evaluations conducted within the reporting date
    •  Issues
       -   Old issues—Issues reported in earlier reports that have not been resolved

       -   New issues—Issues that arise within the reporting date
    •  Actions—Necessary to resolve issues, including the person(s) responsible for resolving
       them and the anticipated dates when they will be resolved
    •  Extra purchases.

The WAM/TOPO/DOPOs may request more information to be included in the monthly reports if
they deem that it is necessary.

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 9.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
	Page 7 of 7


9.3    Data Reporting Package Archiving and Retrieval

The information listed in Table 9-1 will be retained by the ESAT contractor for 4 years, and it is
based on a calendar year (i.e., all data from calendar year 1999 will be archived until
12/31/2002). Upon reaching the 4-year archival date, the ESAT contractor will inform OAQPS
that the material has met the archive limit and will ask for a decision whether further archiving
or disposal should be conducted.

-------
                                                                       Project: PEP QAPP
                                                                        Element No.: 10.0
                                                                          Revision No.: 1
                                                                           Date: 3/6/2009
                                                                       	Page 1 of 4
                             10.0  Sampling Design

The purpose of this element is to describe all of the relevant components of the PEP, the key
parameters to be estimated, the number and types of samples to be expected, and how the
samples are to be collected.

10.1   Scheduled Project Activities, Including Measurement Activities

Section 6.4 (found in Element 6.0, Project/Task Description) details the critical time lines and
activities for the PEP.

10.2   Rationale for the Design

This QAPP reflects the EDOs for a QA activity, not a routine monitoring activity. The sampling
design has been codified in 40 CFR Part 58, Appendix A, Section 3.2.7, as described below.

The PEP is an independent assessment used to estimate total measurement system bias. These
evaluations will be performed under the PM2.5 PEP (40 CFR Part 58, Appendix A, Section 2.4,)
or a comparable program. PEs will be performed on the SLAMS monitors annually within each
PQAO. For PQAOs with less than or equal to five monitoring sites, five valid PE audits must be
collected and reported each year. For PQAOs with more than five monitoring sites, eight valid
PE audits must be collected and reported each year. A valid PE audit means that both the
primary monitor and PEP audit concentrations have not been invalidated and are greater than 3
jug/m3. To achieve this, sites that have seasonally low concentrations should be sampled during
times when concentrations are expected to be greater than 3 jug/m3. EPA recognizes that it may
be difficult or impossible to obtain valid audits at sites where the concentration rarely exceeds 3
jug/m3. EPA is currently considering ways to evaluate such sites. Audits that are otherwise valid,
but do not meet the greater than 3 jug/m3 criteria are still useful to evaluate sampler operation,
even if such audit data may not be used in the calculations for sampler bias (see Section 6.4.1 for
additional information about audit frequency).

Additionally, each year, every designated FRM or FEM sampler within a PQAO must have

   •   Each method designation evaluated each year
   •   All FRM or FEM samplers subjected to a PEP audit at least once every 6 years. This
       equates to approximately 15% of monitoring sites audited per year.

SLT organizations will be asked to select the sites that they feel meet the previously mentioned
criteria and provide a list of sites for the PEs conducted in each calendar year on or before
December 1 of the previous year. The Regional WAM/TOPO/DOPOs, with the assistance of the
ESAT contractors, will determine the most efficient site visit schedule. This schedule will be
based on

-------
                                                                         Project: PEP QAPP
                                                                          Element No.: 10.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 2 of 4

    •   CFR requirements for audit frequency
    •   Meeting the same monitoring schedule as the routine sampler being evaluated (this
       prevents the site from having to run and post an additional sample for the PE audit to
       AQS,)
    •   Site proximity (the sites that are closest in proximity to each other can be visited within
       the same day or week).

 10.3  Design Assumptions

 The intent of the sampling design is to determine that the total measurement bias is within the
 DQOs described in Element 7.0, Data Quality Objectives and Criteria for Measurement. The
 sampling design will allow the PEP data to be statistically evaluated at various levels of
 aggregation to determine whether the DQOs have been attained. Data Quality Assessments
 (DQAs) will be aggregated at the following three levels:

    •   Monitor. Monitor/method designation
    •   Reporting Organization. Monitors in a method designation, all monitors
    •   National. Monitors in a method designation, all monitors.

 OAQPS believes it important to stratify monitors by method designation to assist in the
 determination of instrument-specific bias (i.e., a particular make and model).

 The statistical calculations for the assessments are found in 40 CFR Part 58, Appendix A. After
 both the routine and PE data are in the AQS database, these calculations will be performed on
 the data and will allow for the generation of reports at the levels specified above.

 The DQO for the PEP is based on how the NAAQS for PM2.5 is determined. It is based on 3
 years of data from individual monitors; therefore, it is important to assess the PE data against the
 DQO at the same frequency and level of aggregation. Because the audit frequency of the PEP is
 15%, any  one monitor would receive a PEP audit at least once every 6 years. The PEP data is
 suitable for the actual assessment of the particular monitor type but has limited use at the unique
 monitor level of aggregation. At the PQAO and national levels of aggregation, a sufficient
 amount of PEP data will be available to evaluate bias. The uncertainty of the PEP data will be
 controlled and evaluated by using various QA/QC samples described in Element 7.0, Data
 Quality Objectives and Criteria for Measurement, and Element 14.0, Quality Control
 Requirements. For example, the aggregation of the collocated  samplers over the 3-year period
 will determine the precision of the program. Use of various blanks, verification checks, and
 inter-laboratory comparison studies can help to determine bias.

 10.3.1 Representativeness

 Representativeness is a measure of the degree to which data accurately and precisely represent a
 characteristic of a population, parameter variations at a sampling point, a process condition, or

-------
                                                                        Project: PEP QAPP
                                                                         Element No.: 10.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
	Page 3 of 4

an environmental condition. The PEP design attempts to represent parameter variations at a
sampling point by locating the PEP sampler within 1-4 meters of the primary routine sampler
(inlets within 1 meter of the vertical height) and by operating the PEP sampler on the same
sampling schedule as the routine sampler. In addition, the PEP ensures representativeness of
sampling within the SLAMS network by evaluating all method designations within a PQAO
annually and by evaluating all samplers over a 6-year period (100% sampling).

Appendix L of 40 CFR Part 50 also provides the following summary of the measurement
principle:

    An electrically powered air sampler draws ambient air at a constant volumetric flow rate into
    a specially shaped inlet and through an inertial particle size separator (impactor) where the
    suspended particulate matter in the PM2 5 size range is  separated for collection on a PTFE
    filter over the specified sampling period. The air sampler and other aspects of this reference
    method are specified either explicitly in this appendix or generally with reference to other
    applicable regulations or QA guidance.

Because all PE samplers must meet the requirements of 40 CFR Part 50 and be designated by
EPA as an FRM  sampler, it is assumed that they collect a representative sample of suspended
PM in the PM2.5  size range, similar to the primary sampler at the site.

10.3.2 Homogeneity

The PE sampler must be placed within  1-4 meters of the primary routine sampler to which it is
being compared. The assumption is that the air within this  1-4-meter area is homogenous;
therefore,  both monitors will sample the same PM2.5 load. Historical  information on PMio
collocation data and preliminary PM2.5  data indicates this assumption is correct.

10.4  Procedure for Locating  and Selecting Environmental Samples

Sections 10.2 and 10.3 explain the following:

    •  Frequency (15% of the samplers with a method designation each year).
    •  Location (1-4 meters from monitor to be evaluated; inlets within 1 meter of the vertical
       height). The physical location of the routine monitor is the responsibility of the SLT
       organizations and does not affect the intent of the PE. Site locational information is
       entered by the SLT organization into the AQS database. The critical piece of information
       is the AQS site ID (state, county, unit, pollution occurrence code), which must be entered
       into AQS for primary data to be loaded into the database. The ESAT FS will have access
       to this information.
For each site, the ESAT contractor will develop a Site Data Sheet (Form  SD-01) that contains
the following information:

-------
                                                                        Project: PEP QAPP
                                                                         Element No.: 10.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
                                                                        	Page 4 of 4
   •   AQS Site ID                                •   Directions to and from a major
   •   Monitor Parameter Occurrence Code              thoroughfare
       (POC)                                      •   Safety concerns
   •   Method designation                          •   Additional equipment needed (ropes,
   •   Monitor make and model                         ladders)
   •   Site coordinates*                            •   Closest hospital (address)
   •   Network type (e.g., SLAMS)*                •   Closest express mail facility
   •   Reporting organization*                      •   Closest hardware store
   -   Reporting organization contact                •   Recommended hotel (address/phone)
   •   Street address*                              "   Important free-form notes
   •   Directions to the site (from the                •   Closest PM2.5 site
       Regional office)                             •   Second closest PM2.s site
* Items marked with an asterisk (*) are available in the AQS. These data are publicly available through
  EPA 's Web site; in the Web browser, enter http://www.epa.gov/aqspubll/site.htm and go to Monitor
  Data Queries. The criteria pollutant code for PM2.5 is 88101.

The information previously listed will be kept in a site file (filed by AQS Site ID) and included
in a site notebook for each FS. In addition, maps for each state and city where a monitor is
located will be acquired. Sites can be placed on these maps along with the Site IDs.

Sites will not be visited and  samplers will not be set up in conditions that are deemed unsafe.
Unsafe conditions may include bad weather or monitoring platforms where the FS feels that he
or she cannot transport or set up the monitor without jeopardizing his or her personnel  safety.
The FS will document the occurrence of any unsafe conditions so that mechanisms can be
instituted to make the platform safely accessible for a PE. This information will be conveyed to
the WAM/TOPO/DOPO.

10.5   Classification of Measurements as Critical/Noncritical

Sections 6.2.2 and 6.3.1 classify the critical field and laboratory measurements for the PEP.
Although the Field  and Laboratory SOPs contain  many additional measurements, they are
considered noncritical.

10.6   Validation of Any Non-Standard Measurements

Because the PEP is deploying only FRM samplers and will be operating these samplers
according to the established  SOPs, there will not be any non-standard measurements. Also,
because the PEP will be sending its  filters to a certified laboratory for weighing, there will not be
any non-standard measurements from the analysis of the filters;  therefore, all  sampling and
analysis measurements will be standard.

-------
                                                                       Project: PEP QAPP
                                                                        Element No.: 11.0
                                                                          Revision No.: 1
                                                                           Date: 3/6/2009
                                                                            Page lof 11
                   11.0 Sampling Methods Requirements

The PEP provides for measurement of the mass concentration of PM2.5 in ambient air over a
24-hour period. The measurement process is considered to be non-destructive, and the PM2 5
sample obtained can be subjected to subsequent physical or chemical analyses. A set of SOPs for
field sampling {Field Standard Operating Procedures for the Federal PM2.s Performance
Evaluation Program) has been developed for the PEP and are to be used in all sampling
activities under this QAPP. The following section will  provide summaries of some of the more
detailed information in the Field SOPs. These summaries do not replace the SOPs.

11.1   Sample Collection and Preparation

Portable FRM monitors are used for collecting PM2.5 samples for the PEP. Three models are
available: the BGI™ PQ200A, the Andersen™ RAAS2.5-200, and the R&P Partisol®. Because the
goal is to provide comparable results across the nation, using one make and model of a portable
monitor to evaluate all of the routine monitors is advantageous because it reduces the chances
that bias and imprecision among the different portable  instrument models will confound the
routine monitor comparisons. Because the BGI was the only portable instrument to be granted
FRM designation before January 1999, it was selected  as the primary instrument; therefore, the
Field SOPs have been written based on this instrument. The  other two instruments have been
purchased and used as back-up instruments or used in areas where they have advantages due to
their design. It should be noted that Thermo Fisher Scientific currently owns the Andersen and
R&P sampler lines. Thermo Fisher Scientific has discontinued active production and technical
support for the Andersen RAAS line of samplers, so parts will likely become unavailable in the
future. PEP FSs may continue to use the Andersen RAAS or R&P samplers in their limited roles
as long as the samplers are serviceable and they are included in semiannual collocation
evaluations.

11.1.1  Preparation

Before conducting an evaluation excursion for the week, the sampling equipment and
consumables will be inspected to ensure proper operation and adequate supplies are on-hand
based upon the number of sites to be visited. At least one spare portable monitor and one set of
calibration equipment will be available. Filters will be  selected and stored appropriately (per
SOPs) for transport to the  sites. Filter COC forms will be started, and the filter expiration dates
will be checked to ensure they have not exceeded their 30-day pre-sampling time period. Site
Data Sheets, which contains information on site characteristics for each site, and blank FDSs,
which are used to record field information for the PE audit, should be available. For initial visits,
some of the information on the Site Data  Sheets may be blank and must be completed during the
first visit. The PEP FSs will review the site schedule to be sure that they understand which tasks
will be implemented at the sites they are visiting that week.

-------
                                                                         Project: PEP QAPP
                                                                          Element No.: 11.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 2 of 11

Shipping the filters back to the laboratories will require FSs to use ice substitutes, which must be
kept frozen until use. During transport to/from the sites, the ice substitutes will be placed in a
cooler to minimize heat gain.

11.1.2 Field Sample Collection

FSs will travel to the sites and meet the person (typically the Site Operator) who will allow them
access to the monitoring site. The portable FRM monitors will be transported to within 1-4
meters of the routine monitor, and then set up and calibrated per the PEP Field SOPs. Filter
cassettes will be installed and the monitor will be set to run on a midnight-to-midnight schedule.
The FS will then either perform additional tasks  as required at this site or proceed to another site
for sampling. If there are any delays in the sampling schedule, then the ESAT FS will contact the
affected SLT organizations and will  also notify the Regional WAM/TOPO/DOPO.

Upon completion of sampling,  the FS will return to the site(s), remove the sampling filter
cassette, visually inspect the filter, store it appropriately for transport to the laboratory, and
download the data per the Field SOPs. Each FS will have a portable laptop and a data logger (or
another mechanism to download sampler data) provided by the portable sampler manufacturers.
In 2006, BGI discontinued support for its DataTrans® data loggers. Currently, functioning
instruments may be used until they are no longer serviceable. BGI may decide to develop
another instrument in the future. If a new instrument is developed, it will be evaluated and placed
in service if it appears to be reliable. Laptops should be used as a first option to acquire the data
from the samples. When safety or precipitation prevents the use of a laptop, a data logger may be
used or these data may be downloaded later. A portable media device (e.g., diskette, CD, or USB
flash drive) of the downloaded data must be sent to the laboratory along with the filters.

11.1.3 Filter Transportation

It is important that the filters be properly stored and transported to the weighing laboratory as
soon as possible. Ideally, filter cassettes will be shipped the same day that they are removed from
the monitors via next-day delivery. Filter cassettes, ice packs, maximum/minimum
thermometers, copies of the COC forms and FDSs, and a field data diskette/CD/USB flash drive
containing the monitor information will be included in the  shipment. The FS will keep a copy of
the FDS and the COC Form (to file under PEP/301-093-006.3) and will  record the number of
containers shipped and the air bill number in the field notebook. On the  day of shipping, the FS
will contact the weighing laboratory to make its  personnel  aware of the shipment and to provide
the laboratory with the number of containers shipped and the air bill number.

11.1.4. Return to Station

Upon completing a sampling excursion, the FS will return  to the Regional office, ensure that all
equipment and consumables are properly stored, and determine if ordering supplies or
performing equipment maintenance are required. A second copy of the week's field data will be
stored at the field office and provided to  the EPA Regional WAM/TOPO/DOPO upon request.

-------
                                                                      Project: PEP QAPP
                                                                        Element No.: 11.0
                                                                         Revision No.: 1
                                                                          Date: 3/6/2009
	Page 3 of 11

Vehicles will be serviced as required. The FS will debrief the WAM/TOPO/DOPO on the field
excursion and will include information about whether the site visits remain on schedule.

11.1.5 Field Maintenance

A maintenance list will be developed by the PEP field personnel for all sensitive capital
equipment. The list will contain columns for item, maintenance schedule, and date that will be
filled in when maintenance (scheduled or unscheduled) is performed. See Element 15.0,
Instrument/Equipment Testing, Inspection, and Maintenance Requirements, for more information
about this.

11.2  Support Facilities for Sampling Methods

The analytical support facilities for the federally implemented PEP will be provided by the
Region 4 gravimetric laboratory in Athens, GA.  This laboratory has been developed to meet the
measurement quality objectives described in Table 7-1. In case of emergency, several back-up
laboratories have been arranged: EPA's facility in Research Triangle Park, NC; EPA's ORIA-
NAREL in Montgomery, AL; EPA's ORIA's Office of Radiation and Indoor Environments
(OR&IE) Laboratory in Las Vegas, NV; and EPA's Region 2 Environmental Laboratory in
Edison, NJ.

11.3  Sampling/Measurement System Corrective Action Process

11.3.1 Corrections to the SOPs

The ESAT contractors are responsible for implementing this QAPP and the Field  SOPs and are
responsible for the quality of the data. All methods will be reviewed and implemented by the
ESAT contractors. If changes or corrections are required to the methods or QAPP, the ESAT
contractor will notify the Regional WAM/TOPO/DOPO in writing. The Regional  WAM/TOPO/
DOPO will then convey the issue to the PEP Workgroup, which will review the change and
attempt to classify it according to the effect that the change would have on the data. The classes
follow:

    •   Class 1—The change improves the data and the new procedure replaces the current
       procedure. If the change is found to be acceptable by the PEP Workgroup, a new SOP
       will be issued that can be inserted into the compendium. The document control
       information in the heading will contain a new revision number and date. A Quality
       Bulletin will be completed to describe the change, and it will be distributed to all
       Regional WAM/TOPO/DOPOs and ESAT personnel.
    •   Class 2—The change provides for an alternate method that does not affect the quality of
       the data but may provide for efficiencies in some circumstances or be more cost effective.
       If the change is found to be acceptable by the PEP Workgroup, the original SOP will not

-------
                                                                          Project: PEP QAPP
                                                                           Element No.: 11.0
                                                                             Revision No.: 1
                                                                              Date: 3/6/2009
	Page 4 of 11

       be altered, but an addendum to the procedure will be initiated by EPA OAQPS that
       describes the modification and provides an alternate method.
    •  Class 3—The change is grammatical in nature and does not reflect a change in the
       procedure. The changes will be highlighted and modified during a Class 1 change (where
       appropriate) or will be corrected during the development of a full revision to the
       document.

Upon agreement by the PEP Workgroup to institute a change, hard copies of Class 1 and 2
changes will be distributed using the Quality Bulletin illustrated in Figure 11-1.

-------
                                                                            Project: PEP QAPP
                                                                             Element No.: 11.0
                                                                               Revision No.: 1
                                                                                Date: 3/6/2009
                                                                                  Page 5 of 11
                                     Quality Bulletin
Subject:
Number	
Date	
Page  	of
Supersedes No. 	
Dated
Replace and Discard Original
Add Material to Document
Notes:
                                                            PM2 5 QA Coordinator
            Retain this bulletin until further notice                     D
            Discard this bulletin after noting contents                  D
            This bulletin will be invalid after (Date)	    D
            This bulletin will be incorporated into quality
            Procedure No.	by (Date)	    D
                              Figure 11-1. Quality Bulletin.

-------
                                                                          Project: PEP QAPP
                                                                           Element No.: 11.0
                                                                             Revision No.: 1
                                                                             Date: 3/6/2009
                                                                         	Page 6 of 11
11.3.2 Data Operations

Corrective action measures in the PEP will be taken to ensure that the DQOs are attained. There
is the potential for many types of sampling and measurement system corrective actions.
Table 11-1 lists some of the expected problems and corrective actions needed for a well-run
PEP.

                           Table 11-1. Field Corrective Action
Item
Problem
Action
Notification
Pre-Sampling Event Activities
Filter
inspection



WINS
impactor

Leak test



Pinhole(s) or tear(s)



Heavily loaded with
coarse paniculate
matter. Will be
obvious due to a
"cone" shape on the
impactor well
Leak outside
acceptable tolerance
(80 mL/min)



1. If additional filters have been
brought to the site, use one of them.
Void filters with pinholes or tears
2. Use a new field blank filter as a
sample filter
3. Obtain a new filter from the
laboratory
Clean downtube and WINS impactor.
Load new impactor oil into the WINS
impactor well

1. Completely remove the flow rate
measurement adapter, reconnect it,
and perform the leak test again
2. Inspect all seals and O-rings,
replace them as necessary, and
perform the leak test again
3. Check sampler with different leak
test device

1. Document on the FDS
2. Document on the FDS

3. Notify the Regional
WAM/TOPO/DOPO
Document in a log book

1. Document in a log book
2. Document in a log book; notify
the Regional WAM/TOPO/
DOPO; flag the data since the
last successful leak test
3. Document in a log book; notify
the Regional WAM/TOPO/
DOPO

-------
                                                                                  Project: PEP QAPP
                                                                                   Element No.: 11.0
                                                                                     Revision No.: 1
                                                                                      Date: 3/6/2009
                                                                                        Page 7 of 11
   Item
      Problem
              Action
         Notification
Ambient
pressure
verification
Out of specification
(± lOmmHg)
1. Make sure pressure sensors are each
  exposed to the ambient air and are
  not in direct sunlight

2. Call the local airport or other source
  of ambient pressure data and
  compare that pressure to pressure
  data from the monitor's sensor.
  Pressure correction may be required

3. Connect a new pressure sensor
1. Document on the FDS
                                                                   2. Document on the FDS
                                                                   3. Document on the FDS; notify
                                                                      Regional WAM/TOPO/DOPO
Ambient
temperature
range
during
sampling
event
<-30°Cor>45°C
                     Reschedule another audit
                                  Document on the FDS; notify the
                                  Regional WAM/TOPO/DOPO
Ambient
temperature
verification
and filter
temperature
verification
Out of specification
(± 2°C of standard)
1. Make sure that thermocouples are
  immersed in the same liquid at same
  point without touching the sides or
  bottom of the container

2. Use ice bath or warm water bath to
  check a different temperature. If the
  temperature is acceptable, perform
  the ambient temperature verification
  again

3. Connect a new thermocouple
                                 4. Check the ambient temperature with
                                   another NIST-traceable
                                   thermometer
1. Document on the FDS
                                                        2. Document on the FDS
                                                        3. Document on the FDS; notify
                                                          the Regional WAM/TOPO/
                                                          DOPO

                                                        4. Document on the FDS; notify
                                                          the Regional WAM/TOPO/
                                                          DOPO

-------
                                                                                  Project: PEP QAPP
                                                                                   Element No.: 11.0
                                                                                     Revision No.: 1
                                                                                      Date:  3/6/2009
                                                                                        Page 8 of 11
   Item
      Problem
              Action
          Notification
Sample
flow rate
verification
Out of specification
(indicated flow rate ±
4% of transfer
standard and ± 4% of
design flow rate [16.67
Lpm])
1. Completely remove the flow rate
  measurement adapter, reconnect it,
  and perform the flow rate check
  again

2. Perform the leak test

3. Check the flow rate at 16.67 Lpm
                                 4. Recalibrate the flow rate
                                 5. Verify it again; flow rate must be
                                   within ±2% of design flow rate
                                   (16.67 Lpm)
1. Document on the FDS
2. Document on the FDS

3. Document on the FDS; notify
  the Regional WAM/TOPO/
  DOPO

4. Document on the FDS; notify
  the Regional WAM/TOPO/
  DOPO

5. Document on the FDS
Sample
flow rate
Consistently low flows
documented during the
sample run
1. Check programming of the sampler
  flow rate

2. Check the flow with a flow rate
  verification filter and determine if
  the actual flow is low

3. Inspect the in-line filter downstream
  of 46.2-mm filter location, and
  replace it as necessary
1. Document in the log book
                                                                    2. Document in the log book
                                                                    3. Document in the log book
Sample
flow rate
24-hr CV >2%
1. Inspect 5-minute CVs during
  sampler performance download;
  determine if the exceedance is
  associated with external stimuli,
  such as power loss

2. If the exceedance is not justifiable,
  retest sampler in a laboratory,
  troubleshoot, and repair as
  necessary
1. Document in the log book
                                                                    2. Document in the log book

-------
                                                                        Project: PEP QAPP
                                                                         Element No.: 11.0
                                                                           Revision No.: 1
                                                                           Date: 3/6/2009
                                                                             Page 9 of 11
Post-Sampling Event Activities
Elapsed
sample time
Elapsed
sample time
Power
Power
Filter
inspection
Data down-
loading
Out of specification
(1 min/mo)
Sample did not run
Power interruptions
Liquid crystal display
(LCD) panel is on, but
the sample is not
working
Torn filter or otherwise
suspect paniculate
matter on the 46.2-mm
filter
Data will not transfer
to laptop computer
Check programming; verify power
outages
1. Check programming
2. Try programming the sample run to
start while the operator is at the site;
ensure the transport filter is in the
unit
Check line voltage
Check the circuit breaker (some
samplers have a battery backup for
data, but it will not work without AC
power)
1. Inspect area downstream of where
filter rests in the sampler and
determine if paniculate matter has
been bypassing filter
2. Inspect the in-line filter before the
sample pump and determine if
excessive loading has occurred;
replace as necessary
Document key information on the
sample data sheet; make sure the
problem is resolved before data are
written over in the sampler micro-
processor
Notify the Regional WAM/
TOPO/DOPO
1. Document on the FDS; notify
the Regional WAM/TOPO/
DOPO
2. Document in the log book;
notify the Regional WAM/
TOPO/DOPO
Notify the Regional WAM/
TOPO/DOPO
Document in the log book
1. Document on the FDS
2. Document in the log book
Notify the Regional WAM/
TOPO/DOPO
" Contingent on the SLT monitoring agency invalidating its FRM results for the sampling event or other PEP
sampler performance anomalies such as flow CVs that fall outside of acceptable ranges.

11.4  Sampling Equipment, Preservation, and Holding Time Requirements

This section details the requirements needed to prevent sample contamination, the volume of air
to be sampled, how to protect the sample from contamination, temperature preservation
requirements, and the permissible holding times to ensure against degradation of sample
integrity. In addition, Element 15.0, Instrument/Equipment Testing, Inspection, and Maintenance
Requirements, provides information on sampler maintenance to reduce the potential of
contamination or the collection of samples that do not represent the population of interest.

-------
                                                                        Project: PEP QAPP
                                                                          Element No.: 11.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
                                                                             Page 10 of 11
11.4.1  Sample Contamination Prevention

The PEP has rigid requirements for preventing sample contamination. Powder-free, antistatic
gloves are worn while handling filter cassettes in the laboratory. After the filter cassette has been
removed from the weigh room, it must never be  opened because the 46.2-mm Teflon filter could
become damaged. Filter cassettes will be stored  in protective containers. After samples have
been pre-weighed, they are to be stored with the paniculate collection side up, capped with metal
caps, and individually stored in static-resistant zip-top plastic bags.

11.4.2  Sample Volume

The volume of air to be sampled is specified in 40 CFR Part 50, Appendix L. Sample flow rate
of air is 16.67 Lpm. The total sample of air collected will be 24 m3 based on a 24-hour sample.
Sampling time is expected to be 24 hours (midnight to midnight); however, a shorter sampling
period may be necessary in some cases. This shorter sampling period should not be less than 23
hours. If a sample period is less than 23 hours or greater than 25 hours, then the sample will be
flagged and the Regional WAM/TOPO/DOPO will be notified.
11.4.3  Temperature Preservation Requirements

The temperature requirements for FRM PM2.5 sample collection are explicitly detailed in 40
CFR Part 50, Appendix L. During transport from the laboratory to the sampling location, there
are no specific requirements for temperature control; however, the filters will remain in their
protective container and in the transport container. Excessive heat must be avoided (e.g., do not
leave in direct sunlight or in a closed car during summer). During the 24-hour sampling period,
the filters will be subjected to ambient temperatures and shall not exceed the ambient
temperature by more than 5°C for more than 30 minutes. Upon retrieval of the sample, the filter
temperature will be modified to cool them as soon as possible to <4°C (see PEP Field SOP,
Section 6). The filter temperature requirements are detailed in Table 11-2.

                      Table 11-2. Filter Temperature Requirements
Item
Filter temperature control during
sampling and until recovery
Filter temperature control from time
of recovery to the start of conditioning.
Post-sample transport
Temperature
Requirement
No more than 5°C above
ambient temperature
4°C or less"
4°C or less"
Reference
40 CFR Part 50, Appendix L, Section 7.4.10
40 CFR Part 50, Appendix L, Section 10. 13
40 CFR Part 50, Appendix L, Section 8.3.6
" PEP requirement is more stringent than regulations for FRM design.

-------
                                                                          Project: PEP QAPP
                                                                           Element No.: 11.0
                                                                             Revision No.: 1
                                                                              Date: 3/6/2009
                                                                               Page 11 of 11
11.4.4 Permissible Holding Times

The permissible holding times for the routine FRM network PM2.5 sample are clearly detailed in
both 40 CFR Part 50, Appendix L, and Quality Assurance Guidance Document 2.12. The
holding times for the PEP are provided in Table 11-3. Note that in some steps, PEP requirements
are more stringent than the FRM network regulations.

                                Table 11-3. Holding Times
Item
Pre-sampling
weighed filter
Recovery of filter
Shipped to laboratory
Post-sampling filter
stored at <4°C
Holding Time
<30 days
<24 hours"'6
<8 hours
(ideally)"' *
<10daysa'*
From
Date of pre-
weighing
Completion of
sampling event
Time of recovery
Sample end
date/time
To
Date of sampling
event
Time of sample
recovery
Time of shipment
Date of post-
weighing
Reference
40 CFR Part 50, Appendix L,
Section 8. 3. 5
40 CFR Part 50, Appendix L,
Section 10. 10
40 CFR Part 50, Appendix L,
Section 10. 13
40 CFR Part 50, Appendix L,
Section 8. 3. 6
" PEP requirement is more stringent than regulations for FRM design.
b See Table 6-4 (found in Element 6.0, Project/Task Description) for exceptions.

-------
                                                                         Project: PEP QAPP
                                                                          Element No.: 12.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
                                                                        	Page 1 of 1
                     12.0  Sample Handling and Custody

Due to the potential use of the PM2.5 data for comparison to the NAAQS and the requirement for
extreme care in handling the sample collection filters, sample COC procedures will be followed.
The PEP Laboratory SOP (Sections 5 and 9) and the PEP Field SOP (Sections 3 and 7) provide
detailed instruction on filter-handling and COC procedures, which will not be included in this
section.

Due to the small amount of PM that is expected on these filters, improper filter handling can be a
major source of error. Care must be taken when handling both exposed and unexposed filters.
Filter cassettes should be handled in a manner to prevent the filters they contain from being
damaged or contaminated. Similarly, rough handling of exposed filters should be avoided
because this may dislodge collected PM on the filters. Care should be taken to avoid inadequate
conditioning of filters or excessive delays between sample retrieval and sample weighing
because this may lead to positive or negative weight changes and, thus, to inaccurate PM2.5
concentration measurements.

COC forms are used to ensure that

    •   Filters are processed, transferred, stored, and analyzed by authorized personnel
    •   Sample integrity is maintained during all laboratory phases of sample handling and
       analyses
    •   An accurate written record is maintained of sample handling and treatment from the time
       of receipt from EPA through laboratory procedures to disposal.

Proper sample custody minimizes accidents by assigning responsibility for all stages of sample
handling and ensures that problems will be detected  and documented if they occur. A sample is
in custody if it is in actual physical possession of authorized personnel or if it is in a secured area
that is restricted to authorized personnel. As illustrated in Figure 6.1 (found in Element 6.0,
Project/Task Description) the three-part carbonless COC Form starts at the weighing laboratory,
proceeds through field activities, and then it is sent back to the laboratory. Later, the information
is entered into the weighing laboratory's sample tracking system, where an electronic record will
be kept.

-------
                                                                       Project: PEP QAPP
                                                                        Element No.: 13.0
                                                                          Revision No.: 1
                                                                           Date: 3/6/2009
                                                                      	Page  Iof8
                  13.0  Analytical Methods Requirements

The analytical methods described below provide for gravimetric analyses of filters used in the
PEP. The net weight gain of a sample filter is calculated by subtracting the initial weight (pre-
sampling) from the final weight (post-sampling). The net weight gain is divided by the total flow
volume passed through a filter (derived from the field data) to calculate the concentration. This
PEP-derived concentration may be compared to the concentration derived in the same manner
from a primary routine monitor.

All analytical methods are included in the document entitled Quality Assurance Guidance
Document Method Compendium Laboratory Standard Operating Procedure for the PM2.5
Performance Evaluation Program. The PEP weighing laboratory will be responsible for
implementing these analytical procedures. The following sections summarize the information in
the PEP Laboratory SOP; however, it is important to note that these summaries do not replace
the SOP.

13.1   Preparation of Sample Filters

Upon delivery of 46.2-mm Teflon filters to the laboratory, the receipt is documented and the
filters are stored in the conditioning/weighing room/laboratory. Storing filters in the laboratory
makes it easier to maximize the amount of time available for conditioning. Upon receipt, cases of
filters will be labeled with the date of receipt, they will be opened one at a time, and they will be
used completely before opening another case. All filters in a lot will be used before a case
containing another lot is opened. When more than one case is available to open, the "First In-
First Out" rule will apply.

Filters will be visually inspected according to the FRM criteria to determine compliance. Filters
will then be stored in the filter conditioning compartment in unmarked Petri dishes.

13.2   Analysis Method

13.2.1 Analytical Equipment and Method

A complete listing of the analytical equipment is found in the PEP Laboratory SOP and in
Element 17.0, Inspection/Acceptance for Supplies and Consumables.

The analytical instrument used for gravimetric analysis in the FRM method (gravimetric
analysis) is the microbalance. The PEP weighing laboratory currently uses the Sartorius® MC-5,
which has a readability of 1 //g and a repeatability of 1 //g. The microbalance is calibrated twice
yearly by a technician under a service agreement between the weighing laboratory and the
vendor.

-------
                                                                      Project: PEP QAPP
                                                                       Element No.: 13.0
                                                                         Revision No.: 1
                                                                         Date: 3/6/2009
                                                                     	Page 2 of 8
As Figure 13-1 indicates, the method of analysis consists of pre-sampling and post-sampling
stages. Figure 13.1 also indicates the section number where detailed procedures can be found in
the PEP Laboratory SOP.
      Performance Evaluation Program Laboratory SOP Structure
           OAQPS
                                              •t f  *
                   Pre-sampling
                                              Weighing Lab
                                              Sections 2 and 3
                                  Filter Handling/
                                Inventory/Inspection
                                   Section 5
         Laboratory Shipping/Receiving
           and Chain of Custody
               Section 9
Calibration/Filter Weighing/QA/QC
     Sections 7, 8, and 11
   Filter Conditioning
       Section 6
                    Post-sampling
                     Lab
AQS
                       Cal ibration/FiIter Weigh ing/QA/OC
                            Sections 7, 8, and 11
                Data EntryfOata Transfer
                     Section 10
          /Storage.'
          Archiving
     /   Section 12
                          Figure 13-1. Laboratory activities.
Pre-sampling Stage
   •  Filters are received from EPA, logged in, and examined for integrity.
   •  A proportion of filters are conditioned for use in the field.
   •  Filters are equilibrated, weighed, and enumerated.
   •  Filters are prepared for field activities and shipped to the appropriate Regions.

-------
                                                                         Project: PEP QAPP
                                                                          Element No.: 13.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
                                                                         	Page 3 of 8
Post-sampling Stage

The post-sampling stage consists of the following steps (in chronological order):

   Step 1.   Filters are received in the weighing laboratory, checked for integrity (e.g., damage,
            temperature), and logged in.
   Step 2.   Filters are archived (in cold storage) until ready for weighing.
   Step 3.   Filters are brought into the weighing laboratory and equilibrated for 24 hours.
   Step 4.   Filters are weighed, and then the gravimetric results are entered into the data entry
            system.
   Step 5.   Field data are entered into the data entry system to calculate a concentration.
   Step 6.   Data are verified and validated.
   Step 7.   Filters are archived in cold storage for the remainder of the calendar year and for
            one full calendar year afterwards. Filters are then stored at room temperature for an
            additional three calendar years. For example, a filter sampled on March 1, 2007 will
            be kept in refrigerated storage until December 31, 2008 and not disposed of until
            after December 31,2011.
   Step 8.   Required data are submitted to the AQS database.

13.2.2 Conditioning and Weighing Room

The primary support facility for the PM2.5  analysis is the weighing laboratory. Facility space is
dedicated for long-term archiving of the filter.  This weighing room is used for both sample
conditioning and pre- and post-sampling weighings of each PM2.5 filter sample. The laboratory
facilities have been constructed to minimize contamination from dust or other potential
contaminants (using High-Efficiency Particulate Air [HEPA] filters and sticky mats) and will
have restricted access to LAs who will wear appropriate laboratory attire at all times.

Specific requirements for environmental control of the weighing room  are detailed in 40 CFR
Part 50, Appendix L. Mean relative humidity is controlled between 30% and 40%, with a target
of 35% and variability of not more than ±  5% over 24 hours, with minimums  and maximums
never to fall out of the 25%-45% range. Mean temperature should be held between 20°C and
23°C, with a variability of not more than ± 2°C over 24 hours, with minimums and maximums
never to fall out of the 18°C-25°C range. Temperature and relative humidity are measured and
recorded continuously during equilibration. The balance is located on a vibration-free table and
is protected from or located out of the path of any sources of drafts. Filters are conditioned
before the pre- and post-sampling weighing sessions. Filters must be conditioned for at least 24
hours to allow their weights to stabilize before being weighed.

-------
                                                                        Project: PEP QAPP
                                                                         Element No.: 13.0
                                                                           Revision No.: 1
                                                                           Date: 3/6/2009
	Page 4 of 8


 13.3  Internal QC and Corrective Action for Measurement System

 13.3.1 Corrections to the SOP

 The ESAT contractors are responsible for implementing this QAPP and the PEP Laboratory
 SOP, and they are responsible for the quality of the data. All methods will be reviewed and
 implemented by the ESAT contractors. If changes or corrections are required to the PEP
 Laboratory SOP or QAPP, the ESAT contractor will  notify the Regional WAM/TOPO/DOPO in
 writing. The WAM/TOPO/DOPO will then convey the issue(s) to the PEP Workgroup, which
 will review the changes and attempt to classify them  according to the effect the changes would
 have on the data. The required procedure for changes to the PEP Field SOP is discussed in
 Element 11.0, Sampling Methods Requirements.

 13.3.2 Data Operations

 A QC notebook or database (with disk backups) will  be maintained and will contain QC data and
 entry forms, calibration and maintenance information, routine internal QC checks of mass
 reference standards, laboratory and field filter blanks, and external QA audits. Control charts will
 be maintained for each microbalance and it will be included in this notebook. These charts may
 allow for the discovery of excess drift that could signal an instrument malfunction.

 QC checks will be used to assist the LAs in controlling and evaluating the quality of data during
 a weighing session. These QC checks include the following:

    •   Mass working standards weighed at the beginning and at the end of each weighing
       session, and one after approximately every 15 samples or fewer, per the
       recommendations of the balance  manufacturer
    •   Blanks (both field and laboratory) that will be used to determine contamination
    •   Duplicate routine weights to determine repeatability and filter stability of the instrument
       within and between the weighing sessions.

 The acceptance requirements for these QC checks can be found in Table 7-1, in the SOP, and in
 more detail in Element 14.0, Quality Control Requirements.

 Corrective action measures in the PEP will be taken to ensure data of adequate quality. There is
 the potential for many types of sampling and measurement system corrective actions.  Tables 13-
 1 (organized by laboratory support equipment) and 13-2 (organized by laboratory support
 activity) list potential problems and corrective actions needed to support the PEP. Filter
 weighing will be delayed until corrective actions are  satisfactorily implemented.

-------
                                                                     Project: PEP QAPP
                                                                      Element No.: 13.0
                                                                        Revision No.: 1
                                                                        Date: 3/6/2009
                                                                    	Page 5 of 8
Table 13-1. Potential Problems/Corrective Action for Laboratory Support Equipment
System
Weigh room
Weigh room
Balance
Balance
Balance
Balance
Item
Relative humidity
Temperature
Internal calibration
Zero
Working standards
Filter weighing
Problem
Out of
specification
Out of
specification
Unstable
Unstable
Out of
specification
Unstable
Action
Check the heating, ventilation, and air
conditioning (HVAC) system
Check the HVAC system
Retry internal calibration
Retry zero and check for drafts; check
that draft guard is sealed
1 . Check the temperature and relative
humidity and check the working
standard
2. Recalibrate and check the working
standard
3. Check with primary standards
Check laboratory blank filters
Notification
PEP Laboratory
Manager
PEP Laboratory
Manager
PEP Laboratory
Manager
PEP Laboratory
Manager
Document, PEP
Laboratory Manager
Document in a log book
                Table 13-2. Filter Preparation and Analysis Checks
Activity
Microbalance use
Control of balance
environment
Use of mass
reference standards
Filter handling
Filter integrity
check
Filter identification
Method and Frequency
1 per year to establish instrument
detection limit (IDL)
5-minute values of temperature and
relative humidity averaged for 24
hours
Working standards checked every
3 months against the laboratory
primary standards
Observe handling procedures
Visually inspect each filter
Write the filter number on the COC
Form, the cassette number on the
protective container, and both
numbers in the database and/or on a
laboratory data form in permanent
ink
Requirements
Resolution of 1 /ug, repeatability of
l/
-------
 Project: PEP QAPP
  Element No.: 13.0
    Revision No.: 1
     Date: 3/6/2009
	Page 6 of 8
Activity
Filter lot stability
Pre- sampling filter
equilibration





Initial filter
weighing
Internal QC





Post-sampling
inspection,
documentation, and
verification
Method and Frequency
Determine the correct equilibration
conditions and period (at least 24
hours) for each new lot of filters
Equilibrate filters for at least 24
hours in weighing room; observe
and record the equilibration
chamber relative humidity and
temperature; enter into the database
and/or on the laboratory data form




Observe all weighing procedures;
perform all QC checks
1 . After approximately every 1 5th
filter (or fewer, per
recommendations from the
balance manufacturer), reweigh
the two working standards
2. Weigh laboratory filter blanks
3. Reweigh the first filter as the last
routine weight with each sample
batch (duplicate weighing)
4. For post-sampling weighing
sessions only, keep the filter used
for duplicate weighing and place
it with the next batch; do not
make this filter one of the first
three filters in the next batch
(previous batch duplicate)
Examine the filter and FDSs for
correct and complete entries; if
sample was shipped in a cooled
container, verify that a low
temperature was maintained
Requirements
Check for stability of lot exposure
blank filter weights; weight
changes must be <1 5 /ug on
successive weighings of lot
exposure blanks
Mean relative humidity between
30% and 40%, with a target of
35% and variability of not more
than ±5% over 24 hours, with
minimums and maximums never to
fall out of the 25%-45% range;
mean temperature should be held
between 20°C and 23°C, with a
variability of not more than ± 2°C
over 24 hours, with minimums and
maximums never to fall out of the
18°C-25°C range
Neutralize electrostatic charge on
filters; wait until the balance
indicates a stable reading
1 . Working standard
measurements must agree to
within 3 /ug of the certified
values
2. Blank measurements must agree
to within 1 5 /ug
3. First and last filter reweigh
measurements must agree to
within 1 5 fig
4. Filter reweigh measurements
between adjacent weigh
sessions must agree to within



No damage to filter; FDS
complete; sampler worked OK
Action If the Requirements
Are Not Met
Revise the equilibration
conditions and period; repeat
the equilibration
Revise the equilibration
conditions and period; repeat
the equilibration





Repeat weighing
1 . Stop weighing and
troubleshoot
2. Flag values for validation
activities
3. Flag; reweigh 2nd and 3rd
filters; if failure, then
recondition all sample in
run and reweigh




Notify the PEP Laboratory
Manager; flag filters

-------
                                                                       Project: PEP QAPP
                                                                        Element No.: 13.0
                                                                          Revision No.: 1
                                                                           Date: 3/6/2009
                                                                       	Page 7 of 8
Activity
Post-sampling filter
equilibration
Post-sampling filter
weighing
Method and Frequency
Equilibrate filters for at least 24
hours in weighing room; observe
and record the equilibration
chamber relative humidity and
temperature; enter into the database
and/or on the laboratory data form
(must be within ±5% relative
humidity of pre-sampling weighing
conditions)
Observe all weighing procedures;
perform all QC checks
Requirements
Mean relative humidity between
30% and 40%, with a target of
35% and variability of not more
than ±5% over 24 hours; with
minimums and maximums never to
fall out of the 25^5% range;
mean temperature should be held
between 20°C and 23°C, with a
variability of not more than ±2°C
over 24 hours, with minimums and
maximums never to fall out of the
18-25°C range
Neutralize electrostatic charge on
filters; wait 30 to 60 seconds after
balance indicates a stable reading
before recording data
Action If the Requirements
Are Not Met
Repeat equilibration
Repeat weighing
13.4   Filter Sample Contamination Prevention, Preservation, and Holding
       Time Requirements

This section details the requirements needed to prevent and protect the sample from
contamination, the temperature requirements for sample preservation, and the permissible
holding times to ensure against degradation of sample integrity.

13.4.1  Sample Contamination Prevention

The analytical support component of the PEP has rigid requirements for preventing sample
contamination. Filters are equilibrated/conditioned and stored in the same room where they were
weighed and will be protected in Petri dishes. The weighing room is controlled for climate and
contamination (see  Section 13.2.2). Powder-free and antistatic gloves are worn while handling
filters, and filters are only contacted with smooth,  non-serrated forceps. Upon determining a pre-
sampling weight, the filter is placed in its cassette, filter caps are placed on the cassette, and then
the capped cassette is placed in a plastic, antistatic shipping bag. The shipping bag and capped
cassette are only opened when the filter is being installed in a monitor. After the filter has been
removed from the weighing room, it will never leave the cassette until it is back in the weighing
room (during post-sampling).
13.4.2  Temperature Preservation Requirements

The temperature requirements of the PM2.5 FRM network are explicitly detailed in 40 CFR Part
50, Appendix L. The PEP requirements will be more stringent. In the weighing room laboratory,
the filters must be conditioned for a minimum of 24 hours before pre-weighing; although, a
longer period of conditioning may be required. The mean weighing room laboratory temperature
must be maintained between 20°C and 23 °C, with no more than a ± 2°C change over the 24-
hour period before weighing the filters. Minimums and maximums should never fall out of the

-------
                                                                        Project: PEP QAPP
                                                                         Element No.: 13.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
	Page 8 of 8


 18°C-25°C range. During transport from the weighing room to the sample location, there are no
 specific temperature control requirements; however, the filters will be in their protective
 container, and temperature extremes (excessive heat or cold) will be avoided. Temperature
 requirements for the sampling and post-sampling periods are detailed in 40 CFR Part 50,
 Appendix L,  Section 7.4.10. These requirements state that the temperature of the filter cassette
 during sampler operation and in the period from the end of sampling to the time of sample
 recovery shall not exceed that of the ambient temperature by more than 5°C for more than 30
 minutes.

 The specifics of temperature preservation requirements are clearly detailed in 40 CFR Part 50,
 Appendix L.1 These requirements pertain to sample media before collection, as well as the
 sample media and sample after a sample has been collected. During the sample collection, there
 are also temperature control requirements, which are detailed in Table 13-3.

                     Table 13-3. Temperature Control Requirements
Item
Weighing room
Filter temperature control during
sampling and until recovery
Post- sample transport
Temperature Requirement
Mean temperature should be held between 20°C
and 23°C, with a variability of not more than ± 2°C
over 24 hours, with minimums and maximums
never to fall out of thel8°C-25°C range"
No more than 5°C above ambient temperature
<4°C"
Reference
40 CFR Part 50, Appendix L,
Section 8.2
40 CFR Part 50, Appendix L,
Section 7.4. 10
40 CFR Part 50, Appendix L,
Section 8.3.6
" PEP requirement is more stringent than regulations for FRM design.

13.4.3 Permissible Holding Times

The permissible holding times for the PM2.5 sample are clearly detailed in both 40 CFR Part 501
and Section 2.12 of the U.S. EPA QA Handbook2. A summary of these holding times is provided
in Table 11-3, which is found in Element 11.0, Sampling Methods Requirements.
References

The following documents were used to develop this element:

1.  U.S. EPA (Environmental Protection Agency). 2006. National Ambient Air Quality
    Standards for Particulate Matter—Final Rule. 40 CFR Part 50. Federal Register
    77(200):61144-61233. October 17.

2.  U.S. EPA (Environmental Protection Agency). 1998. U.S. EPA Quality Assurance Guidance
    Document 2.12: Monitoring PM2.5 in Ambient Air Using Designated Reference or Class I
    Equivalent Methods. March.

-------
                                                                         Project: PEP QAPP
                                                                          Element No.: 14.0
                                                                            Revision No.: 1
                                                                            Date: 3/6/2009
                                                                              Page 1 of 18
                     14.0  Quality Control Requirements

To assure the quality of data from air monitoring measurements, two distinct and important
interrelated functions must be performed. One function is the control of the measurement process
through broad QA activities, such as establishing policies and procedures, developing DQOs,
assigning roles and responsibilities, conducting oversight and reviews, and implementing
corrective actions. The other function is the control of the measurement process through the
implementation of specific QC procedures, such as audits, calibrations, checks, replicates, and
routine self-assessments. In general, the greater the control of a given monitoring system, the
better will be the resulting quality of the monitoring data.

QC is the overall system of technical activities that measures the attributes and performance of a
process, item, or service against defined standards to verify that the stated requirements
established by the customer are met. In the case of the PEP, QC activities are used to ensure that
measurement uncertainty, as discussed in Element 7.0, Quality Objectives and Criteria for
Measurement Data, is maintained within acceptance criteria for the attainment of the DQO.
Figure 14-1 represents many QC activities that help to evaluate and control data  quality for the
PM2.5 PEP. The activities in this figure are implemented by the PEP and are discussed in the
appropriate elements of this QAPP.

14.1  QC Procedures

Day-to-day QC is implemented through various check samples or instruments that are used for
comparison. The MQOs table (Table 7-1) in Element 7.0, Quality Objectives and Criteria for
Measurement Data., contains a complete listing of these QC samples, as well as other
requirements for the PM2.5 PEP. The procedures for implementing the QC samples are included
in the PEP Field and Laboratory SOPs, respectively. As Figure  14-1 illustrates, various types of
QC samples have been inserted at phases of the data operation to assess and control
measurement uncertainties. Tables 14-1 and 14-2 contain summaries of all the field and
laboratory QC samples. The following information provides some additional descriptions of
these QC activities, how they will be used in the evaluation process, and what corrective actions
will be taken  when they do not meet acceptance criteria.

-------
                                                                    Project: PEP QAPP
                                                                     Element No.: 14.0
                                                                       Revision No.: 1
                                                                       Date: 3/6/2009
                                                                         Page 2 of 18
      PM2.5 PEP Quality Control Sampling  Scheme
Laboratory
Pre-field
Weighing
Field
Sampling
Laboratory
Post-field
Weighing
QC
Checks
^
r
QC
Checks
T
           Measurement   Transportation    Instrument
            System     and Laboratory   Precision/Bias
           Contamination    Related
                     Contamination
\
Measurement
  System
 Precision
QC
Checks
1
r
QC
Checks
                            Laboratory   Weighing Lab
                           Contamination  Precision/Bias
                         Figure 14-1. PEP QC scheme.

-------
Table 14-1. Field QC Checks
Requirement
Frequency
Acceptance Criteria
CFR Reference
SOP
Reference
Information
Provided
Calibration Standards
Flow rate transfer
standard or primary
standard
Field thermometer
Field barometer
1/yr
1/yr
1/yr
± 2% of NIST-traceable
standard
±0.1°C resolution
± 0.5°C accuracy
± 1 mm Hg resolution
± 5 mm Hg accuracy
Part 50, Appendix
L, Section 9.2.2
Not described
Not described
Not described
Not described
Field SOP,
Section 8


Certification of
traceability
Certification of
traceability
Certification of
traceability
Calibration/Verification
Single-point flow rate
verification
Multipoint flow rate
verification"
Flow rate calibration
Single-point flow rate
verification
External leak check
Internal leak check
Single-point
temperature verification
Temperature multipoint
verification
Temperature calibration
Every sampling event
1/yr or upon failure of
single-point verification
Upon failure of multipoint
verification
Following every
calibration
Every sampling event
Upon failure of external
leak check
Every sampling event and
following every calibration
1/yr or upon failure of
single-point verification
Upon failure of multipoint
verification
± 4% of working standard or ±
4% of design flow (16.67 Lpm)
± 2% of calibration standard
± 2% of calibration standard at
design flow (16.67 Lpm)
± 2% of design flow
(16.67 Lpm)
<80 mL/min
<80 mL/min
± 2°C of working standard
± 2°C of calibration standard
± 0. 1°C of calibration standard
Part 50, Appendix
L, Section 9.2.5
Part 50, Appendix
L, Section 9.2.5
Part 50, Appendix
L, Section 9.2.6
Part 50, Appendix
L, Section 9.2.6
Part 50, Appendix
L, Section 7.4.6
Part 50, Appendix
L, Section 7.4.6
Part 50, Appendix
L, Section 9. 3
Part 50, Appendix
L, Section 9. 3
Part 50, Appendix
L, Section 9. 3
Field SOP,
Section 5
Field SOP,
Section 10
Field SOP,
Section 10
Field SOP,
Section 10
Field SOP,
Section 5
Field SOP,
Section 5
Field SOP,
Section 5
Field SOP,
Section 10
Field SOP,
Section 10
Calibration drift and
memory effects
Calibration drift and
memory effects
Calibration drift and
memory effects
Calibration drift and
memory effects
Sampler function
Sampler function
Calibration drift and
memory effects
Calibration drift and
memory effects
Calibration drift and
memory effects
                                                                              M fD h-.
                                                                              
-------
Requirement
Single-point barometric
pressure verification
Multipoint barometric
pressure verification
Barometric pressure
calibration
Clock/timer verification
Frequency
Every sampling event and
following every calibration
1/yr or upon failure of
single-point verification
Upon failure of multipoint
verification
Every sampling event
Acceptance Criteria
± lOmmHg
± lOmmHg
± lOmmHg
1 min/mo
CFR Reference
Part 50, Appendix
L, Section 9. 3
Part 50, Appendix L,
Section 9.3
Part 50, Appendix
L, Section 9. 3
Part 50, Appendix
L, Section 7.4. 12
SOP
Reference
Field SOP,
Section 5
Field SOP,
Section 10
Field SOP,
Section 10
Field SOP,
Section 5
Information
Provided
Calibration drift and
memory effects
Calibration drift and
memory effects
Calibration drift and
memory effects
Verification of to
assure proper
function
Blanks
Field filter blank*
Trip filter blank'
One/audit (for programs
<2 years old)
One/FS per trip (for all
others)
10% of all filters
± 30 jug change between
weighings
± 30 //g change between
weighings
Part 50, Appendix
L, Section 8.2
Not described
Field SOP,
Section 8
Lab SOP,
Section 8
and Field
SOP,
Section 6
Measurement system
contamination
Measurement system
contamination
Precision (Using Collocated Samplers)11
All samplers
(mandatory)
2/yr (semi-annual)
Coefficient of variance <10%
Not described
Field SOP,
Section 8
Measurement system
precision
Accuracy (Using Independent Verification Devices)
Flow rate audit
External leak check
Internal leak check
4/yr (manual)
4/yr
4/yr (if external leak check
fails)
± 4% of calibration standard at
design flow (16.67 Lpm)
<80 mL/min
<80 mL/min
Part 58, Appendix
A, Section 3.5.1
Part 50, Appendix
L, Section 7.4.6
Part 50, Appendix
L, Section 7.4.6
Field SOP,
Section 5
Field SOP,
Section 5
Field SOP,
Section 5
Instrument
bias/accuracy
Sampler function
Sampler function
.e?  r>  s-
 K- O

-------
Requirement
Temperature audit
Barometric pressure
audit
Frequency
4/yr
4/yr
Acceptance Criteria
± 2°C of calibration standard
± 10 mm Hg of calibration
standard
CFR Reference
Part 50, Appendix
L, Section 9. 3
Part 50, Appendix
L, Section 7.4
SOP
Reference
Field SOP,
Section 5
Field SOP,
Section 5
Information
Provided
Calibration drift and
memory effects
Calibration drift and
memory effects
Technical Systems Audits*
Flow rate audit
External leak check
Internal leak check
Temperature audit
Barometric pressure
audit
1/yr
1/yr
1/yr (if external leak check
fails)
1/yr
1/yr
± 4% of calibration standard at
design flow (16.67 Lpm)
<80 mL/min
<80 mL/min
± 2°C of transfer standard
±10 mm Hg of transfer
standard
Part 58, Appendix
A, Section 3. 5.1
Part 50, Appendix
L, Section 7.4.6
Part 50, Appendix
L, Section 7.4.6
Part 50, Appendix
L, Section 9. 3
Part 50, Appendix
L, Section 7.4
Field SOP,
Section 5
Field SOP,
Section 5
Field SOP,
Section 5
Field SOP,
Section 5
Field SOP,
Section 5
External verification
bias/ accuracy
Sampler function
Sampler function
Calibration drift and
memory effects
Calibration drift and
memory effects
a The BGIPQ200A is not capable of performing a multipoint verification for flow rate. If the BGIPQ200A fails a single-point verification for flow,
 then a single-point calibration should be performed next.
b For a new SLT program (i.e., <2 years old), the frequency for field blanks is one per FRM/FEM audit. For all others, one field blank should be
 performed per FS per trip. A trip may include audits for more than one FRM/FEM sampler. It is up to the FS to determine the site where the field
 blank audit will be performed, unless otherwise directed by his or her Regional WAM/TOPO/DOPO (e.g., when a problem is identified at a particular
 site).
c Trip blanks will be performed at a frequency of 10% of all filters, as determined by the weighing laboratory (i.e., one per every 10 filters shipped out,
 rounded up). So if the laboratory sends out one to 10 filters, then one trip blank should be included in the shipment. If the laboratory ships 11 to 20
 filters, then two trip blanks should be included. The FS will determine with which trip to use the trip blank filter(s), in a manner similar to the field
 blanks; however, if the FS receives more than one trip blank in a shipment, then he or she must make sure that only one trip blank is carried per trip.
rf Twice per year, all of the PEP samplers used by the Region (and any SLT organizations that are running their own PEP) must be collocated and run at
 the same location over the same time period. These are often referred to as "parking lot collocations." In 2007, the monthly and quarterly frequency
 was replaced by semi-annual collocation scenarios because the historical performance shows that the precision does not seem to vary significantly.
* All of the annual technical assessments may be performed in conjunction with one of the semi-annual parking lot studies. It will involve the Regional
 WAM/DOPO/TOPO or the National PEP Project Leader who will observe the FS when the sampler audits are performed.
,gfi
£ *  K"

-------
                                                                                        Project: PEP QAPP
                                                                                            Element No.: 14
                                                                                            Revision No.: 1
                                                                                             Date: 3/6/2009
                                                                                               Page 6 of 18
                                    Table 14-2. Laboratory QC
Requirement
Frequency
Acceptance
Criteria
SOP
Reference
Information Provided
Blanks
Lot exposure
Laboratory
filter
Trip filter
3 filters from each of 3 boxes in lot
(9 filters total)
10% or 1 per weighing session
10% of all filters
± 15 jug change
between weighings
± 15 jug change
between weighings
± 30 jug change
between weighings
Lab SOP,
Section 6
Lab SOP,
Section 8
Lab SOP,
Section 8
Filter stabilization/
equilibrium
Laboratory
contamination
Transportation and
laboratory
contamination
Calibration/Verification
Balance
calibration
Laboratory
temperature
verification
Laboratory
humidity
verification
When routine QC checks indicate
calibration is needed and upon
approval
I/quarter
I/quarter
Manufacturer's
specification
±2°C
± 2% relative
humidity
Lab SOP,
Section 7
Lab SOP,
Section 7
Lab SOP,
Section 7
Verification of
equipment operation
Verification of
equipment operation
Verification of
equipment operation
Accuracy
Balance audit
(PE)
Balance check
2/yr
Beginning/end of the weighing
session and one after approximately
every 15 samples or fewer, per the
balance manufacturer's
recommendations
±20,«gofNIST-
traceable standard,
± 15 //g for
unexposed filters
<3 jug of working
mass standard
Lab SOP,
Section 1 1
Lab SOP,
Section 8
LA operation
Balance
accuracy /stability
Calibration Standards
Working mass
standards
Primary mass
standards
3-6 months
1/yr
0.025 mg
0.025 mg
Lab SOP,
Section 7
Lab SOP,
Section 7
Standards verification
Primary standards
verification
Precision
Duplicate filter
weighings
Interlaboratory
comparisons"1
One per weighing session, one carried
over to next session
1/yr
± 15 jug change
between weighings
Advisory limits set
byNAREL
Lab SOP,
Section 8
Lab SOP,
Section 1 1
Weighing repeatability/
filter stability
Between laboratory
repeatability
 NAREL administers inter-laboratory comparisons. EPA reports results annually in the Laboratory Comparison Study of
Gravimetric Laboratories Performing PM2,s Filter Weighing for the PM2,s Performance Evaluation Program and Tribal Air
Monitoring Support (available at http://www.epa.gov/ttn/amtic/pmpep.html). The advisory limits are three sigma limits that
were derived from previous gravimetric PE studies administered by NAREL.

-------
                                                                         Project: PEP QAPP
                                                                            Element No.: 14
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 7 of 18

14.1.1 Calibrations

Calibration is the comparison of a measurement standard or instrument with another standard or
instrument to report, or eliminate by adjustment, any variation (deviation) in the accuracy of the
item being compared.  The PEP calibration also ensures that the bias in flow rate among the PEP
sampler is minimized.

For the PEP, calibration activities follow a two-step process:

    •   Step 1. Certifying the calibration standard and/or transfer standard against an
       authoritative standard
    •   Step 2. Comparing the calibration standard and/or transfer standard against the routine
       sampling/analytical instruments.

Calibration requirements for the critical field and laboratory equipment are found in Tables 14-1
and 14-2, respectively; the details of the calibration methods are included in Element 16.0,
Instrument Calibration and Frequency, and in the PEP Field and Laboratory SOPs.

14.1.1.1  Calibration  Evaluation

Calibration data will be compared against actual standards acceptance.

Accuracy of a verification/calibration checks—Single check (quarterly) basis (
-------
                                                                         Project: PEP QAPP
                                                                            Element No.: 14
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 8 of 18

Lot blanks. A shipment of 46.2-mm filters will be sent from EPA to the weighing laboratory.
The shipment may contain many filter lots, which are labeled on each filter box (box of 50
filters). A representative number of filters in each lot must be tested to determine the length of
time that it takes for the lot to stabilize. Three filter boxes will be randomly selected from the lot,
and three filter lot blanks will be randomly chosen from each box (nine filters total). These lot
blanks will be subjected to the conditioning/pre-sampling weighing procedures. The blanks will
be weighed every 24 hours for a minimum of 1 week to determine the length of time that it takes
to condition filters (see PEP Laboratory SOP, Section 6).

Lot exposure blanks. Similar to lot blanks, lot exposure blanks are used to determine whether a
specific set of filters scheduled to be conditioned at one time are stable for pre-weighing (see
PEP Laboratory SOP, Section 6).

Field blanks. These provide an estimate of total measurement system contamination. By
comparing information from laboratory blanks against the field blanks, the contamination from
field activities can be assessed. For a new SLT program (i.e., <2 years old), the frequency for
field blanks is one per FRM/FEM audit. For all others, one field blank should be performed per
FS per trip. A trip may include audits for more than one FRM/FEM sampler. The FS will
determine the site where the field blank audit will be performed, unless otherwise directed by his
or her Regional WAM/TOPO/DOPO (such as when a problem is identified at a particular site).
Details about using field blanks can be found in PEP Field SOP, Section 6.

Trip blanks. These are used to measure the possible contamination to filters during
transportation to and from sampling locations. Trip blanks provide a frame of reference in case
field blanks exhibit a mass gain that is higher than the tolerance levels. Trip blanks will be
performed at a frequency of 10% of all filters, as determined by the weighing laboratory (i.e.,  1
per every 10 filters shipped out, rounded up). So if the laboratory sends out 1 to 10 filters, then
one trip blank should be included in the shipment.  If the laboratory ships out 11 to 20 filters, then
two trip blanks  should be included. The FS will determine with which trip to use the trip blank
filter(s), in a manner similar to the field blanks. However, if the FS receives more than one trip
blank in a shipment, then he or she must make sure that only one trip blank is carried per trip.
Details about using the trip blanks can be found in PEP Field SOP, Section 6.

Laboratory blanks. These provide an estimate of contamination occurring at the  weighing
facility. Laboratory blanks should be performed at a frequency of one per post-sampling
weighing session. The LA must weigh an adequate amount of filters during the pre-sampling
weighing sessions to allow for the post-sampling weighing requirement. Details about using the
laboratory blanks can be found in PEP Laboratory SOP, Section 8.

-------
                                                                          Project: PEP QAPP
                                                                             Element No.: 14
                                                                             Revision No.: 1
                                                                              Date: 3/6/2009
                                                                                Page 9 of 18
14.1.1.3  Blank Evaluation

The PEP will include, at a minimum, one field and one laboratory blank in each weighing
session sample batch. When the shipment of trip blanks and audit event filters arrive at the
weighing laboratory, they will be post- weighed. A batch is defined in Section 14.2. The
following statistics will be generated for data evaluation purposes:

Difference for a single check (d). The difference, d, for each check is calculated using the
following equation, where X represents the mass of the original pristine filter (pre-sampling), and
/represents the mass of the blank filter upon return to the laboratory (post-sampling).

                                       d=  \Y-X\

Percent difference for a single check (30 jug or 15 jug., respectively, then all of the samples in the weighing
session will be reweighed. Before reweighing, the laboratory balance will be checked for proper
operation. If the mean differences of either the field or laboratory blanks are still out of the
acceptance criteria, all samples within the weighing session will be flagged with the appropriate
flag (failed field blank [FFB] or failed laboratory blank [FLB]), and efforts will be made to

-------
                                                                          Project: PEP QAPP
                                                                             Element No.: 14
                                                                             Revision No.: 1
                                                                             Date: 3/6/2009
	Page 10 of 18

determine the source of contamination. In theory, field blanks would be expected to contain
more contamination than laboratory blanks; therefore, if the field blanks are outside of the
criteria but the laboratory blanks are acceptable, then weighing can continue on the next batch of
samples while field contamination sources are investigated. If the mean difference of the
laboratory blanks is >20 //g and two or more of the individual differences were >15 //g, then the
laboratory weighing of PEP filters will be suspended until the source of the instability is
identified and corrected. If the resolution requires more than 2  weeks, then the back-up
laboratories will be notified and operations will be temporarily shifted to the back-up laboratory
until the issue is satisfactorily resolved. The LA will alert the PEP Laboratory Manager about the
problem. The problem and solution will be reported and appropriately filed under response and
corrective action reports (PEP/108-025-01-01-237.1, see Element 9.0, Documentation and
Records).

Contamination of trip blanks would be expected to fall between those of laboratory blanks and
field blanks. If a trip blank acquires a mass gain that is >30 jug., then the filter should be
compared to the mass gain of the coincident field blank to determine if there was some unique
problem in transportation. If the field blanks are low (<30 //g), then the shipping and
transportation are suspect and should be investigated for possible invalidation of all events
associated with filters that were shipped with the trip blank. If the field blanks are high (>30 //g),
then further investigation is necessary to determine the source  of the problem. A problem may
exist with sample handling. After investigation, the appropriate sample may be flagged (failed
trip blank [FTB] or FFB).

Laboratory, trip, and field blanks will be control charted (see Section 14.3). The percent
difference calculation (4) is used for control-charting purposes and can be used to determine
equilibrium status.

14.1.2 Precision Checks

Precision is the measure of mutual agreement among individual measurements of the same
property, usually under prescribed similar conditions.  To meet the DQOs for precision, the PEP
must ensure the entire measurement process is within  statistical control. The following two types
of precision measurements, which are further discussed in the following  sections, will be made
in the PM2.5 PEP:

    •   Collocated monitoring
    •   Filter duplicates.

14.1.2.1  Collocated Monitoring

To evaluate the total measurement precision  of the PEP fleet of samplers, collocated monitoring
will be implemented. Twice per year (semi-annually), all of the PEP samplers used by a single
FS or Region must be collocated and run at the same location over the same time period. These
are often referred to as "parking lot collocations." These data will also be analyzed to identify

-------
                                                                          Project: PEP QAPP
                                                                             Element No.: 14
                                                                             Revision No.: 1
                                                                             Date: 3/6/2009
_ Page 11 of 18


individual samplers that habitually operate outside of the performance parameters that are
demonstrated by the bulk of the Regional PEP fleet of which it is associated (see Section
14.1.3.1 for more information). SLT agencies that conduct their PEP field operations will also
bring their samplers and participate in at least one of the semi-annual events. If an SLT agency
chooses to participate in only one collocation event with the EPA Region in a year,  then it must
conduct one other collocation event that involves at least four samplers and meets all other
collocation criteria such as the number of sampling days, sampling time, and sampler spacing.
These data will be reported to the PEP data support contractor for inclusion into the PEP's
QA/QC analyses and reports.

Evaluation of collocated  data. Collocated measurement pairs are selected for use in the
precision calculations only when both measurements are at least 3 //g/m3. The following
algorithms will be used to evaluate collocated data.

Percent difference for a single check (q). For particulate samplery, the
individual coefficients of variation (CVJ:q) during the semi-annual collocation study are pooled
using 40 CFR Part 58 Appendix A, Equation 1 1 (the following equation), where n is the number
of measurement pairs from collocated samplers. The coefficient of variation (CV) is the precision
estimator for PEP regional "parking lot" collocation studies, and X20.i,n-i is the 10th percentile of
a chi-squared distribution with n-1 degrees of freedom. The factor of two in the denominator
adjusts for the fact that each dt is calculated from two values with error.
                           CV=
In a classical sense, the precision of a single sampler cannot be estimated without the ability to
introduce a known concentration in a controlled environmental testing chamber in which the
sampler could make multiple measurements. Consequently the PEP relies upon its collocation
studies to characterize the relative precision, relative accuracy, and relative bias of a single
sampler compared to the other samplers in the same studies. The latter two are discussed in
Section 14.1.3.

-------
                                                                         Project: PEP QAPP
                                                                            Element No.: 14
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
	Page 12 of 18


Precision of a single sampler—semi-annual basis For PEP sampler y, the individual CV
values (represented by CV,) that are produced during a semi-annual collocation study are
aggregated using the following equation, where «,- is the number of CV calculations for that
particular sampler made during the collocation study.
Corrective action: Single monitor. A sampler with a CV >10% will be flagged (failed
collocated sample [PCS]) and reweighed. If the calculated CV is 10%-20%, then the FS will be
alerted to the problem. If the CV is >20%, then all the primary sampler data will be flagged
(PCS) from the last precision check and corrective action will be initiated. CVs and percent
differences will be control charted to determine trends (Section 14.2). The LA will alert the PEP
Laboratory Manager and the EPA Regional WAM/TOPO/DOPO about the problem as soon as
possible. The report will be appropriately filed under response and corrective action reports
(PEP/108-025-01-01-237.1, see Element 9.0, Documentation and Records).

14.1.2.2 Duplicate Laboratory Measurements

During laboratory pre- and post-weighing sessions, the first routine sample filter will be weighed
a second time at the end of the weighing session (see PEP Laboratory SOP, Section 8). The
difference (d) and percent difference (d,) will be calculated from these measurements. The
difference in the weights of the filter must be <15 //g. Failure may be due to transcription errors,
microbalance malfunction, or that the routine samples have not reached equilibrium. Other QC
checks (balance standards and laboratory blanks) may be used to eliminate microbalance
malfunction. If the duplicate does not meet the criteria, then the  second and third routine samples
will be selected and reweighed as second and third duplicate checks. If either of these samples
fails the acceptance criteria and the possibility of balance malfunction and transcription errors
have been eliminated, all samples in the batch will be equilibrated for another 12 hours and
reweighed. Corrective actions will continue until duplicate weights for the batch meet
acceptance criteria.

After a post-weighing session is completed, the routine sample used as the batch duplicate is
placed with the next batch. This filter should not be weighed as one of the first three filters in the
next batch. These are sometimes referred to as "previous batch duplicates" and serve as
indicators for the stability of the conditioning environments and the consistency of the
microbalances between weighing sessions. The difference between these filter weights must be
<15 jug. If the difference is >15 //g, then select two additional routine filters from the previous
batch and reweigh those. If there continues to be a problem, then review the weighing session
QC checks and consult with the PEP Laboratory Manager.

-------
                                                                         Project: PEP QAPP
                                                                            Element No.: 14
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
_ Page 13 of 18


14.1.3 Relative Accuracy and Relative Bias

Accuracy is defined as the degree of agreement between an observed value and an accepted
reference value and includes a combination of random error (precision) and systematic error
(bias). The following three accuracy checks are implemented in the PEP:

    •   Collocated monitors
    •   Flow rate audits
    •   Balance checks.

14.1.3.1  Collocated Monitors

Although the collocated PEP monitors are primarily used for evaluating and controlling
precision, the practice can also be used to determine relative accuracy or relative bias among
different models of PEP samplers. Beginning in 2008, EPA mandated that each Region will
collocate its entire fleet of samplers for a series of three sequential sampling events on a semi-
annual basis. In this way, each sampler's performance will compared to every other sampler. By
using 40 CFR Part 58, Appendix A, Equation 10, to determine the relative percent difference
   ), trends or bias of a single instrument can be tracked without knowing the true value.
The PEP now uses the BGI PQ200A as the only sampler to run side-by-side with FRM/FEM
samplers, except at altitudes >7,000 feet. EPA is aware that that the BGI PQ200A is incapable of
performing satisfactorily at altitudes >7,000 feet. EPA Regions 8, 9, and 10 contain monitoring
sites with elevations where this issue may arise. A limited number of sites exist or may be up-
fitted for PM2.5 FRM/FEM sampling in the future. The portable versions of the Andersen RAAS
100 and the R&P Partisol Model 2000 PM2 5 FEM audit sampler have been successfully used at
higher altitudes throughout the PEP's history. A potentially serious  issue exists because both
manufacturers no longer support these models. To the extent that the PEP can maintain the
serviceability of these models, these samplers will be used to conduct high-altitude PEP audits
and included in the Regional collocation studies. EPA acknowledges that the collocations will
only characterize performance at lower elevations, but the  small number of high-altitude sites
does not warrant the expense of developing additional high-altitude samplers at this time.
Regions that use these samplers will include them in routine parking lot collocations for bias and
precision evaluations at lower elevations.

Relative bias of a  single sampler — semi-annual basis  Several QA criteria will be used to
screen anomalous measurements, as compared to the other data obtained from each sampling
event during a collocation study (see Appendix B). Normalized paired differences (Nij:Cj) will
then be calculated for all samplers that participate in a Regional collocation study using the
following equation, where n is the number of monitors in the collocation study, /' andy represent
different individual monitors in the study, d represents a specific day in  a collocation event, A,/
represents the paired differences among monitors for each  day during the event, and xd is the

-------
                                                                          Project: PEP QAPP
                                                                             Element No.: 14
                                                                             Revision No.: 1
                                                                             Date: 3/6/2009
_ Page 14 of 18


daily mean. After normalization, the differences are considered comparable among individual
studies conducted under differing atmospheric conditions.
                                 where D1J

The collocation study data will be evaluated to determine the relative bias of individual samplers
when compared to all samplers in a study. Histograms of the resulting normalized paired
differences can be used to infer the expected among-monitor precision, which is based on
historical data collected within the PEP. As discussed in Section 14.1.2.1, the "true" precision of
any given monitor is unachievable. However, the among-monitor precision of the samplers that
participate in a collocation study provides for a programmatic review of the general tendencies,
or relative bias, of the reference monitors to obtain consistent results.
Corrective action. Individual samplers that have been identified as having notable differences
will be investigated.  If it appears that there is a significant problem with a particular sampler,
then corrective action will be initiated. The process will include eliminating uncertainties that
may be occurring during the filter handling, transport, and laboratory stages to determine that the
cause of bias is truly the instrument. Corrective actions taken on the instrument will include
temperature and flow rate verifications, additional trial runs where the sampler's flow CV is
strip-charted for indications of controller problems, as well as complete maintenance activities.
Additional corrective action could include  a request for vendor servicing.

If the findings of the investigation reveal potential error in historical audit results, then the
sample results will be flagged in the FED, and any data posted to the AQS will be nullified. If
possible, additional PEs may be scheduled to meet PEP completeness requirements. The EPA
National PEP Project Leader and the EPA Regional WAM/TOPO/DOPOs will be notified of the
problem as soon as possible. Corrective action reports will be appropriately filed under the
heading "PEP/108-025-01-01-237.1" (see Element 9.0, Documentation and Records).

14.1.3.2  Flow Rate

The PEP FS will implement a flow rate verification with each setup. Details of the
implementation aspects of the audit are included in PEP Field SOP, Section 5. The verification is
implemented by measuring the analyzer's normal operating flow rate using a certified flow rate
transfer standard.  The audit (actual) flow rate and the corresponding flow rate indicated or
assumed by the sampler are reported. The procedures used to calculate measurement uncertainty
are described below.

-------
                                                                         Project: PEP QAPP
                                                                            Element No.: 14
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
                                                                              Page 15 of 18
Accuracy of a single sampler—single check (quarterly) basis (
-------
                                                                          Project: PEP QAPP
                                                                             Element No.: 14
                                                                             Revision No.: 1
                                                                             Date: 3/6/2009
	Page 16 of 18

warm up and may require checking the balance weights many times. If the acceptance criteria
are still not met, the LA will be required to verify the working standards against the primary
standards. Finally, if it is established that the balance does not meet acceptance criteria for both
the working and primary standards and other troubleshooting techniques fail, the vendor service
technician (see Element 15.0, Instrument/Equipment Testing, Inspection, and Maintenance
Requirements) will be called to perform corrective action.

If the balance check fails acceptance criteria during a weighing session, then the QC check
samples will be reweighed. If the balance check continues to fail, then troubleshooting, as
previously discussed, will be initiated. The filter weights from the sample batch will be recorded
and flagged (failed internal standard [FIS]); however, the filters will remain in the conditioning
environment to be reweighed when the balance meets the acceptance criteria. The data
acquisition system will flag any balance check outside the acceptance criteria as FIS.

14.2  Sample Batching—QC Sample Distribution

To ensure that the PEP includes all types of QC samples within a weighing session, the PEP will
use the concept of sample batches, which will  consist of balance checks, field blanks, laboratory
blanks, trip blanks (if available), batch duplicates, and previous batch duplicates, as indicated in
Figure 14-2.

14.2.1 Sample Distribution

QC samples need to be interspersed within the batch to provide data quality information
throughout the batch weighing session.

-------
Project: PEP QAPP
Element No.: 14
Revision No.: 1
Date: 3/6/2009
Page 17 of 18

PEP
Batch Type (circ
Date:

Sample
QC1
QC2
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Routine filter
Duplicate 1
Duplicate 2
Duplicate 3
QC1
QC2
Filter Weighing Data Entry Form
ie):PRE POST Batch No.:
Analyst Initials:
Mean Temperature for the past
Mean Relative Humidity for the pas
Filter ID
100 mg
200 mg


















100 mg
200 mg
BAT-01
Filter Type
RO/LB/FB
CO/BD/PD
•














BD
DU
DU
•
24 hours: SD:
3t 24 hours: SD:
Cassette
ID
•
















•
Weight 1
xxx. xxx mg






















Weight 2
xxx. xxx mg
























Flag
















•
Figure 14-2. PEP Filter Weighing Data Entry Form

-------
                                                                        Project: PEP QAPP
                                                                           Element No.: 14
                                                                           Revision No.: 1
                                                                           Date: 3/6/2009
                                                                            Page 18 of 18
14.3   Control Charts

Control charts will be used extensively in the PEP because they provide a graphical means of
determining whether various phases of the measurement process are within control limits. The
PEP will use property charts, which graph single measurements of a standard or a mean of
several measurements. Table 14-3 indicates which QC data will be control charted. The control
charts will be used as an "early warning system" to evaluate trends in precision and bias. These
charts will be discussed in the annual and 3-year PEP QA reports (Elements 6.0, Project/Task
Description., and 21.0, Reports to Management, respectively). They will be appropriately filed
(PEP/108-025-01-01-237.1).

                               Table 14-3. Control Charts
QC Check
Laboratory conditioning environment
(temperature and relative humidity)
Lot, laboratory, field, and trip blanks
Batch stability (post-sample)
Duplicate filter weighings (batch
duplicates and previous batch duplicates)
Balance checks (100-mg and 200-mg
standards)
Leak check
Barometric pressure check
Ambient temperature check
Filter temperature check
Flow rate check
Collocated monitoring
Plotting Technique
Daily mean and standard deviation
Difference of pre- and post-weighed values
Individual weight differences from pre- and post-weighing
sessions; also, days between weighings
Percent difference each pair
Individual weight differences between balance and
certified weights
Difference between ending pressure and beginning
pressure
Difference between standard and sampler
Difference between standard and sampler
Difference between standard and sampler
Percent difference between standard and sampler
CV of all samplers per semi-annual basis (aggregated at the
regional and national levels)
Median of normalized paired differences (aggregated at
regional and national levels)
References

1. Taylor, J.K. 1987. Quality Assurance of Chemical Measurements. Lewis Publishers: Chelsea,
   MI. Pp. 328.

2. U.S. EPA. 2006. Revisions to Ambient Air Monitoring Regulations. 40 CFR Parts 53 and 58.
   Federal Register 77(200):61236-61327. October 17.

-------
                                                                       Project: PEP QAPP
                                                                        Element No.: 15.0
                                                                          Revision No.: 1
                                                                          Date: 3/6/2009
                                                                      	Page 1 of 5
             15.0  Instrument/Equipment Testing, Inspection,
                       and Maintenance Requirements

The purpose of this element in the PEP QAPP is to discuss the procedures used to verify that all
instruments and equipment are maintained in sound operating condition and are capable of
operating at acceptable performance levels. All instrument inspection and maintenance activities
are documented and filed under PEP/301-093-006.3. See Element 9.0, Documentation and
Records, for document filing and record details.

15.1   Testing

All PM2.5 samplers used in the PEP will be designated FRM monitors that have been certified as
such by EPA; therefore, the samplers are assumed to be of sufficient quality for the data
collection operation. Testing of such equipment is accomplished by EPA through the procedures
described in 40 CFR Part 53.l Annually, prior to deployment, the FSs within each Region will
assemble and run all the samplers at the Regional site (full collocation). The FSs will perform
external and  internal leak checks, as well as temperature, time, pressure, and flow rate single-
point verification checks. If any of these checks are out of specification (see Table 14-1 in
Element 14.0, Qualify Control Requirements), then the FS or WAM/TOPO/DOPO will initiate
troubleshooting procedures (see Field SOP Section 5). If the problem cannot be located and the
sampler continues to fail the verification checks, then the sampler cannot be used for the PE. The
FS should use an alternate sampler, and the sampler should be returned to the laboratory for
maintenance. If the sampling instrument meets the acceptance criteria, it will be assumed to be
operating properly. If a new sampler is acquired for use in the PEP, then it should be subject to a
collocation with at least two other samplers that are believed to be performing satisfactorily. The
results should comply with acceptance criteria for a routine collocation study. If new upgraded
FRM sampler hardware is introduced for service (e.g., a very  sharp-cut cyclone replaces the
WINS  impactor), the same type of testing will be conducted. A more detailed testing protocol
will be furnished by the National  PEP Project Leader. These tests will be properly documented
and filed under PEP/301-093-006.3.

15.2   Inspection

Inspection of various equipment and components can be subdivided into the laboratory and field
activities.

15.2.1  Inspection in Weighing Room

There are several items that need  routine inspection in the weighing room. Table  15-1 details the
items to inspect and summarizes how to appropriately document the inspection.

-------
                                                                         Project: PEP QAPP
                                                                          Element No.: 15.0
                                                                            Revision No.: 1
                                                                             Date: 3/6/2009
                                                                         	Page 2 of 5
                  Table 15-1. Inspections in the Weigh Room Laboratory
Item
Weighing
room
temperature
Weighing
room
relative
humidity
Dust in
weighing
room
Inspection
Frequency
Daily
Daily
Monthly
Inspection
Parameter
20°C-23°C
30%-40%
Use glove
and visually
inspect
Action If Item Fails
Inspection
1 . Check heating, ventilation,
and air conditioning
(HVAC) system
2. Call service provider that
holds maintenance
agreement
1. Check HVAC system
2. Call service provider that
holds maintenance
agreement
Clean weigh room
Documentation
Requirement
1 . Document in the
weighing room log book
2. Notify the PEP
Laboratory Manager
1 . Document in the
weighing room log book
2. Notify the PEP
Laboratory Manager
1 . Document in weighing
room log book
15.2.2 Inspection of Field Items

There are several FRM sampler parts and filter cassette parts to inspect in the field operation's
maintenance area and in the field before and after a PM2.5 sample has been taken. Table 15-2
details these inspections.

                          Table 15-2. Inspection of Field Items
Item
Sample
downtube
WINS impactor
well
Very sharp-cut
cyclone
Rain collector
O-rings
Inspection
Frequency
Every site visit
Every site visit
Every 10
sampling events
or after a dust
storm or heavy
air pollution
episode
Every site visit
Every site visit
Inspection
Parameter
Visible particulate
"Cone" shape of
particulate on
impactor well
Collection reservoir
laden with
particulate matter
>2.5 //m
Condensate of
sufficient volume to
pour
Any damage
Action If Item Fails
Inspection
Clean with a clean
dry cloth
Replace impactor
well filter (including
new impactor oil)
Clean reservoir
Empty
Replace
Documentation
Requirement
Document in the log
book
Document in the log
book
Document in the log
book
Document in the log
book
Document in the log
book

-------
                                                                        Project: PEP QAPP
                                                                         Element No.: 15.0
                                                                           Revision No.: 1
                                                                            Date: 3/6/2009
                                                                       	Page 3 of 5
Item
Filter cassettes
Cassette seals
Battery
Inspection
Frequency
After each
sample run
Each sample
Every 6 months
Inspection
Parameter
Visible participate
matter
Clean and smooth
Decrease in voltage
Action If Item Fails
Inspection
Check downtube and
WINS impactor
Clean with a clean
dry cloth or replace
as needed
Replace
Documentation
Requirement
Document in the log
book
Document when
replaced
Document in the log
book
15.3   Maintenance

There are many items that need maintenance attention in the PEP. This section describes those
items according to whether they are weighing room items or field items.

15.3.1  Weighing Room Maintenance Items

The successful execution of a preventive maintenance program for the weighing laboratory will
go a long way towards the success of the PEP. Weigh laboratory preventive maintenance is
handled through the use of service agreements. The weighing laboratory has entered into
maintenance agreements with the vendors who developed the heating, ventilation, and air
conditioning (HVAC) system. Similarly, preventive maintenance for the microbalances is
performed by the vendor's service technician (e.g., Sartorius) and is scheduled to occur at initial
setup and every 6 months thereafter. In the event that there is a problem with a microbalance that
cannot be resolved within the laboratory, the service technician can be paged. The laboratory
will maintain a spare microbalance in case the balance in use should fail.

Service agreements for both the HVAC and microbalance will be renewed each year. In the
event either company's service agreement is not renewed, a new service provider will be
selected and contract will be put in place.
Table 15-3 details the weighing laboratory maintenance items, how frequently they will be
replaced, and who will be responsible for performing the maintenance.

-------
                                                                        Project: PEP QAPP
                                                                         Element No.: 15.0
                                                                           Revision No.: 1
                                                                           Date: 3/6/2009
                                                                       	Page 4 of 5
              Table 15-3. Preventive Maintenance in Weighing Laboratories
Item
Responsibility
Frequency
General Laboratory Maintenance/Cleaning
Table cleaning
Overall laboratory
Cassette ethanol wiping/washing
Adhesive-coated floor mats
HEPA filter change
Polonium-210 strip change
Polonium-210 strip cleaning
LA
LA
LA
LA
LA
LA
LA
Every day
Once a month
After each use
Weekly or when soiled to
a point of non-performance
Once a month
Every 6 months
Monthly or as shown by
blank data
Microbalance
Cleaning
Service cleaning/calibration
Calibration/verification
LA
Service provider
LA
Every 6 months
Twice a year
Every sample weighing
Temperature/Humidity Readers
Calibration/verification
LA
Once every 3 months
Laboratory Computers
Computer backup
Computer virus check
PEP database compaction
Computer system preventive maintenance
(e.g., archive old files, compress hard drive,
inspect)
LA
LA
LA
PC support personnel
Weekly, at minimum;
automated daily backup is
preferred
Weekly, with automated
on-access scans and
on-delivery e-mail scans
Monthly
Yearly
Maintenance (e.g., backup) of network file shares used to store the FED is performed by EPA
contractor(s) according to policies established by EPA's Office of Administration and Resource
Management.

15.3.2  Field Maintenance Items

There are many items associated with appropriate preventive maintenance of a successful field
program. Table 15-4 details the appropriate maintenance checks of the PM2.s samplers and their
frequency. Field SOP Section 6 provides procedures for cleaning some of the more important
pieces of field equipment.

-------
                                                                             Project: PEP QAPP
                                                                              Element No.: 15.0
                                                                                Revision No.: 1
                                                                                 Date: 3/6/2009
                                                                            	Page 5 of 5
                     Table 15-4. Preventive Maintenance of Field Items
 Frequency
                        Maintenance Item
 Every visit
1.  Inspect and, if necessary, empty water collector bottle
2.  Clean and/or change-out WINS impactor well
3.  Inspect visible O-rings in the flow path
 Every 10 sampling events
 or as needed
1. Clean very sharp-cut cyclone (this requirement may be fulfilled by a
  quarterly cleaning)
 Quarterly (every 3 months)
1.  Clean sampler inlet surfaces
2.  Clean main (first stage) size-selective inlet (PMi0 head)
3.  Clean impactor housing (if applicable) and impactor jet surfaces
4.  Clean interior of sampler unit
5.  Clean very sharp-cut cyclone
6.  Check condition of sampler transport containers
7.  Clean sampler downtube
8.  Inspect cooling air intake fan(s) and filter; replace if necessary
9.  Inspect all O-rings, visible and hidden, and reapply vacuum grease as
   needed
10. Inspect vacuum tubing, tube fittings, and other connections to pump
   and electrical components; service if necessary
References

The following document was used in the development of this element:

1.  U.S. EPA. 1997. National Ambient Air Quality Standards for Particulate Matter—Final Rule.
    40 CFRPart 53. Federal Register 62(138):38651-38760. July 18.

-------
                                                                         Project: PEP QAPP
                                                                          Element No: 16.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                        	Page 1 of 5
                16.0  Instrument Calibration and Frequency

This element of the PEP QAPP discusses the calibration procedures that will be used for
instruments involved in the environmental measurements. Table 16-1 lists the instruments that
require verification and calibration, the required frequencies of these activities, the acceptance
criteria for these  activities, and the PEP Field and Laboratory SOPs that describe the procedures
and all calibration activities.

Calibrations that involve instrument adjustments should only be accomplished when it is obvious
that calibration is required; therefore, the PEP uses a three-phase approach to calibration, which
involves the following:

   •   Single-point verification—These verifications ensure that the calibration is within
       acceptance limits by performing frequent single-point verifications that do not include
       instrument adjustments.
   •   Calibration—This occurs when there is a failure of a single-point verification. Instrument
       adjustment occurs at this point and is followed by a subsequent single-point verification.

                          Table 16-1. Instrument Calibrations
Type
Frequency
Acceptance Criteria
PEP SOP
Laboratory Verification
Mass standards verification
Microbalance verification
Temperature verification
Relative humidity verification
1 /quarter
Every weigh session
1 /quarter
1 /quarter
±2/j,g
Manufacturer's specifications
±2°C of standard
±2% of standard
Lab SOP, Section 7
Lab SOP, Section 7
Lab SOP, Section 7
Lab SOP, Section 7
Laboratory Calibration
Mass standards calibration
Microbalance calibration
Temperature calibration
Relative humidity calibration
1/yr
At least 2/yr
1/yr
1/yr
±2^g
Manufacturer's specifications
±2°C of standard
±2% of standard
Lab SOP, Section 7
Lab SOP, Section 7
Lab SOP, Section 7
Lab SOP, Section 7
Field Calibration/Verification
Clock/timer verification
Single-point flow rate
verification
Flow rate calibration
Post-calibration single-point
flow rate verification
Single-point barometric
pressure verification
Barometric pressure
calibration
Single-point temperature
verification
Every sampling event
Every sampling event
Upon failure of single-point
verification
Following every calibration
Every sampling event and
following every calibration
Upon failure of multipoint
verification
Every sampling event and
following every calibration
1 min/mo
± 4% of working standard or
± 4% of design flow (16.67 Lpm)
± 2% of calibration standard
at design flow (16.67 Lpm)
± 2% of design flow (16.67 Lpm)
± lOmmHg
± lOmmHg
± 2°C of working standard
Field SOP, Section 5
Field SOP, Section 5
Field SOP, Section 10
Field SOP, Section 10
Field SOP, Section 5
Field SOP, Section 10
Field SOP, Section 5

-------
                                                                       Project: PEP QAPP
                                                                         Element No: 16.0
                                                                          Revision No: 1
                                                                          Date: 3/6/2009
                                                                      	Page 2 of 5
Type
Temperature calibration
Frequency
Upon failure of single-point
verification
Acceptance Criteria
Adjust to within ± 0. 1 °C of
calibration standard
PEP SOP
Field SOP, Section 10
Standards Recertijications
Flow rate transfer standard
Field thermometer
Field barometer
Working mass standards
Primary mass standards
1/yr
1/yr
1/yr
3-6 mo
1/yr
±2% of NIST-traceable standard
±0.1°C resolution
± 0.5°C accuracy
± 1 mm Hg resolution
± 5 mm Hg accuracy
0.025 mg
0.025 mg
Field SOP, Section 8
Field SOP, Section 8
Field SOP, Section 8
Lab SOP, Section 7
Lab SOP, Section 7
16.1   Instrumentation Requiring Calibration

16.1.1  Laboratory Equipment

16.1.1.1   Laboratory Microbalance

The laboratory support for the PEP includes calibration of the Sartorius MC-5 microbalance. As
indicated in Element 13.0, Analytical Methods Requirements, the balance is calibrated (and the
mass standard check weights are recertified) regularly (twice per year) under a service agreement
and additionally when routine QC checks indicate that the microbalance may be out of
calibration and when the PEP Laboratory Manager grants permission. The service technician
performs routine maintenance and makes any balance response adjustments that the calibration
shows to be necessary. During the visit by the service technician, both the in-house primary and
secondary (working) standards are checked against the service technician's standards to ensure
acceptability. All of these actions are documented in the service technician's report a copy of
which is provided to the PEP Laboratory Manager. After review, the report is appropriately filed
under PEP/301-093-006.6 (see Element 9.0, Documentation and Records).
16.1.1.2   Laboratory Temperature and Relative Humidity Recorders

The laboratory reference, Vaisala™ HMT330 NIST-Traceable Hygrometer/Thermometer, is
placed inside the conditioning environment and operated with the following specifications. Mean
relative humidity is controlled between 30% and 40%, with a target of 35% and variability of not
more than ± 5% over 24 hours, with minimums and maximums never to fall out of the 25%-45%
range. Mean temperature should be held between 20°C and 23 °C, with a variability of not more
than ± 2°C over 24 hours, with minimums and maximums never to fall out of the 18°C-25°C
range. The responses of the reference instrument's combination probe are  compared with the
responses of the conditioning environment control system's recording thermometer and
recording hygrometer. Daily mean and standard deviation are calculated from the recorded
responses. The mean is compared to the operating range and must be within it. The standard
deviation is compared to the control limits and must be within them.

-------
                                                                          Project: PEP QAPP
                                                                            Element No: 16.0
                                                                              Revision No: 1
                                                                              Date: 3/6/2009
	Page 3 of 5


16.1.2 Field Equipment—the PM2.5 Portable Sampler
Upon receipt of a new portable sampler, single-point verifications will be performed as indicated
in Table 16-1. Calibrations typically occur at the field office or laboratory.
NOTE: Experience has shown that multipoint verifications do not indicate the accuracy of the
BGI sampler at its required design flow rate. Multipoint verifications may be useful for
troubleshooting.

The following verifications are routinely performed in the field:

    •   The sampler's internal clock against a timepiece.
    •   The sampler's barometric pressure against the working pressure standard
    •   The sampler's temperature probes against the working temperature standard
    •   The sampler's volumetric flow rate meter against the working flow standard.

16.1.2.1   Time Standard

The FS will use an atomic clock, which can be found on the Internet at http://www.time.gov or
through a known time standard (e.g., cell phone), to verify that the sampler's time matches the
time standard. Times can be checked each day before heading to the field, particularly where
there is no cell phone service at the sampler location(s). Samplers should be set up based on the
local standard time.

16.1.2.2   Barometric Pressure

A NIST-traceable verification device (e.g., BGI Delta-Cal or BGI Tri-Cal) will be used in the
field for single-point verifications of the portable sampler's pressure sensor during each
sampling event. If a sampler fails the single-point verification for barometric pressure, then a
different NIST-traceable verification/calibration device will later be used in the field office as a
primary standard to perform a single-point calibration for barometric pressure. Each time the
sampler is calibrated for barometric pressure,  a subsequent single-point barometric pressure
verification must follow.

16.1.2.3   Temperature Probes

The portable sampler has ambient and internal temperature probes. At every sampling event, the
FSs will perform single-point field verifications of both sensors using a digital NIST-traceable
temperature probe (e.g., BGI Delta-Cal or BGI Tri-Cal). A single-point temperature calibration
is usually performed at the laboratory after there has been a single-point temperature verification
failure. Each time the sampler is  calibrated for temperature, a subsequent single-point
temperature verification must follow.

-------
                                                                        Project: PEP QAPP
                                                                          Element No: 16.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                       	Page 4 of 5
16.1.2.4   Flow Rate
Before every sampling event and after leak checks, temperature verifications, and barometric
pressure verifications are performed, a single-point flow rate verification will be performed
using a NIST-traceable calibration device (e.g., BGI Delta-Cal or BGI Tri-Cal). If the
verification result is outside the acceptable tolerance, then the sampler may need to be calibrated.
A different NIST-traceable verification/calibration device will be used in the field office as a
primary standard to perform a single-point calibration after there has been a verification failure.
The single-point verification must be repeated after any calibration procedure to ensure the
sampler operates at the design flow rate of 16.67 Lpm.

16.2   Calibration Method That Will Be Used  for Each Instrument

As shown in Table 16-1, the calibration methods are described in the PEP Field and Laboratory
SOPs.
16.3   Calibration Standard Materials and Apparatus

Table 16-2 presents a summary of the specific standard materials and apparatus used in
calibrating measurement systems for parameters necessary to generate the PM2.5 data required in
40 CFRPart 50, Appendix L, and 40 CFRPart 58. Table 16-1 presents the acceptance
requirements of each of the standards used in the program; whereas Table 16-2 presents the
accuracy and resolution of each standard. All of the standards meet the acceptance requirements
in Table 7-1  and will be NIST-traceable. Traceability will be established each year through
service agreements with vendors from which the instruments were purchased.

        Table 16-2. Calibration Standards and/or Apparatus for PM2.s Calibration
Parameter
Standard (S) or
Apparatus (A)
Description
Accuracy
or Resolution
Manufacturer's
Name
Model Number
Mass
Primary
and working
S
Class 1 weights
Weight tolerance
0.010 mg
Rice Lake
100-mg, 200-mg,
and 5-g weights
Temperature
Calibration
(laboratory) and
working (field)
A
Multi-parameter
calibrator
Accuracy ±0.2°C
Resolution 0.1°C
BGI Delta-Cal
BGI Tri-Cal
DC-1
TC-12
Barometric Pressure
Calibration
(laboratory) and
working (field)
A
Multi-parameter
calibrator
Accuracy ±0.1%
Resolution 0.01 psig
BGI Delta-Cal
BGI Tri-Cal
DC-1
TC-12
Flow Rate
Calibration
(laboratory) and
working (field)
A
Multi-parameter
calibrator
Accuracy ± 2%
Resolution
20 mL/min
BGI Delta-Cal
BGI Delta-Cal
DC-1
TC-12

-------
                                                                        Project: PEP QAPP
                                                                          Element No: 16.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
                                                                       	Page 5 of 5
Parameter
Standard (S) or
Apparatus (A)
Description
Accuracy
or Resolution
Manufacturer's
Name
Model Number
Laboratory Temperature/Relative Humidity
Laboratory
temperature/
relative humidity
A
Hygrometer/
thermometer
Temperature
Accuracy ± 0.2°C
Resolution 0.01°C
Relative humidity
Accuracy ± 1.5%
Resolution 0.01%
Vaisala
HMT330
16.4  Calibration Frequency

See Table 16-1 for a summary of calibration frequencies.

All calibration events, as well as sampler and calibration equipment maintenance, will be
documented in field data records and notebooks and annotated with the flags as required by
Appendix L of 40 CFR Part 50, the manufacturer's operating instruction manual, and any others
indicated in the PEP Field and Laboratory SOPs. The records will normally be controlled by the
ESAT FSs or LAs and located in the laboratory or field offices when in use. Eventually, all
calibration records will be appropriately filed under PEP/301-093-006.6 (see Element 9.0,
Documentation and Records).


16.5  Standards Recertifications

All primary/calibration and working standards will be certified every year as NIST-traceable.
Agreements with vendors will be set up to provide this certification activity. OAQPS will work
with the Regional offices to find an  appropriate time frame to achieve recertifications.

-------
                                                                      Project: PEP QAPP
                                                                       Element No: 17.0
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                                     	Page Iof9
      17.0  Inspection/Acceptance for Supplies and Consumables

17.1  Purpose

The purpose of this element is to establish and document a system for inspecting and accepting
all supplies and consumables that may directly or indirectly affect the quality of the PEP data.
The PEP relies on various supplies and consumables that are critical to its operation. By having
documented inspection and acceptance criteria, consistency of the supplies can be assured. This
element details the supplies and consumables, their acceptance criteria, and the required
documentation for tracking this process.

Many forms will be discussed in the following sections. These forms can be found in the PEP
Field and Laboratory SOPs, but examples of them are placed at the end of this section. They are

   •  Field/Laboratory Inventory Form (INV-01) (Figure 17-1)
   •  Field/Laboratory Procurement Log Form (PRO-01) (Figure 17-2)
   •  Field/Laboratory Equipment/Consumable Receiving Report Form (REC-01)
      (Figure 17-3).

17.2  Critical Supplies and Consumables

This section describes the needed supplies for the PEP and includes items for the weighing
laboratory and the  field. Generally, critical field and laboratory equipment has been selected by
the PEP organizers based on the required performance specifications of resolution, accuracy, and
ease of use.

17.2.1 Laboratory Supplies

OAQPS has developed a list of the critical laboratory equipment, which are listed in Table 17-1.
Equipment that is not deemed critical (affecting data quality) has been left to the PEP  Laboratory
Manager to select.  To maintain consistency in the PEP, all consumables/equipment with a model
number (as shown in Table 17-1) will be purchased using the same model number when supplies
run low. The LA is required to keep an inventory of all equipment using the Field/ Laboratory
Inventory Form (INV-01),  which is shown in Figure 17-1.

-------
                                                   Project: PEP QAPP
                                                     Element No: 17.0
                                                       Revision No: 1
                                                       Date: 3/6/2009
                                                   	Page 2 of 9
Table 17-1. Weighing Laboratory Equipment
Quantity
2
2
2
2
2
1
1
1
1
1
1
1
1
2
2
1
1
1
1
2
1
24
1
12
7
1
1
1
1
6
1
2
2
1
1
1
20
4
Units
Each
Sets
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Each
Case of
1,000
Each
Pack of 100
Case of 12
bottles
Case of 15
packs
Each
Each
Sets

Each
Each
Case
Box
Case of
1,000
Each
Case of 24
Item
Microbalance
ASTM Class 1 weights
Balance table
Computer
Barcode reader
Relative humidity /temperature monitor
Relative humidity /temperature standard
NIST-traceable thermometer
Tacky mat plastic frame
Uninterruptible power supply
Refrigerator
Freezer
Dishwasher
Antifatigue floor mat
Equilibration rack
Laser printer
Dehumidifier
Light table
Microsoft Access 2000 or later
Sarto Wedge software for Sartorius balances
Barcode-printing software
HVAC filters
Powder-free antistatic gloves
Polonium-210 strips
Petri slides
Staticide
Low-lint wipes (Kimwipes)
HVAC service contract
Microbalance service contract (two scheduled
visits per year)
Chart paper and pens
Cleaning supplies
Worklon antistatic laboratory coats
Forceps (stainless steel with plastic tips)
Antistatic 3" x 5" reclosable bags (for cassettes)
Barcode stickers
Alcohol swipes
Coolers (6-pack size)
Reusable U-Tek refrigerant packs (-1°C)
Vendor
Sartorius
Rice Lake Weighing
Systems
Thermo Fisher Scientific
Dell

Vaisala
Thermo Fisher Scientific
Thermo Fisher Scientific
Thermo Fisher Scientific
Cole-Parmer



Richmond





Sartorius
Cole-Parmer

Thermo Fisher Scientific
NRD
Gelman
Cole-Parmer
Kimberly-Clark

Sartorius


Thermo Fisher Scientific
VWR
Consolidated Plastics

Thermo Fisher Scientific

Thermo Fisher Scientific
Model
Number
MC-5
11909
HM019945


E-375 10-02
11-661-78
15-041 A
06-528A
E-05 158-60



19-61-763




077-00370
YSW01
E-21 190-10

11-393-85A
2U500
7231
E-33672-00
34155




01-352-69B
25672-100
90202KH

14-819-2

03-528B

-------
                                                                      Project: PEP QAPP
                                                                        Element No: 17.0
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                                     	Page 3 of 9
Quantity
1
4
20
3
Units
Case
Each
Each
120 sheets
Item
Antistatic 9" x 12" reclosable bags (for data
sheet)
Log books
Minimum/maximum thermometers (various
digital ones available)
Hard surface tacky mat (moderate tack)
Vendor
Consolidated Plastics

Sentry
Thermo Fisher Scientific
Model
Number
90210KH

4121
06-527-2
As consumables run low or when new equipment purchases are necessary, the LA will be
responsible for assisting in the procurement of these items following the policy and requirements
described in the ESAT scope of work. The LA should continue purchasing consumable
equipment with the same model numbers as the equipment initially procured unless the PEP
Laboratory Manager suggests a different item due to improved quality, reduction in
contamination, improved ease of use, or lower cost (without sacrificing quality). Such changes
should be coordinated with the WAM/TOPO/DOPO. The PEP Laboratory Manager will report
any equipment changes that could affect the results of sampling events to the National PEP
Project Leader. The following procedures will be performed by the LA:

    •  Develop procurement requests as per EPA requirements.
    •  Upon order, add items to the Field/Laboratory Procurement Log Form (PRO-01).
    •  Once a month, provide a copy of the PRO-01 to the PEP Laboratory Manager and the
      laboratory services ESAT WAM/TOPO/DOPO.
    •  File PRO-01 under AFC "PEP/301-093-006.6."

17.2.2 Field Equipment and Supplies

To ensure consistency and to meet the DQOs, OAQPS purchases all equipment and
consumables, as listed in Table 17-2, for the field activities. Quantities for items in Table 17-2
are  not shown because they will vary with the size of the field operation (number of samplers
and auditors). The FS is required to keep and inventory all equipment, which include any
warranty information.

                       Table 17-2. Field Equipment and Supplies
Quantity






PEP Field Equipment and Supplies
Monitoring Equipment and Supplies
Transport cases for loose equipment/consumables
Backpack frame for carrying samplers
Portable FRM PM2 5 samplers) with carrying case
Very sharp cut cyclone (VSCC)
Pre- weighed 46.2-mm diameter filters in the proper cassette
Vendor/Catalog
Number

Forestry Suppliers/3 1113
Forestry Suppliers/35913
BGI
BGI
Supplied by the weighing
laboratory
Make/Model
Number

Collapsible crate

BGI PQ200A
VSCCB


-------
 Project: PEP QAPP
  Element No: 17.0
     Revision No: 1
     Date: 3/6/2009
	Page 4 of 9
Quantity


































PEP Field Equipment and Supplies
COC Form for each filter cassette
Impactor oil and dropper
(NOTE: Dow 704 has been found to solidify when sustained
at 4°C for long periods.)
Impactor filters (37-mm diameter glass fiber)
Teflon-coated tweezers (for handling impactor filters)
Sample shipping containers (coolers)
Custody seals (tape or stickers)
Minimum/maximum thermometers
Cold packs (ice substitutes), 36 per box
Electric transport cooler with 12 volt to AC transformer
Filter transport coolers (6 quart)
Bubble wrap
PEP FRM Sampler Operations Manual
Field notebook(s)
Clipboard (8" x 14")
Grip binders
Data storage media (e.g., diskette, CD, or USB card)
Silicone grease for O-rings (e.g., vacuum grease)
PEP Field SOP
Field Data Sheets, preprinted
Laptop computer with PQ200A job-control software
Datatrans™ to download data; BGI upgraded version 2006
Cables for connecting the data-download device to the
portable FRM sampler
Magnetic compass or other means of determining site
orientation
Tape measure (metric)
Cellular phone
Mechanical pencils
Markers (indelible)
Mounting Equipment and Tools
Ladder and a rope for hoisting equipment
Hand truck or cart with wheels and straps for transporting
equipment
Bubble level for checking the portable FRM sampler
Wooden shims or other means for leveling the portable FRM
sampler
Tool box with basic tools, including the following:
Allen wrenches (metric and standard)
Micro screwdriver set
Vendor/Catalog
Number

SPI Supplies
BGI (preferred)



Daigger/AX24081B
Daigger
Globe Mart/561 5-807
Rubbermaid Web site
Consolidated Plastics


Forestry Suppliers/53283
Office Depot/50 1-627

Daigger/AX23061A



BGI/DC201

Forestry Suppliers/37177
Forestry Suppliers/39651

Skilcraft
Sharpie



Mayes (torpedo)




Make/Model
Number

Octoil®-S
(SPI#00031)




Sentry
EF2592D
Coleman 16 quart
Rubbermaid
6 pack
87604


Cruiser mate
Presstex







Suunto Partner II
Lufkin/
W 9210ME

9 mm
Ultra-fine



10198





-------
 Project: PEP QAPP
  Element No: 17.0
     Revision No: 1
     Date: 3/6/2009
	Page 5 of 9
Quantity






























PEP Field Equipment and Supplies
Pliers (multiple sizes and types)
Screwdrivers (standard straight and Phillips head)
Wire cutters
Small synchs ties
Electrical tape
Soldering gun/solder
Hemostat (for flow rate troubleshooting)
Flashlight with spare batteries
Heavy-duty, grounded, weatherproof electrical extension cord
with multiple outlets (25' length)
Heavy-duty, grounded, weatherproof electrical extension cord
with multiple outlets (12' length)
Tie-down cables, anchors, plywood sheet, and bungee cords
to anchor and stabilize the portable FRM sampler and to
dampen vibration (optional)
Masking tape
Packaging tape
Strapping tape
Calibration/Verification Standards and Related
Equipment
Downtube flow rate adapter
Temperature, pressure, and flow verification device with
external temperature probe
Temperature verification/calibration standard (NIST-
traceable) with probe (optional)
Styrofoam cup and deionized ice water for temperature
calibrations
Flow-check filter in transport cassette
Impermeable "filter" disk for internal leak checks
Accurately set timepiece (cell phone)
Hand calculator (scientific)
Spare Parts and Optional Equipment
Spare O-rings for the portable FRM sampler
Spare batteries (for all battery-powered equipment)
Fuses, as required by all equipment used
Spare in-line filters (if required by the portable FRM sampler)
Voltmeter/ammeter/ohmmeter for troubleshooting
Spare impactor(s)
Ground fault circuit interrupter (GFCI) tester
Portable GFCI device
Camera (digital) for site pictures
Vendor/Catalog
Number








Unicor
Unicor

GSA-7510-00-283-0612
GSA-75 10-00-079-7906
GSA-75 10-00- 159-4450


BGI Delta-Cal
BGI Tri-Cal
VWR




Office Depot/397-554










Make/Model
Number








StyleS Class2 Series2
StyleS Class2 Series2




DC-1
TC-12
61220-601




Casio











-------
                                                                      Project: PEP QAPP
                                                                       Element No: 17.0
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                                     	Page 6 of 9
Quantity















PEP Field Equipment and Supplies
Cleaning Supplies and Equipment
Low-lint laboratory wipes for cleaning WINS and other
sampling equipment (Kimwipes)
Disposable paper towels
Large, locking plastic bag for cleanup of debris and wipes
Soft brush
Supply of deionized water for cleaning and rinsing equipment
Isopropyl alcohol to aid in removal of grease and dirt
Alcohol wipes for preloading hand wipe
Penetrating oil (silicone oil or 3-in-l™)
Lint-free pipe cleaners
Safety pin/dental pick
Lint-free cotton-tipped swabs
Wooden dowel and cloth wads to clean downtube
Spray bottle
Gloves (powder-free, nitrile)
Vendor/Catalog
Number

Kimberly-Clark





Nearest drug store







Make/Model
Number


Kay-Pees disposable
paper towels












Initial quantities will be worked out with the WAM/TOPO/DOPO in each region. As
consumables run low or when new equipment purchases are necessary, the FS will be
responsible for assisting in the procurement of these items following the policy and requirements
described in the ESAT scope of work. The FS should continue purchasing consumable
equipment with the same model numbers as the equipment that was initially procured unless the
Regional WAM/TOPO/DOPO suggests a different item because of its improved quality,
reduction in contamination, increased ease of use, or lower cost (without sacrificing quality). The
WAM/TOPO/DOPO will report any equipment changes that could affect the results of sampling
events to the EPA National PEP Project Leader. The FS will perform the following required
procedures:

   •  Develop procurement requests as per EPA requirements.
   •  Upon order, add items to the Field/Laboratory Procurement Log Form (PRO-01).
   •  Once a month, provide a copy of the PRO-01 to the Regional WAM/TOPO/DOPO.
   •  File PRO-01 under AFC "PEP/301-093-006.6."

-------
                                                                       Project: PEP QAPP
                                                                         Element No: 17.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
                                                                       	Page 7 of 9
17.3   Acceptance Criteria

The major pieces of capital equipment are namely the following:

Laboratory                                 Field

•  Microbalances                            •  Portable samplers
•  Calibration equipment (see Element 16.0,    •  Calibration equipment (see Element 16.0,
   Instrument Calibration and Frequency)         Instrument Calibration and Frequency)
•  Mass weights
•  Temperature recorder
•  Humidity recorder

The equipment and consumables have been selected based upon their advertised specifications
on accuracy and resolution, and the portable sampler has been built to FRM performance
specifications and has been accepted as such. Upon receipt of equipment, the equipment will be
inspected and tested using calibration standards (see Element 16.0, Instrument Calibration and
Frequency) to ensure they operate within the performance parameters. All equipment is under
warranty, and the equipment previously listed will undergo yearly calibration and certification as
discussed in Element 16.0, Instrument Calibration and Frequency.

Both field and laboratory personnel will use procurement logs (PRO-01) (Figure 17-2) to record
the purchase of new equipment and consumables and to indicate whether the items were
accepted or rejected. In addition, the laboratory and field personnel are required to keep a
Field/Laboratory Inventory Form (INV-01) (Figure 17-1), which lists each equipment item and
warranty dates.

17.4   Tracking  and Quality Verification of Supplies and Consumables

Tracking and quality verification of supplies and consumables have two main components. The
first is the need  of the end user of the supply or consumable to have an item of the required
quality. The second is the need for the purchasing department to accurately track goods received
so that payment or credit of invoices can be approved.  The following procedures address these
issues by outlining the proper tracking and documentation process by receiving personnel:

   1.  Perform  a rudimentary inspection of the packages as they are received from the courier or
       shipping company and note any obvious problems with a receiving shipment, such as
       crushed box or wet cardboard
   2.  Pull the appropriate purchase order for the incoming items from the files
   3.  Fill out a Field/Laboratory Equipment/Consumable Receiving Report Form (REC-01)
       (Figure 17-3), comparing the items and quantity against the purchase order and

-------
                                                                       Project: PEP QAPP
                                                                         Element No: 17.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
                                                                       	Page 8 of 9
       inspecting the condition of each item
   4.  If the items received match the purchase order and the condition of the equipment or
       consumables is acceptable, signify this on the form and file under AFC "PEP/301-093-
       006.6"
   5.  If the quantity, items, or condition are not acceptable, complete REC-01  with remarks
       and send a copy of the form to the Regional WAM/TOPO/DOPO
   6.  Call the vendor to report the problem with the package and/or contents
   7.  Add receipt information to the Field/Laboratory Procurement Log Form (PRO-01) and to
       the Field/Laboratory Inventory Form (INV-01).

In addition, any conversations that field or laboratory personnel have with vendors will be
recorded on a phone communication form, which will also be filed.
Field/Laboratory Inventory Form (INV-01)
Item






Vendor






Model Number






Quantity






Purchase Date






Warranty






                Figure 17-1. Field/Laboratory Inventory Form (INV-01).
Field/Laboratory Procurement Log Form (PRO-01)
Item






Model
Number






Quantity






Purchase
Order
Number






Vendor






Date
Ordered






Received






Cost






Initials






Accept/
Reject






            Figure 17-2. Field/Laboratory Procurement Log Form (PRO-01).

-------
                                                                     Project: PEP QAPP
                                                                       Element No: 17.0
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                                     	Page 9 of 9
Field/Laboratory Equipment/Consumable Receiving Report Form (REC-01)
Date:





Received From:
Shipped From:
Shipped Via:
Shipping Charge
Prepaid
Purchase Order Number

Collect


Quantity









Remarks:

Freight Bill Number



Description of Item









Accept Shipment

Problem


Condition










Notes:





Figure 17-3. Field/Laboratory Equipment/Consumable Receiving Report Form (REC-01).

-------
                                                                       Project: PEP QAPP
                                                                         Element No: 18.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
                                                                              Page 1 of 2
                    18.0  Data Acquisition Requirements

This element addresses data that have not been obtained by direct measurement from the PEP.
The majority of data used in the PEP will be direct measurements acquired by the FSs and LAs
working for the PEP.

18.1  Acquisition of Non-Direct Measurement Data

The PEP relies on data that are generated through field and laboratory operations; however,
some data are obtained from sources outside the PEP. This element lists these data and addresses
quality issues related to the PEP.

18.1.1  Chemical and Physical Properties Data

Physical and chemical properties data and conversion constants are often required in the
processing of raw data into reporting units. This type of information, which has not already been
specified in the monitoring regulations, will be obtained from nationally and internationally
recognized sources. Other data sources may be used with approval from the National PEP
Project Leader. The following sources may be used in the PEP without prior approval:

   •   NIST
   •   International Organization for Standardization (ISO), International Union of Pure and
       Applied Chemistry (IUPAC), American National Standards Institute (ANSI), and other
       widely recognized national and international standards organizations
   •   EPA
   •   The current edition of certain standard handbooks may be used without prior approval
       from the National PEP Project Leader. Two that are relevant to the fine particulate
       monitoring program are CRC Press' Handbook of Chemistry and Physics and Lange 's
       Handbook of Chemistry.

18.1.2  Sampler Operation and Manufacturers' Literature

Manufacturers' literature, which includes operations manuals and users' manuals, are another
important  source  of information needed for sampler operation because they frequently provide
numerical  information and equations pertaining to specific equipment. PEP personnel are
cautioned that such information is sometimes in error and appropriate cross-checks will be made
to verify the reasonableness of information in manuals. Whenever possible, the FSs will compare
physical and chemical constants in the operator's manuals to those given in the sources
previously listed. If discrepancies are found, then the FS may raise these issues during PEP
Workgroup conference calls and during recertification training sessions. The  following types of
errors are commonly found in such manuals:

-------
                                                                          Project: PEP QAPP
                                                                           Element No: 18.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
	Page 2 of 2

    •  Insufficient precision
    •  Outdated values for physical constants
    •  Typographical errors
    •  Incorrectly specified units
    •  Inconsistent values within a manual
    •  Use of different reference conditions than those called for in EPA regulations.

18.1.3 Site Information

To determine the site and the monitor that the PE will be compared against, the FS must rely on
the site information provided to him or her by the SLT monitoring agency and included in the
site file and on each FDS. This will include the following parameters:

    •  AQS Site ID
    •  Monitor type
    •  Method designation (routine instrument)
    •  Reporting organization.

These values should be available in the AQS database and can be double-checked for their
accuracy before proceeding to a site.

18.1.4 External Monitoring Databases

It is the policy of the PEP that no data obtained from the Internet, computer bulletin boards, or
databases from outside organizations shall be used to create reportable data or published reports
without approval from the National PEP Project Leader. Requests may be raised during the PEP
Workgroup conference calls or on an individual basis. This policy is intended to ensure the use
of high-quality data in PEP publications.

Data from EPA's AQS database may be used in published reports with appropriate caution. Care
must be taken when reviewing and using any data that contain flags or data qualifiers. If data are
flagged, such data  shall not be used unless it is clear that these data still meet critical QA/QC
requirements. It is  impossible to assure that a database, such as the AQS, is completely free from
errors, including outliers and biases, so caution and skepticism are called for when comparing
routine data from other reporting agencies as reported in the AQS. Users will review available
QA/QC information to assure that the external data are comparable with PEP measurements and
that the original data generator had an acceptable QA program in place.

-------
                                                                         Project: PEP QAPP
                                                                          Element No: 19.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                              Page lof 11
                             19.0 Data Management

19.1   Background and Overview

This element describes the data management operations, including data recording,
transformation, transmittal, reduction, validation, analysis, management, storage, and retrieval,
that pertain to PM2 5 measurements for the PEP. This includes an overview of the mathematical
operations and analyses performed on raw ("as-collected") PM2.5 data.

Data processing procedures for PEP PM2.5 data are summarized in Figure 19-1. A data
management system (called the FED) has been developed to collect the critical information that
must be uploaded into the AQS database and is required to calculate PM2 5 concentrations. As
time and resources allow, system features will be added to automate and electronically store
other important information. The FED is set up so that as a default, all information can be
manually recorded. The critical data values are entered into the FED and processed using a set of
programs written in Microsoft Access. The FED user application resides on PCs running in the
weighing laboratory (the back-end to the database may reside on a network server in another
location). This local copy of the database is shown in the upper left of Figure 19-1. In essence,
data for the PEP can be seen as accumulating during the following three stages:

   •   Pre-sampling filter weighing. At this stage, the filters are assigned a unique Filter
       ID/Cassette ID combination, and a pre-sampling weight value is recorded.
   •   Field. The Filter Cassette is installed, and the sampler is operated by providing many
       values that are automatically downloaded from the sampler to a data logger, laptop, and
       data storage device (e.g., diskette, CD, or USB drive). In particular, the critical
       measurement value collected in the field is the air volume sampled during the filter
       exposure.
   •   Post-sampling filter weighing. At this stage, the exposed filter cassette is returned to the
       laboratory where the filter is equilibrated and weighed again. The difference between the
       initial pre- and post-sampling weights is the particulate load on the filter, which is a
       critical value.

During these stages, additional data, including COC data, calibration data, and laboratory
atmospheric data (temperature/RH), are collected, recorded in hard copy and/or electronic form,
and appropriately stored to ensure the quality of the critical values.

-------
                                                                                            Project: PEP QAPP
                                                                                              Element No: 19.0
                                                                                                Revision No: 1
                                                                                                Date: 3/6/2009
                                                                                                   Page 2 of 11
Fitter prepared and
weighed with unique ID
number recorded on filter
and in data system
Sample shipment including
chain-of-custody form and
data storage device (e.g.,
diskette, CD or USB drive)
                                                               T   f
                                                      Samples received with
                                                      chainof-custody form
                                                      and data storage device
                                                      Data disposal,
                                                      > 4 years
                                                      (with permission
Paper flow

Sample flow

Computer flow

Laboratory

Field

                                                                                      Sample storage
                                                                                      (post-analysis),
                                                                                      2noand subsequent
                                                                                      years at ambient
                                                                                      Sample disposal.
                                                                                      > 4 years
                                                                                      (with permission)
                    Figure 19-1. PEP information management flow.

-------
                                                                         Project: PEP QAPP
                                                                           Element No: 19.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
                                                                               Page 3 of 11
19.1.1  Information Management Security

The FED is maintained on an EPA file share, and access is restricted to authorized personnel.
Data can only be released with the express permission of the National PEP Project Leader. PE
results should not be released for events that have not been posted by the Reporting Organization
to AQS. Only validated, approved data are loaded into the AQS, where the information becomes
public  domain. In addition, hard copies of all weighing logs and routine back-up copies of the
FED are archived. A comparison of the archived FED copies with the current FED allows
unauthorized or altered entries to be detected in the current FED.
19.2   Data Recording

Each method that generates information in the PEP will have a data form available for hand
recording this information. These forms are found at the end of the particular PEP Field or
Laboratory SOP that describes the data collection activity, as summarized in Table 19-1.

          Table 19-1. List of PEP Data Processing Operations for Critical Values
Reference
Lab SOP,
Section 8
Lab SOP,
Section 9
Field SOP,
Section 6
Field SOP,
Section 7
Not applicable
Not applicable
Not applicable
Title
Filter Weighing
Chain of Custody (COC) and
Shipping
Filter Exposure and Concluding
the Sampling Event
COC Form and Field Data Sheet
Performance Evaluation Database
(FED) User's Manual
AQS Data Coding Manual (AQ2)a
AQS User Guide3
Description (Data Related)
Describes the procedure for pre- and post-sample
weighings of the filter and for recording data
Describes the laboratory procedure for starting a COC
Form and for processing the same form when it
returns from the field
Describes how to program the sampler to start and end
sampling for a 24-hour period, as well as how to
acquire data from the portable sampler
Describes the field procedure for completing the field
portions of the COC Form
Describes data entry forms and procedures for using
the FED
Describes the coding of air quality data transactions;
describes the various transactions used to create,
update, or delete data in the AQS
Describes the installation of AQS software, accounts,
data input (batch and online), maintenance, and data
retrievals (standard reports)
" AQS reference documents can be found at http://www.epa.gov/ttn/airs/airsaqs/manuals

19.3   Data Validation

Data validation is a combination of checking that data processing operations have been correctly
performed and of monitoring the quality of the field and laboratory operations. Data validation
can identify problems in either of these areas. After problems are identified, the data can be
corrected or invalidated, and corrective actions can be taken for field or laboratory operations.

-------
                                                                        Project: PEP QAPP
                                                                          Element No: 19.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
	Page 4 of 11

Numerical data stored in the FED are never internally overwritten by condition flags. Flags that
denote error conditions or QA status are saved as separate fields in the database, so that it is
possible to recover the original data.

The following validation functions are incorporated into the FED to ensure the quality of data
entry and data processing operations:

    •   100% data review. Filter weight reports, FDSs, and COC forms are subjected  to a 100%
       data review by the LA and random reviews once a month by the PEP Laboratory
       Manager or designated Laboratory QA Officer.
    •   Range checks. Simple range checks are performed by the FED for almost all monitored
       parameters. For example, valid times must be between 00:00 and 23:59. Reasonableness
       checks may also be performed by the LA. For example, the summer temperatures in most
       Regions should be between 10°C and 50°C. Because these range limits for data input are
       not regulatory requirements, the PEP Laboratory Manager may adjust them from time to
       time to better meet quality goals.
    •   Completeness checks. When the data are processed, certain completeness criteria must
       be met. For example, each sample event must have a start time, an end time, an average
       flow rate, filter weigh dates, and operator and technician names. At a minimum, FDSs,
       COC forms, and pre- and post-weighing data entry forms must be completely filled out.
    •   Internal consistency and other reasonableness checks. Several other internal
       consistency checks are built into the FED. For example, the end time of a filter must be
       later than the start time. Computed filter volume (integrated flow) must be approximately
       equal to the exposure time multiplied by the nominal flow. Additional consistency and
       other checks will be implemented as the result of problems encountered during data
       screening.
    •   Data retention. Raw data sheets are retained in the laboratory files for a minimum of 4
       calendar years and are readily available for audits and data verification activities. After 4
       years, the FS or LA may request instructions from OAQPS on the disposition of hard
       copy records and computer back-up media. Sample filters will be archived for 1 calendar
       year at <4°C. After the first year, the filters may be kept at ambient temperature. At the
       end of the 4th calendar year, the LA may request instructions from OAQPS on the
       disposition of archived sample filters.
       NOTE: The time frame for retention and disposition of Agency records is determined by
       EPA records schedules (see Element 9.0, Documentation and Records}; however, records
       may need to be retained for longer periods (e.g., for legal discovery). Therefore, approval
       from OAQPS is required before the destruction of records.
    •   Statistical data checks. Errors found during statistical screening will be traced back to
       original data entry files and to the raw data sheets, if necessary. These checks shall be
       conducted on a monthly schedule and before any data are submitted to the AQS. Data

-------
                                                                         Project: PEP QAPP
                                                                           Element No: 19.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
                                                                               Page 5 of 11
       validation is the process in which raw data are screened and assessed before inclusion
       into the AQS.
   •   Sample batch data validation. This is discussed in Element 23.0, Validation and
       Verification Methods. Sample batch data validation associates flags, which are generated
       by QC values outside of acceptance criteria, with a sample batch. Batches that contain
       too many flags would be rerun and/or invalidated.

Table 19-2 summarizes the validation checks applicable to the PEP data.

                        Table 19-2. Validation Check Summaries
Type of Data Check
Data parity and transmission protocol checks
Data review
Date and time consistency
Completeness of required fields
Range checking
Statistical outlier checking
Manual inspection of charts and reports
Sample batch data validation
Electronic
Transmission
and Storage
/







Manual
Checks

/
/
/


/

Automated
Checks


/
/
/
/

/
Two key operational criteria for PM2.5 sampling are bias and precision. As defined in 40 CFR
Part 58, Appendix A, these are based on differences between collocated sampler results and
FRM PEs. The PEP Laboratory Manager or a designated Laboratory QA Officer will inspect the
results of collocated sampling during each batch validation activity. These data will be evaluated
as early in the process as possible, so that potential operational problems can be addressed. An
objective of the PEP will be to optimize the performance of its PM2.5 monitoring equipment.
Initially, the results of collocated operations were control charted (see Element 14.0, Quality
Control Requirements) to establish limits to flag potential problems. As the data results
accumulate over time, EPA may reassess data quality with higher confidence and adjust the
control limits accordingly.
19.4   Data Transformation

Calculations for transforming raw data from measured units to final concentrations are relatively
straightforward, and many are performed in the sampler data processing unit before being
recorded. The following relations in Table 19-3 pertain to PM2.5 monitoring:

-------
                                                                        Project: PEP QAPP
                                                                          Element No: 19.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                        	Page 6 of 11
                           Table 19-3. Raw Data Calculations
Parameter
Filter volume (I7)*
Mass on filter (M2.s)
PM25 concentration
Units
m3
m
//g/m3
Type of Conversion
Calculated from average flow rate
(Qave) in L/min and total elapsed time
(t) in minutes multiplied by the unit
conversion (m3/L)
Calculated from filter post-weight
(Mf) in mg and filter pre-weight (M^
in mg multiplied by the unit
conversion (//g/mg)
Calculated from gravimetric mass and
sampler volume
Equation
V = QavextxW-3
M25 =(M/-M!)xl03
P\f -M^
2.5 T7
V
 * FRM instruments will provide this value.

19.5   Data Transmittal

Data transmittal occurs when data are transferred from one person or location to another or when
data are copied from one form to another. Some examples of data transmittal are 1) submission
of downloaded instrument data files saved on a portable storage device for subsequent upload
into a data entry system and 2) transcription of raw data from a notebook into an electronic data
entry system. Table 19-4 summarizes data transfer operations.

                         Table 19-4. Data Transfer Operations
Description of Data
Transfer
Keying weighing data
into the FED
Electronic data transfer
Filter receiving, COC
forms, and FDSs
Verification/calibration
and audit data
AQS data
Originator
LA (hand-written data
form)
(Between computers
or over network)
FS
Auditor or Field
Supervisor
LA
Recipient
LA
-
LA
LA
AQS (EPA)
QA Measures Applied
100% review; random checks by the
PEP Laboratory Manager or by a
designated Laboratory QA Officer
Parity checking; transmission protocols
Filter numbers are automatically
verified; reports indicate missing filters
and/or incorrect data entries; FS checks
data entry with 100% review
Entries are checked by the LA and the
PEP Laboratory Manager or by a
designated Laboratory QA Officer
Data transfer is checked by the technical
support contractor for the AQS
The PEP will report all PM2.5 ambient air quality data and information specified by the AQS
Data Coding Manual (http://www.epa.gov/ttn/airs/airsaqs/manuals), in the required format for
the AQS. Such air quality data and information will be fully screened and validated and will be

-------
                                                                        Project: PEP QAPP
                                                                          Element No: 19.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                              Page 7 of 11
submitted directly to the AQS via electronic transmission, in the AQS format, and in accordance
with the quarterly schedule. PEP audit results are posted to the AQS as data pairs. The data pair
consists of the PEP audit measured value and the site's measured value. SLAMS and NCore sites
are required to post their site data to the AQS on the schedule shown in Table 19-5. Because
posting the  PEP data requires first obtaining the site's measured value from AQS, PEP data
cannot normally be posted until after the due dates listed in Table 19-5. In cases where the site
data have been uploaded to the AQS and validated on or before the due date, the PEP audit data
should be available within 30 days after the due date (to allow time for processing and review).
Data submitted after the due date will be available within 30 days after the end of the next
reporting period.

                          Table 19-5. Data Reporting Schedule
Reporting Period
January 1-March 3 1
April 1-June 30
July 1-September 30
October 1-December 3 1
Due Date
June 30
September 30
December 3 1
March 3 1
19.6   Data Reduction and Data Integrity

Data-reduction processes involve aggregating and summarizing results so that they can be
understood and interpreted in different ways. The PM2.5 monitoring regulations require certain
summary data to be computed and reported regularly to EPA. Examples of data summaries
include the following:

   •   Average PM2.5 concentration
   •   Accuracy, bias, and precision statistics based on accumulated FRM/FEM data
   •   Data completeness reports based on the numbers of valid samples collected during a
       specified period.
The integrity of PEP data reduction can be verified by independent review of the data and
algorithms used. Verification of data integrity requires that PEP data be stored in a manner that
permits any data modification to be detected. Detection of data changes is facilitated by the
record-keeping requirements of the PEP Laboratory SOP, which requires archiving of hard-copy
records for important data (e.g., weighing session reports, sample COC forms, and FDSs). These
archived records enable EPA to trace raw data used in PEs to original documents, which have
been dated and signed by program personnel.
In addition, the PEP Laboratory SOP requires that regular copies of the FED data are archived
into read-only media (e.g., CD-ROM or back-up tape) and regularly  stored at an off-site location.
These archival database copies may also be used to  evaluate data integrity and to check that data
used in a particular PE matched the data on hard-copy records.

-------
                                                                               Project: PEP QAPP
                                                                                Element No: 19.0
                                                                                  Revision No: 1
                                                                                  Date: 3/6/2009
                                                                                    Page 8 of 11
19.7  Data Analysis

The PEP is currently implementing the data summary and analysis requirements contained in 40
CFR Part 58, Appendix A. It is anticipated that as the PM2.5 Monitoring Program develops,
additional data analysis procedures may  evolve. The following specific summary statistics will
be tracked and reported for the PEP:

    •   Single sampler bias (when the Anderson or R&P samplers are included in collocation
       studies) or accuracy (based on internal flow rate performance audits and the collocation
       study results)
    •   Single sampler precision (based on collocated data)
    •   Network-wide bias and precision (based on collocated data and internal flow rate
       performance audits)
    •   Data completeness.

Equations used in these analyses are provided in the Table 19-6.

                           Table 19-6. Data Assessment Equations
               Criterion
         Equation
  Reference
 Percent difference (d,)—Single-point check
 to compare audit concentration or value
 (flow rate) to the concentration/value
 measured by the sampler; /' represents a
 unique pair of audit and measured values
 for a particular audit site and sampling date.
 For determining network bias, the data pair
 will only be used when both concentrations
 are >3 jug/m3.
 ,   measured - audit  , „„
di  =	xlOO
          audit
40 CFR Part 58,
Appendix A,
Section 4.1.1
 Mean (D)—Averages the individual biases
 (d,) between sampler and audit value for
 various levels of aggregation; n is the
 number of sampler/audit pairs in the
 aggregation.
                                   40 CFR Part 58,
                                   Appendix A,
                                   Section 4.3.2
 Standard deviation (S)—An estimate of the
 variability of the average bias.
                                                              ..
                                                          (n-l)
                                   40 CFR Part 58,
                                   Appendix A,
                                   Section 4.3.2
 Confidence intervals for the average bias
 estimates—Du=90% is the upper 90%
 confidence interval; DL=90% is the lower
 90% confidence interval; t0.95, n.} is the 95th
 quantile of a / distribution with n-l degrees
 of freedom.
              *0.95, n-l X
            ~ '0.95, n-l X
40 CFR Part 58,
Appendix A,
Section 4.3.2

-------
                                                                              Project: PEP QAPP
                                                                                Element No: 19.0
                                                                                  Revision No: 1
                                                                                  Date: 3/6/2009
                                                                                    Page 9 of 11
               Criterion
                                                     Equation
                             Reference
 Relative percent difference for PEP
 collocation study data (i.e., "parking lot
 events") (d,)—Xt and 7, are concentrations
 from two different PEP samplers on a
 selected sampling day.
                                              d =
                                                     Y . — X
                          40CFRPart58,
                          Appendix A,
                          Section 4.2.1
 Coefficient of variation (CV)—Precision
 estimate for PEP regional "parking lot"
 collocation studies; X20.i,n-i is the 10th
 percentile of a chi-squared distribution with
 n-1 degrees of freedom.
                                                                              40CFRPart58,
                                                                              Appendix A,
                                                                              Section 4.2.1
Normalized paired differences (Nijid
Evaluation of collocation study data to
determine bias of individual samplers when
compared to all samplers in a study; n is the
number of monitors in the collocation study;
/' andy represent different individual
monitors in the study; d represents a
specific day in a collocation event; DtJ
represents the paired differences among
monitors for each day during the event; xd
is the daily mean.
abs(Du)
                                                                               Not described
                                             where DJ} =
 Completeness of mandatory PEP audits—A
 comparison of the number of valid audits to
 the number that are expected based on the
 size of the PQAO; data are aggregated at the
 PQAO level, regionally, and nationally.
                                          Completeness =
                                                             ' valid
                                                          N
              -xlOO
                                                            theoretical
Audit frequency
described in 40
CFR Part 58,
Appendix A,
Section 3.2.7
19.8  Data Flagging—Sample Qualifiers

A sample qualifier or a result qualifier consists of three alphanumeric characters, which indicate
the fact and the reason why that the data value

    •  Did not produce a numeric result
    •  Produced a valid numeric result, but it is qualified in some respect relating to the type or
       validity of the result
    •  Produced an invalid numeric result that is not to be reported outside the laboratory.

Qualifiers will be used in the field and the laboratory to signify data that may be suspect due to
contamination, special events, or failure of QC limits. Some flags will be generated by the
sampling instrument (see Table 6-2). Appendix D contains a complete list of the data qualifiers
and flags for the field and laboratory activities. Qualifiers will be placed on field and laboratory
data forms with additional explanations in free-form notes areas. Flags may be generated when
sample batch information is entered into the FED  and the validation process is run. During the
sample validation process, which is discussed in Element 23.0, Validation and Verification

-------
                                                                           Project: PEP QAPP
                                                                            Element No: 19.0
                                                                              Revision No: 1
                                                                              Date: 3/6/2009
	Page 10 of 11

Methods, the flags will be used to decide whether to validate or invalidate individual samples or
batches of data.

19.9  Data Tracking
The FED contains the input functions and reports necessary to track and account for the
whereabouts of filters and the status of data processing operations for specific data. Information
about filter location is updated on distributed data entry terminals at the points of significant
operations. The following input data are used to track filter location and status:
    •   Laboratory filter receipt (by lot)
    •   Laboratory filter pre-sampling equilibration (individual Filter ID first enters the system)
    •   Laboratory filter pre-sampling weighing
    •   Laboratory loads filters into cassettes (Filter IDs associated with Cassette IDs are
       recorded)
    •   Filter packaged for the field (Cassette IDs in each package are recorded)
    •   Shipping (package numbers are entered for both sending and receiving)
    •   Laboratory package receipt (package is opened and Cassette IDs are logged in)
    •   Laboratory filter post-sampling equilibration
    •   Laboratory filter post-sampling weighing
    •   Laboratory filter storage/archival.
Tracking reports may be generated by any personnel who have access to the FED. The following
tracking reports are available:
    •   List of all filters in the filter archive
    •   List of all filters that have been received but have not been post-weighed
    •   Ad hoc reports (generated using Microsoft Access queries).
Although not currently in the FED, other reports could be added, if needed, such as the
following:
    •   Location of any  filter (by Filter ID)
    •   List of all filters sent to a specified site that have not been returned
    •   List of all filters that have not been returned and are more than 30 days past the initial
       weighing date.
The PEP Laboratory Manager or designee is responsible for tracking filter status at least twice
per week and for following up on anomalies such as excessive holding time in the laboratory
before reweighing.

-------
                                                                        Project: PEP QAPP
                                                                         Element No: 19.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
                                                                            Page 11 of 11
19.10 Data Storage and Retrieval

Table 19-7 shows archival policies for the PM2.5 data.
                           Table 19-7. Data Archive Policies
Data Type
Weighing records; COC
forms
Laboratory notebooks
Field notebooks
FED (excluding audit
trail records)
FED audit trail records
Filters
Medium
Hard copy
Hard copy
Hard copy
Electronic
(online)
Electronic
(back-up
tapes)
Filters
Location
Laboratory
Laboratory
Air Quality
Division
Air Quality
Division
Air Quality
Division
Laboratory
Retention Time
4 years
4 years
4 years
Indefinite
4 years
4 years; 1 full calendar
year at 4°C, and then 3
additional calendar years
at ambient temperature
Final Disposition
Discarded, with
permission from OAQPS
N/A
Discarded, with
permission from OAQPS
Back-up media retained
indefinitely
Discarded, with
permission from OAQPS
Discarded, with
permission from OAQPS
The PM2.5 data reside on a Microsoft Windows-compatible computer in the PEP weighing
laboratory. The security of data in the FED is ensured by using the following controls:

   •   Network security passwords for access to the project and database files
   •   Regular password changes (as specified by EPA network security)
   •   Independent password protection on all dial-in lines
   •   Logging of all incoming communication sessions, including the originating telephone
       number, the User's ID, and connect times
   •   Storage of media, including back-up tapes in locked, restricted access areas.

-------
                                                                       Project: PEP QAPP
                                                                         Element No: 20.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
                                                                       	Page Iof9
                  20.0 Assessments and Response Actions

For the purposes of this QAPP, an assessment is defined as an evaluation process used to
measure the performance or effectiveness of the quality system and various measurement phases
of the data operation.

The results of assessments indicate whether the QC efforts are adequate or need to be improved.
Documentation of all QA and QC efforts implemented during the data collection, analysis, and
reporting phases are important to data users and decision makers, who can then consider the
impact of these control efforts on the data quality (see Element 21.0, Reports to Management).
Both qualitative and quantitative assessments of the effectiveness of these control efforts will
identify those areas most likely to impact the data quality. Periodic assessments of PEP data
quality are required to be reported to EPA. However,  the selection and extent of the QA and QC
activities used by the PEP depend on many local factors, such as the field and laboratory
conditions, the objectives for monitoring, the level of the data quality needed, the expertise of
assigned personnel, the cost of control procedures, and pollutant concentration levels.

To ensure the adequate performance of the  quality system, the PEP will be subject to the
following assessments:
    •   Management systems reviews (MSRs)
    •   Technical system audits (TSAs)
    •   Surveillance
    •   Audits of data quality (ADQs)
    •   Data quality assessments (DQAs)
    •   Peer review.

20.1   Assessment Activities and  Project Planning

20.1.1  Management Systems Review

An  MSR is a qualitative assessment  of data collection operations and/or organization(s) to
establish whether the quality management structure, policies, practices, and procedures are
adequate to ensure that the desired quality of data needed are met. An MSR is used to determine
the  effectiveness of and adherence to the QA program and the adequacy of resources and
personnel provided to achieve and ensure quality in all activities. A review of the PEP is just one
part of an MSR performed on an EPA Region's monitoring program. OAQPS has a goal of
conducting two to three MSRs per year.

-------
                                                                       Project: PEP QAPP
                                                                         Element No: 20.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
                                                                       	Page 2 of 9
The MSR includes reviews of
   •   Procedures for developing DQOs
   •   Procedures for developing and approving QAPPs
   •   The quality of existing QAPP guidance and QAPPs
   •   Procedures for developing and approving SOPs
   •   Procedures and criteria for designing and conducting audits
   •   Tracking systems for assuring that the QA program is operating and that corrective
       actions disclosed by audits have been taken
   •   The degree of management support
   •   Responsibilities and authorities of the various line managers and the QA Program
       Manager for carrying out the QA program.

20.1.2  Technical Systems Audit

A TSA is an evaluation of a data collection operation or organization to establish whether the
policies, practices, and procedures are adequate for ensuring that the type and quality of data
needed are obtained. TSAs are performed both for EPA Regions and SLT organizations that
implement PEP  activities. The PEP Region TSAs allow OAQPS to assess consistency of
operation among the Regions and to improve the quality system. TSAs will be performed for
field and laboratory activities.

TSAs of the PEP laboratory and data management operations will be conducted annually by
OAQPS; TSAs of the field operations will be conducted annually by the Regional
WAM/TOPO/DOPOs. This will include any SLT-run PEP. It is possible that OAQPS would
team with the Region during the TSAs of SLT-run PEPs. TSAs may be conducted coincident
with the recertification of FSs, where appropriate.

The TSA can be conducted by a team or by an individual assessor. Key personnel to be
interviewed during the audit are those who have responsibilities for  planning, field operations,
laboratory operations, QA/QC, data management, and reporting. The TSA will review the
following three activities:
   •   Field. Filter receipt, instrument setup, sampling, and shipping
   •   Laboratory. Pre-sampling weighing, shipping, receiving, post-sampling weighing,
       archiving, and associated QA/QC
   •   Data management. Information collection, flagging, data editing, security, and upload.

-------
                                                                                                   Project: PEP QAPP
                                                                                                     Element No: 20.0
                                                                                                       Revision No: 1
                                                                                                       Date: 3/6/2009
                                                                                                  	Page 3 of 9
The audit activities are illustrated in Figure 20-1. To increase uniformity of the TSA,  an audit
form will be used (see Appendix E, Technical Systems Audit Forms).
                                      Audit Team Interview of Reporting Organization Director
                       Audit Group 1
                                                Interview with Key Personnel
                                                                                   Audit Group 2
Interview Laboratory Manager 1


                                                                       Interview Field Operations  Manager
                      Visit Laboratory, Witness Operations
                                                                                 Visit Sites
                     Review Sample Receiving and Custody


Visit Audit and Calibration Facility 1


                   Select Portion of Data, Initiate Audit Trail
             Select Portion of Data, Initiate Audit Trail
                     Establish Data Audit Trail Through
                       Laboratory Operations to Data
                     	Management Function
  HMeet to   I
  Discuss   I
^Finding^l
Establish Trail Through Field
Operations to Data Management
                                            Finalize Audit Trails and Complete Data Audit
                                              Prepare Audit Result Summary of:
                                            (a) Overall operations   (b) data audit findings
                                            (c) laboratory operations (d) field operations
                                         Complete audit finding forms and debreifing report
                                             Discuss Findings with Key Personnel
                                                             _L
                                                    On-Site Audit Complete
                                           Figure 20-1. Audit activities.

-------
                                                                         Project: PEP QAPP
                                                                          Element No: 20.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                        	Page 4 of 9
The TSA team will prepare a brief written summary of findings organized into the following
areas: planning, field operations, laboratory operations, QA/QC, data management, and
reporting. Problems with specific areas will be discussed, and an attempt will be made to rank
them in order of their potential impact on data quality. For the more serious of these problems,
the TSA team will summarize audit findings on the Audit Finding Form (Figure 20-2).
                                   Audit Finding Form

             Audit Title:	Audit Number:	
             Finding Number:	
             Finding:
             Discussion:
             QA Lead Signature:	  Date:

             Audited Agency
             Signature:	  Date:	
                            Figure 20-2. Audit Finding Form.

By design, an Audit Finding Form should be completed for each major deficiency that requires
formal corrective action. This form should include information such as the finding impact,
estimated time period of deficiency, site(s) affected, and reason for action. The Audit Finding
Form will notify the laboratory or field office of serious problems that may compromise the
quality of the data and therefore require specific corrective actions. These forms are initiated by
the TSA team and are discussed at the debriefing. If the assessed group is in agreement with the
finding, the form is signed by the ESAT organization during the debriefing. If a disagreement
occurs, then the TSA team will record the opinions of the group assessed and set a time at some
later date to address the finding at issue. These forms are filed under the AFC heading
"PEP/108-025-01-01-237.1" (see Element 9.0, Documentation and Records).

-------
                                                                        Project: PEP QAPP
                                                                          Element No: 20.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                       	Page 5 of 9
20.1.2.1 Post-Audit Activities
The major post-audit activity is the preparation of the audit report. The report will include the
following:
   •   Audit title, number, and any other identifying information
   •   Audit team leaders, audit team participants, and audit participants
   •   Background information about the project, purpose of the audit, dates of the audit,
       particular measurement phase or parameters that were audited, and a brief description of
       the audit process
   •   Summary and conclusions of the audit and corrective action required
   •   Attachments or appendices that include all audit evaluations and audit finding forms.

To prepare the report, the TSA team will meet and compare observations with collected
documents and results of interviews and discussions with key personnel. Expected QAPP
implementation is compared with observed accomplishments and deficiencies, and the audit
findings are reviewed in detail. Within 30 calendar days of the completion of the audit, a draft
audit report will be prepared and submitted. The TSA report will be submitted to the appropriate
ESAT personnel and appropriately filed under the AFC heading "PEP/108-025-01-01-237.1."

If the ESAT organization has written comments or questions about the TSA report, then the TSA
team will review and incorporate them as appropriate and will prepare and resubmit a report in
final form within 30 days of receiving the written comments. The report will include an agreed-
upon schedule for corrective action implementation.

20.1.2.2 Follow-up and Corrective Action Requirements

The Regional office and ESAT may work together to solve required corrective actions. As part
of corrective action and follow-up, an Audit Finding Response Form (Figure 20-3) will be
generated by the assessed organization for each Audit Finding Form submitted by the TSA team.
In addition, ESAT will include corrective action in either its weekly (laboratory) or monthly
(field) progress reports. The Audit Finding Response Form will be signed by the assessed
organization, and then it will be sent to the ESAT WAM/TOPO/DOPO, who reviews and
accepts the corrective action. The Audit Finding Response Form will be completed by the
assessed organization within 30 days of acceptance of the audit report. Audit Finding Response
forms are filed under the AFC heading "PEP/108-025-01-01-237.1."

-------
                                                                    Project: PEP QAPP
                                                                      Element No: 20.0
                                                                        Revision No: 1
                                                                        Date: 3/6/2009
                                                                   	Page 6 of 9
                        Audit Finding Response Form
Audited Division:
Audit Title:
Audit Number:
Finding Number:_
Finding:
Cause of the Problem:
Actions Taken or Planned for Correction:
Responsibilities and Timetable for the Above Actions:
Prepared by:

Signed by:	
QA Division
Reviewed by:
             Date:.

            . Date:




             Date:
Remarks:

Is This Audit Finding Closed?
        When?
File with Official Audit Records. Send a Copy to the Audited Organization.
             Figure 20-3. Audit Finding Response Form.

-------
                                                                         Project: PEP QAPP
                                                                           Element No: 20.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
                                                                        	Page 7 of 9
20.1.3  Surveillance

"Surveillance'" is defined as continual or frequent monitoring and verification of the status of an
entity and the analysis of records to ensure that specified requirements are being fulfilled.
Surveillance is similar to a TSA except that it serves as a more frequent review of certain
important phases of the measurement system (i.e., calibrations and run setup) rather than a
review of the entire implementation process. Because the PEP has matured, surveillance is
limited to specific issues that might be identified by OAQPS, the ESAT WAM/TOPO/DOPOs,
or the PEP Laboratory Manager. A Surveillance Report Form (Figure 20-4) will be used for
documentation and filed under AFC heading "PEP/108-025-01-01-237.1."
               Reviewer
Surveillance Report Form

               Date of Review:
               Personnel Reviewed:
Activity Monitored



Acceptable Performance
YES



NO



               Notes:
               Signature:
                    Date:
                         Figure 20-4. Surveillance Report Form.

-------
                                                                        Project: PEP QAPP
                                                                          Element No: 20.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                        	Page 8 of 9
20.1.4  Audit of Data Quality

An ADQ reveals how the data are handled, what judgments were made, and whether uncorrected
mistakes were made.  ADQs can often identify the means to correct systematic data reduction
errors.  An ADQ will be performed annually by OAQPS as part of the TSA. Thus, sufficient time
and effort will be devoted to this activity so that the auditor or TSA team has  a clear
understanding and complete documentation of data flow. Pertinent  ADQ questions will appear
on the TSA check sheets to ensure that the data collected at each stage maintains their integrity.
The ADQ will serve as an effective framework for organizing the extensive amount of
information gathered during the audit of laboratory, field monitoring, and support functions
within  the agency. The ADQ will have the same reporting/corrective action requirements as the
TSA.

20.1.5  Data Quality Assessments

A DQA is a statistical analysis of environmental data used to determine whether the quality of
data is  adequate to support a decision based on the DQOs. Data are appropriate if the level of
uncertainty is acceptable for the decision based on the data. The DQA process is described in
detail in Guidance for the Data Quality Assessment Process (EPA QA/G-9) and is summarized
below.

   •   Review  the DQOs and  sampling design of the program. Review the DQOs and define
       statistical hypothesis, tolerance limits, and/or confidence intervals
   •   Conduct preliminary data review. Review precision and accuracy (P&A) and other
       available QA reports. Calculate summary statistics, plots, and graphs.  Look for patterns,
       relationships,  and anomalies
   •   Select the statistical test. Select the best test for analysis based on the preliminary
       review and identify underlying assumptions about the  data for that test
   •   Verify test assumptions. Decide whether the underlying assumptions made by the
       selected test hold true for the  data and the consequences
   •   Perform the statistical  test. Perform test and document inferences and evaluate the
       performance for future use.

A DQA will be  included in the PEP Annual QA Report. Details of these reports are discussed  in
Element 21.0, Reports to Management.

Measurement uncertainty will be estimated. Terminology associated with measurement
uncertainty is found within 40 CFR Part 58, Appendix A and includes the following:

   •   Precision. A measurement of mutual agreement among individual measurements of the
       same property usually under prescribed similar conditions, expressed  generally in terms
       of the standard deviation

-------
                                                                        Project: PEP QAPP
                                                                          Element No: 20.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                       	Page 9 of 9
   •   Accuracy. The degree of agreement between an observed value and an accepted
       reference value; accuracy includes a combination of random error (precision) and
       systematic error (bias) components, which are due to sampling and analytical operations
   •   Bias. The systematic or persistent distortion of a measurement process, which causes
       errors in one direction; individual results of these tests for each method or analyzer shall
       be reported to EPA.

Estimates of the data quality will be calculated on the basis of single monitors, Regions, and
laboratories and will be aggregated to all monitors.

20.1.6  Peer Review

Peer review is a documented critical review of work products. These reviews are conducted by
qualified individuals who are independent of those performing the work but are collectively
equivalent in technical expertise. OAQPS uses the peer-review process to assess its products and
guidance. Any guidance documents or reports developed during the implementation of this
program will be reviewed by EPA's informal PM2.5 QA Strategy workgroup (facilitated by
AAMG). This workgroup will serve as a peer reviewer. OAQPS will post the resulting document
in draft on AMTIC's Web site and will announce its availability for public review through a
Monitoring List Server Notice. OAQPS will document comments and responses received as part
of the peer-review process.

20.2   Documentation of Assessments

Table 20-1 summarizes each of the assessments previously discussed.

                           Table 20-1. Assessment Summary
Assessment
Activity
MSRs
TSAs
Surveillance
ADQs
DQAs
Frequency
2 to 3 per yr
1/yr
As needed
1/yr
1/yr
Personnel Responsible
OAQPS
OAQPS and Regional
WAM/TOPO/DOPO
OAQPS and Regional
WAM/TOPO/DOPO
OAQPS (National PEP
Project Leader)
OAQPS and EPA
Regions
Report Completion
30 days after the activity
30 days after the activity
30 days after the activity
30 days after the activity
120 days after the end
of the calendar year
Resolution
Regional Air Program
Managers
ESAT or SLT
ESAT or SLT
WAM/TOPO/DOPOs
EPA Regions and SLT

-------
                                                                       Project: PEP QAPP
                                                                        Element No: 21.0
                                                                          Revision No: 1
                                                                          Date: 3/6/2009
                                                                      	Page Iof8
                        21.0  Reports to Management

This element describes the quality-related reports and communications to management necessary
to support the PEP.
Effective communication among all personnel is an integral part of a quality system. Regular,
planned quality reporting provides a means for tracking the following:
   •   Adherence to scheduled delivery of equipment, data, and reports
   •   Documentation of deviations from approved QA and SOPs and the impact of these
       deviations on data quality
   •   Analysis of the potential uncertainties in decisions based on the data.

21.1   Communication
An organized communications framework facilitates the flow of information among the
participating organizations and other users of the information produced by the PM2.5 PEP. Figure
21-1 represents the principal communication pathways.
nAnp^
\Jn\alr O
(QA Workgroup)



FSAT

Contracts Office
O4««4**/l*^*«««l /
otate/iocal/
Tribal

4 ^

Region

ESAT WAM/TOPO/DOPO



Region

ESAT PO
                      Technical

                     Contractual
                         Figure 21-1. Lines of communication.

In general, ESAT contractors will be responsible for informing the PEP Laboratory Manager, the
Regional WAM/TOPO/DOPO, and the POs about technical progress, issues, and contractual
obligations. On the technical side, the Regional WAM/TOPO/DOPO(s) will be responsible for
communicating with SLT agencies and for informing OAQPS about issues that require technical
attention. Contractual issues will be conveyed from the ESAT contractor through POs to the
ESAT CMD and, if necessary, to OAQPS. Table 21-1 lists key EPA ESAT contacts.

-------
                                                                       Project: PEP QAPP
                                                                        Element No: 21.0
                                                                          Revision No: 1
                                                                          Date: 3/6/2009
                                                                      	Page 2 of 8
The ESAT contractors will frequently communicate with the PEP Laboratory Manager and the
Regional WAM/TOPO/DOPO on the progress of their activities and any problems and issues
associated with them. Resolution of these issues should take place in the Regions unless the issue
could affect the implementation of the PEP at a national level. In those cases, it can be discussed
and resolved through the communications between the National PEP Program Leader, the
Regional WAM/TOPO/DOPOs, and, if needed, the ESAT Project and Contract Officers.
Communications among various participants in the PEP will be critical to the success of the
program. The PEP Field SOP (Section 2) and PEP Laboratory SOP (Section 4) contain
procedures for required communication and for documenting this information.
                        Table 21-1. Communications Summary
Person
PEP Laboratory Manager
Regional WAM/TOPO/DOPO
LA
FS
OAQPS or approved contractor
National PEP Project Leader
Communicates to
Regional (Laboratory)
WAM/TOPO/DOPO and LA
OAQPS
Regional Project Officer
FS
PEP Laboratory Manager and
Regional (Laboratory)
WAM/TOPO/DOPO
FS
OAQPS or approved contractor(s)
LA
Regional (Laboratory)
WAM/TOPO/DOPO
Regional WAM/TOPO/DOPOs
Communication Function
Contract performance issues
Review of deliverables
Review of data
Corrective action
Schedule changes
Funding and resource needs
Bulk filter shipments
Contract performance issues
Audit site selection and scheduling
Laboratory progress
Problems and issues
Scheduling
Out-going filter/equipment shipment
Filter shipment receipt from field
Field procedure issues
Database management and AQS
uploads
Filter shipment from field
Electronic mailing of field data
Filter/equipment requests
Schedule changes
Field data verification
Requests for PEP data
Data transfer to the AQS database
Data quality issues
Funding and resource needs
Contract performance issues
21.1.1  Field Communication
Field communications can take place by phone or by e-mail. Phone messages or conversations
will be recorded using the Phone Communication Form (COM-1) in the field communications
notebook. All PEP-related communication should be logged. Notes will include the following:

-------
                                                                       Project: PEP QAPP
                                                                         Element No: 21.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
	Page 3 of 8

    •   Date
    •   Time
    •   Personnel involved
    •   Issue(s)
    •   Decision(s)
    •   Follow-up action(s)
    •   Follow-up action responsibility
    •   Follow-up action completed by (date).
If follow-up action is required by the FS, then these actions will be included in the monthly
progress reports (see Element 9.0, Problem Definition/Background., Section 9.2.2). At a
minimum, the FS will keep the original hardcopy in the field communications notebook.  The FS
may also choose to keep an electronic record of this information on a PC.
Field communication between the FS and the Regional WAM/TOPO/DOPO may be required.
Cellular phones have been provided to each FS for calls related to PEP activities. The Regional
WAM/TOPO/DOPO should also identify alternates to receive field communications when he or
she is not in the office.

21.1.1.1      Filter Shipment Receipt
Upon request from the FS, the LA will ship the filters to the field offices. On the day of receipt,
the FS will contact the LA and provide the following information:
    •   Date of receipt
    •   Number of filter cassettes in shipment
    •   Number of boxes in shipment
    •   Airbill number.

21.1.1.2      Equipment Shipment Receipt
Once a month, the laboratory will ship coolers, maximum/minimum thermometers, and gel packs
back to the field offices. On the day of receipt, the FS will contact the LA and will provide the
following information:
    •   Date of shipment
    •   Number of boxes in shipment
    •   Tracking number.

21.1.1.3 PEP Conference Calls
The FS may be asked to participate in PEP  conference calls to discuss progress or resolution of
issues. The Regional WAM/TOPO/DOPO will inform the FS of information that needs to be
prepared for the call at least 3 days before the call. During this call, the FS will use the Phone

-------
                                                                        Project: PEP QAPP
                                                                         Element No: 21.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
	Page 4 of 8


Communication Form (COM-1) to record issues and action items that pertain to his or her
activities. These items will be included in the next monthly progress report.

21.1.1.4  Communicating with Reporting Organizations and Site Operators
Dates for the FRM PE visits should be coordinated with the site's normal operating schedule.
This coordination must be completed in advance so that the FS and the site operator have ample
advanced notice and time to prepare for the on-site visit. The procedure for such  communications
includes the following:
    •   The Regional WAM/TOPO/DOPO (or FS, as delegated by the Regional
       WAM/TOPO/DOPO) will contact each site operator before the site visit.  Contact must be
       made by phone if it is within 30 days of the site visit, but e-mail is sufficient otherwise.
    •   Approximately 1  week before the actual evaluation, the FS will call the site operator to
       confirm that the PE visit remains on schedule and to confirm meeting arrangements.

21.1.2 Laboratory Communications
Laboratory personnel will use the Phone Communications Form  (COM-1) in the same manner as
theFS, as described in Section 21.1.1.

21.1.2.1  Filter Shipment
Twice monthly, filters will be shipped to the field offices by Federal Express (FedEx) or another
approved courier. On the day of shipment, the LA will communicate with the FS and will
provide the following information:

    •   Date of shipment
    •   Number of filter cassettes in shipment
    •   Number of boxes in shipment
    •   Airbill number.
The LA will also send the FS an e-mail that contains this information.

21.1.2.2  Equipment Shipment
Once a month or as needed, the laboratory will ship coolers, maximum/minimum thermometers,
and ice substitutes back to the Regional offices by FedEx. On the day of shipment, the LA will
communicate with the field contact and will provide the following information by e-mail:
    •   Date of shipment
    •   Number of boxes in shipment
    •   Tracking number.

21.2  Reports
The following section will discuss the various types of reports that will be generated in the PEP.
Table 21-3 provides  a summary of this information.

-------
                                                                         Project: PEP QAPP
                                                                           Element No: 21.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
	Page 5 of 8


21.2.1 Progress Reports

Field Progress Reports
The FS will provide a written progress report to his or her Regional WAM/TOPO/DOPO
monthly (PEP Field SOP, Section 2). The deadline is the 15th calendar day of the following
month, unless otherwise specified by the Regional WAM/TOPO/DOPO. The Progress Report
Form (COM-2) will be used to convey the following information:

    •   Reporting date. Beginning and end dates that are covered in the report
    •   Reporter. Person writing the reports
    •   Progress. Progress on field activities, including evaluations scheduled and conducted
       within a reporting date
    •   Issues. Old issues reported in earlier reports that have not been resolved and new issues
       arising within a reporting date
    •   Actions. The action necessary to resolve issues, the person(s) responsible for resolving
       them, and the anticipated dates when they will be resolved.

Laboratory Progress Report
The LA will provide a written progress report to the PEP Laboratory Manager and the Regional
(Laboratory) WAM/TOPO/DOPO every Friday or on the last day of the scheduled work week
(PEP Laboratory SOP,  Section 4). Progress Report Form (COM-2) will be used to convey the
following information:
    •   Reporting date. Beginning and end dates that are covered in the report
    •   Reporter. Person writing the reports
    •   Progress. Progress on field activities
       -   Pre-sampling processing. Filters prepared within a reporting date
       -   Post-sampling processing. Filters weighed within a reporting date and data
           submitted to the AQS
       -   Shipments. Shipments made to each Region within a reporting date
       -   Receipt. Total number of filters received within a reporting date
    •   Issues.
       -   Old issues. Issues reported in earlier reports that have not been resolved
       -   New issues. Issues arising within a reporting date
    •   Actions. Action necessary to resolve issues, including the person(s) responsible for
       resolving them and the anticipated dates when they will be resolved.
In addition, an updated Filter Inventory and Tracking Form (COC-1) will be included with the
weekly progress report. The LA will maintain a complete record of the weekly progress reports
in a three-ring binder.

-------
                                                                         Project: PEP QAPP
                                                                           Element No: 21.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
	Page 6 of 8


21.2.2 QA Reports
Various QA reports will be developed to document the quality of data for the PEP. For more
information about reporting time lines, please see Element 6.0, Project/Task Description., Section
6.4.6. The types of reports include the following:
DQA. This assessment is a scientific and statistical evaluation performed annually to determine
if data are of the right type, quality, and quantity to support their intended use. The PEP QA/QC
data can be statistically assessed at various levels of aggregation to determine its quality.
Element 24.0, Reconciliation with Data Quality Objectives, discusses the statistics to be used to
evaluate the data in relation to the DQOs. DQAs will primarily be the responsibility of the EPA
Regions (Regional assessments) and OAQPS (national assessments).
P&A reports. These reports will be generated quarterly and annually and will evaluate the
precision, accuracy, and bias data against the acceptance criteria using the statistics documented
in 40 CFR Part 58. OAQPS will be responsible for generating these reports through the AQS.
Assessment reports. TSAs will be on file at EPA's Regional offices and OAQPS.
QA reports. A QA report provides an evaluation of QA/QC data for a given time period to
determine whether the DQOs were met. QA reports will be more evaluative in nature than the
P&A reports because they will combine the various assessments and the QA data to report on the
overall quality system.  OAQPS will generate Annual QA Summary Reports and 3-year QA
Reports on the PEP and its resultant data quality.
The Annual QA Summary Reports will include the following information:
    •   Program overview and update
    •   Quality objectives for measurement data
    •   Implementation aspects
       -  Training and certifications
       -  Laboratory QA requirements (QC checks, TSAs, and data validation)
       -  Field QA requirements (QC checks, standards certifications, and TSAs)
    •   DQAs
       -  Laboratory and field controls
       -  Precision (based on collocated data)
       -  Accuracy and bias (based on collocated data, flow rate performance audits)
       -  Completeness (PEP results versus FRM/FEM results)
    •   Summary
The 3-year QA Report is a composite of the annual reports, but with a more narrative
interpretation and evaluation of longer term trends with respect to PEP sampler and operational
performance.

-------
                                                                         Project: PEP QAPP
                                                                          Element No: 21.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                        	Page 7 of 8
21.2.3 Response/Corrective Action Reports
During TSAs, the response/corrective action reporting procedure will be followed whenever
there is an assessment finding. The reporting procedure is designed as a closed-loop system. The
Response/Corrective Action Report Form identifies the originator (who reported and identified
the problem), states the problem, and may suggest a solution. The form also indicates the name
of the person(s) assigned to correct the problem. The assignment of personnel to address the
problem and the schedule for completion will be filled in by the appropriate supervisor. The
reporting procedure closes the loop by requiring that the recipient state on the form how the
problem was resolved and the effectiveness of the solution. Copies of the completed
Response/Corrective Action Report Form will be distributed twice: first when the problem has
been identified and the action has been scheduled; and second when the correction has been
completed. The originator, the Regional  (field or laboratory) WAM/TOPO/DOPO, and the
National PEP Project Leader will be included in both distributions.

21.2.4 Control Charts with Summary
Control charts for field and laboratory instruments will be updated after every new calibration or
standardization as defined in the relevant PEP Field and Laboratory SOPs. FSs and LAs are
responsible for reviewing each control chart immediately after it is updated and for taking
corrective actions whenever an out-of-control condition  is observed. Control charts are to be
reviewed at least quarterly by the PEP Laboratory Manager (laboratory instruments) and the
Regional WAM/TOPO/DOPOs. Control charts are also  subject to inspection during TSAs, and
laboratory personnel are responsible for maintaining a readily accessible file of control charts for
each instrument.

21.2.5 Data Reporting
The data reporting requirements of 40 CFRPart 58.35 apply to those  stations designated as
SLAMS or NCore.  Required accuracy and precision data are to be reported, at a minimum, on
the same  schedule as quarterly routine monitoring data submittals; however, it is anticipated that
data will be reported to the AQS within 25 days of receiving the filter from the field. The
required reporting periods and due dates for SLAMS and NCore sites are listed in Table 21-2.
                Table 21-2. Quarterly SLAMS/NCore Reporting Schedule
Reporting period
January 1-March 3 1
April 1-June 30
July 1-September 30
October 1-December 3 1
Due on or before
June 30
September 30
December 3 1
March 3 1 (following year)
PEP audit results are posted to the AQS as paired data. The data pair comprises the PEP audit
measurement and the site sampler's routine measurement. The site measurement value is taken
from the site's posted AQS data for the date of the audit at the applicable sampler (POC).
Because both measured values are needed to report PEP audits to the AQS, the PEP audit results

-------
                                                                          Project: PEP QAPP
                                                                            Element No: 21.0
                                                                              Revision No: 1
                                                                              Date: 3/6/2009
                                                                         	Page 8 of 8
will not be available until approximately 30 days after the dates listed in Table 21-2 (to allow
time for processing and data approvals).
In cases where the PEP audit results are available, but the routine measurements are not available
before the deadlines in Table 21-2, the PEP audit results will not be posted until the next
quarter's posting. For example, for a routine sample collected on March 31st and posted by the
state on or before June 30th, the associated PEP audit results should be posted to the AQS by
approximately July 31st. If the same  routine sample's result were not available in the AQS until
September 1st, then the PEP audit results would not be posted until approximately January 31st.
Air quality data submitted for each reporting period will be edited, validated, and entered into the
AQS using the procedures described in the AQS User Guide and the AQS Data Coding Manual
(available at http://www.epa.gov/ttn/airs/airsaqs/manuals).
                               Table 21-3. Report Summary
Report Type
Field progress
Laboratory progress
DQA
PEP audit results
PEP P&A (collocation
study results)
TSA (of SLT agencies or
ESAT)
OAQPS systems audit
Response/corrective action
Frequency
Monthly
Weekly
1/yr
Quarterly
2/yr
1/yr
1/yr
I/finding
Reporting Organization
ESAT contractor
ESAT contractor
OAQPS and EPA Regions
OAQPS and authorized
contractor
National PEP Project Leader
EPA Region
OAQPS
ESAT contractor
Distribution
Regional WAM/TOPO/DOPO
PEP Laboratory Manager,
Regional (Laboratory)
WAM/TOPO/DOPO
ESAT contractor, Regional
WAM/TOPO/DOPO, AMTIC
AQS
FS, Regional
WAM/TOPO/DOPO, AMTIC
ESAT contractor, assessed
agency, National PEP Project
Leader
ESAT contractor, Regional
WAM/TOPO/DOPO
ESAT contractor, Regional
WAM/TOPO/DOPO, National
PEP Project Leader

-------
                                                                       Project: PEP QAPP
                                                                         Element No: 22.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
                                                                       	Page 1 of 6
    22.0  Data Review, Validation, and Verification Requirements

This element describes how the PEP will verify and validate the data collection operations
associated with the program. "Verification" can be defined as confirmation by examination and
provision of objective evidence that specified requirements have been fulfilled. "Validation" can
be defined as confirmation by examination and provision of objective evidence that the
particular requirements for a specific intended use are fulfilled. The major objective of the PEP
is to provide data of adequate quality to use in the comparison to routine data. This section will
describe the verification and validation activities that occur during many of the important data
collection phases. Earlier elements of this QAPP and the PEP Field and Laboratory SOPs
describe how the activities in these data collection phases will be implemented to meet the
DQOs of the program. Review and approval of this QAPP provide initial agreement that the
processes described in the QAPP, if implemented, will provide data of adequate quality.  To
verify and validate the phases of the data collection operation, the PEP will use various
qualitative assessments (e.g., TSAs, network reviews) to verify that the QAPP is being followed
and will rely on the various QC samples, inserted at various phases of the data collection
operation, to validate that the data will meet the DQOs described in Element 7.0, Data Quality
Objectives and Criteria for Measurement.

22.1   Sampling Design

Element 10.0, Sampling Design., describes the sampling design for the network established by
the PEP. It covers the number of PEs required for each reporting organization and method
designation, as well as the frequency of data collection. These requirements have been described
in the CFR; however, it is  the responsibility  of PEP to ensure that the intent of the regulations are
properly administered and performed.

22.1.1  Sampling Design Verification

SLT organizations will work with the EPA Regions to select and develop a list of sites for the
evaluations conducted in each calendar year on or before December 1 of the previous year. The
Regional WAM/TOPO/DOPOs, with the assistance of the ESAT contractors, will attempt to
determine the most efficient site visit schedule, which should be based upon the following:

   •   CFR requirements  for audit frequency as discussed in Element  10.0, Sampling Design
   •   Meeting the same monitoring schedule as the routine sampler being evaluated (to prevent
       the need for the site to run and post an additional sample for the PE)
   •   Site proximity (the sites that are closest in proximity to each other can be visited within
       the same day or week).

The PEP implementation plan can then be reviewed and compared to the AQS data of active
SLAMS and NCore sites aggregated by reporting organization and method designation. This can

-------
                                                                        Project: PEP QAPP
                                                                          Element No: 22.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
	Page 2 of 6

ensure that the PEP design is being followed. The implementation plan will also be reviewed
during OAQPS and Regional TSAs.

22.2  Sample Collection Procedures

22.2.1 Sample Collection Verification

Sample collection procedures are described in Element 11.0, Sampling Methods Requirements,
and in detail in the PEP Field SOP to ensure proper sampling and to maintain sample integrity.
The following processes will be used to verify the sampling collection activities:

   •   TSAs. Will be required by OAQPS and by the EPA Regions annually, as described in
       Element 20.0, Assessments and Response Actions
   •   Surveillance. Will be conducted as required by the EPA Regions and will be used for
       frequent monitoring of specific data collection phases.

Both types of assessments will be used to verify that the sample collection activities are being
performed as described in this QAPP and in the PEP Field and Laboratory SOPs. Deviations
from the sample collection activity will be noted in  Audit Finding Forms and will be corrected
using the procedures described in Element 20.0, Assessments and Response Actions.

22.2.2 Sample Collection Validation

The sample collection activity is just one phase of the measurement process. Using QC samples
throughout the measurement process can help validate the activities occurring at each phase. The
review of QC data (e.g., collocated sampling data, field/laboratory/trip blanks, and sampling/
laboratory equipment verification checks) that are described  in Element 14.0, Quality Control
Requirements, and Element 16.0, Instrument Calibration and Frequency, can be used to validate
the data collection activities. Any data that indicate unacceptable levels  of bias or precision or a
tendency (trend on a control chart) will be flagged and investigated. This investigation could
lead to a discovery of inappropriate sampling activities.

22.3  Sample Handling

Element 11.0, Sampling Methods Requirements, and Element 12.0, Sample Handling and
Custody, detail the requirements for sampling handling; however,  greater detail  for both field
and laboratory sample handling procedures occur in the PEP Field SOP  (Section 3) and PEP
Laboratory SOP (Section 5), including the types of sample containers and the preservation
methods used to ensure that they are appropriate to the nature of the sample and the type of data
generated from the sample. Due to the size of the filters and the nature of the collected particles,
sample handling is one of the phases where inappropriate techniques can have a significant effect
on sample integrity and data quality.

-------
                                                                         Project: PEP QAPP
                                                                           Element No: 22.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
	Page 3 of 6

22.3.1 Verification of Sample Handling

As previously mentioned, TSAs and surveillance will be performed to ensure that the
specifications mentioned in the QAPP and SOPs are being followed. The assessments would
include checks on the identity of the sample (e.g.,  proper labeling and COC records), packaging
in the field, and proper storage conditions (e.g., COC and storage records) to ensure that the
sample continues to be representative of its native environment as it moves through the data
collection operation.

22.3.2 Validation of Sample Handling

Similar to the validation of sampling activities, the review of data from the collocated sampling
and field, laboratory, trip, and lot blanks (described in Element 14.0, Quality Control
Requirements., and Element 16.0, Instrument Calibration and Frequency) and the use of control
charts can be used to validate the sample handling activities. Acceptable precision and bias in
these samples would  lead one to believe that the sample handling activities are adequate. Any
data that indicate unacceptable levels of bias or precision or a tendency (trend on a control chart)
will be flagged and investigated. This investigation could lead to a discovery of inappropriate
sampling handling activities that would require corrective action.

22.4  Analytical Procedures

Element 13.0, Analytical Methods Requirements., details the requirements for the analytical
methods, which include the pre-sampling and post-sampling weighing activities.  Pre-sampling
weighing activities give each sample a unique identification, establish an initial weight, and
prepare the sample for the field. The post-sampling weighing activities provide the mass  net
weight and the final concentration calculations. The PEP Laboratory SOP, specifically Section 8,
provides the actual procedures. The methods include acceptance criteria (Element 13.0,
Analytical Methods Requirements, and Element 14.0, Quality Control Requirements) for
important components of the procedures, along with suitable codes for characterizing each
sample's deviation from the procedure.

22.4.1 Verification of Analytical Procedures

As previously mentioned, both TSAs and surveillance will be performed to ensure that the
analytical method specifications mentioned in the QAPP and SOPs are being followed. The
assessments will include checks on the identity of the sample. Deviations from the analytical
procedures will be noted in Audit Finding forms and will be corrected using the procedures
described in Element 20.0, Assessments and Response Actions.

22.4.2 Validation of Analytical Procedures

Similar to the validation of sampling activities, the following can be used to validate the
analytical procedures: reviewing data from laboratory blanks, calibration checks, laboratory

-------
                                                                         Project: PEP QAPP
                                                                           Element No: 22.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
	Page 4 of 6

duplicates, laboratory records for temperature and relative humidity devices, the Filter Inventory
and Tracking Form (COC-1), and other laboratory QC activities described in Element 14.0
(Quality Control Requirements), Element 16.0 (Instrument Calibration and Frequency),  and in
the PEP Laboratory SOP. Acceptable precision and bias in these samples or control of the
laboratory's temperature and relative humidity conditions would lead one to believe that the
analytical procedures are adequate. Any data that indicate unacceptable levels of bias or
precision or a tendency (trend on a control chart) will be flagged and investigated as described in
Element 14.0, Quality Control Requirements. This investigation could lead to a discovery of
inappropriate analytical procedures that would require corrective action.

22.5  Quality Control

Element 14.0, Quality Control Requirements, and Element 16.0, Instrument Calibration and
Frequency of this QAPP specify the QC checks that are to be performed during sample
collection, handling, and analysis. These include analyses of check standards, blanks, and
duplicates, which indicate the quality of data being produced by specified components of the
measurement process. For each specified QC check, the procedure, acceptance criteria, and
corrective action are specified in PEP Field and Laboratory SOPs.

22.5.1 Verification  of Quality Control Procedures

As previously mentioned, TSAs and surveillance will be performed to ensure that the QC
method specifications mentioned in the QAPP are being followed.

22.5.2 Validation of Quality Control Procedures

Validation activities  of many of the other data collection phases mentioned in this  subsection use
the QC data to validate the proper and adequate implementation of that phase. Therefore,
validation of QC procedures will require a review of the documentation of the corrective actions
that were taken when QC samples failed to meet the acceptance criteria and a review of the
potential effect of the corrective actions on the validity of the routine data. Element 14.0, Quality
Control Requirements, describes the techniques that are used to document QC review/corrective
action activities.

22.6  Calibration

Element 16.0, Instrument Calibration and Frequency, as well as the field (Element 11.0,
Sampling Methods Requirements) and the analytical (Element 13.0, Analytical Methods
Requirements) sections of this QAPP detail the calibration activities and requirements for the
critical pieces of equipment for the PEP.  The PEP Field SOP (Section 10) and the PEP
Laboratory SOP (Section 7) provide detailed calibration techniques.

-------
                                                                         Project: PEP QAPP
                                                                          Element No: 22.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                        	Page 5 of 6
22.6.1  Verification of Calibration Procedures
As previously mentioned, TSAs and surveillance will be performed to ensure that the calibration
specifications and corrective actions mentioned in the QAPP are being followed. Deviations
from the calibration procedures will be noted in Audit Finding forms and will be corrected using
the procedures described in Element 20.0, Assessments and Response Actions.

22.6.2 Validation of Calibration Procedures

Similar to the validation of sampling activities, the review of the calibration data described in
Element 14.0, Quality Control Requirements, and Element 16.0, Instrument Calibration and
Frequency can be used to validate calibration procedures. Calibration data within the acceptance
requirements would lead one to believe that the sample collection measurement devices are
operating properly. Any data that indicate unacceptable levels of bias or precision or a tendency
(trend on a control chart) will be flagged and investigated as described in Element 14.0, Quality
Control Requirements, and Element 16.0, Instrument Calibration and Frequency. This
investigation could lead to a discovery of inappropriate calibration  procedures or equipment
problems that would require corrective action as detailed in the element. Validation would
include the review of the documentation to ensure that corrective action was taken as prescribed
in the QAPP.

22.7   Data Reduction and Processing

22.7.1 Verification of Data Reduction and Processing Procedures

As previously mentioned, TSAs and surveillance will be performed to ensure that the data
reduction and processing activities mentioned in the QAPP are being followed.

22.7.2 Validation of Data Reduction and Processing Procedures

As part of the ADQ discussed in Element 20.0, Assessments and Response Actions, many
randomly chosen Sample IDs will be identified. All raw data files, including those that contain
the following will be selected:

   •   Pre-sampling-weighing activity (e.g., lot testing)
   •   Pre-sampling weighing
   •   Sampling (sampler download information)
   •   Calibration (information represented from that sampling period)
   •   Sample handling/custody
   •   Post-sampling weighing
   •   Corrective action

-------
                                                                          Project: PEP QAPP
                                                                           Element No: 22.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
                                                                         	Page 6 of 6
       Data reduction.
These raw data will be reviewed and final concentrations will be calculated independently of the
PEP database to determine if the final values submitted to the AQS are comparable to the
independent calculations. The data will also be reviewed to ensure that flags or any other data
qualifiers have been appropriately associated with the PE database reports and that appropriate
corrective actions were taken.

-------
                                                                        Project: PEP QAPP
                                                                          Element No: 23.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                              Page 1 of 10
                 23.0 Validation and Verification Methods

Many of the processes for verifying and validating the measurement phases of the PEP data
collection operation have been discussed in Element 22.0, Data Review, Validation, and
Verification Requirements. If these processes, as written in the QAPP, are followed, then the
PEP should obtain the necessary data quality to permit comparison of PEP with the routine
primary samplers. However, exceptional field events may occur and field and laboratory
activities may negatively affect the integrity of samples. In addition, it is expected that some of
the QC checks will fail to meet the acceptance criteria. Information about problems that affect
the integrity of data is identified in the form of flags (see Appendix D). It is important to
determine how these failures affect the routine data. The review of these routine data  and their
associated QC data will be verified and validated on a sample basis, on groups of samples, and
on a sample batch basis. Element 14.0, Quality Control Requirements, discusses the concept and
use of sample batching.

23.1  Process for Validating and Verifying Data

23.1.1 Verification of Sample Batches

After a sample batch is completed, a thorough review of these data will be conducted for
completeness and data entry accuracy. Data used in FED audit calculations or used for
evaluating critical validation criteria that are recorded on data sheets by hand will be  100%
verified. After these data are entered into the FED, the system will review the data for routine
data outliers and data outside of acceptance criteria or ranges. These data will be flagged
appropriately. All flagged data will be "re-verified" to ensure that the values are correctly
entered. Details of these activities are discussed in Element 19.0, Data Management.  The  data
qualifiers or flags can be found in Appendix D.

23.1.2 Validation

Validation of measurement data can occur at the following different levels: at the single sample
level, on a group of samples that are related (either to a single instrument,  operator, or a pre- or
post-weighing session), or at the sample batch level. Validation at these three levels is discussed
below.

The FED contains automated procedures to assist in the validation process. For  instance, the
FED performs automated QC checks for many of the criteria defined in the CFR, as well as the
more stringent QC checks required by the PEP. These checks are illustrated in the PEP
Validation Matrix (Figure 23-1). The FED produces a PE Summary Report, which details all of
the relevant data associated with a particular PE along with the pass/fail status of the  automated
checks. During validation review, the LA has the ability to override the pass/fail status (with a
note documenting reasons for the override). All overrides must be approved by the PEP
Laboratory QA Officer.

-------
                                                                                         Project: PEP QAPP
                                                                                          Element No: 23.0
                                                                                             Revision No: 1
                                                                                             Date: 3/6/2009
                                                                                               Page 2 of 10
                                             Filler S 30 days
                                             from ore-weigh
                              Baled stability lest
                              < 15 ug between
                                weighings
                              consecutive weignts
                               for 2 of 3 niters)
 Sample period
 1380-1500 min
   Flow raw
i 5% of 16.67 Lpm
   Flow rale
   s 2% CV
                                                          No now late
                                                          excursions
                                                         > 15% for > 5 mm

                                                         No fitter lamp.
                                                          excursions of
                                                         > 5'C for > 30 min
 5 24 fits from
sample end dale
s<'C. posi-weigne 4'C and post-
             weighed s 30 days
              (no Tag override
               allowed)
                                         s 20% difference
                                        between primary and
                                         PE(itbothPM2,
                                         C01 Wrl [fa tori S
                                                                                                 5 75% difference
                                                                                                between primary and
                                                                                                 PE (if either PM!S
                                                                                                  concentration
                                                                                                   « 5 pg/mj)
                                                                             Critical Criteria


                                                                             Sample Batch Validation wilh Major and Minor Flags


                                                                             Operational Evaluation Criteria


                                                                             Undefined Check
                                Figure 23-1. PEP validation matrix.
At least one flag will be associated with an invalid sample. The flag "INV" will be used to
signify that a sample is invalid or the "NAR" flag will be used when no analysis result is
reported.  Additional flags will usually be associated with the NAR or INV flags to help  describe
the reason(s) for these flags. In addition, free-form notes from the FS or LA are often associated
with the sample to further describe the reason(s) for these flags.

Records of all invalid samples will be filed by the LA. Information will include a brief summary
of why the sample was invalidated, along with the associated flags. This record will be available
from the FED because all filters that were pre-weighed will be recorded.

-------
                                                                         Project: PEP QAPP
                                                                           Element No: 23.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
	Page 3 of 10

23.1.2.1  Validation of Single Samples or Groups of Samples

The PEP validation criteria are based upon the CFR criteria and the judgment of the PEP
Workgroup. These criteria will be used to validate a sample or groups of samples. The flags
listed in Appendix D will be used to assist in the validation activities.

Samples flagged in the field will always be returned to the weighing laboratory for further
examination. When the LA reviews the FDS and COC Form, he or she will look for flag values.
Filters that have flags related to obvious contamination (CON), filter damage (DAM), or field
accidents (FAC) will be immediately examined. Upon concurrence with the PEP Laboratory QA
Officer, these samples will be invalidated. The flag for no analysis result (NAR) will be applied
to this sample, along with the other associated flags.

A single sample may be invalidated based on many criteria, such as known or suspected field or
laboratory contamination, field or laboratory accidents, or failure of CFR acceptance criteria.
Table 23-1 lists the cases where single samples or groups of samples may be invalidated based
on failure of any one acceptance criteria (i.e., critical criteria).

Flags may be used in combination to invalidate samples. Table 23-2  identifies the operational
evaluation criteria that can be used in combination to invalidate single samples or groups of
samples. Because the possible flag combinations are overwhelming and cannot be anticipated,
the PEP will review the flags associated with single values or groups of samples and determine
invalidation criteria. The PEP will keep a record of the combination of flags that result in
invalidation. These combinations will be listed and used by the weighing laboratory to ensure
that the PEP evaluates and invalidates data consistently. The PEP anticipates the use of a scoring
system (under development) to further ensure consistency in validation decisions. As previously
mentioned, all data invalidation will be documented.

-------
Table 23-1. Validation Template Where Failure of Any One Criteria Would Invalidate a Sample
                                or a Group of Samples
CRITERIA DEFINED IN CFR— SAMPLES OR GROUPS OF SAMPLES INVALIDATED FOR ANY FAILED CRITERIA
Requirement
Type
Frequency
Acceptance Criteria
40 CFR
Reference
Flag Value
Filter Holding Times
Sample recovery
Post-sampling weighing
S
S
S
S
All filters
All filters
All filters
All filters
<48 hours from sample end date (override
permissible)
<96 hours from sample end date (cannot
be overridden)
<15 days at 4°C from sample end date
(override permissible)
<30 days at 4°C from sample end date
(cannot be overridden)
Not described
Part 50, Appendix L,
Section 10. 10
Not described
Not described
HTE
HTE
HTE
HTE
Sampling Period
Sampling period
S
All data
1,380-1,500 min
Part 50, Appendix L,
Section 3. 3
EST
Sampling Instrument
Flow rate
S
S
S
Every 24 hours of
operation
Every 24 hours of
operation
Every 24 hours of
operation
<4% of design flow (16.67 Lpm)
<2% CV
No flow rate excursions > + 5% for > 5
min
Part 50, Appendix L,
Section 7.4
Part 50, Appendix L,
Section 7.4. 3.2
Part 50, Appendix L,
Section 7.4. 3.1
FLR
FLR
FVL
Filter
Visual defect check
S
All filters
See reference
Part 50, Appendix L,
Section 6.0
DAM
Filter Conditioning
Environment
Equilibration
G
All filters
24 hours minimum in weighing room
Part 50, Appendix L,
Section 8.2
ISP

-------
CRITERIA DEFINED IN CFR— SAMPLES OR GROUPS OF SAMPLES INVALIDATED FOR ANY FAILED CRITERIA
Requirement
Temperature range
Temperature control
Relative humidity range
Relative humidity control
Pre-/post-sampling relative
humidity
Type
G
G
G
G
G
G
S/G
Frequency
All filters
All filters
All filters
All filters
All filters
All filters
All filters
Acceptance Criteria
24-hr mean 20°C-23°C
18°C minimum, 25°C maximum
+ 2°C SDa over 24 hr
24-hr mean 30%-40% relative humidity
25% relative humidity minimum, 45%
relative humidity maximum
+ 5% SDa over 24 hr
+ 5% relative humidity
40 CFR
Reference
Part 50, Appendix L,
Section 8.2
Not described
Part 50, Appendix L,
Section 8.2
Part 50, Appendix L,
Section 8.2
Not described
Part 50, Appendix L,
Section 8.2

Flag Value
ISP
ISP
ISP
ISP
ISP
ISP
ISP
NOTE: S = single filter; G = group of filters (i.e., batch); Gl = group of filters from one instrument
" Variability estimate not defined in CFR

-------
Table 23-2 Validation Template Where Certain Combinations of Failure May Be Used
                   to Invalidate a Sample or Group of Samples
OPERATIONAL EVALUATIONS
Requirement
Type
Frequency
Acceptance Criteria
40 CFR Reference
Flag
Value
Filter Checks
Lot exposure blanks
Filter integrity (exposed)
G
S
3 filters from each of
Sboxes in lot (9 filters total)
Each filter
+ 15 jug change between weighings
No visual defects
Not described
Part 50, Appendix L, Section
10.2

CON,
DAM
Filter Holding Times
Pre-sampling"
Sample recovery
Post-sampling weighing
S
S
S
All filters
All filters
All filters
<30 days from pre-weigh to sampling
<24 hours from sample end date
<10 days at 4°C from sample end date
Part 50, Appendix L, Section 8.3
Not described
Not described
HTE
HTE
HTE
Detection Limit
Lower detection limit
Upper concentration limit
G/G1
G/G1
All data
All data
2 //g/m3
200 //g/m3
Part 50, Appendix L, Section 3.1
Part 50, Appendix L, Section 3.2
BDL
NA
Laboratory QC Checks
Field filter blank"
Laboratory filter blank"
Trip filter blank
Balance check
G/G1
G
G
G
I/audit
(for programs <2 yrs old)
1/FS per trip
(for all others)b
10% or I/weighing session
10% of all filters'
Beginning/end of weighing
session and 1 after
approximately every 15
samples or fewer, per
recommendations of the
balance's manufacturer
+ 30 jug change between weighings
+ 15 jug change between weighings
+ 30 jug change between weighings
<3//g
Part 50, Appendix L, Section 8.3
Part 50, Appendix L, Section 8.3
Not described
Part 50, Appendix L, Section 8.3
FFB
FLB
FTB
FQC
                                                                                                tfq •' S2. »
                                                                                                 fD L*J O y?^

                                                                                                 O> O> ^ O
                                                                                                 H- O • •
                                                                                                 O ^O i—

-------
OPERATIONAL EVALUATIONS
Requirement
Duplicate filter weighing
Type
G
Frequency
I/weighing session,
1 carried over to next
session
Acceptance Criteria
+ 15 jug change between weighings
40 CFR Reference
Part 50, Appendix L, Section 8.3
Flag
Value
FLD
Sampling Instrument
Filter temperature sensor
S
Every 24 hours of operation
No excursions of >5°C lasting longer than
30 min
Part 50, Appendix L, Section 7.4
FLT
Accuracy
Flow rate audit"
External leak check"
Internal leak check"
Temperature audit"
Barometric pressure audit"
Balance audit (PE)
Gl
Gl
Gl
Gl
Gl
G
4/yr (manual)
4/yr
4/yr (if external leak check
fails)
4/yr
4/yr
2/yr
+ 4% of calibration standard at design flow
(16.67 Lpm)
<80 mL/min
<80 mL/min
+ 2°C of calibration standard
+ 10 mm Hg of calibration standard
+ 20 //g of NIST-traceable standard
+ 15 //g for unexposed filters
Part 58, Appendix A, Section
3.5.1
Part 50, Appendix L, Section
7.4.6
Part 50, Appendix L, Section
7.4.6
Part 50, Appendix L, Section 9.3
Part 50, Appendix L, Section 7.4
Not described
FQC
FQC
FQC
FQC
FQC
FQC
Precision (using collocated samplers) d
All samplers (mandatory)
G
2/year (semi-annual)
CV<10%
Part 58, Appendix A, Sections
3.5 and 5.5
PCS
Verification
Single-point flow rate
verification
External leak check
Internal leak check
Single-point temperature
verification"
Gl
Gl
Gl
Gl
Every sampling event
Every sampling event
Upon failure of external
leak check
Every sampling event and
following every calibration
+ 4% of working standard or 4% of design
flow (16.67 Lpm)"
<80 mL/min"
<80 mL/min"
+ 2°C of working standard
Part 50, Appendix L, Section
9.2.5
Part 50, Appendix L, Section 7.4
Part 50, Appendix L, Section 7.4
Part 50, Appendix L, Section 9.3
FSC
LEK
LEK
FSC
0  ^ 2  2
-J ON »  O
          '

-------
OPERATIONAL EVALUATIONS
Requirement
Single-point barometric
pressure verification"
Clock/timer verification
Laboratory temperature
verification
Laboratory relative humidity
verification
Type
Gl
Gl
G

Frequency
Every sampling event and
following every calibration
Every sampling event
I/quarter
I/quarter
Acceptance Criteria
+ lOmmHg
1 min/mo
+ 2°C
+ 2% relative humidity
40 CFR Reference
Part 50, Appendix L, Section 7.4
Part 50, Appendix L, Section
7.4.12
Not described
Not described
Flag
Value
FSC
NA
FLT
FLH
NOTE: S = single filter; G = group of filters (i.e., batch); Gl = group of filters from one instrument
" Identified in the CFR
* For a new SLT program (i.e., <2 years old), the frequency for field blanks is one per FRM/FEM audit. For all others, one field blank should be performed per FS per
  trip. A trip may include audits for more than one FRM/FEM sampler. It is up to the FS to determine which site to perform the field blank audit, unless otherwise
  directed by their Regional WAM/TOPO/DOPO (such as when a problem is identified at a particular site).
c Trip blanks will be performed at a frequency of 10% of all filters, as determined by the weighing laboratory (i.e., 1 per every 10 filters shipped out, rounded up). So if
  the laboratory sends out 1 to 10 filters, then one trip blank should be included in the shipment. If the laboratory ships out 11 to 20 filters, then two trip blanks should
  be included. The FS will determine with which trip to use the trip blank filter(s), in a manner similar to the field blanks. However, if the FS receives more than one
  trip blank in a shipment, then he or she must make sure that only one trip blank is carried per trip.
d Twice per year, all of the PEP samplers used by the Region (and any SLT organizations that are running their own PEP) must be collocated and run at the same
  location over the same time period. These are often referred to as "parking lot collocations."
                                                                                                                                                                          w,3
                                                                                                                                                                    G?l
                                                                                                                                                                       •<  2
                                                                                                                                                                 fa
                                                                                                                                                                OQ
                                                                                                                                                                 K-  o  • •
                                                                                                                                                                 O  ^O  h-

-------
                                                                          Project: PEP QAPP
                                                                           Element No: 23.0
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
                                                                               Page 9 of 10
23.1.2.2  Validation of Sample Batches

Due to the nature and holding times of the routine samples, it is critical that the PEP minimize
the amount of data that is invalidated; therefore, the PEP will validate data on sample batches as
described in Element 14.0, Quality Control Requirements. Based on the types of QC samples
that are included in the batch and on the field and laboratory conditions that are reported along
with the batch (field/laboratory flags), the PEP has developed a validation template that will be
used to determine when PE data will be invalidated and when major corrective actions must be
instituted. Table 23-3 represents the sample batch validation template.

                     Table 23-3. Sample Batch Validation Template
Requirement
Numbe
rPer
Batch
Audit Acceptance
Criteria
Major"
Minor*
Flag
Blanks
Field blanks
Laboratory blanks
Trip blanks
1
>1
1
>1
1
>1
<±30//g
Mean < ± 30 //g
<±15,«g
Mean<± 15 jug
<±30//g
Mean<± 15 jug
Blank > ± 40 jug
Mean > ± 30 //g
Blank > ± 17 //g
Mean>± 15 jug
Blank > ± 40 jug
Mean > ± 30 jug
One blank > ± 30 jug

Blank >± 15 jug

One blank > ± 30 jug

FFB
FFB
FLB
FLB
FTB
FTB
Precision Checks
Filter duplicates
1
<±15//g
Duplicate > ± 17 jug
Duplicate >± 15 jug
FLD
Accuracy
Balance checks
4
<±3//g
Four checks > ± 3 //g
Two checks > ± 3 //g
FIS
 "If two majors occur, data are invalidated.
 * If four minors occur, data are invalidated. Two minors equal one major.
Based on the number of major and minor flags associated with the batch, it may be invalidated.
Either the FED or the LAs will evaluate the batch and generate a report based on the results
described in the validation template. If the report describes invalidating the batch of data, then
the batch will be re-analyzed. Prior to re-analysis, all efforts will be made to take corrective
actions and, depending on the type of QC checks that were outside of acceptance criteria, to
correct the problem. If the batch remains outside the criteria, then the routine samples will be
flagged invalid (INV).

23.1.3  Validation Acceptance and Reporting

All efforts will be made to produce adequate results. Any data  flagged as invalid, with the
exception of obvious filter damage or accidents, will be re-analyzed.
The PEP Laboratory QA Officer will be responsible for determining that data have been
validated before submittal to the AQS. A summary report of all data that were invalidated, along

-------
                                                                         Project: PEP QAPP
                                                                           Element No: 23.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
	Page 10 of 10


with explanations for batch failures, will be submitted to the PEP Laboratory Manager and the
Regional (Laboratory) WAM/TOPO/DOPO each week.

Invalidated PEP audit events cannot be posted to the AQS because there is currently no
provision in the AQS precision data record format ("RP" transaction type), for adding null value
codes or data qualifiers.

-------
                                                                        Project: PEP QAPP
                                                                         Element No: 24.0
                                                                           Revision No: 1
                                                                           Date: 3/6/2009
                                                                       	Page 1 of 2
            24.0  Reconciliation with Data Quality Objectives

The DQOs for the PEP are prescribed in the QA provisions of the Federal Monitoring
Regulations at 40 CFR Part 58, Appendix A, most recently revised on October 17, 2006. They
are described in this QAPP in Element 7.0, Data Quality Objectives and Criteria for
Measurement. This element of the QAPP outlines the procedures that the PEP will follow to
determine whether the monitors and laboratory analyses are producing data that are  sufficiently
consistent to evaluate the bias  of the National PM2.5 FRM/FEM network (i.e., to ensure that the
PEP is meeting its DQOs).

The underlying premise for using the data from the PEP to estimate the bias associated with the
National PM2.5 FRM/FEM network is that the PEP represents the most precise and least biased
measurements of PM2 5 using the FRM. Therefore, the promulgated DQOs are not static goals
with respect to data quality, but they are actually goals that may be periodically revised to reflect
progressive scientific developments and experience. Improvements in the precision  and bias of
the PEP-generated data are constant goals.  To this end, the PEP has instituted many QC
procedures and tests and internal evaluations of sampler and the operators and LAs.  The control
limits for the many MQOs that are  associated with the PEP are discussed in other elements of
this QAPP. It follows that a variety of mathematical and statistical tools are employed to analyze
these QC data that are generated. Occasionally, a new or different tool will be evaluated for new
or different insights into PEP data.

24.1  Preliminary Review of Available Data

Element 7.0, Data Quality Objectives and Criteria for Measurement, of this QAPP contains the
details for the development of the DQOs. Element 10.0, Sampling Design, of this QAPP contains
the details for the sampling design, including the rationale for the design, the design
assumptions, and the sampling locations and frequency. If changes in the DQOs or sampling
design occur, then potential effect should be considered throughout the entire DQA.

A preliminary data review should be performed to uncover potential limitations to using the data,
to reveal outliers, and generally to explore the basic structure of the data. The first step is to
review the QA reports. The second step is to calculate basic summary statistics, generate
graphical presentations of the data, and review these summary statistics and graphs.  This review
will be completed by each Region.

24.2  Regional-Level Evaluation of Data Collected While All PEP Samplers
       Are Collocated

Twice per year (semi-annually), all of the PEP samplers used by a single FS or Region must be
collocated and run at the same location over the same time period. These are often referred to as
"parking lot collocations."

-------
                                                                         Project: PEP QAPP
                                                                          Element No: 24.0
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
	Page 2 of 2

The primary objective for collocating all of the samplers is to determine whether one of the
samplers is biased, relative to the performance of the other samplers (of both same and different
FRM make or model) and to estimate the repeatability (or between-monitor precision) of the
instruments. Estimates of the repeatability can be used to evaluate the certainty with which the
bias of the SLT program within each Region can be estimated. Statistical methods will be used to
determine the between-sampler precision and identify individual samplers that yield aberrant
results. Samplers that produce aberrant results will be reported to their respective Regional users.
In each case, the sampler will be quarantined from use in the PEP until  a subsequent
investigation is performed and  its issues have been resolved.

24.3   National Level Evaluation of Data Collected While All PEP Samplers
       Are Collocated

A major goal of the national review of the data from the collocation of all the PEP samplers is to
determine if the repeatability of the samplers varies greatly by Region or by laboratory. OAQPS
will check for equal variances across all Regions or laboratories by using standard statistical
tests,  such as the Bartlett test (an all-purpose statistical test that can be used for equal and
unequal sample sizes), the Hartley test (a statistical test that requires equal sample sizes but is
designed to find differences between the largest and smallest variances), and Levene's test (an
alternative to Bartlett's test for testing for differences among the dispersions of several groups.
Levene's test has greater power than Bartlett's for non-normal distributions of data).1'2 EPA will
apply additional methods for data evaluation as deemed appropriate by  OAQPS. New analytical
methods will be reviewed by the PEP Workgroup.  The conclusions from these tests will allow
OAQPS to determine whether corrective action must be taken to reduce the variability for any of
the Regions or laboratories. Corrective action will include a formal review of the training and
operations to see if the cause for the disparity can be uncovered and corrected. With these data,
OAQPS will also be able to evaluate with what certainty the bias of the routine program can be
estimated.

References

1.  Neter, J., W. Wasserman, and M.H. Kutner.  1985. Applied Linear Statistical Models (2nd
    edition). Homewood, IL: Richard D. Irwin, Inc.

2.  U.S. EPA (Environmental Protection Agency). 2000. Guidance for Data Quality Assessment:
    Practical Methods for Data Analysis; EPA QA/G-9,QAOO UPDATE. United States
    Environmental Protection Agency, Office of Environmental Information, Washington, DC,
    EPA/600/R-96/084. July.

-------
                                                                       Project: PEP QAPP
                                                                            Appendix A
                                                                          Revision No: 1
                                                                          Date: 3/6/2009
                                                                            Page 1 of 18
                                   Appendix A

                                     Glossary
The following glossary contains terms commonly used in the PEP. All terms listed may not actually be
used in this document.

-------
                                                  Project: PEP QAPP
                                                        Appendix A
                                                      Revision No:  1
                                                      Date: 3/6/2009
                                                        Page 2 of 18
/This page intentionally left blank.]

-------
                                                                               Project: PEP QAPP
                                                                                     Appendix A
                                                                                  Revision No: 1
                                                                                  Date: 3/6/2009
                                                                                     Page 3 of 18
                                         Glossary
Acceptance criteria — Specified limits that are placed on the characteristics of an item, process, or service
defined in requirements documents (American Society of Quality Control definition).

Accuracy — This term refers to a measure of the closeness of an individual measurement or the average of
a number of measurements to the true value. Accuracy includes a combination of random error (precision)
and systematic error (bias) components that are due to sampling and analytical operations; the U.S.
Environmental Protection Agency (EPA) recommends using the terms "precision" and "bias, " rather than
"accuracy" to convey the information usually associated with accuracy.

Activity — This all-inclusive term describes a specific set of operations of related tasks to be performed,
either serially or in parallel (e.g., research and development, field sampling, analytical operations,
equipment fabrication) that, in total, result in a product or service.

Aerometric Information Retrieval System (AIRS) — See the Air Quality System (AQS).

Air Quality System (AQS) — The AQS, which is EPA's repository of ambient air quality data, stores
data from more than 10,000 monitors,  5,000 of which are currently active. State, local, and Tribal
agencies collect monitoring data and submit it to the AQS periodically. The AQS was formerly the Air
Quality Subsystem of the AIRS, which also contained an Air Facility  System (AFS) that stored
information on pollution sources. After the AFS was separated  from AIRS, the terms AIRS and AQS
became frequently used as synonyms to refer to the ambient air quality database.

American National Standards Institute (ANSI) — ANSI is the administrator and coordinator of the U.S.
private-sector voluntary standardization system.

American Society for Testing and Materials (ASTM) — The ASTM is a professional organization that
develops and distributes protocols for testing and provides reference standards.

Analyst — An analyst is a staff member who weighs the new and used filters and computes the
concentration of PM2 5 i
ANSI/ASTM Class 1 and 2 standards — These are the standards for weighing operations with a
microbalance that is certified by their manufacturer as being in conformance with ASTM's standard
specification for laboratory weights and precision mass standards (E 617-9), particularly the Class 1 and 2
specifications. These standards are traceable to the National Institute of Standards and Technology
(NIST).

AQS Monitor ID — This is a 10-digit combination of the AIRS Site ID and POC (see each in this
glossary) that together uniquely defines a specific air sampling monitor for a given pollutant. Some forms
and dialog boxes may refer to this as an AIRS ID or 10-digit AIRS ID.

AQS Site ID — This is a unique identifier for an AQS sampling site. The AQS Site ID is frequently
combined with the Parameter Occurrence Code (POC) (see POC in this glossary) to provide a unique 10-
digit monitor ID. The first nine digits uniquely identify each air monitoring site (two-digit state code,
three-digit county code, and four-digit site code). The tenth digit (POC) identifies the monitor at that site.

-------
                                                                                Project: PEP QAPP
                                                                                       Appendix A
                                                                                    Revision No: 1
                                                                                    Date: 3/6/2009
	Page 4 of 18


The state and county codes are Federal Information Processing Standard (FIPS) codes. The four-digit site
codes are assigned by the local agency, which may allocate them in any way it chooses, as long as there is
no duplication in the county. AQS Site IDs are associated with a specific physical location and address.
Any significant change in location will typically require a new site ID.

Assessment—This term refers to the evaluation process that was used to measure the performance or
effectiveness of a system and its elements. As used here, "assessment" is an all-inclusive term that is used
to denote any of the following: an audit, a Performance Evaluation (PE), a management systems review
(MSR), peer review, inspection, or surveillance.

Audit (quality)—A systematic and independent examination to determine whether quality activities and
related results comply with planned arrangements and whether these arrangements are implemented
effectively and are suitable to achieve objectives.

Audit of Data Quality (ADQ)—A qualitative and quantitative evaluation of the documentation and
procedures associated with environmental measurements to verify that the resulting data are of acceptable
quality.

Authenticate—The act of establishing an item as genuine, valid, or authoritative.

Bias—The systematic or persistent distortion of a measurement process, which causes errors in one
direction (i.e., the expected sample measurement is different from the sample's true value).

Blank—A sample that is intended to contain none of the analytes of interest and is subjected to the usual
analytical or measurement process to establish a zero baseline or background value. A blank is sometimes
used to  adjust or correct routine analytical results. A blank is used to detect contamination during sample
handling preparation and/or analysis.

Calibration drift—The deviation in instrument response from a reference value over a period of time
before recalibration.

Calibration—A comparison of a measurement standard, instrument, or item with a standard or
instrument of higher accuracy to detect and quantify inaccuracies and to report or eliminate those
inaccuracies by adjustments.

Cassette—A device that is supplied with  PM2 5 samplers to allow a weighed Teflon® filter to be held in
place in the sampler and manipulated before and after sampling without touching the filter and to
minimize damage to the filter and/or sample during such activities.

Certification—The process  of testing and evaluation against specifications  designed to document, verify,
and recognize the competence of a person, organization, or other entity to perform a function or service,
usually  for a specified time.

Chain of custody—An unbroken trail of  accountability that ensures the physical security of samples,
data, and records.

Characteristic—Any property or attribute of a datum, item, process, or service that is distinct,
describable, and/or measurable.

-------
                                                                               Project: PEP QAPP
                                                                                      Appendix A
                                                                                   Revision No: 1
                                                                                   Date: 3/6/2009
	Page 5 of 18


Check standard—A standard that is prepared independently of the calibration standards and analyzed
exactly like the samples. Check standard results are used to estimate analytical precision and to indicate
the presence of bias due to the calibration of the analytical system.

Collocated samples—Two or more portions collected at the same point in time and space, so as to be
considered identical. These samples are also known as "field replicates" and should be identified as such.

Comparability—A measure of the confidence with which one data set or method can be compared to
another.

Completeness—A measure of the amount of valid data obtained from a measurement system compared
to the amount that was expected to be obtained under correct, normal conditions.

Computer program—A sequence of instructions suitable for processing by a computer. Processing may
include the use  of an assembler, a compiler, an interpreter, or a translator to prepare the program for
execution. A computer program may be stored on magnetic media and referred to as "software," or it may
be stored permanently on computer chips, referred to  as "firmware." Computer programs covered in a
Quality Assurance Project Plan (QAPP) are those used for design analysis, data acquisition, data
reduction, data  storage (databases), operation or control, and database or document control registers when
used as the controlled source of quality information.

Conditioning environment—A specific range of temperature and relative humidity values in which
unexposed and  exposed filters are to be conditioned for at least 24 hours immediately preceding their
gravimetric analysis.

Confidence interval—The numerical interval constructed around a point estimate of a population
parameter, combined with a probability statement (the confidence coefficient) linking it to the
population's true parameter value. If the same confidence interval construction technique and
assumptions are used to calculate future intervals, then they will include the unknown population
parameter with  the same specified probability.

Confidentiality procedure—A procedure that is used to protect confidential business information
(including proprietary data and personnel records) from unauthorized access.

Configuration—The functional,  physical, and procedural characteristics of an item, experiment, or
document.

Conformance—An affirmative indication or judgment that a product or service has met the requirements
of the relevant specification, contract, or regulation; also, the  state of meeting the requirements.

Consensus standard—A standard established by a group representing a cross section of a particular
industry or trade, or a part thereof.

Contract Officer's Representative (COR)—The EPA Contract Officer designates this person as the
responsible party for managing the work. Depending  on the contract, the COR could be the Delivery
Order Project Officer (DOPO), the Task Order Project Officer (TOPO), or the Work Assignment
Manager (WAM).

-------
                                                                                 Project: PEP QAPP
                                                                                        Appendix A
                                                                                     Revision No: 1
                                                                                     Date: 3/6/2009
	Page 6 of 18


Contractor—Any organization or individual contracting to furnish services or items or to perform work.

Control chart—A graphical presentation of quality control (QC) information over a period of time. If a
procedure is "in control," the results usually fall within established control limits. The chart is useful in
detecting defective performance and abnormal trends or cycles, which can then be corrected promptly.

Corrective action—Any measures taken to rectify conditions adverse to quality and, where possible, to
preclude their recurrence.

Correlation coefficient—A number between -1 and 1 that indicates the degree of linearity between two
variables or sets of numbers. The closer to -1 or +1, the stronger the linear relationship between the two
(i.e., the better the correlation). Values close to zero suggest no correlation between the two variables.
The most common correlation coefficient is the product-moment, which is a measure of the degree of
linear relationship between two variables.

Data of known quality—Data that have the qualitative and quantitative components associated with their
derivation documented appropriately for their intended use; documentation is verifiable and defensible.

Data Quality Assessment (DQA)—The scientific and statistical evaluation of data to determine if data
obtained from environmental operations are of the right type, quality, and quantity to support their
intended use. The five steps of the  DQA process include: 1) reviewing the Data Quality Objectives
(DQOs) and sampling design,  2) conducting a preliminary  data review, 3) selecting the statistical test, 4)
verifying the assumptions of the statistical test, and 5) drawing conclusions from the data.

Data Quality Indicators (DQIs)—The quantitative statistics and qualitative descriptors that are used to
interpret the degree of acceptability or utility of data to the user. The principal data quality indicators are
bias, precision, and accuracy (bias  is preferred); comparability; completeness; and representativeness.

Data Quality Objectives  (DQO) Process—A systematic planning tool to facilitate the planning of
environmental data collection activities. DQOs are the qualitative and quantitative outputs from the DQO
process.

Data Quality Objectives  (DQOs)—The qualitative and quantitative statements derived from the DQO
process that clarify a study's technical and quality objectives, define the appropriate type of data, and
specify tolerable levels of potential decision errors that will be used as the basis for establishing the
quality and quantity of data needed to support decisions.

Data reduction—The process of transforming the number of data items by arithmetic or statistical
calculations, standard curves, and concentration factors and collating them into a more useful form. Data
reduction is irreversible and generally results in a reduced data set and an associated loss of detail.

Data usability—The process of ensuring or determining whether the quality of the data produced meets
the intended use of the data.

Deficiency—An unauthorized deviation from acceptable procedures or practices or a defect in an item.

-------
                                                                                Project: PEP QAPP
                                                                                      Appendix A
                                                                                    Revision No:  1
                                                                                    Date: 3/6/2009
	Page 7 of 18


Demonstrated capability—The capability to meet a procurement's technical and quality specifications
through evidence presented by the supplier to substantiate its claims and in a manner defined by the
customer.

Design change—Any revision or alteration of the technical requirements defined by approved and issued
design output documents and by approved and issued changes thereto.

Design review—A documented evaluation by a team, including personnel such as the responsible
designers, the client for whom the work or product is being designed, and a quality assurance (QA)
representative, but excluding the original designers, to determine if a proposed design will meet the
established design criteria and perform as expected when implemented.

Design—The design refers to specifications,  drawings, design criteria, and performance requirements, as
well as the result of deliberate planning, analysis, mathematical manipulations, and design processes.

Detection limit (DL)—A measure of the capability of an analytical method to distinguish samples that do
not contain a specific analyte from samples that contain low concentrations of the analyte; the lowest
concentration or amount of the target analyte that can be determined to be different from zero by a single
measurement at a stated level of probability. DLs are analyte and matrix specific and may be laboratory
dependent.

Distribution—This term refers to 1) the appointment of an environmental contaminant at a point over
time, over an area, or within a volume; and 2) a probability function (density function, mass function, or
distribution function) used to describe a set of observations (statistical sample) or a population from
which the observations are generated.

Document control—The policies and procedures used by an organization to ensure that its documents
and their revisions are proposed, reviewed, approved for release, inventoried, distributed, archived,
stored, and retrieved in accordance with the organization's requirements.

Document—Any written or pictorial information describing, defining, specifying, reporting, or certifying
activities, requirements, procedures, or results.

Dry-bulb temperature—The actual temperature of the air, which is used for comparison with the wet-
bulb temperature.

Duplicate samples—Two samples taken from and representative of the same population and carried
through all steps of the sampling and analytical procedures in an identical manner. Duplicate samples are
used to assess variance of the total method, including sampling and analysis (see also collocated
samples).

Electrostatic charge buildup—A buildup of static electrical charge on an item, such as the PM2 5 filter,
which makes it difficult to handle, attracts or repels particles, and can influence its proper weighing.

Environmental conditions—The description of a physical medium (e.g., air, water, soil, sediment) or a
biological system expressed in terms of its physical, chemical, radiological, or biological characteristics.

-------
                                                                               Project: PEP QAPP
                                                                                     Appendix A
                                                                                   Revision No:  1
                                                                                   Date: 3/6/2009
	Page 8 of 18


Environmental data operations—Any work performed to obtain, use, or report information pertaining
to environmental processes and conditions.

Environmental data—Any parameters or pieces of information collected or produced from
measurements, analyses, or models of environmental processes, conditions, and effects of pollutants on
human health and the environment, including results from laboratory analyses or from experimental
systems representing such processes and conditions.

Environmental monitoring—The process of measuring or collecting environmental data.

Environmental processes—Any manufactured or natural processes that produce discharges to, or that
impact, the ambient environment.

Environmental programs—An all-inclusive term that pertains to any work or activities involving the
environment, including but not limited to, the characterization of environmental processes and conditions;
environmental monitoring; environmental research and development; the design, construction, and
operation of environmental technologies; and laboratory operations on environmental samples.

Environmental technology—An all-inclusive term used to describe pollution control devices and
systems, waste treatment processes and storage facilities, and site remediation technologies  and their
components that may be used to remove pollutants or contaminants from, or to prevent them from
entering, the environment. Examples include wet scrubbers (air), soil washing (soil), granulated activated
carbon unit (water), and filtration (air, water). Usually, this term applies to hardware-based systems;
however, it can also apply to methods or techniques used for pollution prevention, pollutant reduction, or
containment of contamination to prevent further movement of the contaminants, such as capping,
solidification or vitrification, and biological treatment.

Equilibration chamber—A clean chamber that is usually constructed of plastic or glass, held at near
constant temperature and relative humidity, and is used to store and condition PM2 5 filters until they and
their collected particulate sample (if the filters have been exposed) have reached a steady  state of moisture
equilibration.

Estimate—A characteristic from the sample from which inferences on parameters can be made.

Evidentiary records—Any records identified as part of litigation and subject to restricted access,
custody, use, and disposal.

Expedited change—An abbreviated method of revising a document at the work location  where the
document is used when the normal change process would cause unnecessary or intolerable delay in the
work.

Field (matrix) spike—A sample prepared at the sampling point (i.e., in the field) by adding a known
mass of the target analyte to a specified amount of the sample. Field matrix spikes are used, for example,
to determine the effect of the sample preservation, shipment, storage, and preparation on analyte recovery
efficiency (the analytical bias).

Field blank filter—New, randomly selected filters that are weighed at the same time that presampling
weights are determined for a set of PM2 5 filters and used for QA purposes. These field blank filters are

-------
                                                                                Project: PEP QAPP
                                                                                      Appendix A
                                                                                   Revision No: 1
                                                                                   Date: 3/6/2009
	Page 9 of 18


transported to the sampling site in the same manner as the filter(s) intended for sampling, installed in the
sampler, removed from the sampler without sampling, stored in their protective containers inside the
sampler's case at the sampling site until the corresponding exposed filter(s) is (are) retrieved, and returned
for postsampling weighing in the laboratory, where they are handled in the same way as an actual sample
filter and reweighed as a QC check to detect weight changes due to filter handling.

Field blank—A blank that provides information about contaminants that may be introduced during
sample collection, storage, and transport. A clean sample is carried to the sampling site, exposed to
sampling conditions, returned to the  laboratory, and treated as an environmental sample.

Field split samples—Two or more representative portions taken from the same sample and submitted for
analysis to different laboratories to estimate inter-laboratory precision.

File plan—A file plan lists the records in your office, and describes how they are organized and
maintained. For more information about EPA's File Plan Guide, see
http://www.epa.gov/records/tools/toolkits/filecode  (see also records schedule).
Filter chamber assembly—As shown in Figures 5.6 and 5.7 in this Performance Evaluation Program
(PEP) Field Standard Operating Procedure (SOP), this is referencing the mechanism in the interior of the
BGI main unit.  This assembly contains the WINS impactor assembly in the upper half and the  filter
cassette or holder assembly in the lower half.

Financial assistance—The process by which funds are provided by one organization (usually
governmental) to another organization for the purpose of performing work or furnishing services or items.
Financial assistance mechanisms  include grants, cooperative agreements, and governmental interagency
agreements.

Finding—An assessment conclusion that identifies a condition having a significant effect on an item or
activity. An assessment finding may be  positive or negative, and is normally accompanied by specific
examples of the observed condition.

Goodness-of-fit test—The application of the chi square distribution  in comparing the frequency
distribution of a statistic observed in a sample with the expected frequency distribution based on some
theoretical model.

Graded  approach—The process of basing the level of application of managerial controls applied to an
item or work according to the intended use of the results and the degree of confidence needed in the
quality of the results (see  also Data Quality Objectives (DQO) Process).

Grade—The category or rank given to entities having the same functional use but different requirements
for quality.

Guidance—A suggested practice that is not mandatory; it is intended to be an aid or example in
complying with a standard or requirement.

Guideline—A suggested practice that is not mandatory in programs intended to comply with a standard.

Hazardous waste—Any waste material that satisfies the definition of hazardous waste given in 40 CFR
261, Identification and Listing of Hazardous Waste.

-------
                                                                               Project: PEP QAPP
                                                                                     Appendix A
                                                                                   Revision No:  1
                                                                                   Date: 3/6/2009
	Page 10 of 18


High-efficiency particulate air (HEPA) filter—A HEPA filter is an extended-media, dry-type filter
with a minimum collection efficiency of 99.97% when tested with an aerosol of essentially monodisperse
0.3-/an particles.

Holding time—The period of time a sample may be stored prior to its required analysis. Although
exceeding the holding time does not necessarily negate the veracity of analytical results, it causes the
qualifying or "flagging" of any data not meeting all of the specified acceptance criteria.

Hygrothermograph—An instrument that results from the combination of a thermograph and a
hygrograph and furnishing, on the same chart, simultaneous time recording of ambient temperature and
relative humidity.

Identification error—The misidentification of an analyte. In this error type, the contaminant of concern
is unidentified and the measured concentration is incorrectly assigned to another contaminant.

Independent assessment—An assessment that is performed by a qualified individual, group, or
organization that is not a part of the organization directly performing and accountable for the work being
assessed.

Inspection—The  examination or measurement of an item or activity to verify conformance to specific
requirements.

Internal standard—A standard added to a test portion of a sample in a known amount and carried
through the entire determination procedure as a reference for  calibrating  and controlling the precision and
bias of the applied analytical method.

Item—An all-inclusive term that is used in place of the following: appurtenance, facility, sample,
assembly, component, equipment, material, module, part, product, structure, subassembly, subsystem,
system, unit, documented concepts, or data.

Laboratory analyst—The generic term used to describe the Environmental Sampling and Assistance
Team (ESAT) contractor(s) responsible for the activities described in the PEP SOPs.

Laboratory blank filters—New filters that are weighed at the time of determination of the presampling
(tare) weight of each set of PM2 5 filters intended for field use. These laboratory blank filters remain in the
laboratory in protective containers during the field sampling and are reweighed in each weighing session
as a QC check.

Laboratory split samples—Two or more representative portions taken from the same sample and
analyzed by different laboratories to estimate the inter-laboratory precision or variability and the data
comparability.

Limit of quantitation—The minimum concentration of an analyte or category of analytes in a specific
matrix that can be identified and quantified above the method detection limit and within specified limits
of precision and bias during routine analytical operating conditions.

Local Standard Time—The time used in the geographic location of the sample site that is set to standard
time.  Standard time is used in the Federal Reference Method (FRM) program to match continuous

-------
                                                                               Project: PEP QAPP
                                                                                      Appendix A
                                                                                   Revision No: 1
                                                                                   Date: 3/6/2009
	Page 11 of 18


instruments to filter-based instruments. During the winter months, all areas of the country use standard
time; however, in the summer months, some areas may go to Daylight Saving Time (1 hour ahead of
standard time).

Management system—A structured, nontechnical system describing the policies, objectives, principles,
organizational authority, responsibilities, accountability, and implementation plan of an organization for
conducting work and producing items and services.

Management Systems Review (MSR)—The qualitative assessment of a data collection operation and/or
organization(s) to establish whether the prevailing quality management structure, policies, practices, and
procedures are adequate for ensuring that the type and quality of data needed are obtained.

Management—Those individuals who are directly responsible and accountable for planning,
implementing, and assessing work.

Mass reference standard—The NIST-traceable weighing standards, generally in the range of weights
expected for the filters.

Matrix spike—A sample that is prepared by adding a known mass of a target analyte to a specified
amount of matrix sample for which an independent estimate of the target analyte concentration is
available. Spiked samples are used, for example, to determine the effect of the matrix on a method's
recovery efficiency.

May—When used in a sentence, this term denotes permission but not a necessity.

Mean (arithmetic)—The sum of all the values of a set of measurements divided by the number of values
in the set; a measure of central tendency.

Mean squared error—A statistical term for variance added to the square of the bias.

Measurement and Testing Equipment (M&TE)—Tools, gauges, instruments, sampling devices,  or
systems used to calibrate, measure, test, or inspect to control or acquire data to verify conformance to
specified requirements.

Memory effects error—The effect that a relatively high concentration sample has on the measurement of
a lower concentration sample of the same analyte when the higher concentration sample precedes the
lower concentration sample in the  same analytical instrument.

Method blank—A blank that is prepared to represent the sample matrix as closely as possible and
analyzed exactly like the calibration standards, samples,  and QC samples. Results of method blanks
provide an estimate of the within-batch variability of the blank response and an indication of bias
introduced by the analytical procedure.

Method—A body of procedures and techniques for performing an  activity (e.g., sampling, chemical
analysis, quantification), systematically presented in the  order in which they are to be executed.

Microbalance—A type of analytical balance that can weigh to the nearest 0.001 jug (i.e., one microgram,
or one-millionth of a gram).

-------
                                                                                Project: PEP QAPP
                                                                                      Appendix A
                                                                                    Revision No:  1
                                                                                    Date: 3/6/2009
	Page 12 of 18


Mid-range check—A standard used to establish whether the middle of a measurement method's
calibrated range is still within specifications.

Mixed waste—A hazardous waste material as defined by 40 CFR 261 and the Resource Conservation
and Recovery Act (RCRA) and mixed with radioactive waste subject to the requirements of the Atomic
Energy Act.

Must—When used in a sentence, this term denotes a requirement that has to be met.

Nonconformance—A deficiency  in a characteristic, documentation, or procedure that renders the quality
of an item or activity unacceptable or indeterminate; nonfulfillment of a specified requirement.

Objective evidence—Any documented statement of fact, other information, or record, either quantitative
or qualitative, pertaining to the quality of an item or activity, based on observations, measurements, or
tests that can be verified.

Observation—An assessment conclusion that identifies a condition (either positive or negative) that does
not represent a  significant impact on an item or activity. An observation may identify a condition that has
not yet caused a degradation of quality.

Organization structure—The responsibilities, authorities, and relationships, arranged in a pattern,
through which an organization performs its functions.

Organization—A company, corporation, firm, enterprise, or institution, or part thereof,  whether
incorporated or not, public or private, that has its own functions and administration.

Outlier—An extreme observation that is shown to have a low probability of belonging to a specified data
population.

Parameter—A quantity, usually unknown, such as  a mean or a standard deviation characterizing a
population. Commonly misused for "variable" "characteristic"  or "property"

Peer review—A documented, critical review of work generally beyond the state of the art or
characterized by the existence of potential uncertainty.  Conducted by qualified individuals (or an
organization) who are independent of those who performed the work but collectively equivalent in
technical expertise (i.e., peers) to those who performed the original work. Peer reviews are conducted to
ensure that activities are technically adequate, competently performed, properly documented, and satisfy
established technical and quality requirements. An in-depth assessment of the assumptions, calculations,
extrapolations,  alternate interpretations, methodology, acceptance criteria, and conclusions pertaining to
specific work and of the documentation that supports them. Peer reviews provide an evaluation of a
subject where quantitative methods of analysis or measures of success  are unavailable or undefined, such
as in research and development.

Performance Evaluation (PE)—A type of audit in which the quantitative data generated in a
measurement system are obtained  independently and compared with routinely obtained data to evaluate
the proficiency of an analyst or laboratory.

-------
                                                                               Project: PEP QAPP
                                                                                     Appendix A
                                                                                   Revision No:  1
                                                                                   Date: 3/6/2009
	Page 13 of 18


PM2.5—Particulate matter (suspended in the atmosphere) having an aerodynamic diameter less than or
equal to a nominal 2.5 jum, as measured by a reference method based on 40 CFR Part 50, Appendix L, and
designated in accordance with 40 CFR Part 53.

PM2.5 sampler—A sampler that is used for monitoring PM2 5 in the atmosphere that collects a sample of
particulate matter from the air based on principles of inertial separation and filtration. The  sampler also
maintains a constant sample flow rate and may record the actual flow rate and the total volume sampled.
PM2.5 mass concentration is calculated as the weight of the filter catch divided by the sampled volume. A
sampler cannot calculate PM2 5 concentration directly.

POC (Parameter Occurrence Code)—A one-digit identifier used in AIRS/AQS (see both defined in
this glossary) to distinguish between multiple monitors at the same site that are measuring  the same
parameter (e.g., pollutant). For example, if two different samplers both measure PM2 5, then one may be
assigned a POC of 1 and the other a POC of 2. Note that replacement samplers are typically given the
POC of the sampler that they replaced, even if the replacement is of a different model or type.

Pollution prevention—An organized, comprehensive effort to systematically reduce or eliminate
pollutants or contaminants prior to their generation or their release or discharge into the environment.

Polonium-210 (210Po) antistatic  strip—A device that contains a small amount of 210Po that emits a
particles  (He2+) that neutralize the static charge on filters, making them easier to handle and their weights
more accurate.

Polytetrafluoroethylene (PTFE)—Also known as Teflon, this is a polymer that is used to manufacture
the 46.2-mm diameter filters for PM2 5 FRM and Federal Equivalent Method (FEM) samplers.

Population—The totality of items or  units of material under consideration or study.

Precision—A measure of mutual agreement among individual measurements of the same property,
usually under prescribed similar conditions expressed generally in terms of the standard deviation.

Procedure—A specified way to perform an activity.

Process—A set of interrelated resources and activities that transforms inputs into outputs.  Examples of
processes include analysis, design, data collection, operation, fabrication,  and calculation.

Project—An organized set of activities within a program.

Qualified data—Any data that have been modified or adjusted as part of statistical or mathematical
evaluation, data validation, or data verification operations.

Qualified services—An indication that suppliers providing services have been evaluated and determined
to meet the technical and quality requirements of the client as provided by approved procurement
documents and demonstrated by the supplier to the client's satisfaction.

Quality Assurance (QA) Supervisor or Coordinator—A staff member  who assists in preparation of the
reporting organization's quality plan,  makes recommendations to management on quality issues
(including training), oversees the quality system's control and audit components, and reports the results.

-------
                                                                               Project: PEP QAPP
                                                                                     Appendix A
                                                                                   Revision No: 1
                                                                                   Date: 3/6/2009
	Page 14 of 18


Quality assurance (QA)—An integrated system of management activities involving planning,
implementation, assessment, reporting, and quality improvement to ensure that a process, item, or service
is of the type and quality needed and expected by the client.

Quality Assurance Program Description/Plan—See Quality Management Plan.

Quality Assurance Project Plan (QAPP)—A formal document that describes in comprehensive detail
the necessary QA, QC, and other technical activities that must be implemented to ensure that the results of
the work performed will satisfy the stated performance criteria. The QAPP components are divided into
the following four classes: 1) Project Management, 2) Measurement/Data Acquisition, 3)
Assessment/Oversight, and 4) Data Validation and Usability. Guidance and requirements on preparation
of QAPPs can be found in EPA, Requirements for Quality Assurance Project Plans, EPA QA/R-5 and
Guidance for Quality Assurance Project Plans, EPA QA/G-5.

Quality control (QC) sample—An uncontaminated sample matrix that is spiked with known amounts of
analytes from a source independent of the calibration standards. This type of sample is generally used to
establish intra-laboratory or analyst-specific precision and bias or to assess the performance of all or a
portion of the measurement system.

Quality control (QC)—The overall system of technical activities that measures the attributes and
performance of a process, item, or service against defined standards to verify that they meet the stated
requirements established by the customer; operational techniques and activities that are used to fulfill
requirements for quality. The system of activities and checks used to ensure that measurement systems are
maintained within prescribed limits, providing protection against "out of control" conditions and ensuring
the results are of acceptable quality.

Quality improvement—A management program for improving the quality of operations. Such
management programs generally entail a formal mechanism for encouraging worker recommendations
with timely management evaluation and feedback or implementation.

Quality Management Plan (QMP)—A formal document that describes the quality system in terms of
the organization's structure, the functional responsibilities of management and staff, the lines of authority,
and the required interfaces for those planning, implementing, and assessing all activities conducted.

Quality management—That aspect of the overall management system of the organization that
determines and implements the quality policy. Quality management includes strategic planning, allocation
of resources, and other systematic activities (e.g., planning, implementation, and assessment) pertaining to
the quality system.

Quality system—A structured and documented management system that describes the policies,
objectives, principles, organizational authority, responsibilities, accountability, and implementation plan
of an organization for ensuring quality in its work processes, products (items), and services. The quality
system provides the framework for planning, implementing, and assessing work performed by the
organization and for carrying out required QA and QC.

Quality—The totality of features and characteristics of a product or service that bears on its ability to
meet the stated or implied needs and expectations of the user.

-------
                                                                               Project: PEP QAPP
                                                                                      Appendix A
                                                                                   Revision No: 1
                                                                                   Date: 3/6/2009
	Page 15 of 18


Radioactive waste—This refers to waste material that contains or is contaminated by radionuclides and is
subject to the requirements of the Atomic Energy Act.

Readability—The smallest difference between two measured values that can be read on the microbalance
display. The term "resolution" is a commonly used synonym.

Readiness review—A systematic, documented review of the readiness for the startup or continued use of
a facility, process, or activity. Readiness reviews are typically conducted before proceeding beyond
project milestones and prior to initiation of a major phase of work.

Record (quality)—A document that furnishes objective evidence of the quality of items or activities and
that has been verified and authenticated as technically complete and correct. Records may include
photographs, drawings, magnetic tape, and other data recording media.

Records schedule—This schedule constitutes EPA's official policy on how long to keep Agency records
(retention) and what to do with them afterwards (disposition). For more information, refer to
http://www.epa.gov/records/policy/schedule on EPA's Web site or see file plan.

Recovery—The act of determining whether the methodology measures all of the analyte contained in a
sample.

Remediation—The process of reducing the concentration of a contaminant (or contaminants) in air,
water, or soil media to a level that poses an acceptable risk to human health.

Repeatability—This refers to a measure of the ability of a microbalance to display the same result in
repetitive weighings of the same mass under the same measurement conditions. The term "precision" is
sometimes used as a synonym. Repeatability also refers to the degree of agreement between independent
test results produced by the same analyst, using the same test method and equipment on random aliquots
of the same sample within a short time period.

Reporting limit—The lowest concentration or amount of the target analyte required to be reported from a
data collection project. Reporting limits are generally greater than detection limits and are usually not
associated with a probability level.

Representativeness—A measure of the degree to which data accurately and precisely represent a
characteristic of a population, a parameter variation at a sampling point, a process condition, or an
environmental condition.

Reproducibility—The precision, usually expressed as variance, that measures the variability among the
results of measurements of the same sample at different laboratories.

Requirement—A formal statement of a need and the expected manner in which it is to be met.

Research (applied)—A process, the objective of which is to gain the knowledge or understanding
necessary for determining the means by which a recognized and specific need may be met.

-------
                                                                                Project: PEP QAPP
                                                                                       Appendix A
                                                                                    Revision No: 1
                                                                                    Date: 3/6/2009
	Page 16 of 18


Research (basic)—A process, the objective of which is to gain fuller knowledge or understanding of the
fundamental aspects of phenomena and of observable facts without specific applications toward processes
or products in mind.

Research development/demonstration—The systematic use of the knowledge and understanding gained
from research and directed toward the production of useful materials, devices, systems, or methods,
including prototypes and processes.

Round-robin study—A method validation study involving a predetermined number of laboratories or
analysts, all analyzing the same sample(s) by the same method. In a round-robin study, all results are
compared and used to develop summary statistics such as inter-laboratory precision and method bias or
recovery efficiency.

Ruggedness study—The carefully ordered testing of an analytical method while making slight variations
in test conditions (as might be expected in routine use) to determine how such variations affect test
results. If a variation affects the results significantly, the method restrictions are tightened to minimize
this variability.

Scientific method—The principles and processes regarded as necessary for scientific investigation,
including rules for concept or hypothesis formulation, conduct of experiments, and validation of
hypotheses by analysis of observations.

Self-assessment—The assessments of work conducted by individuals, groups, or organizations directly
responsible for overseeing and/or performing the work.

Sensitivity—The capability of a method or instrument to discriminate between measurement responses
representing different levels of a variable of interest.

Service—The result generated by activities at the interface between the supplier and the customer, and
the supplier internal activities to meet customer needs. Such activities in environmental programs include
design, inspection, laboratory and/or field analysis, repair, and installation.

Shall—A term that denotes a requirement is mandatory whenever the criterion for conformance with the
specification permits no deviation. This term does not prohibit the use of alternative approaches or
methods for implementing the specification so long as the requirement is fulfilled.

Should—A term that denotes a guideline  or recommendation whenever noncompliance with the
specification is permissible.

Significant condition—Any state, status, incident, or situation of an environmental process or condition,
or environmental technology in which the work being performed will be adversely affected sufficiently to
require corrective action to satisfy quality objectives or specifications and safety requirements.

Software life cycle—The period of time that starts when a software product is conceived and ends when
the software product is no longer available for routine use. The software life cycle typically includes a
requirement phase, a design phase, an implementation phase, a test phase, an installation and check-out
phase, an operation and maintenance phase, and sometimes a retirement phase.

-------
                                                                                Project: PEP QAPP
                                                                                      Appendix A
                                                                                    Revision No: 1
                                                                                    Date: 3/6/2009
	Page 17 of 18


Source reduction—Any practice that reduces the quantity of hazardous substances, contaminants, or
pollutants.

Span check—A standard used to establish that a measurement method is not deviating from its calibrated
range.

Specification—A document that states requirements and refers to or includes drawings or other relevant
documents. Specifications should indicate the means and criteria for determining conformance.

Spike—A substance that is added to an environmental sample to increase the concentration of target
analytes by known amounts. Spikes are used to assess measurement accuracy (spike recovery), whereas
spike duplicates are used to assess measurement precision.

Split samples—Two or more representative portions taken from one sample in the field or in the
laboratory and analyzed by different analysts or laboratories. Split samples are QC samples that are used
to assess analytical variability and comparability.

Standard deviation—A measure of the dispersion or imprecision of a sample or population distribution
expressed as the positive square root of the variance and having the same unit of measurement as the
mean.

Standard Operating Procedure (SOP)—A written document that details the method for an operation,
analysis, or action with thoroughly prescribed techniques and steps and that is officially approved as the
method for performing certain routine or repetitive tasks.

Supplier—Any individual or organization furnishing items or services or performing work according to a
procurement document or a financial assistance agreement. An all-inclusive term used in place of any of
the following: vendor, seller, contractor,  subcontractor, fabricator, or consultant.

Surrogate spike or analyte—A pure substance with properties  that mimic the analyte of interest. It is
unlikely to be found in environmental samples and is added to them to establish that the analytical method
has been performed properly.

Surveillance (quality)—Continual or frequent monitoring and verification of the status of an entity and
the analysis of records to ensure that specified requirements are being fulfilled.

Technical review—A documented critical review of work that has been performed within the state of the
art. The review is accomplished by one or more qualified reviewers who are  independent of those who
performed the work, but are collectively equivalent in technical expertise to those who performed the
original work. The review is an in-depth  analysis and evaluation of documents, activities, material, data,
or items that require technical verification or validation for applicability,  correctness, adequacy,
completeness, and assurance that established requirements have  been satisfied.

Technical Systems Audit (TSA)—A thorough, systematic, on-site qualitative audit of facilities,
equipment, personnel, training, procedures, recordkeeping, data validation, data management, and
reporting  aspects of a system.

-------
                                                                                Project: PEP QAPP
                                                                                      Appendix A
                                                                                    Revision No: 1
                                                                                    Date: 3/6/2009
	Page 18 of 18


Traceability—This term refers to the ability to trace the history, application, or location of an entity by
means of recorded identifications. In a calibration sense, traceability relates measuring equipment to
national or international standards, primary standards, basic physical constants or properties, or reference
materials. In a data collection sense, it relates calculations and data generated throughout the project back
to the requirements for the quality of the project. This term also refers to the property of the result of a
measurement or the value of a standard whereby it can be related to stated references, usually national or
international standards, through an unbroken chain of comparisons, all having stated uncertainties. Many
QA programs demand traceability of standards to a national standard. In most cases this can be achieved
through a standard traceable to NIST.

Trip blank—A clean sample of a matrix that is taken to the sampling site and transported to the
laboratory for analysis without having been exposed to sampling procedures.

Validation—Confirmation by examination and provision of objective evidence that the particular
requirements for a specific intended use have been fulfilled. In design and development, validation refers
to the process of examining a product or result to determine conformance to user needs.

Variance (statistical)—A measure or dispersion of a sample or population distribution. Population
variance is the sum of squares of deviation from the mean divided by the population size (number of
elements). Sample variance is the sum of squares of deviations from the mean divided by the degrees of
freedom (number of observations minus  one).

Verification—Confirmation by examination and provision of objective evidence that specified
requirements have been  fulfilled. In design and development, verification refers to the process of
examining a result of a given activity to determine conformance to the stated requirements for that
activity.

Wet-bulb temperature—The temperature of the wet-bulb thermometer at equilibrium with a constant
flow of ambient air at a rate of from 2.5 meters to 10.0 meters per second.

Wet-bulb thermometer—A thermometer with a muslin-covered bulb, which is moistened and used to
measure the wet-bulb temperature.

-------
                                                               Project: PEP QAPP
                                                                   Appendix B
                                                                  Revision No: 1
                                                                  Date: 3/6/2009
                                                                   Page 1 of 40
                               Appendix B

          Documents to Support Data Quality Objectives


Document                                                              Page
1.     Review of the Potential to Reduce or Provide a More Cost Efficient Means to
      Implement the PM2.5 Performance Evaluation Program	B-3
2.     Decision Framework for PM2.5 Performance Evaluation Program
      Collocation Study Data	B-21

-------
                                                  Project: PEP QAPP
                                                        Appendix B
                                                      Revision No: 1
                                                      Date: 3/6/2009
                                                        Page 2 of 40
[This page intentionally left blank.]

-------
                                                Project: PEP QAPP
                                            Appendix B (Document 1)
                                                   Revision No: 1
                                                   Date: 3/6/2009
                                            	Page 3 of 40
Review of the Potential to Reduce or Provide a More Cost
  Efficient Means to Implement the PM2.s Performance
                  Evaluation Program
                         DRAFT
                 Louise Camalier & Mike Papp

                      December 2005

-------
                                                                               Project: PEP QAPP
                                                                         Appendix B (Document 1)
                                                                                   Revision No: 1
                                                                                   Date: 3/6/2009
	Page 4 of 40


Intent of Paper
During the June 2, 2005 Ambient Air Monitoring Steering Committee Meeting, OAQPS was asked to
look at whether the costs associated with the PM2 5 Performance Evaluation Program (PEP) could be
reduced, either through a reduction in the number of audits or by providing a different implementation
scheme that would reduce implementation costs. This paper provides a description of the process OAQPS
used to evaluate the question of reducing the number of PEP audits and provides a few options and
recommendations for the steering committee to consider.


Background
Unlike the gaseous criteria pollutants, where one can use a standard of known concentration to estimate
precision and bias and perform this at every site, the particulate matter pollutants rely on a representative
sample of sites for estimates of both precision and bias. Precision is estimated using collocated sampling;
bias is estimated using the PEP. Since only a portion of the monitoring sites are represented, the precision
and bias estimates are assessed at the reporting organization level. In order to provide an adequate  level of
confidence in our estimates of precision and bias, an adequate number of collocation and PEP samples
must be collected.

The PEP is a quality assurance activity which is used to evaluate measurement system bias of the fine
particle (PM2 5) monitoring network. The pertinent regulations for this performance evaluation are  found
in 40 CFR Part 58, Appendix A. The strategy is to collocate a portable FRM PM2 5 air sampling
instrument with an established primary sampler at a routine air monitoring site, operate both samplers in
the same manner, and then compare the results. In the original promulgation, the performance evaluation
was required at every site at a frequency of six times per year. EPA believed this would have allowed an
adequate assessment of bias at the site level. However, due to criticism of the burden of this requirement,
the PEP was revised to its current form of 25 percent of the monitors within each reporting organization
network at a frequency of four times per year. The data from the routine monitors and PEP monitors are
compared for each reporting organization in order to determine whether the bias estimate for the reporting
organization is within the data quality objective of+/-  10%.


Approach
First, the study question was  restated:

        "Can the PM2 5 PEP audits be reduced without adversely affecting the confidence in the
        3-year bias estimate at the reporting organization level?"

Since our data quality objectives are based upon assessments of precision and bias at a 3-year level of
aggregation per reporting organization, we need to have enough representative data at this level of
aggregation to make a reasonable assessment of bias.

Over the past few years, the QA Strategy Workgroup has been reviewing and revising the Ambient Air
Monitoring Program Quality System requirements found in 40 CFR Part 58 Appendix A. The planned
revisions have included the statistics used in our estimates of precision and bias and the move towards
using confidence limits rather than simple averages over various time periods (quarters/years). One
advantage of the new statistics is that it provides monitoring organizations some flexibility in choosing
how frequently the quality control checks need to be performed. In the report that was generated to

-------
                                                                               Project: PEP QAPP
                                                                         Appendix B (Document 1)
                                                                                   Revision No: 1
                                                                                   Date: 3/6/2009
	Page 5 of 40

explain the new statistics1 a matrix table was developed to demonstrate how one could determine how
many QC samples, such as the biweekly one-point QC check, were needed to ensure that the DQO would
be met. The following is an excerpt from this document.

       For ozone and other gases, the proposed precision and bias estimates are both made
       from the biweekly checks. Table 1 shows how many of those checks are needed to
       confidently (90%) establish that both the precision and bias are less than 10%. In this
       way, one knows that both the precision and the bias are controlled to at most 10%,
       provided the sample size is at least the number shown in Table 1. For Table 1, one-sided
       90% confidence limits about the precision estimate were assumed.  This statistic matches
       the current use for the PM2.s precision estimates in CFR.
 Table 1. Conservative Number of Precision and Bias Checks Needed to Yield Both an Absolute Bias
        Upper Bound of at Most 10% and an Upper bound of at most 10% for the Precision.
Minirr
+•*
(0
LU
£
(0
03
in
um sample
size
5%
6%
7%
8%
9%
Precision Point Estimate
5%
8
12
20
43
166
6%
8
12
20
43
166
7%
12
12
20
43
166
8%
24
24
24
43
166
9%
87
87
87
87
166
This sample size matrix approach was used to answer our study question. This was accomplished by:

    1.   Developing a matrix table with precision and bias ranges of 15% and 9.5%, respectively.
        Since the DQO for bias (provided by the PEP) is +/-10%, the bias side of the matrix table could
        not exceed 10% since it is impossible to determine how many samples are needed to control a
        bias estimate to 10% if the current estimate is over 10%.  Table 2 represents the matrix table that
        was used for this evaluation.

    2.   Data aggregation/data reduction- Precision and bias data from the calendar years 2002-2004
        were used to provide appropriate reporting organization estimates. Any precision and bias data
        were excluded if their concentrations were < 3 ug/m3. In  addition, bias outliers for each reporting
        organization were identified using a univariate outlier test and removed prior to data evaluation.

    3.   Providing 3-year precision and bias estimates at the reporting organization level. Statistics
        used in the precision and bias estimates are provided in Appendix A.

    4.   Determination of number of PEP pairs necessary for assessment purposes. The matrix table
        was used to identify the required number of PEP visits over a 3-year period needed to obtain 90%
        confidence that the bias DQO of+/-10% is being met.
             Table 2. PEP Sample Size Requirements Based on Reporting Organization
  Proposal: A New Method for Estimating Precision and Bias for Gaseous Automated Methods for the Ambient Air Monitoring Program

-------
                                              Project: PEP QAPP
                                        Appendix B (Document 1)
                                                  Revision No: 1
                                                  Date: 3/6/2009
                                       	Page 6 of 40
Precision and Bias Estimates

p
R
E
C
1
S
1
o
N
C
V
u
p
p
E
R
B
O
U
N
D
1
1.5
2
2.5
3
3.5
4
4.5
5
5.5
6
6.5
7
7.5
8
8.5
9
9.5
10
10.5
11
11.5
12
12.5
13
13.5
14
14.5
15
BIAS
2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7 7.5 8 8.5 9 9.5






3
3
3
3
3
3
3
4
4
4
4
4
5
5
5
5
6
6
6
7
7
7
8





3
3
3
3
3
3
3
4
4
4
4
4
5
5
5
6
6
6
7
7
7
8
8
9





3
3
3
3
3
3
4
4
4
4
4
5
5
5
6
6
6
7
7
8
8
9
9
9




3
3
3
3
3
3
4
4
4
4
5
5
5
6
6
6
7
7
8
8
9
9
10
10
11




3
3
3
3
3
3
4
4
4
5
5
5
6
6
7
7
7
8
9
9
10
10
11
11
12




3
3
3
3
3
4
4
4
5
5
5
6
6
7
7
8
9
9
10
10
11
12
13
13
14



3
3
3
3
3
4
4
4
5
5
6
6
7
7
8
9
9
10
11
11
12
13
14
15
16
17



3
3
3
3
4
4
5
5
5
6
7
7
8
9
9
10
11
12
13
14
15
16
17
18
19
20


3
3
3
3
4
4
5
5
6
6
7
8
9
9
10
11
12
13
14
15
17
18
19
21
22
23
25


3
3
3
4
4
5
5
6
7
8
9
9
10
12
13
14
15
17
18
20
21
23
25
26
28
30
32

3
3
3
4
4
5
6
7
7
9
10
11
12
14
15
17
18
20
22
24
26
28
30
33
35
38
40
43

3
3
4
4
5
6
7
9
10
11
13
15
17
19
21
23
26
28
31
34
37
40
43
46
50
53
57
61
3
3
4
5
6
7
9
10
12
14
17
19
22
25
28
32
35
39
43
47
52
56
61
66
71
77
82
88
94
3
4
5
7
9
11
14
17
20
24
28
33
38
43
49
55
61
68
75
82
90
98
107
116
125
135
145
155
166
4
6
9
12
17
22
28
35
43
52
61
71
82
94
107
120
135
150
166
183
201
219
238
258
279
301
324
347
371
9
17
28
43
61
82
107
135
166
201
238
279
324
371
422
476
534
595
659
726
797
871
948
1028
1112
1199
1289
1383
1480

-------
                                                                              Project: PEP QAPP
                                                                         Appendix B (Document 1)
                                                                                  Revision No: 1
                                                                                  Date: 3/6/2009
_ Page 7 of 40


Statistical Background

Generation of Matrix Table
For the purpose of calculating optimal sample sizes, a sample size matrix was iteratively generated to
yield a statistically calculated sample size given a specific precision and bias scenario. The matrix
indicates the smallest sample size needed to assure that the upper confidence limit on bias will be below
10% given the current estimate of precision and bias for a reporting organization.

The  sample size matrix is generated using an algorithm in SAS and creates various potential precision and
bias scenarios. The precision and bias scenarios begin at a minimum of 1% and 2%, respectively, and
increase to values of 15% and 9.5%. Possible sample sizes range from 3 to 1480. The algorithm used to
create the matrix iteratively increases the sample size by one through each loop and calculates upper
confidence limits for the current sample size and one sample size smaller for a specific precision and bias
scenario. For each precision and bias scenario, the sample size begins at 3 and is increased by one until
the 90% upper confidence limit calculated by sample size 'n' is below 10% and the 90% upper
confidence limit calculated by a sample size 'n-1' is above 10%. This assures that the matrix sample size
'n' is the smallest sample size that can be used where the  90% upper confidence limit is still below 10%.

Given  a specific reporting organization precision and bias estimate, one can use this matrix as a guide to
approximate sample size, assuming that the bias estimate is already less than 10%. As the reporting
organization precision and bias estimates get closer to 15% or 10% respectively, more samples are
required to ensure that 90% of the time the bias estimate is below 10%. When the bias estimate is greater
than 10%, the sample matrix cannot be used since the initial estimate is already above 10%.

The  matrix is generated using the following equations:

The  90% upper confidence limit on the bias for sample size 'n' is calculated by Equation la:
                                 bias_lUCL = m + tOMn_l} —f=                      Equation la
The 90% upper confidence limit on the bias for sample size 'n-1' is calculated by Equation Ib:


                               bias_2UCL =m + t090    •    S"                      Equation Ib


Both Equation la and Ib use a standard deviation of the percent differences, dt, calculated in Equation 2
below:
                                                                                    Equation 2


where the percent difference (or individual bias), dt, is described in Equation 5 in Appendix A

When bias_!UCL is under 10% and bias_2UCL is above 10%, one can be 90% confident that the bias value
that is under 10% is at most 10% when using a sample size of n.

-------
                                                                                 Project: PEP QAPP
                                                                           Appendix B (Document 1)
                                                                                     Revision No: 1
                                                                                     Date: 3/6/2009
	Page 8 of 40

Precision and Bias Estimates
The precision value that feeds into the sample size matrix above is based on the proposed precision upper
bound statistic, while the bias value is based on the mean absolute value of the individual bias  estimates.
The relevant precision and bias equations can be found in Appendix A of this document. For this study,
precision and bias sample pairs are considered valid when both paired value concentrations are greater
than 3ug/m3. In addition, a univariate outlier test was run on the individual bias estimates for each
reporting organization. Outliers were located and filtered out if data points were a certain distance away
from the interquartile range (bulk of the data). Any outlier identified from the test was excluded from the
reporting organization bias estimate. Table 3  identifies the frequency of excluded outliers within a
reporting organization.
Data Evaluation
Table 3 provides the estimates of precision and bias for the CY 2002-2004 PM2s data. Definitions for the
columns are provided below:
k
1
2
3
4
5
6
7
8
9
10
11
12
13
Variable
Rep Org
State
Sites 02-04
Req PEP
Checks
PEP Checks
Outlier
Prec Checks
Mean Abs Bias
CV_ub
Matrix
Diff
Matrix >
Matrix <
Comment
Reporting Organization
State
Number of SLAMS sites active in 2002-2004
Required PEP checks in a 3 year period (25% of sites*4/year*3 years)
Valid PEP audits performed in the 3 year period
Number of individual bias estimates (percent difference >±50) that were
from the dataset at a reporting organization level.
removed
Number of collocated precision checks in the 3-year period
Mean absolute bias
Precision coefficient of variation 90% upper confidence bound.
Number of PEP audits required based on the sampling matrix
Difference between the matrix value and the PEP requirement (Matrix -
Check=Diff)
A value of 1 signifying when matrix value was greater than the required
REQ PEP
PEP number
A value of 1 signifying when matrix value was less than the required PEP number
Since we are using confidence limits, we made a decision not to evaluate any reporting organization that
did not have at least 7 valid PEP/routine pairs after outliers and values < 3 ug/m3 were removed. The 23
unevaluated reporting organizations are highlighted in green in Table 3. Additionally, there were 2
reporting organizations (see Table 3) with > 7 PEP/routine pairs that did not report precision data to AQS
and therefore could not be used in the evaluation.

For each reporting organization, the CV_ub and the mean absolute bias values were used in the matrix
table to determine the number of PEP audits needed to ensure, with 90% confidence, the DQO will be
met. Example:

        For the first site with 7 valid PEP/routine pairs in Table 3 (Rep. Org. 0012), the
        intersection of the bias value of 3.09% and the precision value of 4.08% on the matrix
        yields a value of 3 audit pairs to ensure that 90% of the time the bias estimate of 3.09 %
        will be less than 10%. For reporting organizations that had either the precision or bias
        estimates beyond the matrix table, the extreme value for that row or column was used.

-------
                                                                               Project: PEP QAPP
                                                                         Appendix B (Document 1)
                                                                                  Revision No: 1
                                                                                  Date: 3/6/2009
	Page 9 of 40

        For example, if the reporting organization had a bias estimate of 6.5% and a precision
        estimate of 16%, the matrix estimate for that reporting organization would be 32 samples
        which relates to the intersection of 6.5 (bias) and 15 (precision).

The "Diff' column in Table 3 provides the difference based on the subtraction of the number of required
PEP checks from the matrix estimate for each reporting organization. A positive value indicates where the
matrix has required more PEP audits than the current requirement (a value of "1" is placed in the "Matrix
>" column); a negative value indicates that the matrix required less PEP audits than the current
requirement (a value of "1" is placed in the "Matrix <" column). In the case described above (Rep. Org.
0012), the matrix required 6 fewer samples then the current PEP requirement. The next two columns
("Matrix >" and "Matrix <") are used to summarize the number of sites where more or less audits than the
current required PEP checks are needed.

Upon evaluation of the data, a number of observations can be made:

    •    For reporting organizations with greater than 7 valid PEP/routine pairs and reported both
        precision and bias values, 32 needed more audits than the current PEP requirement, 50 required
        fewer and 2 sites had the same  number of audits for the matrix and PEP requirement. If we went
        strictly by what the matrix required, in total, many more audits would be required then are
        currently implemented.
    •    We noticed that at around 20 PEP audits, there was a tendency for the matrix to require less
        audits then the PEP requirement. For reporting organizations with > 20 PEP audits, 11 reporting
        organizations needed more audits and 31  required fewer audits than the PEP requirement. This
        observation may infer that around 20 valid audits may be appropriate to provide bias 3-year
        estimates with satisfactory confidence.

Next Step—Finding an  Appropriate  and Consistent Sample Size
Our evaluation of the sample size matrix (Table 2) information suggested that selecting a consistent
sample size for reporting organizations could ensure more statistically sound bias assessments while
reducing program costs. In answering the study question, two objectives remained critical: 1) that the
sample size is adequate to provide an appropriate level of confidence in the bias estimate, and 2)  ensuring
the bias estimate is representative of the reporting organization.

In order to select an appropriate sample size, we evaluated the 2002-2004 PM2 5 data base used to
generate Table 3. To get an idea of the national bias average, averaging the mean absolute value of the
bias estimates from the filtered  data for each reporting organization provided us with a national average
bias of-7.6%. Since individual reporting organizations bias estimates values can change quarterly and
yearly, and our DQOs are based on national estimates, we felt using this national estimate was justified.
We then posed the  question:

    •    How many samples would it take to ensure that 90% of the time, a bias estimate 7.6% would not
        be >10%?
In order to answer this question we needed to have a variability parameter to feed into the confidence
limit width equation that varies by reporting organization. Since we had much more collocated precision
data at our disposal, we used this data to generate our confidence limits with the assumption that  the
uncertainty between collocated  routine  samplers is indicative of the uncertainty between the two  samplers
used to assess bias  (PEP/routine sampler). The widths of confidence limits were calculated for each bias
value using this assumption and are  shown in Table 4 in the column labeled "CLimit". We generated 90%
confidence limit CLimits by varying samples sizes until we came to the sample size number where the

-------
                                                                              Project: PEP QAPP
                                                                        Appendix B (Document 1)
                                                                                  Revision No: 1
                                                                                  Date: 3/6/2009
                                                                        	Page 10 of 40
national average CLimit was 2.4 or less. This sample size would ensure that 90% of the time, the national
bias estimate of 7.6% would not be >10%. A sample size of 24 samples produced the appropriate CLimit.
Considering a reporting organization with 24 samples and a national mean bias value of-7.60 %, we can
be sure that this bias value in reality lies somewhere between 5.2% and 10%. 24  samples equate to 8 PEP
audits each year per reporting organization over the 3-year period. However, in order to allow for
incomplete data, we propose 9 PEP audits a year or 27 over a three year period. The sample size of 27
would be allocated across the sites in the reporting organization in a manner that takes into account the
logistical costs of implementation but must also be accomplished in a manner that provides for adequate
spatial and temporal representation of the reporting organization. This paper does not address this issue
but believes that 27 audits could be implemented in a manner that would achieve the representativeness
objective.
       Expected 90% confidence limit for bias (+/-) vs. sample size
  o
  
-------
                                                                                Project: PEP QAPP
                                                                          Appendix B (Document 1)
                                                                                    Revision No: 1
                                                                                    Date: 3/6/2009
	Page 11 of 40

reasonable risk for smaller reporting organizations in lieu of more complete sampling representation at
each site.

Allowing for one data loss event each year while requiring one more audit than actually needed allows
reporting organizations to have one audit credit per year in case it is needed in the future. This audit credit
acts as a "spare" to be used to compensate for unexpected data loss events without increasing the
resources already allocated to each reporting organization. Using this "18/27"approach we can reduce the
PEP from the current required audits of 3237 (over 3 years) to about 2466. This relates to a 24% audit
reduction (~ 250) a year and which equate to a cost savings of between 400-45 OK (accounting for some
static infrastructure costs).


Conclusions
PM2 5 precision and bias are estimated at the reporting organization level. The data evaluation suggests
that we could provide better estimates of reporting organization bias with a more consistent distribution of
auditing across reporting organizations. The data evaluation revealed an anticipated pattern: large
reporting organizations can reduce their sampling and small reporting organizations need to sample more.

Our discussions of the proposed PEP sampling reformation yielded the issue of discrepant
representativeness within a reporting organization. To perform a successful assessment, one must be
confident that the data collected is representative of the target population. By increasing our samples
within a small reporting organization, we are improving representativeness within the target population.
However, representativeness is compromised for larger reporting organizations when reductions in
sampling occur. It is also important to note that these larger reporting organizations also tend to be more
heterogeneous across a larger area. An optimized sampling design for large reporting organizations may
involve stratification by design value and consideration of important spatial and geographic
characteristics. Discussions regarding the most appropriate sampling design for assessing bias across a
large reporting organization are in progress.


Recommendations (CY2007)
Revise PEP requirement to the "18/27"  audit scheme. This would allow for one extra audit to
accommodate historically-documented data incompleteness issues within the PEP and routine monitoring
programs. Every 3  years, precision and bias  data will be evaluated to determine whether adjustments in
the sampling scheme are needed.

Select appropriate sites to represent the reporting organizations. Since we do not use  concentrations
< 3 ug/m3, we will only select sites that have a good chance of providing a concentration above this value.
Since we have plenty of routine concentration data from all sites within a reporting organization, we can
appropriately select the sites that will provide the best opportunity to be representative of the reporting
organization.

Consolidation of reporting organizations. Some states would benefit by consolidating their networks
into one or fewer reporting organizations.  The states of Ohio, Florida, and California may be good
candidates for consolidation. Some years ago the term reporting organization started to be used by
monitoring organizations to identify the organization responsible for reporting data to AQS and therefore
lost its original meaning. The  revision in CFRto add the term primary quality assurance organization
was developed in order to restore its original meaning. This new term uses the old definition and gives the
monitoring organizations another opportunity for consolidation which would reduce the PEP audit
requirements.

-------
                                                                               Project: PEP QAPP
                                                                          Appendix B (Document 1)
                                                                                   Revision No: 1
                                                                                   Date: 3/6/2009
	Page 12 of 40

Provide a better implementation scheme to reduce travel costs. OAQPS will look at ways to
implement the program more efficiently, taking into account representative needs of a reporting
organization from a spatial, temporal, and concentration context. For example, for large reporting
organizations the PEP may be able to reduce travel expenses by performing audits at a specific
geographic area one year, and then moving to a different geographic area the next. This scheme is beyond
the scope of this paper, but could be presented upon further evaluation.

The proposed sampling technique for the PEP program strengthens our assessments of bias while
providing for an overall reduction in the audit requirements. By implementing the program as proposed,
PEP audits can be reduced without adversely affecting the confidence in the 3-year bias estimate  at the
reporting organization level. In strengthening our bias assessments, we are strengthening the PEP
program and its mission.

-------
                                                                    Project: PEP QAPP
                                                              Appendix B (Document 1)
                                                                        Revision No: 1
                                                                        Date: 3/6/2009
                                                             	Page 13 of 40
Table 3. 2002-2004 PM2.s Reporting Organization Precision and Bias Estimates
                    for Sites with > 7 Valid PEP Audits
Rep_Org
0121
0274
0394
0779
0833
561
1124
709
1224
170
300
391
393
549
581
809
951
1226
220
595
151
458
880
12
395
403
805
820
867
544
550
682
874
986
491
0017
807
864
258
861
1150
1188
0350
392
396
481
Req PEP
Sites PEP check Prec Mean Abs
State 02-04 Checks s Outlier checks bias
FL
FL
FL
NC
FL
MO
VI
CA
FL
TN
AL
FL
FL
KY
TN
OH
FL
FL
OH
OH
OH
CA
OH
OH
FL
NC
OH
NC
FL
FL
AL
TN
IA
MO
FL
NM
OH
AZ
IL
PA
WV
WY
DC
FL
FL
HI
3
2
1
1
2
4
2
1
2
1
1
1
1
3
4
4
1
1
3
1
2
2
2
3
2
3
5
2
3
2
4
3
4
1
2
2
2
2
9
5
6
5
3
3
6
6
9
6
3
3
6
12
6
3
6
3
3
3
3
9
12
12
3
3
9
3
6
6
6
9
6
9
15
6
9
6
12
9
12
3
6
6
6
6
27
15
18
15
9
9
18
18
0
0
0
0
0
1
1
3
3
4
4
4
4
4
4
4
4
4
5
5
6
6
6
7
7
7
7
7
7
8
8
8
8
8
9
10
10
10
11
11
11
11
12
12
12
12
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
1
0
1
0
0
1
0
0
0
0
0
0
1
3
0
0
1
0
1
2
2
1
1
0
0
65
75
186
301
169
164
185
506
142
131
147
128
150
150
158
36
148
169
159
249
160
142
186
136
370
151
224
179
159

129
142
468
149
327
169

172
167
149
8.33
22.92
5.49
9.36
6.63
7.07
10.34
10.73
3.12
2.01
2.90
19.20
11.55
4.02
1.77
3.38
1.78
4.49
3.09
7.54
2.27
4.20
9.06
10.20
6.75
4.31
6.54
6.86
7.05
6.25
17.24
8.13
9.54
7.61
8.67
2.27
8.74
2.16
9.26
4.31
12.91
CV_ub
4.10
18.23
4.87
2.96
2.69
9.34
5.07
7.16
6.26
7.11
8.23
5.18
8.45
6.08
5.54
10.11
8.90
4.08
8.96
5.05
15.44
5.37
5.33
4.41
3.98
6.86
5.54
3.96
5.53

7.31
17.44
9.34
8.87
2.94
5.05

5.72
6.66
15.04
Matrix


3
23
3
12
52
201
6
4
11
7
5
5

25
1480
26
61
3
20

52
4
1480
Matrix Matrix
Diff > <


-6
17 1
-6
-3 1
46 1
192 1
0
-8
2 1
-5
2 1
-1

19 1
1474 1
-1
46 1
-15
5 1

43 1
-14
1462 1


1

1




1

1

1



1

1



1


-------
       Project: PEP QAPP
 Appendix B (Document 1)
           Revision No: 1
           Date: 3/6/2009
	Page 14 of 40
Req PEP
Sites PEP check Prec Mean Abs
Rep_Org State 02-04 Checks s Outlier checks bias
669
812
990
1025
1138
15
53
226
634
287
523
635
992
1119
673
613
782
816
1151
730
1259
229
762
907
294
942
889
251
973
752
1175
21
513
511
700
1118
240
588
13
584
764
971
972
1113
1127
55
86
660
703
NC
OK
MO
TN
NV
AK
AZ
NV
OH
OH
IN
ME
MO
VT
TN
IA
ND
NE
WV
MT
OH
OH
NH
Rl
DE
CA
PR
CT
SD
NE
Wl
PA
IL
ID/WA
MN
CA
CO
MO
AL
KY
NJ
SC
CA
UT
VA
AR
CA
MA
MS
3
5
3
7
1
7
7
6
3
5
7
6
3
6
5
3
8
3
5
10
11
9
12
8
7
11
15
12
12
11
25
8
28
12
25
15
14
14
13
17
21
14
17
17
21
24
15
24
17
9
15
9
21
3
21
21
18
9
15
21
18
9
18
15
9
24
9
15
30
33
27
36
24
21
33
45
36
36
33
75
24
84
36
75
45
42
42
39
51
63
42
51
51
63
72
45
72
51
12
12
12
12
12
13
13
13
13
14
14
14
14
14
15
16
16
17
18
19
19
20
20
20
21
21
22
24
24
25
26
27
27
28
29
29
30
34
35
36
41
42
42
42
42
43
44
44
44
0
0
0
0
0
2
1
0
1
1
0
0
0
2
0
1
0
2
2
1
0
2
1
1
0
1
0
2
3
1
3
2
0
2
3
2
2
0
3
4
3
5
4
2
1
7
1
3
0
154
61
766
451
174
327
250
99
161
149
231
306
158
311
144
221
81
257
349
272
453
307
351
206
148
213
229
341
496
264
571
418
757
385
578
457
392
796
476
568
441
723
596
611
929
462
240
995
483
4.00
10.64
8.52
10.36
6.74
4.68
14.84
4.35
4.16
4.69
4.60
22.47
7.31
2.78
10.34
11.68
12.60
7.24
4.07
7.60
5.84
5.15
5.21
7.58
3.91
7.54
20.33
7.54
24.74
10.48
4.44
5.51
11.14
7.15
6.52
4.02
7.36
5.24
4.51
6.90
8.46
4.13
4.62
8.40
5.16
8.42
6.21
9.23
8.04
CV_ub
4.83
5.98
3.68
6.26
2.20
10.35
18.99
12.88
3.82
5.28
5.15
5.64
4.29
4.03
7.44
4.59
5.86
12.67
4.44
7.95
3.01
7.69
7.58
12.70
5.19
5.23
13.24
6.81
10.12
8.52
4.05
3.92
8.56
6.23
8.09
6.61
9.70
3.32
4.89
6.34
7.12
4.11
4.51
8.22
7.58
2.14
4.95
14.93
7.04
Matrix
3
238
11
279
3
8
1480
11
3
4
4
201
7
3
371
135
238
30
3
19
3
6
6
43
3
9
1112
15
659
476
3
3
476
9
10
4
26
3
3
10
38
3
3
49
6
5
5
371
22
Matrix Matrix
Diff > <
-6
223 1
2 1
258 1
0
-13
1459 1
-7
-6
-11
-17
183 1
-2
-15
356 1
126 1
214 1
21 1
-12
-11
-30
-21
-30
19 1
-18
-24
1067 1
-21
623 1
443 1
-72
-21
392 1
-27
-65
-41
-16
-39
-36
-41
-25
-39
-48
-2
-57
-67
-40
299 1
-29
1




1

1
1
1
1

1
1




1
1
1
1
1

1
1

1


1
1

1
1
1
1
1
1
1
1
1
1
1
1
1
1

1

-------
       Project: PEP QAPP
 Appendix B (Document 1)
           Revision No: 1
           Date: 3/6/2009
	Page 15 of 40
Rep_Org
1001
145
1136
685
437
563
776
1080
1002
520
851
821
1035
768
Summary
State
LA
CA
WA
Ml
GA
KS
NC
IA
MD
IN
PA
OR
TX
NY

Req
Sites PEP
02-04 Checks
25
30
22
28
23
13
23
15
20
34
25
32
56
53
1079
75
90
66
84
69
39
69
45
60
102
75
96
168
159
3237
PEP
check
s Outlier
44
45
45
48
49
49
50
54
58
64
71
81
87
99
2313
3
2
4
10
5
5
2
1
5
7
6
2
1
5
146
Prec Mean Abs
checks bias
645
646
603
678
444
616
815
861
437
765
772
721
1354
647
35809
12.39
8.85
5.37
6.50
3.51
8.48
7.80
9.64
7.62
5.38
4.03
7.62
7.78
9.75
7.62
CV_ub
5.92
10.53
4.48
6.27
4.88
8.73
8.30
6.55
5.51
4.26
4.66
4.09
7.97
5.62
6.93
Matrix
238
183
4
8
3
55
32
279
10
4
3
6
28
201
10969
Matrix Matrix
Diff > <
163
93
-62
-76
-66
16
-37
234
-50
-98
-72
-90
-140
42
7882
1
1
1
1
1
1
1
1
1
1
1
1
1
1
32 50

-------
                                                                            Project: PEP QAPP
                                                                      Appendix B (Document 1)
                                                                                Revision No: 1
                                                                                Date: 3/6/2009
                                                                      	Page 16 of 40
Table 4. 2002-04 PM2.5 Summary of Potential Reduction Based on Proposed Equitable Allocation
Rep_Org
0121
0274
0394
0779
0833
561
1124
709
1224
170
300
391
393
549
581
809
951
1226
220
595
151
458
880
12
395
403
805
820
867
544
550
682
874
986
491
0017
807
864
258
861
1150
1188
0350
State
FL
FL
FL
NC
FL
MO
VI
CA
FL
TN
AL
FL
FL
KY
TN
OH
FL
FL
OH
OH
OH
CA
OH
OH
FL
NC
OH
NC
FL
FL
AL
TN
IA
MO
FL
NM
OH
AZ
IL
PA
WV
WY
DC
Sites 02-04
3
2
1
1
2
4
2
1
2
1
1
1
1
3
4
4
1
1
3
1
2
2
2
3
2
3
5
2
3
2
4
3
4
1
2
2
2
2
9
5
6
5
3
Req PEP
Checks
9
6
3
3
6
12
6
3
6
3
3
3
3
9
12
12
3
3
9
3
6
6
6
9
6
9
15
6
9
6
12
9
12
3
6
6
6
6
27
15
18
15
9
PEP
checks
0
0
0
0
0
1
1
3
3
4
4
4
4
4
4
4
4
4
5
5
6
6
6
7
7
7
7
7
7
8
8
8
8
8
9
10
10
10
11
11
11
11
12
Mean Abs
bias


8.33
22.92
5.49
9.36
6.63
7.07
10.34
10.73
3.12
2.01
2.90
19.20
11.55
4.02
1.77
3.38
1.78
4.49
3.09
7.54
2.27
4.20
9.06
10.20
6.75
4.31
6.54
6.86
7.05
6.25
17.24
8.13
9.54
7.61
8.67
2.27
8.74
2.16
Climit24
CV_ub 90%


4.10

18.23
4.87
2.96
2.69
9.34
5.07
7.16
6.26
7.11
8.23
5.18
8.45
6.08
5.54
10.11
8.90
4.08
8.96
5.05
15.44
5.37
5.33
4.41
3.98
6.86
5.54
3.96
5.53

7.31
17.44
9.34
8.87
2.94
5.05



1.43

6.38
1.70
1.04
0.94
3.27
1.77
2.51
2.19
2.49
2.88
1.81
2.96
2.13
1.94
3.54
3.11
1.43
3.13
1.77
5.40
1.88
1.87
1.54
1.39
2.40
1.94
1.38
1.94

2.56
6.10
3.27
3.10
1.03
1.77


-------
       Project: PEP QAPP
 Appendix B (Document 1)
           Revision No: 1
           Date: 3/6/2009
	Page 17 of 40
Rep_Org
392
396
481
669
812
990
1025
1138
15
53
226
634
287
523
635
992
1119
673
613
782
816
1151
730
1259
229
762
907
294
942
889
251
973
752
1175
21
513
511
700
1118
240
588
13
584
764
971
State
FL
FL
HI
NC
OK
MO
TN
NV
AK
AZ
NV
OH
OH
IN
ME
MO
VT
TN
IA
ND
NE
WV
MT
OH
OH
NH
Rl
DE
CA
PR
CT
SD
NE
Wl
PA
IL
I DM/A
MN
CA
CO
MO
AL
KY
NJ
SC
Sites 02-04
3
6
6
3
5
3
7
1
7
7
6
3
5
7
6
3
6
5
3
8
3
5
10
11
9
12
8
7
11
15
12
12
11
25
8
28
12
25
15
14
14
13
17
21
14
Req PEP
Checks
9
18
18
9
15
9
21
3
21
21
18
9
15
21
18
9
18
15
9
24
9
15
30
33
27
36
24
21
33
45
36
36
33
75
24
84
36
75
45
42
42
39
51
63
42
PEP
checks
12
12
12
12
12
12
12
12
13
13
13
13
14
14
14
14
14
15
16
16
17
18
19
19
20
20
20
21
21
22
24
24
25
26
27
27
28
29
29
30
34
35
36
41
42
Mean Abs
bias
9.26
4.31
12.91
4.00
10.64
8.52
10.36
6.74
4.68
14.84
4.35
4.16
4.69
4.60
22.47
7.31
2.78
10.34
11.68
12.60
7.24
4.07
7.60
5.84
5.15
5.21
7.58
3.91
7.54
20.33
7.54
24.74
10.48
4.44
5.51
11.14
7.15
6.52
4.02
7.36
5.24
4.51
6.90
8.46
4.13
Climit24
CV_ub 90%
5.72
6.66
15.04
4.83
5.98
3.68
6.26
2.20
10.35
18.99
12.88
3.82
5.28
5.15
5.64
4.29
4.03
7.44
4.59
5.86
12.67
4.44
7.95
3.01
7.69
7.58
12.70
5.19
5.23
13.24
6.81
10.12
8.52
4.05
3.92
8.56
6.23
8.09
6.61
9.70
3.32
4.89
6.34
7.12
4.11
2.00
2.33
5.26
1.69
2.09
1.29
2.19
0.77
3.62
6.64
4.51
1.34
1.85
1.80
1.97
1.50
1.41
2.60
1.61
2.05
4.43
1.55
2.78
1.05
2.69
2.65
4.44
1.82
1.83
4.63
2.38
3.54
2.98
1.42
1.37
2.99
2.18
2.83
2.31
3.39
1.16
1.71
2.22
2.49
1.44

-------
       Project: PEP QAPP
 Appendix B (Document 1)
           Revision No: 1
           Date: 3/6/2009
	Page 18 of 40
Rep_Org
972
1113
1127
55
86
660
703
1001
145
1136
685
437
563
776
1080
1002
520
851
821
1035
768
Summary
State
CA
UT
VA
AR
CA
MA
MS
LA
CA
WA
Ml
GA
KS
NC
IA
MD
IN
PA
OR
TX
NY

Sites 02-04
17
17
21
24
15
24
17
25
30
22
28
23
13
23
15
20
34
25
32
56
53
1079
Req PEP
Checks
51
51
63
72
45
72
51
75
90
66
84
69
39
69
45
60
102
75
96
168
159
3237
PEP
checks
42
42
42
43
44
44
44
44
45
45
48
49
49
50
54
58
64
71
81
87
99
2313
Mean Abs
bias
4.62
8.40
5.16
8.42
6.21
9.23
8.04
12.39
8.85
5.37
6.50
3.51
8.48
7.80
9.64
7.62
5.38
4.03
7.62
7.78
9.75
7.62
Climit24
CV_ub 90%
4.51
8.22
7.58
2.14
4.95
14.93
7.04
5.92
10.53
4.48
6.27
4.88
8.73
8.30
6.55
5.51
4.26
4.66
4.09
7.97
5.62
6.93
1.58
2.87
2.65
0.75
1.73
5.22
2.46
2.07
3.68
1.57
2.19
1.71
3.05
2.90
2.29
1.93
1.49
1.63
1.43
2.79
1.97
2.42

-------
                                                  Project: PEP QAPP
                                             Appendix B (Document 1)
                                                     Revision No: 1
                                                     Date: 3/6/2009
                                            	Page 19 of 40
                Appendix A

Precision and Bias Statistical Calculations

-------
                                                                                 Project: PEP QAPP
                                                                           Appendix B (Document 1)
                                                                                     Revision No: 1
                                                                                     Date: 3/6/2009
	Page 20 of 40


Precision
Precision is estimated via duplicate measurements from collocated samplers of the same type. Precision is
aggregated at the reporting organization level quarterly, annually, and at the 3-year level. For each
collocated data pair, the relative percent difference, dt, is calculated by Equation 3.
                                             X- —Y-
                                               '        100
                                                                                       Equation3
where Xt is the concentration of the primary sampler and Yt is the concentration value from the audit
sampler

The precision upper bound statistic, CVub, is a standard deviation with a 90% upper confidence limit
(Equation 4).
                              ar    1    !=l       V!=l   /                               T^    j.-   *
                               ub = \	7	7?	. H	                  Equation 4
Bias
PEP audits are performed by a PEP audit sampler to find measurement bias in the routine sampler relative
to the audit sampler. This is calculated below as a percent difference or individual bias, dt, where /'
represents a specific sampler (Equation 5).

                                             Y- — X
                                        d, = - - - • 1 00                              Equation 5
where Xt represents the audit sampler and Yt represents the routine sampler

The bias value is based on the average individual bias and is calculated as m in equation 6 below:
                                                                                       Equation 6

-------
                                                                               Project: PEP QAPP
                                                                         Appendix B (Document 2)
                                                                                  Revision No: 1
                                                                                  Date: 3/6/2009
                                                                        	Page 21 of 40
    Decision Framework for PM2.s Performance Evaluation Program (PEP)
                                 Collocation Study Data

  Dennis Grumpier (USEPA OAQPS), Jennifer Lloyd (RTI International), William Warren-Hicks (EcoStat, Inc.)
                                      January 2009 (DRAFT)

Introduction

Historically, evaluating the precision and bias of PM2 5 Federal Reference Method (FRM) samplers
measuring low concentrations (i.e., less than 6 (ig/m3) has been somewhat problematic. Previous methods
to evaluate collocated samplers involved comparisons of paired sampler results (Xt and Y,) using
calculations based on relative percent difference (d,):
                                            Y — X
                                     d =
Prior to the 2006 update to 40 CFR Part 58, Appendix A, the PEP utilized a non-traditional coefficient of
variation (CV) for flagging parking lot collocated sampler results as one of the sampler performance test
criteria. The "CV" was calculated using the relative percent difference (d,) for each single check (i.e.,
pair of collocated sample results):

                                          "CV" =4=
                                                 V2

For the majority of the historical data, this did not present a problem.  However, at low concentrations,
the percent difference could appear unacceptably large, when the true difference between values was
reasonably small. Less problematic but with similar consequence, percent differences based on higher
concentrations may have indicated that  the results were consistent when the true difference was
unacceptably large.

Figure 1 presents precision data from 2005 for the FRM Network; State, local, and Tribal (SLT) samplers
are collocated with other SLT samplers. This graph shows that as the average PM2 5 concentration
decreases, then the range (or scatter) of relative percent differences increases. This large dataset
convincingly illustrates the problem described above.

Figures 2 and 3  present the PEP collocation study results for 1999-2007. Figure 2 shows the difference
in concentration from each pair of collocated samplers against the mean concentration for entire sampling
day. Figure 3 shows the "CV" (described above) plotted against the daily mean concentration. (Note:  In
order to illustrate the effects on individual collocation studies, three different studies were highlighted
using different graph symbols to set them apart from the full data set.)  These graphs again illustrate that
at lower concentrations, the scatter of values based on a relative percent difference increases.

This type of analysis does not provide an adequate tool for identifying PEP samplers that do not perform
in a manner consistent with the fleet,  especially at low concentrations.  As a result, EPA is adopting a
different approach for identifying the nonconforming PEP samplers.

-------
                                                                                Project: PEP QAPP
                                                                          Appendix B (Document 2)
                                                                                    Revision No: 1
                                                                                    Date: 3/6/2009
                                                                         	Page 22 of 40
Data Sets
EPA utilized two different data sets in this evaluation. The first was the "parking lot" collocation results
from the Performance Evaluation Program (PEP) for years 1999 to 2007 (later narrowed to 2002-2007).
The second data set was obtained from EPA's AQS database and consisted of PM2 5 results for SLT
routine FRM samples paired with the coincident PEP audit results for years 2002-2007.

Analytical Approach

Normalized paired differences for all valid parking lot studies were calculated using the following
equations:
                                          «-l n
                             where^ Z). = 2 S xt-x,
                                      '•*    .-—1  ... i       -^
There are n samplers evaluated for a specific day, d, during the testing period.  The absolute value of the
differences (Z)y) for all available pairings (/', j) among samplers are computed for each day during the
study.  For each day of a testing period, these differences are then normalized and turned into a
percentage (Nij,d* 100) by dividing each /', j pair by the daily mean, xd, and multiplying by 100. After
normalization, the differences are considered comparable among individual studies conducted under
differing atmospheric conditions.

Normalization also serves to dampen the effects of an increased percent difference at lower
concentrations (see Figure 4). However, the mathematical  characteristics of the dataset are still present
(i.e., the normalized percentages calculated from low concentrations still tend to be inflated relative to
those calculated from higher concentrations).  The normalized data lends itself to a more sensitive
analysis of the collocation study results and an associated tiered decision framework for identifying more
subtle deviations in sampler performance.

Histograms of the Normalized Data Sets

Histograms of the resulting normalized percent differences can be used to infer the expected among-
sampler precision based on historical data collected within the PEP. The true accuracy of any given
sampler can only be approximated; a reference gas with a known PM2 5 concentration is not readily
available for challenging samplers.  However, the among-sampler precision of the PEP collocated
samplers provides a programmatic review of the general tendencies of the reference samplers to obtain
consistent results.

Figure 5 shows the distribution of normalized differences (%) for all available concentration data
generated during the parking lot studies. Figure 6 presents the same information for the 2002 - 2007 time
periods. The histogram shown in the figures represents the empirical data. Three theoretical distributions,
calibrated from the empirical data, are  shown on each plot. The Beta and lognormal distributions are
reasonable choices, with each distribution limited to values greater than zero. The normal distribution is
only displayed for reference, and is generally inappropriate for these data (not limited to normalized

-------
                                                                               Project: PEP QAPP
                                                                         Appendix B (Document 2)
                                                                                   Revision No: 1
                                                                                   Date: 3/6/2009
	Page 23 of 40


values greater than zero). A few (less than 1%) of the normalized values are greater than 50%, and are not
shown in the graphic. As shown, the distributions are positively skewed with a median value of
approximately 4.5% to 4.6%.

For the 2002 - 2007 data subset (Figure 6), 95% of the observations have a normalized difference less
than 16.3%, 90% of the observations are less than 10.6%, and 75% of the observations are less than 5.8%.

Figures 7-9 present the data and distributions aggregated by EPA Region. Examination of the figures
indicates that the shapes of the distributions among Regions are similar, with the median values ranging
from 2.27% to 4.12%. The distributions provide empirical evidence that the results among EPA Regions
are reasonably consistent.

Figure 10 presents the normalized differences for the FRM data set. Differences between a SLT FRM
sampler and corresponding PEP audit sampler for all observations in the data set during the period 2002 -
2007 are calculated and plotted. The median value is 6.95%. Examination of the figure illustrates that the
FRM distribution has a larger variance than found in the parking lot studies.

For the FRM data in Figure 10, 95% of the observations have a normalized difference less than 47.4%,
90% of the observations are less than 29.4%, and 75%  of the observations are less than 15.0%.

Figures 11-13 present the FRM normalized differences by Region. Unlike the parking lot study data,
the distributions among EPA Regions are inconsistent.  In general, a larger probability mass in the tail is
seen than would be expected if the data were associated with a traditional lognormal distribution. The
medians among Regions range from 5.1% to 10.65%. Regions with larger variation in results include
Regions 2, 8, and 6.

The historical data described above  can be used to establish reasonable quality assurance criteria and
decision-frameworks for identifying samplers that may be generating  inaccurate results. A tiered decision
framework that does not rely on classic hypothesis testing has several advantages, including the
following: (1) hypothesis testing results are difficult to interpret and the available methods can easily
provide conflicting answers, (2) empirical decision-frameworks using historical information on a
program-wide basis can provide useful information reflecting among-sampler variation leading to quality
assurance criteria and decision pathways, (3) tiered decision frameworks can be designed to provide QA
officers with information that can be lead to a final decision concerning sampler performance, and (4)
simplified numerical methods typically found in tiered  decision frameworks can be programmed into
most software systems.

Evaluation of Decision Methods

Several QA criteria for isolating samplers with anomalous measurements relative to other samplers were
evaluated using the historical PEP data. The number of samplers found to be inconsistent within each
collocation study was recorded. The steps in the QA decision-framework, and the approach to simulating
candidate QA criteria, are described below. Results of the  simulation follow in the next section.

-------
                                                                                Project: PEP QAPP
                                                                          Appendix B (Document 2)
                                                                                    Revision No: 1
                                                                                    Date: 3/6/2009
	Page 24 of 40


Step 1. Reasonable measured concentrations - The measured PM2 5 concentrations are first screened
for reasonableness. The maximum detection limit stipulated by the CFR is 200 (ig/m3. Therefore, on any
study day, if a sampler measures a PM^ concentration >200 ^g/m3. the measurements associated with the
sampler are dropped from consideration
(i.e., all measurements from the sampler are discarded during evaluation of the data collected during the
multiple-day study period). The sampler cannot be used for PEP audits without a thorough engineering
evaluation. If more than one sampler measures a PM^ concentration
>200 |ig/m3 on a given day, then the entire collocation study warrants closer examination before
judgments on sampler bias can be made.  Field scientists are  advised that if an atmospheric condition
caused by an event (such as a nearby wild fire) occurs, then the collocation study should be rescheduled.

Step 2. Notable differences - Based on the examination of the histograms displaying the distribution of
normalized percent differences (Figures 5 - 13), QA criteria cutoffs of (a) >10%, (b) >15%, and (c)
>20% were evaluated.  Considerations included all the factors that might have biased the data for those
sampling events.  EPA also considered the numbers of samplers which might require further investigation
and possible maintenance (to avoid overburdening the program). EPA has decided that normalized
percent differences <15% will be accepted as within the expected normal range of within-sampler
precision historically observed within the PEP.  Normalized percent differences >15% will be flagged as
"notable differences."

Step 3. Relevance of notable differences - From a mathematical perspective, relatively small
inconsistencies in within-sampler precision can result in large computed percent differences if the
magnitudes of the PM2 5 concentrations are small. In this case, the resulting percent difference is not
relevant from an engineering or human health perspective. From prior studies, EPA has determined that
the lowest ambient concentration that can be used for calculating within-sampler precision  and network
bias relative to the PEP audit sampler is 3 (ig/m3.2 Therefore, the following cases were evaluated:  (a)
each sampler in a pair measured
< 3 (ig/m3, and (b) either  sampler 1 or sampler 2 measured < 3 (ig/m3.

Notwithstanding the historical data set, it is conceivable that only one sampler may capture a very small
amount of particulate, EPA has selected case (a), using the AND operator, for evaluating the relevance of
notable differences based on concentrations < 3 (ig/m3.  If concentrations from both samplers in the
comparison measure < 3 |ig/m3. then the normalized percent difference from the pair will not be used to
identify the sampler for further investigation. However, if only one of the two measurements is < 3(ig/m3,
then the normalized percent difference will remain in the data set for evaluating the collocated samplers.

Step 4. Sampler-specific relevant notable differences - For any collocation study (consisting of 3 or
more sampling days) the number of calculated normalized differences representing within-sampler
precision is dependent upon the number of samplers participating in the study, and can be large. For
example, ten samplers participating on each of three days results in 45 normalized differences on each
day, and 135 differences over a 3-day period. One objective of the collocation exercise is to identify
samplers that consistently generate larger differences than are normally expected. If, for example, a
sampler is involved in a single notable difference over the course of 3-days, this finding could easily be
due to random variation within the expected range  of differences. Therefore, to specify individual
samplers for further investigation, the following criteria were tested: (a) there must be more than one
notable difference computed during the collocation study, and (b) the sampler must be involved with at
2 See proposal at 71 FR 2728, January 17, 2006 and promulgation at 71 FR 61255, October 17, 2006.

-------
                                                                                Project: PEP QAPP
                                                                          Appendix B (Document 2)
                                                                                    Revision No: 1
                                                                                    Date: 3/6/2009
	Page 25 of 40


least 50% of the relevant notable differences computed over the entire collocation study. EPA determined
this to be a reasonable approach for identifying inconsistently performing samplers.

Results of Simulation

PEP collocation studies should include at least three days of testing, however, data shows that historically
the testing period has ranged from 1 to 3 days. During a study, the number of samplers evaluated on any
specific day could vary. The largest number of samplers on a specific day was  14. Table 1 presents a
program-level summary of the simulation results.

                     Table 1. Program-Level Summary of Simulation Results

Total number of notable differences:
Number of notable differences
involving concentrations < 3 ug/m3:
Minimum number of samplers
requiring evaluation:
using AND operator
using OR operator
using AND operator
using OR operator
Normalized
Percent
Difference >10%
1,017
72
116
108
108
Normalized
Percent
Difference >15%
515
44
80
59
56
Normalized
Percent
Difference >20%
349
30
60
46
42
Note: The total number of differences from paired sample results in the PEP data set is 11,405.
Examination of the study-level results and the Table 1 program-level summary provides the following
findings:

    a.   Changing the QA criteria for the identification of notable differences from >10% to >15%
        decreased the number of paired sampler results where further evaluation would be warranted by
        49% (this is consistent with the log-normal distribution of differences presented in the histogram).
        A change from >10% to >20% results in a reduction in differences remaining for evaluation by
        65%.

    b.   The choice of the two logical operators (AND / OR) for identifying differences between samples
        with concentrations < 3 (ig/m3 is relatively significant, with the number of differences using the
        OR operator (72) having  as much as a 62% increase over those identified using the AND operator
        (116) for the >10% scenario. However, the overall effect of changing the logical operator on the
        identification of specific  samplers for evaluation is small. For example, the minimum number of
        samplers identified for the > 15% scenario was 59 for the AND operator versus 56 for the OR
        operator. This evaluation reinforces the decision to use the AND operator in the analysis.

    c.   The cascading decision framework results in a minimum of 108 PEP sampler evaluations using
        the >10% criteria. Note that the 108 evaluations are based on samplers within a study that were
        involved in the greatest number of calculated differences meeting the decision criteria. For some
        collocation studies, EPA  may need to further evaluate more than one sampler with differences
        exceeding the decision criteria. The overall effect of excluding the differences associated with
        samplers having concentrations < 3 (ig/m3, prior to selecting samplers for further review, is
        minimal.

-------
                                                                                Project: PEP QAPP
                                                                          Appendix B (Document 2)
                                                                                    Revision No: 1
                                                                                    Date: 3/6/2009
   	Page 26 of 40


    d.  Examination of the historical data shows that specific samplers in some Regions have repeatedly
       exceeded the decision criteria.  EPA will take proactive steps to correct or replace those samplers
       which exhibit repetitive exceedances.
Conclusion

The decision framework outlined above is shown to provide a logical method for identifying individual
samplers that display inconsistency in PM2 5 measurements relative to other samplers. EPA will continue
to evaluate the selected decision criteria as more collocation studies are conducted.

To facilitate efficiency and ensure consistency, the decision framework will be implemented using
automated programming methods. Following each collocation study, EPA will review the program
output.  Any samplers identified as displaying inconsistent measurements will be further investigated. If
the overall collocation study results show a high number of notable differences, EPA will  investigate not
only the samplers, but the filter handling process for all personnel involved in the collocation study.

-------
                                                                      Project: PEP QAPP
                                                                 Appendix B (Document 2)
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                                	Page 27 of 40
                     Figure 1. 2005 Precision Data for FRM Network
                     (SLT samplers collocated with other SLT samplers)
   0.8
   0.7:
   0.6:
£  0.5
   0.4:
-Z  0.3:
   0.2:
   0.1
   0.0:
                               10                       20

                    Average  Collocated  PM2.5  Concentration
30

-------
                                                                            Project: PEP QAPP
                                                                      Appendix B (Document 2)
                                                                                Revision No: 1
                                                                                Date: 3/6/2009
                                                                     	Page 28 of 40
Figure 2. PEP Collocation Studies:  Paired Concentration Differences vs. Daily Mean
                                     1999 - 2007
              PEP Collocation Studies - Paired Concentration Differences
                                                       <> Differences between all paired
                                                         concentrations (1999-2007)
                                                       A Differences from one study (12/10/02
                                                         12/11/02, 12/15/02)
                                                       O Differences from one study (2/21/03,
                                                         2/22/03)
                                                       D Differences from one study (9/26/06,
                                                         9/27/06, 9/28/06, 9/29/06)
                           15       20       25       30       35

                                      Daily Mean (|ici/m!)
40
45
50

-------
                                                                                 Project: PEP QAPP
                                                                           Appendix B (Document 2)
                                                                                     Revision No: 1
                                                                                     Date: 3/6/2009
                                                                          	Page 29 of 40
      Figure 3. PEP Collocation Studies: "CV" for Paired Concentrations vs. Daily Mean
                                          1999 - 2007
   70
   60
   50
                     PEP Collocation Studies - "CV" for Paired Concentrations
                                             o CVs for all paired concentrations (1999-2007)
                                             A CVs for one study (12/10/02, 12/11 /02, 12/15/02)
                                             O CVs for one study (2/21/03, 2/22/03)
                                             D CVs for one study (9/26/06, 9/27/06, 9/28/06, 9/29/06)
              A
4>
U
O
o
•o
£
're
a.
40
30
                                                   "CV" = Relative percent difference / sqrt (2)
            o
            o
         A,
                0

                0
                                    _D
                                                   Note: Prior to 2006, the acceptance criteria
                                                   for collocation study data was "CV" <= 10%.
                       10
                             15
20
25
30
35
40
45
50
                                           Daily Mean (\iylm )

-------
                                                                              Project: PEP QAPP
                                                                        Appendix B (Document 2)
                                                                                  Revision No: 1
                                                                                  Date: 3/6/2009
                                                                       	Page 30 of 40
     Figure 4.  PEP Collocation Studies:  Normalized Paired Differences vs. Daily Mean
                                        1999 - 2007
            PEP Collocation Studies — Percent Differences Normalized to Daily Mean
300
   250
8  200
c
£
£
zed
150
100
 50
  0 -M
                                   B
                                                         o Normalized percent differences (1999-
                                                           2007)

                                                         A Normalized differences from one study
                                                           (12/10/02, 12/11/02, 12/15/02)

                                                         O Normalized differences from one study
                                                           (2/21/03,2/22/03)

                                                         D Normalized differences from one study
                                                           (9/26/06, 9/27/06, 9/28/06, 9/29/06)
                              15
                                          20      25       30
                                            Daily Mean (|ig/m ;)
35
40
45
50

-------
                                                                         Project: PEP QAPP
                                                                   Appendix B (Document 2)
                                                                             Revision No: 1
                                                                             Date: 3/6/2009
                                                                  	Page 31 of 40
       Figure 5. PEP Collocation Studies: Distribution of Paired Differences
                                   1999 - 2007
Normalized Paired Differences Greater than 50% Not Shown (<1% of total observations)
                                                              Summary Statistics
                                                              N
                                                              Mean
                                                              Median
                                                              Std Dev
                                                              Minimum
                                                              Maximum
                                                   11219
                                                     4.50
                                                     2.78
                                                     5.67
                                                     0.00
                                                   49.94
                  10
15      20      25      30      35
   Normalized Paired Difference (%)
40
45
50
          Curves:
- Normal(Mu=4.5039 Sigma=5.6714)
- Lognormal(Theta=-0 STiape=1.3 Scale=.86)
  Be!a(Theta=-.01 Scale=51 a=0.78 b=7.57)

-------
                                                                        Project: PEP QAPP
                                                                   Appendix B (Document 2)
                                                                            Revision No: 1
                                                                            Date: 3/6/2009
                                                                  	Page 32 of 40
        Figure 6. PEP Collocation Studies: Distribution of Paired Differences
                                   2002 - 2007
 Normalized Paired Differences Greater than 50% Not Shown (<1% of total observations)
301
                                                               N
                                                               Mean
                                                               Median
                                                               Std Dev
                                                               Minimum
                                                               Maximum
                   10
15      20     25      30      35
   Normalized Paired Difference (%)
40
45
50
            Curves:
  Normal(Mu=4.6326 Sigma=5.7426)
  Lognormal(Theta=-0 Snape=1.3 Scale=.89)
  Befa(Theta=-.01 Scale=51 a=0.78 b=7.39)

-------
                                                                     Project: PEP QAPP
                                                                Appendix B (Document 2)
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                               	Page 3 3 of 40
       Figure 7. PEP Collocation Studies: Distribution of Paired Differences
                           Regions 1-4: 2002 - 2007
Normalized Paired Differences Greater than 50% Not Shown (<1% of total observations




25-




Il5 1
£.


10-



0










Summary Statistics
N
Mean
Median
Std Dev

Maximum
423
5.50
3.75
5.72
0.00
32.44








\
| 	 ^
\ ~n
5 10 15 20 25 30 35
Normalized Paired Difference (%)
40
Curves: 	 Normal(Mu=5.499 Siqma=5.7241)
— Lognormal(Theta^O shape=1.4 Scale=1 .1)
Bera(Theta=-.01 Scale=5l a=0.83 b=6.75)
Region 1












45 50







,

-
e 15-
S.

10-


5-

0

1
1
\

n
A •-,
'' n \

\ -1 \

\\f\^rf^__
5 10 15 20 25 30 35
Normalized Paired Difference (%)


Summary Statistics
N
Mean
Median
Std Dev
Minimum
Maximum










40
Curves: 	 Normal(Mu=3.6878 Sigma=3.3962)
— Lognormal(Theta— 0 Snape=1 .1 Scale=.84)
BefafThetaVoi Scale=51 a=1.13 b=14.4)
Rpninn 7

330
3.69
2.74
3.40
0.02
18.28


































45 50








































\
30-



25-
_ 20-
1
s.
15-


10- r

5-
0

Summary Statistics
iN 555
Mean 3.85 1
Median 2.69
Std Dev 4.04 1
Minimum 0.00
Maximum 30.17



T~l
V

-V--
\ ~n "'v
rpjT^ — ^^-^-^ m
















5 10 15 20 25 30 35 40 45 50
Normalized Paired Difference (%)
Curves: 	 Normal(Mu=3. 8462 Sigma=4. 0432)
— Lognormal(Theta=-0 STiape=1 .3 Scale=.78)
Be!a(Theta^.01 Scale=51 a=0.94 b=1 1.3)
Rpginn 3



35-
30-
25-
-
e 20-
S.
15-


10-

5-
0
Summary Statistics
JN 3519
Mean 3.98
Median 2.27
Std Dev 5.60
Minimum 0.00
Maximum 49.94

\


\
\

\r— I
'*""\r^-
^ni^_
5 10 15 20 25 30 35 40 45 50
Normalized Paired Difference (%)
Curves: --- Normal(Mu=3.9758 Sigma=5.5985)
— Lognormal(Theta— 0 Snape=1 .3 Scale=.66)
Befa(ThetaVoi Scale=5l a=0.71 b=7.73)













Rpninn 4

-------
                                                                     Project: PEP QAPP
                                                                Appendix B (Document 2)
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                               	Page 34 of 40
       Figure 8. PEP Collocation Studies: Distribution of Paired Differences
                           Regions 5-8: 2002 - 2007
Normalized Paired Differences Greater than 50% Not Shown (<1% of total observations

A
17.5 •
15.0 1
12.5 1
1
g 10.0-
Q_
7.5-
5.0-
2.5-
0




^ i\
20-
I
,5 rl
1
1
10-
5- '
0



Summary Statistics
N 402
Mean 6.59
Median 4.12 •
Std Dev 7.46
Minimum 0.00 1
Maximum 39.20
it
5 10 15 20 25 30 35 40 45 50
Normalized Paired Difference (%)
Curves: 	 Normal(Mu=6.5888 Sigma=7.4631)
— Lognormal(Theta=-0 Sriape=1 .3 Scale=1 .3)
BeTa(Theta=-.01 Scale=5l a=0.81 b=5.15)
Rpninn 5


Summary Statistics
N 1810 1
Mean 4.53 •
Median 3.20 •
Std Dev 4.87
L
I^Tpf^^^—
5 10 15 20 25 30 35 40 45 50
Normalized Paired Difference (%)
Curves: --- Normal(Mu=4.528 Sigma=4.8715)
Lognormal(Theta=-05hape=1.2Scale=1J
Bera(Theta=-.01 Sca\e=S\ a=0.98 b=9.63)
Rpninn 6

















-I
1
O_
10-
5-
0





.. \
15-
1 r
ID-
S' -'
0



Summary Statistics
IN 680 1
Mean 4.00
Median 2.65
Std Dev 4.88
Minimum 0.00
Maximum 38.43
Vi
5 10 15 20 25 30 35 40 45 50
Normalized Paired Difference (%)
Curves: --- Normal(Mu=3.9956 Sigma=4.8778)
— Lognormal(Theta=-0 Snape=1.1 Scale=.86)
Befa(Theta^.01 Scale=51 a=0.94 b=10.5)
Rpginn 7


Summary Statistics
N 404 1
Mean 5.68
Median 4.11
Std Dev 5.70
Maximum 44.08
-•'' \ -|Xk
5 10 15 20 25 30 35 40 45 50
Normalized Paired Difference (%)
Curves: --- Normal(Mu=5.6818 Sigma=5.6992)
Lognormal(Theta=-0 Snape=1 .3 Scale=1 .2)
Befa(Theta=-.01 Scale=5!1 a=0.91 b=7.09)
Rpninn fi
















-------
                                                                     Project: PEP QAPP
                                                                Appendix B (Document 2)
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                               	Page 3 5 of 40
       Figure 9. PEP Collocation Studies: Distribution of Paired Differences
                          Regions 9 - 10: 2002 - 2007
Normalized Paired Differences Greater than 50% Not Shown (<1% of total observations

20.0-
17.5- j\
15.0-
12.5-1 F
1 10.0-
7.5-
r
5.0-


2.5-

0


\
\r
\
•n





5


Curves:




















n "'

TTT^^^TTfl-w- rULnji
10 15 20 25 30 35



Normalized Paired Difference (%)
--- Normal(Mu=6.9607 Sigma=8.2426)
— Lognormal(Theta=-0 STiape=1 .3 Scale=
Beta(Theta=-.01 Scale=51 a=0.76 b=4.4
Rpqinn 9
Summary Statistics
N
Mean
Median
Std Dev
Minimum
Maximum









40

?

236
6.96
4.10
8.24
0.04
40.53
H





























45 50























25-
20-
15-
1
10-


5-





Summary Statistics
N
Mean
Median
Std Dev
Minimum
Maximum

[I
\










--- -p----
\
Xs-jn ""~---
rftff^^i^ri^ Q_
5 10 15 20 25 30 35
Normalized Paired Difference (%)


— Lognormal(Theta=-0 STiaDe=1.5 Scale=1
Beta(Theta=-.01 Scale=51 a=0.71 b=4.53
Rpqinn 10









40

)

797
6.59
3.92
7.55
0.00
46.99































45 50







-------
                                                                           Project: PEP QAPP
                                                                      Appendix B (Document 2)
                                                                               Revision No: 1
                                                                               Date: 3/6/2009
                                                                     	Page 36 of 40
       Figure 10. PEP Audits of SLT Samplers: Distribution of Paired Differences
                                      2002 - 2007
   Normalized Paired Differences Greater than 50% Not Shown (<3% of total observations)
   17.51
   15.0-
   12.5-
   10.0-
o

CL
                                   N
                                   Mean
                                   Median
                                   Std Dev
                                   Minimum
                                   Maximum
            4516
             9.98
             6.95
             9.59
             0.00
            50.00
    0.0
                        10
15     20      25      30      35
  Normalized Paired Difference (%)
40
45
50
              Curves:
Normal(Mu=9.976 Sigrna=9.5927)
Lognormal(Theta=-0 bhape=1.6 Scale=1.7)
Befa(Theta=-.01 Scale=51 a=0.73 b=2.86)

-------
                                                                     Project: PEP QAPP
                                                               Appendix B (Document 2)
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                               	Page 37 of 40
  Figure 11. PEP Audits of SLT Samplers: Distribution of All Paired Differences
                           Regions 1-4: 2002 - 2007
Normalized Paired Differences Greater than 50% Not Shown (<3% of total observations)





17.5-


£ 12.5-
1
°- 10.0-

7.5-

5.0-

2.5- -
0










\

^
\ -|

.--Hj"-
; \
5
Curves:















-i
T
f
r— i


^J]J]rT] ~""-

































—

















10 15 20 25 30 35
Normalized Paired Difference (%)


Summary Statistics
N
Mean
Median
Std Dev
Minimum
Max mum








-f^
40
	 Normal(Mu=10.291 Sigma=10.013)
— Lognormal(Theta=-0 Stiape=1 .7 Scale=1 .6)
Beta(Theta=-.01 Scale=51 a=0.68 b=2.61)
Rp.n




321
10.29
6.92
10.01 •
0.00
45.45




















45 50





A
10- \
\


1 6-1
a!

4-
2-
0


\
-r
\
\
_\















SflflJ^f









i









f









^••--
5 10 15 20 25 30 35
Normalized Paired Difference (%)
Curves:



Summary Statistics
N
Mean
Median
Std Dev
Minimum
Maximum







40
--- Normal(Mu=1 2.348 Sigma=1 1.438)
— Lognormal(Theta=-0 Snape=1 .4 Scale=1 .9)
Beta (Th eta =-.01 Scale=51 a=0.73 b=2.13)
Rpq
272
12.35
9.20
11.44
0.00
50.00










45 50
nn 9





































25-





£ \

10- r


5-

0












n

\
-'' \ n ~~-^
'ffttffefe-^^
5 10 15 20 25 30 35
Normalized Paired Difference (%)

Summary Statistics
N 499 1
Mean 6.78 1
Median 5.10 1
Std Dev 6.81 1
Minimum 0.00 1
Maximum 45.75

























40 45 50
Curves: 	 Normal(Mu=6.7758 Sigma=6.8093)
— Lognormal(Theta=-0 Stiape=1 .6 Scale=1 .3)
Befa(Theta=-.01 Scale=51 a=0.78 b=4.93)
Region 3



15.0- t
12.5-

_ 10.0-1
|
o_
7.5-

2.5-
0

\

-i\
\
n \
_\ -T
\ 41
Uifes^ -,
5 10 15 20 25 30 35
Normalized Paired Difference (%)
Curves: --- Normal(Mu=8.5549 Sigma=8.0957)
Lognormal(Theta=-0 Snane=1 .4 Scale=1 .6
Beta(Theta=-.01 Scale=5T a=0.85 b=4.05)

Summary Statistics
N 733
Mean 8.55
Median 6.21
Std Dev 8.10
Minimum 0.00
Maximum 49.14






l=^ , , n 	
40 45 50











Rpnion 4

-------
                                                                     Project: PEP QAPP
                                                               Appendix B (Document 2)
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                               	Page 3 8 of 40
  Figure 12. PEP Audits of SLT Samplers: Distribution of All Paired Differences
                           Regions 5-8: 2002 - 2007
Normalized Paired Differences Greater than 50% Not Shown (<3% of total observations)





17.5-





£ 12.5-
£>
& 10.0- l\
7.5- r
5.0- r

2.5- '

0













\
HjT
,..- \TFT-~~--
rl4-d ~~--
Tffl^h^^h-rFter^ H
5 10 15 20 25 30 35
Normalized Paired Difference (%)

Curves: 	 Normal(Mu=9. 1903 Sigma=10. 005)
— Lognormal(Theta=-0 STiaoe=1.6 Scale=1
Befa(Trieta=-.01 Scale=51 a=0.64 b=2.6E
Region 5



Summary Statistics
N
Mean
Median
Std Dev
Minimum
563
9.19
5.76
10.01
0.00





Maximum 49.54











40

5)




































45 50









12.5- 1
10, ^
S 75-
Q.

5.0- ri-
r
2.5-
0




Summary Statistics
N
Mean
Median
Std Dev
Minimum
Maximum
561
10.55
7.69
9.47
0.00
48.89



\1
5 10 15 20 25 30 35
Normalized Paired Difference (%)
Curves: --- Normal(Mu=1 0.547 Sigma=9.4709
Lognormal(Theta=-0 Snaoe=1 .5 Scale=1 .1
Beta(Theta=-.01 Scale=5T a=0.81 b=2.98)
Rpnion 6
40
)










45 50










































10- f\

\




- \
g fi
S. r
4-

2-


0












n o^
-• \F ^ "•
^^ j
^•^

















-



3 10 15

C
urves:

'-v
-~dn_L "*pi
^^^Rrfl^KT^
20 25 30 35
Normalized Paired Difference (%)

Summary Statistics
N 422 1
Mean 10.79
Median 8.71 1
Std Dev 8.74 1
Minimum 0.00 1
Maximum 48.82










j 	 _^^_^f^_


















40 45 50

Normal(Mu=1 0.786 Sigma=8.7401)
Lognormal(Theta=-0 STiaoe=1.3 Scale=1.9)
Befa(Theta=-.01 Scale=51 a=0.98 b=3.5)
Region 7



15.0-
12.5-
_ 10.0- I

a!
7.5- \
~P
5.0-
2.5- ,
0
C





-,

xii n n
M
5 10
urves:



Summary Statistics
JN 316
Mean 13.84
Median 10.65
StdDev 12.02
Minimum 0.00
Maximum 50.00






±:

15 20 25 30 35
Normalized Paired Difference (%)
Normal(Mu=1 3.838 Sigma=12.02)
Lognormal(Theta=-0 Snaoe=1 .8 Scale=1 .9
Beta(Theta=-.01 Scale=5T a=0.66 b=1 .75)







40 45 50











Rpnion 8

-------
                                                                     Project: PEP QAPP
                                                               Appendix B (Document 2)
                                                                         Revision No: 1
                                                                         Date: 3/6/2009
                                                               	Page 3 9 of 40
  Figure 13. PEP Audits of SLT Samplers: Distribution of All Paired Differences
                          Regions 9 - 10: 2002 - 2007
Normalized Paired Differences Greater than 50% Not Shown (<3% of total observations)

15.0-
\

12.5-

\


e 7.5-
Q.

5.0-

25-

0







I
\n

~l
\
\-r
.--'' \


5

-


10
Curves:
































•"-.












-,


^^rff;:^.
15 20 25 30 35
Normalized Paired Difference (%)
	 Normal(Mu=10.214 Sigma=9.5328)
— Lognormal(Theta=-0 STiaDe=1.5 Scale=1
Befa(Theta=-.01 Scale=51 a=0.77 b=2.9


Rpninn 9


Summary Statistics
N
Mean
Median
Std Dev
Minimum
Maximum








40
P

505
10.21
7.30
9.53
0.00
48.77







































45 50




























20-


15-
c
I
10-

5-












\
\
-T


0



Summary Statistics
N 324 1
Mean 11.00 1
Median 8.00 1
Std Dev 10.09
Minimum 0.00 1
Maximum 48.65




\ n
V


fflh n
^ J -
1 1 f^fRflflBftferfw—,—^- ^ ^_















5 10 15 20 25 30 35 40 45 50
Normalized Paired Difference (%)

Curves: 	 Normal(Mu=1 0.999 Sigma=1 0.086)
— Lognormal(Theta=-0 Stiape=1 .9 Scale=1 .7)
Befa(Theta=-.01 Scale=51 a=0.66 b=2.33)
Reninn 10

-------
                                                 Project: PEP QAPP
                                           Appendix B (Document 2)
                                                     Revision No: 1
                                                     Date: 3/6/2009
                                          	Page 40 of 40
/This page intentionally left blank.]

-------
                               Appendix C


              Training Certification Evaluation Forms

The following forms will be used by the PEP to certify the PM2 5 field and laboratory personnel have
performed environmental data operations at a satisfactory level.

-------
[This page intentionally left blank.]

-------
Trainee/Operator's Name
Date
                     Field Performance Examination Checklist
STANDARD OPERATING PROCEDURE
Section 2.1 Equipment Inventory and Storage
1. Field Scientist has a general understanding of the requirements for
inventorying and procuring equipment
ACCEPT

RETEST

Notes:
Section 2.2 Communications
1. Field Scientist demonstrates general knowledge of the communication
requirements
2. Field Scientist understands how to document communications (e.g., e-mails,
phone calls)
3. Field Scientist understands when and how often to talk with the State, local, or
Tribal entity
4. Field Scientist understands requirements for the monthly progress report and
the expected information needed for it
ACCEPT




RETEST




Notes:
Section 2.3 Preparation for PEP Sampling Events
1. Field Scientist understands the requirements and uses for the Site Data Sheet
2. Field Scientist understands the purpose for a site visit
3. Field Scientist knows the procedure for audit event equipment preparation
4. Field Scientist understands the notification procedures for scheduling an audit
event
5. Field Scientist knows the appropriate days to sample and when it is possible to
sample at a different schedule
ACCEPT





RETEST





Notes:
Section 3.1 Cassette Receipt, Storage, and Handling
1. Field Scientist knows the procedure for receiving filters from the laboratory
2. Field Scientist understands the critical filter holding time requirements
3. Field Scientist knows the procedure for storing filters at the field office,
during transport to the field, and if samples must come back to the field office
4. Field Scientist knows the procedure for handling unexposed and exposed
filters
ACCEPT




RETEST




Notes:

-------
STANDARD OPERATING PROCEDURE
Section 4.1 Sampler Transport and Placement
Field Scientist understands the procedure for safe transport of the main sampler
unit and travel cases to the sampling location
ACCEPT

RETEST

Notes:
Section 5.1 Sampler Assembly, Section 6.4 Sampler Disassembly
Field Scientist properly assembles the unit (overall)

Legs
AC power supply
Weather shroud (back plate)
Gill screen
Inlet assembly and downtube
WINS impactor assembly or VSCC inspection
Removal of transport filter cassette
Field Scientist properly powers the unit
Field Scientist properly sets date and time
Field Scientist properly disassembles the unit by storing components in correct
travel cases
ACCEPT











RETEST











Notes:
Section 6.5 Sampler Maintenance and Cleaning
Field Scientist properly identifies and performs maintenance areas to be checked
each visit (overall)

Water collector
Impactor well or VSCC
O-rings of impactor assembly or VSCC assembly
Downtube
Field Scientist properly identifies and performs maintenance on the O-rings of
the inlet
ACCEPT






RETEST






Notes:

-------
STANDARD OPERATING PROCEDURE
Section 5.2 Leak Check Procedures
1. Field Scientist properly configures sampler for leak check
2. Field Scientist navigates to the correct "screen"
3. Field Scientist slowly releases vacuum
4. Field Scientist has an awareness of the internal leak check procedure
5. Field Scientist records data on the Field Data Sheet correctly
6. Field Scientist has an awareness of troubleshooting procedures
ACCEPT






RETEST






Notes:
Section 5.3 Barometric Pressure Verification
1. Field Scientist installs the barometric pressure transfer standard correctly
and allows time for the equipment to equilibrate to ambient conditions
2. Field Scientist navigates to the correct sampler "screen"
3. Field Scientist records data on the Field Data Sheet correctly
4. Field Scientist has an awareness of troubleshooting procedures
ACCEPT




RETEST




Notes:
Section 5.4 Temperature Verification
1. Field Scientist installs the temperature transfer standard correctly and
allows time for equipment to equilibrate to ambient conditions
2. Field Scientist navigates to the correct sampler "screen"
3. Field Scientist performs the ambient temperature verification properly
4. Field Scientist performs the filter temperature verification properly
5. Field Scientist records data on the Field Data Sheet correctly
6. Field Scientist has an awareness of troubleshooting procedures
7. Field Scientist has an awareness of the filter overheat ("F") flag.
ACCEPT







RETEST







Notes:

-------
STANDARD OPERATING PROCEDURE
Section 5.5 Flow Rate Verification
1. Field Scientist correctly installs and zeroes the flow transfer standard
2. Field Scientist correctly installs the test/transport filter cassette
3. Field Scientist navigates to the correct sampler "screen"
4. Field Scientist performs the flow rate verification properly
5. Field Scientist calculates percent difference for comparison to flow rate acceptance
criteria (if flow transfer standard is the primary or back-up flow standard)
6. Field Scientist compares the flow transfer standard with the sampler flow rate
7. Field Scientist compares the flow transfer standard with the design flow rate
8. Field Scientist records data on the Field Data Sheet correctly
9. Field Scientist returns the sampler to normal operation
10. Field Scientist has an awareness of troubleshooting procedures
ACCEPT










RETEST










Notes:
Section 5.6 Preparing to Sample
1. Field Scientist attaches the inlet assembly
2. Field Scientist installs the WINS impactor assembly or VSCC
3. Field Scientist completes the installation (checks that the sampler is secure and that
the inlet is level, puts away installation tools and shipping materials, covers
electrical connections)
ACCEPT



RETEST



Notes:
Section 6.1 Conducting the Filter Exposure
1. Field Scientist installs the field blank filter cassette into the sampler (performs
additional steps, including inspection, documentation of Cassette ID, and
placement into a 3" x 5" bag)
2. Field Scientist has a awareness of trip blank procedures
3. Field Scientist installs the Routine PE filter cassette into the sampler
4. Field Scientist programs the Cassette ID and the AQS Site Code into the sampler
5. Field Scientist programs the sampler to run for the next day
6. Field Scientist programs the sampler to run the day after next
ACCEPT






RETEST






Notes:

-------
STANDARD OPERATING PROCEDURE
Section 6.2 Sample Recovery and Data Download
1. Field Scientist records information on the Field Data Sheet from the sampling run
2. Field Scientist removes the filter cassette from the sampler (performs additional
steps, including inspection, documentation, and placement into a 3" x 5" bag)
3. Field Scientist performs data download onto a laptop computer and a 3.5" disk (or
other portable storage media)
ACCEPT



RETEST



Notes:
Section 6.3 Filter Packing and Shipment
1. Field Scientist performs packing procedure correctly
2. Field Scientist includes all necessary items in the shipping cooler (filter cassettes,
ice packs, mm/max thermometer, documentation, data storage media)
3. Field Scientist demonstrates knowledge of the time requirements for shipment
ACCEPT



RETEST



Notes:
Section 7.1 Chain-of-Custody Form and Field Data Sheet
1. Field Data Sheet(s) have been appropriately and completely filled out
2. Chain-of-Custody Form(s) have been appropriately and completely filled out
ACCEPT


RETEST


Notes:
Section 8.1 Quality Assurance/Quality Control
1. Field Scientist demonstrates general knowledge of the required QA activities for
the PEP
2. Field Scientist has awareness of how frequently the QA/QC activities should be
conducted
3. Field Scientist knows procedures for scheduling, ordering filters, sampler set-up,
and conduct of "parking lot" collocation studies. Field Scientist understands how
to interpret results with respect to their Region's samplers.
ACCEPT



RETEST



Notes:
Section 8.2 Field Data Verification and Validation
1. Field Scientist has awareness of field data verification procedures and how
frequently they should be performed
ACCEPT

RETEST

Notes:

-------
                                                      Project: PEP QAPP
                                                            Appendix C
                                                          Revision No: 1
                                                          Date: 3/6/2009
                                                            Page 8 of 12
STANDARD OPERATING PROCEDURE
Section 9.1 Information Retention
1. Field Scientist demonstrates general knowledge of the
requirements
information retention
ACCEPT

RETEST

Notes:
Instructor's/Auditor's Name
Instructor's/Auditor's Name
Instructor's/Auditor's Name
Instructor's/Auditor's Name

-------
                                                                   Project: PEP QAPP
                                                                         Appendix C
                                                                       Revision No: 1
                                                                       Date: 3/6/2009
                                                                         Page 9 of 12
   Performance Examination Checklist for Weighing Laboratory Training

Trainee:                                            Date:
Evaluator:
Fully Successful:
WEIGHING LABORATORY ACTIVITY
Success
(Yes/No)
COMMENTS
Section 6.1 FILTER CONDITIONING (Pre-Sampling)
1 . Determine how many filters need to be conditioned for
the next shipment
2. Select filter boxes for conditioning after checking the
appropriate form
3 . Determine the filter conditioning period for the lot
based on earlier measurements
4. Check whether temperature and relative humidity
(RH) values in the conditioning environment are
within the acceptance criteria
5 . Put on gloves and a lab coat
6. Use forceps to handle filters only by their rings
7. Inspect filters for defects
8. Transfer acceptable filters to the Petri dish, place the
cover three-quarters of the way across it, put the dish
on the tray, put the tray in the rack, and transfer
rejected filters to the envelope
9. Record data on the Filter Inventory Form
10. Conduct pre-sampling filter conditioning test with
three filters from the batch, weigh them periodically
until the weights stabilize, and keep filters in the
conditioning environment until the conditioning
period is complete




















SCORE OF 10 POSSIBLE

-------
Project: PEP QAPP
      Appendix C
    Revision No: 1
    Date: 3/6/2009
     Page 10 of 12
WEIGHING LABORATORY ACTIVITY
Success
(Yes/No)
COMMENTS
Section 8.1 MANUAL FILTER WEIGHING (Pre-sampling and Post-sampling)
1 . Record temperature and RH of the conditioning period
and record on appropriate data form; and check
whether they meet the acceptance criteria
2. Put on gloves and a lab coat
3 . Clean the microbalance's weighing chamber with
appropriate brush, and then clean the balance table
surface and two forceps
4. Exercise the microbalance draft shield to equilibrate
the air in the weighing chamber
5. Zero (i.e., tare) and calibrate the microbalance
6. Use appropriate forceps to handle the working
standards
7. Weigh the first working mass reference standard,
record the value on the appropriate form, and compare
this value against verified value
8. Weigh the second working mass reference standard,
record the value on the appropriate form, and compare
this value against the verified value
9. Close the chamber door and check zero
10. Select the filter, the Record ID, and indicate the filter
type on the appropriate data form
1 1 . Use the appropriate forceps to handle filters only by
their outside ring, and then move filters from the Petri
dishes to the antistatic strip and wait for 30 to 60
seconds
12. Move the filters from the antistatic strip to the center
of microbalance weighing pan and close the draft
shield
13. Weigh the filters and return them to the Petri dishes;
record weighing data on the appropriate form
14. At the end of the batch, reweigh one of the filters;
decide if more filters need duplicate weighings; record
weighing data on the laboratory data form; and then
check for agreement with previous values





























-------
Project: PEP QAPP
      Appendix C
    Revision No: 1
    Date: 3/6/2009
     Page 11 of 12
WEIGHING LABORATORY ACTIVITY
15. At the end of the batch, reweigh the two working
standards; record the working standard measurements
on the appropriate form; and then check for agreement
with verified values
16. Weigh laboratory blanks; record, check for agreement
with previous values, and return them to the Petri
dishes that are labeled as laboratory blanks
17. Save the appropriate filter for reweighing with the
next batch (only in post-sampling)
Success
(Yes/No)



COMMENTS



SCORE OF 17 POSSIBLE
Section 8.1 FILTER WEIGHING and Section 9.1 SHIPPING (Filter Shipping to Field)
1 . Put on gloves and a lab coat
2. Select the weighed filter and a clean cassette; record
the Cassette ID on the appropriate form
3 . Use forceps to handle the filters; hold the filter only by
the outside ring
4. Move filters from Petri dishes to the bottom section of
filter cassette that has a backing screen and is secure
with the cassette top
5. Record the Cassette ID on a new 3" x 5 " antistatic
self-sealing bag
6. Put caps on the filter/cassette assemblies
7. Put capped filter/cassette assemblies into a labeled 3"
x 5" bag
8. Add the Cassette ID and pre-sampling weighing date
to the appropriate form
9. Select the filter cassette assemblies that are still
contained in the 3" x 5" bag from the appropriate form
10. Completely fill in the appropriate section of the Chain-
of-Custody Form (COC-2)
11. Place multiple filter cassette assemblies, each still in
3" x 5" bags, with appropriate COC forms in a larger
9" x 12" bag
12. Wrap in bubble wrap, pack them, fill out FedEx
shipping papers, and notify the Regional Office Field
Scientist of the shipment
























SCORE OF 12 POSSIBLE

-------
Project: PEP QAPP
      Appendix C
    Revision No: 1
    Date: 3/6/2009
     Page 12 of 12
WEIGHING LABORATORY ACTIVITY
Success
(Yes/No)
COMMENTS
Section 9.1 FILTER CHAIN OF CUSTODY (Filter Receipt)
1 . Open the shipping container; find cassette assemblies,
Chain-of-Custody Form (COC-2), Field Data Sheet,
and the sampler data diskette; and check these over to
ensure that the shipment is complete and that data
sheets are appropriately filled out
2. Store the diskette in folder by Region
3 . Completely fill out Part V of COC-2, record
temperature data on the Chain-of-Custody Form, move
the scalable bags to the refrigerator or weigh room
depending on when post sample weighs will be
performed
4. Describe how long filter cassette assemblies in the
3" x 5" bag should be thermally equilibrated in the
weigh room before opening








SCORE OF 4 POSSIBLE
Section 6.1 FILTER CONDITIONING (Post-Sampling) and Section 9.1 FILTER CHAIN OF
CUSTODY (Filter Receipt)
1 . Match the Cassette ID/Filter Type on the bag with the
information listed on COC-2
2. Remove the filter cassette assembly from the 3" x 5"
scalable bags
3 . Remove the caps from filter/cassette assemblies
4. Put on gloves and remove the filter from the cassette
5. Use forceps to handle filters; hold the filter only by the
rings
6. Inspect the filters for defects
7. Move the filters from the cassettes to the Petri dishes,
label the Petri slide with the Filter ID and Filter Type,
place the cover three-quarters over the dish, put the
dish on the tray, and place the tray in the rack
8. Allow the filter to condition for no less than 24 hours;
conduct the post-sampling filter conditioning test with
three filters before the remainder of the batch is
weighed
















SCORE OF 8 POSSIBLE
Trainee 100% successful:

-------
                                                                           Project: PEP QAPP
                                                                                 Appendix D
                                                                               Revision No: 1
                                                                               Date: 3/6/2009
                                                                          	Page 1 of 6
                                     Appendix D

                              Data Qualifiers/Flags
A sample qualifier or a result qualifier consists of three alphanumeric characters which act as an indicator
of the fact and the reason that the subject analysis (1) did not produce a numeric result; (2) produced a
numeric result, but it is qualified in some respect relating to the type or validity of the result; or (3)
produced a numeric  result, but for administrative reasons, it is not to be reported outside the laboratory.

-------
                                                  Project: PEP QAPP
                                                        Appendix D
                                                      Revision No: 1
                                                      Date: 3/6/2009
                                                 	Page 2 of 6
[This page intentionally left blank.]

-------
                                                                           Project: PEP QAPP
                                                                                 Appendix D
                                                                               Revision No: 1
                                                                               Date: 3/6/2009
                                                                          	Page 3 of 6
Field Qualifiers
Code
CON
DAM
EST-
EVT
FAC
FAT
FIT
FLR11
FLT-
FMC
FPC
FSC
FVL
GFI
LEK
SDM
Definition
Contamination
Filter damage
Elapsed sample time
Event
Field accident
Failed temperature
check ambient
Failed temperature
check internal
Flow rate
Filter temperature
Failed multipoint
calibration
verification
Failed pressure check
Failed single-point
calibration
verification
Flow volume
Good filter integrity
Leak suspected
Sampler damaged
Description
Contamination, including observations of insects or other
debris
Filter appeared damaged
Elapsed sample time out of specification
Exceptional event expected to have effected sample (e.g., dust,
fire , spraying)
An accident in the field occurred that either destroyed the
sample or rendered it not suitable for analysis
Ambient temperature check out of specification
Internal temperature check out of specification
Flow rate, 5 -minute average out of specification
Filter temperature differential, 30-minute interval out of
specification
Failed the initial multipoint calibration verification
Barometric pressure check out of specification
Failed the initial single -point calibration verification
Flow volume suspect
Filter integrity, upon post-sampling field inspection looks
good
Internal/external leak suspected
Sampler appears to be damaged which may have effected filter
I/- Flag generated by sampling equipment

-------
                                                                      Project: PEP QAPP
                                                                            Appendix D
                                                                          Revision No: 1
                                                                          Date: 3/6/2009
                                                                      	Page 4 of 6
Laboratory Qualifiers
Code
ALT
AVG
BDL
BLQ
CAN
CBC
EER
FBK
PCS
FFB
FIS
FLB
FLD
FLH
FLT
FQC
FTB
GSI
HTE
ISP
Definition
Alternate measurement
Average value
Below detectable limits
Below limit of
quantitation
Canceled
Cannot be calculated
Entry error
Found in blank
Failed collocated sample
Failed field blank
Failed internal standard
Failed laboratory blank
Failed laboratory
duplicate
Failed laboratory
humidity
Failed laboratory
temperature
Failed quality control
Failed trip blank
Good shipping integrity
Holding time exceeded
Improper sample
preservation
Description
Subject parameter determined by using an alternate
measurement method; value believed to be accurate but could
be suspect
Average value (used to report a range of values)
There was not a sufficient concentration of the parameter in the
sample to exceed the lower detection limit in force at the time
the analysis was performed. Numeric results field, if present is
at best, an approximate value.
The sample was considered above the detection limit but there
was not a sufficient concentration of the parameter in the
sample to exceed the lower quantitation limit in force at the
time the analysis was performed
Analysis of this parameter was canceled and not performed
Calculated analysis result cannot be calculated because an
operand value is qualified
The recorded value is known to be incorrect but the correct
value cannot be determined to enter a correction.
The subject parameter had a measurable value above the
established QC limit when a blank was analyzed using the same
equipment and analytical method. Therefore, the reported value
may be erroneous.
Collocated sample exceeded acceptance criteria limits
Field blank samples exceeded acceptance criteria limits
Internal standards exceeded acceptance criteria limits
Laboratory blank samples exceeded acceptance criteria limits
Laboratory duplicate samples exceeded acceptance criteria
limits
Laboratory humidity exceeded acceptance criteria limits
Laboratory temperature exceeded acceptance criteria limits
The analysis result is not reliable because quality control
criteria were exceeded when the analysis was conducted;
numeric field, if present, is estimated value.
Trip blank sample exceeded acceptance criteria limits
Integrity of filter upon receipt by shipping/receiving looked
good
Filter holding time exceeded acceptance criteria limits
Due to improper preservation of the sample, it was rendered not
suitable for analysis

-------
 Project: PEP QAPP
       Appendix D
     Revision No: 1
     Date: 3/6/2009
	Page 5 of 6
Code
INV
LAC
LLS
LTC
NAR
REJ
REQ
RET
RIN
STD
UNO
Definition
Invalid sample
Laboratory accident
Less than lower standard
Less than criteria of
detection
No analysis result
Rejected
Re-que for re-analysis
Return(ed) for re-
analysis
Re -analyzed
Internal standard
Analyzed but undetected
Description
Due to single or a number or flags or events, the sample was
determined to be invalid.
There was an accident in the laboratory that either destroyed the
sample or rendered it not suitable for analysis.
The analysis value is less than the lower quality control
standard.
Value reported is less than the criteria of detection
There is no analysis result required for this subject parameter
The analysis results have been rejected for an unspecified
reason by the laboratory. For any results where a mean is being
determined, these data were not used to calculate the mean.
The analysis is not approved and must be re-analyzed using a
different method.
The analysis result is not approved by laboratory management
and re-analysis is required by the bench analyst with no change
in the method.
The indicated analysis results were generated from a re-analysis
The subject parameter is being used as an internal standard for
other subject parameters in the sample. There is no analysis
result report, although the theoretical and/or limit value(s) may
be present
Indicates material was analyzed for but not detect

-------
                                                  Project: PEP QAPP
                                                        Appendix D
                                                      Revision No: 1
                                                      Date: 3/6/2009
                                                 	Page 6 of 6
/This page intentionally left blank.]

-------
                                                               Project: PEP QAPP
                                                                    Appendix E
                                                                  Revision No: 1
                                                                  Date: 3/6/2009
                                                                   Page Iof20
                               Appendix E


                   Technical Systems Audit Forms

              (These forms are available from OAQPS in electronic format.)

Document                                                              Page
1.     PEP Field Technical Systems AuditForm	E-3
2.     PEP Sampler Audit Worksheet	E-ll
3.     PEP Laboratory Technical Systems AuditForm	E-15

-------
                                                 Project: PEP QAPP
                                                        Appendix E
                                                     Revision No: 1
                                                     Date: 3/6/2009
                                                        Page 2 of 20
[This page intentionally left blank.]

-------
                           PEP Field Technical Systems Audit Form
Project: PEP QAPP
      Appendix E
    Revision No: 1
    Date: 3/6/2009
      Page 3 of 20
Part 1 - Quality System Documentation
and Facility Operations
Agency Being Evaluated:
Office or Lab Location:

AQS Site ID:
Assessor Name:
Observer(s) Name:
Assessment Date:

Section 1. Organization and Responsibilities
1. Field Operations Manager
Name:
Phone:
Address:
Affiliation:
Affiliation:

Affiliation:


Phone:
E-mail:
2. PEP Field Operators(s)
Name:
Phone:
Address:
Affiliation:


Phone:
E-mail:
Name:
Phone:
Address:
Affiliation:


Phone:
E-mail:

(Revised: March 4, 2009)

-------
                                   PEP Field Technical Systems Audit Form
Project: PEP QAPP
        Appendix E
     Revision No: 1
     Date: 3/6/2009
       Page 4  of 20
         Section 1.  Organization and Responsibilities (Cont'd)

               Audit Questions (Block for the correct answer is highlighted yellow.  If answer other than correct
               answer, enter response in Comments Section.)
            1   Does the SLT PEP operate under an approved quality assurance project plan (QAPP)?
               Date of QAPP approval?
            2   Are there significant differences between the Federal PEP QAPP and the SLT PEP QAPP
               If yes list or briefly describe the differences in the comment section.


            3   If yes, does the approved QAPP contain the field operations SOP(s)?


            4   Is a copy of the approved QAPP and SOP available for review by field operators?
               If no, briefly describe how and where QA and QC requirements and SOPs are documented


            5   Have all appropriate personnel reviewed the QAPP?


            6   Are there any deviations from the field SOP(s) at your site?
               If yes, briefly describe why.
                                                                                                    (O = Other)
                                                                                                    RESPONSE
                                                                                                        N
                                                                                                                O
            7   Have the PEP Field operators attended PEP training ?
               When?
               Lab Technicians if applicable
         Section 1  Comments (Place question number and comment)
(Revised: March 4, 2009)

-------
                           PEP Field Technical Systems Audit Form
Project: PEP QAPP
      Appendix E
    Revision No: 1
    Date: 3/6/2009
      Page 5 of 20
Section 2. Safety
Audit Questions (Block for the correct answer is highlighted yellow. If answer other than correct
answer, enter response in Comments Section.)
(O = Other)
1
2
3
4
5



Is the field operator authorized to suspend a PEP audit in the event of a health or
safety hazard
If not, then who?

Has the operator been trained in the particular hazards of the instrument/materials
with which they are operating?
Are personnel outfitted with any required safety equipment? E.g., extreme weather
clothing, harnesses, head gear, repellants.
Are personnel trained regarding OSHA Limits for
manually lifting and carrying loads?
Are personnel trained regarding other safety issues and procedures?
Section 2 Comments (Place question number and comment)
RESPONSE
Y N | O

1

1

1

1

1



Sections. Sampler Siting
Use 40 CFR Appendix A and E for siting requirements
Audit Questions (Block for the correct answer is highlighted yellow. If answer other than correct
answer, enter response in Comments Section.)
(O = Other)
1
2
3
4


Has the auditor evaluated the site of the FRM sampler and PEP sampler used in this TSA to
determine if it conforms to the siting requirements of 40 CFR 58, Appendices A andE?

Has permission been given for not complying with the siting criteria? If yes, please explain.

Are there any noticeable problems at the site that would affect sample integrity?

Are there any visible sources that might influence or impact the monitoring instrument?
If present list the influencing sources in the comment section
Section 3 Comments (Place question number and comment)
RESPONSE
Y N | 0

1

1

1

1


(Revised: March 4, 2009)

-------
                                PEP Field Technical Systems Audit Form
Project: PEP QAPP
       Appendix E
    Revision No: 1
    Date: 3/6/2009
       Page 6 of 20
         Drawing of site SLT is Auditing During ISA


         Briefly draw the monitoring location and illustrate all obstructions including distances to the nearest roadways
         and/or obstructions. Use the table below to indicate objects, distance from object to sampler (m), height of object
         (m), and orientatation from sampler (degrees).

1
2
3
4
5
6
7
8
9
10
Object(s)










Distance from sampler to object,
(m)










Height of
object, (m)










Orientation from sampler to
object, (degrees)










        After your sketch, please photograph the sampler from 8 cardinal directions, and then take photographs looking
        from the sampler in the 8 directions.
        Place the PM sampler in middle of drawing.
(Revised: March 4, 2009)

-------
                                      PEP Field Technical Systems Audit Form
Project: PEP QAPP
        Appendix E
     Revision No: 1
     Date: 3/6/2009
        Page 7 of 20
          Basic siting criteria from 40 CFR Appendix A and E

             1   The height of the inlet to the sampler should be between 2 and 15 meters above ground surface.

             2   For samplers located on roofs  or other structures, the minimum separation distance between the inlet and any structure should be
                greater than 2 meters.

             3   The sampler should be located away from obstacles so that the monitor is at a distance least twice the height of the obstacle. For
                example, a tree is 10 meters tall and is east of the sampler. The sampler would need to be placed at least 20 meters away from a tree.

             4   An unrestricted air flow of 270° must exist around the inlet.

             5   If the sampler is located on the side of a building, a  180° air flow clearance is required.

             6   Sampler inlet should be placed at least 20 meters from the drip line of any tree.

             7   Minimum distance to any roadway is 10 meters, but this value is determined by the average daily number of vehicles (refers to
                40 CFR Part 58 Appendix E for exact table).

             8   The inlet for a co-location sampler and audit sampler should agree vertically within 1 meter.

             9   The closest horizontal distance to place a co-location sampler to a Lo-Vol sampler or Ffi-Vol sampler is 1 and 2 meters, respectively.
                The maximum horizontal distance a co-location sampler can be from any sampler is 4 meters.

                Comments:
          Section 4.  Monitoring Site Information & Audit Event Planning

                 Block for the correct answer is highlighted yellow. If answer other than correct answer, enter
                 response in Comments Section.
                                                                                                             (O = Other)
             1    Does SLT auditor have Site Data Sheet for the site being audited

             2    Does SLT maintain a data base to permanently store information

             3    Is the sampling platform or set-up area clean and in good repair?

             4    Is there adequate room to perform the needed operations?

             5    Does the SLT PEP have a tentative audit plan for the current calendar year

             6    Does field operator have a working knowledge of the correct sampling days
                 Explain in comment section the contingency measures if planned audits do not take place

          Section 4 Comments (Place question number and comment)	
                                                                                                             RESPONSE
                                                                                                         Y
                                                                                                                 N
                                                                                                                          O
(Revised: March 4, 2009)

-------
                                     PEP Field Technical Systems Audit Form
     Project: PEP QAPP
             Appendix E
          Revision No: 1
          Date: 3/6/2009
             Page 8 of 20
          Section 5.  Sample Handling

                Audit Questions (Block for the correct answer is highlighted yellow.  If answer other than correct
                answer, enter response in Comments Section.)
                                                                                                           (O = Other)
                                                                                                          RESPONSE
                                                                                                               N
                Receiving:
            1   Does Field Operator log information on COC and Initiate FDS

            2   Does Field Operations have adequate clean and conditioned temporary storage space

            3   Are all samples handled to avoid contamination and/or loss of material during field operation?

            4   Does the operator know how to perform trip and field blank events?
                Have operator show steps. Document any discrepancy from SOP.
                                                                                                                       O
                                      Satisfactory=S, Unsatisfactory=U, Need Review =R, Not Assessed=NA
            5   Observe the following handling steps for PEP samples, verifying that the auditor
                follows the sample handling SOPs correctly:

             a.  Receipt and temporary storage of sampling filters at the auditor's office facility
             b.  Documents receipt of sampling filter on chain of custody form
             c.  Inspection of the sampling filter prior to sampling
             d.  Installation of sampling filter in the sampler
             e.  Retrieval of exposed filter from the sampler after a sampling event
             f.  Completion of chain of custody and field data forms and shipping package included
             g.  Filters transferred to local operator's facility, follows temporary transport procedures

            6   How are sample handling problems communicated and to whom?
                                                                                                    RESPONSE
U
        R
               NA
          Section 5 Comments (Place question number and comment)
(Revised: March 4, 2009)

-------
                                 PEP Field Technical Systems Audit Form
Project: PEP QAPP
       Appendix E
    Revision No: 1
    Date: 3/6/2009
       Page 9 of 20
Section 6. Demonstration of Properly Setting-up and Running a PEP Audit
Satisfactory=S, Unsatisfactory=U, Need Review =R, Not Assessed=NA
1
2
3
4
5
6
1
8
9
10
Set-up of the Sampler:
Auditor properly sets-up PEP sampler
Auditor properly powers the unit
Field Scientist properly set date/time
Field Scientist properly conducts sampler performance verifications
in correct order:
Leak Check
Ambient Temperature Measurement
Barometric Pressure Measurement
Flow Rate setting and calibration
Filter Temperature Measurement
Field Scientist properly programs the audit sampler for subsequent sampling event
Field Scientist properly recovers the exposed filter and downloads or records run data
Field scientist properly disassembles unit and stores components in correct transport cases
Section 6 Comments (Place question number and comment)
RESPONSE
S
U
R | NA



|



1



1







1
1
1



1







1
1
1



1


         Section 7.  Shipping


               Audit Questions (Block for the correct answer is highlighted yellow.  If answer other than correct
               answer, enter response in Comments Section.)
                                                                                                 (O = Other)
           1   Is there adequate freezer space to for blue ice on site or in the office?


           2   Does site operator have knowledge of filter holding/use/shipping times?


           3   Are there weekend storage procedures in place?


           4   Are the coolers and samples being packed according to the SOPs?  Have site operator
               demonstrate procedure and document any discrepancies.


         Section 7 Comments (Place question number and comment)	
                                                                                                RESPONSE
                                                                                             Y
                                                                                                    N
                                                                                                           O
(Revised: March 4, 2009)

-------
                           PEP Field Technical Systems Audit Form
Project: PEP QAPP
      Appendix E
    Revision No: 1
    Date: 3/6/2009
     Page 10 of 20
Part 2 - MQOs for Audit Samplers, Calibrations, and Audit Devices
(The blue highlighted cells below will be filled by values entered on page 1.)
Monitoring Site Location:
AQS Site ID:

Assessment Date:

The following activities and acceptance criteria are in the PEP QAPP and PEP Field SOP (http://www.epa.gov/ttn/amtic/pmpep.html). They should
be consistent with the regulations at 40 CFR Part 50 Appendix L and Part 58 Appendix A.
Checks/Maintenance

Clock check
Leak check
Flow rate check
Filter temperature check
External temperature check
Ambient pressure check
Inspect and, if necessary, empty water collector
jar
Inspect/clean impactor or cyclone
Inspect visible O-rings in the flow path
Clean sampler's inlet surfaces
Clean main (first-stage) size-selective inlet (PM-
10 head)
Clean impactor housing and jet surfaces
Clean cyclones
Clean interior of sampler unit
Check condition of sample transport containers
Clean sampler downtube
Inspect cooling air intake fan(s) and filter
Inspect all O-rings and reapply vacuum grease
as needed
Inspect vacuum tubing, tube fittings, and other
connections to the pump and electrical
components
Clock check w/independent std
Flow rate audit w/independent std
Filter temperature check w/independent std
External temperature audit w/independent std
Ambient pressure audit w/independent std
Flow rate calibration device
Temperature calibration device
Pressure calibration device
Flow rate audit device
Temperature audit device
Pressure audit device
Frequency

Every Run
Every Run
Every Run
Every Run
Every Run
Every Run
Every Run
Every Run
Every Run
Quarterly
Quarterly
Quarterly
Every 10
Runs and
Quarterly
Quarterly
Quarterly
Quarterly
Quarterly
Quarterly
Quarterly
Quarterly
Quarterly
Quarterly
Quarterly
Quarterly
Annually
Annually
Annually
Annually
Annually
Annually
Requirement

Current date, time ± 5 minute
< 80 mL/min
±4% sampler design FR & Ref Std
Current temp ± 2°C*
Current temp ± 2°C*
Current pressure +/- 10 mm Hg*
Per service manual
Per service manual
Per service manual
Per service manual
Per service manual
Per service manual
Per service manual
Per service manual
Per service manual
Per service manual
Per service manual
Per service manual
Per service manual
Current date, time ± 5 minute
±4% sampler design FR & Ref Std
Current temp ± 2° C*
Current temp ± 2° C*
Current pressure +/- 10 mm Hg*
Certify as NIST-traceable**
Certify as NIST-traceable**
Certify as NIST-traceable**
Certify as NIST-traceable**
Certify as NIST-traceable**
Certify as NIST-traceable**
Last Date































* Comparison should be made to NIST-traceable audit device.
** Certifications may be performed by EPA, its contractor, or by the manufacturer.
Are corrective actions in place when Measurement Quality Objectives (MQOs) are not met
(e.g. out-of-control calibration data)?
Performed Correctly?
Y






























N






























OTHER



































Part 2 MQO Comments

(Revised: March 4, 2009)

-------
                                  PEP Sampler Audit Worksheet
                                  Project: PEP QAPP
                                        Appendix E
                                     Revision No: 1
                                     Date: 3/6/2009
                                      Page 11 of 20
 PM2.5 Performance Evaluation Program
       U.S. Environmental Protection Agency
 Location:
 AQS Site ID:
     Latitude (if known):
                                                                                       Date:
Longitude (if known):
 Audit Information
 Auditor(s):
 Site Operator:  	

 Phone No.:    	


 Sampler Model


 Last Calibration Date:




 Reference Std Model:


 Calibration Date:
             Affiliation:
             Affiliation:
                              Sampler S/N:


                      Collocated?   Yes (X):

                                   No (X):


                     Reference Standard S/N:
 Significant Findings:
(Revised: March 4, 2009)

-------
                              PEP Sampler Audit Worksheet
Project: PEP QAPP
      Appendix E
   Revision No: 1
   Date: 3/6/2009
    Page 12 of 20
Location:
AQS Site ID:






Date:

Clock Test:
If Local Time is under daylight savings, convert Ref Std to Local Standard Time. Daylight Saving Time begins for
most of the United States at 2:00 a.m. on the first Sunday of April. Time reverts to standard time at 2:00 a.m. on the last
Sunday of October.
Audit
Recalibrated
Date

Time (hh:mm)
Ref Std








PQ200







Difference
Minutes







5 minutes or less?
Pass Fail



Leak Test
Start cm H2O
Stop cm H2O

Initial Audit

After
Correction








Initial:
After Correction:

Change < 5 cmH2O for 2-min interval
Difference


Pass Fail



Flow Test Calibration
For the referen
Retest after Ca
ce standard, enter "UR" for under range and "OR" for over range flow readings.
L/min
Ref Std



PQ200

ibration
L/min
Ref Std



PQ200




% Difference


% Difference



Less than 4%?
Pass Fail


Less than 4%?
Pass Fail


(Revised: March 4, 2009)

-------
                              PEP Sampler Audit Worksheet
Project: PEP QAPP
      Appendix E
   Revision No: 1
   Date: 3/6/2009
    Page 13 of 20
Location:
AQS Site ID:




Date:

Reference Standard vs Design Flow
Channel 1
Retest after Ca
Channel 1


Ref Std

ibration

Ref Std


L/min

L/min


PQ200
16.70


PQ200
16.70



% Difference


% Difference




Less than 4%?
Pass Fail


Less than 4%?
Pass Fail


Ambient Temperature Test
Retest After Re


Ref Std

calibration


Degrees C


PQ200






Difference






Less than 2 degrees?
Pass Fail




Filter Temperature Test
Retest After Re


Ref Std

calibration


Degrees C


PQ200






Difference






Less than 2 degrees?
Pass Fail




Pressure Test
Retest after rec


Ref Std

alibration


mm Hg


PQ200






Difference






Less than 10 mm?
Pass Fail




(Revised: March 4, 2009)

-------
                                                 Project: PEP QAPP
                                                       Appendix E
                                                     Revision No: 1
                                                     Date: 3/6/2009
                                                      Page 14 of 20
[This page intentionally left blank.]

-------
                                                    PEP Laboratory Technical Systems Audit Form
Audit Location and Attendees
(include Names, Affiliations, and Addresses)
Laboratory
Date(s) of Audit:
Audit Report Date:


Summary of Findings

Auditor(s)
Phone No. E-mail

Observer(s)
Phone No. E-mail

EPA Lab Oversight Personnel
Phone No. E-mail

Laboratory Supervisor
Phone No. E-mail

Laboratory Analyst(s)
Phone No. E-mail

(Revised: March 4, 2009)
                                                                         1 Of 6

-------
                                                    PEP Laboratory Technical Systems Audit Form
Finding Level: 1=exemplary; 2=satisfactory; 3=needs small improvement; 4=unsatisfactory and needs significant attention; 5=critical or catastrophic condition
    that needs immediate attention
Section-
Question
No.
Section 1
1-1.
1-2.
1-3.
1-4.
1-5.
1-6.
1-7.
1-8.
1-9.
1-10.
1-11.
1-12.
1-1 2a.
1-1 2b.
1-1 2c.
1-1 2d.
1-1 2e.
1-13.
1-14.
1-15.
1-16.
1-17.
1-18.
1-19.
Audit Question
Yes
No
N/A
Response or Comment
Finding
Level
Laboratory Management and Quality System Requirements
Does the Laboratory operate under a Quality Management Plan?
Are the EPA organizational structure and responsibilities of oversight
personnel well documented and understood?
Does the Laboratory operate under a unique Quality Assurance Project
Plan?
When was it last updated?
Does the PEP Laboratory operation follow an up-to-date SOP?
When was it last updated?
How are QA documents controlled?
How often are QA documents reviewed for accuracy?
Are obsolete documents such as the old version of an SOP retained?
How long are technical records maintained before they are disposed?
Who is authorized to halt program activities due to inadequate quality?
What kind of internal audit procedures exist?
Existence of Class 1 or 0 ASTM-certified weights
Last certification date(s)
Existence of NIST-traceable temperature and humidity std and
data logger
Accuracy
Last certification date(s)
Are reports available from the most recent internal and previous external
audits?
Were there any significant findings and if so what were they?
Are reports available for recent preventive or corrective actions associated
with last internal and external audits?
Is there a primary and a back-up PM2.5 PEP Laboratory Analyst?
Have PM2.5 PEP Laboratory Analysts been thoroughly trained on the PEP
field and laboratory operations and specifically on lab support functions?
Does the EPA Laboratory Manager conduct an annual review or audit of
procedures with the analysts and laboratory supervisor?
In the event of a catastrophic failure or suspension of the Laboratory; is
there a back-up laboratory and contingency plans for continued support?
























































































































(Revised- Marnh 4 90091 2 Of 6
Project: PEP QAPP
Appendix E
Revision No: I
Date: 3/6/2009
Page 16 of 20

-------
PEP Laboratory Technical Systems Audit Form
Section-
Question
No.
Section 2
2-1.
2-2.
2-3.
2-4.
2-5.
2-6.
2-7.
2-8.
2-9.
2-10.
2-11.
2-12.
2-13.
2-14.
2-15.
2-16.
2-17.
2-18.
Audit Question
Yes
No
N/A
Response or Comment
Finding
Level
General Facilities
Are isles and hallways to and from the filter processing and weighing lab
free of obstructions?
Is access to the weighing lab limited and controlled?
Is there a place in the log book for recording entry and use of the weighing
lab?
Are samples maintained in a secure area at all times after being delivered
to the facility?
Does adequate refrigerated storage capacity exist?
Are samples that are stored in the refrigerator logged-in, maintained on an
inventory list, and logged-out at disposal? Provide an example page.
Does the operator keep the filter-handling area neat and clean?
Is the weighing lab clear of extraneous papers, trash, and especially dust?
Is there a sticky mat at the entrance of the lab and has the top sheet be
removed recently, revealing a sticky surface to collect dirt and dust from
shoes?
Is the analytical balance positioned on a weighted table?
Are supplies and instruments such as alcohol wipes and tweezers stored
away neatly when not in use?
Is there a climate control system for the weighing lab and is it engaged and
working properly?
Is there a monitor or strip-chart recorder that is easily read by the lab
attendants and in plain view?
Does the climate control system have an alarm to indicate when conditions
inside the weighing lab are not suitable for weighing samples?
Please provide strip charts or graphs of the last two weeks' recordings.
Has the climate control system satisfactorily maintained critical parameters
for the last two months? If "no" explain.
Is there a routine maintenance or service contract in place for the climate
control system?
Are important routine maintenance procedures for the climate control
system conducted? Produce record or data.


























































































(Revised- Marnh 4 90091 3 Of 6
Project: PEP QAPP
Appendix E
Revision No: I
Date: 3/6/2009
Page 17 of 20

-------
PEP Laboratory Technical Systems Audit Form
Section-
Question
No.
Section 3
3-1.
3-2.
3-3.
3-4.
3-5.
3-6.
3-7.
3-8.
3-9.
Section 4
4-1.
4-2.
4-3.
4-4.
4-5.
4-6.
4-7.
4-8.
4-9.
4-10.
Audit Question
Yes
No
N/A
Response or Comment
Finding
Level
General PM2.5 PEP Laboratory Procedures
Are logbooks kept up-to-date and properly filled in clearly and completely?
How are records of critical consumables (such as filter lot numbers)
maintained?
Do analysts have a supply of disposable laboratory jackets and shoe
covers that attract and retain dust particles?
Are filters handled with the necessary care and finesse to avoid
contamination and/or loss of material?
Does the analyst thoroughly understand filter conditioning requirements
and procedures for pre-exposed and post-exposed filters?
Are new filters placed in the conditioning environment immediately upon
arrival and stored there not less than 24 hours prior to weighing?
Is the analytical balance located in the same controlled environment in
which the filters are conditioned?
Are filters conditioned at the same environmental conditions before both
the pre- and post-sampling weighing?
Do analysts have a thorough understanding of the timing requirements for
the life-cycle of a PEP filter?













































Balance Maintenance and Weighing Procedures
Is the balance serviced under a calibration and maintenance plan?
Specify date of last service.
Is there an up-to-date, formal logbook or file for balance maintenance?
Does the analytical balance used to weigh filters have a readability of ±1
ug?
Does a back-up balance exist that has a current calibration certificate?
Does a unique set of Class 0 reference weight standards exist for routine
calibration checks of the balance?
Does a certification exist? Last certification date? Certifying laboratory?
Are regular (e.g., daily, when in use) calibration checks made and
recorded?
Analyst should demonstrate weighing calibration weights. Please produce
results for last 2 weighing sessions.
Does a device exist for removal of static charges from the filter prior to and
during the weighing process?
The analyst should demonstrate the use of the antistatic device(s) during a
weighing exercise.


















































(Revised- Marnh 4 90091 4 Of 6
Project: PEP QAPP
Appendix E
Revision No: I
Date: 3/6/2009
Page 18 of 20

-------
PEP Laboratory Technical Systems Audit Form
Section-
Question
No.
4-11.
4-12.
4-13.
4-14.
4-15.
4-16.
4-17.
4-18.
Section 5
5-1.
5-2.
5-3.
5-4.
5-5.
5-6.
5-7.
5-8.
5-9.
5-10.
5-11.
Audit Question
Are the filters weighed immediately following the conditioning period withoul
intermediate or transient exposure to other conditions or environments?
Are both the pre- and post sampling weighing performed on the same
analytical balance?
Analyst should demonstrate weighing of conditioned filters. Data should be
provided for last two sets of filters.
Does analyst understand the function and importance of lab blanks?
Analyst should describe what a lab blank is and how it is incorporated into
the weighing routine.
Does analyst understand the function and importance of field and trip
blanks?
Are replicate weighing performed by the same analyst? If not are results of
round robin weighing between all analysts to determine biases?
Analyst should demonstrate proper loading of filters into cassettes.
Yes








No








N/A








Response or Comment








Finding
Level








Data handling and data validation
Does the lab analyst have thorough understanding and high proficiency for
entering data into the PEP database (PED)?
Does the analyst record PEP filters in the database according to the PEP
protocol, which assigns cassette ID numbers and identifies trip blanks?
Lab analyst should demonstrate.
Is there a procedure to QA the data by the lab analyst? Have them
describe it and provide an example of result; i.e., graph or statistics, etc.
Is there a procedure for a lab manager or supervisor to QA the data
generated by the lab analyst? Have them describe.
Does the analyst understand the significance of the Chain-of-Custody
Form?
Does the analyst know how to enter all information on the Field Data Sheet
into the PEP database?
Do the analysts and supervisor know the schedule for validating PEP audit
data and posting it on the Region 4 website and its importance?
Does documentation exist that shows when, why and by whom any invalid
data is overridden and re-validated?
Do the analysts and supervisor know the schedule for validating PEP audit
data and posting it in AQS and its importance?
Do the analysts and supervisor know the contact at EPA or within the PEP
support contractor's organization, if there is an issue with the PED?
Do the analysts and supervisor know the archiving schedule for filters?























































(Revised- Marnh 4 90091 5 Of 6
Project: PEP QAPP
Appendix E
Revision No: I
Date: 3/6/2009
Page 19 of 20

-------
                                                    PEP Laboratory Technical Systems Audit Form
Section-
Question
No.
5-12.
5-13.
Audit Question
Do the analysts and supervisor know the file storage numbering system for
the PEP?
Describe system back-ups.
Yes


No


N/A


Response or Comment


Finding
Level


                                                                                                                                             O
(Revised: March 4, 2009)
                                                                         6 Of 6

-------