United States
Environmental Protection
Agency
Office of Air Quality
Planning and Standards
Research Triangle Park, NC 27711
EPA-454/R-01-007
June 2001
Air
Quality Assurance Guidance
Document
Quality Assurance Project Plan
for the Air Toxics
Monitoring Program
-------
Foreword
EPA policy requires that all projects involving the generation, acquisition, and use of environmental data be
planned and documented and have an Agency-approved quality assurance project plan or QAPP prior to the start
of data collection. The primary purpose of the QAPP is to provide an overview of the project, describe the need for
the measurements, and define QA/QC activities to be applied to the project, all within a single document. The
QAPP should be detailed enough to provide a clear description of every aspect of the project and include
information for every member of the project staff, including site operators, lab staff, and data reviewers. The QAPP
facilitates communication among clients, data users, project staff, management, and external reviewers. Effective
implementation of the QAPP assists project managers in keeping projects on schedule and within the resource
budget. Agency QA policy is described in the Quality Manual and EPA QA/R-1, EPA Quality System
Requirements for Environmental Programs.
The following document represents a draft model Quality Assurance Project Plan (QAPP) for the
environmental data operations for Air Toxics Monitoring Program (ATMP). The Office of Air Quality
Planning and Standards ( OAQPS) staff developed this Model QAPP to serve as an example of the type
of information and detail necessary for the documents that will submitted by state and local organizations
involved in their ATMP. Please review this document and forward your comments and suggestions to the
persons listed in the Acknowledgment Section.
This draft model QAPP was generated using the EPA QA regulations and guidance as described in EPA
QA/R-5, EPA Requirements for Quality Assurance Project Plans and the accompanying document,
EPA QA/G-5, Guidance for Quality Assurance Project Plans. All pertinent elements of the QAPP
regulations and guidance are addressed in this model. The model also contains background
information and a rationale for each element which are excerpts from EPA QA/G-5 and are
included in text brackets (as seen above), usually found at the beginning of a section or
subsection.
The Model QAPP must not and can not be referenced verbatim. Data in the tables should not be
used by organizations to meet the data quality needs for their ATMP. These are provided as examples
only. Therefore, state and local organizations should develop their own QAPPs that meet their
needs.
The Standard Operation Procedures (SOPs) listed in the Table of Contents are a guidance document
developed by OAQPS for the Air Toxics Pilot Project. It is the outcome of work by the Air Toxics Pilot
Laboratory Sub-committee, headed by Joann Rice and Sharon Nizich. The guidance actually outlines the
preferred guideline and direction for air toxics monitoring and should be used by the air toxics community
as much as possible. The guidance document has appendices, which are the EPA's Toxic Organic (TO)
Compendia, which had been written earlier. OAQPS has not developed SOPs for this project because it
would be difficult to write SOPs for all of the different field and laboratory instruments that are available.
The TO Compendia are useful as guidance only. SOPs must be developed by the State and Local
Agencies for their individual programs.
-------
Acknowledgments
This Model QAPP is the product of the EPA's Office of Air Quality Planning and Standards. The
development and review of the material found in this document was accomplished through the activities of
the air toxics QA and Data Analysis Workgroup. The following individuals are acknowledged for their
contributions.
Principle Authors
Dennis Mikel, OAQPS-EMAD-MQAG, Research Triangle Park, North Carolina
Michael Papp, OAQPS-EMAD-MQAG, Research Triangle Park, ,North Carolina
Reviewers
EPA Regions
Region
1 Peter Kahn
4 Van Shrieves
10 Keith Rose, Ginna Grepo-Grove
Office of Air Quality Planning and Standards
Sharon Nizich, JoAnn Rice
State and Local Agencies
MaryAnn Heindorf, State of Michigan
Alain Watson, Pinellas County Department of Environmental Management
Other Organizations
Donna Kenski, Lake Michigan Air Directors Consortium
Comments and questions can be directed to:
Dennis Mikel, OAQPS, RTP,NC mikel.dennisk@epa.gov
111
-------
Acronyms and Abbreviations
AIRS
ATMP
ANSI
APTI
ASTM
CAA
CFR
COC
DAS
DQA
DQOs
EDO
EMAD
EPA
FIPS
CIS
GLP
HVAC
10
LAN
LIMS
MPA
MQOs
MSA
MSR
NAAQS
NAMS
NIST
OAQPS
ORD
PC
PD
PTFE
PUF
QA/QC
QA
QAAR
QAD
QAM
QAO
QAPP
QMP
SIPS
SLAMS
SOP
Aerometric Information Retrieval System
Air Toxics Monitoring Program
American National Standards Institute
Air Pollution Training Institute
American Society for Testing and Materials
Clean Air Act
Code of Federal Regulations
chain of custody
data acquisition system
data quality assessment
data quality objectives
environmental data operation
Emissions, Monitoring, and Analysis Division
Environmental Protection Agency
Federal Information Processing Standards
geographical information systems
good laboratory practice
Heating and Ventilating Air Conditioning Unit
InOrganic
local area network
Laboratory Information Management System
monitoring planning area
measurement quality objectives
metropolitan statistical area
management system review
National Ambient Air Quality Standards
national air monitoring station
National Institute of Standards and Technology
Office of Air Quality Planning and Standards
Office of Research and Development
personal computer
percent difference
polytetrafluoroethylene
poly-urethane foam
quality assurance/quality control
quality assurance
quality assurance annual report
quality assurance division director
quality assurance manager
quality assurance officer
quality assurance project plan
quality management plan
State Implementation Plans
state and local monitoring stations
standard operating procedure
-------
SPMS special purpose monitoring stations
SVOC Semi-Volatile Organic Compounds
SYSOP system operator
TCAPCD Toxa City Air Pollution Control District
TO Toxic Organic
TSA technical system audit
TSP total suspended particulate
UATS Urban Air Toxics Strategy
VOC volatile organic compound
WAM Work Assignment Manager
VI
-------
Tables
,, , Description _ ,
Number - - Page I
-, , Distribution List ,
ListofHAPs
/, Design/Performance Specifications-Total Suspended Particulates - Toxics Metals -
'. Design/Performance Specifications-Air Canister Sampler - Volatile Organic Compounds .
/•j Design/Performance Specifications-Poly Urethane Foam Sampler - Semi- Volatile Organic .,
Compounds
, . Design/Performance Specifications - Carbonyl Sampler - Aldehyde and Ketone Compounds ,
/c Assessment Schedule ,
Schedule of Critical Air Toxics Activities
g'y Critical Documents and Records g
j. Principle Study Questions and Alternate Actions -,
List of Top Ten HAPs in Toxa City
7', False Acceptance and False Rejection Decisions ,
_' Measurement Quality Objectives - Air Toxics Metals ,
jc Measurement Quality Objectives - Air Toxics Carbonyls ,-
j^ Measurement Quality Objectives - Air Toxics Volatile Organics g
_' Measurement Quality Objectives - Air Toxics Semi- Volatile Organics _
01 TCAPCD Training Requirements ,
Core Ambient Air Training Courses
g j Air Toxics Reporting Package Information 2
, Q , Schedule of Air Toxics Sampling Related Activities ~
List of Collocated Samplers and Coordinates
, , ' , Sample Set-up, Run and Recovery Dates .,
. 1 ' Supplies at Storage Shelters ,
,,-, Field Corrective Action c
-,-,, Temperature Requirements
', Holding Times
'
, -, , Instruments used in the Toxa City Laboratory -,
Precision Acceptance Criteria
j 2' j Inspections in the Laboratory ^
, <-~ Preventive Maintenance in Weight Room -,
'. Preventive Maintenance in VOC Laboratory .
13 j 4
, j'. Preventive Maintenance in Liquid Chromatography Laboratory .
..' Preventive Maintenance in Inductively Coupled Plasma Laboratory .
i c 't Preventive Maintenance on Field Instruments c
,g'. Lab Instrument Standards ^
,,' Standard Materials and/or Apparatus for Air Toxics Calibrations ,
,j. Critical Field Supplies and Consumables ,
Critical Laboratory Supplies and Consumables
j g' j Validation Check Summaries 4
, g'« Data Transfer Operations ,-
Data Reporting Schedule
jo'^ Reporting Equations „
. „' Data Archive Policies ..
2Q i Assessment Summary o
22' i Single Flag Pnvalidation Criteria for Single Sampler 2
Vll
-------
Figures
Number Section Page
4.1 Organizational Structure of Toxa City Air Pollution Control District for air toxics 4.1.3.1
monitoring 3
7.1 An example of a Decision Performance Goal Diagram 7.1.2 5
10.1 Population Distribution of Toxa City 10.4.2 6
10.2 Metals data and Population 10.4.2 7
12.1 Example DNPH Cartridge chain of custody form 12.0 2
12.2 Example PLTF Cartridge chain of custody form 12.0 3
12.3 Example VOC Canister chain of custody form 12.0 4
12.4 Example TSP/Metals chain of custody form 12.0 5
12.5 General archive form 12.0 7
14.1 Quality Control and Quality Assessment Activities 14.1 1
19.1 Data Management and Sample Flow Diagrams 19.1 2
Vlll
-------
Project: Model QAPP
Element No: 1
Revision No: 1
Date:7/5/01
Page 1 of 1
1.0 QA Project Plan Identification and Approval
The purpose of the approval sheet is to enable officials to document their approval of the QAPP. The title
page (along with the organization chart) also identifies the key project officials for the work. The title and approval
sheet should also indicate the date of the revision and a document number, if appropriate.
Title: Toxa City Air Pollution Control District Project Plan for the air toxics ambient air
monitoring program.
The attached QAPP for the ATMP is hereby recommended for approval and commits the Department
to follow the elements described within.
Toxa City Air Pollution Control District
1) Signature: Date:
Dr. Melvin Thomas - Air Pollution Control Officer
2) Signature: Date:
Russell Kuntz - QA Division Director
EPA Region 11
1) Signature: Date:
Dennis Mckelson-Technical Project Officer - Air Monitoring Branch
2) Signature: Date:
Benjamin T. Zachary - QA Officer - QA Branch
-------
Project: Model QAPP
Element No: 2
Re vision No: 1.0
Date:7/5/01
Page 1 of 4
2.0 Table of Contents
The table of contents lists all the elements, references, and appendices contained in a QAPP, including a list
of tables and a list of figures that are used in the text. The major headings for most QAPPs should closely follow
the list of required elements. While the exact format of the QAPP does not have to follow the sequence given
here, it is generally more convenient to do so, and it provides a standard format to the QAPP reviewer. Moreover,
consistency in the format makes the document more familiar to users, who can expect to find a specific item in the
same place in every QAPP. The table of contents of the QAPP may include a document control component. This
information should appear in the upper right-hand comer of each page of the QAPP when document control
format is desired.
Section
Fore-word
Acknowledgments
Acronyms and Abbreviations
Tables
Figures
Region Approval
A. PROJECT MANAGEMENT
1 . Title and Approval Page
2. Table of Contents
3. Distribution List
4. Project/Task Organization
Page
ii
iii
iv
vi
vii
viii
1/1
1/4
1/1
Revision
2
2
1
1
1
1
1
1
1
1
Date
6/18/01
6/18/01
6/18/01
6/18/01
6/18/01
6/18/01
6/18/01
6/18/01
6/18/01
6/18/01
6/18/01
4.1 Roles and Responsibilities
5. Problem Definition/Background
5.1 Problem Statement and Background
5.2 List of Pollutants
5.3 Location of Interest for HAPs
6. Project/Task Description
6.1 Description of Work to be Performed
6.2 Field Activities
6.3 Laboratory Activities
6.4 Project Assessment Techniques
6.5 Schedule of Activities
6.6 Project Records
7. Quality Objectives and Criteria for Measurement Data
7.1 Data Quality Objectives
7.2 Measurement Quality Objectives
1/10
1/4
2/4
4/4
1/8
2/8
4/8
5/8
6/8
7/8
1/11
8/11
6/18/01
6/18/01
6/07/01
-------
Project: Model QAPP
Element No: 2
Re vision No: 1.0
Date:7/5/01
Page 2 of 4
Section
8. Special Training Requirements/Certification
8.1 Training
8.2 Certification
9. Documentation and Records
9.1 Information Included in the Reporting Package
9.2 Data Reporting Package and Documentation Control
9.3 Data Reporting Package Archiving and Retrieval
B. MEASUREMENT/DATA ACQUISITION
10. Sampling Design
10.1 Scheduled Project Activities, Including Measurement
Activities
10.2 Rationale for the Design
10.3 Design Assumptions
10.4 Procedure for Locating and Selecting
Environmental Samples
10.5 Classification of Measurements as Critical/Noncritical
10.6 Validation of Any non-standard Measurements
11. Sampling Methods Requirements
11.1 Purpose/Background
11.2 Sample Collection, Preparation, Decontamination
Procedures
11.3 Support Facilities for Sampling Methods
11.4 Sampling/Measurement System Corrective Action
11.5 Sampling Equipment, Preservation, and Holding Time
Requirements
12. Sample Custody
12.1 Sample Custody Procedure
13. Analytical Methods Requirements
13.1 Purpose/Background
13.2 Preparation of Samples
13.3 Analysis Methods
13.4 Internal QC and Corrective Action for Measurement System
13.5 Sample Contamination Prevention, Preservation and Holding
14. Quality Control Requirements
14.1 QA Procedures
Page
1/4
4/4
1/5
4/5
5/5
1/10
3/10
4/10
5/10
14/10
14/10
1/8
2/8
3/8
4/8
7/8
1/9
1/6
2/6
2/6
3/6
4/6
1/8
Revision Date
1 6/18/01
1 6/18/01
7/5/01
6/18/01
6/18/01
6/18/01
6/18/01
15. Instrument/Equipment Testing, Inspection, and Maintenance
Requirements
15.1 Purpose/Background
15.2 Testing
15.3 Inspection
15.4 Maintenance
6/18/01
1/6
1/6
2/6
2/6
-------
Project: Model QAPP
Element No: 2
Re vision No: 1.0
Date5/04/01:
Page 3 of 4
Section
16. Instrument Calibration and Frequency
16.1 Instrumentation Requiring Calibration
16.2 Calibration Methods
16.3 Calibration Standard Materials and Apparatus
16.4 Calibration Standards
16.5 Calibration Frequency
17. Inspection/Acceptance for Supplies and Consumables
17.1 Purpose
17.2 Critical Supplies and Consumables
17.3 Acceptance Criteria
17.4 Tracking and Quality Verification of Supplies and
Consumables
18. Data Acquisition Requirements (non-direct measurements)
18.1 Acquisition of Non-Direct Measurement Data
19. Data Management
19.1 Background and Overview
19.2 Data Recording
19.3 Data Validation
19.4 Data Transformation
19.5 Data Transmittal
19.6 Data Reduction
19.7 Data Summary
19.8 Data Tracking
19.9 Data Storage and Retrieval
C. ASSESSMENT/OVERSIGHT
20. Assessments and Response Actions
20.1 Assessment Activities and Project Planning
20.2 Documentation of Assessment
21. Reports to Management
21.1 Frequency, Content, and Distribution of Reports
21.2 Responsible Organizations
22. Data Review
22.1 Data Review Design
22.2 Data Review Testing
22.3 Procedures
D. VALIDATION AND USABILITY
23. Validation, Verification and Analysis Methods
23.1 Process for Validating and Verifying Data
Page
Revision Date
1 6/18/01
1/8
2/8
4/8
6/8
7/8
1/4
1/4
3/4
3/4
1/4
1/11
4/11
4/11
7/11
7/11
8/11
10/11
10/11
11/11
11/11
2/8
7/8
1/2
2/2
1/3
2/3
3/3
1/4
6/18/01
6/18/01
6/18/01
6/18/01
6/18/01
6/18/01
6/18/01
-------
Section
24. Reconciliation with Data Quality Objectives
24.1 Reconciling Results with DQOs
Page Revision Date
1 6/18/01
1/11
Appendices
A. Glossary
B. Air Toxics Pilot Technical Systems Audit - Laboratory Form
C. Air Toxics Pilot Technical Systems Audit - Field Form
D. Toxics Pilot Monitoring Study - Measurement Guidelines
6/18/01
12/00
12/00
12/00
-------
Project: Model QAPP
Element No: 3
Re vision No: 1.0
Date: 7/5/01
Page 1 of 1
3.0 Distribution
All the persons and document files designated to receive copies of the QAPP, and any planned future
revisions, need to be listed in the QAPP. This list, together with the document control information, will help the
project manager ensure that all key personnel in the implementation of the QAPP have up-to-date copies of the
plan. A typical distribution list appears in Table 3-1
A hardcopy of this QAPP has been distributed to the individuals in Table 3-1. The document is also
available on the Internet at http://www.toxacitv.apcd.gov.
Table 3.1 Distribution List
Name
Position
Division/Branch
Toxa City Air Pollution Control District
Dr. Melvin Thomas
Russell Kuntz
John Holstine
Thomas Sutherland
Daniel Willis
Holly J. Webster
James Courtney
Robert Kirk
Joe L. Craig
Kent Field
Alexander Bamett
Janet Hoppert
David Bush
Gary Arcemont
Lisa Killion
Robert Renelle
Mark Fredrickson
Air Pollution Control Officer
QA Division Director
QA Officer
QA Technician
Air Division Director
Ambient Air Monitoring Branch Chief
Field Technician
Field Technician
Field Technician
Data Manager
Program Support Division Director
Shipping/Receiving Branch Chief
Clerk
Laboratory Branch Chief
Lab Technician
Lab Technician
Lab Technician
TCAPCD
QA Division
QA Division
QA Division
Air Division
Technical/ Ambient Air Monitoring
Technical/ Ambient Air Monitoring
Technical/ Ambient Air Monitoring
Technical/ Ambient Air Monitoring
Technical/ Ambient Air Monitoring
Program Support
Program Support/Shipping &Rec.
Program Support/Shipping &Rec.
Technical/Laboratory
Technical/Laboratory
Technical/Laboratory
Technical/Laboratory
EPA Region 11
Dennis Mickelson
Benjamin T. Zachary
QA Officer
EPA Project Officer
Air/ Air Quality Monitoring
Air/Quality Assurance
-------
Project: Model QAPP
Element No: 4
Re vision No: 1.0
Date: 7/5/01
iloflO
4.0 Project/Task Organization
The purpose of the project organization is to provide EPA and other involved parties with a clear
understanding of the role that each party plays in the investigation or study and to provide the lines of authority
and reporting for the project.
4.1 Roles and Responsibilities
The specific roles, activities, and responsibilities of participants, as well as the internal lines of authority and
communication within and between organizations, should be detailed. The position of the QA Manager or QA
Officer should be described. Include the principal data users, the decision-maker, project manager, QA manager,
and all persons responsible for implementation of the QAPP. Also included should be the person responsible for
maintaining the QAPP and any individual approving deliverables other than the project manager. A concise chart
showing the project organization, the lines of responsibility, and the lines of communication should be presented.
For complex projects, it may be useful to include more than one chart—one for the overall project (with at least the
primary contact) and others for each organization.
Federal, State, Tribal and local agencies all have important roles in developing and implementing
satisfactory air monitoring programs. As part of the planning effort, EPA is responsible for developing
National Ambient Air Quality Standards (NAAQS), and identifying a minimum set of QC samples
from which to judge data quality. The State and local organizations are responsible for taking this
information and developing and implementing a quality system that will meet the data quality
requirements. Then, it is the responsibility of both EPA and the State and local organizations to assess
the quality of the data and take corrective action when appropriate. The responsibilities of each
organization follow.
4.1.1 Office of Air Quality Planning and Standards
OAQPS is the organization charged under the authority of the Clean Air Act (CAA) to protect and
enhance the quality of the nation's air resources. OAQPS sets standards for pollutants considered
harmful to public health or welfare and, in cooperation with EPA's Regional Offices and the States,
enforces compliance with the standards through state implementation plans (SIPs) and regulations
controlling emissions from stationary sources. The OAQPS evaluates the need to regulate potential air
pollutants, especially air toxics and develops national standards; works with State and local agencies to
develop plans for meeting these standards; monitors national air quality trends and maintains a database
of information on air toxics and controls; provides technical guidance and training on air pollution
control strategies; and monitors compliance with air pollution standards.
-------
Project: Model QAPP
Element No: 4
Re vision No: 1.0
Date: 7/5/01
Page 2 of 10
Within the OAQPS Emissions Monitoring and Analysis Division (EMAD), the Monitoring and Quality
Assurance Group (MQAG) is responsible for the oversight of the Ambient Air Quality Monitoring
Network. MQAG has the following responsibilities:
ensuring that the methods and procedures used in making air pollution measurements are
adequate to meet the programs objectives and that the resulting data are of satisfactory quality
operating the National Performance Audit Program (NPAP);
evaluating the performance, through technical systems audits and management systems
reviews, of organizations making air pollution measurements of importance to the regulatory
process;
implementing satisfactory quality assurance programs over EPA's Ambient Air Quality
Monitoring Network;
ensuring that national regional laboratories are available to support toxics and QA programs;
ensuring that guidance pertaining to the quality assurance aspects of the Ambient Air Program
are written and revised as necessary;
rendering technical assistance to the EPA Regional Offices and air pollution monitoring
community.
4.1.2 EPA Region 11 Office
The EPA Regional Offices will address environmental issues related to the States within their
jurisdiction and to administer and oversee regulatory and congressionally mandated programs. The
major quality assurance responsibilities of EPA's Regional Offices, in regards to the Ambient Air
Quality Program, are the coordination of quality assurance matters at the Regional levels with the State
and local agencies. This is accomplished by the designation of EPA Regional Project Officers who are
responsible for the technical aspects of the program including:
reviewing QAPPs by Regional QA Officers who are delegated the authority by the Regional
Administrator to review and approve QAPPs for the Agency;
supporting the air toxics audit evaluation program;
evaluating quality system performance, through technical systems audits and network reviews
whose frequency is addressed in the Code of Federal Regulations and Section 20;
acting as a liaison by making available the technical and quality assurance information
developed by EPA Headquarters and the Region to the State and local agencies, and making
EPA Headquarters aware of the unmet quality assurance needs of the State and local
agencies.
Toxa City will direct all technical and QA questions to Region 11.
4.1.3 Toxa City Air Pollution Control District
-------
Project: Model QAPP
Element No: 4
Re vision No: 1.0
Date: 7/5/01
: 3 of 10
40 CFR Part 58 defines a State Agency as "the air pollution control agency primarily responsible for
the development and implementation of a plan under the Act (CAA)". Section 302 of the CAA
provides a more detailed description of the air pollution control agency.
40 CFR Part 58 defines the Local Agency as "any local government agency, other than the state
agency, which is charged with the responsibility for carrying out a portion of the plan (SIP)".
The major responsibility of State and local agencies is the implementation of a satisfactory monitoring
program, which would naturally include the implementation of an appropriate quality assurance
program. It is the responsibility of State and local agencies to implement quality assurance programs in
all phases of the environmental data operation (EDO), including the field, their own laboratories, and in
any consulting and contractor laboratories which they may use to obtain data. An EDO is defined as
work performed to obtain, use, or report information pertaining to environmental processes or
conditions.
Figure 4.1 represents the organizational structure of the areas of the Toxa City Air Pollution Control
District (TCAPCD or the District) that are responsible for the activities of the air toxics ambient air
quality monitoring program. The following information lists the specific responsibilities of each individual
and are grouped by functions of the Directors Office, and the divisions related to Quality Assurance,
Technical Support, and Program Support.
4.1.3.1 Directors Office
Air Pollution Control Director - Dr. Melvin Thomas
The Director has overall responsibility for managing the Toxa City Air Pollution Control District
according to policy. The direct responsibility for assuring data quality rests with management.
Ultimately, the Director is responsible for establishing QA policy and for resolving QA issues identified
through the QA program. Major QA related responsibilities of the Director include:
approving the budget and planning processes;
assuring that the District develops and maintains a current and germane quality system;
assuring that the District develops and maintains a current air toxics QAPP and ensures
adherence to the document by staff, and where appropriate, other extramural cooperators;
establishing policies to ensure that QA requirements are incorporated in all environmental data
operations;
maintaining an active line of communication with the QA and technical managers;
conducting management systems reviews.
The Director delegates the responsibility of QA development and implementation in accordance with
District policy to the Division Directors. Oversight of the District's QA program is delegated to the QA
Division Director.
-------
Project: Model QAPP
Element No: 4
Re vision No: 1.0
Date: 7/5/01
Page 4 of 10
4.1.3.2 QA Division
QA Division Director (QAD) - Russell Kuntz
The QA Division Director is the delegated manager of the District's QA Program. He has direct
access to the Director on all matters pertaining to quality assurance. The main responsibility of the
QAD is QA oversight, and ensuring that all personnel understand the District's QA policy and all
pertinent EPA QA policies and regulations specific to the Ambient Air Quality Monitoring Program.
The QAD provides technical support and reviews and approves QA products. Responsibilities
include:
developing and interpreting District QA policy and revising it as necessary;
developing a QA Annual Report for the Director;
reviewing acquisition packages (contracts, grants, cooperative agreements, inter-agency
agreements) to determine the necessary QA requirements;
developing QA budgets;
assisting staff scientists and project managers in developing QA documentation and in
providing answers to technical questions;
ensuring that all personnel involved in environmental data operations have access to any
training or QA information needed to be knowledgeable in QA requirements, protocols, and
technology of that activity;
reviewing and approving the QAPP for the ATMP;
ensuring that environmental data operations are covered by appropriate QA planning
documentation (e.g., QA project plans and data quality objectives);
ensuring that Management System Reviews (MSRs), assessments and audits are scheduled
and completed, and at times, conducting or participating in these QA activities;
tracking the QA/QC status of all programs;
recommending required management-level corrective actions;
serving as the program's QA liaison with EPA Regional QA Managers or QA Officers and the
Regional Project Officer.
The QAD has the authority to carry out these responsibilities and to bring to the attention of the
Director any issues associated with these responsibilities. The QAD delegates the responsibility of QA
development and implementation in accordance with District policy to the QA Officer and technician.
-------
Project: Model QAPP
Element No: 4
Re vision No: 1.0
Date: 7/5/01
: 5 of 10
Quality Assurance Officer - John Holstine
The QA Officer is a main point of contact within the QA Division. The QA Officer's responsibilities
include:
implementing and overseeing the District's QA policy within the division;
acting as a conduit for QA information to division staff;
assisting the QAD in developing QA policies and procedures;
coordinating the input to the QA Annual Report (QAAR);
assisting in solving QA-related problems at the lowest possible organizational level.
• ensuring that an updated QAPP is in place for all environmental data operations associated with
the ATMP;
• ensuring that technical systems audits, audits of data quality, and data quality; assessments occur
within the appropriate schedule and conducting or participating in these audits.
• tracking and ensuring the timely implementation of corrective actions;
• ensuring that a management system review occurs every 3 years;
EPA QA Officer
Dennis Mickleson
872 669-2299
EPA Project Officer
Benjamin T. Zachary
872 669-2378
J
Figure 4.1 Organizational Structure of Toxa City Air Pollution Control District for air toxics monitoring.
ensuring that technical personnel follow the QAPP
review precision and bias in the data;
-------
Project: Model QAPP
Element No: 4
Re vision No: 1.0
Date: 7/5/01
Page 6 of 10
• data validation;
• ensuring that all environmental data activities effectively follow the QA/QC requirements.
The QA officer has the authority to carry out these responsibilities and to bring to the attention of his or
her respective Division Director any issues related to these responsibilities. The QA officer delegates
the responsibility of QA development and implementation in accordance with District policy.
Quality Assurance Technician - Thomas Sutherland
The QA technician is the staff QA contact appointed by the QA officer. Tom Sutherland is the person
who performs all field and laboratory audits. Mr. Sutherland's responsibilities include:
remaining current on District QA policy and general and specific EPA QA policies and
regulations as it relates to the ATMP;
scheduling and implementing technical systems audits;
performing data quality assessments;
reviewing precision and bias data;
providing QA training to Air and Program Support Division technical staff;
ensuring timely follow-up and corrective actions resulting from auditing and evaluation
activities;
facilitating management systems reviews implemented by the QA Officer.
4.1.3.3 Technical Division
The technical divisions are responsible for all routine environmental data operations (EDOs) for the
ATMP.
Air Division Director - Daniel Willis
The Air Division Director is the delegated manager of the routine ATMP which includes the QA/QC
activities that are implemented as part of normal data collection activities. Responsibilities of the
Director include:
communication with EPA Project Officers and EPA QA personnel on issues related to routine
sampling and QA activities;
• understanding EPA monitoring and QA regulations and guidance, and ensuring subordinates
understand and follow these regulations and guidance;
understanding District QA policy and ensuring subordinates understand and follow the policy;
understanding and ensuring adherence to the QAPP;
reviewing acquisition packages (contracts, grants, cooperative agreements, inter-agency
agreements) to determine the necessary QA requirements.
developing budgets and providing program costs necessary for EPA allocation activities
-------
Project: Model QAPP
Element No: 4
Re vision No: 1.0
Date: 7/5/01
:7oflO
ensuring that all personnel involved in environmental data collection have access to any training
or QA information needed to be knowledgeable in QA requirements, protocols, and
technology;
recommending required management-level corrective actions.
The Air Director delegates the responsibility for the development and implementation of individual
monitoring programs, in accordance with District policy, to the Air Division Branch Managers.
Air Monitoring Branch Manager - Holly J. Webster
Laboratory Branch Manager - Gary Arcemont
These two branches are responsible for overseeing the routine field/lab monitoring and QA activities of
the Ambient Air Quality Monitoring Program. The Branch Manager's responsibilities include:
implementing and overseeing the District's QA policy within the branch;
acting as a conduit for information to branch staff;
training staff in the requirements of the QA project plan and in the evaluation of QC
measurements;
assisting staff scientists and project managers in developing network designs, field/lab standard
operating procedures and appropriate field/lab QA documentation;
ensuring that an updated QAPP is in place for all environmental data operations associated
with the ATMP;
ensuring that technical personnel follow the QAPP;
assure that the laboratory and field staff adhere to the QA/QC requirements of the specified
analytical methods and Standard Operating Procedures (SOPs);
assure that the laboratory and field programs generate data of known and needed quality to
meet the programs Data Quality Objectives (DQOs);
review and approve of modifications on the SOPs for the field and laboratory programs. In
addition, review and approval any new SOPs with the integration of new instruments.
Field Personnel - James Courtney, Robert Kirk, and Joe L. Craig
The field personnel are responsible for carrying out a required task(s) and ensuring the data quality
results of the task(s) by adhering to guidance and protocol specified by the QAPP and SOPs for the
field activities. Responsibilities include:
• participating in the development and implementation of the QAPP;
• participating in training and certification activities;
• writing and modifying SOPs;
• verifying that all required QA activities are performed and that measurement quality standards
are met as required in the QAPP;
• performing and documenting preventative maintenance;
• documenting deviations from established procedures and methods;
• reporting all problems and corrective actions to the Branch Managers;
-------
Project: Model QAPP
Element No: 4
Re vision No: 1.0
Date: 7/5/01
: 8 of 10
assessing and reporting data quality;
preparing and delivering reports to the Branch Manager;
flagging suspect data;
handling/transport of cartridges, filters, Poly Urethane Foam (PUF) plugs and other sampling
needs in and out of the field;
maintain chain-of-custody records in the field;
calibration of samplers as specified by the QAPP and SOPs;
loading/unloading of samples;
packing, shipping or transporting the exposed samples in accordance with the SOPs and
QAPP;
maintain logbooks of the QA/QC activities and equipment preventive maintenance logs.
Laboratory Personnel - Lisa Killion, Robert Renelle, Mark Fredrickson
Laboratory personnel are responsible for carrying out a required task(s) and ensuring the data quality
results of the task(s) by adhering to guidance and protocol specified by the air toxics QAPP and SOPs
for the lab activities. Their responsibilities include:
• participating in the development and implementation of the QAPP;
• participating in training and certification activities;
• participating in the development of data quality requirements (overall and laboratory) with the
appropriate QA staff;
• writing and modifying SOPs and good laboratory practices (GLPs);
• verifying that all required QA activities were performed and that measurement quality standards
were met as required in the QAPP;
• following all manufacturer's specifications;
• performing and documenting preventative maintenance;
• documenting deviations from established procedures and methods;
• reporting all problems and corrective actions to the Branch Manager;
• assessing and reporting data quality;
• preparing and delivering reports to the Branch Manager;
• flagging suspect data;
• preparing and delivering data to the Information Manager.
In addition, the laboratory personnel will perform the following duties:
• sample receiving and inspection from vendor;
• pre-sampling processing, assembling (for PUF) and preparation;
• clean-up and testing of canisters or PUF cartridges;
• Di-nitro-phenyl-hydrazine (DNPH) cartridge preparation;
• preparing the chain-of-custody forms for field use;
• post-sampling receiving of samples and processing of samples (i.e., refrigeration of DNPH
-------
Project: Model QAPP
Element No: 4
Re vision No: 1.0
Date: 7/5/01
: 9 of 10
cartridges and PUF cartridges);
• sample preparation, extraction, and clean-up;
• Analysis of the VOC, Semi-Volatile Organic Compounds (SVOC), metals and aldehydes
according to accepted SOPs.
Information Manager- Kent Field
The Information Manager is responsible for coordinating the information management activities of
the ATMP. The main responsibilities of the Information Manager include ensuring that data and
information collected for the ATMP are properly captured, stored, and transmitted for use by
program participants. Responsibilities include:
developing local data management standard operating procedures;
ensuring that information management activities are developed within reasonable time frames
for review and approval;
maintenance and upkeep of the Laboratory Information Management System (LEVIS);
storage of raw data for the analysis data, i.e., chromatograms from the various laboratory
instrumentation;
long term storage of data to Compact Disk (CD) or other digital storage media;
upkeep of LEVIS software and upgrading when needed;
ensuring the adherence to the QAPP where applicable;
ensuring access to data for timely reporting and interpretation processes;
ensuring the development of data base guides (data base structures, user guidance
documents);
ensuring timely delivery of all required data to the AIRS system.
4.1.3.4 Program Support
The Program Support Division include the areas of human resources, facilities maintenance, and
shipping and receiving.
Program Support Division Director - Alexander Barnett
Responsibilities of the Director include:
communication with QA and Air Monitoring Division on specific needs;
• understanding EPA monitoring and QA regulations and guidance, and ensuring subordinates
understand and follow these regulations and guidance;
understanding District QA policy and ensuring subordinates understand and follow the policy;
understanding and ensuring adherence to the QAPP as it relates to program support activities;
ensuring that all support personnel have access to any training or QA information needed to be
knowledgeable in QA requirements, protocols, and technology.
-------
Project: Model QAPP
Element No: 4
Revision No:
Date: 7/5/01
Page 10 of 10
Shipping/Receiving Branch Manager - Janet Hoppert
This branch is responsible for shipping and receiving equipment, supplies and consumables for the
routine field/lab monitoring and QA activities of the ATMP. The Branch Managers responsibilities
include:
implementing and overseeing the District's QA policy within the branch
acting as a conduit for information to branch staff;
• training staff in the requirements of the QA project plan as it relates to shipping/receiving;
assisting staff in developing standard operating procedures;
coordinating the Branch's input to the Quality Assurance Annual Report
ensuring that technical personnel follow the QAPP;
reviewing and evaluating staff performance and conformance to the QAPP.
Clerk-David Bush
Mr. Bush supports for all shipping/receiving of all equipment and consumable supplies for the
ATMP. Responsibilities include:
assisting in the development of standard operating procedures for shipping/receiving;
following SOPs for receiving, storage, chain-of-custody and transfer of filters, canisters and
cartridges;
informing appropriate field /lab staff of arrival of consumables, equipment, and samples;
documenting, tracking, and archiving shipping/receiving records.
-------
Project: Model QAPP
Element No: 5
Re vision No: 1.0
Date: 7/5/01
Page 1 of 4
5.0 Problem Definition/Background
The background information provided in this element will place the problem in historical perspective,
giving readers and users of the QAPP a sense of the project's purpose and position relative to other project
and program phases and initiatives
5.1 Problem Statement and Background
5.1.1 Background
There are currently 188 hazardous air pollutants (HAPs), or air toxics, regulated under the Clean
Air Act (CAA) that have been associated with a wide variety of adverse health effects, including
cancer, neurological effects, reproductive and developmental effects, as well as ecosystem effects.
These air toxics are emitted from multiple sources, including major stationary, area, and mobile
sources, resulting in population exposure to these air toxics as they occur in the environment. While
in some cases the public may be exposed to an individual HAP, more typically people experience
exposures to multiple HAPs and from many sources. Exposures of concern result not only from the
inhalation of these HAPs, but also, for some HAPs, from multi-pathway exposures to air emissions.
For example, air emissions of mercury are deposited in water and people are exposed to mercury
through their consumption of contaminated fish.
5.1.2 Air Toxics Program
In order to address the concerns posed by air toxics emissions and to meet the city's strategic
goals, the TCAPCD has developed an ATMP designed to characterize, prioritize, and equitably
address the impacts of HAPs on the public health and the environment. The TCAPCD seeks to
address air toxics problems through a strategic combination of agencies' activities and authorities,
including regulatory approaches and voluntary partnerships.
5.1.3 The Role of Ambient Monitoring
Emissions data, ambient concentration measurements, modeled estimates, and health impact
information are all needed to fully assess air toxics impacts and to characterize risk. Specifically,
emissions data are needed to quantify the sources of air toxics impacts and aid in the development
of control strategies. Ambient monitoring data are then needed to understand the behavior of air
toxics in the atmosphere after they are emitted. Since ambient measurements cannot practically be
made everywhere, modeled estimates are needed to extrapolate our knowledge of air toxics
impacts into locations without monitors. Exposure assessments, together with health effects
information, are then needed to integrate all of these data into an understanding of the implications
of air toxics impacts and to characterize air toxics risks.
-------
Project: Model QAPP
Element No: 5
Re vision No: 1.0
Date: 7/5/01
Page 2 of 4
This QAPP focuses on the role of ambient measurement data as one key element of the full air
toxics assessment process. The rest of this section describes the specific uses of ambient
monitoring data and outlines the key considerations for focusing the spatial, temporal, and
measurement aspects of a national air toxics monitoring effort.
The anticipated uses of ambient monitoring data should be kept in mind when designing the
measurement network. In order to better focus the data collection activities on the final use of the
data, a DQO process was performed in Chapter 7 of this QAPP. From that process, the following
objective was determined for the ATMP.
Determine the highest concentrations expected to occur in the area covered by the
network, i.e., to verify the spatial and temporal characteristics of HAP s within the city.
Since it is not possible to monitor everywhere, we must develop a monitoring network which is
representative of air toxics problems on a neighborhood scale and which provide a means to obtain
data on a more localized basis as appropriate and necessary. The appropriateness of a candidate
monitoring site with respect to the data uses described above.
5.2 List of Pollutants
There are 33 HAPs identified in the draft Integrated Urban Air Toxics Strategy (UATS)1. They are
a subset of the 188 toxics identified in Section 112 of the CAA which are thought to have the
greatest impact on the public and the environment in urban areas. The TCAPCD staff reviewed the
33 HAPs list and consulted with EPA and State of North Carolina staff. After several
consultations, a final list of compounds were selected. The list is based on:
• The EPA's Concept Paper25
A major portion of the 33 Unified Air Toxics Strategy (UATS) HAPs can be measured with 4
field and lab systems;
The limitations of the State-of-the-Science instruments.
A number of compound on the UATS list are difficult to characterize or the methods have not been
developed yet. These compounds will not be included in the pollutant list. If at some time in the
future methods are developed for these compounds, the District may, at some point include these
compounds. See Table 5-1.
-------
Project: Model QAPP
Element No: 5
Re vision No: 1.0
Date: 7/5/01
Page 3 of 4
Table 5.1 List ofHAPs
EPA Method
Volatile
Organic Compounds
TO-15
Metals
10-3.5
Aldehydes and Ketones
TO-11A
Poly cyclic
Aromatic
Hydrocarbons
TO-13A
Pollutants on the UATS List
benzene
1,3-butadiene
carbon tetrachloride
chloroform
1 ,2-dichloropropane
methylene chloride
tetrachloroethene
trichloroethene
vinyl chloride
acrylonitrile
1,2 dibromoethane
cis- 1 ,3-dichloropropene
trans- 1 ,3-dichloropropene
1 ,2-dichloroethane
1 , 1 ,2,2-tetrachloroethane
arsenic
beryllium
cadmium
chromium
lead
manganese
nickel
acetaldehyde
formaldehyde
Additional HAPS
methyl chloride
methyl bromide
ethyl chloride
1,1-dichloroethene
1,1-dichloroethane
1,1,1 -trichloroethane
1 ,1 ,2-trichloroethane
toluene
chlorobenzene
ethylbenzene
mxylene
p-xylene
styrene
o-xylene
1 ,4-dichlorobenzene
1 ,2,4-trichlorobenzene
hexachloro-1 ,3-butadiene
antimony
cobalt
selenium
propionaldehyde
methyl ethyl ketone
acephthalene
anthracene
benzo [a] pyrene
fluorene
pyrene
chrysene
benzo [a] anthracene
naphthalene
As can be seen from Table 5-1, there are a number of additional HAPs on the list. These are
-------
Project: Model QAPP
Element No: 5
Re vision No: 1.0
Date: 7/5/01
Page 4 of 4
HAPs that the current analytical systems can measure. Although the additional compounds are not
considered to be as hazardous as the pollutants on the UATS list. Data will be collected on these
compounds as well because, at some future date, these compounds may be deemed hazardous.
The SVOCs that are on this list were detected during the pilot study. Therefore, it has been
determined if these compounds exist in the ambient environment, they should be quantified and
identified.
5.3 Locations of Interest for HAPs
Information on air toxics is needed for both industrial/downtown and suburban areas. The major
manufacturing and industrial areas are also near the mouth of the bay. There are several
neighborhood that surround this areas. The TCAPCD has decided to target this area as one of the
monitoring locations since neighborhood scale and exposure are objectives of this program. The
other locations are suburban-oriented sites needed to characterize general exposure and temporal
and spatial variability
5.3.1 Spatial and Temporal Considerations
The monitoring network will primarily emphasize long-term measures of air quality. The major part
of the effort to develop air quality and emissions data, therefore, will focus on year-round
information. To provide maximum flexibility in data use, however, the data collection will be based
on intermittent (e.g., every sixth day) collection of 24-hour samples throughout the year. Individual
24-hour data will be stored in EPA's Aerometric Information Retrieval System (AIRS) and the
District's database.
Reference
1. National Air Toxics Program: The Integrated Urban Strategy-Report to Congress, EPA Document No.
453/R-99-007, July 2000, URL Address: http://www.epa.gov/ttn/atw/urban/urbanpg.html
2. Air Toxics Monitoring Concept Paper, Draft, February 29, 2000, URL address:
http://www.epa.gov/ttn/amtic/airtxfil.html
-------
Air Toxics Model QAPP
Element No: 6
Re vision No: 1.0
Date: 7/5/01
Page 1 of 7
6.0 Project/Task Description
The purpose of the project/task description element is to provide the participants with a background
understanding of the project and the types of activities to be conducted, including the measurements that will
be taken and the associated QA/QC goals, procedures, and timetables for collecting the measurements.
6.1 Description of Work to be Performed
(1) Measurements that are expected during the course of the project. Describe the characteristic or property to
be studied and the measurement processes and techniques that will be used to collect data.
(2) Any special personnel and equipment requirements that may indicate the complexity of the project.
Describe any special personnel or equipment required for the specific type of work being planned or
measurements being taken.
(3) The assessment techniques needed for the project. The degree of quality assessment activity for a project
will depend on the project's complexity, duration, and objectives. A discussion of the timing of each
planned assessment and a brief outline of the roles of the different parties to be involved should be
included.
(4) A schedule for the work performed. The anticipated start and completion dates for the project should be
given. In addition, this discussion should include an approximate schedule of important project
milestones, such as the start of environmental measurement activities.
The measurement goal of the ATMP is to estimate the concentration, in units of nanograms per
cubic meter (ng/m3), parts per billion/volume (ppbv), picograms per microliter (pg/ul) of air toxic
compounds of particulates, gases and semi-volatile organics. This is accomplished by four separate
collection media: canister sampling with passivated canisters, DNPH cartridges, poly-urethane
foam/XAD resin and high volume sampling on an 8 x 10" quartz glass filter.
The following sections will describe the measurements required for the routine field and laboratory
activities for the network.
-------
Air Toxics Model QAPP
Element No: 6
Re vision No: 1.0
Date: 7/5/01
Page 2 of 7
6.2 Field Activities
Table 6.1, 6.2, 6.3 and 6.4 summarizes some of the more critical performance requirements.
Table 6.1 Design/Performance S
ecifications - Total Suspended Particulates - Toxic Metals
Equipment
Filter Design Specs.
Size
Medium
Pore size
Filter thickness
Max. pressure drop
Collection efficiency
Alkalinity
Sampler Performance
Specs.
Sample Flow Rate
Flow Regulation
Flow Rate Precision
Flow Rate Accuracy
External Leakage
Internal Leakage
Clock/Timer
Frequency
1 in 6 days
1 in 6 days
Acceptance Criteria
See Reference 1
203 x 254 mm.
Quartz Glass Fiber Filter
0.3 urn
0.50 mm
600 mm Hg @ 1.13 nvYmin
99.95%
6.5 0.1 ppbv
180 cc/min.
1 . 0 cc/min.
+ 10%
+ 10%
Vendor specs
Vendor specs
24 hour +_ 2 min accuracy
Reference
See Reference 2
'Vender Spec.
"
"
"
"
"
See TO-14A
'Vender Spec.
See Reference 2
TO-14A
"
NA
NA
"Sec 6.1.8
-------
Air Toxics Model QAPP
Element No: 6
Re vision No: 1.0
Date: 7/5/01
Page 3 of 7
Table 6.3 Design/Performance S
Equipment
Filter Design Specs.
Size
Medium
Pore size
Filter thickness
Max. pressure drop
Collection efficiency
Sampler Performance
Specs.
Sample Flow Rate
Flow Regulation
Flow Rate Precision
Flow Rate Accuracy
External Leakage
Internal Leakage
Clock/Timer
ecifications - Poly-Urethane Foam Sampler - Semi- Volatile Or
Frequency
1 in 6 days
1 in 6 days
Acceptance Criteria
See Reference 3
101.6 mm Spherical filter
followed by 22 mm x 76 mm
Plug
Quartz Glass Fiber Filter and
Poly Urethane Foam followed
by
XAD resin
0.3 um
0.50 mm
600 mm Hg @ 0.2 m3/min
Varies by compound
0.20 m3/min.
0.2 m /min.
+ 10%
+ 10%
Vendor specs
Vendor specs
24 hour +_ 2 min accuracy
ganic Compounds
Reference
See Reference 3
"TO- 13 A Sec 11.1
"
"
"
"Sec 10.3
"Sec 9. 11
Vender Spec.
NA
'Vender Spec.
"
"
"
NA
NA
'Vender Spec.
Table 6.4 Design/Performance Specifications - Carbonyl Sampler - Aldehyde and Ketone Compounds
Equipment
Filter Design Specs.
Size
Medium
Sampler Performance
Specs.
Sample Flow Rate
Flow Regulation
Flow Rate Precision
Flow Rate Accuracy
External Leakage
Internal Leakage
Clock/Timer
Frequency
1 in 6 days
1 in 6 days
Acceptance Criteria
See Reference 4
100 mm Cylindrical Silica Gel
cartridge
coated with
2,4-Dinitro-phenyl hydrazine
0.20 m3/min.
0.2 m /min.
+ 10%
+ 10%
Vendor specs
Vendor specs
24 hour +_ 2 min accuracy
Reference
See Reference 3
"TO-llASec7.1
"
'Vender Spec.
"
"
"
NA
NA
'Vender Spec.
The District assumes the sampling instruments to be adequate for the sampling for air toxics. All of the
instruments operated in the field are vendor supplied. The descriptions of the samplers are similar to the
instruments described in the references noted above. Section 7.0 discusses the Measurement Quality
Objectives of each of the systems listed in Tables 6-1 through 6-4.
-------
Air Toxics Model QAPP
Element No: 6
Re vision No: 1.0
Date: 7/5/01
Page 4 of 7
6.2.1 Field Measurements
Table 6.1, 6.2 6.3 and 6.4 represents the field measurements that must be collected. This table is
presented in the Compendia of Organic and Inorganic Methods listed in References 1-4. These
measurements are made by the air sampler and are stored in the instrument for downloading by the field
operator during routine visits.
6.3 Laboratory Activities
Laboratory activities for the air toxics program include preparing the filters, canisters and cartridges for
the routine field operator, which includes three general phases:
Pre-Sampling
Receiving filters, canisters or cartridges from the vendors;
Checking sample integrity;
Conditioning filters, storing canisters and cartridges;
Weighing filters;
Storing prior to field use;
Packaging filters, canisters and cartridges for field use;
Associated QA/QC activities;
Maintaining microbalance and analytical equipment at specified environmental conditions;
Equipment maintenance and calibrations.
Shipping/Receiving
Receiving filters, canisters and cartridges from the field and logging into database;
Storing filters, canisters and cartridges;
Associated QA/QC activities.
Post-Sampling
Checking filter, cartridge and canister integrity;
Stabilizing/weighing filters;
extraction of VOCs from canisters;
extraction of metals from quartz filter using hot acid/microwave extraction;
extraction of DNPH compounds;
extraction of SVOC from PUF plug, XAD-2 resin and quartz filter;
Analysis of samples extracted;
Data downloads from field samplers;
Data entry/upload to AIRS;
Storing filters/archiving;
-------
Air Toxics Model QAPP
Element No: 6
Re vision No: 1.0
Date: 7/5/01
Page 5 of 7
Cleaning canisters;
Associated QA/QC activities.
The details for these activities are included in various sections of this document as well as References 1-
4.
6.4 Project Assessment Techniques
An assessment is an evaluation process used to measure the performance or effectiveness of a system
and its elements. As used here, assessment is an all-inclusive term used to denote any of the following:
audit, performance evaluation (PE), management systems review (MSR), peer review, inspection, or
surveillance. Definitions for each of these activities can be found in the glossary (Appendix A). Section
20 will discuss the details of the District's assessments.
Table 6.5 will provide information on the parties implementing the assessment and their frequency.
Table 6.5 Assessment Schedule
Assessment Type
Technical Systems Audit
Network Review
Performance Evaluation
Data Quality Assessment
Performance Audits (field)
Management Systems Review
Assessment Agency
EPA Regional Office
District's QA Office
EPA Regional Office
District's Air Division
State' sQA office
State's QA Office
District's QA Office
District's QA Office
EPA Regional QA Office
Districts QA Office
Frequency
1 every 3 years
Annually
1 every 3 years
Annually
submit "blind" samples to laboratory
annually
1 every 3 years
Annually
Annually
1 every 3 years
Annually
-------
Air Toxics Model QAPP
Element No: 6
Re vision No: 1.0
Date: 7/5/01
Page 6 of 7
6.5 Schedule of Activities
Table 6.6 contains a list of the critical activities required to plan, implement, and assess the air toxics
program.
Table 6.6 Schedule of Critical Air Toxics Activities
Activity
Network development
Sampler order
Laboratory design/upgrade
Laboratory procurement
Personnel Requirements
QAPP development
Network design completion
Samplers arrive
Sampler siting/testing
Field/Laboratory Training
QAPP Submittal
QAPP Approval
Pilot testing
Final Installation of 2000 sites
Routine Sampling Begins
Due Date
June 15, 2000
August 12, 2000
August 12, 2000
September 1,2000
September 1,2000
Sept- Dec. 2000
July 1,2000
October 15, 2000
November 2000
December 2000
October 1,2000
October 31,2000
November-December 2000
December 3 1,1 998
January 1,2001
Comments
Preliminary list of sites and samplers required
Samplers ordered from National contract
Listing of laboratory requirements
Ordering/purchase of all laboratory and miscellaneous
field equipment
Advertising for field and laboratory personnel (if
required)
Development of the QAPP
Final network design
Received in Shipping and Receiving District
Establishment of sites and preliminary testing of
samplers
Field and laboratory training activities and
certification.
QAPP submittal to EPA
Approval by EPA
Pilot activities to ensure efficiency of measurement
system
Sites must be established and ready to collect data
Routine activities must start
6.6 Project Records
The District will establish and maintain procedures for the timely preparation, review, approval, issuance,
use, control, revision and maintenance of documents and records. Table 6-7 represents the categories
and types of records and documents which are applicable to document control for air toxics information.
Information on key documents in each category are explained in more detail in Section 9.
-------
Air Toxics Model QAPP
Element No: 6
Re vision No: 1.0
Date: 7/5/01
Page 7 of 7
Table 6.7 Critical Documents and Records
Categories
Record/Document Types
Management and Organization
State Implementation Plan
Reporting agency information
Organizational structure
Personnel qualifications and training
Training Certification
Quality management plan
Document control plan
Grant allocations
Site Information
Network description
Site characterization file
Site maps
Site Pictures
Environmental Data Operations
QA Project Plans
Standard operating procedures (SOPs)
Field and laboratory notebooks
Sample handling/custody records
Inspection/maintenance records
Raw Data
Any original data (routine and QC data) entry forms
Electronic deliverables of summary analytical runs
Associated QC and calibration runs
Data Reporting
Air quality index report
Annual SLAMS air quality information
Data/summary reports
Data Management
Data algorithms
Data management plans/flowcharts
Air Toxics Data
Quality Assurance
Good Laboratory Practice
Network reviews
Control charts
Data quality assessments
QA reports
System audits
Response/Corrective action reports
Site Audits
Reference:
1. Compendium Method for the Determination of Inorganic Compounds in Air, United States Environmental Protection
Agency, June 1999, Section IO-3.
2. Compendium Method for the Determination of Toxic Organic Compounds in Air, United States Environmental
Protection Agency, Section TO-11 A, January 1999
3. Compendium Method for the Determination of Toxic Organic Communes in Air, United States Environmental
Protection Agency, Section TO-14A, January 1999
4. Compendium Method for the Determination of Toxic Organic Compounds in Air, United States Environmental
Protection Agency, Section TO-13A, January 1999
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 1 of 11
7.0 Quality Objectives and Criteria for Measurement Data
The purpose of this element is to document the DQOs of the project and to establish performance criteria for
the mandatory systematic planning process and measurement system that will be employed in generating the data.
7.1 Data Quality Objectives (DQOs)
7.1.1 Introduction
This section provides a description of the data quality objectives for the ambient air toxics
characterization in Toxa City that is currently under development. Consistent with the District's
requirement for systematic planning prior to a data collection effort, this document presents issues and
discusses trade-offs related to budget and practical constraints. Due to limited resources, it is important
to consider these trade-offs to plan an efficient and effective study design that collects high quality data
that addresses the questions that need to be answered. The most efficient way to accomplish these goals
is to establish criteria for defensible decision making before the study begins, and then develop a data
collection design based on these criteria. By using the DQO Process to plan environmental data
collection efforts, the TCAPCD can improve the effectiveness, efficiency, and defensibility of decisions
in a resource-effective manner.
It is the policy of the TCAPCD that all air toxics data generated for internal and external use shall meet
specific qualitative requirements, referred to as Data Quality Objectives. The DQO performed in
accordance to the guidelines as stated in "EPA Quality Manual for Environmental Programs."1 The
DQO process is detailed in US-EPA's "Guidance for the Data Quality Objectives Process, EPA
QA/G-41.
The DQOs are used to develop a resource-effective data collection design. It provides a systematic
procedure for defining the criteria that a data collection design should satisfy, including when to collect
samples, where to collect samples, the tolerable level of decision errors for the study, and how many
samples to collect. By using the DQO Process, the TCAPCD will assure that the type, quantity, and
quality of environmental data used in decision making will be appropriate for the intended application.
7.1.2 DQO Process
The DQO Process consists of seven steps. The output from each step influences the choices that will be
made later in the Process. During the first six steps of the DQO Process, the planning team developed
the decision performance criteria that were used to develop the data collection design. The final step of
the Process involves developing the data collection design based on the DQOs. Every step should be
completed before data collection begins.
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 2 of 11
The seven steps of the DQO process are:
State the Problem
Identify the Decision
Identify the Inputs to the Decision
Define the Study Boundaries
Develop a Decision Rule
Specify Tolerable Limits on Decision Errors
Optimize the design
Each of these steps will be examined in the following section. Each of these steps has been performed to
ensure a maximized project.
(1) State the Problem: Currently, Toxa City does not have sufficient amount of data of known and
needed quality or quantity to understand the spatial and temporal characteristics of the monitoring area at
a neighborhood scale. Toxa City has evidence that a number of the hazardous air pollutants regulated
under the Clean Air Act are being emitted in the air shed of Toxa City. TCAPCD has been funded to
participate in the National Air Toxics Assessment (NATA) program whose initial ambient air monitoring
focus is to:
characterize ambient concentrations and deposition in representative monitoring areas;
provide data to support and evaluate dispersion and deposition models, and;
establish trends and evaluate effectiveness of HAP reduction strategies.
TCAPCD feels that if it can characterize ambient concentrations and deposition in Toxa City with
adequate data quality, the data will support the modeling and trends analysis goals. This is consistent
with the NATA Concept Paper1 goal of initially focusing on characterization (community wide
concentrations in urban areas and ecosystem impacts, and to quantify conditions in the vicinity of
localized hot spots or specific areas of concern like schools).
As mentioned in the NATA Concept Paper, "initial new monitoring together with data analysis of
existing measurements will be needed to provide a sufficient understanding of ambient air toxics
concentration throughout the country in order to decide on the appropriate quantity and quality of
data needed. " Therefore the TCAPCD study objective is consistent with this initial goal.
The current problem is:
Toxa City will develop a monitoring network to characterize HAPS, how much
monitoring is needed and where to place the monitors. Toxa City does not have an
adequate understanding of the spatial and temporal characteristics of its monitoring area,
sampled at the neighborhood scale to ensure adequate characterization of the annual
average concentrations.
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 3 of 11
In order to address this problem, TCAPCD has been provided with $1,500,000, over a five year period,
which is intended to cover all equipment and consumable purchases, data collection, and assessment
costs. TCAPCD must determine the appropriate tradeoffs (i.e., quality, quantity, instrument sensitivity,
precision, bias) to produce the desired results within the resource constraints. These tradeoffs will be
documented in order to help the TCAPCD determine the best monitoring design within budgets and data
quality constraints.
(2) Identify the Decision: The decision that must be made once the data is evaluated is whether or not
TCAPCD feels it can provide a meaningful annual HAP concentration estimates of Toxa City that
adequately represents the spatial and temporal characteristics of the city at a every 6-day sampling
frequency. Possible actions, as described in Table 7.1, could be that the data from the study appears to
adequately represent Toxa City and that we continue our plans to implement an ambient air monitoring
program; or our results indicate that the estimate provides an inordinate amount of uncertainty that would
need to be corrected by increasing the number on monitors in Toxa City, increasing the sampling
frequency, stratifying the monitoring boundaries or correcting sampling or analytical errors.
Table 7.1 Principal Study Questions and Alternative Actions
Principal Study Question
Is the ambient air HAPS concentration appropriately
characterized with adequate spatial and temporal resolution
and appropriate quantity and quality of data
Alternate Actions
Yes- Start implementation of the monitoring network
No- Need more monitoring sites or need to increase the
monitoring frequency, stratify boundary conditions, correct
measurement errors.
(3) Identify the Input to the Decision: For this pilot study the important inputs are:
the actual 24-hour concentration estimates of HAPS listed in Tables 7-4 to 7-7;
measurements of overall precision and bias to quantify the source of measurement error, and
location information of each sampling site (latitude and longitude).
Several supporting inputs are available that helped in our development of this study and will be used to
support development of the final monitoring network. These are listed below:
• Initial monitoring results which indicate that certain HAPs have been measured in Toxa City;
• Guassian Plume and Exposure Models which indicate that certain areas of the city may have levels of
pollutants that are higher than EPA's benchmark values;
• A review of the emission inventory indicates that there are a number of pollutants being generated
within the city that are of concern. We have location data on the emission sources.
• Meteorological data (i.e., wind rose information);
Technical staff expertise in development of ambient air monitoring networks for criteria pollutants and
PAMS;
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 4 of 11
Sampling instruments that can meet our requirements for sampling time, contamination, precision,
durability , and ease of use;
Analytical instruments and methods that can meet our requirements for, contamination, detectability,
repeatability, and bias, and
A number of PAMS and criteria pollutant monitoring sites available that could be used as sampling
platforms.
Table 7.2 List of Top Ten HAPs in Toxa City
Pollutant
1. Benzene
2. Xylene
3. Mercury
4. Chromium
5. Formaldehyde
6. Vinyl Chloride
7. Methylene Chloride
8. Trichlorethylene
9. Naphthalene
10 Cadmium
Tons Per Year (1999 est.)
30,000
25,000
10,000
7,000
6,590
4,100
2,220
950
400
250
Area or Point Source
Area
Area
Point
Point
Area and Point
Point
Point
Point
Point
Point
(4) Define the Study Boundaries: The spatial and temporal boundaries will be based upon what can
reasonably achieved within our current and predicted resources for an ambient air monitoring network
The spatial boundary, Toxa City, is described in detail in Section 10, but in general, is considered as the
counties of Hillsburg and Pine Lake. Within this boundary pollutant gradients have been subjectively
identified based upon proximity on known HAP emitters. These gradients will differ depending on the
HAP.
The temporal condition is one year. The data is collected with the intent of providing an annual average.
These averages are based on the collection of 24 hour samples collected once every
6-days.
(5) Develop a Decision Rule: Given the objective to characterize sources of variability the most
straightforward representation that both characterizes a major endpoint and separates out the magnitude
of the distinct sources of variability (error) associated with that characterization, is the following equation
which was described in an EPA technical report titled: Data Quality Objective Guidance for the
Ambient Air Toxics Characterization Pilot Study.
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 5 of 11
Yljk =^ + a1+pJ+yy+Syk (1)
(i.e., Measurement = Truth
+ Spatial Variability
+ Temporal Variability
+ Spatial-Temporal Interaction Variability
+ Sampling/Analytical Error)
where Yyk is the measured concentration, • characterizes the major endpoint of concern (e.g., an area's
true annual average), •; characterizes spatial variability, • j characterizes temporal variability, • y-
characterizes spatial-temporal interaction variability and '^ characterizes sampling/analytical variability.
The first three sources of variability can be considered as population variability while the last ('yk) can be
considered measurement uncertainty. In addition our major concern with measurement error are those
errors that do not effect all sites equally (i.e., systematic bias in one sampler). Since all the sites will be
operated by one field technician and samples of any particular pollutant will be sent to one laboratory,
measurement errors effecting any particular site, sampler, or sample will be minimized. Therefore, the
difference in concentration from each of the monitoring sites on any given day can be considered the
spatial and temporal variability. However, each value will contain measurement uncertainty that must be
minimized as well as quantified in order to separate it from the population variability.
(6) Specify Tolerable Limits on Decision Errors: Since this study's objective is to characterize spatial
and temporal variability there is no intolerable limits on population variability. What is initially important is
that each sampling site provides a true estimate of what it represents (boundary condition) therefore the
goal is to establish an adequate estimate of the boundary. TCAPCD must feel comfortable that it will be
able provide reasonable annual estimates of HAPs. Since "risk-based concentrations" have been
established for some HAPs the planning team decided it was important to have an established and
adequate level of confidence in concentrations that were reported at these levels. Since there are many
HAPS, the planning team selected one that they knew contained an appreciable concentration in Toxa
City (Table 7-2), and which had a risk-based concentration that was above the method detection limit.
Therefore trichloroethene was selected.
The planning team established a baseline condition which is:
The annual average concentration for trichloroethene is greater than the risk-base
concentration of0.61ppbv
From this statement, we could establish the two types of potential decision error
• falsely accepting the baseline condition that the annual average concentration for
trichloroethene is greater than the risk-based concentration when in truth it does not
• falsely rejecting the baseline condition by stating that the annual average concentration for
trichloroethene is less than the risk-based concentration when in truth it is greater than the
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 6 of 11
risk based concentration.
Table 7.3 also illustrates the false acceptance and false rejection decisions of this pilot study.
Table 7.3 False Acceptance and False Rejection Decisions
Decision Based on Sampling Data
The annual average concentration
for trichloroethene is greater than
the risk-based concentration of 0.61
ppbv (This is the baseline condition)
The annual average concentration
for trichloroethene is not greater
than the risk-based concentration
of 0.61 ppbv
The True Condition
Baseline is true
Correct Decision
The true concentration is greater
than the risk-based concentration
Decision error (false rejection)
Alternative is true
The true concentration is not greater
than the risk-base concentration
Decision error (false acceptance)
Correct Decision
Decision errors occur due to the population and measurement uncertainty components that are discussed
above.
The planning team could just have easily set up the baseline condition that the concentration was less than
the risk-based concentration. In either case, the planning team wanted to guard against making a false
decision that the HAP concentrations were low when in truth they were a potential health hazard. In
addition, the goal of the exercise in this step was to develop a monitoring system with acceptable levels of
population and measurement uncertainty (i.e., correct sampling design, sampling frequency) in order to
make the decisions within tolerable levels of decision error.
The planning team then went about setting the tolerable levels of decision errors. Figure 7.1 shows the
case where a decision maker considers the more severe decision error to occur above the Action Level
and has labeled that as baseline. Figure 7.1, the decision performance goal diagram (DPGD) shows the
case where a decision maker considers the more severe decision error to occur above the Action Level
and has labeled that as baseline.
The plausible range of values based on professional judgment would be approximately the Detection
Limit to 1.0 ppbv. The Action Level was 0.61 ppbv. A false rejection would be saying the parameter is
less than the Action Level, when, in fact, it is really greater. A false acceptance would be saying the
parameter level is above the Action Level, when, in reality, it is below the Action Level. The gray region
is the area where we considered it is tolerable to make a decision error. For example, if TCAPCD
decided the true parameter level was above the Action Level (0.61 ppbv) when in reality it was 0.55
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 7 of 11
ppbv. Although an error has occurred (false acceptance), it is not particularly severe because the
difference of .06 ppbv on human health and financial resources is minimal. On the other hand, suppose
TCAPCD decided the true parameter level was above the Action Level (0.61 ppbv) when in reality it
was 0.45ppbv. Again, an error has occurred (false acceptance), but it is severe because a difference of
0.16 ppbv is considerable. In this particular case the planning team chose 0.45 ppbv as the edge of their
gray region because it represented the case where errors in decision making have a great impact on
• Alternative
E Q
Q
= .2
>, u
•- <:
0 01 02 03 04 OS
Action Level
True Value of the Parameter (Mean Concentration, ppbV)
2.0
Figure 7.1 An example of a Decision Performance Goal Diagram Baseline Condition: Parameter exceeds the
Action Level. (More severe decision error occurs above action level)
resources. The planning team then assigned risk probabilities to the chance of making decision errors for
various true values of the parameter. The team agreed that, if the true value was 0.45 ppbv and they
decided (from the data to be collected) that the true value exceeded 0.61 ug/m3, they were only willing
to accept a 10% risk of this happening. The team then considered the implications of what adverse effect
would occur if the true value was 0.3 ppbv, but they decided the parameter was greater than 0.61 ppbv.
The analysis showed a additional expenditure of resources, so the planning team elected to take only a
5% risk of this happening. The Planning Team did a similar exercise with the tolerable false rejection
error rates.
Summary
• The baseline condition (i.e., the null hypothesis [HJ) was established as "the measured concentration
for the HAP is above the risk-based concentration".
• The gray region was designated as that area adjacent to the Action Level where the planning team
considered that the consequences of a false acceptance decision error were minimal. The planning
team specified a width of O.lSppbv for the gray region based on their preferences to guard against false
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 8 of 11
acceptance decision errors at a concentration of 0.45 ppbv (the lower bound of the gray region).
• Below the Action Level, the planning team set the maximum tolerable probability of making a false
acceptance error at 10% when the true parameter was from 0.45 to 0.61 ppbv and 5% when it was
below 0.45 ppbv. These limits were based on both experience and an economic analysis that showed
that these decision error rates reasonably balanced the cost of additional sampling/monitoring
(7) Optimize the Design: In order to achieve the DPGD the Planning Team gathered some preliminary
information from other monitoring programs and information they had available in monitoring HAPS to
provide some information on the total uncertainty (population + measurement). The goal was to reduce
total uncertainty through an appropriate choice of sample design and data collection (sampling/analysis)
techniques. If the total variability can be reduced to a value less than that specified in Step 6, the result
will be either a reduction in decision error rates (given a fixed number of samples) or reduction in the
number of samples (and, hence, resource expenditure) for a given set of decision error rates. Based
upon the number of samples taken in the proposed design we estimated total variability around the mean
at the 95% confidence limits to be <20%. Based upon our initial estimates of variability and the
resources available to perform the study, the following design was established:
• Location of 5 sites to establish the spatial and temporal variability across a gradient of pollution
concentrations
Sampling frequency of every six days in order to determine the adequacy of a annual estimate (-300
samples)
Based upon this design the DPGD can be met if total variability. Section 10 explains the sampling design
in more detail.
7.2 Measurement Quality Objectives
Once a DQO is established, the quality of the data must be evaluated and controlled to ensure that it is
maintained within the established acceptance criteria. Measurement Quality Objectives (MQOs) are
designed to evaluate and control various phases (sampling, preparation, analysis) of the measurement
process to ensure that total measurement uncertainty is within the range prescribed by the DQOs.
MQOs can be defined in terms of the following data quality indicators:
Precision - a measure of mutual agreement among individual measurements of the same property usually under
prescribed similar conditions. This is the random component of error. Precision is estimated by various statistical
techniques using some derivation of the standard deviation.
Bias - the systematic or persistent distortion of a measurement process which causes error in one direction. Bias will
be determined by estimating the positive and negative deviation from the true value as a percentage of the true value.
Representativeness - a measure of the degree which data accurately and precisely represent a characteristic of a
population, parameter variations at a sampling point, a process condition, or an environmental condition.
Detectabilitv- The determination of the low range critical value of a characteristic that a method specific procedure
can reliably discern (40 CFR Part 136, Appendix B).
Completeness - a measure of the amount of valid data obtained from a measurement system compared to the amount
that was expected to be obtained under correct, normal conditions. Data completeness requirements are included in
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 9 of 11
the reference methods (40 CFR Pt. 50).
Comparability - a measure of confidence with which one data set can be compared to another.
Accuracy has been a term frequently used to represent closeness to "truth" and includes a combination of
precision and bias error components. If possible, the District will attempt to distinguish measurement uncertainties
into precision and bias components.
For each of these attributes, acceptance criteria can be developed for various phases of the environmental data
operation . In theory, if these MQOs are met, measurement uncertainty should be controlled to the levels required
by the DQO. Table 7-4 through 7-7 lists the MQOs for pollutants to be measured in the pilot study PM2 5 program.
More detailed descriptions of these MQOs and how they will be used to control and assess measurement
uncertainty will be described in other elements, as well as SOPs of this QAPP.
Table 7.4 Measurement Quality Objectives - Air Toxics Metals
Compound
arsenic
beryllium
cadmium
chromium
lead
manganese
nickel
antimony
cobalt
selenium
Reporting
Units
ng/m3
ng/m3
ng/m3
ng/m3
ng/m3
ng/m3
ng/m3
ng/m3
ng/m3
ng/m3
Precision
(CV)
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
Accuracy
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
Representativeness
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Comparability/
Method
Selection
ICP-MS
ICP-MS
ICP-MS
ICP-MS
ICP-MS
ICP-MS
ICP-MS
ICP-MS
ICP-MS
ICP-MS
Completeness
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
Minimum
Detection
Limits1
0.30
0.02
0.02
0.01
0.01
0.02
0.02
0.01
0.01
1.10
Table 7.5 Measurement Quality Objectives - Air Toxics Carbonyls
Compound
Acet aldehyde
Formaldehyde
Propionaldehy
de
methyl ethyl
ketone
Reporting
Units
ppbv
ppbv
ppbv
ppbv
Precision
(CV)
10%
10%
10%
10%
Accuracy
+/- 15%
+/- 15%
+/- 15%
+/- 15%
Representativeness
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Comparability/
Method
Selection
Liquid
Chromatography
Liquid
Chromatography
Liquid
Chromatography
Liquid
Chromatography
Completeness
>75%
>75%
>75%
>75%
Minimum
Detection
Limits2
1.36
1.45
1.28
1.50
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 10 of 11
Table 7.6 Measurement Quality Objectives - Air Toxics Volatile Organics
Compound
benzene
1,3 - butadiene
carbon tetrachloride
chloroform
1,2-dichloropropane
methylene chloride
tetrachloroethene
tetrachloroethane
trichloroethene
vinyl chloride
acrylonitrile
1,2-dibromoethane
cis- 1,3,- dichloropropene
trans-l,3,-dichloropropene
1,2— dichloroethane
1,1,2,2-tetrachloroethane
methyl chloride
methyl bromide
ethyl chloride
1,1- dichloroethane
1, 1- dichloroethene
1, 1, 1-trichloroethane
1,1,2-trichloroethane
toluene
chlorobenzene
ethylbenzene
xylene (isomers)
styrene
1,4- dichlorobenzene
1,2,4- trichloro benzene
hexachloro-l,3-butadiene
Reporting
Units
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
ppbv
Precision
(CV)
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
10%
Accuracy
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
+/- 15%
Representativeness
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Neighborhood Scale
Comparability/
Method Selection
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Gas Chromatography
Completeness
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
Minimum
Detection
Limits3
0.34
1.00
0.42
0.25
0.21
1.38
0.75
0.28
0.45
0.48
1.00
0.05
0.36
0.06
0.24
0.28
0.40
0.53
0.19
0.27
0.50
0.62
0.50
0.99
0.34
0.27
0.76/0.57
1.64
0.70
NA
NA
-------
Project: Model QAPP
Element No: 7
Re vision No: 1.0
Date: 7/5/01
Page 11 of 11
Table 7.7 Measurement Quality Objectives - Air Toxics Semi-Volatile Organics
Compound
acenaphthene
anthracene
benzo [a] pyrene
fluorene
pyrene
chrysene
benzo [a]
anthracene
naphthalene
Reporting
Units
pg/uL
pg/uL
pg/uL
pg/uL
pg/uL
pg/uL
pg/uL
pg/uL
Precision
+/- 10%
+/- 10%
+/- 10%
+/- 10%
+/- 10%
+/- 10%
+/- 10%
+/- 10%
Accuracy
+/- 20%
+/- 20%
+/- 20%
+/- 20%
+/- 20%
+/- 20%
+/- 20%
+/- 20%
Representativeness
Neighborhood
Scale
Neighborhood
Scale
Neighborhood
Scale
Neighborhood
Scale
Neighborhood
Scale
Neighborhood
Scale
Neighborhood
Scale
Neighborhood
Scale
Comparability/
Method Selection
Gas Chrom/Mass
Spec.
Gas Chrom/Mass
Spec.
Gas Chrom/Mass
Spec.
Gas Chrom/Mass
Spec.
Gas Chrom/Mass
Spec.
Gas Chrom/Mass
Spec.
Gas Chrom/Mass
Spec.
Gas Chrom/Mass
Spec.
Completeness
>75%
>75%
>75%
>75%
>75%
>75%
>75%
>75%
Minimum
Detection
Limits4
18.0
21.0
31.1
18.5
23.4
26.7
26.3
14.0
References
1. Compendium Method for the Determination of Inorganic Compounds in Air, United States Environmental Protection
Agency, June 1999, Section IO-3.
2. Compendium Method for the Determination of Toxic Organic Compounds in Air, United States Environmental
Protection Agency, Section TO-11A, January 1999
3. Compendium Method for the Determination of Toxic Organic Communes in Air, United States Environmental
Protection Agency, Section TO-14A, January 1999
4. Compendium Method for the Determination of Toxic Organic Compounds in Air, United States Environmental
Protection Agency, Section TO-13A, January 1999
-------
Project: Model QAPP
Element No: 8
Re vision No: 1.0
Date: 7/5/01
Page 1 of 4
8.0 Special Training Requirements/Certification
The purpose of this element is to ensure that any specialized or unusual training requirements necessary to
complete the projects are known and furnished and the procedures are described in sufficient detail to ensure that
specific training skills can be verified, documented, and updated as necessary.
8.1 Training
Requirements for specialized training for nonroutine field sampling techniques, field analyses, laboratory
analyses, or data validation should be specified. Depending on the nature of the environmental data operation,
the QAPP may need to address compliance with specifically mandated training requirements.
Personnel assigned to the air toxics ambient air monitoring activities will meet the educational, work
experience, responsibility, personal attributes, and training requirements for their positions. Records on
personnel qualifications and training will be maintained in personnel files and will be accessible for review
during audit activities.
Adequate education and training are integral to any monitoring program that strives for reliable and
comparable data. Training is aimed at increasing the effectiveness of employees and the District. Table
8.1 represents the general training requirements for all employees, depending upon there job classification.
Table 8.1 TCAPCD Training Requirements.
Job Classification
Training Title
Time/Frequency
Requirement
Directors
Executive Development Program
As available
Branch Chief and above
Framework for Supervision
Keys to Managerial Excellence
EEO for Managers and Supervisors
Sexual Harassment
Contract Administration for Supervisors
40 hours of developmental activities
1 st 6 months
After comp. of above
As available
Project Officers and Above
Contract Administration
Contract Administration Recertification
EEO for Managers and Supervisors
Grants Training
Project Officer Training (contract/grants)
Ethics in Procurement
Work statements for Negotiated Procurement
Prior to responsibility
Every three years
As available
Prior to responsibility
-------
Project: Model QAPP
Element No: 8
Re vision No: 1.0
Date: 7/5/01
Page 2 of 4
Job Classification
Field Personnel
Laboratory Personnel
Training Title
24-Hour Field Safety
8- hour Field Safety Refresher
8-hour First Aid/CPR
Blood borne pathogens
24- Hour Laboratory Safety
4- Hour Refresher
Safety Video/Discussion
Chemical Spill Emergency Response
Blood borne pathogens
Time/Frequency
Requirement
1 st time
Yearly
Yearly
1 st time
1 st time
Yearly
Yearly
1 st time
1 st time
8.1.1 Ambient Air Monitoring Training
Appropriate training is be available to employees supporting the Ambient Air Quality Monitoring
Program, commensurate with their duties. Such training may consist of classroom lectures, workshops,
tele-conferences, and on-the-job training.
Over the years, a number of courses have been developed for personnel involved with ambient air
monitoring and quality assurance aspects. Formal QA/QC training is offered through the following
organizations:
Air Pollution Training Institute (APT!) http://www.epa.gov/oar/oaq.apti.html
Air & Waste Management Association (AWMA) http://awma.org/epr.htm
American Society for Quality Control (ASQC)
http://www. asqc. org/products/educat. html
EPA Institute
EPA Quality Assurance Division (QAD) http://es.inel.gov/ncerqa/qal
EPA Regional Offices
Table 8.2 presents a sequence of core ambient air monitoring and QA courses for ambient air
monitoring staff, and QA managers. The suggested course sequences assume little or no experience in
QA/QC or air monitoring. Persons having experience in the subject matter described in the courses would
select courses according to their appropriate experience level. Courses not included in the core sequence
would be selected according to individual responsibilities, preferences, and available resources.
-------
Project: Model QAPP
Element No: 8
Re vision No: 1.0
Date: 7/5/01
Page 3 of 4
Table 8.2. Core Ambient Air Training Courses
Sequence
1*
2*
3*
4*
5*
6*
7*
8*
9
10
11
*
*
*
*
*
Course Title (SI = self instructional)
Air Pollution Control Orientation Course (Revised), SL422
Principles and Practices of Air Pollution Control, 452
Orientation to Quality Assurance Management
Introduction to Ambient Air Monitoring (Under Revision), 81:434
General Quality Assurance Considerations for Ambient Air Monitoring
(Under Revision), SL471
Quality Assurance for Air Pollution Measurement Systems (Under
Revision), 470
Data Quality Objectives Workshop
Quality Assurance Project Plan
Atmospheric Sampling (Under Revision), 435
Analytical Methods for Air Quality Standards, 464
Chain-of-Custody Procedures for Samples and Data, SL443
Data Quality Assessment
Management Systems Review
Beginning Environmental Statistical Techniques (Revised), SL473A
Introduction to Environmental Statistics, SL473B
Statistics for Effective Decision Making
AIRS Training
Sour
ce
APT
I
APT
I
QA
D
APT
I
APT
I
APT
I
QA
D
QA
D
APT
I
APT
I
APT
I
QA
D
QA
D
APT
I
APT
I
ASQ
C
OA
QPS
Courses recommended for QA Managers
-------
Project: Model QAPP
Element No: 8
Re vision No: 1.0
Date: 7/5/01
Page 4 of 4
8.2 Certification
Usually, the organizations participating in the project that are responsible for conducting training and health
and safety programs are also responsible for ensuring certification. Various commercial training courses are
available that meet some government regulations. Training and certification should be planned well in advance
for necessary personnel prior to the implementation of the project. All certificates or documentation
representing completion of specialized training should be maintained in personnel files.
For the air toxics program, the QA Division will issue training certifications for the successful
completion of field, laboratory, sample custody and data management training. Certification will be based
upon the qualitative and quantitative assessment of individuals adherence to the SOPs.
-------
Project: Model QAPP
Element No: 9
Re vision No: 1.0
Date: 7/5/01
Page 1 of 5
9.0 Documentation and Records
The purpose of this element is to define which records are critical to the project and what information needs to
be included in reports, as well as the data reporting format and the document control procedures to be used.
Specification of the proper reporting format, compatible with data validation, will facilitate clear, direct
communication of the investigation and its conclusions and be a resource document for the design of future
studies.
For the ATMP, there are number of documents and records that need to be retained. A document, from
a records management perspective, is a volume that contains information which describes, defines,
specifies, reports, certifies, or provides data or results pertaining to environmental programs. As defined in
the Federal Records Act of 1950 and the Paperwork Reduction Act of 1995 (now 44 U.S. C. 3101-
3107), records are: "...books, papers, maps, photographs, machine readable materials, or other
documentary materials, regardless of physical form or characteristics, made or received by an agency of
the United States Government under Federal Law or in connection with the transaction of public business
and preserved or appropriate for preservation by that agency or its legitimate successor as evidence of the
organization, functions, policies, decisions, procedures, operations, or other activities of the Government
or because of the informational value of data in them..." The TCAPCD follows the guidelines to ensure
the public that the District's procedures are being performed within the guidelines of the Paper Reduction
Act.
The following information describes the Air Pollution Control's document and records procedures for
ATMP. In this QAPP regulation and guidance, the District uses the term reporting package. It is defined
as all the information required to support the concentration data reported to EPA and the State, which
includes all data required to be collected as well as data deemed important by the District under its policies
and records management procedures. Table 9-1 identifies these documents and records.
9.1 Information Included in the Reporting Package
The selection of which records to include in a data reporting package must be determined based on how the
data will be used. Different "levels of effort" require different supporting QA/QC documentation. For example,
organizations conducting basic research have different reporting requirements from organizations collecting data
in support of litigation or in compliance with permits. When possible, field and laboratory records should be
integrated to provide a continuous track of reporting.
9.1.1 Routine Data Activities
The TCAPCD has a structured records management retrieval system that allows for the efficient archive
and retrieval of records. The air toxics information will be included in this system. It is organized in a
-------
Project: Model QAPP
Element No: 9
Re vision No: 1.0
Date: 7/5/01
Page 2 of 5
similar manner to the EPA's records management system (EPA-220-B-97-003) and follows the same
coding scheme in order to facilitate easy retrieval of information during EPA technical systems audits and
network reviews. Table 9.1 includes the documents and records that will be filed according to the statute
of limitations discussed in Section 9.3. In order to archive the information as a cohesive unit, the air toxics
information will be filed under the individual codes depending on the chemical makeup of the compound.
Please see Table 9.1.
Table 9.1 Air Toxics Reporting Package Information
Categories
Management
and Organization
Site Information
Environmental
Data Operations
Raw Data
Data Reporting
Data
Management
Record/Document Types
State Implementation Plan
Reporting agency information
Organizational structure
Personnel qualifications and training
Training Certification
Quality management plan
Document control plan
EPA Directives
Grant allocations
Support Contract
Network description
Site characterization file
Site maps
Site Pictures
QA Project Plans
Standard operating procedures (SOPs)
Field and laboratory notebooks
Sample handling/custody records
Inspection/Maintenance records
Any original data (routine and QC data)
including data entry forms
Electronic deliverables of summary
analytical and associated QC and calibration
runs per instrument
Air quality index report
Data summary reports
Journal articles/papers/presentations
Data algorithms
Data management plans/flowcharts
Air toxics Data
Data Management Systems
File Codes
AIRP/217
AIRP/237
ADMI/106
PERS/123
AIRP/482
AIRP/216
ADMI/307
DIRE/007
BUDG/043
CONT/003
CONT/202
AIRP/237
AIRP/237
AIRP/237
AUDV/708
PROG/185
SAMP/223
SAMP/502
TRAN/643
AIRP/486
SAMP/223
SAMP/224
AIRP/484
AIRP/484
PUBL/250
INFO/304
INFO/304
INFO/160 -
INFO/173
INFO/304 -
INFO/170
-------
Project: Model QAPP
Element No: 9
Re vision No: 1.0
Date: 7/5/01
Page 3 of 5
Categories
Quality
Assurance
Record/Document Types
Good Laboratory Practice
Network reviews
Control charts
Data quality assessments
QA reports
System audits
Response/Corrective action reports
Site Audits
File Codes
COMP/322
OVER/255
SAMP/223
SAMP/223
OVER/203
OVER/255
PROG/082
OVER/658
9.1.2 Annual Summary Reports Submitted to EPA
The TCAPCD shall submit to EPA Region 11 Office, an annual summary report of all the air toxics data
collected within that calender year. The report will be submitted by April 1 of each year for the data
collected from January 1 to December 31 of the previous year. The report will contain the following
information:
Site and Monitoring Information
City name;
county name and street address of site location;
• AIRS-AQS site code;
AIRS-AQS monitoring method code.
Summary Data
Annual arithmetic mean, and
Sampling schedule used as once every 6-day schedule.
Dr. Melvin Thomas, as the senior air pollution control officer for the District, will certify that the annual
summary is accurate to the best of his knowledge. This certification will be based on the various
assessments and reports performed by the organization, in particular, the Quality Assurance Annual
Report (QAAR). Section 21 documents the quality of the air toxics data and the effectiveness of the
quality system.
9.2 Data Reporting Package Format and Documentation Control
The format of data reporting packages, whether for field or lab data, must be consistent with the requirements
and procedures used for data validation and data assessment. All individual records that represent actions taken
to achieve the objective of the data operation and the performance of specific QA functions are potential
components of the final data reporting package. This element of the QAPP should discuss how these various
components will be assembled to represent a concise and accurate record of all activities impacting data quality.
The discussion should detail the recording medium for the project, guidelines for hand-recorded data (e.g., using
indelible ink), procedures for correcting data (e.g., single line drawn through errors and initialed by the
responsible person), and documentation control. Procedures for making revisions to technical documents should
be clearly specified and the lines of authority indicated.
-------
Project: Model QAPP
Element No: 9
Re vision No: 1.0
Date: 7/5/01
Page 4 of 5
Table 9-1 represents the documents and records, at a minimum, that must be filed into the reporting
package. The details of these various documents and records will be discussed in the appropriate sections
of this document.
All raw data required for the calculation of air toxics concentrations, the submission to the AIRS database,
and QA/QC data, are collected electronically or on data forms that are included in the field and analytical
methods sections. All hardcopy information will be filled out in indelible ink. Corrections will be made by
inserting one line through the incorrect entry, initialing this correction, the date of correction and placing the
correct entry alongside the incorrect entry, if this can be accomplished legibly, or by providing the
information on a new line.
9.2.1 Notebooks
The District will issue notebooks to each field and laboratory technician. This notebook will be uniquely
numbered and associated with the individual and the ATMP. Although data entry forms are associated
with all routine environmental data operations, the notebooks can be used to record additional information
about these operations. All notebooks will be bound as well as paginated so that individual pages cannot
be removed unnoticeably.
Field notebooks - Notebooks will be issued for each sampling site. These will be 3-ring binders that will
contain the appropriate data forms for routine operations as well as inspection and maintenance forms and
SOPs.
Lab Notebooks - Notebooks will also be issued for the laboratory. These notebooks will be uniquely
numbered and associated with the ATMP. One notebook will be available for general comments/notes;
others will be associated with, the temperature and humidity recording instruments, the refrigerator,
calibration equipment/standards, and the analytical balances and instruments used for this program.
Sample shipping/receipt- One notebook will be issued to the shipping and receiving facility. This
notebook will be uniquely numbered and associated with the ATMP. It will include standard forms and
areas for free form notes.
9.2.2 Electronic data collection
In order to reduce the potential for data entry errors, automated systems will be utilized where appropriate
and will record the same information that is found on data entry forms. In order to provide a back-up, a
hardcopy of automated data collection information will be stored for the appropriate time frame in project
files. The Information Manager will back-up analytical data acquired by each laboratory instrument
including tuning, calibrations and QC sample runs associated with samples.
9.3 Data Reporting Package Archiving and Retrieval
-------
Project: Model QAPP
Element No: 9
Re vision No: 1.0
Date: 7/5/01
Page 5 of 5
The length of storage for the data reporting package may be governed by regulatory requirements,
organizational policy, or contractual project requirements. This element of the QAPP should note the governing
authority for storage of, access to, and final disposal of all records
In general, all the information listed in Table 9-1 will be retained for 5 years from the date the grantee
submits its final expenditure report unless otherwise noted in the funding agreement. However, if any
litigation, claim, negotiation, audit or other action involving the records has been started before the
expiration of the 5-year period, the records will be retained until completion of the action and resolution of
all issues which arise from it, or until the end of the regular 5-year period, whichever is later. The District
will extend this regulation in order to store records for three full years past the year of collection. For
example, any data collected in calendar year 2000 (1/1/00 -12/31/00) will be retained until, at a
minimum, January 1, 2006, unless the information is used for litigation purposes.
-------
Project: Model QAPP
Element No: 10
Re vision No: 1.0
Date: 7/5/01
Page 1 of 10
10.0 Sampling Design
The purpose of this element is to describe all the relevant components of the experimental design; define the key
parameters to be estimated; indicate the number and type of samples expected; and describe where, when, and how
samples are to be taken. The level of detail should be sufficient that a person knowledgeable in this area could
understand how and why the samples will be collected. This element provides the main opportunity for QAPP
reviewers to ensure that the "right" samples will be taken. Strategies such as stratification, compositing, and
clustering should be discussed, and diagrams or maps showing sampling points should be included. Most of this
information should be available as outputs from the final steps of the planning (DQO) process.
The purpose of this Section is to describe all of the relevant components of the monitoring network to be
operated by TCAPCD, including the network design for evaluating the quality of the data. This entails
describing the key parameters to be estimated, the rationale for the locations of the monitors and the
collocated samplers, the frequency of sampling at the primary and collocated samplers, the types of
samplers used at each site, frequency and performance evaluations. The network design components
comply with the regulations stipulated in Network Design and Site Exposure for Selected Noncriteria
Air Pollutants1.
10.1 Scheduled Project Activities, Including Management Activities
This element should give anticipated start and completion dates for the project as well as anticipated dates of
major milestones, such as the following:
! schedule of sampling events;
! schedule for analytical services by offsite laboratories;
! schedule for phases of sequential sampling (or testing), if applicable;
! schedule of test or trial runs; and
! schedule for peer review activities.
The use of bar charts showing time frames of various QAPP activities to identify both potential bottlenecks and
the need for concurrent activities is recommended.
TCAPCD will be monitoring concentrations at five locations. The order of installation of the primary
samplers has been determined based on anticipated concentrations at each of the locations. The sites with
the highest anticipated concentrations will be installed first, and the collocated samplers will be installed at
a later date. Table 10.1 represents the activities associated with the ordering and deployment of the
primary and collocated samplers.
-------
Project: Model QAPP
Element No: 10
Re vision No: 1.0
Date: 7/5/01
Page 2 of 10
Table 10.1. Schedule of Air toxics Sampling-Related Activities
Activity
Receive samplers
Install samplers at site TCI
Install samplers at site TC2
Install samplers at site TC3
Install collocated samplers at site
TC2
Install samplers at site TC4
Install collocated samplers at site
TC3
Install samplers at TC5
Begin routine sampling at
collocated sites TCI and TC2
Begin routine sampling at
collocated sites TC3 and TC4 and
TC5
Begin sample analysis in
laboratory
Report routine data to AIRS-
AQS
Performance Evaluations
Report QA data to AIRS-AQS
Review QA reports generated by
AIRS
Primary network review
Due Date
July 1,2000
September 2000
September 2000
October 2000
October 2000
November 2000
December 2000
January 1,2001
February 2001
February 2001
Ongoing - due within
90 days after end of
quarterly reporting period
Receive 1"
State/EPA blind lab
samples
Ongoing - due within
90 days after end of
quarterly reporting period
Ongoing
Annually
Comments
After receipt, begin conditioning of filters
First samplers installed. PUF and VOC
Second samplers installed: PUF, VOC, TSP
Third samplers installed. VOC,PUF,ALD
First collocated samplers installed. TSP
First samplers installed. PUF, TSP and VOC
Second collocated samplers installed. PUF,
VOC, ALD
VOC sampler only
Begin sampler shakedown. Make
repairs/changes as needed
Begin laboratory equipment shakedown. Make
adjustments as necessary.
Needed to determine which, if any, monitors fail
bias and/or precision limits.
Evaluate reasonableness of siting
-------
Project: Model QAPP
Element No: 10
Re vision No: 1.0
Date: 7/5/01
Page 3 of 10
10.2 Rationale for the Design
The objectives for an environmental study should be formulated in the planning stage of any investigation. The
requirements and the rationale of the design for the collection of data are derived from the quantitative outputs of the
DQO Process. The type of design used to collect data depends heavily on the key characteristic being investigated.
For example, if the purpose of the study is to estimate overall average contamination at a site or location, the
characteristic (or parameter) of interest would be the mean level of contamination. This information is identified in
Step 5 of the DQO Process. The relationship of this parameter to any decision that has to be made from the data
collected is obtained from Steps 2 and 3 of the DQO Process.
10.2.1 Primary Samplers
The purpose of the ATMP operated by Toxa City is to ascertain the spatial/temporal variability of the
urban area. To determine whether these characteristics are quantified with sufficient confidence, Toxa
City must address sampler type, sampling frequency, and sampler siting. By employing samplers that are
described in the appropriate compendia1'2'3'4, the data collected will be comparable to standard EPA
methods. By complying with the sampling frequency requirements of Network Design and Site
Exposure Criteria for Selected Noncriteria Air Pollutants5, Toxa City assumes that the sampling
frequency is sufficient to attain the desired confidence in the annual 95th percentile and annual mean of
concentrations in the vicinity of each monitor. By selecting sampler locations using the rules in Network
Design and Site Exposure Criteria for Selected Noncriteria Air Pollutants, Toxa City can be
confident that the concentrations within its jurisdiction are adequately characterized. Sampler type,
frequency, and siting are further described in section 10.4.
10.2.2 QA Samplers
The purpose of collocated samplers and the performance evaluation is to estimate the precision and bias
of the various systems samplers. The goal of the District is to have concentrations measured by a sampler
be within ±10% of the true concentration and that the precision have a coefficient of variation less than
10% for each monitoring system.. To estimate the level of bias and precision being achieved in the field,
at least one site will operate collocated samplers. Chapter 24outiines the equations that will be used to
determine precision. There will be 2 analytes from each instrument that will be used to determine the bias
and precision.
Field accuracy will be estimated using flow, temperature sensor and barometric checks. Laboratory
accuracy will be determined by the analysis of known reference analytes prepared by independent
laboratories submitted to the TCAPCD laboratory. If a sampler and laboratory equipment are operating
within the required bias, precision and accuracy levels, then the decision maker can proceed knowing
that the decisions will be supported by unambiguous data. Thus the key characteristics being measured
with the QA samplers are bias and precision.
To determine whether these characteristics are measured with sufficient confidence, Toxa City must
address sampler type, sampling frequency, and sampler siting for the QA network. As with the primary
network, by using samplers as described in the TO and IO methods, maintaining the sampling frequency
specified in Network Design and Site Exposure Criteria for Selected Noncriteria Air Pollutants,
-------
Project: Model QAPP
Element No: 10
Re vision No: 1.0
Date: 7/5/01
Page 4 of 10
Toxa City assumes its QA network will measure bias and precision with sufficient confidence. These
issues are described in more detail in section 10.4.
10.3 Design Assumptions
The planning process usually recommends a specific data collection method (Step 7 of the DQO Process), but
the effectiveness of this methodology rests firmly on assumptions made to establish the data collection design.
Typical assumptions include the homogeneity of the medium to be sampled (for example, sludge, fine silt, or
wastewater effluent), the independence in the collection of individual samples (for example, four separate samples
rather than four aliquots derived from a single sample), and the stability of the conditions during sample collection
(for example, the effects of a rainstorm during collection of wastewater from an industrial plant). The assumptions
should have been considered during the DQO Process and should be summarized together with a contingency plan
to account for exceptions to the proposed sampling plan. An important part of the contingency plan is
documenting the procedures to be adopted in reporting deviations or anomalies observed after the data collection
has been completed. Examples include an extreme lack of homogeneity within a physical sample or the presence of
analytes that were not mentioned in the original sampling plan. Chapter 1 of EPA QA/G-9 provides an overview of
sampling plans and the assumptions needed for their implementation, and EPA QA/G-5S provides more detailed
guidance on the construction of sampling plans to meet the requirements generated by the DQO Process.
The sampling design is based on the assumption that following the rules and guidance provided in CFR
and Network Design and Site Exposure Criteria for Selected Noncriteria Air Pollutants will result in
data that can be used to measure compliance with the national standards. The only issue at Toxa City's
discretion is the sampler siting, and to a degree, sampling frequency. The siting assumes homogeneity of
concentrations within the MSA. Boundaries will be regularly reviewed, as part of the network reviews
(Section 20). The basis for creating and revising the boundaries is described in the following section.
-------
Project: Model QAPP
Element No: 10
Re vision No: 1.0
Date: 7/5/01
Page 5 of 10
10.4 Procedure for Locating and Selecting Environmental Samples
The most appropriate plan for a particular sampling application will depend on: the practicality and feasibility
(e.g., determining specific sampling locations) of the plan, the key characteristic (the parameter established in Step 5
of the DQO Process) to be estimated, and the implementation resource requirements (e.g., the costs of sample
collection, transportation, and analysis).
This element of the QAPP should also describe the frequency of sampling and specific sample locations (e.g.,
emissions inventory, population exposure, determination of highest concentration) and sampling materials.
Sometimes decisions on the number and location of samples will be made in the field; therefore, the QAPP should
describe how these decisions will be driven whether by actual observations or by field screening data. When
locational data are to be collected, stored, and transmitted, the methodology used must be specified and described
(or referenced) and include the following:
! procedures for finding prescribed sample locations,
! contingencies for cases where prescribed locations are inaccessible,
! location bias and its assessment, and
! procedures for reporting deviations from the sampling plan.
When appropriate, a map of the sample locations should be provided and locational map coordinates supplied.
EPA QA/G-5S provides nonmandatory guidance on the practicality of constructing sampling plans and references
to alternative sampling procedures.
10.4.1 Sampling Design
The design of the air toxics network must achieve the monitoring objective . This is:
• Determine the highest concentrations expected to occur in the area covered by the network, i.e., to
verify the spatial and temporal characteristics of HAPs within the city.
The procedure for siting the samplers to achieve the objective is based on judgmental sampling, as is the
case for most ambient air monitoring networks. Judgmental sampling uses data from existing monitoring
networks, knowledge of source emissions and population distribution, and inference from analyses of
meteorology to select optimal sampler locations. In addition, a Geographic Information System (GIS)
software package was also utilized to help locate the samplers. Figures 10-1 and 10-2 illustrate the use
of GIS for locating the samplers. Figures 10.1 shows that the highest population in the area is in the
northwest and just southeast of the bay. Between these residential areas are the port facilities, power
plants and the majority of the industrial sources. This knowledge were used to locate the sampling areas.
The exact locations are discussed in Section 10.4.2
10.4.2 Sampling Locations
Toxa City is situated in 2 counties: Ffillsburg and Pine Lake. The boundaries were determined based
on (1) the 1990 census data by census tract, (2) the boundaries of the existing MSAs, and (3) the
surrounding geography. Figure 10-1 shows the population and major air toxics sources for the counties
which TCACPD is responsible. According to the 1990 census, the Ffillsburg County has a population of
-------
Project: Model QAPP
Element No: 10
Re vision No: 1.0
Date: 7/5/01
Page 6 of 10
834,054 while Pine Lake county has a population of 851, 659. The population is evenly distributed
through the MSA except in the downtown area (see Figure 10-1). As can be seen from figure 10-1, the
two counties surround a coastal bay.
'' WpW'l "'
llpliif
• R4iiiiii i.ill»r
j Roadways
I I Ccnintv
Population
I | 0 - 3009
|: : : :r| 3O1 O - l»Zl»1
|g:;;|;j 5252 - 82OO
82O1 - 1-1-15O
Figure 10.1 Population distribution of Toxa City
-------
Project: Model QAPP
Element No: 10
Re vision No: 1.0
Date: 7/5/01
Page 7 of 10
wf*v,=iv • $F%if* M?¥? iff ^^-w'*!*'- = ^,=j =i=»i^r
20
Mil«s
0- D.011
ft.O I -I -«j
:g*S 0.0 Si- 0.10 S
Iff 0.101 O.XJ4
I I county
Pop ulitll o n f p .;• I -jj>ii s.'sq (Ml
I 0 J»W
;«§;) t-ai raw
f al £-201 11-15*
Figure 10.2 Metals data and Population
-------
Project: Model QAPP
Element No: 10
Re vision No: 1.0
Date: 7/5/01
Page 8 of 10
Figure 10-2 illustrates the metals exposure, population, the proposed air monitoring stations and the
major air toxics sources. As can be seen from this view, the areas that have the highest exposure are
the districts in the northeastern end of the bay. This is where the major boat manufacturing activities exist.
For metals, site TC2 will collect the highest concentrations. TCI, TC4 and TC5 will collect downwind
levels and verify population exposure. As mentioned previously, the procedure for siting the samplers is
based on the expertise of the monitoring staff with the help of the TCAPCD modelers. TCAPCD staff
believe that five sites will be needed to adequately characterize the HAPs in the two counties. Two of the
monitor stations will be located in Pine Lake and three in Ffillsburg County. Figure 10-2 shows a map of
the proposed locations of the sites in relation to population and major air toxics release locations.
One site, TCI will be the upwind/background and will be located to quantify the background
concentrations. The siting of TCI fulfills one of the DQOs for background concentrations. Site TC3 is
located on the bay near the industrial center. Again, this site satisfies the DQO for highest concentration.
This is a middle scale monitoring station sited to capture maximum concentrations. Site TC2 will be
collocated with neighborhood scale monitoring. Sites TC4 and TC5 are downwind/suburban monitoring
locations and are also neighborhood scale. The latitude/longitude coordinates for the five monitoring
sites are listed in Table 10-2.
10.4.3 Sampling Frequency
The TCAPCD has set the frequency for the samplers to once every six days. Please see Table 11-1 for
details.
10.4.4 Collocated Sampling
According to the primary network design, Toxa City will deploy and operate one site (TC2) using
collocated TSP samplers. A second site, TC3 will have collocated PUF, Aldehyde and VOC samplers.
According to 40 CFR Part 58, Appendix A, Section 3.5.2, for each method designation, at least 25%
(minimum of one) of the samplers must be collocated. Although the 40 CFR 58 requirements do not
directly relate to air toxics monitoring, the District will uses these as guidelines for precision and bias. As
a result, Toxa City will collocate the samplers of each type. Based on the data collected by the Toxa
City pilot study, it is assumed the site that will most likely monitor concentrations above the risk
assessment benchmarks is TC3. However, as data from the network becomes available, the data will be
reviewed on an annual basis to determine if a different site is more appropriate for collocation. The
collocation samplers will be operated on a 12-day sampling schedule, regardless of the sampling
frequency of the primary samplers and will coincide with the sampling run time of the primary sampler so
that the primary and collocated samplers are operating on the same days. See Table 10-2 for details on
the location of primary and QA samplers.
-------
Project: Model QAPP
Element No: 10
Re vision No: 1.0
Date: 7/5/01
Page 9 of 10
Table 10.2 List of Collocated Samplers and Coordnates
Site Name
TCI
TC2
TC3
TC4
TC5
Samplers Operated
PUF, VOC
PUF,VOC,
TSP
PUF, Aldehydes, VOC
PUF, VOC, TSP
VOC
Collocated
TSP
Aldehydes,
PUF, VOC
Coordinates
(Lat./Long.)
27.89/-82.80
28.12/-82.61
27.96/-S2.39
28.03/-82.16
27.71/-82.36
10.5 Classification of Measurements as Critical/Noncritical
All measurements should be classified as critical (i.e., required to achieve project objectives or limits on
decision errors, Step 6 of the DQO Process) or noncritical (for informational purposes only or needed to provide
background information). Critical measurements will undergo closer scrutiny during the data gathering and review
processes and will have first claim on limited budget resources. It is also possible to include the expected number of
samples to be tested by each procedure and the acceptance criteria for QC checks (as described in element B5,
"Quality Control Requirements").
The ambient concentration and site location data will be provided to AIRS. The information collected at
collocated samplers is the same as that presented in Tables 6-1, 6-2, 6-3 and 6-4 for primary samplers.
All of the measurements in these tables are considered critical because it forms the basis for estimating
bias and precision, which are critical for evaluating the ability of the decision makers to make decisions at
desired levels of confidence.
-------
Project: Model QAPP
Element No: 10
Re vision No: 1.0
Date: 7/5/01
Page 10 of 10
10.6 Validation of Any Non-Standard Measurements
For nonstandard sampling methods, sample matrices, or other unusual situations, appropriate method
validation study information may be needed to confirm the performance of the method for the particular matrix. The
purpose of this validation information is to assess the potential impact on the representativeness of the data
generated. For example, if qualitative data are needed from a modified method, rigorous validation may not be
necessary. Such validation studies may include round-robin studies performed by EPA or by other organizations.
If previous validation studies are not available, some level of single-user validation study or ruggedness study
should be performed during the project and included as part of the project's final report. This element of the QAPP
should clearly reference any available validation study information.
At this time there are no NAAQS for the air toxics compounds, with the except for lead. Toxa City is
deploying and operating instruments according to descriptions in the applicable EPA guidance
documents.
References
1. Network Design and Site Exposure Criteria For Selected Noncriteria Air Pollutants, EPA Document Number, EPA
450/4-84-022, September 1984.
-------
Project: Model QAPP
Element No: 11
Re vision No: 1.0
Date: 7/5/01
Page 1 of 8
11.0 Sampling Methods Requirements
Environmental samples should reflect the target population and parameters of interest. As with all other
considerations involving environmental measurements, sampling methods should be chosen with respect to the
intended application of the data. Just as methods of analysis vary in accordance with project needs. Different
sampling methods have operational characteristics, such as cost, difficulty, and necessary equipment. In
addition, the sampling method can materially affect the representativeness, comparability, bias, and precision of
the final analytical result.
In the area of environmental sampling, there exists a great variety of sample types. It is beyond the scope of
this document to provide detailed advice for each sampling situation and sample type. Nevertheless, it is
possible to define certain common elements that are pertinent to many sampling situations (see EPA QA/G-5S).
If a separate sampling and analysis plan is required for the project, it should be included as an appendix to
the QAPP. The QAPP should simply refer to the appropriate portions of the sampling and analysis plan for the
pertinent information and not reiterate information.
11.1 Purpose/Background
The methods described herein provides for measurement of the relative concentration of a number
hazardous air pollutants in ambient air for a 24-hour sampling period .
Since there are 4 separate instruments and subsequently four separate analytical techniques, each of the
sampling methods are different. General QA handling requirements are crucial for all sampling, so in that
aspect, sample handling is similar.
11.2 Sample Collection and Preparation
-------
Project: Model QAPP
Element No: 11
Re vision No: 1.0
Date: 7/5/01
Page 2 of 8
(1) Select and describe appropriate sampling methods from the appropriate compendia of methods. For each
parameter within each sampling situation, identify appropriate sampling methods from applicable EPA
regulations, compendia of methods, or other sources of methods that have been approved by EPA. When
EPA-sanctioned procedures are available, they will usually be selected. When EPA-sanctioned procedures
are not available, standard procedures from other organizations and disciplines may be used. In addition,
the QAPP should specify the type of sample to be collected (e.g., grab, composite, depth-integrated, flow-
weighted) together with the method of sample preservation.
(2) Discuss sampling methods' requirements. Each medium or contaminant matrix has its own characteristics
that define the method performance and the type of material to be sampled. Investigators should address the
following:
! choice of sampling method/collection;
! inclusion of all particles within the volume sampled, and
! correct subsampling to reduce the representative field sample into a representative laboratory aliquot.
(3) Describe the decontamination procedures and materials. Decontamination is primarily applicable in
situations of sample acquisition from solid, semi-solid, or liquid media, but it should be addressed, if
applicable, for continuous monitors as well. Conversely, if ppb-level detection is required, rigorous
decontamination or the use of disposable equipment is required.
Sample preparation is an essential portion of the AMTP. The following functions are required for
sample preparation:
TSP - filter receipt and inspection, filter numbering, conditioning and storage;
VOC - cleaning, testing, verification and storage of canisters;
SVOC - filter receipt and inspection, cleaning of filters, inspection, clean-up and certification of
PUF cartridges;
Aldehydes - receipt and storage of DNPH cartridges in the laboratory refrigerator.
Sample set-up of the air toxics samplers in the Toxa City network takes place any day after the previous
sample has been recovered. For instance, on a Sunday - Thursday sample day set-up when 1 in 6 day
sampling is required, the pickup occurs the day after the run. However, on Friday and Saturday run
dates, the pick up is on the following Monday. It is important to recognize that the only holding time that
affects sample set-up is the 30 day window from the time a samples are pre-weighed/processed to the
time it is installed in the monitor. At collocated, sites the second monitor will be set up to run at a sample
frequency of 1 in 12 days; however, sample set-up will take place on the same day as the primary
sampler. Detailed sample set-up procedures are available from the Toxa City sample methods standard
operating procedure.
11.2.2 Sample Recovery
Sample recovery of any individual sample from the air toxics instruments sampler in the Toxa City
-------
Project: Model QAPP
Element No: 11
Re vision No: 1.0
Date: 7/5/01
Page 3 of 8
network must occur within 72 hours of the end of the sample period for that sampler. For 1 in 6 day
sampling this will normally be the day after a sample is taken. The next sample would also be set-up at
this time. See Table 11.1.
Table 11.1 Sample Set-up, Run and Recovery dates
Sample
Frequency
1 in(.
Weekl
1 in(.
Week 2
1 hid
Week3
1 in(.
Week 4
1 hid
WeekS
1 in(.
Week 6
Sunday
Sample
Day 1
Monday
Recovery
& Set-up
Recovery
& Set-up
Recovery
& Set-up
Sample
Day 13
Tuesday
Sample
Day 11
Recovery
& Set-up
Wednesday
Sample
Day 9
Recovery &
Set-up
Thursday
Sample
Day 7
Recovery
& Set-up
Friday
Sample
Day 5
Recove
ry
& Set-
up
Saturday
Sample
Day 3
-------
Project: Model QAPP
Element No: 11
Re vision No: 1.0
Date: 7/5/01
Page 4 of 8
11.3 Support Facilities for Sampling Methods
Support facilities vary widely in their analysis capabilities, from percentage-level accuracy to ppb-level
accuracy. The investigator must ascertain that the capabilities of the support facilities are commensurate with the
requirements of the sampling plan established in Step 7 of the DQO Process.
The main support facility for sampling is the sample trailer or shelter. At each sample location in the Toxa
City network there is a climate controlled sample trailer. The trailer has limited storage space for items
used in support of air toxic sampling. Table 11.2 lists the supplies that are stored at each sample location
trailer
Table 11.2 Supplies at Storage Trailers
Item
Powder Free Gloves
Fuses
Temperature standard
Flow rate standard
Sampler Operations Manual
Sampling SOPS
Flow rate verification filter
Tools
Filter Cassettes
Motor Brushes
Various 1/8'" and 1/4" fittings
pumps
Data Download Cable
Teflon end caps
aluminum foil
ice chests
Minimum
Quantity
box
2
1
1
1 per model
1
2
1
1
1 set of 2
IBox
1 Box
1
IBox
IBox
2
Notes
Material must be inert and powder free
Of the type specified in the sampler manual
In the range expected for this site and NIST
traceable
Calibrated from at least IS.OLPMto 18.4 LPM
and NIST Traceable
For TSP sampler
One Tool kit with various wrenches, screwdrivers,
etc..
For use with flow rate check filter or non-
permeable membrane
For TSP andPUF samplers
For Carbonyl and VOC samplers
For use with laptop computer
For capping the DNPH cartridges
For Carbonyl andPUF samplers
Spare ice chests for transporting samples
-------
Project: Model QAPP
Element No: 11
Re vision No: 1.0
Date: 7/5/01
Page 5 of 8
Since there are other items that the field operator may need during a site visit that are not expected to be
at each site, the operator is expected to bring these items with him/her.
11.4 Sampling/Measurement System Corrective Action
This section should address issues of responsibility for the quality of the data, the methods for making
changes and corrections, the criteria for deciding on a new sample location, and how these changes will be
documented. This section should describe what will be done if there are serious flaws with the implementation of
the sampling methodology and how these flaws will be corrected. For example, if part of the complete set of
samples is found to be inadmissable, how replacement samples will be obtained and how these new samples will
be integrated into the total set of data should be described.
Corrective action measures in the ATMP will be taken to ensure the data quality objectives are attained.
There is the potential for many types of sampling and measurement system corrective actions. Table 11.3
is an attempt to detail the expected problems and corrective actions needed for a well-run network.
Table 11.3 Field Corrective Action
Item
Filter
Inspection (Pre-
sample)
Filter
Inspection (Post-
sample)
Flow rate
erratic
Sample Flow
Rate Verification
Problem
Pinhole(s) or torn
Torn or otherwise
suspect particulate by-
passing 46.2 mm filter.
Heavy loading or
motor/motor brushes are
worn..
Out of Specification
{+ 10% of transfer
standard)
Action
1.) If additional filters have been
brought, use one of them. Void filter with
pinhole or tear.
2.) Use new field blank filter as
sample filter.
3.) Obtain a new filter from lab.
1.) Inspect area downstream of
where filter rests in sampler and determine
if particulate has been by-passing filter.
2.) Inspect in-line filter before
sample pump and determine if excessive
loading has occurred. Replace as
necessary.
Replace brushes or motor. Re-
calibrate flowrate.
1.) Completely remove mass flow
controller and perform flow rate check.
2.) Perform leak test.
3.) Check flow rate at 3 points to
determine if flow rate problem is with
zero bias or slope.
4.) Re-calibrate flow rate
Notification
1.) Document on field
data sheet.
2.) Document on field
data sheet.
3.) Notify Field Manager
1.) Document on field
data sheet.
2.) Document in log
book.
Document in log book
1.) Document on data
sheet.
2.) Document on data
sheet.
3.) Document on data
sheet. Notify Field Manager
4.) Document on data
sheet. Notify Field Manager.
-------
Project: Model QAPP
Element No: 11
Re vision No: 1.0
Date: 7/5/01
Page 6 of 8
Item
Leak Test
Sample Flow
Rate
Ambient
Temperature
Verification, and
Filter Temperature
Verification.
Ambient
Pressure Verification
Elapsed Sample
Time
Elapsed Sample
Time
Power
Problem
VOC canisters will not
hold pressure.
Consistently low flows
documented during sample
run
Out of Specification
{+ 1°C of standard)
Out of Specification
(±10mmHg)
Out of Specification
( 10 mm/day)
Sample did not run
Power Interruptions
Action
1.) Replace fitting on nut on sampler
line.
2.) Inspect connections to the mass
flow controller and re-perform leak test.
1.) Check programming of sampler
flowrate of VOC/Carbonyl Sampler.
2.) Check flow with a flow rate
verification filter and determine if actual
flow is low.
3.) Inspect in-line filter and PUF
cartridge downstream of filter location,
replace as necessary.
1.) Make certain thermocouples are
immersed in same liquid at same point
without touching sides or bottom of
container.
2.) Use ice bath or warm water bath
to check a different temperature. If
acceptable, re-perform ambient
temperature verification.
3.) Connect new thermocouple.
4.) Check ambient temperature with
another NIST traceable thermometer.
1.) Make certain pressure sensors are
each exposed to the ambient air and are
not in direct sunlight.
2.) Call local Airport or other source
of ambient pressure data and compare that
pressure to pressure data from monitors
sensor. Pressure correction may be
required
3.) Connect new pressure sensor
Check Programming, Verify Power
Outages
1.) Check Programming
2.) Try programming sample run to
start while operator is at site. Use a flow
verification filter.
Check Line Voltage
Notification
1.) Document in log
book.
2.) Document in log
book, notify Field Manager,
and flag data since last
successful leak test.
1.) Document in log
book.
2.) Document in log
book.
3.) Document in log
book.
1.) Document on data
sheet.
2.) Document on data
sheet.
3.) Document on data
sheet. Notify Field Manager.
4.) Document on data
sheet. Notify Field Manager.
1.) Document on data
sheet.
2.) Document on data
sheet.
3.) Document on data
sheet. Notify Field Manager
Notify Field Manager
1.) Document on data
sheet. Notify Field Manager
2.) Document in log
book. Notify Field Manager.
Notify Field Manager
-------
Project: Model QAPP
Element No: 11
Re vision No: 1.0
Date: 7/5/01
Page 7 of 8
Item
Power
Data
Downloading
Problem
LCD panel on, but
sample not working.
Data will not transfer
to laptop computer or there
is no printout from the
Carbonyl/VOC samplers
Action
Check circuit breaker, some the VOC
and Carbonyl samplers have battery back-
up for data but will not work without AC
power.
Document key information on
sample data sheet. Make certain problem
is resolved before data is written over in
sampler microprocessor.
Notification
Document in log book
Notify Field Manager.
In addition to these corrective actions, the samplers will also be calibrated: when installed, after any major
repairs, or when an audit flow rate shows that the samplers is outside of the +/-10% relative to the audit
flow value.
11.5 Sampling Equipment, Preservation, and Holding Time Requirements
This section includes the requirements needed to prevent sample contamination (disposable samplers or
samplers capable of appropriate decontamination), the physical volume of the material to be collected (the size of
composite samples, core material, or the volume of water needed for analysis), the protection of physical
specimens to prevent contamination from outside sources, the temperature preservation requirements, and the
permissible holding times to ensure against degradation of sample integrity.
This sections details the requirements needed to prevent sample contamination, the volume of air to be
sampled, how to protect the sample from contamination, temperature preservation requirements, and the
permissible holding times to ensure against degradation of sample integrity.
11.5.1 Sample Contamination Prevention
The quality system has rigid requirements for preventing sample contamination. Powder free gloves are
worn while handling filter cassettes, PUF and DNPH cartridges . Filter and cartridges are to be held in
storage containers (static resistant zip lock bags) as provided by the sampler manufacturer during
transport to and from the laboratory. Once samples have been analyzed they, are stored in static
resistant zip lock bags.
11.5.2 Sample Volume
The volume of air to be sampled is specified in manufacturer's and the Method specifications. The
different methods specify that certain minimum volumes must be collected Samples are expected to be
24 hours, therefore the site operators must set the flow rates to collect sufficient sample to obtain the
minimum sample volume. In some cases a shorter sample period may occur due to power outages. A
valid sample run should not to be less than 23 hours. If the sample period is less than 23 hours or
greater than 25 hours, the sample will be flagged and the Branch Manager notified.
11.5.3 Temperature Preservation Requirements
-------
Project: Model QAPP
Element No: 11
Re vision No: 1.0
Date: 7/5/01
Page 8 of 8
The temperature requirements of the samples vary between methods. During transport from the
laboratory to the sample location there are no specific requirements for temperature control with the
exception of DNPH cartridges. Filters will be located in their protective container and in the transport
container. Excessive heat must be avoided (e.g., do not leave in direct sunlight or a closed-up car during
summer). DNPH cartridges need to stored at 4° C until they are loaded into the sampler. The filter
temperature requirements are detailed in Table 11.4.
Table 11.4 Temperature Requirements
Item
TSP filter temperature control
during sampling and until recovery.
DNPH Cartridge Filter
temperature control pre- and post-
sampling .
VOC canister Pre and post
sampling
PUF cartridge and filter
Temperature Requirement
No requirements
4° C or less
No Requirements
4° C or less
Reference
TO-1 1A Compendium Section
9.4.3
TO-13A Section 6.2.7
11.5.4 Permissible Holding Times
The permissible holding times for the sample are clearly detailed in the attached appendices.. These holding times are provided
in Table 11-5.
Table 11-5 Holding Times
Item
TSP filter
temperature
VOC canister
PUF cartridge and
filter
DNPH Cartridge
Filter
Holding
Time
No limits
<30 days
<24 Hours
(ideally) or 20
days if
refrigerated
<30 days
From:
Completion
of sample period
Time of
recovery
Sample end
date/time
To:
Time of
analysis
Time placed
in conditioning
room
Date of Post
Weigh
Reference
TO- 15 Compendium Section
9.4.2.1
TO- 13 Compendium Section
11.3.19
TO-11 Compendium
Section 11. 1.1
-------
Project: Model QAPP
Element No: 12
Re vision No: 1.0
Date: 7/5/01
Pnop 1 of Q
12.0 Sampling Custody
This element of the QAPP should clearly describe all procedures that are necessary for ensuring that:
1. samples are collected, transferred, stored, and analyzed by authorized personnel;
2. sample integrity is maintained during all phases of sample handling and analyses; and
3. an accurate written record is maintained of sample handling and treatment from the time of its
collection through laboratory procedures to disposal.
Proper sample custody minimizes accidents by assigning responsibility for all stages of sample handling and
ensures that problems will be detected and documented if they occur. A sample is in custody if it is in actual
physical possession or it is in a secured area that is restricted to authorized personnel. The level of custody
necessary is dependent upon the project's DQOs. While enforcement actions necessitate stringent custody
procedures, custody in other types of situations (i.e., academic research) may be primarily concerned only with the
tracking of sample collection, handling, and analysis.
Sample custody procedures are necessary to prove that the sample data correspond to the sample collected, if
data are intended to be legally defensible in court as evidence. In a number of situations, a complete, detailed,
unbroken chain of custody will allow the documentation and data to substitute for the physical evidence of the
samples (which are often hazardous waste) in a civil courtroom.
An outline of the scope of sample custody—starting from the planning of sample collection, field sampling,
sample analysis to sample disposal-should also be included. This discussion should further stress the
completion of sample custody procedures, which include the transfer of sample custody from field personnel to
lab, sample custody within the analytical lab during sample preparation and analysis, and data storage.
Figures 12.1 -12.4 represent chain of custody forms that will be used to track the stages of filter handling
throughout the data collection operation. Although entries on this form will be made by hand, the
information will be entered into the a sampling tracking system, where an electronic record will be kept.
This section will address sample custody procedures at the following stages:
Pre-sampling
Post-sampling
Sample receipt
Sample archive
-------
Project: Model QAPP
Element No: 12
Re vision No: 1.0
Date: 7/5/01
Pnop 9 of Q
p
F
re-Sampling Cartridge
Site Operator
Initial
BLM
BLM
Cart.
ID
D990101
DC990101
DNPH Cartridge Chain of Custody Record
Receipt
Date
99/01/01
99/01/01
Monitor ID Install Date Temp. Comments
Storage
060021125811041 99/01/03 4 C
060021125811041 99/01/03 4 C
Post-Sampling Recovery
Site
Operate
r
Final
BLM
BLM
Cart. ID
D990101
DC990101
Monitor ID
060021125811041
060021125811041
Removal Removal Comments
Date Time
99/01/03 0900
99/01/03 0900
ree Form Notes -
R
F
eceipt
Box 1 Max Temp
Receiver
ID
SBM
SBM
Filter ID
D990101
DC990101
h
/Unl
Date
Received
99/01/04
99/01/04
"emp Box 2 Max Temp MinTemp
Receip Shipping
t time Integrity Archived Sent
Flags to
Lab
1030 GSI X
1030 GSI X
ree Form Notes -
Transfer
Relinquishe(
1 bv: SBM
Date/Time: 99/01/04 /1130 Received bv: FIN Date/Time: 99/01/04 /1130
Figure 12.1 Example DNPH Cartridge chain of custody record
-------
Project: Model QAPP
Element No: 12
Re vision No: 1.0
Date: 7/5/01
Paoe- T, of Q
p
F
VOC Canister Chain of Custody Record
re-Sampling Canister Selection
Site Operator
Initial
BLM
Can. ID Receipt Monitor ID Install Date Comments
Date
V990101 99/01/01 060021125811041 1/1/00
Post-Sampling Canister Recovery
Site
Operate
r
Final
BLM
Can. ID Monitor ID Removal Removal
Date Time
V990101 060021125811041 99/01/02 0900
Comments
ree Form Notes -
C
F
anister Receipt
Receiver
ID
SBM
Can ID Date Receip Shippin
Received t time g
Integrit
y Flags
V990101 99/01/04 1030 GSI
Sent to
Lab
X
ree Form Notes -
Figure 12.2 Example filter VOC Canister chain of custody record
-------
Project: Model QAPP
Element No: 12
Re vision No: 1.0
Date: 7/5/01
S of Q
p
F
PUF Cartridge Chain of Custody Record
re-Sampling Cartridge
Site Operator
Initial
BLM
BLM
Cart. ID Receipt Monitor ID Install Date Comments
Date
P990101 99/01/01 060021125811041 99/01/03
PFB990101 99/01/01 060021125811041 99/01/03
Post-Sampling Recovery
Site Cart. ID Monitor ID Removal Removal
Operate Date Time
r
Final
BLM P990101 0600211258110 99/01/03 0900
41
BLM PFB990101 0600211258110 99/01/03 0900
41
Comments
ree Form Notes -
~E
F
3ox 1 Max Ten
Receiver
ID
SBM
SBM
ip Min Temp Box 2 Max Temp Min Temp
Filter ID Date Receip Shippin
Receive t time g
d Integrit
y Flags
P990101 99/01/04 1030 GSI
PFB990101 99/01/04 1030 GSI
Temp of
Sample Sent to
Lab
X
X
ree Form Notes -
T
R
ransfer
elinquished bv SBM Date/Time: 99/01/04 /1130 Received bv: FIN Date/Time: 99/01/04 /1130
Figure 12.3 Example PUF Cartridge chain of custody record
-------
Project: Model QAPP
Element No: 12
Re vision No: 1.0
Date: 7/5/01
Pnop f, of Q
TSP Filter Chain of Custody Record
Pre-Sampling Filter Selection
Site Operator
Initial
BLM
Filter ID
M990101
Cont.
ID
MC001
Receipt
Date
99/01/01
Monitor ID
060021125811041
Sampler ID
AD001
Installation
Date
99/01/01
Comments
Post-Sampling Filter Recovery
Site
Operate
r
Final
BLM
Filter ID
M990101
Cont.
ID
MC001
Monitor ID
060021125811041
Sampler
ID
AD001
Removal
Date
99/01/03
Removal
Time
0900
Field
Qualifier
s
Free Form Notes -
Receiver
ID
SBM
Filter ID
M990101
Cont.
ID
MC001
Date
Received
99/01/04
Receip
t time
1030
Shippin
g
Integrit
y Flags
GS;
Archive
d
Sent to
Lab
X
Free Form Notes -
12.4 Example of the TSP/Metal Chain of Custody Form
-------
Project: Model QAPP
Element No: 12
Re vision No: 1.0
Date: 7/5/01
Pnop 7 of Q
Archiving Tracking Form
Sample
ID
MC990101
PFB990101
Sample
Type
TSP
PUF
Analysis
Date
99/07/05
99/07/05
Archive
Date
99/07/0
6
99/07/0
6
Box ID/Box #
060021125811
041/1
060021125811
041/1
Archived
By:
FIN
FIN
Comments
Figure 12.5 general archive form
12.1 Sample Custody Procedure
One of the most important values in the sample custody procedure is the unique sample ID number,
illustrated in Figure 12.1 - 12.4. The ID is an alpha-numeric value. The alpha values identify the type of
sample(V,P,D or M),a field blank (FB),a lab blank (LB) or collocated (C). The next two values (YY)
represent the last two digits of the calendar year and the next 4 digits represent a unique date (MM/DD).
Therefore, for 1998 the first routine filter will be numbered M980101 for a metals filter and the collocated
sample will be MC980101. The field blank for the same day would be label MFB980101. The filter ID
will be generated by the laboratory analyst at the time of preparation of the sample.
12.1.1 Pre-Sampling Custody
The District's pre-sampling SOPs define how the samples will be enumerated, conditioned, weighed,
placed into the protective shipping container, sealed with tape, and stored or refrigerated. See Table
11.3 for details on sample holding. The Inventory Sheets containing the ID, Sample Type, Container
ID, and the Pre-sampling date will be attached to the field shelf for use by the site operator. Each
sampling period, the site operators will select samples that they will used for the field. The number
selected will depend on the time in the field prior to returning to the laboratory and the
-------
Project: Model QAPP
Element No: 12
Re vision No: 1.0
Date: 7/5/01
Pnop S of Q
number of samplers to be serviced. The site operator will perform the following Pre-sampling activities:
1. Contact Mr. Arcemont or Ms. Killion for access to laboratory.
2. Put on appropriate laboratory attire.
3. Enter the filter storage area.
4. Review the Inventory Sheet and select the next set of samples on the sheet. Ensure the seals
are intact. Since the site operator can not check the ID he will have to use the container ID
value.
5. Take the Chain of Custody Records for each site visited. Fill out the first 4 columns of the "Pre-
Sampling Selection" portion of the Chain of Custody Record (Fig s!2.1 - 12.4) for each
sample.
6. Initial the column "Site Operator" on the Inventory Sheets to signify selection of the filters.
7. Pack samples in sample coolers for travel to the field.
Upon arrival at a site:
8. Select the appropriate samples.
9. Once the samples are installed at the site, complete the remainder of the columns of the "Pre-
Sampling Selection" portion of the Chain of Custody Records (Fig 12.1- 12.4.).
12.1.2 Post Sampling Custody
The field sampling SOPs specify the techniques for properly collecting and handling the sample filters.
Upon visiting the site:
1. Select the appropriate Chain of Custody Records. Ensure that the filter ID are correct.
2. Remove the sample. Please refer to Appendices A-D for explicit details on unloading samples.
Briefly examine and and place it into the protective container per SOPs and seal with tape.
3. Place the protective containers) into the shipping/transport container with the appropriate
temperature control devices.
4. Record "Post Sampling Filter Recovery Information" on the Filter Chain of Custody Record.
12.1.3 Sample Reciept
The samples, whether transported by the site operator or next day air, will be received by either Janet
Hoppert or David Bush at the Shipping/Receiving Office. The Shipping/Receiving Office will:
1. Receive shipping/transport container(s).
2. Upon receipt, open the containers) to find Filter Chain of Custody Record(s) or collect the
originals from the site operator (if delivered by operator).
3. Fill out the "Filter Receipt" area of the Filter Chain of Custody Records(s). Check sample
container seals.
4. If the samples are delivered on a weekday, follow sequence 5; if the sample (s) are delivered
on a weekend, follow sequence 6.
-------
Project: Model QAPP
Element No: 12
Re vision No: 1.0
Date: 7/5/01
Pnop Q of Q
5. Check the "Sent to Laboratory" column of the Filter Chain of Custody Records(s) and
transport the filters to the appropriate laboratory room . Upon delivery to the laboratory,
complete the "Filter Transfer" area of the Filter Chain of Custody Records(s).
6. Store the samples in the refrigerator and check the "archived" column of the Filter Chain of
Custody Records(s). On the Monday of the following week, deliver the archived filters to the
laboratory and complete the "Filter Transfer" area of the Filter Chain of Custody Records(s).
12.1.4 Sample Archive
Once the analysis laboratory receives the filter, they will use their raw data entry sheets to log the
samples back in from receiving and prepare them for post-sampling weighing activities. These activities
are included in the analytical SOPs. The laboratory technicians will take the filters out of the protective
containers or folders and examine them for integrity, which will be marked on the data entry sheets.
During all post-sampling activities, filter custody will be the responsibility of Mr. Arcemont. The samples
will be stored within the laboratory freezer. The laboratory has restricted access to Ms. Killion and Mr.
Arcemont.
Upon completion of post-sampling weighing activities, the Filter Archiving Form (Figure 12.2) will be
used by the laboratory technicians to archive the filter. Each filter will be packaged according to the
SOPs and stored in a box uniquely identified by Site ID and box number. Samples will be archived in
the filter storage facility for one year past the date of collection..
-------
Project: Model QAPP
Element No: 13
Re vision No: 1.0
Date: 7/5/01
Page 1 of 6
13.0 Analytical Methods Requirements
The choice of analytical methods will be influenced by the performance criteria, Data Quality
Objectives, and possible regulatory criteria. Qualification requirements may range from functional group
contaminant identification only to complete individual contaminant specification. If appropriate, a citation of
analytical procedures may be sufficient if the analytical method is a complete SOP, such as one of the
Contract Lab Program Statements of Work. In other situations, complete step-wise analytical and/or sample
preparation procedures will need to be attached to the QAPP if the procedure is unique or an adaption of a
"standard" method.
Specific monitoring methods and requirements to demonstrate compliance traditionally were specified
in the applicable regulations and/or permits. However, this approach is being replaced by the Performance-
Based Measurement System (PBMS). PBMS is a process in which data quality needs, mandates, or
limitations of a program or project are specified and serve as a criterion for selecting appropriate methods.
The regulated body selects the most cost-effective methods that meet the criteria specified in the PBMS.
Under the PBMS framework, the performance of the method employed is emphasized rather than the specific
technique or procedure used in the analysis. Equally stressed in this system is the requirement that the
performance of the method be documented and certified by the laboratory that appropriate QA/QC
procedures have been conducted to verify the performance. PBMS applies to physical, chemical, and
biological techniques of analysis performed in the field as well as in the laboratory. PBMS does not apply
to the method-defined parameters.
The QAPP should also address the issue of the quality of analytical data as indicated by the data's
ability to meet the QC acceptance criteria. This section should describe what should be done if the
calibration check samples exceed the control limits due to mechanical failure of the instrumentation, a drift in
the calibration curve occurs, or if a reagent blank indicates contamination. This section should also indicate
the authorities responsible for the quality of the data, the protocols for making changes and implementing
corrective actions, and the methods for reporting the data and its limitations.
Laboratory contamination from the processing of hazardous materials such as toxic or radioactive
samples for analysis and their ultimate disposal should be a considered during the planning stages for
selection of analysis methods. Safe handling requirements for project samples in the laboratory with
appropriate decontamination and waste disposal procedures should also be described.
13.1 Purpose/Background
The methods stated here provide for gravimetric, spectrophotometric and chromatographic analyses of
samples collected in the Toxa City network. The basic methods used by the agency are based on the
Toxic Organic and Inorganic Compendia1'2'3'4. These are listed in the Reference area of this section.
13.2 Preparation of Samples
Preparation procedures should be described and standard methods cited and used where possible.
Step-by-step operating procedures for the preparation of the project samples should be listed in an
appendix. The sampling containers, methods of preservation, holding times, holding conditions, number
and types of all QA/QC samples to be collected, percent recovery, and names of the laboratories that will
perform the analyses need to be specifically referenced.
-------
Project: Model QAPP
Element No: 13
Re vision No: 1.0
Date: 7/5/01
Page 2 of 6
The Toxa City network consist of 5 sites. The primary samplers will operate on a 1 in 6 day
schedule. The collocated samplers are on a 1 in 12 day schedule. Therefore, the approximate number
of routine samples that have to be prepared, used, transported, and conditioned is 24 per week. In
addition, field blanks and lab blanks must also be prepared. See the attached SOPs for activities
associated with preparing pre-sample batches.
Upon delivery of approved sample media for use in the Toxa City network, the receipt is documented
and the pre-sampling media stored in the conditioning room/laboratory. Storing samples in the
laboratory makes it easier to maximize the amount of time available for conditioning. Upon receipt,
samples will be labeled with the date of receipt, opened one at a time and used completely before
opening another case. In the case of canisters, each canister will be cleaned according to the cleaning
procedures in Appendix D. DNPH cartridges will be stored in a refrigerator until taken into the field.
All TSP filters in a lot will be used before a case containing another lot is opened. When more than one
case is available to open the "First In - First Out" rule will apply. This means that the first case of filters
received is the first case that will be used.
13.3 Analysis Method
The citation of an analytical method may not always be sufficient to fully characterize a method because the
analysis of a sample may require deviation from a standard method and selection from the range of options in the
method. The SOP for each analytical method should be cited or attached to the QAPP, and all deviations or
alternative selections should be detailed in the QAPP.
Often the selected analytical methods may be presented conveniently in one or several tables describing the
matrix, the analytes to be measured, the analysis methods, the type, the precision/accuracy data, the performance
acceptance criteria, the calibration criteria, and etc.
13.3.1 Analytical Equipment and Method
The instruments used for analysis are listed in Table 13.1.
Table 13.1 Instruments Used in the Toxa City Laboratory
Parameter
Metals
Aldehydes
VOCs
SVOC
Instrument
Antech
3000
AanTech
3001
Antech
3001
AnTech
3001
Method
Inductively Coupled Plasma
High Pressure Liquid Chromatography
Gas Chromatography
Gas Chromatography/Mass
Spectrometry
Range
0.01 to 50 ug/m3
0.01 to 25 ppbv
0.001 to lOOppbv
0.01 to 50 ppbv
-------
Project: Model QAPP
Element No: 13
Re vision No: 1.0
Date: 7/5/01
Page 3 of 6
13.3.2 Environmental Control
The Toxa City TSP weigh room facility is an environmentally controlled room with temperature and
humidity control. Temperature is controlled at a minimum from 20 - 30° C. Humidity is controlled
from 30 - 40% relative humidity. Temperature and relative humidity are measured and recorded
continuously during equilibration. The balance is located on a vibration free table and is protected from
or located out of the path of any sources of drafts. Filters are conditioned before both the pre- and
post-sampling weighings. Filters must be conditioned for at least 24 hours to allow their weights to
stabilize before being weighed. The areas used for preparation of the canister, and PUF samples are
clean laboratory benches in the main part of the lab. The areas are cleaned periodically to eliminate
contamination of samples. This is particularly important for the PUF samples. Small contaminants can
be in the atmosphere of the laboratory and contaminate the PUF samples. Great care is exercised to
keep the lab atmosphere clean of SVOCs. Lab blanks for PUFs are performed once every 10
samples. DNPH cartridges must be stored at 4° C before they are extracted and analyzed.
13.4 Internal QC and Corrective Action for Measurement System
A QC notebook or database (with disk backups) will be maintained which will contain QC data,
including the calibrations, maintenance information, routine internal QC checks of mass reference
standards and laboratory and field or lab filter blanks, and external QA audits. It is a requirement that
QC charts be maintained on each instrument and included in their maintenance notebooks. These
charts may allow the discovery of excess drift that could signal an instrument malfunction.
At the beginning of each analysis day, after the analyst has completed zeroing and calibrating the
instruments and measuring the working standard, analyze the laboratory filter blanks established for the
current samples to be analyzed.
Corrective action measures in the system will be taken to ensure good quality data. There is the
potential for many types of sampling and measurement system corrective actions. Each of the SOPs
outline exact actions that will be taken if the analytical systems are out of control.
13.5 Sample Contamination Prevention, Preservation, and Holding
13.5.1 Sample Contamination Prevention
The analytical support component of the network has rigid requirements for preventing sample
contamination. To minimize contamination, the sample media clean-up and sample preparation rooms
are separate from the instrumentation rooms. In addition, Heating and Ventilation system is check
annually by certified technicians. Hoods are also checked annually. TSP filters are
equilibrated/conditioned and stored in the same room where they are weighed. Powder free gloves are
worn while handling filters and filters are only contacted with the use of smooth non-serrated forceps.
Upon determination of its pre-sampling weight, the filter is placed in its filter holding jacket for storage.
For VOC analytical method, the best prevention of contamination is not opening the canister in the
-------
Project: Model QAPP
Element No: 13
Re vision No: 1.0
Date: 7/5/01
Page 4 of 6
laboratory. All post sampling canisters that enter the laboratory should be under pressure between 12-
14 psig. With positive pressure, there is less likely that the sample will be contaminated. However,
care must be taken when the canisters are under vacuum and stored in the laboratory. If there is a
slight leak in the canister cap or valve, then laboratory air can enter into the canister and contaminate
the run.
For DNPH cartridges, the best prevention is to not take the cartridges out of the sealed shipping
packet until they are loaded into the sampler in the field. TCAPCD purchases the cartridges from a
chemical supply house with the DNPH coating already applied. Upon receipt and log-in, the cartridges
are immediately stored in a refrigerator within the sealed package. The field technicians remove the
cartridges (still in the sealed Mylar package) from the refrigerator and log-out the samples. The
samples are then refrigerated at the field monitoring site. When the technician loads the samples into the
aldehyde sampler, the DNPH cartridges are removed from their Mylar package and installed.
Semi-Volatile Organics Compound contamination prevention is the most difficult of all of the air toxics.
When SVOC samples are re-fluxed, small quantities of SVOC can become volatilized in the
laboratory. Therefore, it is very important to have a properly operating HVAC system working in the
lab. A HEPA filter is changed monthly in the HVAC to avoid contamination of laboratory air. In
addition, good laboratory practice is followed to avoid contamination of samples upon Receipt.
13.5.2 Temperature Preservation Requirements
The temperature requirements of the laboratory and field situations are detailed in IO and TO methods.
In the weigh room laboratory, the TSP filters must be conditioned for a minimum of 24 hours prior to
pre-weighing; although, a longer period of conditioning may be required. The weigh room laboratory
temperature must be maintained between 20 and 30° C, with no more than a +/- 5° C change over the
24 period prior to weighing the filters. During transport from the weigh room to the sample location,
there are no specific requirements for temperature control; however, the filters will be located in their
protective container and excessive heat avoided.
The specifics of temperature preservation requirements for VOC, SVOC and DNPH cartridges are
clearly detailed in TO and IO methods1'2'3'4. These requirements pertain to both sample media before
collection and both the sample media and sample after a sample has been collected. Additionally,
during the sample collection there are requirements for temperature control. These are listed in Table
11.4.
13.5.4 Permissible Holding Times
The permissible holding times for the sample are clearly detailed in the TO and IO Compendia1'2'3'4.
See Table 11.5.
-------
Project: Model QAPP
Element No: 13
Re vision No: 1.0
Date: 7/5/01
Page 5 of 6
References
1. Compendium Method for the Determination of Inorganic Compounds in Air, United States Environmental
Protection Agency, June 1999, Section IO-3.
2. Compendium Method for the Determination of Toxic Organic Compounds in Air, United States Environmental
Protection Agency, Section TO-11 A, January 1999
3. Compendium Method for the Determination of Toxic Organic Communes in Air, United States Environmental
Protection Agency, Section TO-14A, January 1999
4. Compendium Method for the Determination of Toxic Organic Compounds in Air, United States Environmental
Protection Agency, Section TO-13A, January 1999
-------
Project: Model QAPP
Element No: 14
Re vision No: 1.0
Date: 7/5/01
Page 1 of 8
14.0 Quality Control Requirements
QC is "the overall system of technical activities that measures the attributes and performance of a process,
item, or service against defined standards to verify that they meet the stated requirements established by the
customer." QC is both corrective and proactive in establishing techniques to prevent the generation of
unacceptable data, and so the policy for corrective action should be outlined. This element will rely on information
developed in section 7, "Quality Objectives and Criteria for Measurement Data," which establishes measurement
performance criteria.
To assure the quality of data from air monitoring measurements, two distinct and important interrelated
functions must be performed. One function is the control of the measurement process through broad
quality assurance activities, such as establishing policies and procedures, developing data quality
objectives, assigning roles and responsibilities,
/ 0
"'Hi
Training
Technical
C ompetence of
Analysis
Good Laboratory
Practices (GLP)
Good
Measurement
Practices (GMP)
Standarrd
Operating
Procedures (SOPs)
Proper Facilities
and
Instrumentation
Proper
Docum entation
En
i_
ualitj
ontro
I
vironm en
Quality
4ssurance
\
Internal Standard
Reference Material
Replicate
Measur em e nts
Internal On-going
Inspections
Quality Control
Charts
Interchange of
Analysis
Interchange of
Instrum ents
^-—
r
..^
r
tal
f
|
< Quality X^
4ssessm ents>X
f
External
External Standard
Reference Materia
(NPAP)
Technical Systems
Audit
Interlab
C om parisons
DQO/MQO
Assessm ent
Network
Reviews
;>,
.*
Figure 14.1 Quality control and quality assessment activities
conducting oversight and reviews, and implementing corrective actions. The other function is the
control of the measurement process through the implementation of specific quality control procedures,
such as audits, calibrations, checks, replicates, routine self-assessments, etc. In general, the greater the
-------
Project: Model QAPP
Element No: 14
Re vision No: 1.0
Date: 7/5/01
Page 2 of 8
control of a given monitoring system, the better will be the resulting quality of the monitoring data.
Quality Control (QC) is the overall system of technical activities that measures the attributes and
performance of a process. In the case of the ATMP, QC activities are used to ensure that
measurement uncertainty, as discussed in Section 7, is maintained within acceptance criteria for the
attainment of the DQO. Figure 14.1 represents a number of QC activities that help to evaluate and
control data quality for the program. Many of the activities in this figure are implemented by the Air
Division and are discussed in the appropriate sections of this QAPP.
14.1 QC Procedures
This element will need to furnish information on any QC checks not defined in other QAPP elements and
should reference other elements that contain this information where possible.
Many of these QC checks result in measurement data that are used to compute statistical indicators of
data quality. For example, a series of dilute solutions may be measured repeatedly to produce an estimate of
the instrument detection limit. The formulas for calculating such Data Quality Indicators (DQIs) should be
provided or referenced in the text. This element should also prescribe any limits that define acceptable data
quality for these indicators (see also Appendix D, "Data Quality Indicators"). A QC checklist should be used
to discuss the relation of QC to the overall project objectives with respect to:
! the frequency of the check and the point in the measurement process in which the check sample is
introduced,
! the traceability of the standards,
! the matrix of the check sample,
! the level or concentration of the analyte of interest,
! the actions to be taken in the event that a QC check identifies a failed or changed measurement
system,
! the formulas for estimating DQIs, and
! the procedures for documenting QC results, including control charts.
Finally, this element should describe how the QC check data will be used to determine that measurement
performance is acceptable. This step can be accomplished by establishing QC "warning" and "control" limits
for the statistical data generated by the QC checks (see standard QC textbooks or refer to EPA QA/G-5T for
operational details).
Day-to-day quality control is implemented through the use of various check samples or instruments that
are used for comparison. The measurement quality objectives table in Section 7 contains a complete
listing of these QC samples as well as other requirements for the program. The procedures for
implementing the compounds collected are included in the field and analytical methods section (Sections
11 and 13 respectively). The following information provides some additional descriptions of these QC
activities, how they will be used in the evaluation process, and what corrective actions will be taken
when they do not meet acceptance criteria.
14.1.1 Calibrations
Calibration is the comparison of a measurement standard or instrument with another standard or
instrument to report, or eliminate by adjustment, any variation (deviation) in the accuracy of the item
-------
Project: Model QAPP
Element No: 14
Re vision No: 1.0
Date: 7/5/01
Page 3 of 8
being compared. The purpose of calibration is to minimize bias.
Calibration activities for air toxics samplers follow a two step process:
1. Certifying the calibration standard and/or transfer standard against an authoritative standard,
and
2. Comparing the calibration standard and or transfer standard against the routine
sampling/analytical instruments.
Calibration requirements for the critical field and laboratory equipment are found in the respective
SOPs.
14.1.2 Blanks
Blank samples are used to determine contamination arising from principally four sources: the
environment from which the sample was collected/analyzed, the reagents used in the analysis, the
apparatus used, and the operator/analyst performing the analysis. Three types of blanks will be
implemented in the air toxics program:
Lot blanks - shipments of 8 x 11 inch filters will be periodically sent from the vendor to TCAPCD.
Each shipment must be tested to determine the length of time it takes the filters to stabilize. Upon
arrival of each shipment, 3 lot blanks will be randomly selected for the shipment and be subjected to the
conditioning/pre-sampling weighing procedures. The blanks will be measured every 24 hours for a
minimum of one week to determine the length of time it take to maintain a stable weight reading.
Field blanks - provides an estimate of total measurement system contamination. By comparing
information from laboratory blanks against the field blanks, one can assess contamination from field
activities. Details of the use of the field blanks can be found in field SOPs. Field blanks will be utilized
for the aldehydes, metals and SVOCs. Field blanks cannot be utilized with the VOC canisters since
they arrive in the field under vacuum.
Lab blanks -provides an estimate of contamination occurring at the weighing/analysis facility. Details
of the use of the lab blanks can be found in can be found in SOPs. Lab blanks will be utilized for the
aldehydes, metals, VOC and SVOCs. Lab blanks for VOCs are generated by the canister cleaning
system.
Blank Evaluation
The laboratory will include 3 field and 3 lab blanks into session batch. A batch is defined in section
14.2. The following statistics will be generated for data evaluation purposes:
Difference for a single check (d) - The difference, d, for each check is calculated using Equation 1,
where X represents the concentration produced from the original weight and /represents the
concentration reported for the duplicate weight (TSP/metals only).
-------
Project: Model QAPP
Element No: 14
Re vision No: 1.0
Date: 7/5/01
Page 4 of 8
Percent Difference for a Single Check (4) The percentage difference, dh for each check is
calculated using Equation 2 where Xt represents the original concentration and Yf represents the
concentration reported for the duplicate concentration.
Mean difference for batch (dz) - The mean difference dz for both field and lab blanks within an
analysis batch, is calculated using equation 3 where d} through dn represent individual differences
(calculated from equation 1) and n represents the number of blanks in the batch.
Corrective action- The acceptance criteria for field blanks are discussed in the individual SOPs.
Field and lab blanks differences are determined by equation 1. However the mean difference based
upon the number of blanks in each batch will be used for comparison against the acceptance criteria.
If the mean difference of either the field or laboratory blanks is greater than the accepted values in
Table 14.1 then these will be noted in the QA report. For TSP filter, the laboratory balance will be
checked for proper operation. If the blank means of either the field or lab blanks are still out of the
acceptance criteria, all samples within the analysis session will be flagged with the appropriate flag) and
efforts will be made to determine the source of contamination. In theory, field blanks should contain
more contamination than laboratory blanks. Therefore, if the field blanks are outside of the criteria while
the lab blanks are acceptable, analysis can continue on the next batch of samples while field
contamination sources are investigated. If the mean difference of the laboratory blanks is greater than
the acceptance criteria, the laboratory will stop until the issue is satisfactorily resolved. The laboratory
technician will alert the Laboratory Branch Manager and QA Officer of the problem. The problem and
solution will be reported and appropriately filed under response and corrective action reports that will
be summarized in the QA report.
Lab and field blanks will be control charted (see Section 14.3). The percent difference calculation
(equation 2) is used for control charting purposes and can be used to determine status.
14.1.3 Precision Checks
Precision is the measure of mutual agreement among individual measurements of the same property,
usually under prescribed similar conditions. In order to meet the data quality objectives for precision,
-------
Project: Model QAPP
Element No: 14
Re vision No: 1.0
Date: 7/5/01
Page 5 of 8
the Division must ensure the entire measurement process is within statistical control. Precision
measurements will be obtained using collocated monitoring.
Collocated Monitoring
In order to evaluate total measurement precision, collocated monitoring will be implemented.
Therefore, every method designation will have:
a. Each type of monitor collocated;
b. The VOC, PUF and Aldehyde samplers will be collocated at site ;
c. The TSP sampler will be collocated at TC 2.
Evaluation of Collocated Data- All collocated data will be reported to AIRS. The following
algorithms will be used to evaluate collocated data. Collocated measurement pairs are selected for use
in the precision calculations only when both measurements are within the acceptance criteria. Please see
Table 14.1.
Percent Difference for a collocated (Check (d^. The percentage difference, db for each check is
calculated by using Equation 19, where Xt represents the concentration produced from the primary
sampler and Yt represents the concentration reported for the duplicate sampler.
v - y
* - * *
Precision of a Single Sampler - Quarterly Basis. For particulate sampler /', the individual 95%
confidence limit, produced during the calendar year are pooled using the following equations:
where the number of checks made during the calendar quarter. Each individual compound must have
the precision data generated.
Upper 95% Percent Limit
Limit =d, +1.96*S, /• 2
Lower 95% Percent Limit
Limit =dl_1.96*Sl/l' 2
Corrective Action: Quarter - Usually, corrective action will be initiated and imprecision rectified
-------
Project: Model QAPP
Element No: 14
Re vision No: 1.0
Date: 7/5/01
Page 6 of 8
before a quarters worth of data fail to meet 15% Confidence Limits (CL). However in the case were
the quarters CL is greater than 20% the routine data for that monitor for that quarter will be flagged.
The QA Office, the Lab and the Air Monitoring Branch Managers will work together to identify the
problem and a solution. The EPA Regional Office will be alerted of the issue and may be asked to help
find a common solution. The problem and solution will be reported and appropriately filed under
response and corrective action. This information will also be included in the annual QA report.
Table 14.1 Precision Acceptance Criteria
Parameter
Both samples did not run 24 hours +/- 10 min.
One or both filters are damaged or exhibit a pinhole or tear
One or both samplers has erratic flow pattern
The difference in the pressure of the VOC canisters is > 2 psig
One or both PUF plugs or filters are damaged
One or both samples are not kept within the holding and storage
temperature requirements for any length of time
Decision
Do not accept
Do not accept
Do not accept
Do not accept
Do not accept
Do not accept
14.1.4 Accuracy Checks
Accuracy is defined as the degree of agreement between an observed value and an accepted reference
value and includes a combination of random error (precision) and systematic error (bias). Three
accuracy checks are implemented in the air toxics monitoring program:
Flow rate audits;
Balance checks, and
Laboratory audits.
Flow Rate Audits
The flow rate audit is made by measuring the field instrument's normal operating flow rate using a
certified flow rate transfer standard. The flow rate standard used for auditing will not be the same flow
rate standard used to calibrate the analyzer. However, both the calibration standard and the audit
standard may be referenced to the same primary flow rate or volume standard. Report the audit
(actual) flow rate and the corresponding flow rate indicated or assumed by the sampler. The
procedures used to calculate measurement uncertainty are described below.
Accuracy of a Single Sampler - Single Check (Quarterly) Basis (4) The percentage difference
(4) for a single flow rate audit /' is calculated using Equation 13, where Xt represents the audit standard
flow rate (known) and Yt represents the indicated flow rate.
-------
Project: Model QAPP
Element No: 14
Re vision No: 1.0
Date: 7/5/01
Page 7 of 8
Balance Checks- Balance checks are frequent checks of the balance working standards (100 and
200 mg standards) against the balance to ensure that the balance is within acceptance criteria
throughout the pre- and post-sampling weighing sessions. Toxa City will use ASTM class 1 weights for
its primary and secondary (working) standards. Both working standards will be used measured at the
beginning and end of the sample batch. Balance check samples will be controlled charted.
Balance Check Evaluation- The following algorithm will be used to evaluate the balance checks
Difference for a single check (d) - The difference, dy for each check is calculated using Equation
3, where X represents the certified mass weight and /represents the reported weight.
Corrective Action - The difference among the reported weight and the certified weight must be < 5
mg. Since this is the first check before any pre-or post-sampling weighings, if the acceptance criteria is
not met, corrective action will be initiated. Corrective action may be as simple as allowing the balance
to perform internal calibrations or to sufficiently warm-up, which may require checking the balance
weights a number of times. If the acceptance criteria is still not met, the laboratory technician will be
required to verify the working standards to the primary standards. Finally, if it is established that the
balance does not meet acceptance criteria for both the working and primary standards, and other
trouble shooting techniques fail, the Libra Balance Company service technician will be called to
perform corrective action.
If the balance check fails acceptance criteria during a run, the 10 filters weighed prior to the failure
will be rerun. If the balance check continues to fail, trouble shooting, as discussed above, will be
initiated. The values of the 10 samples weighed prior to the failure will be recorded and flagged, but will
be remain with the unweighed samples in the batch to be reweighed when the balance meets the
acceptance criteria. The data acquisition system will flag any balance check outside the acceptance
criteria. The samples that were flagged will be un-flagged once the balance comes into compliance with
the QC procedure.
-------
Project: Model QAPP
Element No: 14
Re vision No: 1.0
Date: 7/5/01
Page 8 of 8
Accuracy of a Laboratory Audit - Single Check (Annual) Basis (4) The laboratory audit is an
independent check that is generated by an outside laboratory. Each calendar year, the EPA or State
designated laboratory will be sending the TCAPCD laboratory a sample of metals on a quartz filter,
aldehydes in a DNPH cartridge, a canister with VOCs and a PUF sample with SVOC. The TCAPCD
lab will analyze the samples and send the results to the EPA certified laboratory. The audit sample for
each system will be mailed directly to the laboratory. The lab technician will handle the audit sample in
the same manner as all other samples. Once the analysis is performed, the results will be reviewed by
the lab supervisor. These results will then be sent to the EPA certified laboratory. The equation used
to define percentage difference (4) for a each individual compound audit / is calculated as:
where Xi represents the audit standard concentration from a certified laboratory (known) and Yt
represents the indicated value obtained from the TCAPCD laboratory.
-------
Project: Model QAPP
Element No: 15
Re vision No: 1.0
Date: 7/5/01
Page 1 of 6
15.0 Instrument/Equipment Testing, Inspection, and Maintenance
Requirements
The purpose of this element of the QAPP is to discuss the procedures used to verify that all instruments
and equipment are maintained in sound operating condition and are capable of operating at acceptable
performance levels. This section describes how inspections and acceptance testing of environmental sampling
and measurement systems and their components will be performed and documented.
15.1 Purpose/Background
The purpose of this element in the Toxa City QAPP is to discuss the procedures used to verify that all
instruments and equipment are maintained in sound operating condition and are capable of operating at
acceptable performance levels.
15.2 Testing
The procedures described should (1) reflect consideration of the possible effect of equipment failure on
overall data quality, including timely delivery of project results; (2) address any relevant site-specific effects
(e.g., environmental conditions); and (3) include procedures for assessing the equipment status. This
element should address the scheduling of routine calibration and maintenance activities, the steps that will
be taken to minimize instrument downtime, and the prescribed corrective action procedures for addressing
unacceptable inspection or assessment results. This element should also include periodic maintenance
procedures and describe the availability of spare parts and how an inventory of these parts is monitored and
maintained. The reader should be supplied with sufficient information to review the adequacy of the
instrument/equipment management program. Appending SOPs containing this information to the QAPP and
referencing the SOPs in the text are acceptable.
Inspection and testing procedures may employ reference materials, such as the National Institute of
Standards and Technology's (NIST's) Standard Reference Materials (SRMs), as well as QC standards or an
equipment certification program. The accuracy of calibration standards is important because all data will be
measured in reference to the standard used. The types of standards or special programs should be noted in
this element, including the inspection and acceptance testing criteria for all components. The acceptance
limits for verifying the accuracy of all working standards against primary grade standards should also be
provided.
All samplers used in the Toxa City ATMP will be similar to the instruments described in the TO and
IO Compendia. Therefore, they are assumed to be of sufficient quality for the data collection
operation. Prior to field installation, Toxa City will assemble and run the samplers at the laboratory
facilities. The field operators will perform external and internal leak checks and temperature, pressure
and flow rate verification checks. If any of these checks are out of specification, the field technicians
will attempt to correct them.. If the problem is beyond their expertise, the division director will contact
the vendor for guidance. If the vendor does not provide sufficient support, then the instrument will be
returned to the vendor. Once installed at the site, the field operators will run the tests at least one more
-------
Project: Model QAPP
Element No: 15
Re vision No: 1.0
Date: 7/5/01
Page 2 of 6
time. If the sampling instrument meets the acceptance criteria, it will be assumed to be operating
properly.
15.3 Inspection
Inspection of various equipment and components are provided here. Inspections are subdivided into
two sections: one pertaining to laboratory issues and one associated with field activities.
15.3.1 Inspection in Laboratory
There are several items that need routine inspection in the laboratory. Table 15-1 details the items to
inspect and how to appropriately document the inspection. All of the different areas of the laboratory
(TSP mass weight, Gas Chromatography/Mass Spec., Liquid Chromatography and the ICP rooms)
will be maintained according to Table 15.1.
Table 15.1 Inspections in the Laboratory
Item
Weighing
loom
Temperature
Weighing
loom
Humidity
Weighing
loom
Cleanliness
GC/MC
loom
Temperature
GC/MS
Cleanliness
ICP
Temperature
ICP
Cleanliness
Inspection
Frequency
Daily
Daily
Monthly
Daily
Monthly
Daily
Monthly
Inspection
Parameter
20 - 30° C
30-40°RH
Use glove and
dsually inspect
20 - 30° C
Use glove and
dsually inspect
20 - 30° C
Use glove and
dsually inspect
Action if Item Fails
Inspection
1.) Check HVAC System
2.) Call service provider that
lolds maintenance agreement
1.) Check HVAC System
2.) Call service provider that
lolds maintenance agreement
Clean room
1.) Check HVAC System
2.) Call service provider that
lolds maintenance agreement
Clean room and remove clutter
put canisters back into rack
1.) Check HVAC System
2.) Call service provider that
lolds maintenance agreement
Clean room and remove clutter
store and clean vial. Discard
)ld filters
Documentation
Requirement
1 .) Document in log book
2.) Notify Lab Manager
1 .) Document in log book
2.) Notify Lab Manager
Document in Log Book
Document in Logbook
Document in Log Book
Document in Logbook
Document in Log Book
-------
Project: Model QAPP
Element No: 15
Re vision No: 1.0
Date: 7/5/01
Page 3 of 6
HPLC Room
Temperature
HPLC
Cleanliness
Extract ion
loom
Daily
Monthly
Weekly
20 - 30° C
Use glove and
dsually inspect
Use glove and
dsually inspect
1.) Check HVAC System
2.) Call service provider that
lolds maintenance agreement
Clean room and store PUF
;artridges
Thoroughly clean room and
emove all materials. Clean all
removal instrument and
mtoclave
Document in Logbook
Document in Log Book
Document in Log Book
15.3.2 Inspection of Field Items
There are several items to inspect in the field before and after a sample has been taken. The attached
appendices discuss in detail the items that need to be inspected. Please refer to the attached SOPs.
15.4 Maintenance
There are many items that need maintenance attention in the network. This section describes the
laboratory and field items.
15.4.1 Laboratory Maintenance Items
The successful execution of a preventive maintenance program for the laboratory will go a long way
towards the success of the entire program. In the Toxa City network, laboratory preventive
maintenance is handled through the use of several contractors. The Smith and Jones HVAC
Company has a contract to take care of all preventive maintenance associated with the heating,
ventilation, and air conditioning system (HVAC). In addition to these contacts, the TCAPCD also
hires LabTech Inc. to perform the maintenance on the ICP, GC/MS and the two Liquid
Chromatographs. The Smith and Jones HVAC Company can be paged for all emergencies pertaining
to the laboratory HVAC system. Preventive maintenance for the micro-balance is performed by the
Libra BalanceCompany service technician. Preventive maintenance for the all analytical instruments is
scheduled to occur at initial set-up and every 6-months thereafter. In the event that there is a problem
with the analytical instruments that cannot be resolved within the Toxa City organization, the Libra
Balance Company and LabTech Inc. service technician can be paged. The District's service
agreement with Libra Balance Company and LabTech Inc. calls for service within 24 hours. The
service technician will also have a working micro-balance in his/her possession that will be loaned to
Toxa City in the case that the District's micro-balance can not be repaired on-site. In the event one of
the other analytical instruments fail, the service technicians for the vendors will visit the TCAPCD
laboratory and ascertain the problem. The parts will be shipped and replaced as soon as possible.
Service agreements with both the Smith and Jones HVAC Company., Libra Balance Company and
LabTech Inc. are expected to be renewed each year. In the event either companies service agreement
-------
Project: Model QAPP
Element No: 15
Re vision No: 1.0
Date: 7/5/01
Page 4 of 6
is not renewed, a new service provider will be selected and contract put in place. The following tables
details the maintenance items, how frequently they will be replaced, and who will be responsible for
performing the maintenance.
Table 15.2 Preventive Maintenance in Weigh Room Laboratories
Item
Multi-point Micro-balance maintenance
calibration
Comparison of NIST Standards to laboratory
working and primary standards
Verify Humidity and Temperature sensors
HEPA filter replacement
HVAC system preventive maintenance
Computer Back-up
Computer Virus Check
Computer system preventive maintenance (clean
out old files, compress hardrive, inspect)
Maintenance
Frequency
6 Months
6 Months
Monthly
Monthly
Yearly
Weekly
Weekly
Yearly
Responsible Party
Libra Balance Company
Libra Balance Company
Balance Analyst
Balance Analyst
Smith and Jones HVAC
Lab Analyst
Lab Analyst
PC support personnel
Table 15.3 Preventive Maintenance in VOC Laboratories
Item
Multi-point maintenance calibration
Comparison of NIST Standards to laboratory
working and primary standards
Filament Replacement
Carrier gas scrubber replaced
MS Quadruples or ion source cleaned
RF Generator Replaced
Test lines for pressure integrity
Replace Traps
Computer Back-up
Maintenance
Frequency
6 Months or after initial
setup, after maintenance or
repair, after column is
replaced
Weekly
As necessary
When trap color
indicates
Every 3 months
As needed
Annually
as needed
Weekly
Responsible Party
Lab Analyst
Lab Analyst
Lab Analyst
Lab Analyst
Lab Analyst
Lab Analyst
Lab Analyst
Lab Analyst
Lab Analyst
-------
Project: Model QAPP
Element No: 15
Re vision No: 1.0
Date: 7/5/01
Page 5 of 6
Computer Virus Check
Computer system preventive maintenance (clean out
old files, compress hardrive, inspect)
Weekly
Yearly
Lab Analyst
PC support personnel
Table 15.4 Preventive Maintenance in Liquid Chromatography Laboratory
Item
Multi-point maintenance calibration
Comparison of NIST Standards to laboratory
working and primary standards
Replace Chromatography Column
Replace delivery system motor
Change Column guard
Replace Teflon delivery tubing
Test Acetonitrile used for sample extraction
Computer Back-up
Computer Virus Check
Computer system preventive maintenance (clean
out old files, compress hardrive, inspect)
Maintenance
Frequency
6 Months
6 Months
As needed
2 years
As needed
Yearly
Monthly
Weekly
Weekly
Yearly
Responsible Party
LabTech Inc.
Lab Analyst
Lab Analyst
LabTech Inc.
Lab Analyst
Lab Analyst
Lab Analyst
Lab Analyst
Lab Analyst
PC support personnel
Table 15.5 Preventive Maintenance in Inductively Coupled Plasma Laboratories
Item
Instrument Tuning
Torch and Spray chambers cleaned
Multi-point maintenance calibration
Comparison of NIST Standards to laboratory
working and primary standards
Clean Oven
Plasma Generator
Heat Generator
Computer Back-up
Computer Virus Check
Computer system preventive maintenance (clean
out old files, compress hardrive, inspect)
Maintenance
Frequency
Initial Setup
3 months
6 Months
Monthly
Monthly
Monthly
Yearly
Weekly
Weekly
Yearly
Responsible Party
LabTech Inc.
Lab Analyst.
LabTech Inc.
Lab Analyst
Lab Analyst
Lab Analyst
LabTech Inc
Balance Analyst
Balance Analyst
PC support personnel
-------
Project: Model QAPP
Element No: 15
Re vision No: 1.0
Date: 7/5/01
Page 6 of 6
15.4.2 Field Maintenance Items
There are many items associated with appropriate preventive maintenance of a successful field
program. Please see Table 15.6 details the appropriate maintenance checks of the samplers and their
frequency.
Table 15.6 Preventive Maintenance on Field Instruments
Instrument
TSP sampler
PUF Sampler
VOC Sampler
Aldehyde Sampler
Item
Motor Brush replacement
Clean inside of sampler
Replace Motor
Replace Motor
Replace Motor gaskets
Filter screen inspected for
impacted deposits or bits of filter
Check connecting tube and power
lines for holes, crimps or cracks
Motor Brush replacement
Clean inside of sampler
Replace Motor
Replace sample lines
Clean flow controller
Replace 1/8" connectors
Cartridge connectors
Replace Motor Brushes
Fan motor replacement
Clean inside of sampler
Maintenance
Frequency
3 Months
6 Months
Annually
Annually
When motor is
replaced
Annually
6 months
3 Months
6 Months
Annually
Annually
Annually
Annually
Annually
Annually
2 years
6 Months
Responsible Party
Field Technician
Field Technician
Field Technician
Field Technician
Field Technician
Field Technician
Field Technician
Field Technician
Field Technician
Field Technician
Senior Field Technician
Senior Field Technician
Field Technician
Field Technician
Field Technician
Field Technician
Field Technician
-------
Project: Model QAPP
Element No: 16
Revision No: 1
Date: 7/5/01
Page 1 ofS
16.0 Instrument Calibration and Frequency
This element of the QAPP concerns the calibration procedures that will be used for instrumental
analytical methods and other measurement methods that are used in environmental measurements. It is
necessary to distinguish between defining calibration as the checking of physical measurements against
accepted standards and as determining the relationship (function) of the response versus the concentration.
The American Chemical Society (ACS) limits the definition of the term calibration to the checking of
physical measurements against accepted standards, and uses the term standardization to describe the
determination of the response function.
16.1 Instrumentation Requiring Calibration
The QAPP should identify any equipment or instrumentation that requires calibration to maintain
acceptable performance. While the primary focus of this element is on instruments of the measurement
system (sampling and measurement equipment), all methods require standardization to determine the
relationship between response and concentration
16.1.1 Analysis of Instruments - Laboratory
The laboratory support for Toxa City includes calibration As indicated in Section 13, the instruments
are calibrated using NIST traceable standards (if available) once a year under a service agreement.
For the Libra 101, the service technician performs routine maintenance and makes any balance
response adjustments that the calibration shows to be necessary. During the visit by the service
technician, both the in-house primary and secondary (working) standards are checked against the
service technicians standards to ensure acceptability. All of these actions are documented in the service
technician's report, a copy of which is provided to the laboratory manager, which after review, is
appropriately filed.
The laboratory also maintains a set of standards for each of the laboratory systems. Please see Table
16.1. Below are brief statements on how these Calibrations are performed.
For the Libra 101, the technician uses 3 Class A weights to verify that the balance is weighing
within the tolerance limits. Once this is performed, the balance is tarred. Filters are weighed in
batches of 10 samples. After a sample batch has been weighed, the technician re-weighs on
filter (duplicate weight) and re-tares the balance. At the end of the day (or end of the weighing
session) the technician reweigh the 3 Class A weights. Any difference in weight is noted.
For the Gas Chromatographs, the NIST Traceable cylinder is attached to a mass flow control
calibration unit. The concentration of the benzene and methylene chloride are blended down to
a value which will be in the higher 80% of the range of compounds found in ambient
concentrations. This usually is ~ 20 ppbv. The Gas Chromatographs is allowed to reach
-------
Project: Model QAPP
Element No: 16
Revision No: 1
Date: 7/5/01
Page 2 ofS
operating conditions. The gas from the mass flow controller is injected into the system and the
carrier helium is allowed to flow. Once the calibration gas is allowed to enter, two peaks
should appear. The mass flow controller is then adjusted to allow the gas concentration to be ~
40%. This process is then repeated with a concentration of 20% of range of compounds. Zero
air is then generated and a baseline is determined. The system is now ready to accept ambient
concentrations. After the day's batches are run, a single point (80%) is injected into the GC.
After the Inductively Coupled Plasma unit is allowed to come to operating conditions, a
standard solution of metals is injected into the TCP. The responses are noted. Distilled ion-free
water is then injected into the TCP. This allows the system to reach a baseline.
For the Liquid Chromatographs, (Aldehydes) the procedure is the same, with the exception of
the compounds injected. 2,4 Dinitro phenylhydrazine is dissolved in ultra-pure Acetonitrile.
These become the standard solutions. After the LCs have come to operating conditions, ultra-
pure Acetonitrile is injected. This allows the system to reach a baseline. Then a concentration
at 80% of the normal ambient concentrations of DNPH in Acetonitrile are injected into the
LC. Response peaks are observed and recorded. This procedure is repeated at the end of the
analysis batch run.
Table 16.1 Lab Instruments Standards
Manufacturer
Libra 101
(filter weights
Antech 3000
(metals)
ZanTech 3001
(Aldehydes)
AnTechSOOl
(SVOCs)
AnTech 3001
(VOCs)
Instrument
Balance
Inductively
Coupled Plasma
Liquid
Chromatographs
Gas
Chromatography
Gas
Chromatography
Type of
Standard
Class A
Weights
High Purity
Reagents - High
Purity grade
Standards
High Purity 2,4
Dinitro
phenylhydrazi
ne crystals
dissolved in
Acetonitrile
High Purity
Benzo [a] Pyrene
Standard
Solutions
Compressed
Gas Cylinder
Frequency
1 every 10
samples
Before and after
each batch run
Before and after
each batch run
Before and after
each batch run
Before and after
each batch run
NIST
Traceability
Class A Weights
99. 99% pure
ultra high grade
Standard solutions
Reagent grade
available from
Chemical vendor
Reagent grade
available from
Chemical vendor
Benzene,
Methylene
Chloride are
NIST Traceable
through vendor
-------
Project: Model QAPP
Element No: 16
Revision No: 1
Date: 7/5/01
Page 3 offi
16.1.2 Flow Rate - Laboratory
Laboratory technicians perform the comparison of the flow rate transfer standard to a NIST-traceable
primary flow rate standard and once every year sends the primary standard to NIST for Recertification.
The laboratory and field personnel chose an automatic dry-piston flow meter for field calibrations and
flow rate verifications of the flow rates of the network samplers. This type of device has the advantage
of providing volumetric flow rate values directly, without requiring conversion from mass flow
measurements, temperature, pressure, or water vapor corrections. In addition, the manual bubble
flowmeter will be used in the lab as a primary standard and as a backup to the dry-piston flowmeter,
where the absence of wind and relatively low humidity will have less negative effect on flowmeter
performance.
Upon initial receipt of any new, repaired, or replaced air toxics sampler, a field technician will perform
a multipoint flow rate calibration verification on the sampler flow rate to determine if initial performance
is acceptable. Once sampler flow rate is accepted, the lab performs the calibration and verifications at
the frequency specified in Section 14, as well as directly performing or arranging to have another party
perform the tests needed to recertify the organizations standards.
16.1.3 Sampler Temperature, Pressure, Time Sensors - Laboratory
The lab arranges support for the field calibration of temperature and pressure sensors by acquiring
the necessary equipment and consumables, preparing and lab testing the temperature comparison
apparatus. A stationary mercury manometer in the laboratory is used as a primary standard to calibrate
the two electronic aneroid barometers that go out in the field as transfer standards.
16.1.4 Field
The following calibrations are performed in the field:
calibration of volumetric flow rate meter of each samplers against the working standard;
calibration of sampler temperature and pressure sensors against the working temperature
standard (VOC and Aldehyde Samplers only);
calibration of the min/max thermometers, normally located in the coolers in which DNPH
cartridges, PUFs and XAD are transported to and from the sampler in the field, against the
laboratory-checked working standard thermometer.
-------
Project: Model QAPP
Element No: 16
Revision No: 1
Date: 7/5/01
Page 4 ofS
16.2 Calibration Method that Will Be Used for Each Instrument
The QAPP must describe the calibration method for each instrument in enough detail for another
researcher to duplicate the calibration method. It may reference external documents such as EPA-
designated calibration procedures or SOPs providing that these documents can be easily obtained.
Nonstandard calibration methods or modified standard calibration methods should be fully documented
and justified.
Most EPA-approved analytical methods require multipoint (three or more) calibrations that include zeros,
or blanks, and higher levels so that unknowns fall within the calibration range and are bracketed by
calibration points. The number of calibration points, the calibration range, and any replication (repeated
measures at each level) should be given in the QAPP.
The QAPP should describe how calibration data will be analyzed. The use of statistical QC techniques
to process data across multiple calibrations to detect gradual degradations in the measurement system
should be described. The QAPP should describe any corrective action that will be taken if calibration (or
calibration check) data fail to meet the acceptance criteria, including recalibration. References to appended
SOPs containing the calibration procedures are an acceptable alternative to describing the calibration
procedures within the text of the QAPP.
16.2.1 Laboratory - Gravimetric (Mass) Calibration
The calibration and QC (verification) checks of the microbalance are addressed in Sections 16.1.1 and
13.3 of this QAPP. For the following 3 reasons, the multipoint calibration for this method will be zero,
100 and 200mg: 1) the required sample collection filters weigh between 100 and 200 mg; 2) the
anticipated range of sample loadings for the 24 hour sample period is rarely going to be more than a
few 100 mgs; and 3) the lowest, commercially available check weights that are certified according to
nationally accepted standards are only in the single milligram range. Since the critical weight is not the
absolute unloaded or loaded filter weight, but the difference between the two, the lack of microgram
standard check weights is not considered cause for concern about data quality, as long as proper
weighing procedure precautions are taken for controlling contamination, or other sources of mass
variation in the procedure.
16.2.2 Laboratory/ Field - Flow Calibration.
The Air Monitoring and Laboratory Branch Managers conduct spot checks of lab and field notebooks
to ensure that the lab and field personnel are following the SOPs, including the QA/QC checks,
acceptance criteria and frequencies.
Method Summary: After equilibrating the calibration device to the ambient conditions, connect the
flow calibration device on the sampler down tube or filter holding device. If the sampler has not been
calibrated before, or if the previous calibration was not acceptable, perform a leak check according to
the manufacturer's operational instruction manual, which is incorporated into Toxa City ATMP SOPs.
-------
Project: Model QAPP
Element No: 16
Revision No: 1
Date: 7/5/01
Page 5 ofS
Otherwise, place the sampler in calibration or "run" mode and perform a one-point
calibration/verification or a one-point flow rate verification. The field staff will only perform a leak check
after calibration or verification of are outside of the acceptance criteria.
Following the calibration or verification, turn off the sampler pump, remove the filter, cartridge, or PUF
holder, remove the flow calibration device, (and flow adaptor device if applicable), and replace the
sampler inlet or hood. If the flow rate is determined to be outside of the required target flow rate,
attempt to determine possible causes by minor diagnostic and trouble shooting techniques (e.g., leak
checks), including those listed in the manufacturer's operating instruction manual.
16.2.3 Sampler Pressure Calibration Procedure
General: According to ASTM Standard D 3631 (ASTM 1977), a barometer can be calibrated by
comparing it with a secondary standard traceable to a NIST primary standard.
Precautionary Note: Protect all barometers from violent mechanical shock and sudden changes in
pressure. A barometer subjected to either of these events must be recalibrated. Maintain the vertical
and horizontal temperature gradients across the instruments at less than 0. l°C/m. Locate the instrument
so as to avoid direct sunlight, drafts, and vibration.
A Fortin mercury type of barometer is used in the laboratory to calibrate and verify the
aneroid barometer used in the field to verify the barometric sensors of samplers. Details are provided
in the appropriate SOP.
16.3 Calibration Standard Materials and Apparatus
Some instruments are calibrated using calibration apparatus rather than calibration standards. For
example, an ozone generator is part of a system used to calibrate continuous ozone monitors. Commercially
available calibration apparatus should be listed together with the make (the manufacturer's name), the model
number, and the specific variable control settings that will be used during the calibrations. A calibration
apparatus that is not commercially available should be described in enough detail for another researcher to
duplicate the apparatus and follow the calibration procedure.
Table 16.2 presents a summary of the specific standard materials and apparatus used in calibrating
measurement systems.
-------
Project: Model QAPP
Element No: 16
Revision No: 1
Date: 7/5/01
Page 6 offi
Table 16.2 Standard Materials and/or Apparatus for Air Toxics Calibration
Parameter
M-Material
A=Apparatus
MassM
Temperature
M+A
M+A
Pressure
M+A
A
Flow Rate
A
A
A
Std.
Material
Class A wgts
Hg
NA
Hg
NA
NA
Std. Apparatus
NA
Thermometer
Thermistor
Fortin
Aneroid
Piston Meter
Bubble Meter
High Volume Flow
Mfr. Name
ScalesTech.
Inc.
Hot Water
Inc.
True Temp.
You Better...
Aviators
Choice
Flowtech
Inc.
SaapTech.
Inc
Top Hat Inc..
Model #
111
5500
8910
22
7-11
F199
LG88
TP-1
Frequency
of
Calibration
NA
NA
Annually
NA
Quarterly
Annually
NA
Annually
Flow Rate
The flow rate standard apparatus used for flow-rate calibration (field- NIST-traceable, piston-type
volumetric flow rate meter; laboratory -NIST-traceable manual soap bubble flow meter and time
monitor) has its own certification and is traceable to other standards for volume or flow rate which are
themselves NIST-traceable. A calibration relationship for the flow-rate standard, such as an equation,
curve, or family of curves, is established by the manufacturer (and verified if needed) that is accurate to
within 2% over the expected range of ambient temperatures and pressures at which the flow-rate
standard is used. The flow rate standard will be recalibrated and recertified at least annually.
The actual frequency with which this recertification process must be completed depends on the type of
flow rate standard- some are much more likely to be stable than others. The Division will maintain a
control chart (a running plot of the difference or percent difference between the flow-rate standard and
the NIST-traceable primary flow-rate or volume standard) for all comparisons. In addition to providing
excellent documentation of the certification of the standard, a control chart also gives a good indication
of the stability of the standard. If the two standard-deviation control limits are close together, the chart
indicates that the standard is very stable and could be certified less frequently. The minimum
recertification frequency is 1 year. On the other hand, if the limits are wide, the chart would indicate a
less stable standard that will be recertified more often.
The High Volume sampler flow rate device is a Top Hat Inc., TP-1, which is certified to a NIST
traceable Roots meter. The High Volume orifice is sent to the State's certification laboratory on an
annual basis to verify its flow rate.
-------
Project: Model QAPP
Element No: 16
Revision No: 1
Date: 7/5/01
Page 7 offi
Temperature
The operations manuals associated with the TCAPCD samplers identify types of temperature
standards recommended for calibration and provide a detailed calibration procedure for each type that
is specifically designed for the particular sampler.
The EPA Quality Assurance Handbook, Volume IV ( EPA 1995), Section 4.3.5.1, gives information
on calibration equipment and methods for assessing response characteristics of temperature sensors.
The temperature standard used for temperature calibration will have its own certification and be
traceable to a NIST primary standard. A calibration relationship to the temperature standard (an
equation or a curve) will be established that is accurate to within 2% over the expected range of
ambient temperatures at which the temperature standard is to be used. The temperature standard must
be reverified and recertified at least annually. The actual frequency of recertification depends on the
type of temperature standard; some are much more stable than others. The Division will use ana NIST-
traceable mercury in glass thermometer, for laboratory calibration and certification of the field
thermistor.
The temperature sensor standards chosen by the lab and field staff and managers are both based on
standard materials contained in standardized apparatus; each has been standardized (compared in a
strictly controlled procedure) against temperature standards the manufacturers obtained from NIST.
The TCAPCD laboratory standards are 2 NIST-traceable mercury-in-glass thermometers from the
Hot Water Inc,each with its own certificate summarizing the company's NIST traceability protocol
and documenting the technicians signature, comparison date, identification of the NIST standard used,
and the mean and standard deviation of the comparison results. There are 2 thermometers with
overlapping ranges that span the complete range of typically measured summer to winter lab and field
temperature values.
The TCAPCD field temperature standards are two True Temp.8910 ® thermistor probes and one
digital readout module with RS232C jack and cable connector available for linkage to a data logger or
portable computer. The two probes have different optimum ranges, one including the full range of
temperatures ever recorded in the summer and the other including the full range of temperatures ever
recorded in the winter by the National Weather Service at the Toxa City sites. Each probe came with a
certificate of NIST-traceability with the same kind of information as the thermometer certificates
contained.
Pressure
The Fortin mercurial type of barometer works on fundamental principles of length and mass and is
therefore more accurate but more difficult to read and correct than other types. By comparison, the
precision aneroid barometer is an evacuated capsule with a flexible bellows coupled through
mechanical, electrical, or optical linkage to an indicator. It is potentially less accurate than the Fortin
type but can be transported with less risk to the reliability of its measurements and presents no damage
-------
Project: Model QAPP
Element No: 16
Revision No: 1
Date: 7/5/01
Page 8 of8
from mercury spills. The Fortin type of barometer is best employed as a higher quality laboratory
standard which is used to adjust and certify an aneroid barometer in the laboratory. The Toxa City
pressure standard is a You Better Believe It® Model 22 Fortin-type mercury barometer. The field
working standard is an Aviator's Choice® 7-11 aneroid barometer with digital readout.
16.5 Document Calibration Frequency
See Table 16-1 for a summary of Primary and Working Standards QC checks that includes frequency
and acceptance criteria and references for calibration and verification tests . All of these events, as well
as sampler and calibration equipment maintenance will be documented in field data records and
notebooks and annotated with the flags. Laboratory and field activities associated with equipment used
by the respective technical staff will be kept in record notebooks as well. The records will normally be
controlled by the Branch Managers, and located in the labs or field sites when in use or at the
manager's offices when being reviewed or used for data validation.
References
1 .ASTM. 1977. Standard test methods for measuring surface atmospheric pressure. American Society for Testing
and Materials. Philadelphia, PA. Standard D 3631-84.
2. ASTM. 1995. Standard test methods for measuring surface atmospheric pressure. American Society for Testing
and Materials. Publication number ASTM D3631-95.
3. EPA. 1995. Quality Assurance Handbook for Air Pollution Measurement Systems Volume IV: Meteorological
Measurements. U.S. Environmental Protection Agency. Document No. EPA/600/R-94/038d. Revised March.
4. NIST. 1976. Liquid-in-glass thermometry. National Institute of Standards and Technology. NBS Monograph 150.
January.
5. NIST. 1986. Thermometer calibration: a model for state calibration laboratories. National Institute of Standards
and Technology. NBS Monograph 174. January.
6. NIST. 1988. Liquid-in-glass thermometer calibration service. National Institute of Standards and Technology.
Special publication 250-23. September.
7. NIST. 1989. The calibration of thermocouples and thermocouple materials. National Institute of Standards and
Technology. Special publication 250-35. April 1989
-------
Project: Model QAPP
Element No: 17
Revision No: 1
Date: 7/5/01
Page 1 of 4
17.0 Inspection/Acceptance for Supplies and Consumables
Describe how and by whom supplies and consumables shall be inspected and accepted for use in the
project. State acceptance criteria for such supplies and consumables.
17.1 Purpose
The purpose of this element is to establish and document a system for inspecting and accepting all
supplies and consumables that may directly or indirectly affect the quality of the Program. The Toxa
City Air Toxics Monitoring Network relies on various supplies and consumables that are critical to its
operation. By having documented inspection and acceptance criteria, consistency of the supplies can
be assured. This section details the supplies/consumables, their acceptance criteria, and the required
documentation for tracking this process.
17.2 Critical Supplies and Consumables
Clearly identify and document all supplies and consumables that may directly or indirectly affect the quality
of the project or task. See Figures 10 and 11 for example documentation of inspection/acceptance testing
requirements. Typical examples include sample bottles, calibration gases, reagents, hoses, materials for
decontamination activities, deionized water, and potable water.
For each item identified, document the inspection or acceptance testing requirements or specifications (e.g.,
concentration, purity, cell viability, activity, or source of procurement) in addition to any requirements for
certificates of purity or analysis.
Table 17.1 details the various components for the laboratory and field operations.
Table 17.1 Critical Field Supplies and Consumables
Area
TSP Sampler
TSP Sampler
TSP Sampler
VOC Sampler
VOC Sampler
Aldehyde
Sampler
Item
8x 11" Quartz filters
High Volume Motor
Motor Brushes
Stainless Steel tubing
Mass Flow
Controller
DNPH cartridges
Description
Quartz filter
20 amp. Blower motor
Carbon Brush
Elements
Clean SS tubing
0- 50 cc/min.
DNPH coated plastic
Cartridges
Vendor
FilterTech Inc.
XYZ Company
XYZ Company
Steeltech
Flowtech Inc.
CartTech Inc.
Model Number
NA
X300
X301
X3301
FL100
D100
-------
Project: Model QAPP
Element No: 17
Revision No: 1
Date: 7/5/01
Page 2 of 4
Area
Aldehyde
Sampler
Aldehyde
Sampler
Aldehyde
Sampler
PUF Sampler
PUF Sampler
PUF Sampler
PUF Sampler
PUF Sampler
Item
Fuses
Mass Flow
Controller
Motor
Low Volume Motor
76 mm filter
PUF Cartridge with
>CAD resin
Chart Paper
Motor Brushes
Description
In sampler
0-100 cc/min
0-200 cc/min
16.7 1/m
Quartz
Sampling media
Flow check
Carbon Brush
ilements
Vendor
FuseTech Inc.
Flowtech Inc.
Flowtech Inc.
Flowtech Inc.
XYZ Company
XYZ Company
XYZ Company
XYZ Company
Model Number
F100
F1101
FL3021
X401
X402
D100
X101
Table 17.2 Critical Laboratory Supplies and Consumables
Area
Weigh Room
Weigh Room
Weigh Room
All
All
Liquid
Chromatography
Liquid
Chromatography
GC/MS
GC/MS
GC/MS
GC/MS
GC/MS
Item
Staticide
Forceps
Air Filters
Powder Free
Antistatic Gloves
Low-lint wipes
Teflon tubing
Chromatographs
column
Chromatographs
column
FID Detector
Helium
Hydrogen Gas
Zero Air
Description
Anti- static
solution
non-
serrated/Teflon
Coated
High Efficiency
Vinyl, Class
M4.5
4.5" x 8.5"
Cleaning Wipes
1/8" PTFE tubing
36" column
48" column
High Detection
Carrier Gas
Flame Gas
Calibration Gas
Vendor
WeighTech
WeighTech
Purchase Local
Fisher
Scientific®
Kimwipes®
TubeTech Inc
ZanTech Inc.
ZanTech Inc.
ZanTech Inc.
CylinderTech
CylinderTech
CylinderTech
Model Number
W1024
WWW
11-393-85A
34155
TW8
CW01
C1004
DW01
H 1002 3
HI 002 2
HI 0024
-------
Project: Model QAPP
Element No: 17
Revision No: 1
Date: 7/5/01
Page 3 of 4
GC/MS
GC/MS
GC/MS
ICP
ICP
ICP
All Instruments
All Instruments
All Instruments
Liquid Nitrogen
Silica Gel
cryogenic traps
Argon Coolant
Deionized H20
Photo multiplier
Tube
Reagent Grade
Solvents
Reagent Grade
Solvents
Various sizes of
ferrules, tubing and
connectors
200 gallons tank
Canister
stainless steel
Coolant Flow
Post Flush
Analytical
element
See SOPs
See SOPs
See SOPs
All Gases Inc.
Zantech Inc
CylinderTech
CylinderTech
Various Vendors
ZanTech Inc.
Various Vendors
Various Vendors
Various Vendors
HI 0021
SI 002 2
HI 00 2 3
A10022
FT 1004 5
17.3 Acceptance Criteria
Acceptance criteria must be consistent with overall project technical and quality criteria . If special
requirements are needed for particular supplies or consumables, a clear agreement should be established with
the supplier, including the methods used for evaluation and the provisions for settling disparities.
Acceptance criteria must be consistent with overall project technical and quality criteria. It is the air
monitoring branch chief and the field technicians responsibility to update the criteria for acceptance of
consumables. As requirements change, so do the acceptance criteria. Knowledge of field and
laboratory equipment and experience are the best guides to acceptance criteria. Other acceptance
criteria such as observation of damage due to shipping can only be performed once the equipment
has arrived on site.
17.4 Tracking and Quality Verification of Supplies and Consumables
Procedures should be established to ensure that inspections or acceptance testing of supplies and
consumables are adequately documented by permanent, dated, and signed records or logs that uniquely
identify the critical supplies or consumables, the date received, the date tested, the date to be retested (if
applicable), and the expiration date. These records should be kept by the responsible individual(s) (see
Figure 13 for an example log)
Tracking and quality verification of supplies and consumables have two main components. The first is
-------
Project: Model QAPP
Element No: 17
Revision No: 1
Date: 7/5/01
Page 4 of 4
the need of the end user of the supply or consumable to have an item of the required quality. The
second need is for the purchasing District to accurately track goods received so that payment or
credit of invoices can be approved. In order to address these two issues, the following procedures
outline the proper tracking and documentation procedures to follow:
1. Receiving personnel will perform a rudimentary inspection of the packages as they are received
from the courier or shipping company. Note any obvious problems with a receiving shipment
such as crushed box or wet cardboard.
2. The package will be opened, inspected and contents compared against the packing slip.
3. If there is a problem with the equipment/supply, note it on the packing list, notify the branch chief
of the receiving area and immediately call the vendor.
4. If the equipment/supplies appear to be complete and in good condition, sign and date the
packing list and send to accounts payable so that payment can be made in a timely manner.
5. Notify appropriate personnel that equipment/supplies are available. For items such as the filters,
it is critical to notify the laboratory manager of the weigh room so sufficient time for processing
of the filters can be allowed.
6. Stock equipment/supplies in appropriate pre-determined area.
7. For supplies, consumables, and equipment used throughout the program, document when these
items are changed out. A sign-in/sign-out sheet is placed outside of the stockroom. All
personnel must sign-out for any consumables removed or added to the stock room.. A lab
technician then enters this data into the equipment tracking database. The database will allow all
levels (Division Director, Branch Chief, lab and field technicians) able to tell if items and
consumables are in stock.
-------
Project: Model QAPP
Element No: 18
Revision No: 1
Date: 7/5/01
Page 1 of 3
18.0 Data Acquisition Requirements
This element of the QAPP should clearly identify the intended sources of previously collected data and
other information that will be used in this project. Information that is non-representative and possibly biased
and is used uncritically may lead to decision errors. The care and skepticism applied to the generation of new
data are also appropriate to the use of previously compiled data (for example, data sources such as handbooks
and computerized databases).
This section addresses data not obtained by direct measurement from the Air Toxics Monitoring
Program. This includes both outside data and historical monitoring data. Non-monitoring data and
historical monitoring data are used by the Program in a variety of ways. Use of information that fails to
meet the necessary Data Quality Objectives (DQOs) for the ATMP lead to erroneous trend reports and
regulatory decision errors. The policies and procedures described in this section apply both to data
acquired through the TCAPCD ATMP and to information previously acquired and/or acquired from
outside sources.
18.1 Acquisition of Non-Direct Measurement Data
This element's criteria should be developed to support the objectives of element A7. Acceptance criteria for
each collection of data being considered for use in this project should be explicitly stated, especially with respect
to:
Representativeness. Were the data collected from a population that is sufficiently similar to the population of
interest and the population boundaries? How will potentially confounding effects (for example, season, time of day,
and cell type) be addressed so that these effects do not unduly alter the summary information?
Bias. Are there characteristics of the data set that would shift the conclusions. For example, has bias in analysis
results been documented? Is there sufficient information to estimate and correct bias?
Precision. How is the spread in the results estimated? Does the estimate of variability indicate that it is sufficiently
small to meet the objectives of this project as stated in element A7? See also Appendix D.
Qualifiers. Are the data evaluated in a manner that permits logical decisions on whether or not the data are
applicable to the current project? Is the system of qualifying or flagging data adequately documented to allow the
combination of data sets?
Summarization. Is the data summarization process clear and sufficiently consistent with the goals of this project?
(See element D2 for further discussion.) Ideally, observations and transformation equations are available so that
their assumptions can be evaluated against the objectives of the current project.
This element should also include a discussion on limitations on the use of the data and the nature of the
uncertainty of the data.
The ATMP relies on data that are generated through field and laboratory operations; however, other
significant data are obtained from sources outside the TCAPCD or from historical records. This section
-------
Project: Model QAPP
Element No: 18
Revision No: 1
Date: 7/5/01
Page 2 of 3
lists this data and addresses quality issues related to the ATMP.
Chemical and Physical Properties Data
Physical and chemical properties data and conversion constants are often required in the processing of
raw data into reporting units. This type of information that has not already been specified in the
monitoring regulations will be obtained from nationally and internationally recognized sources. Other data
sources may be used with approval of the Air Division QA Officer.
• National Institute of Standards and Technology (NIST);
• ISO, IUPAC, ANSI, and other widely-recognized national and international standards
organizations;
• U.S. EPA;
• The current edition of certain standard handbooks may be used without prior approval of the
Toxa City QA Officer. Two that are relevant to the fine particulate monitoring program are
CRC Press' Handbook of Chemistry and Physics, and Merck Manual.
Sampler Operation and Manufacturers' Literature
Another important source of information needed for sampler operation is manufacturers' literature.
Operations manuals and users' manuals frequently provide numerical information and equations
pertaining to specific equipment. TCAPCD personnel are cautioned that such information is sometimes
in error, and appropriate cross-checks will be made to verify the reasonableness of information
contained in manuals. Whenever possible, the field operators will compare physical and chemical
constants in the operators manuals to those given in the sources listed above. If discrepancies are found,
determine the correct value by contacting the manufacturer. The following types of errors are commonly
found in such manuals:
• insufficient precision;
• outdated values for physical constants;
• typographical errors;
• incorrectly specified units;
• inconsistent values within a manual, and
• use of different reference conditions than those called for in EPA regulations.
Geographic Location
Another type of data that will commonly be used in conjunction with the Monitoring Program is
geographic information. For the current sites, the District will locate these sites using global positioning
systems (GPS) that meet EPA Locational Data Policy of 25 meters accuracy. USGS maps were used
as the primary means for locating and siting stations in the existing network. Geographic locations of
Toxa City monitoring sites that are no longer in operation will not be re-determined.
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date:7/5/01
Page 3 of 3
External Monitoring Data Bases
It is the policy of the TCAPCD that no data obtained from the Internet, computer bulletin boards, or
data bases from outside organizations shall be used in creating reportable data or published reports
without approval of the Air Division Director. This policy is intended to ensure the use of high quality
data in Toxa City publications.
Data from the EPA -AIRS data base may be used in published reports with appropriate caution. Care
must be taken in reviewing/using any data that contain flags or data qualifiers. If data is flagged, such
data shall not be utilized unless it is clear that the data still meets critical QA/QC requirements. It is
impossible to assure that a data base such as AIRS is completely free from errors including outliers and
biases, so caution and skepticism is called for in comparing Toxa City data from other reporting agencies
as reported in AIRS. Users should review available QA/QC information to assure that the external data
are comparable with Toxa City measurements and that the original data generator had an acceptable
QA program in place.
Lead and Speciated Particulate Data
The TCAPCD has been routinely monitoring airborne lead since the 1981. Early data is likely to be
problematic because of significantly higher detection limits. Caution is needed in directly comparing this
data with the data because of the difference in size fractions.
Existing chemical speciation data for elements other than lead are very limited. Some speciation data
from PM2.5 Speciation Samples were obtained by the Toxa City Institute of Technology in cooperation
with the District of Health during a!999 research study sponsored by the U.S.EPA. These results may
be used to provide a historical baseline for the speciation results to be obtained by the PM25 Ambient
Air Quality Monitoring Program; however, it is unclear whether the quality of these data is sufficient to
allow direct comparison with new toxics data.
U.S. Weather Service Data
Meteorological information is gathered from the U.S. Weather Service station at the Toxa City
International Airport. Parameters include: temperature, relative humidity, barometric pressure, rainfall,
wind speed, wind direction, cloud type/layers, percentage cloud cover and visibility range. Historically,
these data have not been used to calculate pollutant concentration values for any of the Toxa City
monitoring sites, which each have the required meteorological sensors. However, NWS data are often
included in summary reports. No changes to the way in which these data are collected are anticipated
due to the addition of the air toxics data to the Toxa City Air Pollution Control District.
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 1 of 11
19.0 Data Management
19.1 Background and Overview
This element should present an overview of all mathematical operations and analyses performed on raw
("as-collected") data to change their form of expression, location, quantity, or dimensionality. These operations
include data recording, validation, transformation, transmittal, reduction, analysis, management, storage, and
retrieval. A diagram that illustrates the source(s) of the data, the processing steps, the intermediate and final data
files, and the reports produced may be helpful, particularly when there are multiple data sources and data files.
When appropriate, the data values should be subjected to the same chain-of-custody requirements as outlined in
element B3. Appendix G has further details.
This section describes the data management operations pertaining to measurements for the air toxics
stations operated by TCAPCD. This includes an overview of the mathematical operations and analyses
performed on raw ("as-collected") data. These operations include data recording, validation,
transformation, transmittal, reduction, analysis, management, storage, and retrieval.
Data processing for air toxics data are summarized in Figure 19-1. Data processing steps are
integrated, to the extent possible, into the existing data processing system used for The TCAPCD airt
toxics network. The data base resides on a machine running the Windows NT Server operating system,
which is also the main file server for the Air Quality Division. This machine is shown in the upper left of
Figure 19-1.
The sample tracking and chain of custody information are entered into the Laboratory Information
Management System (LEVIS) at four main stages as shown in Figure 19-1. Managers are able to
obtain reports on status of samples, location of specific samples, etc.,using LEVIS. All users must be
authorized by the Manager, Air Quality Division, and receive a password necessary to log on to the
LEVIS. Different privileges are given each authorized user depending on that person's need. The
following privilege levels are defined:
• Data Entry Privilege - The individual may see and modify only data within LEVIS, he or she has
personally entered. After a data set has been "committed" to the system by the data entry
operator, all further changes will generate entries in the system audit trail;
• Reporting Privilege - This without additional privileges;
• Data Administration Privilege - Data Administrators for the LEVIS are allowed to change data as
a result of QA screening and related reasons. All operations resulting in changes to data values
are logged to the audit trail. The Data Administrator is responsible for performing the following
tasks on a regular basis;
• Merging/correcting the duplicate data entry files;
• Running verification/validation routines, correcting data as necessary and
generating summary data reports for management;
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 2 of 11
Uploading verified/validated data to EPA -AIRS.
I
W , .
1 I
Figure 19.1 Data Management and Sample Flow Diagram
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 3 of 11
19.2 Data Recording
Any internal checks (including verification and validation checks) that will be used to ensure data quality
during data encoding in the data entry process should be identified together with the mechanism for detailing and
correcting recording errors. Examples of data entry forms and checklists should be included.
Data entry, validation, and verification functions are all integrated in the LEVIS. Bench sheets shown in
Figure 19.1 are entered by laboratory personnel. Procedures for filling out the laboratory sheets and
subsequent data entry are provided in SOPs listed in Table 19.1 and included in the SOPs.
19.3 Data Validation
The details of the process of data validation and pre-specified criteria should be documented in this element
of the QAPP. This element should address how the method, instrument, or system performs the function it is
intended to consistently, reliably, and accurately in generating the data. Part D of this document addresses the
overall project data validation, which is performed after the project has been completed.
Data validation is a combination of checking that data processing operations have been carried out
correctly and of monitoring the quality of the field operations. Data validation can identify problems in
either of these areas. Once problems are identified, the data can be corrected or invalidated, and
corrective actions can be taken for field or laboratory operations. Numerical data stored in the LEVIS
are never internally overwritten by condition flags. Flags denoting error conditions or QA status are
saved as separate fields in the data base, so that it is possible to recover the original data.
The following validation functions are incorporated into the LEVIS ensure quality of data entry and data
processing operations:
• Duplicate Key Entry - the following data are subjected to duplicate entry by different operators:
filter weight reports, field data sheets, chain of custody sheets. The results of duplicate key entry are
compared and errors are corrected at biweekly intervals. The method for entering the data are
given in SOPs. Procedures for reconciling the duplicate entries are given in SOPs.
• Range Checks - almost all monitored parameters have simple range checks programmed in. For
example, valid times must be between 00:00 and 23:59, summer temperatures must be between 10
and 50 degrees Celsius, etc. The data entry operator is notified immediately when an entry is out of
range. The operator has the option of correcting the entry or overriding the range limit. The specific
values used for range checks may vary depending on season and other factors. Since these range
limits for data input are not regulatory requirements, the Air Division QA Officer may adjust them
from time to time to better meet quality goals.
• Completeness Checks - When the data are processed certain completeness criteria must be met.
For example, each sample must have a start time, an end time, an average flow rate, dates weighed
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 4 of 11
or analyzed and operator and technician names. The data entry operator will be notified if an
incomplete record has been entered before the record can be closed.
• Internal Consistency and Other Reasonableness Checks - Several other internal consistency
checks are built into the LIMS. For example, the end time of a sample must be greater than the start
time. Computed filter volume (integrated flow) must be approximately equal to the exposure time
multiplied by the nominal flow. Additional consistency and other checks will be implemented as the
result of problems encountered during data screening..
• Data Retention - Raw data sheets are retained on file in the Air Quality Division office for a
minimum of five years, and are readily available for audits and data verification activities. After five
years, hardcopy records and computer backup media are cataloged and boxed for storage at the
Toxa City Services Warehouse. Physical samples such as filters shall be discarded with appropriate
attention to proper disposal of potentially hazardous materials.
• Statistical Data Checks - Errors found during statistical screening will be traced back to original
data entry files and to the raw data sheets, if necessary. These checks shall be run on a monthly
schedule and prior to any data submission to AIRS. Data validation is the process by which raw
data are screened and assessed before it can be included in the main data base (i.e., the LEVIS).
• Sample Batch Data Validation- which is discussed in Section 23, associates flags, that are
generated by QC values outside of acceptance criteria, with a sample batch. Batches containing too
many flags would be rerun and or invalidated.
Table 19.1 summarizes the validation checks applicable to the data.
Table 19.1 Validation Check Summaries
Type of Data Check
Data Parity and Transmission Protocol Checks
Duplicate Key Entry
Date and Time Consistency
Completeness of Required Fields
Range Checking
Statistical Outlier Checking
Manual Inspection of Charts and Reports
Field and Lab Blank Checks
Electronic
Transmission and
Storage
•
Manual
Checks
•
•
•
•
•
Automated
Checks
•
•
•
•
The objective of the TCAPCD will be to optimize the performance of its monitoring equipment.
Initially, the results of collocated operations will be control charted (see Section 14). From these charts,
control limits will be established to flag potential problems.
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 5 of 11
19.4 Data Transformation
Data transformation is the conversion of individual data point values into related values or possibly symbols
using conversion formulas (e.g., units conversion or logarithmic conversion) or a system for replacement. The
transformations can be reversible (e.g., as in the conversion of data points using a formulas) or irreversible (e.g.,
when a symbol replaces actual values and the value is lost). The procedures for all data transformations should be
described and recorded in this element. The procedure for converting calibration readings into an equation that
will be applied to measurement readings should be documented in the QAPP. Transformation and aberration of
data for statistical analysis should be outlined in element D3, "Reconciliation with Data Quality Objectives."
Calculations for transforming raw data from measured units to final concentrations are relatively
straightforward.
19.5 Data Transmittal
Data transmittal occurs when data are transferred from one person or location to another or when data are
copied from one form to another. Some examples of data transmittal are copying raw data from a notebook onto a
data entry form for keying into a computer file and electronic transfer of data over a telephone or computer
network. The QAPP should describe each data transfer step and the procedures that will be used to characterize
data transmittal error rates and to minimize information loss in the transmittal.
Data transmittal occurs when data are transferred from one person or location to another or when data
are copied from one form to another. Some examples of data transmittal are copying raw data from a
notebook onto a data entry form for keying into a computer file and electronic transfer of data over a
telephone or computer network. Table 19-3 summarizes data transfer operations.
Table 19.2 Data Transfer Operations
Description of Data Transfer
Keying Data into The LIMS
Electronic data transfer
Filter Receiving and Chain-of-
Custody
Calibration and Audit Data
AIRS data summaries
Originator
Laboratory Technician (hand-
written data form)
(between computers or over
network)
Shipping and Receiving Clerk
Auditor or field supervisor
Air Quality Supervisor
Recipient
Data Processing Personnel
The LIMS computer (shipping
clerk enters data at a local terminal)
Air Quality Field
Supervisor
AIRS (U.S. EPA)
QA Measures Applied
Double Key Entry
Parity Checking; transmission
protocols
Sample numbers are verified
automatically; reports indicate
missing filters and/or incorrect data
entries
Entries are checked by Air
Quality Supervisor and QA Officer
Entries are checked by Air
Quality Supervisor and QA Officer
The TCAPCD will report all ambient air quality data and information specified by the AIRS Users Guide
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 6 of 11
(Volume n, Air Quality Data Coding, and Volume m, Air Quality Data Storage), coded in the
AIRS-AQS format. Such air quality data and information will be fully screened and validated and will
be submitted directly to the AIRS-AQS via electronic transmission, in the format of the AIRS-AQS,
and in accordance with the quarterly schedule. The specific quarterly reporting periods and due dates
are shown in the Table 19.3.
Table 19.3 Data Reporting Schedule
Reporting Period
January 1 -March 31
April 1-June 30
July l-September30
October 1 -December 3 1
Due Date
June 30
September 30
December 3 1
March 3 1
19.6 Data Reduction
Data reduction includes all processes that change the number of data items. This process is distinct from data
transformation in that it entails an irreversible reduction in the size of the data set and an associated loss of detail.
For manual calculations, the QAPP should include an example in which typical raw data are reduced. For
automated data processing, the QAPP should clearly indicate how the raw data are to be reduced with a
well-defined audit trail, and reference to the specific software documentation should be provided.
Data reduction processes involve aggregating and summarizing results so that they can be understood
and interpreted in different ways. Since air toxics has no regulatory requirements, such as those with the
NAAQS, monitoring regulations are not required to be reported regularly to U.S. EPA. Examples of
data summaries include:
• average concentration for a station or set of stations for a specific time period;
• accuracy, bias, and precision statistics;
• data completeness reports based on numbers of valid samples collected during a specified
period.
The Audit Trail is another important concept associated with data transformations and reductions. An
audit trail is a data structure that provides documentation for changes made to a data set during
processing. Typical reasons for data changes that would be recorded include the following:
• corrections of data input due to human error;
• application of revised calibration factors;
• addition of new or supplementary data;
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 7 of 11
• flagging of data as invalid or suspect;
• logging of the date and times when automated data validation programs are run.
The audit trail is implemented as a separate table a relational data base. Audit trail records will include
the following fields:
• operator's identity (ID code);
• date and time of the change;
• table and field names for the changed data item;
• reason for the change;
• full identifying information for the item changed (date, time, site location, parameter, etc.);
• value of the item before and after the change.
When routine data screening programs are run, the following additional data are recorded in the audit
trail:
• version number of the screening program;
• values of screening limits (e.g., upper and lower acceptance limits for each parameter);
• numerical value of each data item flagged and the flag applied.
The audit trail is produced automatically and can only document changes; there is no "undo" capability
for reversing changes after they have been made. Available reports based on the audit trail include:
• log of routine data validation, screening, and reporting program runs;
• report of data changes by station for a specified time period;
• report of data changes for a specified purpose;
• report of data changes made by a specified person.
Because of storage requirements, the System Administrator must periodically move old audit trail
records to backup media. Audit trail information will not be moved to backup media until after the data
are reported to AIRS. All backups will be retained so that any audit trail information can be retrieved
for at least three years.
19.7 Data Summary
Data analysis sometimes involves comparing suitably reduced data with a conceptual model (e.g., a dispersion
model or an infectivity model). It frequently includes computation of summary statistics, standard errors,
confidence intervals, tests of hypotheses relative to model parameters, and goodness-of-fit tests. This element
should briefly outline the proposed methodology for data analysis and a more detailed discussion should be
included in the final report.
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 8 of 11
The TCAPCD is currently implementing the data summary and analysis program It is anticipated that as
the Monitoring Program develops, additional data analysis procedures will be developed. The following
specific summary statistics will be tracked and reported for the network:
• Single sampler bias or accuracy (based on audit flow checks and laboratory audits);
• Single sampler precision (based on collocated data);
• Network-wide bias and precision;
• Data completeness.
Equations used for these reports are given in the Table 19.4.
Table 19.4 Report Equations
Criterion
Equation
Accuracy of Single Sampler Flow
- Single Check (d;) X, is reference
flow; Y; is measured flow
100
Bias of a Single Sampler - Annual
Basis (Dj)- average of individual
percent differences between sampler
and reference value; rij is the number
of measurements over the period
i-i
Percent Difference for a Single
Check (di) - X; and Y; are
concentrations from the primary and
duplicate samplers, respectively.
100
Upper 95% Confidence Limit
Limit =d, +L96*S,•/• 2
Lower 95% Confidence Limit
Limit =d,_L96*S, /• 2
Completeness
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 9 of 11
19.8 Data Tracking
Data management includes tracking the status of data as they are collected, transmitted, and processed. The
QAPP should describe the established procedures for tracking the flow of data through the data processing
system.
The LIMS contains the necessary input functions and reports necessary to track and account for the
whereabouts of filters and the status of data processing operations for specific data. Information about
filter location is updated at distributed data entry terminals at the points of significant operations. The
following input locations are used to track sample location and status:
• Laboratory (initial receipt)
Sample receipt (by lot);
Pre-sampling processing or weighing (individual filter or cartridge number first enters the
system);
Canister number (VOC only);
• Filter packaged for the laboratory (filter numbers in each package are recorded);
• Shipping (package numbers are entered for both sending and receiving);
• Laboratory(receipt from field)
• Package receipt (package is opened and filter numbers are logged in);
• Filter post-sampling weighing;
• Filter archival.
In most cases the tracking data base and the monitoring data base are updated simultaneously. For
example, when the filter is pre-weighed, the weight is entered into the monitoring data base and the filter
number and status are entered into the tracking data base. For the VOC system, the sample handling is
different. The VOC canisters are reused many times before they are retired from field use. Each
canister has its own unique code that designates the can number. When the canister is sent into the field,
a canister number becomes a portion of the tracking code. This allows the sample that was in the
canister to be tracked.
The Air Division Branch Chief or designee is responsible for tracking sample status at least twice per
week and following up on anomalies such as excessive holding time in the laboratory before analysis.
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 10 of 11
19.9 Data Storage and Retrieval
The QAPP should discuss data storage and retrieval including security and time of retention, and it should
document the complete control system. The QAPP should also discuss the performance requirements of the data
processing system, including provisions for the batch processing schedule and the data storage facilities.
Data archival policies for the data are shown in Table 19.5.
Table 19.5 Data Archive Policies
Data Type
Weighing records;
chain of custody
forms
Laboratory
Notebooks
Field Notebooks
Data Base
(excluding Audit Trail
records)
Trail record
TSP Quartz filters
PUF
VOC canisters
DNPH cartridge
Medium
Hardcopy
Hardcopy
Hardcopy
Electronic
(on-line)
Hardcopy
and
electronic
reports
Filters
Foam
metal can
plastic
cartridge
Location
Laboratory
Laboratory
Air Quality
Division
Air Quality
Division
Air Quality
Division
Laboratory
Laboratory
Laboratory
Laboratory
Retention Time
3 years
3 years
3 years
indefinite (may be
moved to backup
media after 5 years)
3 years
1 year
reused after
cleaning
reused after
cleaning
6 months
Final Disposition
Discarded
N/A
Discarded
Backup tapes retained
indefinitely
N/A
Discarded
Discarded
Recycled
Discarded
The data reside on an Local Access Network on the TCAPCD server. This computer has the following
specifications:
• Storage: 18 GB (SCSI RAID 0 array);
• Backup: DAT (3 GB per tape) - incremental backups daily; full backups biweekly;
• Network: Windows NT, 100 Mbps Ethernet network (currently 23 Windows 95 and NT
workstations on site; additional workstations via 28.8 kbps dial-in modem);
-------
Project: Model QAPP
Element No: 19
Revision No: 1
Date: 7/5/01
Page 11 of 11
• Security: Password protection on all workstations and dial-in lines; Additional password
protection applied by application software.
Security of data in the data base is ensured by the following controls:
• Password protection on the data base that defines three levels of access to the data;
• Regular password changes (quarterly for continuing personnel; passwords for personnel leaving
the Air Division will be canceled immediately);
• Independent password protection on all dial-in lines;
• Logging of all incoming communication sessions, including the originating telephone number, the
user's ID, and connect times;
• Storage of media including backup tapes in locked, restricted access areas.
-------
Project: Model QAPP
Element No: 20
Revision No: 1
Date: 7/5/01
Page 1 of 8
20.0 Assessments and Response Actions
During the planning process, many options for sampling design (ref EPA QA/G-5S, Guidance on Sampling
Design to Support QAPPs), sample handling, sample cleanup and analysis, and data reduction are evaluated and
chosen for the project. In order to ensure that the data collection is conducted as planned, a process of evaluation
of the collected data is necessary. This element of the QAPP describes the internal and external checks necessary
to ensure that:
all elements of the QAPP are correctly implemented as prescribed,
the quality of the data generated by implementation of the QAPP is adequate, and
corrective actions, when needed, are implemented in a timely manner and their effectiveness is confirmed.
Although any external assessments that are planned should be described in the QAPP, the most important
part of this element is documenting all planned internal assessments. Generally, internal assessments are initiated
or performed by the internal QA Officer so the activities described in this element of the QAPP should be related to
the responsibilities of the QA Officer.
An assessment is defined as an evaluation process used to measure the performance or effectiveness of
the quality system or the establishment of the monitoring network and sites and various measurement
phases of the data operation..
The results of quality assurance assessments indicate whether the control efforts are adequate or need to
be improved. Documentation of all quality assurance and quality control efforts implemented during the
data collection, analysis, and reporting phases is important to data users, who can then consider the
impact of these control efforts on the data quality (see Section 21). Both qualitative and quantitative
assessments of the effectiveness of these control efforts will identify those areas most likely to impact the
data quality and to what extent. In order to ensure the adequate performance of the quality system, the
TCAPCD in conjunction with the State, EPA Regional office will perform the following assessments:
20.1 ASSESSMENT ACTIVITIES AND PROJECT PLANNING
20.1.1 Management Systems Review
A management systems review (MSR) is a qualitative assessment of a data collection operation or
organization to establish whether the prevailing quality management structure, policies, practices, and
procedures are adequate. MSRs conducted every three years by the QA Division. The MSR will use
appropriate regulations, and the QAPP to determine the adequate operation of the air program and its
related quality system. The quality assurance activities of all criteria pollutants including air toxics will be
part of the MSR. The QA Office Director's staff will report its findings to the appropriate Divisions
within 30 days of completion of the MSR. The report will be appropriately filed. Follow-up and
progress on corrective action(s) will be determined during regularly scheduled division directors meetings
-------
Project: Model QAPP
Element No: 20
Revision No: 1
Date: 7/5/01
Page 2 of 8
20.1.2 Network Reviews
Conformance with network requirements of the monitoring network through annual review . The
network review is used to determine how well a particular air monitoring network is achieving its required
air monitoring objective, and how it should be modified to continue to meet its objective. The network
review will be accomplished every 3 years. Since the states are also required to perform these reviews,
the District will coordinate its activity with the State in order to perform the activity at the same time (if
possible). The Air Monitoring Branch will be responsible for conducting the network review.
The following criteria will be considered during the review:
• date of last review;
• areas where attainment/nonattainment redesignations are taking place or are likely to take place;
• results of special studies, saturation sampling, point source oriented ambient monitoring, etc.;
• proposed network modifications since the last network review.
In addition, pollutant-specific priorities may be considered in areas that models may show persons to be
at risk.
Prior to the implementation of the network review, significant data and information pertaining to the
review will be compiled and evaluated. Such information might include the following:
• network files (including updated site information and site photographs);
• AIRS reports (AMP220, 225, 380, 390, 450);
• air quality summaries for the past five years for the monitors in the network;
• air toxics emissions trends reports for major metropolitan area;
• emission information, such as emission density maps for the region in which the monitor is
located and emission maps showing the major sources of emissions;
• National Weather Service summaries for monitoring network area.
Upon receiving the information it will be checked to ensure it is the most current. Discrepancies will be
noted on the checklist and resolved during the review. Files and/or photographs that need to be updated
will also be identified. The following categories will emphasized during network reviews:
Adequacy of the network will be determined by using the following information:
• maps of historical monitoring data;
• maps of emission densities;
• dispersion modeling;
special studies/saturation sampling;
• best professional judgement;
-------
Project: Model QAPP
Element No: 20
Revision No: 1
Date: 7/5/01
Page 3 of 8
SIP requirements;
• GIS updates.
The number of samplers operating can be determined from the AMP220 report in AIRS. The number
of monitors required, based on concentration levels and population, can be determined from the
AMP450 report and the latest census population data.
Location of Monitors- Adequacy of the location of monitors can only be determined on the basis of
stated objectives. Maps, graphical overlays, and GIS-based information will be helpful in visualizing or
assessing the adequacy of monitor locations. Plots of potential emissions and/or historical monitoring data
versus monitor locations will also be used.
During the network review, the stated objective for each monitoring location or site (see section 10)
will be "reconfirmed" and the spatial scale "reverified" and then compared to each location to determine
whether these objectives can still be attained at the present location.
Probe Siting Requirements- The on-site visit will consist of the physical measurements and
observations to determine the best locations. Prior to the site visit, the reviewer will obtain and review the
following::
• most recent hard copy of site description (including any photographs);
• data on the seasons with the greatest potential for high concentrations for specified pollutants;
• predominant wind direction by season.
A checklist similar to the checklist used by the EPA Regional offices during their scheduled network
reviews will be used. This checklist can be found in the SLAMS/NAMS/PAMS Network Review
Guidance which is intended to assist the reviewers In addition to the items on the checklist, the
reviewer will also perform the following tasks:
• ensure that the inlet is clean;
• record findings in field notebook and/or checklist;
• take photographs/videotape in the 8 directions;
• document site conditions, with additional photographs/videotape.
Other Discussion Topics- In addition to the items included in the checklists, other subjects for
discussion as part of the network review and overall adequacy of the monitoring program will include:
• installation of new monitors;
• relocation of existing monitors;
• siting criteria problems and suggested solutions;
• problems with data submittals and data completeness;
• maintenance and replacement of existing monitors and related equipment;
-------
Project: Model QAPP
Element No: 20
Revision No: 1
Date: 7/5/01
Page 4 of 8
• quality assurance problems;
• air quality studies and special monitoring programs;
• other issues;
-proposed regulations;
-funding.
A report of the network review will be written within two months of the review and appropriately filed.
20.1.3 Technical Systems Audits
A ISA is a thorough and systematic on-site qualitative audit, where facilities, equipment, personnel,
training, procedures, and record keeping are examined for conformance to the QAPP. TSAs of the
network will be accomplished every three years and will stagger the required TSA conducted by the
State QA Office. The QA Office will implement the TSA either as a team or as an individual auditor. The
QA Office will perform three TSA activities that can be accomplished separately or combined :
• Field - handling, sampling, shipping.;
• Laboratory - Pre-sampling , shipping, receiving, post-sampling weighing, analysis, archiving, and
associated QA/QC;
• Data management - Information collection, flagging, data editing, security, upload.
Key personnel to be interviewed during the audit are those individuals with responsibilities for: planning,
field operations, laboratory operations, QA/QC, data management, and reporting.
To increase uniformity of the TSA, an audit checklist will be developed and used. This checklist is
based on the EPA R-5 guidance.
The audit team will prepare a brief written summary of findings, organized into the following areas:
planning, field operations, laboratory operations, quality assurance/quality control, data management, and
reporting. Problems with specific areas will be discussed and an attempt made to rank them in order of
their potential impact on data quality.
The audit finding form has been designed such that one is filled out for each major deficiency that
requires formal corrective action. The finding should include items like: systems impacted, estimated time
period of deficiency, site(s) affected, and reason of action. The finding form will inform the Division about
serious problems that may compromise the quality of the data and therefore require specific corrective
actions. They are initiated by the Audit Team, and discussed at the debriefing. During the debriefing, if
the audited group is in agreement with the finding, the form is signed by the groups branch manager or his
designee during the exit interview. If a disagreement occurs, the Audit Team will record the opinions of
the group audited and set a time at some later date to address the finding at issue.
Post-Audit Activities- The major post-audit activity is the preparation of the systems audit report.
-------
Project: Model QAPP
Element No: 20
Revision No: 1
Date: 7/5/01
Page 5 of 8
The report will include:
• audit title and number and any other identifying information;
• audit team leaders, audit team participants and audited participants;
• background information about the project, purpose of the audit, dates of the audit; particular
measurement phase or parameters that were audited, and a brief description of the audit
process;
• summary and conclusions of the audit and corrective action requires;
• attachments or appendices that include all audit evaluations and audit finding forms.
To prepare the report, the audit team will meet and compare observations with collected documents
and results of interviews and discussions with key personnel. Expected QA Project Plan implementation
is compared with observed accomplishments and deficiencies and the audit findings are reviewed in
detail. Within thirty (30) calendar days of the completion of the audit, the audit report will be prepared
and submitted. The systems audit report will be submitted to the appropriate branch managers and
appropriately filed.
If the branch has written comments or questions concerning the audit report, the Audit Team will review
and incorporate them as appropriate, and subsequently prepare and resubmit a report in final form within
thirty (30) days of receipt of the written comments. The report will include an agreed-upon schedule for
corrective action implementation.
Follow-up and Corrective Action Requirements- The QA Office and the audited organization may
work together to solve required corrective actions. As part of corrective action and follow-up, an audit
finding response letter will be generated by the audited organization . The audit finding response letter will
address what actions are being implemented to correct the finding of the TSA. The audit response letter
will be completed by the audited organization within 30 days of acceptance of the audit report.
20.1.4 Performance Audit
A Performance Audit is a field operations audit that ascertains whether the samplers are operating
within the specified limits as stated in the SOPs and QAPP. The Performance Audit is performed every
year in conjunction with the field TSA. The audit consists of challenging the samplers to operate using
independent NIST-traceable orifices or other flow devices. Once the audit has been performed, the
flow rate is calculated and compared against the flow rates as specified in the QAPP or SOPs. If the
flowrates are not within these ranges, then the field operations technician is notified and corrective action
ensues. Once the field technicians have remedied the situation, a post audit confirms the adjustment or
maintenance. The audit results are then written in a detailed report and are included in the QAAR.
-------
Project: Model QAPP
Element No: 20
Revision No: 1
Date: 7/5/01
Page 6 of 8
20.1.5 Data Quality Assessments
A data quality assessment (DQA) is the statistical analysis of environmental data to determine whether
the quality of data is adequate to support the decision which are based on the DQOs. Data are
appropriate if the level of uncertainty in a decision based on the data is acceptable. The DQA process is
described in detail in Guidance for the Data Quality Assessment Process, EPA QA/G-9 and is
summarized below.
1. Review the data quality objectives (DQOs) and sampling design of the program: review the
DQO. Define statistical hypothesis, tolerance limits, and/or confidence intervals.
2. Conduct preliminary data review. Review Precision &Accuracy (P&A) and other available
QA reports, calculate summary statistics, plots and graphs. Look for patterns, relationships, or
anomalies.
3. Select the statistical test: select the best test for analysis based on the preliminary review, and
identify underlying assumptions about the data for that test.
4. Verify test assumptions: decide whether the underlying assumptions made by the selected test
hold true for the data and the consequences.
5. Perform the statistical test: perform test and document inferences. Evaluate the performance
for future use.
Data quality assessment will be included in the QA AR. Details of these reports are discussed in
Section 21.
Measurement uncertainty will be estimated for both automated and manual methods. Terminology
associated with measurement uncertainty are found within 40 CFR Part 58 Appendix A and includes: (a)
Precision - a measurement of mutual agreement among individual measurements of the same property
usually under prescribed similar conditions, expressed generally in terms of the standard deviation; (b)
Accuracy- the degree of agreement between an observed value and an accepted reference value,
accuracy includes a combination of random error (precision) and systematic error (bias) components
which are due to sampling and analytical operations; (c) Bias- the systematic or persistent distortion of a
measurement process which causes errors in one direction. The individual results of these tests for each
method or analyzer shall be reported to EPA.
Estimates of the data quality will be calculated on the basis of single monitors and aggregated to all
monitors.
20.1.6 Performance Evaluations
-------
Project: Model QAPP
Element No: 20
Revision No: 1
Date: 7/5/01
Page 7 of 8
The PE is an assessment tool for the laboratory operations. The State's Laboratory Division creates
"blind" samples and sends them periodically to the District's laboratory. Upon receipt, the laboratory
logs in the samples and performs the normal handling routines as any other sample. The PE is analyzed in
accordance with the SOPs and QAPP. The results are then sent to the Laboratory Branch Manager for
final review. Then the results are reported to the State's Laboratory Director. The State's laboratory
writes up a PE report and sends a copy of the results to the Laboratory Branch Manager and the EPA
QA Office. Any results outside of the State's acceptance criteria are then noted in the PE report. The
TCAPCD has 120 days to address any deficiencies noted in the PE Report.
20.2 Documentation of Assessments
The following material describes what should be documented in a QAPP after consideration of the above
issues and types of assessments:
Number. Frequency, and Types of Assessments- Depending upon the nature of the project, there may be more than
one assessment. A schedule of the number, frequencies, and types of assessments required should be given.
Assessment Personnel- The QAPP should specify the individuals, or at least the specific organizational units, who
will perform the assessments. Internal audits are usually performed by personnel who work for the organization
performing the project work but who are organizationally independent of the management of the project. External
audits are performed by personnel of organizations not connected with the project but who are technically
qualified and who understand the QA requirements of the project.
S schedule of Assessment Activities-A schedule of audit activities, together with relevant criteria for assessment,
should be given to the extent that it is known in advance of project activities.
Remoting and Resolution of Issues-Audits, peer reviews, and other assessments often reveal findings of practice or
procedure that do not conform to the written QAPP. Because these issues must be addressed in a timely manner,
the protocol for resolving them should be given here together with the proposed actions to ensure that the
corrective actions were performed effectively. The person to whom the concerns should be addressed, the
decision-making hierarchy, the schedule and format for oral and written reports, and the responsibility for
corrective action should all be discussed in this element. It also should explicitly define the unsatisfactory
conditions upon which the assessors are authorized to act and list the project personnel who should receive
assessment reports.
-------
Project: Model QAPP
Element No: 20
Revision No: 1
Date: 7/5/01
Page 8 of 8
Table 20.1 Assessment Summary
Assessment
Activity
Management
Systems Reviews
Network
Reviews
AppD
AppE
Technical
Systems Audits
Audits of Data
Quality
Performance
Audits
Performance
Evaluation
Frequenc
y
1/3 years
11 years
1/3 years
1/3 years
I/ year
I/year
I/year
Personnel
Responsible
Directors Office
Air Division
Air Division
QA Office
QA Office
QA/Air
Monitoring
Divisions
State Laboratory
Division
Schedule
1/1/00
1/1/00
1/1/00
5/1/99
5/1/99
1/1/00
1/1/00
Report
Completion
30 days
after activity
30 days
after activity
30 days
after activity
30 days
after activity
120 days
after end of
calendar year
120 days
after end of
calendar year
Reporting/Resolutio
n
Directors Office to
QA, Air, Program
Support Divisions
Air Division to Air
Monitoring Branch
QA Division to Air
Monitoring Division
QA Division to Air
Monitoring Division
QA Division
Laboratory Branch
Manager
-------
Project: Model QAPP
Element No: 21
Revision No: 1
Date:7/5/01
Page 1 of 2
21.0 Reports to Management
Effective communication between all personnel is an integral part of a quality system. Planned reports provide
a structure for apprizing management of the project schedule, the deviations from approved QA and test plans, the
impact of these deviations on data quality, and the potential uncertainties in decisions based on the data. Verbal
communication on deviations from QA plans should be noted in summary form in element Dl of the QAPP.
This section describes the quality-related reports and communications to management necessary to
support air toxics network operations and the associated data acquisition, validation, assessment, and
reporting.
Important benefits of regular QA reports to management include the opportunity to alert the management
of data quality problems, to propose viable solutions to problems, and to procure necessary additional
resources. Management should not rely entirely upon the MSR and ISA for their assessment of the
data. The MSR and TSA only occur once every three years. Quality assessment, including the
evaluation of the technical systems, the measurement of performance, and the assessment of data, is
conducted to help insure that measurement results meet program objectives and to insure that necessary
corrective actions are taken early, when they will be most effective.
Effective communication among all personnel is an integral part of a quality system. Regular, planned
quality reporting provides a means for tracking the following:
adherence to scheduled delivery of data and reports,
documentation of deviations from approved QA and test plans, and the impact of these
deviations on data quality;
analysis of the potential uncertainties in decisions based on the data.
21.1 Frequency, Content, and Distribution of Reports
The QAPP should indicate the frequency, content, and distribution of the reports so that management may
anticipate events and move to ameliorate potentially adverse results. An important benefit of the status reports is
the opportunity to alert the management of data quality problems, propose viable solutions, and procure
additional resources. If program assessment (including the evaluation of the technical systems, the measurement
of performance, and the assessment of data) is not conducted on a continual basis, the integrity of the data
generated in the program may not meet the quality requirements. These audit reports, submitted in a timely
manner, will provide an opportunity to implement corrective actions when most appropriate
Required reports to management for monitoring in general are discussed in various sections of 40 CFR
Parts 53 and 58. Guidance for management report format and content are provided in guidance
developed by EPA's Quality Assurance Division (QAD) and the Office of Air Quality Planning and
Standards. These reports are described in the following subsections.
-------
Project: Model QAPP
Element No: 21
Revision No: 1
Date:7/5/01
Page 2 of 2
21.1.1 QA Annual Report
Periodic assessments of air toxics data are required to be reported to EPA (40 CFR 58 Appendix A,
Section 1.4, revised July 18, 1997). The Toxa City Air Pollution Control Air Division's QA Annual
Report is issued to meet this requirement. This document describes the quality objectives for
measurement data and how those objectives have been met.
The QA Annual Report will include Quality information for each air toxic monitored in the network.
Each section includes the following topics:
program overview and update;
quality objectives for measurement data;
data quality assessment.
For reporting air toxics measurement uncertainties, the QA Annual Report contains the following
summary information:
Flow Rate Audits;
Collocated Samplers Audits using estimation of Precision and Bias;
• Laboratory audits which include "round-robin" cylinders that are shared among many
laboratories;
• NPAP audits.
21.1.2 Network Reviews
Section 20 discusses the contents of the network review.
21.1.3 Technical System Audit Reports
The TCAPCD performs Technical System Audits of the monitoring system (section 20) . These reports
will be filed and made available to EPA personnel during their technical systems audits.
External systems audits are conducted at least every three years by the EPA Regional Office as required
by 40 CFR Part 58, Appendix A, Section 2.5. Further instructions are available from either the EPA
Regional QA Coordinator or the Systems Audit QA Coordinator, Office of Air Quality Planning and
Standards, Emissions Monitoring and Analysis Division (MD-14), U.S. Environmental Protection
Agency, Research Triangle Park, NC 27711.
21.1.5 Response/Corrective Action Reports
The Response/Corrective Action Report procedure will be followed whenever a problem is found such
as a safety defect, an operational problem, or a failure to comply with procedures. A
Response/Corrective Action Report is one of the most important ongoing reports to management
because it documents primary QA activities and provides valuable records of QA activities.
-------
Project: Model QAPP
Element No: 22
Revision No: 1
Date: 7/5/01
Page 1 of 6
22.0 Data Review
How closely a measurement represents the actual environment at a given time and location is a complex
issue that is considered during development of element B1. See Guidance on Sampling Designs to Support
QAPPs (EPA QA/G-5S). Acceptable tolerances for each critical sample coordinate and the action to be taken if
the tolerances are exceeded should be specified in element B1.
Each agency must develop its own sets of data review tools and criteria. The use of computers can greatly
enhance the amount of data that can be reviewed and processed. There are many tools available to the modem
air quality professional.
22.1 Data Review Design
The primary purpose of this section is to describe the data validation procedures which are used by the
TCAPCD to process ambient air toxics data. Data validation refers to those activities performed after
the fact, that is, after the data have been collected. The difference between data validation and quality
control techniques is that the quality control techniques attempt to minimize the amount of bad data being
collected, while data validation seeks to prevent any bad data from getting through the data collection and
storage systems.
It is preferable that data review be performed as soon as possible after the data collection, so that the
questionable data can be checked by recalling information on unusual events and on meteorological
conditions which can aid in the validation. Also, timely corrective actions should be taken when
indicated to minimize further generation of questionable data. The data review group will attempt to
review the data within 1 month after the end of the month of sampling. This will also help with getting the
data loaded onto AIRS in a timely manner, as described in Section 19.5.
Personnel performing data review should:
• Be familiar with typical diurnal concentration variations (e.g., the time daily maximum concentrations
occur and the interrelationship of pollutants.) For example, benzene, toluene and xylene
concentrations usually increase and decrease together, due to these being attributed to mobile
sources, whereas, metals are usually attributable to manufacturing process, and may have a longer
temporal cycle.
Be familiar with the type of instrument malfunctions which cause characteristic trace irregularities.
Recognize that cyclical or repetitive variations (at the same time each day or at periodic intervals
during the day) may be caused by excessive line voltage or temperature variations. Nearby source
activity can also cause erroneous or non-representative measurements.
Recognize that flow traces showing little or no activity often indicate flow problems, or sample line
leaks.
-------
Project: Model QAPP
Element No: 22
Revision No: 1
Date: 7/5/01
Page 2 of 4
There is a wide variety of information with which to validate air toxics data. Among them are the
following, along with their uses:
Multi-point Calibration Forms - the multipoint forms should be used to establish proper initial
calibration and can be used to show changes in the calibration;
Span Control Charts - these charts will be the most valuable tool in spotting data that is out of
control limits;
Site and Instrument Logs - because all station activities are noted in one or both of these logs, one
can obtain a good picture of station operations by reading these logs
Data From Other Air Quality Stations - data from other air quality stations nearby can be
compared between two stations to help the identification of invalid data.
Blanks, Replicates and Spikes - these QC indicators can be used to ascertain whether sample
handling or analysis is causing bias in the data set.
Monthly Summary Reports - The Monthly Summary Reports are outputs from the Analytical
Laboratory LDVIS units. These reports are "canned" reports provided by the computer vendor
who writes the interface software. These reports provide the following information:
Completeness report;
Initial Calibration Report from the Analytical Instruments;
Laboratory Control Sample Recoveries;
Field or Laboratory Surrogate Recoveries;
Spike Recoveries;
Laboratory Duplicate Results;
Serial Dilution Results.
22.2 Data Review Testing
Recently, the TCAPCD has received a copy of the newly developed program VOCDat. This program
was developed by EPA-OAQPS for PAMS data validation. However, the TCAPCD will apply this to
the Organic Toxics data by using the following VOCDat tests:
22.2.1 Data Identification Checks
Data with improper identification codes are useless. Three equally important identification fields which
must be correct are time, location, parameter and sampler ID.
22.2.2. Unusual Event Review
Extrinsic events (e.g., construction activity, dust storms, unusual traffic volume, and traffic jams) can
explain unusual data. This information could also be used to explain why no data are reported for a
specified time interval, or it could be the basis for deleting data from a file for specific analytical purposes.
-------
Project: Model QAPP
Element No: 22
Revision No: 1
Date: 7/5/01
Page 3 of 4
22.2.3. Relationship Checks
Toxics data sets contain many physically or chemically related parameters. These relations can be
routinely checked to ensure that the measured values on an individual parameter do not exceed the
corresponding measured values of an aggregate parameter which includes the individual parameter. For
example, benzene, toluene and xylene are mobile source driven. The relative concentrations are within +/-
10 ppbv, if these values are recorded at the same time and location. Data sets in which individual
parameter values exceed the corresponding aggregate values are flagged for further investigation. Minor
exceptions to allow for measurement system noise may be permitted in cases where the individual value is
a large percentage of the aggregate value.
22.2.4. Review of Spikes, Blanks and Replicates -
An additional check of the data set is to verify that the spikes, blanks and replicate samples have been
reviewed. Generally, recovery of spikes in samples should be greater than 80%. Blanks should not be
more than 3 times the MDL for any compound. The difference in concentration of replicates should be
within +/-10%. If any of these are outside of this boundary, then the reviewer should notify the air
monitoring branch supervisor for direction. The air branch supervisor will discuss these results with the
lab branch supervisor and the QA officer. The three will decide whether any of these results can or will
invalidate a single run or batch.
22.3 Data Review Testing
These tests check values in a data set which appear atypical when compared to the whole data set.
Common anomalies of this type include unusually high or low values (outliers) and large differences in
adjacent values. These tests will not detect errors which alter all values of the data set by either an
additive or multiplicative factor (e.g., an error in the use of the scale). The following test for internal
consistency are used:
Data Plots
Ratio Test
Students "t-test"
22.3.1. Tests for Historical and Temporal Consistency
These tests check the consistency of the data set with respect to similar data recorded in the past. In
particular these procedures will detect changes where each item is increased by a constant or by a
multiplicative factor. Gross limit checks are useful in detecting data values that are either highly
unlikely or considered impossible. The use of upper and lower 95% confidence limits is very useful in
identifying outliers.
-------
Project: Model QAPP
Element No: 22
Revision No: 1
Date: 7/5/01
Page 4 of 4
22.3.2 Pattern and Successive Difference Tests
These tests check data for pollutant behavior which has never or very rarely occurred in the past. Values
representing pollutant behavior outside of these predetermined limits are then flagged for further
investigation. Pattern tests place upper limits on:
The individual concentration value (maximum-hour test),
The difference in adjacent concentration values (adjacent hour test),
The difference or percentage difference between a value and both of its adjacent values (spike
test), and
The average of three or more consecutive values (consecutive value test)
22.3.3 Parameter Relationship Tests
Parameter relationship tests can be divided into deterministic tests involving the theoretical relationships
between parameters (e.g., ratios between benzene and toluene) or empirical tests which determine
whether or not a parameter is behaving normally in relation to the observed behavior of one or more
other parameters. Determining the "normal" behavior of related parameters requires the detailed review
of historical data.
-------
Project: Model QAPP
Element No: 23
Revision No: 1
Date: 7/5/01
Page 1 of 4
23.0 Data Validation, Verification and Analysis
The purpose of this element is to describe, in detail, the process for validating (determining if data satisfy
QAPP-defined user requirements) and verifying (ensuring that conclusions can be correctly drawn) project data.
The amount of data validated is directly related to the DQOs developed for the project. The percentage
validated for the specific project together with its rationale should be outlined or referenced. Diagrams should be
developed showing the various roles and responsibilities with respect to the flow of data as the project
progresses. The QAPP should have a clear definition of what is implied by "verification" and "validation."
Many of the processes for verifying and validating the measurement phases of the data collection
operation have been discussed in Section 22. If these processes, as written in the QAPP, are followed,
and the sites are representative of the boundary conditions for which they were selected, one would
expect to achieve the DQOs. However, exceptional field events may occur, and field and laboratory
activities may negatively effect the integrity of samples. In addition, it is expected that some of the QC
checks will fail to meet the acceptance criteria. This section will outline how the District will take the
data to a higher level of analysis. This will be accomplished by performing software tests, plotting and
other methods of analysis.
23.1 Describe the Process for Validating and Verifying Data
Each sample should be verified to ensure that the procedures used to generate the data (as identified in
element B4 of the QAPP) were implemented as specified. Acceptance criteria should be developed for important
components of the procedures, along with suitable codes for characterizing each sample's deviation from the
procedure. Data validation activities should determine how seriously a sample deviated beyond the acceptable
limit so that the potential effects of the deviation can be evaluated during DQA.
23.1.1 Verification of Samples
After a sample batch is completed, a thorough review of the data will be conducted for completeness and
data entry accuracy. All raw data that is hand entered on data sheets will be double keyed as discussed
in Section 19, into the LEVIS. For the chromatographic data, the data will be transferred from a Level 1
to a Level 2 status. The entries are compared to reduce the possibility of entry and transcription errors.
Once the data is entered into the LEVIS, the system will review the data for routine data outliers and data
outside of acceptance criteria. These data will be flagged appropriately. All flagged data will be
"reverified" that the values are entered correctly. The data qualifiers or flags can be found in the SOPs.
-------
Project: Model QAPP
Element No: 23
Revision No: 1
Date: 7/5/01
Page 2 of 4
23.1.2 Validation
Validation of measurement data will require two stages: one at the Level I and the Level n. Records
of all invalid samples will be filed for 5 years. Information will include a brief summary of why the sample
was invalidated along with the associated flags. This record will be available on the LEVIS since all
samples that were analyzed will be recorded. At least one flag will be associated with an invalid sample,
that being the "INV" flag signifying invalid, or the "NAR" flag when no analysis result is reported, or
"BDL" which means below the detection limit. Additional flags will usually be associated with the NAR,
INV or BDL flags that help describe the reason for these flags, as well as free form notes from the field
operator or laboratory technician.
Validation of Measurement Values
Certain criteria based upon field operator and laboratory technician judgement have been developed that
will be used to invalidate a sample or measurement. The flags listed in table 22-1 will be used to
determine if individuals samples, or samples from a particular instrument will be invalidated. In all cases
the sample will be returned to the laboratory for further examination. When the laboratory technician
reviews the field sheet and chain-of-custody forms he/she will look for flag values. Filters that have flags
related to obvious contamination (CON), filter damage (DAM), field accidents (FAC) will be
immediately examined. Upon concurrence of the laboratory technician and laboratory branch manager,
these samples will be invalidated. The flag "NAR" for no analysis result will be placed in the flag area
associated with this sample, along with the other associated flags.
Other flags listed may be used alone or in combination to invalidate samples. Since the possible flag
combinations are overwhelming and can not be anticipated, the air division will review these flags and
determine if single values or values from a site for a particular time period will be invalidated. The division
will keep a record of the combination of flags that resulted in invalidating a sample or set of samples. As
mentioned above, all data invalidation will be documented. Table 23.1 contains criteria that can be used
to invalidate single samples based on single flags.
Table 23.1 Single Flag Invalidation Criteria for Single Samples
Requirement
Contamination
Filter Damage
Event
Laboratory
Accident
Flag
CON
DAM
EVT
LAC
Comment
Concurrence with lab technician and branch manager
Concurrence with lab technician and branch manager
Exceptional , known field event expected to have effected
sample . Concurrence with lab technician and branch
manager
Concurrence with lab technician and branch manager
-------
Project: Model QAPP
Element No: 23
Revision No: 1
Date: 7/5/01
Page 3 of 4
Below Detection
Limit
Field Accident
BDL
FAC
Value is below the Minium Detection Limit of the analytical
system
Concurrence with lab technician and branch manager
23.2 Data Analysis
Once the data has been reviewed, verified and validated. It should be loaded into a computer archive. This
section will describe how the data will be analyzed in order to put the values collected into context with
the environment.
Data analysis refers to the process of attempting to make sense of the data that are collected. By
examining the list in Table 5-1, there are a large number of parameters to analyze. However, many of
these have similar characteristics: Volatile Organics, Semi-Volatile Organics and particulate metals. One
would assume that there physical and chemical properties could group them together.
This section will state how the District will begin to analyze the data to ascertain what the data illustrates
and how it should be applied.
23.2.1 Analytical tests
The District will employ several software programs towards analyzing the data. These are listed below
with a short explanation of each.
Spreadsheet - The District will perform a rudimentary analysis on the data sets using EXCEL
spreadsheets. Spreadsheets allow the user to input data and statistically analyze, plot and graph linear
data. This type of analysis will allow the user to see if there are any variations in the data sets. In
addition, various statistical tests such as tests for linearity, slope, intercept or correlation coefficient can be
generated between two strings of data. Box and Whisker, Scatter and other plots can be employed.
Time series plots can help identify the following trends:
Large jumps or dips in concentrations
periodicity of peaks within a month or quarter
Expected or un-expected relationships among species
-------
Project: Model QAPP
Element No: 23
Revision No: 1
Date: 7/5/01
Page 4 of 4
VOCDat- As stated in Section 22, the EPA has placed resources into creating software that can analyze
data. One such program is VOCDat. This software program was originally written for input of PAMS
data.. VOCDat is a Windows-based program that provides a graphical platform from which to display
collected VOC data; to perform quality control tasks on the data; and for exploratory data analysis. This
program will enable the TCAPCD to rapidly validate and release their air toxics VOC data to AIRS.
VOCDat displays the concentrations of the VOC data using scatter, fingerprint, and time series plots.
Customizable screening criteria may be applied to the data and the quality control codes may be changed
for individual data points as well as for the entire sample on all plots. VOCDat can allow a user to find
out what percentage a particular compound is of the total. This test allows the user the ability to see if the
data exceeds the 3 sigma rule for outliers. For more details, please see Section 22.2.
Wind Rose Plots - Recently the TCAPCD has purchased a wind rose program that will except pollutant
data. The wind direction, wind speed and pollutant data will be input into the program and wind rose
which show the relative direction and speed of pollutants (transport) will be graphically displayed.
GIS - GIS program that allows the user the ability to overlay concentration data on geographic data. By
creating "views", the user can overlay temporally changing data into a spatial analysis too. Plots of
concentrations of data can be temporal/spatially displaced.
-------
Project: Model QAPP
Element No: 24
Revision No: 1
Date: 7/5/01
Page 1 of 5
24.0 Reconciliation with Data Quality Objectives
24.1 Reconciling Results with DQOs
The DQA process has been developed for cases where formal DQOs have been established. Guidance for
Data Quality Assessment (EPA QA/G-9) focuses on evaluating data for fitness in decision- making and also
provides many graphical and statistical tools.
DQA is a key part of the assessment phase of the data life cycle, as shown in Figure 9. As the part of the
assessment phase that follows data validation and verification, DQA determines how well the validated data can
support their intended use. If an approach other than DQA has been selected, an outline of the proposed activities
should be included
The DQOs for the air toxics monitoring network were developed in Section 7. This is stated below.
Determine the highest concentrations expected to occur in the area covered by the network,
i.e., to verify the spatial and temporal characteristics of HAPs within the city.
This section of the QAPP will outline the assessment procedures that Toxa City will follow to determine
whether the monitors and laboratory analyses are producing data that comply with the stated goals. This
section will then clearly state what action will be taken as a result of the assessment process. Such an
assessment is termed a Data Quality Assessment (DQA) and is thoroughly described in EPA QA/G-9:
Guidance for Data Quality Assessment1.
For the stated DQO, the assessment process must follow statistical routines. The following five steps will
discuss how this will be achieved.
24.2 Five Steps of DQA Process
As described in EPA QA/G-9, the DQA process is comprised of five steps. The steps are detailed
below.
24.2.1 Review DQOs and Sampling Design.
Section 7 of this QAPP contains the details for the development of the DQOs, including defining the
objectives of the air toxics monitoring network, and developing limits on the decision errors . Section
10 of this QAPP contains the details for the sampling design, including the rationale for the design, the
design assumptions, and the sampling locations and frequency. If any deviations from the sampling design
have occurred, these will be indicated and their potential effect carefully considered throughout the entire
-------
Project: Model QAPP
Element No: 24
Revision No: 1
Date: 7/5/01
Page 2 of 5
DQA. Since this program is in its formative stages, no assessments have been performed. However, the
State of North Carolina performs annual network reviews. The TCAPCD will request that the State
Agency review the network siting and maintenance.
24.2.2 Conduct Preliminary Data Review
A preliminary data review will be performed to uncover potential limitations to using the data, to reveal
outliers, and generally to explore the basic structure of the data. The first step is to review the quality
assurance reports. The second step is to calculate basic summary statistics, generate graphical
presentations of the data, and review these summary statistics and graphs.
Review Quality Assurance Reports.- Toxa City will review all relevant quality assurance reports,
internal and external, that describe the data collection and reporting process. Particular attention will be
directed to looking for anomalies in recorded data, missing values, and any deviations from standard
operating procedures. This is a qualitative review. However, any concerns will be further investigated in
the next two steps.
24.2.3 Select the Statistical Test
Toxa City will generate summary statistics for each of its primary and QA samplers. The summary
statistics will be calculated at the annual, and a three-year levels and will include only valid samples.
These following statistical test will be performed:
Test to examine distribution of the data
Simple annual and 3-year averages of all pollutants for examination of trends
Examination of bias and precision of the data as described in Table 19.6
Seasonal averages to determine any seasonal variability
Particular attention will be given to the impact on the statistics caused by the observations noted in the
quality assurance review. In fact, Toxa City may evaluate the influence of a potential outlier by evaluating
the change in the summary statistics resulting from exclusion of the outlier.
Toxa City will generate some graphics to present the results from the summary statistics and to show the
spatial continuity over Toxa City. Maps will be created for the annual and three-year means, maxima,
and interquartile ranges for a total of 6 maps. The maps will help uncover potential outliers and will help
in the network design review. Additionally, basic histograms will be generated for each of the primary
and QA samplers and for the percent difference at the collocated sites. The histograms will be useful in
identifying anomalies and evaluating the normality assumption in the measurement errors. GIS spatial
analysis will also be performed to see if meteorology and topography have any influence on the
concentrations.
-------
Project: Model QAPP
Element No: 24
Revision No: 1
Date: 7/5/01
Page 3 of 5
24.2.4. Verify Assumptions of Statistical Test. There are no NAAQS to compare with air toxics.
Therefore, verification of the data must be done against estimated values, such as models. However,
before this can occur, the distribution, tests for trends, tests for outliers must be examined.
Normal distribution for measurement error- Assuming that measurement errors are normally
distributed is common in environmental monitoring. Toxa City has not investigated the sensitivity of the
statistical test to violation of this assumption; although, small departures from normality generally do not
create serious problems. Toxa City will evaluate the reasonableness of the normality assumption by
reviewing a normal probability plot and employing the Coefficient of Variance Test. If the plot or
statistics indicate possible violations of normality, Toxa City may need to determine the sensitivity of the
DQOs to departures in normality.
Trends Analysis- It is recommended that a simple linear regression test be performed to observe the
temporal variations in the data sets. Air toxics data can be roughly divided into two categories: Point and
area sources. In terms of area sources, of which many of these may be mobile sources, one would
assume that mobile related toxics would vary with the diurnal variations of traffic in urban and suburban
environment. The linear regression test would provide information on whether certain compounds are
tied to mobile sources. For instance, benzene is identified as major mobile HAP. If a linear regression is
performed against a compound whose source is unknown, then a small correlation coefficient would
provide information on its possible source. In addition to the linear regression test, it is recommended
that annual and 3-year average trend plots be generated. These plots can give a long-term temporal
information. It will also allow the TCAPCD the justification to decrease the network if trends illustrate
that the values are also decreasing.
Measurement precision and bias- For each sampling system, TCAPCD will review the 95%
confidence limits as determined in Table 19.2. If any exceed 10%, Toxa City may need to determine the
sensitivity of the DQOs to larger levels of measurement imprecision. Before describing the algorithm, first
some ground work. When less than three years of collocated data are available, the three-year bias and
precision estimates must be predicted. Toxa City's strategy for accomplishing this will be to use all
available quarters of data as the basis for projecting where the bias and precision estimates will be at the
end of the three-year monitoring period.
Toxa City will develop confidence intervals for the bias and precision estimates. This will be
accomplished using a re-sampling technique. The protocol for creating the confidence intervals are using
the following equation.
Bias Algorithm:
1. For each measurement pair, use Equation 19 from Section 14 to estimate the
percent relative bias, df. To reiterate, this equation is:
-------
Project: Model QAPP
Element No: 24
Revision No: 1
Date: 7/5/01
Page 4 of 5
where Xi represents the concentration recorded by the primary sampler, and Yt represents the
concentration recorded by the collocated sampler.
2. Summarize the percent relative bias to the quarterly level, Djiq, according to
1 **
where nj:q is the number of collocated pairs in quarter q for site/
3 . Summarize the quarterly bias estimates to the three-year level using
where «g is the number of quarters with actual collocated data and wq is the weight for quarter.
4. Examine Djiq to determine whether one sampler is consistently measuring above or below the other.
To formally test this, a non-parametric test will be used. The test is called the Wilcoxon Signed Rank
Test and is described in EPA QA/G-92. If the null hypothesis is rejected, then one of the samplers is
consistently measuring above or below the other. This information may be helpful in directing the
investigation into the cause of the bias.
Precision Algorithm
1 For each measurement pair, calculate the coefficient of variation according to Equation 20 from
Section 14 and repeated below: Y - Xs
x 100
-------
Project: Model QAPP
Element No: 24
Revision No: 1
Date: 7/5/01
Page 5 of 5
2. Summarize the 95% confidence Limits to the quarterly level, according to
where the number of collocated pairs in quarter.
Upper 95% Confidence Limits: Limit =d, +L96*S, /• 2
Upper 95% Confidence Limits: Limit =d, _1.96*S, • 2
24.2.5 Draw Conclusions from the Data.
If the sampling design and the statistical test bear out, it can be assumed that the network design and the
uncertainty of the data are acceptable. This conclusion can then be written in the Annual Report to
management. Management may then decide whether to perform risk assessments, allow the State and
EPA to analyze the data or work closely with the nearby university to determine whether this data can be
used to assess conclusion from health effects studies.
24.1.5 Action Plan Based on Conclusions from DQA
A thorough DQA process will be completed during the summer of each year. For this section, Toxa
City will assume that the assumptions used for developing the DQOs have been met. If this is not the
case, Toxa City must first revisit the impact on the bias and precision limits determined by the DQO
process. At some point in time, it may be necessary to reduce the network. This would happen under
the following scenario.
The data at a particular location shows values that are very low or at the detection limit. If this
occurs it will be the District's option to re-locate the sampler or remove it from service.
Vandalism or loss of right of way
References
1. Guidance for the Data Quality Assessment Process EPA QA/G-9 U.S. Environmental Protection Agency, QAD
EPA/600/R-96/084, July 1996.
2. U.S. EPA (1997b) Revised Requirements for Designation of Reference and Equivalent Methods for Air toxics and
Ambient Air Quality Surveillance for Particulate Matter-Final Rule. 40 CFR Parts 53 and 58. Federal Register,
62(138):38763-38854. July 18,1997.
-------
Appendices
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 1 of 14
Appendix A
Glossary
The following glossary is taken from the document EPA Guidance For Quality Assurance Project
Plans EPA OA/G-5.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 2 of 14
GLOSSARY OF QUALITY ASSURANCE AND RELATED TERMS
Acceptance criteria — Specified limits placed on characteristics of an item, process, or service defined
in requirements documents. (ASQC Definitions)
Accuracy — A measure of the closeness of an individual measurement or the average of a number of
measurements to the true value. Accuracy includes a combination of random error (precision) and
systematic error (bias) components that are due to sampling and analytical operations; the EPA
recommends using the terms "precision" and "bias"', rather than "accuracy," to convey the information
usually associated with accuracy. Refer to Appendix D, Data Quality Indicators for a more detailed
definition.
Activity — An all-inclusive term describing a specific set of operations of related tasks to be performed,
either serially or in parallel (e.g., research and development, field sampling, analytical operations,
equipment fabrication), that, in total, result in a product or service.
Assessment — The evaluation process used to measure the performance or effectiveness of a system
and its elements. As used here, assessment is an all-inclusive term used to denote any of the following:
audit, performance evaluation (PE), management systems review (MSR), peer review, inspection, or
surveillance.
Audit (quality) — A systematic and independent examination to determine whether quality activities and
related results comply with planned arrangements and whether these arrangements are implemented
effectively and are suitable to achieve objectives.
Audit of Data Quality (ADQ) — A qualitative and quantitative evaluation of the documentation and
procedures associated with environmental measurements to verify that the resulting data are of
acceptable quality.
Authenticate — The act of establishing an item as genuine, valid, or authoritative.
Bias — The systematic or persistent distortion of a measurement process, which causes errors in one
direction (i.e., the expected sample measurement is different from the sample's true value). Refer to
Appendix D, Data Quality Indicators, for a more detailed definition.
Blank — A sample subjected to the usual analytical or measurement process to establish a zero baseline
or background value. Sometimes used to adjust or correct routine analytical results. A sample that is
intended to contain none of the analytes of interest. A blank is used to detect contamination during
sample handling preparation and/or analysis.
Calibration — A comparison of a measurement standard, instrument, or item with a standard or
instrument of higher accuracy to detect and quantify inaccuracies and to report or eliminate those
inaccuracies by adjustments.
Calibration drift — The deviation in instrument response from a reference value over a period of time
before recalibration.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 3 of 14
Certification — The process of testing and evaluation against specifications designed to document,
verify, and recognize the competence of a person, organization, or other entity to perform a function or
service, usually for a specified time.
Chain of custody — An unbroken trail of accountability that ensures the physical security of samples,
data, and records.
Characteristic — Any property or attribute of a datum, item, process, or service that is distinct,
describable, and/or measurable.
Check standard — A standard prepared independently of the calibration standards and analyzed exactly
like the samples. Check standard results are used to estimate analytical precision and to indicate the
presence of bias due to the calibration of the analytical system.
Collocated samples — Two or more portions collected at the same point in time and space so as to be
considered identical. These samples are also known as field replicates and should be identified as such.
Comparability — A measure of the confidence with which one data set or method can be compared to
another.
Completeness — A measure of the amount of valid data obtained from a measurement system
compared to the amount that was expected to be obtained under correct, normal conditions. Refer to
Appendix D, Data Quality Indicators, for a more detailed definition.
Computer program — A sequence of instructions suitable for processing by a computer. Processing
may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for
execution. A computer program may be stored on magnetic media and referred to as "software," or it
may be stored permanently on computer chips, referred to as "firmware." Computer programs covered
in a QAPP are those used for design analysis, data acquisition, data reduction, data storage (databases),
operation or control, and database or document control registers when used as the controlled source of
quality information.
Confidence Interval — The numerical interval constructed around a point estimate of a population
parameter, combined with a probability statement (the confidence coefficient) linking it to the population's
true parameter value. If the same confidence interval construction technique and assumptions are used to
calculate future intervals, they will include the unknown population parameter with the same specified
probability.
Confidentiality procedure — A procedure used to protect confidential business information (including
proprietary data and personnel records) from unauthorized access.
Configuration — The functional, physical, and procedural characteristics of an item, experiment, or
document.
Conformance — An affirmative indication or judgment that a product or service has met the
requirements of the relevant specification, contract, or regulation; also, the state of meeting the
requirements.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 4 of 14
Consensus standard — A standard established by a group representing a cross section of a particular
industry or trade, or a part thereof.
Contractor — Any organization or individual contracting to furnish services or items or to perform work.
Corrective action — Any measures taken to rectify conditions adverse to quality and, where possible, to
preclude their recurrence.
Correlation coefficient — A number between -1 and 1 that indicates the degree of linearity between
two variables or sets of numbers. The closer to -1 or +1, the stronger the linear relationship between the
two (i.e., the better the correlation). Values close to zero suggest no correlation between the two
variables. The most common correlation coefficient is the product-moment, a measure of the degree of
linear relationship between two variables.
Data of known quality — Data that have the qualitative and quantitative components associated with
their derivation documented appropriately for their intended use, and when such documentation is
verifiable and defensible.
Data Quality Assessment (DQA) — The scientific and statistical evaluation of data to determine if
data obtained from environmental operations are of the right type, quality, and quantity to support their
intended use. The five steps of the DQA Process include: 1) reviewing the DQOs and sampling design,
2) conducting a preliminary data review, 3) selecting the statistical test, 4) verifying the assumptions of the
statistical test, and 5) drawing conclusions from the data.
Data Quality Indicators (DQIs) — The quantitative statistics and qualitative descriptors that are used
to interpret the degree of acceptability or utility of data to the user. The principal data quality indicators
are bias, precision, accuracy (bias is preferred), comparability, completeness, representativeness.
Data Quality Objectives (DQOs) — The qualitative and quantitative statements derived from the
DQO Process that clarify study's technical and quality objectives, define the appropriate type of data, and
specify tolerable levels of potential decision errors that will be used as the basis for establishing the quality
and quantity of data needed to support decisions.
Data Quality Objectives (DQO) Process — A systematic strategic planning tool based on the
scientific method that identifies and defines the type, quality, and quantity of data needed to satisfy a
specified use. The key elements of the DQO process include:
! state the problem,
! identify the decision,
! identify the inputs to the decision,
! define the boundaries of the study,
! develop a decision rule,
! specify tolerable limits on decision errors, and
! optimize the design for obtaining data.
DQOs are the qualitative and quantitative outputs from the DQO Process.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 5 of 14
Data reduction — The process of transforming the number of data items by arithmetic or statistical
calculations, standard curves, and concentration factors, and collating them into a more useful form. Data
reduction is irreversible and generally results in a reduced data set and an associated loss of detail.
Data usability — The process of ensuring or determining whether the quality of the data produced meets
the intended use of the data.
Deficiency — An unauthorized deviation from acceptable procedures or practices, or a defect in an item.
Demonstrated capability — The capability to meet a procurement's technical and quality specifications
through evidence presented by the supplier to substantiate its claims and in a manner defined by the
customer.
Design — The specifications, drawings, design criteria, and performance requirements. Also, the result
of deliberate planning, analysis, mathematical manipulations, and design processes.
Design change — Any revision or alteration of the technical requirements defined by approved and
issued design output documents and approved and issued changes thereto.
Design review — A documented evaluation by a team, including personnel such as the responsible
designers, the client for whom the work or product is being designed, and a quality assurance (QA)
representative but excluding the original designers, to determine if a proposed design will meet the
established design criteria and perform as expected when implemented.
Detection Limit (DL) — A measure of the capability of an analytical method to distinguish samples that
do not contain a specific analyte from samples that contain low concentrations of the analyte; the lowest
concentration or amount of the target analyte that can be determined to be different from zero by a single
measurement at a stated level of probability. DLs are analyte- and matrix-specific and may be
laboratory-dependent.
Distribution — 1) The appointment of an environmental contaminant at a point over time, over an area,
or within a volume; 2) a probability function (density function, mass function, or distribution function) used
to describe a set of observations (statistical sample) or a population from which the observations are
generated.
Document — Any written or pictorial information describing, defining, specifying, reporting, or certifying
activities, requirements, procedures, or results.
Document control — The policies and procedures used by an organization to ensure that its documents
and their revisions are proposed, reviewed, approved for release, inventoried, distributed, archived, stored,
and retrieved in accordance with the organization's requirements.
Duplicate samples — Two samples taken from and representative of the same population and carried
through all steps of the sampling and analytical procedures in an identical manner. Duplicate samples are
used to assess variance of the total method, including sampling and analysis. See also collocated sample.
Environmental conditions — The description of a physical medium (e.g., air, water, soil, sediment) or a
biological system expressed in terms of its physical, chemical, radiological, or biological characteristics.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 6 of 14
Environmental data — Any parameters or pieces of information collected or produced from
measurements, analyses, or models of environmental processes, conditions, and effects of pollutants on
human health and the ecology, including results from laboratory analyses or from experimental systems
representing such processes and conditions.
Environmental data operations — Any work performed to obtain, use, or report information pertaining
to environmental processes and conditions.
Environmental monitoring — The process of measuring or collecting environmental data.
Environmental processes — Any manufactured or natural processes that produce discharges to, or
that impact, the ambient environment.
Environmental programs — An all-inclusive term pertaining to any work or activities involving the
environment, including but not limited to: characterization of environmental processes and conditions;
environmental monitoring; environmental research and development; the design, construction, and
operation of environmental technologies; and laboratory operations on environmental samples.
Environmental technology — An all-inclusive term used to describe pollution control devices and
systems, waste treatment processes and storage facilities, and site remediation technologies and their
components that may be utilized to remove pollutants or contaminants from, or to prevent them from
entering, the environment. Examples include wet scrubbers (air), soil washing (soil), granulated activated
carbon unit (water), and filtration (air, water). Usually, this term applies to hardware-based systems;
however, it can also apply to methods or techniques used for pollution prevention, pollutant reduction, or
containment of contamination to prevent further movement of the contaminants, such as capping,
solidification or vitrification, and biological treatment.
Estimate — A characteristic from the sample from which inferences on parameters can be made.
Evidentiary records — Any records identified as part of litigation and subject to restricted access,
custody, use, and disposal.
Expedited change — An abbreviated method of revising a document at the work location where the
document is used when the normal change process would cause unnecessary or intolerable delay in the
work.
Field blank — A blank used to provide information about contaminants that may be introduced during
sample collection, storage, and transport. A clean sample, carried to the sampling site, exposed to
sampling conditions, returned to the laboratory, and treated as an environmental sample.
Field (matrix) spike — A sample prepared at the sampling point (i.e., in the field) by adding a known
mass of the target analyte to a specified amount of the sample. Field matrix spikes are used, for example,
to determine the effect of the sample preservation, shipment, storage, and preparation on analyte recovery
efficiency (the analytical bias).
Field split samples — Two or more representative portions taken from the same sample and submitted
for analysis to different laboratories to estimate interlaboratory precision.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 7 of 14
Financial assistance — The process by which funds are provided by one organization (usually
governmental) to another organization for the purpose of performing work or furnishing services or items.
Financial assistance mechanisms include grants, cooperative agreements, and governmental interagency
agreements.
Finding — An assessment conclusion that identifies a condition having a significant effect on an item or
activity. An assessment finding may be positive or negative, and is normally accompanied by specific
examples of the observed condition.
Goodness-of-fit test — The application of the chi square distribution in comparing the frequency
distribution of a statistic observed in a sample with the expected frequency distribution based on some
theoretical model.
Grade — The category or rank given to entities having the same functional use but different
requirements for quality.
Graded approach — The process of basing the level of application of managerial controls applied to an
item or work according to the intended use of the results and the degree of confidence needed in the
quality of the results. (See also Data Quality Objectives (DQO) Process.)
Guidance — A suggested practice that is not mandatory, intended as an aid or example in complying
with a standard or requirement.
Guideline — A suggested practice that is not mandatory in programs intended to comply with a
standard.
Hazardous waste — Any waste material that satisfies the definition of hazardous waste given in 40 CFR
261, "Identification and Listing of Hazardous Waste."
Holding time — The period of time a sample may be stored prior to its required analysis. While
exceeding the holding time does not necessarily negate the veracity of analytical results, it causes the
qualifying or "flagging" of any data not meeting all of the specified acceptance criteria.
Identification error — The misidentification of an analyte. In this error type, the contaminant of
concern is unidentified and the measured concentration is incorrectly assigned to another contaminant.
Independent assessment — An assessment performed by a qualified individual, group, or organization
that is not a part of the organization directly performing and accountable for the work being assessed.
Inspection — The examination or measurement of an item or activity to verify conformance to specific
requirements.
Internal standard — A standard added to a test portion of a sample in a known amount and carried
through the entire determination procedure as a reference for calibrating and controlling the precision and
bias of the applied analytical method.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 8 of 14
Item — An all-inclusive term used in place of the following: appurtenance, facility, sample, assembly,
component, equipment, material, module, part, product, structure, subassembly, subsystem, system, unit,
documented concepts, or data.
Laboratory split samples — Two or more representative portions taken from the same sample and
analyzed by different laboratories to estimate the interlaboratory precision or variability and the data
comparability.
Limit of quantitation — The minimum concentration of an analyte or category of analytes in a specific
matrix that can be identified and quantified above the method detection limit and within specified limits of
precision and bias during routine analytical operating conditions.
Management — Those individuals directly responsible and accountable for planning, implementing, and
assessing work.
Management system — A structured, nontechnical system describing the policies, objectives, principles,
organizational authority, responsibilities, accountability, and implementation plan of an organization for
conducting work and producing items and services.
Management Systems Review (MSR) — The qualitative assessment of a data collection operation
and/or organization(s) to establish whether the prevailing quality management structure, policies, practices,
and procedures are adequate for ensuring that the type and quality of data needed are obtained.
Matrix spike — A sample prepared by adding a known mass of a target analyte to a specified amount
of matrix sample for which an independent estimate of the target analyte concentration is available.
Spiked samples are used, for example, to determine the effect of the matrix on a method's recovery
efficiency.
May — When used in a sentence, a term denoting permission but not a necessity.
Mean (arithmetic) — The sum of all the values of a set of measurements divided by the number of
values in the set; a measure of central tendency.
Mean squared error — A statistical term for variance added to the square of the bias.
Measurement and Testing Equipment (M&TE) — Tools, gauges, instruments, sampling devices, or
systems used to calibrate, measure, test, or inspect in order to control or acquire data to verify
conformance to specified requirements.
Memory effects error — The effect that a relatively high concentration sample has on the
measurement of a lower concentration sample of the same analyte when the higher concentration sample
precedes the lower concentration sample in the same analytical instrument.
Method — A body of procedures and techniques for performing an activity (e.g., sampling, chemical
analysis, quantification), systematically presented in the order in which they are to be executed.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 9 of 14
Method blank — A blank prepared to represent the sample matrix as closely as possible and analyzed
exactly like the calibration standards, samples, and quality control (QC) samples. Results of method
blanks provide an estimate of the within-batch variability of the blank response and an indication of bias
introduced by the analytical procedure.
Mid-range check — A standard used to establish whether the middle of a measurement method's
calibrated range is still within specifications.
Mixed waste — A hazardous waste material as defined by 40 CFR 261 Resource Conservation and
Recovery Act (RCRA) and mixed with radioactive waste subject to the requirements of the Atomic
Energy Act.
Must — When used in a sentence, a term denoting a requirement that has to be met.
Nonconformance — A deficiency in a characteristic, documentation, or procedure that renders the
quality of an item or activity unacceptable or indeterminate; nonfulfillment of a specified requirement.
Objective evidence — Any documented statement of fact, other information, or record, either
quantitative or qualitative, pertaining to the quality of an item or activity, based on observations,
measurements, or tests that can be verified.
Observation — An assessment conclusion that identifies a condition (either positive or negative) that
does not represent a significant impact on an item or activity. An observation may identify a condition
that has not yet caused a degradation of quality.
Organization — A company, corporation, firm, enterprise, or institution, or part thereof, whether
incorporated or not, public or private, that has its own functions and administration.
Organization structure — The responsibilities, authorities, and relationships, arranged in a pattern,
through which an organization performs its functions.
Outlier — An extreme observation that is shown to have a low probability of belonging to a specified
data population.
Parameter — A quantity, usually unknown, such as a mean or a standard deviation characterizing a
population. Commonly misused for "variable," "characteristic," or "property."
Peer review — A documented critical review of work generally beyond the state of the art or
characterized by the existence of potential uncertainty. Conducted by qualified individuals (or an
organization) who are independent of those who performed the work but collectively equivalent in
technical expertise (i.e., peers) to those who performed the original work. Peer reviews are conducted to
ensure that activities are technically adequate, competently performed, properly documented, and satisfy
established technical and quality requirements. An in-depth assessment of the assumptions, calculations,
extrapolations, alternate interpretations, methodology, acceptance criteria, and conclusions pertaining to
specific work and of the documentation that supports them. Peer reviews provide an evaluation of a
subject where quantitative methods of analysis or measures of success are unavailable or undefined, such
as in research and development.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 10 of 14
Performance Evaluation (PE) — A type of audit in which the quantitative data generated in a
measurement system are obtained independently and compared with routinely obtained data to evaluate
the proficiency of an analyst or laboratory.
Pollution prevention — An organized, comprehensive effort to systematically reduce or eliminate
pollutants or contaminants prior to their generation or their release or discharge into the environment.
Population — The totality of items or units of material under consideration or study.
Precision — A measure of mutual agreement among individual measurements of the same property,
usually under prescribed similar conditions expressed generally in terms of the standard deviation. Refer
to Appendix D, Data Quality Indicators, for a more detailed definition.
Procedure — A specified way to perform an activity.
Process — A set of interrelated resources and activities that transforms inputs into outputs. Examples of
processes include analysis, design, data collection, operation, fabrication, and calculation.
Project — An organized set of activities within a program.
Qualified data — Any data that have been modified or adjusted as part of statistical or mathematical
evaluation, data validation, or data verification operations.
Qualified services — An indication that suppliers providing services have been evaluated and
determined to meet the technical and quality requirements of the client as provided by approved
procurement documents and demonstrated by the supplier to the client's satisfaction.
Quality — The totality of features and characteristics of a product or service that bears on its ability to
meet the stated or implied needs and expectations of the user.
Quality Assurance (QA) — An integrated system of management activities involving planning,
implementation, assessment, reporting, and quality improvement to ensure that a process, item, or service
is of the type and quality needed and expected by the client.
Quality Assurance Program Description/Plan — See quality management plan.
Quality Assurance Project Plan (QAPP) — A formal document describing in comprehensive detail the
necessary quality assurance (QA), quality control (QC), and other technical activities that must be
implemented to ensure that the results of the work performed will satisfy the stated performance criteria.
The QAPP components are divided into four classes: 1) Project Management, 2) Measurement/Data
Acquisition, 3) Assessment/Oversight, and 4) Data Validation and Usability. Guidance and requirements
on preparation of QAPPs can be found in EPA QA/R-5 and QA/G-5.
Quality Control (QC) — The overall system of technical activities that measures the attributes and
performance of a process, item, or service against defined standards to verify that they meet the stated
requirements established by the customer; operational techniques and activities that are used to fulfill
requirements for quality. The system of activities and checks used to ensure that measurement systems
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 11 of 14
are maintained within prescribed limits, providing protection against "out of control" conditions and
ensuring the results are of acceptable quality.
Quality control (QC) sample — An uncontaminated sample matrix spiked with known amounts of
analytes from a source independent of the calibration standards. Generally used to establish intra-
laboratory or analyst-specific precision and bias or to assess the performance of all or a portion of the
measurement system.
Quality improvement — A management program for improving the quality of operations. Such
management programs generally entail a formal mechanism for encouraging worker recommendations
with timely management evaluation and feedback or implementation.
Quality management — That aspect of the overall management system of the organization that
determines and implements the quality policy. Quality management includes strategic planning, allocation
of resources, and other systematic activities (e.g., planning, implementation, and assessment) pertaining to
the quality system.
Quality Management Plan (QMP) — A formal document that describes the quality system in terms of
the organization's structure, the functional responsibilities of management and staff, the lines of authority,
and the required interfaces for those planning, implementing, and assessing all activities conducted.
Quality system — A structured and documented management system describing the policies, objectives,
principles, organizational authority, responsibilities, accountability, and implementation plan of an
organization for ensuring quality in its work processes, products (items), and services. The quality system
provides the framework for planning, implementing, and assessing work performed by the organization
and for carrying out required quality assurance (QA) and quality control (QC).
Radioactive waste — Waste material containing, or contaminated by, radio nuclides, subject to the
requirements of the Atomic Energy Act.
Readiness review — A systematic, documented review of the readiness for the start-up or continued
use of a facility, process, or activity. Readiness reviews are typically conducted before proceeding
beyond project milestones and prior to initiation of a major phase of work.
Record (quality) — A document that furnishes objective evidence of the quality of items or activities
and that has been verified and authenticated as technically complete and correct. Records may include
photographs, drawings, magnetic tape, and other data recording media.
Recovery — The act of determining whether or not the methodology measures all of the analyte
contained in a sample. Refer to Appendix D, Data Quality Indicators, for a more detailed definition.
Remediation — The process of reducing the concentration of a contaminant (or contaminants) in air,
water, or soil media to a level that poses an acceptable risk to human health.
Repeatability — The degree of agreement between independent test results produced by the same
analyst, using the same test method and equipment on random aliquots of the same sample within a short
time period.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 12 of 14
Reporting limit — The lowest concentration or amount of the target analyte required to be reported
from a data collection project. Reporting limits are generally greater than detection limits and are usually
not associated with a probability level.
Representativeness — A measure of the degree to which data accurately and precisely represent a
characteristic of a population, a parameter variation at a sampling point, a process condition, or an
environmental condition. See also Appendix D, Data Quality Indicators.
Reproducibility — The precision, usually expressed as variance, that measures the variability among the
results of measurements of the same sample at different laboratories.
Requirement — A formal statement of a need and the expected manner in which it is to be met.
Research (applied) — A process, the objective of which is to gain the knowledge or understanding
necessary for determining the means by which a recognized and specific need may be met.
Research (basic) — A process, the objective of which is to gain fuller knowledge or understanding of
the fundamental aspects of phenomena and of observable facts without specific applications toward
processes or products in mind.
Research development/demonstration — The systematic use of the knowledge and understanding
gained from research and directed toward the production of useful materials, devices, systems, or
methods, including prototypes and processes.
Round-robin study — A method validation study involving a predetermined number of laboratories or
analysts, all analyzing the same sample(s) by the same method. In a round-robin study, all results are
compared and used to develop summary statistics such as interlaboratory precision and method bias or
recovery efficiency.
Ruggedness study — The carefully ordered testing of an analytical method while making slight
variations in test conditions (as might be expected in routine use) to determine how such variations affect
test results. If a variation affects the results significantly, the method restrictions are tightened to
minimize this variability.
Scientific method — The principles and processes regarded as necessary for scientific investigation,
including rules for concept or hypothesis formulation, conduct of experiments, and validation of
hypotheses by analysis of observations.
Self-assessment — The assessments of work conducted by individuals, groups, or organizations directly
responsible for overseeing and/or performing the work.
Sensitivity — the capability of a method or instrument to discriminate between measurement responses
representing different levels of a variable of interest. Refer to Appendix D, Data Quality Indicators,
for a more detailed definition.
Service — The result generated by activities at the interface between the supplier and the customer, and
the supplier internal activities to meet customer needs. Such activities in environmental programs include
design, inspection, laboratory and/or field analysis, repair, and installation.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 13 of 14
Shall — A term denoting a requirement that is mandatory whenever the criterion for conformance with
the specification permits no deviation. This term does not prohibit the use of alternative approaches or
methods for implementing the specification so long as the requirement is fulfilled.
Should — A term denoting a guideline or recommendation whenever noncompliance with the
specification is permissible.
Significant condition — Any state, status, incident, or situation of an environmental process or condition,
or environmental technology in which the work being performed will be adversely affected sufficiently to
require corrective action to satisfy quality objectives or specifications and safety requirements.
Software life cycle — The period of time that starts when a software product is conceived and ends
when the software product is no longer available for routine use. The software life cycle typically
includes a requirement phase, a design phase, an implementation phase, a test phase, an installation and
check-out phase, an operation and maintenance phase, and sometimes a retirement phase.
Source reduction — Any practice that reduces the quantity of hazardous substances, contaminants, or
pollutants.
Span check — A standard used to establish that a measurement method is not deviating from its
calibrated range.
Specification — A document stating requirements and referring to or including drawings or other
relevant documents. Specifications should indicate the means and criteria for determining conformance.
Spike — A substance that is added to an environmental sample to increase the concentration of target
analytes by known amounts; used to assess measurement accuracy (spike recovery). Spike duplicates
are used to assess measurement precision.
Split samples — Two or more representative portions taken from one sample in the field or in the
laboratory and analyzed by different analysts or laboratories. Split samples are quality control (QC)
samples that are used to assess analytical variability and comparability.
Standard deviation — A measure of the dispersion or imprecision of a sample or population distribution
expressed as the positive square root of the variance and has the same unit of measurement as the mean.
Standard Operating Procedure (SOP) — A written document that details the method for an operation,
analysis, or action with thoroughly prescribed techniques and steps and that is officially approved as the
method for performing certain routine or repetitive tasks.
Supplier — Any individual or organization furnishing items or services or performing work according to a
procurement document or a financial assistance agreement. An all-inclusive term used in place of any of
the following: vendor, seller, contractor, subcontractor, fabricator, or consultant.
Surrogate spike or analyte — A pure substance with properties that mimic the analyte of interest. It is
unlikely to be found in environmental samples and is added to them to establish that the analytical method
has been performed properly.
-------
Project: Model QAPP
Appendix A
Revision No: 1
Date: 7/5/01
Page 14 of 14
Surveillance (quality) — Continual or frequent monitoring and verification of the status of an entity and
the analysis of records to ensure that specified requirements are being fulfilled.
Technical review — A documented critical review of work that has been performed within the state of
the art. The review is accomplished by one or more qualified reviewers who are independent of those
who performed the work but are collectively equivalent in technical expertise to those who performed the
original work. The review is an in-depth analysis and evaluation of documents, activities, material, data,
or items that require technical verification or validation for applicability, correctness, adequacy,
completeness, and assurance that established requirements have been satisfied.
Technical Systems Audit (TSA) — A thorough, systematic, on-site qualitative audit of facilities,
equipment, personnel, training, procedures, record keeping, data validation, data management, and
reporting aspects of a system.
Traceability — The ability to trace the history, application, or location of an entity by means of recorded
identifications. In a calibration sense, traceability relates measuring equipment to national or international
standards, primary standards, basic physical constants or properties, or reference materials. In a data
collection sense, it relates calculations and data generated throughout the project back to the requirements
for the quality of the project.
Trip blank — A clean sample of a matrix that is taken to the sampling site and transported to the
laboratory for analysis without having been exposed to sampling procedures.
Validation — Confirmation by examination and provision of objective evidence that the particular
requirements for a specific intended use have been fulfilled. In design and development, validation
concerns the process of examining a product or result to determine conformance to user needs. See also
Appendix G, Data Management.
Variance (statistical) — A measure or dispersion of a sample or population distribution. Population
variance is the sum of squares of deviation from the mean divided by the population size (number of
elements). Sample variance is the sum of squares of deviations from the mean divided by the degrees of
freedom (number of observations minus one).
Verification — Confirmation by examination and provision of objective evidence that specified
requirements have been fulfilled. In design and development, verification concerns the process of
examining a result of a given activity to determine conformance to the stated requirements for that
activity.
-------
Appendix B
Air Toxics Pilot Program Technical System Audits
Laboratory Form
This following section has the Technical Systems Audit Form that was developed for the Air Toxics
Pilot Program. The form was developed between September and November 1999.
-------
Air Toxics Pilot Program - Technical Systems Audit
Laboratory Form
Part 1- Systems Audit Checklist for Quality System Documentation
Laboratory
Assessor Name and Affiliation
Observers) Name and Affiliation
Reporting Organization
Assessment Date
AUDIT QUESTIONS
1 . Is there an approved quality assurance
project plan (QAPP) for the overall
program and has it been reviewed by all
appropriate personnel?
2. Is a copy of the approved QAPP
available for review by field operators
and laboratory analysts? If not, briefly
describe how and where QA and quality
control (QC) requirements and
procedures are documented and are
made available to them.
3. Is the design and implementation of the
program as is specified in the QAPP?
4. Are there deviations from the QAPP?
5 . How are any deviations from the QAPP
noted?
6. Are the established procedures for
corrective or response actions when
MQOs (e.g out-of-cpntrol calibration
data) metr If yes, briefly describe them.
7. Are corrective action procedures
consistent with the QAPP?
8. Have any such corrective actions been
taken during the program?
9. Are the SOPs complete, up-to-date and
followed?
RESPONSE
Y
N
N
A
COMMENTS
-------
AUDIT QUESTIONS
RESPONSE
N
N
COMMENTS
10. Are written and approved standard
operating procedures (SOPs) used in
the program? If so, are these the SOPs
that were written up in the EPA Field
QAPP? Are they available for review
by field operators and laboratory
analysts. If not, briefly describe how
and where the program's operating
procedures are documentec
Additional Questions or Comments:
-------
Part 2- Systems Audit Checklist for Management and Organization
Laboratory
Assessment Date
AUDIT QUESTIONS
RESPONSE
Y
N
N
A
COMMENTS
A. ORGANIZATION AND RESPONSIBILITIES
Identify the following personnel and determine whether they have the listed
responsibilities:
1 . Lab Analysis Manager:
- Coordinates lab operations,
- Logistical support of lab operations,
- Training monitoring lab technicians,
and
- Review of routine lab data and quality
control data.
2. Lab Technician(s):
- receive samples,
- analyze samples,
- perform QA/QC checks,
- report data to Lab Manager
4. Who is authorized to halt the program in
the event of a health or safety hazard or
inadequate quality?
Additional Questions or Comments:
B. TRAINING AND SAFETY
1 . Do the lab technicians have the training
and experience for the operation of the
equipment?
-------
AUDIT QUESTIONS
RESPONSE
N
N
A
COMMENTS
2. Are the staff aware of hazards with which
they are in contact? (i.e., benzene or x-ray
fluorescence)
3. Does the program maintain current
summaries of the training/certification
and qualifications for program
personnel?
4. Is there special safety equipment that is
required for health and safety?
5. Are personnel outfitted with any required
safety equipment?
6. Are personnel adequately trained
regarding appropriate safety procedures?
Additional Questions or Comments:
-------
Part 3- Systems Audit Checklist for Monitoring Site
Laboratory
Assessment Date
AUDIT QUESTIONS
RESPONS
E
Y
N
N
A
COMMENT
A. Laboratory QA
l.Are the equipment calibration and
maintenance logs and data sheets filled out
promptly, clearly and completely?
2. Does the operator keep the
filter/sample/sample handling/preparation
area neat and clean?
3. Is there a copy of the applicable QAPP
available to the lab technicians?
4. Are copies of the SOPs available?
Additional Comments:
-------
AUDIT QUESTIONS
RESPONS
E
Y
N
N
A
COMMENT
B. Sample Handling
1 . Are all samples handled with the
necessary care and finesse to avoid
contamination and/or loss of material?
2. Check log books at the lab to verify that
field and lab blanks are being collected
and analyzed.
3 . Are blanks routinely used by the
monitoring organization? Check log
books at the lab to verify field blanks are
run periodically, as specified by the
weighing laboratory.
Trip blanks one set every 30 days
Field blanks one set every 10 days
4. Observe the following handling steps for
routine samples, verifying that the lab
tech follows the sample handling SOPs
correctly:
- receipt of samples at the sampling site
and unpacking
- completion of sample logbook entries
and other required documentation
packing and sending to the field
- completion of chain of custody and field
data forms supplied by the reporting
organization
samples shipped to other labs?
5 . Request the lab tech perform the field
blank sample-handling procedures (if
not possible, go through the SOP step-
by-step and verify that the technician
knows the correct procedures.):
- receipt of samples at the lab and
unpacking
- completion of sample logbook entries
and other required documentation
- inspection of the sample prior to
analysis
completion of chain of custody and
fiela data forms supplied by the
reporting organization
-------
AUDIT QUESTIONS
RESPONS
E
Y
N
N
A
COMMENT
Additional Questions or Comments:
D. Calibration
1 . Is the flow rate standard used for lab
equipment calibration/verification
recalibrated or reverified against a NIST-
traceable standard at least annually?
2. Is the barometric pressure standard used
for lab equipment calibration/verification
recalibrated or re-verified against a
NIST-traceable standard atleast
annually?
3. Is the temperature standard used for
routine calibration/verification
recalibrated or re-verified against a
NIST-traceable standard atleast
annually?
-------
AUDIT QUESTIONS
4. Obtain the SOPs used for the following
activities and observe the operator
perform the periodic verifications:
- leak check
- temperature verification
- barometric pressure verification
- flow rate check
RESPONS
E
Y
N
N
A
COMMENT
Additional Questions or Comments:
-------
E. Sample Handling
1 . Is the sample handling area clean?
2. Is the sample handling area cleaned
before each unloading session?
3 . Are the filters handled with non-
powder latex gloves?
4. Are the filter handling forceps different
from mass reference standards
forceps?
5. Is the temperature of samples (i.e.,
DNPH cartridges) being recorded
upon receipt?
5. Are all extracts, cartridges stored
according to the QAPP and SOPs?
Additional Questions or Comments:
-------
Part 4- MQOs for Laboratory Systems
Laboratory
Assessment Date
Table 1. Analysis Matrices, Reporting Units, Holding Times and Preservation
Techniques
Parameter
PM mass
Trace
metals
PAHs,
PCBs,
Pesticides
carbonyls
VOC
Matrix
quartz /glass
quartz/glass
QF/PUF
DNPH
stainless steel
Units
mg/m3
ug/m3
ng/m3
ppb
ppb
Maximum
Holding Time
<30 days at 4 °C
60 days
7 days (before
extraction); 40
days (after
extraction)
30 days
30 days
Preservation
Store at 4 °C
Store at 4 °C
Store at 4 °C
Store at 4 °C
None
Compliance?
Table 2 Measurement Quality Objectives- X-Ray Fluorescence (XRF) Analysis of
Metals in Ambient Air Coarse (TSP/PM10) Particulate Matter
Requirement
X-ray attenuation
corrections
Interference
corrections
Flow fraction
collection
Field filter/sample
blank
Lab filter/sample blank
Run- time QC: peak
areas, background
areas, centroid,
FWHM
SRM1833andSRM
1832
Chi-square measure of
fit
Frequency
Each run
Each run
Each run
1 per paired sample
1 per run
1 per run
1 per run
1 per run
Acceptance Criteria
Not specified
Not specified
Not specified
Less than UDL for target
analytes
Less than LDL for target
analytes
Target and tolerance
parameters by element;
must be within tolerance
units
Uncertainty intervals for
analytical results and
certified values must
overlap
<1.0
Compliance?
-------
Table 3 Measurement Quality Objectives-Gravimetric Analysis of Ambient Air Coarse (TSP/PM10)
Paniculate Matter
Requirement
After pre-weighing
Before post-
weighing
Sampling period
Reporting units
Lower detection
limit
Upper
concentration limit
Visual defect check
Equilibration
Temperature range
Temperature
control
Humidity range
Humidity control
Pre/Post sampling
RH
Balance
Lot blanks
Field filter/sample
blank
Lab filter/sample
blank
Balance check
Duplicate
filter/sample
weighing
Frequency
All filter/samples
All filter/samples
All data
All data
All data
All data
All filter/samples
All filter/samples
All filter/samples
All filter/samples
All filter/samples
All filter/samples
All filter/samples
All filter/samples
3 filter/samples per lot
1 per paired sample
1 per weighing session
Beginning, every 10U1
sample, end
Every filter/sample
Acceptance Criteria
<30 days before sampling
<30 days at 4°C from sampling end
date
24 ± 0.25 hours
mg/m3
2mg/m3
200 mg/m3
No visible defects
24 hours minimum
30-40 °C
± 2 °C SD over 24 hours
30-40% RH
± 5% RH SD over 24 hours
± 5% RH
Located in filter/sample
conditioning room
< 15 mg change between weighing
± 30 mg change between weighing
± 1 5 • g change between weighing
• 3mg
± 15 mg change between weighing
Compliance?
-------
Table 4. Measurement Quality Objectives-GC/MS Analysis of PAHs
Requirement
Purchase
specifications
Purchase
specifications
Visual defect check
Lot blanks
Field surrogates
Lab surrogates
Internal standards
GC/MS tuning
GC/MS calibration
GC/MS continuing
calibration
Laboratory method
blank
Laboratory control
spike
Frequency
All filter/samples
All PUFs
All filter/samples
and PUFs
1 filter/sample and
PUF per lot
All filter/samples
and PUFs
All filter/samples
and PUFs
All extracts
Every 12 hours of
operation, or after
corrective action
After corrective
action
After GC/MS tuning
Every batch of
samples
Every batch of
samples
Acceptance Criteria
Binderless quartz micro fiber
filter/samples, 47-mm diameter
6.0-cm diameter cylindrical plug cut
from 7.6-cm 0.022 g/cm3 stock
No visible defects
PAHs below MDL
60- 120% recovery
1 • g of two deuterated PAHs
0.5 • g of five deuterated PAHs
With
decafluorotriphenylphosphine
(DFTPP) to meet mass spectral ion
abundance criteria
Five calibration standards
containing target compounds,
internal standards and surrogate
compounds between MDL and
detector saturation
One calibration standard (as
above) is within ±30% of the initial
calibration
-50% to +100% area response and
±20.0 seconds retention time for
internal standards; PAHs below
MDL
-50% to +100% area response and
±20.0 seconds retention time for
internal standards; 60-120%
rf>miTf>r\/ r\f PAWc
Compliance?
-------
Table 5. Measurement Quality Objectives-GC/ECD Analysis of PCBs and Pesticides
Requirement
Field blank
Spiked trip blank
Solvent blank
GC/ECD calibration
GC/ECD calibration
Sampling efficiency
Frequency
One per sampling event
One per sampling event
Each batch of sample
After corrective action
Beginning of each day
and after every 10
samples
At project start, and at
least once per quarter
Acceptance Criteria
<10 ng single compound/sample,
<1 00 ng multiple
compounds/sample
65-125% recovery
<10 ng single compound/sample,
<1 00 ng multiple
compounds/sample
Three calibration standards in the
linear range (<20% RSD), 85-1 1 5%
recovery
One midpoint-calibration standard
with <15% RSD
Recovery of >75% at <15% RSD
of target compounds on a spiked
filter/sample under normal
sampling conditions
Compliance?
Table 6. Measurement Quality Objectives-Carbonyls Analysis
Requirement
Sample holding times
Sampling period
Reporting units
Detection limit
Lower detection limit
Upper concentration
limit
Purchase
specifications
Lot blanks
Field blank
Replicate sample
analysis
Instrument calibration
Spiked lab blank
Lab blank
Frequency
All cartridges
All data
All data
All data
All data
all cartridges
1 filter/sample per
lot
One per sampling
event
At project start,
and once per
quarter
Once per sample
batch
Acceptance Criteria
<30 days at 4° C
24 ± 0.25 hours
ppb
1 ppb
5 ppb
100 ppb
2,4-dinitro phenyl hydrazine
coated cartridges, 50-mm
diameter
all carbonyls less than 1 ppb
> 2 ppb of any carbonyl
<20%RSD
Known volume and
concentration of
acetaldehyde
<5% RSD
> 1 ppb
Compliance?
-------
Table 7 Measurement Quality Objectives-Metals by ICP/MS
Requirement
Before shipping
Before digestion
After digestion
Sampling period
Reporting units
Detection limit
Glassware pre-
conditioning
Field blank
Replicate
sample analysis
instrument
calibration
Calibration
check
JN1ST SKM
1648
Lab replicate
Lao splits (with
another lab)
Frequency
All data
All data
All data
All glassware
and
plasticware
Une per
sampling event
30 paired
analyses
Daily
Beginning ot
run and after
every 10
samples
Daily
Une per
sampling event
>10 samples
Acceptance Criteria
<3 days at 4" (J
< / days at 4" u
<30 days at 4" (J
24 ± 1 hour
• g/nr-day
2-g/L
Washed in 1 : 1 nitric acid in a
clean room, double-wrapped
in sealed plastic bags
Metals below MDL
<20% RSD
tive standard concentrations,
R2>95%
One mid-point standard, <5%
RSD
70-120% recovery
< 15% RSD
<20% RSD
Compliance?
-------
Table 8 Measurement Quality Objectives-Volatile Organic Compounds
Kequirement
Sample holding
times
Sampling period
Reporting units
Detection limit
Lower detection
limit
Upper
concentration
limit
Purchase
specifications
Replicate sample
analysis
Instrument
calibration
Spiked lab blank
Lab blank
frequency
All canisters
Ail data
All data
All data
All data
All data
Canisters
At project start,
and once per
quarter
Once per sample
batch
Once per batch
Once per batch
Acceptance criteria
<30 days
24 ± u.25 hours
ppb
O.lppb
5 ppb
100 ppb
electro-polished
stainless steel canisters
<20% RSD
5 species calibration
point at beginning and
end of batch run
<5% RSD
total VOC below 5 ppb
compliance?
-------
Appendix C
Air Toxics Pilot Program Technical System Audits
Field Form
This following section has the Technical Systems Audit Form that was developed for the Air Toxics
Pilot Program. The form was developed between September and November 1999.
-------
Air Toxics Pilot Program - Technical Systems Audit
Field Form
Part 1- Systems Audit Checklist for Quality System Documentation
Monitoring Site Location
Assessor Name and Affiliation
Observer(s) Name and Affiliation
Reporting Organization
Assessment Date
AUDIT QUESTIONS
1 . Is there an approved quality assurance
project plan (QAPP) for the overall
program and has it been reviewed by all
appropriate personnel?
2. Is a copy of the approved QAPP
available for review by field operators?
If not, briefly describe how and where
QA and quality control (QC)
requirements and procedures are
documented and are made available to
them.
3 . Is the design and implementation of the
program as is specified in the QAPP?
4. Are there deviations from the QAPP?
5 . How are any deviations from the
QAPP noted?
6. What are the critical measurements in
the program as defined in the QAPP?
7. Are there established procedures for
corrective or response actions when
MQOs (e.g., out-of-control calibration
data) are not met? If yes, briefly
describe them.
8. Are corrective action procedures
consistent with the QAPP?
9. Have any such corrective actions been
taken during the program?
RESPONSE
Y
N
N
A
COMMENTS
-------
AUDIT QUESTIONS
RESPONSE
N
N
A
COMMENTS
10
Are written and approved standard
operating procedures (SOPs) used in
the program? If so, list them on the
attached sheet and note whether they
are available for review by field
operators and laboratory analysts. If
not, briefly describe how and where the
program's operating procedures are
documentec
11. Are the SOPs complete, up-to-date,
and followed?
Additional Questions or Comments:
-------
Part 2- Systems Audit Checklist for Management and Organization
Monitoring Station
Assessment Date
AUDIT QUESTIONS
RESPONSE
N
N
COMMENTS
A. ORGANIZATION AND RESPONSIBILITIES
Identify the following personnel and determine whether they have the listed
responsibilities:
1. Field Operations Manager:
- Development of monitoring network,
- Coordinates field operations,
- Logistical support of field operations,
- Training monnxmng site operators, and
- Review of routine sampler data and
quality control data.
2. Monitoring Site Operator(s):
Operation of samplers,
Calibration of samplers,
Maintenance of samplers,
Maintenance of monitoring site
4. Who is authorized to halt the program
in the event of a health or safety hazard
or inadequate quality?
Additional Questions or Comments:
B. TRAINING AND SAFETY
1. Do the monitoring site operators have
training or experience for the operation
of the sampler?
2. Has the operator been trained in the
particular hazards of the
instruments/materials with which they
are operating?
-------
AUDIT QUESTIONS
RESPONSE
N
N
COMMENTS
3. Is there special safety equipment
required to ensure the health and safety
of personnel?
4. Are personnel outfitted with any
required safety equipment?
5. Are personnel adequately trained
regarding appropriate safety
procedures?
Additional Questions or Comments:
-------
Part 3- Systems Audit Checklist for Monitoring Site
Monitoring Site
Assessment Date
AUDIT QUESTIONS
RESPONS
E
Y
N
N
A
COMMENT
A. Sampler Siting
1 . Does the location for the samplers and
collocated samplers conform with the
siting requirements of 40CFR58,
Appendices A and E?
2. Are there any changes at the site that
might compromise original siting
criteria (e.g., fast-growing trees or
shrubs, new construction)?
Additional Questions or Comments:
B. Monitoring Site
1 . Are site logbooks and required data
sheets filled in promptly, clearly, and
completely?
2. Does the operator keep the sample-
handling area neat and clean?
3. Is (are) a copy of the applicable
QAPP(s) available to the site operator?
4. Are copies of applicable SOPs
available to the site operator?
5 . Do the sampler(s) appear to be well
maintained and free of dirt and debris,
bird/animal/insect nests, excessive rust
and corrosion, etc.?
6. Are the walkways to the station and
equipment kept free of tall grass,
weeds, and debris?
7. Is the station shelter (if any) clean and
in good repair?
-------
AUDIT QUESTIONS
Additional Questions or Comments:
RESPONS
E
Y
N
N
A
COMMENT
C. Sample Handling
1 . Are all samples handled with the
necessary care and finesse to avoid
contamination and/or loss of material?
2. Are blanks routinely used by the
monitoring organization? Check log
books at the site to verify field blanks
are run periodically, as specified by the
weighing laboratory.
Trip blanks should be 1 in 30 days
Approximately 10% of sample
samples should be field blanks.
-------
AUDIT QUESTIONS
3. Observe the following handling steps
for routine samples, verifying that the
operator follows the sample handling
SOPs correctly:
- receipt of samples at the sampling site
and unpacking
- completion of sample logbook entries
and other required documentation
- inspection of the sample prior to
sampling
- installation of sample in the sampler
- retrieval from the sampler after
sampling
- packing and sending to the laboratory
- completion of chain of custody and
fiela data forms supplied by the
reporting organization
samples shipped
4. Request the operator to perform the
field blank sample-handling procedures
(if not possible, go through the SOP
step-by-step ana verify that the
operator knows the correct
procedures.):
- receipt of samples at the sampling site
and unpacking
- completion of sample logbook entries
and other required documentation
- inspection of the sample prior to
sampling
- installation of sample in the sampler
- retrieval from the sampler (without
sampling)
- packing and sending to the laboratory
- completion of chain of custody and
fiela data forms supplied by the
reporting organization
RESPONS
E
Y
N
N
A
COMMENT
-------
AUDIT QUESTIONS
RESPONS
E
Y
»|AN
COMMENT
Additional Questions or Comments:
D. Calibration
1 . Is the flow rate standard used for
routine sampler calibration/verification
recalibrated or reverified against a
NIST-traceable standard at least
annually?
2. Is the calibration relationship for the
flow rate standard (e.g., an equation,
curve, or family of curves relating
actual flow rate [QJ to the flow rate
indicator reading) accurate to within
what is specified in the QAPP over the
expected range of ambient tem-
peratures and pressures at which the
flow rate standard may be used?
3. Is the barometric pressure standard
used for routine sampler
calibration/verification recalibrated or
re-verified against a NIST-traceable
standard at least annually?
-------
AUDIT QUESTIONS
4. Is the temperature standard used for
routine sampler calibration/verification
recalibrated or re-verified against a
NIST-traceable standard at least
annually?
5. Obtain the SOPs used for the
following activities and observe the
operator perform the periodic
verifications:
- leak check
- temperature verification
- barometric pressure verification
- flow rate check
RESPONS
E
Y
N
N
A
COMMENT
-------
E. Sample Handling
1 . Is the sample handling area clean?
2. Is the sample handling area cleaned
before each unloading session?
3. Are the filters and DNPH cartridges
handled with non-powder latex gloves?
4 Are the DNPH cartridges stored in a
refrigerator while at the monitoring
location?
5 Describe the procedure that is
followed after an exposed sample is
received from the field, including the
sample storage temperature.
Additional Questions or Comments:
-------
Part 4- MQOs for Monitoring Samplers
Monitoring Site
Assessment Date
Table 1. Total Suspended Particulate Sampler for Metals Testing, Inspection and Maintenance
Requirements
Check/Maintenanc
e
Clock check
Flow rate
multipoint
calibration
Leak check
Motor Brushes
Clean inside of
housing cover
Clean air screens
Check timer
electrical cords
and tubing
Frequency
Once per week
quarterly
Every Run
when they fail or every six
months, whichever comes
first.
Semiannual inspection
Semiannual
Semiannual
Requirement
Current date, time + 30
minute
3 points between 39-60
cfm
Per Operating Manual
Per Service Manual
Clear of obstructions to
flow
Per Service Manual
Performed?
Table 2. Volatile Organics Compounds Sampler (VOC) Testing, Inspection and
Maintenance Requirements
Checks/Maintenance
Clock check
Pressure Gauge
Flowrate check
Leak check
Sampler inlet
Computer backup
battery
Frequency
Once per week
quarterly
quarterly
each run for two
canisters
quarterly
Semiannual
inspection;
replace as
necessary
Requirement
Current date, time+ 1
minute
Ambient pressure +/- 1 psig
70 cc/min +/- 5 cc/min
Loss of < 0.1 psig/ 5 minute
Visual Inspection
Per Service Manual
Performed?
-------
Table 3. DNPH Carbonyl Sampler Testing, Inspection and Maintenance Requirements
Checks/Maintenance
Clock check
Flowrate check
Leak check
Sampler Inlet
Computer backup
battery
Frequency
Once per week
quarterly
each run for two
cartridges
quarterly
Semiannual
inspection;
replace as
necessary
Requirement
Current date, time+ 1
minute
1.01 l/min +/- 10 cc/min
Loss of < 0.1 psig / 5 minute
Visual Inspection
Per Service Manual
Performed?
Table 4. SemiVolatile Organic Compounds Testing, Inspection and Maintenance
Requirements
Checks/Maintenance
Clock check
Flowrate check
Leak check
Motor Brushes
Clean inside of housing
cover
Clean air screens
Check timer electrical
cords and tubing
Frequency
Once per week
quarterly
Every Run
when they fail or every
six months, whichever
comes first.
Semiannual inspection
Semiannual
Semiannual
Requirement
Current date, time + 2
minute
0.2 m3/min +/-0.02
m3/min
Per Operating Manual
Per Service Manual
Clear of obstructions to
flow
Per Service Manual
Performed?
-------
Appendix D
Field Operations and Analytical and Calibration Procedures
This model QAPP only contains a place holder for the SOPs. SOPs should be developed by the
State and Local Agencies since these are specific for the agencies' methods.. The following document
and URL are a supplementary document that can assist the agencies in creating their SOPs. The
Internet address is: http://www.epa.gov/ttn/amtic/airtxfil.html, the document name is "Pilot City Air
Toxics Measurement Summary, February 2001, EPA No. 454/R-01-003. This document discusses
the findings and recommendations of the Air Toxics Pilot City Measurement Workgroup from
November 2000- January 2001.
-------
TECHNICAL REPORT DATA
(Please read Instructions on reverse before completing)
1. REPORT NO.
EPA-354/RO1-001
3. RECIPIENT'S ACCESSION NO.
4. TITLE AND SUBTITLE
Quality Assurance Guidance Document., Quality Assurance
Project Plan for the Air Toxics Monitoring Program
5. REPORT DATE
07/01
6. PERFORMING ORGANIZATION CODE
7. AUTHOR(S)
Dennis Mikel, Michael Papp
8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMING ORGANIZATION NAME AND ADDRESS
U.S. Environmental Protection Agency
Office of Air Quality Planning and Standards
Research Triangle Park, NC 27711
10. PROGRAM ELEMENT NO.
11. CONTRACT/GRANT NO.
12. SPONSORING AGENCY NAME AND ADDRESS
Director
Office of Air Quality Planning and Standards
Office of Air and Radiation
U.S. Environmental Protection Agency
Research Triangle Park, NC 27711
13. TYPE OF REPORT AND PERIOD COVERED
14. SPONSORING AGENCY CODE
EPA/200/04
15. SUPPLEMENTARY NOTES
16. ABSTRACT
The Quality Guidance Document is the Quality Assurance Project Plan that outlines the field operations for a Model
Air Toxics Monitoring Program. The guidance document gives details on how to
set-up, operate, and perform all quality control and assurance duties that are required to provide precise, accurate
and representative data. This guidance document also has two appendices. The first appendix is the glossary of
terms. The second appendix references an AMTIC document. This Guidance Document is written in model
format. The QAPP uses a fictitious city and outlines how an agency should approach the task of writing a QAPP.
KEY WORDS AND DOCUMENT ANALYSIS
DESCRIPTORS
b. IDENTIFIERS/OPEN ENDED TERMS
c. COSATI Field/Group
Air Quality Monitoring
Quality Assurance
Air Pollution Control
18. DISTRIBUTION STATEMENT
Release Unlimited
19. SECURITY CLASS (Report)
Unclassified
20. SECURITY CLASS (Page)
Unclassified
21. NO. OF PAGES
22. PRICE
EPA Form 2220-1 (Rev. 4-77)PREVIOUS EDITION IS OBSOLETE
------- |